Context: The conflict between free speech and consent
- Publishing non-consensual intimate images (NCII) is a criminal offence under the Information Technology Act 2000, with platforms doing their best to filter out such content.
- While a criminal conviction is desirable, the more urgent need for victims is to stop the spread of this illegal content.
- The Intermediary Guidelines 2021 provide a partial solution.
- They empower victims to complain directly to any website that has allowed the uploading of non-consensual images or videos of a person in a state of nudity or engaging in a sexual act.
- The website must remove the content within 24 hours of receiving a complaint, or risk facing criminal charges.
- This approach relies on victims identifying and sharing every URL hosting their intimate images.
- Further, the same images may be re-uploaded at different locations or by different user accounts in the future.
- While the Intermediary Guidelines do encourage large social media platforms to proactively remove certain types of content, the focus is on child pornography and rape videos.
- Victims of NCII abuse have few options other than lodging complaints every time their content surfaces, forcing them to approach courts.
Existing practices to fight
- Meta recently built a tool to curtail the spread of NCII
- The tool relies on a “hashing” technology to match known NCII against future uploads
- Other websites could eventually use this NCII hash database to identify illegal content
- Australia has appointed an “e-Safety Commissioner” who receives complaints against NCII and coordinates between complainants, websites, and individuals who posted the content – with the Commissioner empowered to issue “removal notices” against illegal content.
- Pairing a hash database with an independent body like the Commissioner may significantly reduce the spread of NCII.
- The use of automated tools raises free speech concerns that lawful content may accidentally be taken down.
- Automatic filters often ignore context.
- Content that may be illegal in one context may not be illegal in another.
- While there exist tricky cases where courts may be required to intervene
- The vast majority of NCII has no public interest component and can be taken down quickly.
- Automated tools also have a much better record against images than text, with images less likely to be misinterpreted.
The government’s reported overhaul of the Information Technology Act is an opportunity to develop a coordinated response to NCII-abuse that will provide victims meaningful redress without restricting online speech.
In the interim, courts should balance the harm caused by NCII with the need to protect online speech, courts may consider tasking a state functionary or independent body with verifying the URLs and coordinating with online platforms and internet service providers.
Source: Indian Express