From UPSC perspective, the following things are important :
Prelims level : NA
Mains level : online child abuse and protection
- Last month, the Central Bureau of Investigation (CBI) conducted searches across States and Union Territories as part of a pan-India operation, “Megh Chakra”. The operation, against the online circulation and sharing of Child Sexual Abusive Material (CSAM) using cloud-based storage, was supposedly based on inputs received from Interpol’s Singapore special unit, in turn based on the information received from New Zealand.
Current system of detecting CSAM
- Help of foreign agencies: As the public reporting of circulation of online CSAM is very low and there is no system of automatic electronic monitoring, India’s enforcement agencies are largely dependent on foreign agencies for the requisite information.
- Operation carbon: In November 2021, a similar exercise code-named “Operation Carbon” was launched by the CBI, with many being booked under the IT Act, 2000.
American Model of fighting CSAM
- Cyber tipline programme under NCMEC: The National Centre for Missing & Exploited Children (NCMEC), a non-profit organization in the United States, operates a programme called Cyber Tipline, for public and electronic service providers (ESPs) to report instances of suspected child sexual exploitation. In 2021, the Cyber Tipline received more than 29.3 million reports (99% from ESPs) of U.S. hosted and suspected CSAM.
- Mandatory reporting for Internet service providers (ISPs): ISPs are mandated to report the identity and the location of individuals suspected of violating the law. Also, NCMEC may notify ISPs to block transmission of online CSAM.
UK Model of fighting CSAM
- Internet Watch Foundation (IWF) to ensure safe online environment: In the United Kingdom, the mission of the Internet Watch Foundation (IWF), a non-profit organisation established by the United Kingdom’s Internet industry to ensure a safe online environment for users with a particular focus on CSAM, includes disrupting the availability of CSAM and deleting such content hosted in the U.K.
- ISPs may be held responsible: The IWF engages the analysts to actively search for criminal content and not just rely on reports from external sources. Though the U.K. does not explicitly mandate the reporting of suspected CSAM, ISPs may be held responsible for third party content if they host or caches such content on their servers. In 2021, the IWF assessed 3,61,062 reports, (about 70% reports had CSAM) and seven in 10 reports contained “self-generated” CSAM.
Efforts of Global community
- Global network for secure IT infrastructure: A global network of 50 hotlines (46 member countries), provides the public with a way to anonymously report CSAM. It provides secure IT infrastructure, ICCAM (I- “See” (c)-Child-Abuse-Material) hosted by Interpol and facilitates the exchange of CSAM reports between hotlines and law enforcement agencies. ICCAM is a tool to facilitate image/video hashing/finger printing and reduce the number of duplicate investigations.
- Removal of illegal URLs: In 2021, the number of exchanged content URLs stood at 9,28,278, of which 4,43,705 contained illegal content. About 72% of all illegal content URLs were removed from the Internet within three days of a notice and takedown order.
- Internet service providers are exempted from the liability: In India, the Supreme Court of India, in Shreya Singhal (2015), read down Section 79(3)(b) of the IT Act to mean that the ISP, only upon receiving actual knowledge of the court order or on being notified by the appropriate government, shall remove or disable access to illegal contents. Thus, ISPs are exempted from the liability of any third-party information.
- In the Kamlesh Vaswani (WP(C) 177/2013) case: The petitioner sought a complete ban on pornography. After the Court’s intervention, the advisory committee (constituted under Section 88 of the IT Act) issued orders in March 2015 to ISPs to disable nine (domain) URLs which hosted contents in violation of the morality and decency clause of Article 19(2) of the Constitution. The petition is still pending in the Supreme Court.
- Aarambh India portal: a Mumbai-based non-governmental organization, partnered with the IWF, and launched India’s first online reporting portal in September 2016 to report images and videos of child abuse. These reports are assessed by the expert team of IWF analysts and offending URLs are added to its blocking list. Till 2018, out of 1,182 reports received at the portal, only 122 were found to contain CSAM.
- National cybercrime reporting portal: The Ministry of Home Affairs (MHA) launched a national cybercrime reporting portal in September 2018 for filing online complaints pertaining to child pornography and rape-gang rape. This facility was developed in compliance with Supreme Court directions with regard to a public interest litigation filed by Prajwala, a Hyderabad-based NGO that rescues and rehabilitates sex trafficking survivors. As not many cases of child porn and rape were reported, the portal was later extended to all types of cybercrime.
- National Crime Records Bureau (MHA): The National Crime Records Bureau (MHA) signed a memorandum of understanding with the NCMEC in April 2019 to receive Cyber Tipline reports to facilitate action against those who upload or share CSAM in India. The NCRB has received more than two million Cyber Tipline reports which have been forwarded to the States for legal action.
- The ad hoc Committee of the Rajya Sabha: In its report of January 2020, made wide-ranging recommendations on ‘the alarming issue of pornography on social media and its effect on children and society as whole’.
- Widening of the definition of ‘child pornography’: On the legislative front, the committee not only recommended the widening of the definition of ‘child pornography’ but also proactive monitoring, mandatory reporting and taking down or blocking CSAM by ISPs.
- Breaking of end-to-end encryption: On the technical front, the committee recommended permitting the breaking of end-to-end encryption, building partnership with industry to develop tools using artificial intelligence for dark-web investigations, tracing identity of users engaged in crypto currency transactions to purchase child pornography online and liasoning with financial service companies to prevent online payments for purchasing child pornography.
What needs to be done?
- Mandatory reporting of CSAM by ISP, s: According to the ninth edition (2018) report of the International Centre for Missing and Exploited Children on “Child Sexual Abusive Material: Model Legislation & Global Review”, more than 30 countries now require mandatory reporting of CSAM by ISPs. Surprisingly, India also figures in this list, though, the law does not provide for such mandatory reporting.
- Establish liability of legal persons: The Optional Protocol to the United Nations Convention on the Rights of the Child that addresses child sexual exploitation encourages state parties to establish liability of legal persons.
- Convention on The Protection of Children against Sexual Exploitation and Sexual Abuse: The Council of Europe’s Convention on Cybercrime and Convention on The Protection of Children against Sexual Exploitation and Sexual Abuse also requires member states to address the issue of corporate liability.
- India should join INHOPE: It is time India joins INHOPE and establishes its hotline to utilize Interpol’s secure IT infrastructure or collaborate with ISPs and financial companies by establishing an independent facility such as the IWF or NCMEC.
- India needs to explore all options and adopt an appropriate strategy to fight the production and the spread of online CSAM. Children need to be saved.
Q. How children are Vulnerable against child sexual abuse material (CSAM)? What legal remedies available in India against CSAM?