Why in the News?
The Ministry of Electronics and Information Technology proposed mandatory labelling of Artificial Intelligence–generated synthetic content on social media to curb deepfakes, under draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
2025 Draft Amendment on AI Content:
- AI Regulation: Introduced by MeitY to address synthetic and AI-generated media such as deepfakes.
- Mandatory Disclosure: Users must self-declare AI-generated content; platforms must detect and label undeclared synthetic material.
- Labelling Standards: Labels to cover 10% of image/video area or duration (audio); applies to text, audio, and video formats.
- Platform Obligations: Ensure metadata embedding and automated verification of user declarations.
- Legal Liability: Non-compliance leads to loss of “safe harbour” protection under Section 79(1), making intermediaries liable for hosted content.
- Public Consultation: Comments open till 6 November 2025.
Back2Basics: IT Rules, 2021:
- Legal Basis: Framed under Sections 87(2)(z) and 87(2)(zg) of the Information Technology Act, 2000 to regulate social media, digital news, and OTT platforms.
- Objective: To ensure accountability, transparency, and user protection in India’s digital ecosystem while balancing free speech with responsible governance.
- Evolution: Replaced the IT (Intermediary Guidelines) Rules, 2011, expanding obligations for intermediaries like Facebook, X (Twitter), YouTube, and Instagram.
- Scope: Applies to social media intermediaries, messaging services, digital news publishers, and OTT streaming platforms.
- Compliance Framework: Platforms must appoint Chief Compliance Officer (CCO), Nodal Contact Person, and Resident Grievance Officer (RGO), all based in India.
- Traceability Clause (Rule 4(2)): Mandates messaging services to identify the “first originator” of unlawful content, raising privacy and surveillance concerns.
Regulation of Social Media Content in India:
- Legislative Basis: Governed by the IT Act, 2000, notably Section 69A (blocking powers) and Section 79(1) (safe harbour for intermediaries).
- Obligations: Intermediaries must remove unlawful content within 36 hours of a government or court order.
- 2023 Amendment: Proposed removal of false content about the government; implementation stayed by Supreme Court.
- Judicial Context:
- Shreya Singhal (2015): Struck down Section 66A, upholding free speech.
- K.S. Puttaswamy (2017): Recognised privacy as a fundamental right influencing digital governance.
|
Get an IAS/IPS ranker as your 1: 1 personal mentor for UPSC 2024
Attend Now