đź’ĄUPSC 2026, 2027 UAP Mentorship Aug Batch

Freedom of Speech – Defamation, Sedition, etc.

Should Commercial speech on digital platform be regulated

Introduction

On August 25, 2025, the Supreme Court of India asked the Union government to frame guidelines for regulating social media content, noting that influencers often commercialise speech in ways that offend vulnerable groups. The case arose from derogatory remarks made by comedians about persons with Spinal Muscular Atrophy. While well-intentioned, the order has raised concerns about overregulation of free speech.

Why in the news

The Supreme Court of India’s intervention is significant because it directs the executive to draft specific rules for social media despite existing laws such as the Bharatiya Nyaya Sanhita, 2023 (BNS) and the Information Technology Act, 2000 (IT Act) already providing mechanisms. For the first time, the Court has nudged the government toward formal regulation triggered by a single incident, raising alarms of censorship and judicial overreach.

The presence or absence of a regulatory vacuum

  1. Existing provisions: FIRs can be filed under the Bharatiya Nyaya Sanhita, 2023 and the Information Technology Act, 2000. The IT Act already empowers courts or the executive to order takedowns.
  2. Opaque enforcement: Takedowns often occur without notifying the affected individual, undermining natural justice.
  3. Critics’ view: No regulatory vacuum exists; additional rules may be an overreaction to a single case.

The question of dignity as a ground for restricting free speech

  1. Constitutional limits: Article 19(2) of the Constitution of India exhaustively lists permissible restrictions, security of the state, public order, decency, morality, etc. Dignity is not among them.
  2. Judicial precedents: In Subramanian Swamy v. Union of India (2016), the Supreme Court of India upheld criminal defamation, indirectly protecting individual dignity, but did not treat dignity as an independent ground.
  3. Slippery slope risk: Recognising dignity as a separate basis for restriction could legitimise expansive censorship.

The risk of silencing uncomfortable speech

  1. Chilling effect: Overbroad regulations may deter comedians, satirists, and artists from bold expression.
  2. Supreme Court stance: In March 2025, in Imran Pratapgadhi v. State of Gujarat, the Court quashed charges against a Member of Parliament, reaffirming that Article 19(1)(a) protects even disturbing or offensive views.
  3. Censorship creep: Proposals like the Broadcasting Services (Regulation) Bill may expand state control over independent creators.

The place of commercial speech in free expression

  1. Judicial recognition: In Sakal Papers Pvt. Ltd. v. Union of India (1962) and Tata Press Ltd. v. Mahanagar Telephone Nigam Limited (1995), the Supreme Court of India affirmed that commercial speech falls under Article 19(1)(a).
  2. Commerce and speech: Just as newspapers rely on advertisements, comedians and influencers rely on monetisation. Profit motive does not make speech less deserving of protection.
  3. Criticism: Comedy and satire do not neatly fall into the narrow category of “commercial speech,” traditionally reserved for advertisements.

Judicial polyvocality and consistency of precedent

  1. Court’s nature: Divergent views are part of common law, but binding precedent ensures continuity.
  2. Problem here: Directing the executive to draft rules risks giving regulations undue legitimacy and making constitutional challenges harder.
  3. Judicial discipline: When coordinate Benches depart from earlier rulings, proper procedure is referral to a larger Bench.

Safeguards needed in future regulations

  1. Transparent review: Any regulation must ensure robust review mechanisms and fairness in takedown procedures.
  2. Broad consultation: Stakeholder engagement should extend beyond industry associations to include civil society and affected communities.
  3. Opacity concerns: Section 69A of the Information Technology Act, 2000 and its rules (2009) are already opaque; future regulations must not repeat these flaws.

Conclusion

The Supreme Court’s intention to protect dignity is laudable, but creating fresh regulations risks undermining the freedom of expression. India already has legal frameworks to tackle offensive content. Expanding restrictions based on vague concepts like dignity may lead to excessive censorship, weaken democratic discourse, and erode artistic freedom.

Value Addition

Social Media Regulation in India

Existing legal framework:

  1. Information Technology Act, 2000 (IT Act) – Section 69A empowers the government to block content in the interest of sovereignty, security, or public order.
  2. Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 – impose obligations on intermediaries (traceability, grievance redressal, content takedown within 24 hours).
  3. Bharatiya Nyaya Sanhita, 2023 (BNS) – contains provisions criminalising hate speech, obscenity, and defamation.

Judicial interventions:

  1. Shreya Singhal v. Union of India (2015) – struck down Section 66A of the IT Act for being vague and unconstitutional.
  2. Subramanian Swamy v. Union of India (2016) – upheld criminal defamation, linking dignity and reputation to Article 21.
  3. Concerns: Opaque takedown orders, executive overreach, limited transparency, chilling effect on creators.

Comparative Global Perspective

  • European Union (EU):
    • Digital Services Act (DSA), 2022 – imposes strict obligations on platforms to remove illegal content, ensures algorithmic transparency, and penalises non-compliance heavily.
    • Focus on user rights, platform accountability, and transparency reports.
  • United States:
    • Section 230 of the Communications Decency Act, 1996 – grants platforms immunity for third-party content but allows them to moderate in “good faith.”
    • Debate ongoing about reforming Section 230 to tackle misinformation and hate speech.
  • United Kingdom: Online Safety Act, 2023 – places a “duty of care” on platforms to protect children and curb illegal content.
  • Australia: Online Safety Act, 2021 – empowers the eSafety Commissioner to order removal of harmful content (cyberbullying, image-based abuse, terrorist material).
  • China: Heavily restrictive model – extensive censorship, mandatory real-name verification, and state monitoring of digital platforms.
  • Global South: Many countries (e.g., Nigeria, Pakistan) have passed restrictive social media laws under the pretext of national security, raising concerns about authoritarian misuse.

International Bodies and Global Norms

  • United Nations Human Rights Council (UNHRC): Stresses that restrictions on online speech must comply with Article 19 of the International Covenant on Civil and Political Rights (ICCPR) – legality, necessity, and proportionality.
  • UNESCO: Advocates for a multi-stakeholder approach to digital governance, focusing on protecting human rights, access to information, and pluralism.
  • OECD (Organisation for Economic Cooperation and Development): Encourages transparency and accountability frameworks for digital platforms.
  • Global Internet Forum to Counter Terrorism (GIFCT): A tech industry-led initiative to remove extremist content online.

Good Examples

  • Germany: Network Enforcement Act (NetzDG), 2017 – requires platforms to remove “manifestly unlawful” content (hate speech, fake news) within 24 hours. Criticised for overblocking but effective in quick takedowns.
  • France: Passed “Avia Law” (2020) against online hate but was struck down by the Constitutional Council for disproportionate restrictions. Illustrates the tension between free speech and regulation.
  • EU’s GDPR (General Data Protection Regulation) indirectly regulates platforms by holding them accountable for data privacy and targeted advertising.

Way Forward for India

  • Principle-based framework: Regulations should follow constitutional safeguards (Article 19(2)), ensure proportionality, and avoid vague categories like “dignity.”
  • Transparency and due process: Mandatory publication of takedown orders, notice to affected parties, and avenues for appeal.
  • Independent oversight: Instead of executive dominance, an independent regulator (like an ombudsman or tribunal) could review takedown requests.
  • Stakeholder-driven approach: Consultation must involve civil society, creators, tech companies, and vulnerable communities.
  • Digital literacy: Public campaigns to counter hate speech and misinformation organically, rather than relying solely on punitive regulation.
  • Learning from global practices: India could adapt elements of the EU’s Digital Services Act (transparency), US’s Section 230 immunity, and Australia’s safety-first approach, while avoiding China’s over-control.

UPSC Relevance

[UPSC 2013] Discuss Section 66A of IT Act, with reference to its alleged violation of Article 19 of the Constitution.

Linkage: Section 66A of the Information Technology Act, 2000 was struck down in Shreya Singhal v. Union of India (2015) for being vague and violating Article 19(1)(a) beyond the limits of Article 19(2). The present debate on regulating commercial speech on digital platforms raises a similar concern, as introducing “dignity” as a restriction risks the same arbitrariness. Both highlight the constitutional need for clear, proportionate, and narrowly defined limits on free speech in India.

Get an IAS/IPS ranker as your 1: 1 personal mentor for UPSC 2024

Attend Now

Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

JOIN THE COMMUNITY

Join us across Social Media platforms.