đź’ĄUPSC 2026, 2027 UAP Mentorship September Batch

Artificial Intelligence (AI) Breakthrough

Unseen labour, exploitation: the hidden human cost of Artificial Intelligence

Introduction

The promise of AI as an automated, error-free technology often masks the unseen human labour that makes it possible. From labelling raw data to moderating harmful content, “ghost workers” form the backbone of AI ecosystems. Yet, their contributions remain invisible, underpaid, and unprotected. The debate on AI is incomplete without recognising the human cost of automation, a matter of global ethics, labour rights, and governance.

The Hidden Human Cost of AI

Why is AI’s invisible labour in the news?

AI companies, especially in Silicon Valley, outsource essential annotation and moderation work to low-paid workers in developing countries. Recent revelations of exploitative conditions, such as Kenyan workers earning less than $2 an hour for traumatic tasks like filtering violent content, have exposed the dark underbelly of AI. This has amplified global concerns about modern-day slavery, violation of labour rights, and the absence of legal safeguards in AI supply chains.

Areas of Human Involvement in AI

  1. Data Annotation: Machines cannot interpret meaning; humans label text, audio, video, and images to train AI models.
  2. Training LLMs: Models like ChatGPT and Gemini depend on supervised learning and reinforcement learning, requiring annotators to correct errors, jailbreaks, and refine responses.
  3. Subject Expertise Gap: Workers without domain knowledge label complex data, e.g., Kenyan annotators labelling medical scans, leading to inaccurate AI outputs.

Are Automated Features Truly Automated?

  1. Content Moderation: Social media “filters” rely on humans reviewing sensitive content (pornography, beheadings, bestiality). This causes severe mental health risks like PTSD, anxiety, and depression.
  2. AI-Generated Media: Voice actors, children, and performers record human sounds and actions for training datasets.
  3. Case Study (2024): Kenyan workers wrote to U.S. President Biden describing their labour as “modern-day slavery.”

What Challenges Do Workers Face?

  1. Poor Wages: Less than $2/hour compared to global standards.
  2. Harsh Conditions: Tight deadlines of a few seconds/minutes per task; strict surveillance; risk of instant termination.
  3. Union Busting: Workers raising concerns are dismissed, with collective bargaining actively suppressed.
  4. Fragmented Supply Chains: Work outsourced via intermediary digital platforms; lack of transparency about the actual employer.

Why Is This a Global Governance Issue:

  1. Exploitation in Developing Countries: Kenya, India, Pakistan, Philippines, and China host the bulk of annotators, highlighting global North-South labour inequities.
  2. Digital Labour Standards: Current international labour frameworks inadequately cover digital gig work.
  3. Ethical Responsibility: Big Tech profits from AI breakthroughs while invisibilising the labour behind them.
  4. Need for Regulation: Stricter global and national laws must ensure fair pay, transparency, and dignity at work.

Way Forward

  1. Transparency Mandates: Disclosure of supply chains by tech companies.
  2. Fair Labour Standards: Minimum wages, occupational safety norms, and psychological health safeguards.
  3. Recognition of Workers: From “ghost workers” to “digital labour force.”
  4. Global Collaboration: Similar to climate treaties, AI labour governance requires multilateral regulation.

Conclusion

Artificial Intelligence is not fully autonomous—it rests on millions of invisible workers whose exploitation challenges the ethics of the digital age. For India and the world, the future of AI must balance innovation with human dignity, equity, and justice. Without recognising and regulating this labour, the AI revolution risks deepening global inequalities.

Value Addition

Global Frameworks and Conventions

  1. ILO Convention 190 (2019): Addresses workplace violence and harassment — highly relevant to content moderators exposed to graphic/traumatic data.
  2. ILO Recommendation 204: Transition from informal to formal economy — ghost workers are currently informal, with no rights.
  3. UN Guiding Principles on Business and Human Rights (2011): Corporate duty to respect human rights across supply chains, including digital gig platforms.
  4. EU Artificial Intelligence Act (2025): First comprehensive law regulating AI systems; includes risk categories and human oversight.
  5. Santa Clara Principles (2018): Framework for transparency, accountability, and due process in online content moderation.

Conceptual Tools and Keywords

  1. Digital Colonialism: Global North exploits cheap digital labour in Global South for AI systems.
  2. Surveillance Capitalism (Shoshana Zuboff): Big Tech monetises personal data and labour while eroding privacy and dignity.
  3. Platform Precarity: Gig workers face algorithmic control, constant surveillance, and lack of social protection.
  4. Ghost Work (Mary Gray & Siddharth Suri, 2019): Term for invisible human labour powering AI systems.
  5. Cognitive Labour: Work that relies on human judgment, emotional resilience, and meaning-making (beyond physical labour).
  6. Algorithmic Management: Use of algorithms to allocate, monitor, and discipline workers—stripping them of agency.
  7. Ethics of Invisibility: Recognition gap when workers’ contributions are hidden, making justice claims difficult.

Reports and Studies

  1. Oxford Internet Institute (2019, “Ghost Work”): Estimated millions of hidden workers behind AI, mainly in developing countries.
  2. WEF Future of Jobs Report (2023): Warned of AI-induced job displacements alongside new digital gig work.
  3. ILO Report on Digital Labour Platforms (2021): Documented widespread exploitation, lack of contracts, and cross-border regulatory challenges.

Indian Context

  1. Code on Social Security, 2020: Recognises gig and platform workers, but still weak on implementation.
  2. NITI Aayog Report on “India’s Booming Gig and Platform Economy” (2022): Predicts 23.5 million gig workers by 2030.
  3. Personal Data Protection Act, 2023: Regulates data, but silent on labour rights of those who process AI data.
  4. India’s AI Mission (National Strategy for AI, NITI Aayog): Envisions “AI for All” but doesn’t sufficiently cover labour dimensions.

PYQ Relevance

[UPSC 2023] Introduce the concept of Artificial Intelligence (AI). How does Al help clinical diagnosis? Do you perceive any threat to privacy of the individual in the use of Al in healthcare?

Linkage: AI aids clinical diagnosis by analysing medical scans and predicting outcomes with high accuracy, but it relies on human annotators to label sensitive data. The article shows how even untrained workers in Kenya were tasked with labelling medical scans, raising concerns of reliability. Such outsourcing also heightens the risk of privacy violations in handling patient data across insecure global supply chains.

Get an IAS/IPS ranker as your 1: 1 personal mentor for UPSC 2024

Attend Now

Subscribe
Notify of
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

JOIN THE COMMUNITY

Join us across Social Media platforms.