Advertisment

Snap & ShareChat offer minimal disclosure on content removed via automated tools: Report

The report evaluates content moderation and grievance redressal practices of platforms like Facebook, Instagram, WhatsApp, YouTube, Twitter/X, LinkedIn, Snap, ShareChat, and Koo from June 2021 to December 2023.

author-image
Social Samosa
New Update
123

The Internet Governance and Policy Project (IGAP) has published its latest report, 'Social Media Transparency Reporting: A Performance Review', offering a detailed analysis of how Significant Social Media Intermediaries (SSMIs) are complying with India’s Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The report assesses the performance of platforms, including Facebook, Instagram, WhatsApp, YouTube, Twitter/X, LinkedIn, Snap, ShareChat, and Koo, focusing on their content moderation practices and grievance redressal mechanisms between June 2021 and December 2023.

As concerns over harmful online content, including misinformation and hate speech, continue to grow, IGAP’s report highlights key gaps in transparency across these platforms and calls for enhanced accountability to Indian users. The study also provides a comparative analysis with international frameworks such as the European Union’s 'Digital Services Act' and presents actionable recommendations for improving transparency and compliance practices.

Key Findings:

  • Inconsistent reporting: While platforms like Facebook and YouTube offer relatively comprehensive reports, others such as Koo and LinkedIn provide limited data, making it difficult to evaluate their adherence to content moderation guidelines.

  • Lack of clarity on automated monitoring: Platforms like Snap and ShareChat offer minimal disclosure on content proactively removed using automated tools, raising questions about the effectiveness of their systems in addressing harmful content such as hate speech and child exploitation.

  • Complex grievance redressal mechanisms: Platforms such as Instagram, Facebook, and Twitter/X maintain both global and India-specific grievance systems, leading to confusion for users and inconsistencies in data reporting.

  • Need for more granular data: The report emphasises the importance of providing more detailed disclosures, particularly on content moderation in regional Indian languages, law enforcement requests, and actions against repeat offenders.

Recommendations: IGAP’s report outlines several key reforms to improve transparency and ensure better accountability:

  • More granular disclosures: Social media platforms should provide detailed data on user complaints, content removals, and the diversity of Indian languages to ensure fair and equitable moderation practices.

  • Standardised reporting formats: Platforms should adopt consistent and uniform reporting formats to improve accessibility and comparison of data across different platforms.

  • Improved oversight of automated tools: The report calls for clearer reporting on the types of content flagged and removed by automated systems, with a focus on emerging concerns such as misinformation, deepfakes, and synthetic media.

Rakesh Maheswari, the Lead Author, and former Senior Director and Group Coordinator (Cyber Laws and Data Governance Division), Ministry of Electronics and Information Technology (MeitY), stated, "Social media platforms have a responsibility to create a transparent and accountable digital environment, especially given their influence on public discourse. This report underscores the need for uniform, more granular reporting in line with the intent of IT Rules, 2021 and aims to help bridge the gaps in content moderation practices across platforms operating in India.”



Instagram Facebook whatsapp Social media report social media apps IGAP content moderation guidelines Internet Governance and Policy Project