Amnesty International Urges Meta to Tackle Harmful Content Escalating Sectarian Violence in Bangladesh

Amnesty International has called upon Meta to urgently address the proliferation of harmful content on its platform, Facebook, which is fueling sectarian tensions and violence against minority communities in Bangladesh. The rights group highlights the rise of misleading content inciting discrimination and violence, particularly regarding political parties and minority groups. Alia Al Ghussain of Amnesty underscored how the interplay of harmful content, political tensions, and Meta’s algorithmic functions create a precarious scenario threatening human rights.

Criticism of Meta has intensified following recommendations to implement “break the glass” measures to curb the influence of algorithmic amplification. Amnesty recommended that states enforce regulations on social media algorithms but emphasized the intrinsic responsibility of social media companies to uphold human rights. In December 2025, the Bangladesh Telecommunication Regulatory Commission (BRTC) reached out to Meta, highlighting instances where social media threats led to violent mob attacks on media outlets like The Daily Star and Prothom Alo. Such incidents signify a troubling pattern of Facebook algorithms exacerbating violence in the region.

The BRTC has expressed dissatisfaction with Meta’s content moderation processes, emphasizing that the platform has been a tool for inciting “large-scale violence.” It urged Meta to enforce community standards more stringently for Bangladesh-specific content, improve Bengali-language moderation, and prompt immediate action on reported violent content.

Bangladesh’s recent political upheaval, involving mass protests and a regime change, heightens the urgency for Meta’s intervention. Concerns over Facebook’s role in spreading harmful narratives are magnified by past incidents, such as the human rights violations in Ethiopia and the Rohingya crisis in Myanmar, showing a repeated pattern of sectarian violence linked to content on social media platforms. For further details, the issue is thoroughly discussed in an article on JURIST – News.

Globally, the call for stronger content moderation by social media platforms has been echoed in different contexts. Reports indicate that similar issues of content moderation have contributed to societal unrest in various regions, prompting experts to examine the wider implications of algorithm-driven content. By failing to effectively mitigate these risks, Meta faces mounting pressure to balance engagement-driven business models with the ethical obligation to safeguard against exacerbating social and political unrest.