Amnesty Report Highlights X’s Role in Fostering LGBTQ+ Abuse in Poland Amid Content Policy Criticisms

A recent report from Amnesty International has shed light on the role of the social media platform X, previously known as Twitter, in facilitating online abuse against the LGBTQ+ community in Poland. This report highlights alarming trends in technology-facilitated gender-based violence (TfGBV) on the platform, following significant changes to its content guidelines under the ownership of Elon Musk. These alterations have reportedly left X inundated with harmful content, prompting calls for urgent reform.

The report underscores the detrimental impact of X’s current content moderation practices, or lack thereof. Alia Al Ghussain, Amnesty International’s Researcher and Advisor on technology and human rights, pointed out that these inadequate practices, coupled with an absence of human rights due diligence, have spurred abuses against LGBTQ+ individuals in Poland. An examination of tweets on the platform revealed a high prevalence of homophobic and transphobic content, particularly linked to accounts aligned with anti-LGBTQ+ political figures.

A key mechanism driving this issue is the platform’s algorithm, particularly the “For You” feed. Designed to boost user engagement, this system often amplifies content likely to receive interaction, inadvertently promoting harmful narratives. Amnesty characterized this as part of X’s “surveillance-based” business model, rooted in intensive data collection to target advertisements, similar to other social media giants.

Compounding these issues is the apparent lack of resources allocated to content moderation in the Polish language. The report identified that only two Polish-speaking moderators are tasked with handling content for a population of over 37 million, including 5.33 million users of X. This under-resourcing, combined with weak policy frameworks, has left the platform susceptible to becoming a channel for hate-fueled content.

Beyond internal policy failures, Amnesty International has identified compliance issues with the European Union’s Digital Services Act (DSA). Specifically, the report alleges that X has not fulfilled its obligations under Article 34(1) of the DSA, which mandates thorough risk assessment and mitigation concerning systemic human rights risks. The report suggests that X’s practices infringe on rights such as human dignity, private life, and non-discrimination, as enshrined by the Charter of Fundamental Rights of the European Union.

In December 2023, the European Commission initiated an investigation against X under the DSA framework, requiring the platform to disclose information about its recommender system by January 2025. Amnesty International’s report suggests that these investigations should also scrutinize X’s effectiveness in addressing the risks associated with TfGBV, highlighting the urgency of addressing these systemic challenges.

Further details on the report by Amnesty International can be found here.