EU Seeks Meta and TikTok Compliance Details on Digital Services Act Amid Israel-Hamas Conflict

The European Commission has issued formal requests for information to social media heavyweights Meta and TikTok, seeking detailed explanations on their compliance with the European Union’s (EU) Digital Services Act (DSA). This comes particularly in the wake of the ongoing Israel-Hamas conflict, a situation demanding an active role of these platforms to prevent disinformation and harmful content.

The platforms have been given a deadline line until October 25, to furnish details of the proactive steps taken to fulfil their obligations under the DSA. These obligations primarily include measures to arrest the spread of violent content, hate speech, and disinformation across their platforms.

According to DSA, the European Commission holds the authority, under Article 74 (2), to impose penalties on these platforms if the information provided is found to be “incorrect, incomplete, or misleading”. Moreover, should the platforms fail to adhere to the deadline or not comply, the commission can resort to a formal request for the necessary information by decision, with the possibility of imposing recurring fines as penalties.

Implemented in 2022, the Digital Services Act serves as a regulatory framework aimed at preserving human rights and limiting the circulation of illegal online content. It imposes a comprehensive set of duties on digital service providers to protect users, affirm fundamental rights online, and improve transparency and accountability across online platforms operating in Europe.

Meta, on its part, has implemented measures to curb the spread of disinformation and misinformation, specifically in the context of the ongoing Israel-Hamas conflict. These include classifying Hamas under its Dangerous Organizations and Individuals Policy, effectively banning the group from its platforms.

Concurrently, TikTok has rolled out measures to control the dissemination of violent and misleading content associated with the Israel-Hamas conflict. The measures notably involve upgrading its automated detection systems to recognize and remove graphic and violent content in real time, thereby ensuring the safety of both its moderators and community members.

Only last week, social media platform X (formerly Twitter) took similar steps in the wake of the European Commission’s Internal Market Commissioner, Thierry Breton, airing concerns about violent and misleading content on its platform. In a parallel development, New York Attorney General Letitia James issued a call to action for Meta, TikTok, X, Google, TikTok, Reddit, and Rumble asking for detailed explanations about their strategies to prevent the misuse of their platforms for propagating violence and terrorism. She also implored these companies to stop the spread of hateful content targeting Jewish and Muslim individuals and institutions.

For a more detailed account of this development, refer to this report.