The Danish government has recently announced a national minimum age requirement of 15 years for certain social media platforms, becoming the first EU nation to implement such a restriction. This new regulation, overseen by the Ministry of Digitalisation, aims to limit children under 15 from independently registering on platforms that may expose them to harmful content or features. However, parental consent can allow access for children aged 13 and older.
The government’s initiative is directed at protecting the youth, providing them with more time for activities conducive to healthy development before engaging with social media. Digitalisation Minister Caroline Stage emphasized Denmark’s leadership in Europe by setting this age limit as part of a comprehensive plan to improve the digital well-being of children and adolescents. Moderate Party lawmaker Rasmus Lund-Nielsen highlighted concerns, pointing out that the pervasive nature of social media is affecting social interaction and physical activity among the youth. He cited alarming statistics that show a decline in physical and social activities among Danish children, and noted that a significant percentage receive psychiatric diagnoses before adulthood.
In parallel with the age restriction, Denmark is introducing several “gatekeeping” measures to protect young users from digital abuse. These measures include vigilance over offensive content, aggressive marketing, and the use of minors in advertising. The efforts align with the Danish Marketing Practices Act, reinforcing the need for stricter advertising regulations involving children.
This initiative is part of a broader compliance with the 2023 European Digital Services Act (DSA). The legislation, particularly Article 35(j), compels providers of extensive online platforms and search engines to adopt effective measures to safeguard children’s rights. This includes implementing age verification methods, parental control options, and tools to help minors report abuse or seek support.
Denmark’s policy emerges amid criticisms from organizations like Amnesty International, which have spotlighted platforms such as TikTok for failing to protect young users from harmful content. Concerns have been raised about the algorithmic promotion of content related to mental health issues, leading teens into concerning thematic content.
The announcement highlights a significant shift in policy, reflecting Denmark’s commitment to shielding its younger generation from digital risks while encouraging healthier lifestyle choices. More details can be found on JURIST.