Skip to content Skip to footer

CEE Digital Democracy Watch Submits Key Recommendations for Youth Protection Under EU’s Digital Services Act

The European Commission is currently in the process of developing a framework for Digital Services Act (DSA), a landmark legislation aimed at creating a safer digital space. In light of this, CEE Digital Democracy Watch has submitted a call for evidence regarding the protection of minors online under the DSA framework.

Our key recommendations include:
1️⃣ Prioritising combating misinformation for youth
2️⃣ Safeguarding online communities for marginalized groups
3️⃣ Implementing responsible age verification without compromising privacy

By actively engaging with the European Commission’s consultation process, our organisation is contributing to the shaping of digital rights policies for young people across the European Union. The Commission will use input to draft the guidelines, which are planned to be adopted before summer 2025.

Full text:

Warsaw, 30 September 2024
Call for Evidence for Guidelines on the Protection of Minors Online under the Digital Services Act


CEE Digital Democracy Watch is a Polish non-profit organisation dedicated to promoting responsible online discourse and advocating for a democratic future where regulation and free expression go hand in hand. We strive to ensure that fundamental online values are protected, particularly for the youth.


We recognize the significant effort involved in expanding the regulatory framework under the Digital Services Act (DSA) and the European Commission’s emerging priorities, as indicated in the Political Guidelines, including the focus on mental health in digital spaces and the regulation of dark patterns. In this context, we believe that the overarching goal of European digital policy should be to ensure that young people are not excluded from meaningful connectivity. Such connectivity must allow for communication, entertainment, information access, online education, public services, and financial activities. To achieve this, the following areas must be addressed:


Quality of Information | One of the most pressing risks with long-term societal implications is the declining quality of online information. A Pew Research Center study showed that social media platforms increasingly serve as the primary news source for individuals aged 18 to 29. Furthermore, research by the Center for Countering Digital Hate revealed that a significant proportion of teenagers aged 13 to 17 exhibit concerning levels of belief in conspiracy theories, particularly regarding health and racialized narratives.
For this reason, the development of a European Democracy Shield and any related framework addressing misinformation must prioritise the protection of minors. Such efforts should ensure impartial and transparent content reviews without depriving young people of access to valuable resources.


Safeguarding Communities | The internet should be recognized as a platform for community-building among young people, serving as an outlet for creativity and a means to engage in rich digital lives.
Online connections are especially crucial for dispersed groups, such as LGBTQ+ youth, young community organisers, and providers of credible health information. These groups are at heightened risk of being deplatformed, shadowbanned, or subjected to non-transparent moderation practices. Their ability to connect and explore should not be curtailed, but rather encouraged. In addition, their culture and expressions face constant threats from national regulations that seek to suppress education and expression, as seen in recent years in countries like Bulgaria, Hungary, Lithuania or Poland.


Responsible Age Verification | The expansion of age verification is often presented as a catch-all solution for protecting minors online. However, this may not be effective if such measures require the collection of sensitive data, thereby infringing on the privacy and safety of both minors and adults. Proposals that seek to weaken encryption or implement age verification systems that track users and link them to specific content undermine the protections currently guaranteed by the EU legal framework. This also applies to national exceptions being proposed in several Member States.