Social Media Ban for under-16s in Spain: Opportunities, Risks, and what it Says about Our Behaviour
- Larissa
- 3 hours ago
- 4 min read
In recent months, a remarkable trend has emerged in digital policy. More and more countries, including Spain, Australia, and several other European nations, are planning or have already passed laws that significantly restrict—or even ban—access to social media for children and adolescents under 16. This decision does not exist in isolation; it reflects growing societal concern about the impact of digital platforms on young people.

Spain has announced that social networks will no longer be accessible to under-16s. Providers will be required to implement age verification systems that go beyond simple birthdate declarations and create genuine technical barriers. Prime Minister Pedro Sánchez justifies this move as a way to protect minors from a “digital Wild West,” where harmful content, hate speech, and algorithmic manipulation are widespread.
Similar measures have already been implemented in Australia: since 2025, there is a legal minimum age of 16 for the use of major social platforms. Australian law makes social networks legally responsible for ensuring that no one under this age can access their services, with heavy penalties for non-compliance. France and Denmark have also proposed age limits (under 15 or 16), and surveys indicate that a majority of the population in many countries supports age restrictions for children and adolescents.
Why are these bans being proposed?
The reasons for banning social media for under-16s are varied but can be summarised in three main concerns:
Mental health: Studies and experts point out that excessive use of social media among young people is linked to increased anxiety, depression, sleep problems, and body image issues. Social media can promote idealised portrayals, cyberbullying, and constant comparison, which are especially problematic during sensitive developmental stages.
Protection from harmful content: Platforms are full of violence, sexualised content, misinformation, and algorithmic recommendations designed to maximise engagement rather than safety. For young users without fully developed media literacy, this poses a significant challenge.
Data and privacy protection: Children often share personal data without understanding the consequences and are targets for personalised advertising or manipulative content. A higher minimum age is intended to prevent the collection and misuse of minors’ data.
Advantages of a ban
From the perspective of supporters, a general minimum age can bring several benefits:
Reduced mental and social pressure: Without a constant feed of likes, filters, and trends, young people’s daily lives could focus more on real-life social interactions and activities.
Reduction of cyberbullying and risky behaviours: Many forms of self-harm, harassment, or dangerous online challenges spread through social networks. A ban could slow how quickly and intensely children encounter them.
Protection of minors’ data: Stricter age verification would make it harder for platforms to collect and exploit data from very young users.
Disadvantages and criticisms
Despite good intentions, there are significant objections to bans:
Difficult enforcement: Age verification is technically challenging. Children can bypass restrictions using fake documents, email addresses, or VPNs, which limits the effectiveness of a ban.
Loss of positive aspects: Social media also offers young people opportunities for peer exchange, creative expression, access to information, and support in marginalised groups. Total exclusion can prevent these positive experiences.
Risk of secret use: If children are officially banned, they may use hidden accounts or alternative platforms without parental oversight, which can increase risks.
Inequalities: Access to safe educational content or useful communities on social media may be restricted if no nuanced solution is implemented.
What does this say about adults’ social media use?
The focus on protecting children also invites a critical look at how adults themselves use social media. If safeguarding minors is considered important, society should also ask:
How do adults manage their own screen time, algorithmic exposure, and online habits?
To what extent do parents, teachers, and role models foster responsible media use?
How well do we teach critical media literacy before children open their first accounts?
A closer look shows that problematic social media use in adults is rarely purely a technical issue. For many, these platforms fulfil emotional and psychological needs: connection, recognition, distraction, or temporary relief from stress, loneliness, or inner emptiness. Likes, comments, and constant updates can temporarily ease insecurity, while endless scrolling offers an easy escape from uncomfortable thoughts or feelings.
At the same time, daily life often leaves little room for real breaks, reflection, or deep interpersonal contact, making digital platforms an easily available substitute. Algorithms amplify this pattern by rewarding attention and emotional engagement rather than wellbeing. Adults can fall into usage habits that are hard to control, even when aware of the negative consequences.
In this context, the debate about protecting children and adolescents also prompts us to honestly reflect on how adults deal with pressure, uncertainty, and unmet needs—and whether social media increasingly functions as a coping strategy rather than a conscious choice.
Lessons we can learn from the debate
A social media ban for under-16s, as in Spain or Australia, is a bold and far-reaching step in digital youth protection. It offers clear advantages, especially for protecting mental health and the privacy of minors. At the same time, criticism shows that simply banning access is not a perfect solution, and that accompanying measures—media education, parental guidance, and a culture of responsible use—are just as important. What cannot be banned is engagement with media literacy, resilience, and digital self-determination—for both young and adult users alike.