Featured Article : Greece To Ban Social Media For Under-15s
Greece is set to ban social media access for under-15s from 2027, marking a significant step in a growing global effort to limit the impact of platforms on young people’s health and behaviour.
Why Is Greece Taking Action?
The Greek government has positioned the move as a response to rising concerns about children’s mental health, particularly anxiety, sleep disruption, and compulsive use of social media. Prime Minister Kyriakos Mitsotakis has pointed directly to what he describes as the “addictive design” of platforms, arguing that the way apps are built to capture attention is now part of the problem.
Reports from schools and parents in Greece suggest that excessive screen time is affecting sleep patterns and concentration, with some teachers describing children arriving at school exhausted. The government has already taken earlier steps, including banning mobile phones in schools and introducing parental control tools, but has now concluded that broader restrictions are necessary.
The proposed law will require platforms to block access for under-15s or face financial penalties, with further details on enforcement expected as legislation progresses. Greece is also pushing for a coordinated European approach, including standardised age verification and a common digital age threshold.
A Growing International Trend
Greece is not the only country introducing this kind of ban. Australia became the first country to implement a nationwide ban on social media for under-16s in late 2025, requiring platforms such as TikTok, Instagram, and Snapchat to remove underage accounts or face substantial fines.
Also, across Europe, similar proposals are now gaining traction. For example, France has already moved legislation forward to restrict access for younger users, while Denmark, Spain, and Slovenia are developing comparable measures. Germany has debated an under-16 ban, and the UK is currently consulting on whether to introduce restrictions or alternative controls such as screen time limits and digital curfews.
Outside Europe, countries including Indonesia and Malaysia are also moving towards tighter controls. This reflects a broader change in how governments are approaching social media, not simply as a communication tool, but as a potential public health issue requiring intervention.
What The Evidence Says About Health Impacts
The policy momentum is being driven by a growing body of research linking heavy social media use with negative outcomes for children and teenagers. Studies have associated prolonged screen time with increased levels of anxiety, depression, poor sleep quality, and reduced attention span.
Sleep disruption is one of the most consistent findings. Late-night usage, constant notifications, and the pressure to remain engaged can reduce both the quantity and quality of sleep, which in turn affects cognitive performance and emotional regulation.
There is also increasing focus on the role of comparison and social validation. Young users are exposed to curated content and constant feedback through likes and comments, which can contribute to feelings of inadequacy and social pressure.
When it comes to countries that have already introduced restrictions, the picture is still unclear. Australia’s under-16 ban only came into force in late 2025, meaning there is not yet enough long-term data to show whether it has improved mental health outcomes. Early signs suggest platforms are being forced to take age verification more seriously, but evidence of measurable health improvements has not yet emerged.
This means governments are largely acting on existing research and precaution rather than proven results from national bans.
At the same time, the evidence is not entirely one-sided. Some researchers and platforms argue that social media can provide benefits, including social connection, access to information, and support networks, particularly for isolated or vulnerable individuals. This is one reason why some policymakers are cautious about blanket bans.
How Governments Are Responding To Social Media Risks
What is clear is that governments are increasingly willing to intervene directly in how social media is used. The framing is changing from personal responsibility to systemic risk, with platform design, algorithms, and engagement models coming under scrutiny.
Recent legal action in the United States has reinforced this direction, with court cases finding major platforms liable for harm linked to addictive design, adding weight to arguments that these systems are not neutral tools but engineered environments with measurable effects.
For policymakers, this creates a rationale for regulation that goes beyond content moderation and into the structure of the platforms themselves.
What Does This Mean For Your Business?
For businesses, this is part of a wider change in digital regulation that is likely to expand beyond children’s use into broader platform accountability.
Right now, there is limited real-world evidence on the outcomes of these bans, simply because most have only recently been introduced. However, if early restrictions, such as Australia’s, begin to show measurable improvements in areas like sleep, attention, or mental wellbeing, that will significantly strengthen the case for wider and more permanent regulation.
If that happens, businesses should expect tighter controls not just on age access, but potentially on platform design itself, including features that drive prolonged engagement, such as endless scrolling, notifications, and algorithmic content feeds. This could directly affect how audiences interact with content and how effectively platforms can be used for marketing and engagement.
Organisations that rely on social media for marketing, recruitment, or customer engagement should also expect stricter age verification requirements and more defined audience segmentation. Younger demographics may become harder to reach on mainstream platforms or may move to alternative, less regulated spaces.
There is also a reputational dimension. As awareness of the health impact of social media grows, businesses may face greater scrutiny over how they use these platforms, particularly if their content or campaigns are seen to contribute to excessive use or target younger audiences.
This all seems to point to a future where digital platforms are treated less as open channels and more as regulated environments, with clearer rules around access, design, and responsibility. Businesses that understand this direction early will be better placed to adapt as those rules tighten.

