Growing Global Scrutiny of Social Media and Youth

November 25, 2025

Growing Global Scrutiny of Social Media and Youth - Luvonese AI
Photo of A person uses a smartphone displaying a folder of popular social media apps, including Facebook, Instagram, YouTube, LinkedIn, and more.

Around the world, social media is facing a new wave of scrutiny. Governments, regulators, courts and youth advocates are all tightening their focus on digital rights, online safety and internet governance, especially when it comes to children and teenagers.

From age-based bans and stronger verification rules to UN-level debates on youth rights in the digital environment, 2025 is shaping up as a turning point in how societies govern the online spaces where young people spend so much of their time.

Growing Global Scrutiny in Late 2025

In the second half of 2025, several developments underscored a growing consensus: social media can no longer be left to self-regulate when youth safety and democratic health are at stake.

  • Denmark announced plans to ban social media platforms for children under 15, with limited exceptions for 13–14-year-olds if parents consent, citing severe concerns about mental health and screen time. Reuters

  • Australia has moved in a similar direction, introducing a law to ban major social media apps for under-16s — now facing a High Court challenge from a 15-year-old who argues the ban may actually make things less safe by pushing teens into unregulated corners of the internet. The Times of India

These moves illustrate the tension at the heart of the debate: how to protect young people from harm without cutting them off from the social, educational and civic value that digital platforms can provide.

Youth Protection Laws and Age Limits

Children’s access to social media is increasingly seen as a public-policy issue rather than a purely private family decision.

  • Denmark’s plan puts strict age thresholds on platforms such as Snapchat, TikTok and Instagram, with political backing across multiple parties. Reuters

  • In Indonesia, new Government Regulation No. 17 of 2025 on Electronic System Governance for Child Protection (“PP Tunas”) is designed to shield children from online threats and to impose sanctions on platforms that fail to comply with stricter safety rules. ANTARA News

These measures echo broader international efforts to define what age-appropriate design and duty of care should mean in law — a trend already visible in Europe’s Digital Services Act and various children’s codes.

Internet Governance Forums Put Youth at the Centre

At the global level, internet governance spaces have been increasingly dominated by questions about youth, platforms and power.

The Internet Governance Forum (IGF) 2025, hosted in Norway, has been described as one of the largest UN digital policy meetings ever, bringing together states, companies, civil society and youth delegates. Digital Watch Observatory

A major highlight was the Global Youth Summit “Too Young to Scroll?”, which focused on: Digital Watch Observatory

  • Age verification on social platforms

  • Online safety for children and teenagers

  • Balancing protection with digital rights and free expression

Youth IGF initiatives stressed that young people must be co-creators of digital rules, not just passive subjects of regulation. events.youthigf.com

Children’s Rights, Protection and Participation

Children’s rights organisations have used 2025 to push a clear message: safety, empowerment and participation must all be part of the same conversation.

At IGF 2025, multiple sessions addressed children’s rights to protection, empowerment and participation in the digital environment. Childrens Rights Digital

European examples showcased how regulators are using tools like the EU’s Digital Services Act (DSA) to demand:

  • Stricter content moderation around harmful and violent content

  • Age-appropriate terms and conditions

  • Transparency on recommender systems and targeted advertising for minors Childrens Rights Digital

The child-rights NGO 5Rights Foundation called on governments and tech firms at the IGF ministerial panel to redesign digital products around children’s best interests, not just engagement metrics. 5rights

Mental Health, Algorithms and Design Duties

Policymakers are increasingly concerned not only with what content young people see, but how platforms are designed to keep them engaged.

A 2025 European Parliament briefing on youth and social media highlighted that young people themselves are raising alarms about constant exposure to violent and harmful content, and the risk of becoming desensitised to it. European Parliament

Parallel research on digital media consumption trends among youth in Europe shows that: EBU

  • Teens spend several hours a day across multiple platforms

  • Algorithms heavily influence content discovery

  • Short-form video dominates attention spans

These findings are strengthening calls for “safety by design” — requiring platforms to change notification flows, recommendation systems and default settings to prioritise mental health and safety, especially for minors.

Digital Rights and Free Expression

At the same time, there is growing concern that aggressive regulation could undermine digital rights and freedom of expression if not carefully designed.

  • Critics of blanket youth bans argue they risk pushing teens into VPNs, anonymous spaces or fringe platforms with fewer safeguards. The Times of India

  • Youth advocates at IGF 2025 warned that safety measures must not become an excuse to silence dissent, activism or marginalised voices online. events.youthigf.com

This is why many global discussions now frame the challenge as balancing protection, privacy and participation rather than simply restricting access.

The Role of Platforms and Tech Companies

Governments are no longer satisfied with voluntary self-regulation. New rules increasingly impose:

  • Transparency demands for algorithms and content moderation

  • Stronger reporting obligations for harmful content involving minors

  • Fines or sanctions when platforms fail to protect children

Campaigns such as “End digital violence” under the UN’s 16 Days of Activism theme highlight how harassment, doxxing and image-based abuse disproportionately affect women and young people, and call for systemic changes from platforms. unwomen.org

The message is clear: big platforms must shoulder more legal and ethical responsibility for the environments they create.

Looking Ahead: Governance for a Generation That Grew Up Online

As of 25 November 2025, the trend is unmistakable: social media is entering a new era of harder rules, higher expectations and deeper public scrutiny.

Key questions for the next few years include:

  • Can governments craft laws that protect children without eroding fundamental digital rights?

  • Will platforms meaningfully redesign their systems, or simply adapt at the margin to avoid penalties?

  • How can young people be involved not just as users, but as shapers of digital policy and governance?

What is certain is that the generation that grew up online will not accept a future where safety and rights are treated as an either–or choice. The debates of 2025 — from age-verification laws to UN governance forums — show that the world is finally starting to treat youth, social media and digital rights as a central part of public policy, not a side issue.

In that sense, today’s scrutiny is less about “cracking down” and more about growing up the internet itself.

Tags

Share this article

Table of Content

Boost your productivity with Luvonese AI

Need website for your business?

Other News

Ready to Work Smarter With AI?

Start using Luvonese AI and experience how easy it is to brainstorm, summarize, and get things done — all in one place.