The Great Disconnect: Australia Becomes the First Nation to Ban Social Media for Under-16s

"Can’t Look Away: The Case Against Social Media" directed by Matthew O’Neill and Perri Peltz

CANBERRA - December 10, 2025, Australia's landmark social media ban officially took effect. The Online Safety Amendment (Social Media Minimum Age) requires platforms to take "reasonable steps" to prevent children under 16 from creating or maintaining accounts, making Australia the first country to enforce such nationwide restriction.

In March 2025, the Court of the Citizens of the World convened the Social Media Tribunal, a five-day legal forum that examined Facebook (Meta), X, YouTube, and TikTok for their role in facilitating cybercrimes, election interference, and disinformation. The tribunal, presided over by former U.S. Federal Judge Hon. Shira A. Scheindlin and opened by Nobel Peace laureate Maria Ressa, concluded that these platforms systematically prioritize profits over public safety, knowingly enabling bad actors to exploit their services for harm.

Platforms including TikTok, Instagram, Snapchat, X (formerly Twitter), YouTube, Facebook, Reddit, and Threads have been forced to implement measures to block young Australians from holding accounts. Violations carry fines of up to A$49.5 million (approximately $33 million USD). Importantly, the law requires platforms to demonstrate reasonable effort, not achieve perfect enforcement. Age estimation tools, rather than strict ID verification, may suffice, and platforms won't be penalized if minors circumvent controls using VPNs or false information. It's a big moment in a debate that's been building for years, moving from worried parents  to legislators drafting bills.

A Crisis in Numbers

In the United States, suicide rates among 10- to 24-year-olds jumped 62% between 2007 and 2021. That timeline tracks almost exactly with the explosion of smartphones and social media.

A 2023 U.S. Youth Risk Behavior Survey analysis found that high school students who frequently use social media were significantly more likely to experience bullying, persistent hopelessness, and to have seriously considered suicide. Nearly one in three reported poor mental health most of the time. These are not abstract numbers—they represent classrooms where students are struggling, and emergency rooms increasingly treating adolescents in acute psychological distress.

The European Picture

Europe's seeing similar trends. New data from the WHO Regional Office for Europe shows that "problematic" social media use among adolescents rose from 7% in 2018 to 11% in 2022. Girls are disproportionately affected (13% vs. 9% for boys). In countries like Romania, Malta, and Bulgaria, rates reach as high as 17–22%.

A French study tried to model the long-term cost and found that excessive social media use could contribute to hundreds of thousands of new depression cases over a generation. Dr. Hans Kluge, WHO's Regional Director for Europe, put it bluntly: “The evidence shows social media is driving depression, bullying, anxiety, and poor school performance. The question isn't whether it's harmful anymore. It's whether governments can keep doing nothing.”

Films That Crystallized the Debate

Two powerful documentaries have helped shift public discourse from parental responsibility to corporate accountability, providing both the emotional testimony and forensic evidence that made this law politically possible.

Molly Vs The Machines, directed by Marc Silver, chronicles the tragic story of Molly Russell, a 14-year-old British girl who took her own life after being exposed to vast quantities of self-harm content. A 2022 inquest found social media contributed "in a more than minimal way" to her death. The film acts as a forensic investigation into how engagement-based algorithms actively guided Molly deeper into darkness, portraying platforms not as neutral town squares but as "machines" built for surveillance capitalism. The work draws heavily on the research of Harvard professor Shoshana Zuboff, whose analysis of platform economics frames the narrative.

Can't Look Away: The Case Against Social Media, directed by Matthew O'Neill and Perri Peltz, provides the legal angle. It follows lawyers who are treating social media the way regulators once treated tobacco and opioids: as products designed to be addictive. Infinite scroll, notifications that come at random intervals - these aren't accidents. They're features built to hijack your brain.

Together, these films underscore a stark reality: even the most watchful parent can't compete with an algorithm trained to maximize engagement. The problem isn't bad parenting, it’s a predatory design.

A Global Experiment

Implementation began swiftly. Snapchat and Meta began locking a few accounts they suspected belonged to minors, asking users to verify their age. Australia tested several methods in 2024, including biometric scans, behavioral analysis, and government ID uploads. That last one worries privacy advocates, especially given the strict rules enforced by Australia's Information Commissioner.

Social media is starting to be treated less like a teenage rite of passage and more like a product that needs regulation—think pharmaceuticals or cars. Millions of Australian kids and teens just lost access to their accounts. Prime Minister Anthony Albanese calls it "families taking back power from tech giants."

The world is watching and the era of unregulated social media might finally be ending.

10 COMMANDMENTS ON SOCIAL MEDIA by The Court of the Citizens of the World 

1. Make the distribution of independently proven wrong facts and deliberate fake news a punishable crime; reinstate truth and the values of your society: Criminalize the deliberate distribution of demonstrably false information that causes provable harm to public health, safety, or the human rights of any individual, such as incitement to violence, election manipulation, serious defamation, or fraud. This must not be used to silence dissent, satire, journalism, or artistic expression, all of which are protected under Article 19 of the The International Covenant on Civil and Political Rights (ICCPR) - enforcement under this rule shall be overseen by an independent council, ensuring transparency, due process, and protection of fundamental rights.

2. Make international centralized social media platforms with more than 1 million users accountable for criminal content*, unverified identities and for the content distributed, which needs to be monitored by content moderators with support of AI (for example in regard to pedophilia, antisemitism, homophobia, racism etc.). Platforms must transparently report AI moderation practices, including error rates and enforcement metrics, subject to independent audits to verify effectiveness. 

3. Establish a content moderation ratio of one qualified full-time moderator per 1,000 live users in each country for international centralized platforms exceeding 1 million users. Moderators must hold appropriate legal qualifications and knowledge of local laws. Platforms should publicly disclose moderation policies, enforcement statistics, and provide accessible complaint and appeal mechanisms.

4. Require international centralized platforms with over 1 million users to implement functionality allowing any user to flag and temporarily take down suspected false, harmful, or criminal content for review. The review process must include clear timelines, appeal rights, audit trails, and public reporting of outcomes.

5. Approval processes for fact-checking of social media posts - if not sent by journalists or news media with a proven fact-checking record, or dealing with Lifestyle, Sport and non factual or private issues - require: for posts to more than 10,000 recipients, digital approval by volunteers; for posts to more than 100,000 recipients, digital approval by volunteers of a Council as outlined at the end of these guidelines; and for posts to more than 1 million recipients, digital approval by professionally qualified editors of the same Council. Similar to editorial meetings at newspapers, content must be fact-checked and evaluated before publication. For this, you may utilize independent councils in your country or request review services by The World Council on AI and Social Media.

6. International centralized social media platforms with more than 1 million users must provide legal departments* that are reachable 24/7 digitally, and court decisions must be executed within 60 minutes of receiving the decision digitally.

7. International centralized social media platforms with more than 1 million users must pay taxes in your country for your users-pro rata of the annual profits and valuation of the social media platform.

8. International centralized social media platforms with more than 1 million users must annually ask the citizens of your country if they wish to keep their profile and/or data on the platform or have it erased. If users from your country do not give active annual consent or fail to respond, the social media platforms must erase all their data. Users must be able to control, in very simple ways, the purposes for which their data is used and must be able to erase all information gathered on them with one click on a prominently featured button on every social media platform and digital membership provider. 
All users must have the right to access, review, and delete their data, with full transparency about how it is used. Data must not be sold, cloned, or transferred to third parties without explicit annual opt-in consent. Micro-targeting based on behavioral, psychological, or political profiling is prohibited. Platforms must not deploy AI systems that manipulate user emotions or behavior without transparency and safeguards.

9. International Compliance and Accountability: International centralized platforms with over 1 million users must provide technological measures that enable legal authorities, upon a valid court order and following due process*, to restrict or disable platform access within their jurisdiction in cases of serious non-compliance with court orders. Governments should require companies to retain all internal communications, particularly those related to legal or competitive matters, with clear penalties for intentional deletion or non-compliance. All legal complaints and notices must be answered digitally within one hour. Managers, the CEO, and the owners of the international company shall face personal accountability for repeated failures to comply with court orders if an independent international body confirms the court’s findings.

10.  Algorithmic Transparency and Ethical Design: Social media platforms must disclose the logic and impact of their content recommendation algorithms*. Users must have the option to decline algorithm-driven recommendations. Independent audits shall evaluate and publicly report whether the algorithms amplify harmful, extremist, or manipulative content. Ethical design must be prioritized, and platforms that amplify content based on outrage, fear, or false engagement metrics shall face penalties. - Like any product in life (medical drugs or vehicles) social media needs to be tested for years on their benefits and harm before allowed to be used by companies and customers.
A global independent council of civil society members, journalists, technologists, and legal scholars shall monitor platform compliance with these guidelines. Public figures, media outlets, and watchdog groups should be empowered to raise awareness, propose reforms, and represent public interest, regionally and globally. Governments cannot unilaterally determine which content must be deleted. Immediate decisions on takedowns should be made within one hour, following a preliminary assessment. A first, provisional decision by a court must follow within 24 hours. Only after this initial court ruling can standard legal procedures commence. Online platforms must adjust their content algorithms to comply with the national laws and regulations of each country where they operate, while continuing to maintain user engagement.

The World Forum 2026 suggests Independent Global Social Media

At The World Forum 2025, the co-Godfather of AI, Yoshua Bengio, Nobel Peace Laureate Maria Ressa and 30 leading experts discussed the creation of “The World Council on AI - Algorithms, Social Media and Digital Life”. Nobel laureate & Advisory Board Member of The World Forum Maria Ressa sees social media as a crucial element for living in feeedom: “Without (proven) facts, there is no truth. Without truth, there is no democracy.” 

In order for the world's media becoming independent on social media with their own values and fact checking standards, The World Forum suggests to create an independent global social media platform with the world's leading news organisations that may unite their essential functions in one ecosystem - a chat app (like WhatsApp), a news and discussion platforms (like X, Instagram and Facebook), a search engine (like Google), a video content platform (like YouTube), and possibly even a film and documentary space - all designed to serve journalism, democracy, and socially relevant storytelling, with the potential to unite over 2 billion users worldwide. 

How to create such independent and decentralised social media has been discussed at The World Forum 2025 with Nobel Laureate and Rappler co-founder Maria Ressa, and the founder of Matrix and Element, Matthew Hodgson, who created secure, open-source communication platforms for example for the media platform Rappler in The Philippines, NATO, France, the UK, Germany, and other institutions. In an era of intensifying disinformation, surveillance capitalism, and algorithmic control, further  workshops are designed to empower trusted media institutions to reclaim digital sovereignty by building and operating their own independent, secure, and ethical social media ecosystems. Data is the commodity of the 21st century, like gold and oil in the 20th: own social media platforms will give new revenue and income sources, which will support the independence of journalism and may guarantee it’s survival.

As this newsletter is also sent to parliamentarians of the worlds leading democracies and many leading journalists, we offer to contact us at the email communication@theworldforum.eu, if you wish to get in contact with us, Maria Ressa’s team or Matthew Hodgson in order to explore your options as Government or media organisation.

"Molly vs The Machines" directed by Marc Silver

Jaka Bizilj