Two months into Australia's SMMA law, implementation is a disaster. Despite claims of 4.7 million blocked accounts, Snapchat's own reporting shows major gaps — age verification is inaccurate, under-16s can bypass protections and over-16s are wrongly blocked. Platforms aren't checking gender or age properly, kids are still online, and the government refuses to release transparency data because they know it hasn't worked. A centralized app-store verification system, not piecemeal bans, is the only way to enforce protections consistently.
Despite imperfections and some under-16s bypassing verification measures, Australia’s world-leading SMMA law is delivering meaningful results two months into its implementation. Platforms are actively restricting access and improving age assurance, demonstrating real progress in protecting children online. Early figures, including 4.7 million accounts blocked, show platforms are taking the rules seriously. Success isn’t measured by perfection but by harm reduction, norm-setting, and long-term benefits for young Australians.
After two months of Australia's SMMA law, people should be questioning the direction of the law itself. Recent research challenges the assumption that time spent on social media or gaming causes anxiety or depression in teens. Efforts to strictly limit under-16s' access may therefore be misdirected, focusing on the wrong target. Teen mental health is shaped by support, social connections and online experiences — not hours online. Policy should prioritize healthy engagement and safeguards, not blanket bans that treat social media as inherently harmful.
© 2026 Improve the News Foundation.
All rights reserved.
Version 6.20.3