Australia's Social Media Age Restrictions: What Parents Need to Know
From 10 December 2025, Australia is introducing new rules that will change how young people use social media. Many parents are hearing bits and pieces online and feeling unsure about what it all means. Belinda Young spoke with the Digital Rights Watch to understand the situation further and discuss ways to support families too.
This guide brings everything together in one place so our community has the facts—and can support each other through the transition.
The Basics
Australia's new social media age restrictions will require major platforms to take reasonable steps to prevent anyone under 16 from creating or keeping an account. This world-first legislation passed Parliament in November 2024 and became law in December 2024.
It’s important to remember that this isn't technically a "ban" but a delay to having accounts until age 16. The aim is to give young people more time to develop digital, social and emotional skills before facing the pressures of social media.
The restrictions aim to protect young Australians from pressures and risks from design features that encourage more screen time while serving up content that can harm their health and wellbeing. Research has linked heavy and early social media use to anxiety, depression, body image concerns, cyberbullying, and exposure to inappropriate content.
The government's position emphasises that many platforms were never designed for children and young teens, and that stronger protections are needed during critical developmental years.
Over the next year, social media platforms will begin rolling out new systems to meet the under-16 requirements, which means families may start seeing more frequent age-verification prompts, accounts being restricted, locked or removed if a user appears underage, and stricter parental-consent screens for older teens. You may also notice fewer under-age accounts slipping through the cracks as platforms tighten their checks.
As this happens, some young people may naturally shift to other digital spaces—such as messaging apps, gaming chats or YouTube—that aren’t covered by the new rules. All of this is expected, and simply something for families and communities to keep an eye on.
Which Platforms Are Affected?
As of November 2025, eSafety considers these platforms age-restricted: Facebook, Instagram, Snapchat, Threads, TikTok, X, YouTube, Kick and Reddit (See here for further details). Twitch has also been added to this list.
Exempted Platforms (Under-16s Can Still Use)
Services exempted from restrictions include messaging apps, email, voice calling, video calling services, and platforms used for education or healthcare. This means:
Messaging apps like WhatsApp, Messenger, and Messenger Kids remain accessible
Education platforms like Google Classroom continue to be available
Health services like Kids Helpline and headspace are not restricted
Gaming platforms like Roblox remain accessible (though this was under review)
Under Review: Discord, Steam, and some other platforms are still being assessed to determine whether they function more like social media or messaging services.
What Parents Need to Know
There are no penalties for under-16s who access age-restricted platforms, or for their parents or carers. The responsibility lies entirely with the social media companies.
Parents cannot give consent for their young people to access social media before they turn 16. This differs from previous age restrictions where parental permission could override minimum age requirements.
What Happens to Existing Accounts?
Under the legislation, even if young people already have accounts, after the ban comes into effect anyone under 16 will not be allowed on certain platforms—there will be no 'grandfathering' arrangements. Some platforms may deactivate accounts and reactivate them when users turn 16, but this isn't guaranteed across all services.
Before December 10, young people should download any data they want to save—including connections, posts, chats, photos and videos.
How Will Platforms Enforce This?
Platforms must demonstrate they've taken reasonable steps to prevent age-restricted users from having an account. Methods being considered include:
Age verification through ID documents
Behavioral signals (usage patterns, language, community memberships)
Biometric scanning
A government-funded trial of age assurance technologies is currently underway, with findings expected to inform what counts as "reasonable steps." Social media companies face fines of up to $50 million for failing to take reasonable steps to prevent minors from creating accounts.
Preparing Your Family
Talk openly about the benefits and risks of the internet, what's safe to share, how to protect personal information, and what to do if something online feels uncomfortable. Many young people will feel upset, worried, or angry about these changes—acknowledge their feelings while explaining that safety comes first.
Show balance in your own screen use and set family-wide screen-free times like during dinner or before bed. Children tend to imitate adult behavior, so your actions matter.
Young people may migrate to other platforms or find workarounds. Keep conversations going about:
New apps and platforms they're interested in
The risks and benefits of different online spaces
Where they can connect safely with friends
The restriction doesn't mean young people will automatically shift offline. Help them find healthy ways to stay connected:
Encourage hobbies, sports, and in-person activities
Arrange family game nights or movie nights
Support peer connections through offline activities
Keep communication channels open without judgment
Build Digital Literacy
Use this transition period to teach responsible online behavior so young people are better prepared when they turn 16. This includes:
Understanding privacy settings
Recognising misinformation
Managing screen time
Knowing when to seek help
Support Resources
If your child is struggling with these changes or experiencing mental health concerns:
Kids Helpline: 1800 55 1800 (24/7)
headspace: Find your nearest center at headspace.org.au
Beyond Blue: 1300 22 4636 (24/7)
Looking Ahead
The eSafety Commissioner has stated that there will not be a static list of platforms that are age-restricted, as services and technology continue to change. The government has committed to an independent review of the law within two years of implementation.
Australia's approach is being watched internationally, with several European countries and Denmark considering similar measures. The focus remains on protecting young people while they develop the resilience and skills needed to navigate social media safely in the future.
For the Latest Information:
eSafety Commissioner: esafety.gov.au/about-us/industry-regulation/social-media-age-restrictions
Raising Children Network: raisingchildren.net.au (search "social media ban")
headspace: headspace.org.au (family guidance section)
Other Sources of Information
Here are direct links to the key resources for the upcoming social media minimum age rule:
eSafety Commissioner — Social media “ban or delay” FAQs & resource hub
https://www.esafety.gov.au/about-us/industry-regulation/social-media-age-restrictions/faqs eSafety Commissioner+2eSafety Commissioner+2Department of Infrastructure, Transport, Regional Development, Communications & the Arts (Australia) — Social Media Minimum Age policy and factsheet
https://www.infrastructure.gov.au/media-communications/internet/online-safety/social-media-minimum-age Infrastructure and Transport Dept.+1Additional family-friendly summary from RaisingChildren.net.au
https://raisingchildren.net.au/pre-teens/entertainment-technology/media/social-media-ban-faqs
