Australia has taken a very significant step forward regarding the safety and protection of minors in the use of social networks. Through the eSafety Commissioner of Australia, social media applications have been informed of the obligation to implement age assurance technology, as required by local law by December 10, 2025. From that date onwards, minors will no longer have access to applications like Facebook, Instagram, Snapchat, TikTok, X, and YouTube. For his part, the founder and analyst of the Australian firm PivotNine, and advocate for technology rights, Justin Warren, argues that age verification technology does not work properly.
The basic requirement is to ensure that the user, in this case, the child, should not have to reveal their age. According to eSafety, useful measures include account age, participation only with content aimed at children or adolescents, facial recognition, and voice analysis, among others. This type of technological regulation aims to safeguard the protection of minors online, and as Annika Wells, the Minister for Communications of Australia, explains, this measure is not infallible, but it does represent a significant step towards the protection of minors.
Safety for Minors on the Internet
It is no secret that social media, despite having positive aspects, is posing a serious problem for minors. Addictions, exposure to inappropriate content, and the fact that anyone without any training can stand in front of a camera and make judgments that are repeated, especially by our youth, are the main issues. This is why Australian authorities have stepped up and taken action on this matter.
Social Media Law in Australia
Starting from December 10, 2025, young people under the age of 16 will not have access to platforms such as Facebook, YouTube, Snapchat, TikTok, Instagram, or X in Australia. Unsurprisingly, this measure has been rejected by the BTC and the technical community, but is strongly supported by parents and institutions. Justin Warren, the founder and principal analyst of the Australian firm PivotNine and a defender of technology rights, stated in a report that age verification technology does not work correctly 100% of the time, “Theoretically, if you choose a specific set of tools and use them under carefully controlled conditions, you can sometimes make age verification”.
Australia’s Proposed Age Verification Model for Platforms
According to Australian authorities, the “reasonable measure” is one that does not require the individual, in this case the minor, to disclose their age when using a platform. To this end, they have developed a guide on how platforms can implement this age verification, although they warn that “there is no one-size-fits-all approach to what constitutes the adoption of reasonable measures.” Australia wants platforms to adopt a model they define as a “cascading approach,” in which, according to authorities, “multiple independent age verification methods are used sequentially to establish an age verification outcome”.
eSafety Proposal
According to its own website, the eSafety Commissioner is an independent Australian body that regulates electronic safety and is part of an Australian government agency. Its function includes protecting Australians’ online experiences, as well as promoting the use of safe experiences. According to that entity, useful measures to achieve age control are:
- Account age (e.g., over 10 years old).
- Participation with content aimed at children or adolescents.
- Language analysis indicating that the end user is likely a child.
- Analysis of the information/publications provided by the end user.
- Analysis of visual content (e.g., facial age analysis conducted on photos and videos uploaded to the platform).
- Audio analysis (e.g., age estimation based on voice).
- Activity patterns consistent with school schedules.
- Connections with other end users who appear to be under 16 years old.
- Participation in groups, forums, or communities focused on youth.
Like any initiative in its beginnings, it required improvements, but it is undoubtedly a significant step forward. This is how Annika Wells, the Minister of Communications, explains it: “We are not anticipating perfection here”. Authorities expect ‘kindness, care, and clear communication’ from social media platforms, and that they will do their part to ensure that minors can experience a safer online navigation.
