Australia to require age verification when using search engines


This recording was made using enhanced software.

Summary

Age verification

Australia is implementing new codes that require users to verify their age when using search engines.

Why?

The change is part of a recent effort to limit children’s access to adult content online.

Teen ban

The new rule follows the announcement that Australia would ban children younger than 16 from using social media sites.


Full story

Beginning in December, Australians will be required to verify their age to use search engines like Google and Bing. Users will have to scan their face or do an identity check when using the search engine as a logged-in user. 

The changes are a result of industry code, not rule changes from the country’s parliament, according to The Guardian. The tech industry develops the codes, then the eSafety commissioner — an unelected official — registers them in a process called co-regulation. 

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

The change is expected to take effect on Dec. 27 and is part of an effort to limit children’s access to adult content online. 

How it works

According to the Australian Broadcasting Corporation, Google and its rival Microsoft will be required to implement some form of age-assurance technology by Dec. 27. If the companies fail to comply, they could face fines of up to nearly $50 million per breach. 

The new regulations list seven main methods for verifying age: 

  • Photo ID checks.
  • Face scanning age estimation tools.
  • Credit card checks.
  • Digital ID.
  • Vouching by the parent of a young person.
  • Using AI to guess a user’s age based on the data the company already has.
  • Relying on a third party that has already checked the user’s age.

The search results for logged-in users under the age of 18 will then be filtered for pornography, high-impact violence, material promoting eating disorders and a range of other content. 

Unbiased. Straight Facts.TM

Beginning Dec. 10, Australia will ban children under 16 from using social media apps, including TikTok, Instagram, Snapchat, X, Facebook and Reddit.

Changes cause concerns

That process has some Australian politicians nervous. Australian Sen. David Shoebridge called the new code “staggering.”

“These proposals don’t have to go through an elected parliament and we can’t vote them down no matter how significant concerns are,” Shoebridge said. “That, combined with lack of public input, is a serious issue.”

Lizzie O’Shea with Digital Rights Watch, an Australian nonprofit, expressed concerns about the public’s lack of say in the new codes. 

“It’s not clear that there is a social licence for such important and nuanced changes,” O’Shea said. “We would argue that the public deserves more of a say in how to balance these important human rights issues.”

Other online safeguards

The new safeguards come after Australia announced it would ban children under the age of 16 from using social media, beginning Dec. 10. That ban sparked significant media attention, unlike the new codes, which have mostly gone unnoticed.

Julie Inman Grant, Australia’s eSafety commissioner, briefly mentioned the new codes in June, saying, “These provisions will serve as a bulwark and operate in lock step with the new social media age limits.”

“It’s critical to ensure the layered safety approach,” Grant said. “Including on the app stores and at the device level — the physical gateways to the internet where kids sign up and first declare their ages.”

Alan Judd (Content Editor) contributed to this report.
Tags: , , ,

SAN provides
Unbiased. Straight Facts.

Don’t just take our word for it.


Certified balanced reporting

According to media bias experts at AllSides

AllSides Certified Balanced May 2025

Transparent and credible

Awarded a perfect reliability rating from NewsGuard

100/100

Welcome back to trustworthy journalism.

Find out more

Why this story matters

Australia’s new industry-driven age verification requirements for search engines signal an increased focus on child online safety but also raise concerns regarding privacy, oversight and public input in digital regulation.

Child online safety

The new regulations aim to limit minors’ access to harmful content, reflecting growing concerns about children’s exposure to adult material and other risks on the internet.

Privacy and surveillance

Required methods like face scans and identity checks have sparked debate about user privacy and data protection as Australians may need to provide sensitive information to access basic online services.

Regulatory governance

The co-regulatory process allows industry-created codes to be adopted without parliamentary debate, prompting concern about transparency, public input and democratic oversight in digital policymaking.

SAN provides
Unbiased. Straight Facts.

Don’t just take our word for it.


Certified balanced reporting

According to media bias experts at AllSides

AllSides Certified Balanced May 2025

Transparent and credible

Awarded a perfect reliability rating from NewsGuard

100/100

Welcome back to trustworthy journalism.

Find out more

Daily Newsletter

Start your day with fact-based news

Start your day with fact-based news

Learn more about our emails. Unsubscribe anytime.

By entering your email, you agree to the Terms and Conditions and acknowledge the Privacy Policy.