Australia mandates pornography sites block under-18 users starting Monday

Australia orders porn sites to block minors as sweeping child-safety rules take effect

Australia’s online regulator has ordered pornography websites to block people under 18 starting Monday, enforcing age verification under sweeping new restrictions aimed at shielding children from harmful content across the internet.

- Advertisement -

Some adult sites in Australia had already barred non-members and halted new registrations in recent days ahead of the deadline, as platforms prepare to deploy age checks that go beyond simple self-declaration.

The crackdown expands Australia’s child online safety framework following its Dec. 10 move to ban children under 16 from joining social media platforms. The rules target access to what authorities deem “age-inappropriate content,” including pornography, high-impact violence, suicide and eating disorders.

“Make no mistake, where we see failures or foot-dragging, we will hold companies to account,” eSafety Commissioner Julie Inman Grant said. Failure to comply could bring penalties of up to €30 million per breach, the regulator said.

Under the new standards, users must verify their age before accessing restricted material on pornography websites and services. “Clicking a button that says ‘I am 18 years or older’ is no longer sufficient. This is consistent with similar efforts being implemented internationally,” the eSafety office said.

Inman Grant framed the measures as bringing digital spaces in line with established real-world safeguards. “We don’t allow children to walk into bars or bottle shops, adult stores or casinos, but when it comes to online spaces where they are spending a lot of their time, there are no such safeguards,” she said. “But that changes for Australian kids.”

The rules extend beyond adult sites to cover a broader slice of the technology ecosystem. AI companion chatbots that can generate sexually explicit, violent or self-harm content must confirm a user’s age before engagement. App stores and online gaming platforms are required to block minors from adult-only content and features.

Search engines will also need to limit exposure to harmful material, with results containing pornography and high-impact violence blurred by default when a user is not signed in. Queries related to suicide or eating disorders must surface links to appropriate mental health support services before other results.

The eSafety Commission said the industry will be obliged to apply consistent safeguards across products and services to reduce the risk of accidental exposure. The regulator will monitor compliance and “take enforcement action against systemic non-compliance,” it said.

Inman Grant acknowledged the limits of regulation but said the codes would still deliver meaningful protections. “No piece of regulation will eliminate all risks and harms all at once, but these codes create meaningful protections for children across the tech ecosystem,” she said, adding that the government’s planned “digital duty of care” would further strengthen protections over time.

Australia’s approach places greater responsibility on platforms to verify age and manage risks proactively, as governments worldwide wrestle with how to curb the spread of online pornography and other harmful material to children without compromising privacy or free expression. The eSafety Commission said it would continue to assess adherence and adjust its enforcement posture as the rules bed in.

By Abdiwahab Ahmed
Axadle Times international–Monitoring.