Posts

I questioned the Commissioner regarding her September trip to Stanford and meetings with US tech firms. She will provide a detailed log of her itinerary, speaking engagements, and total costs on notice. Australians deserve to know exactly how their money is being spent and what is being discussed behind closed doors.

I then queried the Minister regarding concerns raised by US House Judiciary Committee Chairman Jim Jordan about the Commissioner’s conduct. While I support protecting children from harm, we must be vigilant when unelected officials are labelled “extreme” by international peers.

Lastly, I was interested to know what the Commissioner’s philosophy was regarding censorship, noting the “enormous power” that has been given to her. She denied being a censor, stating she only acts on public complaints regarding “highly damaging” and “refused classification” material, specifically excluding political speech.

The eSafety Commissioner has enormous power over what you see and say online. I will continue to hold this agency to account to protect the rights of adult Australians from government overreach.

P.S. At one point during this session, Senator Green accidentally called me “Minister” – saying “maybe one day, if the LNP has their way.” She even joked that One Nation is already writing policy for the LNP! 😆😆

— Senate Estimates | December 2025

Transcript

CHAIR: Senator Roberts, I understand you have a few more questions.

Senator ROBERTS: Yes, just three. Commissioner, you visited Stanford University in September this year as part of a USA trip. Did Australian taxpayers fund that?

Ms Inman Grant: Yes, I went, and I met with eight of the AI companies and the social media companies. Then I spent a day and a half at the Trust and Safety Research Conference.

Senator ROBERTS: Could you please provide a log of meetings and a record of your speeches, or any other documentation, to assure taxpayers that their money was spent appropriately, as well as the total cost of the trip?

Ms Inman Grant: I sure can.

Senator ROBERTS: On notice.

Ms Inman Grant: Yes.

Senator ROBERTS: Thank you. You’ve already answered a question from Senator Whitten about the House Judiciary Committee chairman wanting you to testify, so I don’t need to cover that. Minister, does it concern you that your commissioner is engaging in conduct that is so extreme that the US Congress, specifically the House Judiciary Committee chairman, Jim Jordan, is alarmed?

Senator Green: Minister, I think the eSafety Commissioner’s address—

Senator ROBERTS: I’m not a minister.

Senator Green: Sorry, Senator—maybe one day, if the LNP has their way.

*Senator Henderson interjecting—*

Senator Green: You never know. They wrote your net zero policy, so you never know. We are very proud of the reforms that we are undertaking. To be fair, I’m sure the coalition was very proud of the steps that they took in terms of online safety when the eSafety Commissioner was established. For the most part, we have had bipartisan support for these types of reforms, because they keep Australians safe. The social media ban or minimum age will seek to keep our children safe. It’s incredibly important. I know you come in here quite often talking about the safety of children and wanting to keep harmful material away from them. That is the work of the eSafety Commissioner. It’s open to other governments or other people in other parliaments to have their judgment of it, but from an Australian government point of view we are very proud of the work that she does.

Senator ROBERTS: Commissioner, you said earlier, in roughly these words, that you’ve never claimed to censor the net globally. Why do you think people think this?

Ms Inman Grant: We talked about Elon Musk’s tweet that said she’s the eSafety commissar trying to globally regulate the internet, and then Ben Fordham then picked it up, and it’s just had a life of its own.

Senator ROBERTS: I’ve complimented your office on its work in protecting children, quite clearly. There are other concerns we have with your work because it can cause consequences for adults that we don’t like, but it’s not appropriate to discuss it here. What’s your philosophy on censorship?

Ms Inman Grant: My philosophy is I’m not a censor. I respond to complaints from the public. We received many about the Charlie Kirk assassination and about the stabbing of Iryna Zarutska on a train where she bled to death and the decapitation of the Dallas hotel owner. If you think that that’s overstepping when that’s something that’s highly damaging and was determined—

Senator ROBERTS: No, I didn’t say that. I was wanting to know your thoughts on censorship—that’s all—because you’ve got enormous power.

Ms Inman Grant: My thoughts on censorship? Well, what has been helpfully built into the Online Safety Act is that we’re not regulating for political speech or commentary. It’s where either online invective or imagery veers into the lane of serious harm. You provide us with thresholds. Sometimes those thresholds are tested and sometimes they’re a grey area, but I think we help thousands of people every year. We’re doing world-leading work that the rest of the governments around the world are following. I think we’re punching above our weight. We’re a very small agency given the size of our population. So I guess I don’t have a view. I don’t see myself as a censor. I don’t tell you what you can or can’t say unless it’s refused classification or it’s trying to silence someone else’s voice by targeted online abuse that reaches the threshold of adult cyberabuse.

Senator ROBERTS: Thank you. Lastly, I think it was Mr Fleming who invited us to have a briefing. We haven’t forgotten. We’d like to do that, but we’ve been a bit busy. We will do it one day.

Mr Fleming: Maybe in the new year. The offer still stands.

Senator ROBERTS: Thank you

During Senate Estimates in December, I asked the eSafety Commissioner why social media platforms like X are being targeted, while Bluesky — a known hangout for the left — seems to be getting a free pass.

The Commissioner claimed there’s no “political bias” and that Bluesky has not been exempted – they’re just focusing on where the most kids are. She called Bluesky a “young company” that’s still finding its feet. It looks like a double standard to me — conservative platforms get targeted, while ‘left-wing hangouts’ get a free pass for being ‘low risk.’

Government shouldn’t be picking winners and losers based on politics. We need transparency, not a “dynamic list” that changes whenever a bureaucrat feels like it. Whether it’s the Labor Party or the Coalition, Australians are sick of the double standards and the “Big Brother” tactics.

I’ll keep speaking up to make sure your voice isn’t silenced by bureaucratic overreach. We need one rule for everyone, and total protection for our free speech.

— Senate Estimates | December 2025

Transcript

CHAIR: It wasn’t my intention.

Senator ROBERTS: No, I know that. Thank you for appearing again. I have, perhaps, an insight. Since COVID, people in Australia are very wary of government. That’s not just the Labor Party; that’s both. Commissioner, you have exempted Bluesky from your under-16 social media minimum-age restrictions, yet Bluesky is almost identical to X, as I understand it. It currently allows 13-year-olds or younger people saying they are 13 to sign up, and they have no age verification. Do you understand, Commissioner, that you have an obligation to discharge your duties without the perception of political bias? Your decision to exempt a left-wing hangout and to include a conservative hangout, X, looks like political bias.

Ms Inman Grant: Bluesky has not been exempted. They present a very low risk. They have actually identified themselves as an age-restricted social media platform. They probably have 50,000 Australian users—a very small number of young users. They’re building up their age inference tools. They’re a very young company. What we’ve decided to do—we’re talking to a range of companies that could be age restricted social media platforms, whether it’s Yubo, Yope, Lemon8 or other ones that we know we’re going to go to. But you missed the opening statement, where I said our focus—these assessments that we’re doing are voluntary. I don’t have specific declaratory powers in terms of who is in and who is out, so I can’t say anyone is exempted. It’s up to the legal teams of those companies to determine whether they’re in or out. Where we will focus our compliance is where the vast majority of young people are. For the purposes of transparency, fairness and due process, we developed the self-assessment tools. Then we did some initial assessments so that we could at least have a body of major companies that would fit the criteria set forth by parliament. We’ve got 10 that we’re starting with, but I’ve always said that this will be a dynamic list. If we see that there are significant migratory patterns with young people that are going over to Bluesk —again, we’ve had three conversations with them—we expect that they will start applying some of their age assurance tools. They’re just at the beginning of that journey.

Senator ROBERTS: So what you’re saying, Commissioner, as I interpret it, is that you’ve got objective criteria that you assess platforms against.

Ms Inman Grant: We developed a self-assessment tool, so there are consistent assessment criteria. The criteria we have to use are the criteria that was in the legislation that parliament passed. That primary test is around whether or not a particular site—if it didn’t meet an exclusion, say, the messaging exclusion, the online gaming exclusion or the education and mental health exclusion, we had to do a sole-and-significant-purpose test. If its sole or significant purpose was online social interaction, then our preliminary view—it is not a determination—was that they were an age restricted social media platform.

Mr Fleming: Senator, just to give you a pointer, it’s in section 63C of the Online Safety Act. The criteria are set out in the act, and, if someone meets those criteria, there are a set of rules that the minister made. If the rules apply to that platform, then they’re out of the scheme. That’s how it works. To reinforce the point the commissioner made, there’s no determination that platforms are in or out. We’ve just expressed our preliminary view based on our assessments against the criteria, like the platforms can do for themselves. Then we focused on where most of the kids are, and that’s where we’re going to focus our initial efforts.

Senator ROBERTS: Thank you. Minister, your government chose to use legislation against social media platforms. However, the commissioner has then included search engines in the scope of age restrictions, using an industry code under the Online Safety Act. Couldn’t you have simply done the whole thing under existing powers and created an industry code of practice, mandatory if necessary, for age control of social media instead of this whole blunt instrument legislation—an industry code as opposed to enforcement?

Senator Green: I’ll let the eSafety Commissioner answer because there would have been advice given to government about the best way forward. This is a very important step forward that we’re taking, and legislation was required.

Senator ROBERTS: Well, I asked you because I’m not allowed to ask—

Senator Green: No, you are allowed to—

Senator ROBERTS: for an opinion of an officer.

Senator Green: No, it’s not an opinion.

Senator ROBERTS: If the minister wants you to, that’s fine.

Senator Green: You’re asking why legislation was required. They can answer that question.

Ms Inman Grant: The industry codes were included in the Online Safety Act of 2021 under the then coalition government. What they decided was that they would split the technology industry into eight different sectors, from search engines to social media sites to ISPs to some broader categories, including the designated internet services and relevant electronic services. What Paul Fletcher, who was my minister at the time, decided was that he wanted to continue the tradition of co-regulation that had existed for many years across telecommunications and ensure that the industry developed the codes. We would decide whether or not they met appropriate community safeguards. If they did, we’d register them. If they did not, then I would create standards, and that would be a disallowable instrument that would require additional parliamentary scrutiny. It took 4½ or five years for all this deliberation, for this to happen. In most other jurisdictions, the regulator writes the code, but, with respect to the search engine code that I think you’re referring to, I don’t know if you missed the interaction I just had with Charlotte Walker—

Senator ROBERTS: I did.

Ms Inman Grant: They were written by Google and Bing, and they pretty much codify safe search practices that are used today. So, come 27 December, if you’re searching the internet and you come across violent pornography or explicit violence, it will be blurred. This is because 40 per cent of kids tend to come across this kind of violent conflict. The search engine is the gateway, and it’s unexpected, it’s unsolicited and it’s in their face. If you’re an adult and you want to continue through, you can do that. You only have to be age verified if you decide to search the internet with a Google account on, for instance, and a lot of families may choose to have a Google account on so that they can have different age-inappropriate settings set up. But, if you’re concerned about it, you just use DuckDuckGo, Bing or whatever other one. The other thing that I think is really important about the search engine code is that, if there’s a person in distress who is seeking to take their life, rather than the search engine taking them directly to a lethal-method site, it will redirect them in the first instance to an Australian mental health support provider. We all know that suicide is a terribly damaging thing for families and communities. So, if we can give someone in distress the support that they need rather than the directions in terms of how to take their life, any family would be grateful.

Senator ROBERTS: I’m sure they would. X currently—

Senator Green: I’m sorry, Senator, I misunderstood your question at the beginning. I thought you were asking about the minimum age legislation, so I apologise.

Senator ROBERTS: That’s alright.

Senator Green: I understand now what you were asking, and the eSafety Commissioner has given a very good answer.

Senator ROBERTS: X currently has, in early deployment, routines which do the following: pattern matching to determine age without the use of personal identifiers, such as a digital ID; pin protected parental controls—I tend to think government should not be undermining parents—to allow parents to set guardrails for their children on content that will be granulated to individual accounts, keywords or topics; and interaction monitoring to identify what could be harassment based on the pattern of posting, the words used and the ages of the people involved to stop offending posts being seen by anyone but the poster. If industry can do this by themselves, why did we need legislation? Why wasn’t a simple code of practice used instead of this ‘big brother, big stick’ drama?

Ms Inman Grant: Is that a question? I would just say in response—

Senator ROBERTS: It looks like the platforms are developing new technology.

Ms Inman Grant: I would just say that we had a very constructive meeting with X. They walked us through a number of the tools. They did say they were going to use age assurance with Grok, which could have some interesting outcomes. But a large number of parents don’t utilise parental controls. Sometimes it’s because they’re too difficult for parents to find or to work. This was a bipartisan act that the parliament obviously started. The momentum started in South Australia and then in New South Wales. But my view, after talking to so many of the ministers, the Prime Minister and the opposition leader, who supported it, was that they wanted to do something monumental. They wanted to create a significant normative change.

Senator ROBERTS: That’s what scares us.

Ms Inman Grant: One normative change that isn’t scary, I would think, is that we know that 84 per cent of eight- to 12-year-olds already have social media accounts, and, in 90 per cent of cases, parents have helped them set them up. Why? Because they wanted them to be exposed to harm early? No. It’s because they’re concerned that their kids’ friends are all on the sites and their kid will be excluded. What this change does is delay them from being exposed to all the harmful and deceptive design features. They can also sit down with their kids and say: ‘Hey, you’re not ready for this. You’re not going to be on it and your friends shouldn’t be on it either.’ So it takes the FOMO, exclusionary element out of it, and this is what we’ll be measuring.

Senator ROBERTS: So the government excludes them instead of their friends? It should be the parents, shouldn’t it?

Ms Inman Grant: They’re setting a standard like you’d set a drinking age or the age for cigarettes. They’re setting an age for social media that they think is the right age and—

Senator ROBERTS: Let’s move on. Commissioner, the search engine code included a grace period of 12 months to allow search companies to write their code to comply. As I just indicated, social media companies are close to a technological solution that will also solve their compliance. Will you allow a grace period to allow social media companies to properly write, test and deploy age-verification technology in an orderly manner—in other words, delay?

Ms Inman Grant: We’re following the letter of the law, but what we’ve said is that we are looking for systemic failures. We don’t expect accounts to immediately disappear overnight. We also have another requirement beyond the deactivating of the under-16 accounts on 10 December, which is preventing under-16s from creating accounts. We accept that that’s going to be a longer-term journey for a lot of these companies, and many that we’re talking about here already have very sophisticated age-inference tools or AI tools. Some of them will be supplementing them with third-party tools that have been tested with the age assurance technical trial. Again, they’re taking a layered approach. We will watch closely. If they have glitches, we’ll talk to them about it. What we care about is that they’re clear with us about the tools and the success of validation or the layered approach they plan to take. If it’s not working, the other requirement is continued improvement, which the technology is doing every day. So in some ways we will be providing a grace process.

Senator ROBERTS: It seems that you accept that this rushed introduction with insufficient time for social media companies to get the software right, with no time for testing and very little public education, could be a recipe for chaos.

Ms Inman Grant: I think they’ve had plenty of time and they’re all technically capable of achieving this.

CHAIR: Senator Roberts, noting the time, we’re due to take a short break. Do you have a final question? Then we’ll take a break and rotate the call after that.

Senator ROBERTS: Why was the decision made to time the introduction for school holidays, which is when children will be wanting to access social media to stay in contact with their friends, sports and activities?

Ms Inman Grant: It was written into the legislation.

Senator ROBERTS: It was one of the reasons we opposed it.

Senator CANAVAN: It’s killed Christmas.

Ms Inman Grant: That’s a legitimate concern. Kids are—

Senator CANAVAN: They’ll get new gadgets that they won’t be able to use.

Ms Inman Grant: Only for gaming.

CHAIR: That’s a good note.

Australia’s eSafety Commissioner, Julie Inman Grant, recently directed social media companies to take steps to prevent children from altering or falsifying their age to bypass upcoming restrictions for users under 16. This directive is part of the broader implementation of the Online Safety Amendment (Social Media Minimum Age) Act 2024, which requires platforms to enforce age verification and block underage users starting December 10, 2025.

The Commissioner stressed that platforms must identify and remove existing underage accounts (84% of children aged 8 to 12 already use social media, often with parental assistance) and implement “reasonable steps” to prevent new sign-ups or age changes by minors.

This will include multi-layered age assurance technologies, such as facial age estimation, behavioural inference and successive validation – to detect and block attempts by children to lie about their age, or edit profiles post-creation, without relying solely on self-reported birthdates, which can be easily manipulated.

Non-compliance may result in fines of up to AUD 49.5 million per violation, as platforms such as Meta (Facebook and Instagram), TikTok, Snapchat, and others were explicitly required to audit their user bases and redesign onboarding processes to close existing loopholes.

The guidance builds on earlier research from September 2024 showing that only 13% of underage accounts were previously shut down for age violations, highlighting the need to stop children from “changing their age” on profiles to evade detection.

This aligns with the law’s goal to stop social media access until age 16 in order to “protect young users from harms such as cyberbullying and addictive platform features”, while exemptions will apply for non-social platforms like gaming and messaging apps.

Life will never guarantee safety. That is not an excuse to legislate danger.

Australia is set to ‘quietly’ introduce ‘unprecedented’ age verification checks for YouTube and Google as part of a wider push to gate-keep access to social media.

Quietly.

Without the consent of the Australian people.

Labor is pushing ahead despite alarming fallout from the UK where their Online Safety Act, claimed to be created in the interests of ‘child safety’, has led to the immediate censoring of political discussion surrounding mass migration and Grooming Gangs.

What began as genuine concern for children on social media has rapidly expanded to mandatory, wide-ranging, biometric age checking across the digital landscape.

Not only here – throughout the Western world.

Australia, the United Kingdom, and the European Union have all decided that information is the enemy of political ideas.


The Coalition established the eSafety Commissioner

Liberal Leader Sussan Ley continues to support the eSafety ‘Commissar’s’ proposed restrictions on X, Facebook, TikTok, and Instagram, set to begin in December and now expanded to YouTube and Google (including Google maps).

Failure to comply will see the imposition of extraordinary and ludicrous fines.

This is to satisfy an age verification technology whose reliability is yet to be proven. While these biometric technologies can guess at ages, they cannot return a reliable result to distinguish teenagers, even for the same individual. How can this be the proposed basis for adult rights to digital communication?

It is only natural that when adults find themselves unable to access essential digital services, or a 16-year-old on their birthday wants to download X, that a more reliable form of identification will be sought – and that will almost certainly be Digital ID.

So much for promises that this will be ‘voluntary’ and ‘only for government paperwork and applying for rental properties’.

Most believe, logically, that the point of ‘child safety’ legislation is to force the implementation of Digital ID and perhaps begin the crackdown against VPNs.

These are policy positions that would have been rejected if it weren’t for the added layer of ‘think of the children’ just as deconstructing our energy grid required the weaponisation of screaming children gluing themselves to the road believing they were ‘going to die’ because of fossil fuels.

It is a sickening form of confected emotionally-manipulative hysteria.

We may ask, for what other reasons have these extreme measures been placed upon the digital realm?

Especially considering YouTube is one of the most heavily regulated established platforms and Google has a fully-functional adult-content setting.

Safety?

I don’t believe that. I’m sure you don’t believe that. Chalk up another Uniparty lie.

There is more going on.

While the government continues turning a blind eye to gaming chats and unregulated message boards, it clearly does not believe in child safety online. And even if the eSafety Commissioner believes in her mission to ‘protect children online’, why not wait until the Under 16 social media comes into force in December?

There is enormous doubt about its functionality and, most assume, its public reception. It is likely to be a social disaster. Children around Australia will suddenly realise that government power extends beyond campaign slogans aimed at their frustrated parents.

The UK is experiencing a fraction of the power the Australian legislation proposes, and it is an unmitigated disaster which has been called an assault on fundamental human and civil rights. Instead of protecting children, the UK’s Online Safety Act has put them in danger because it silences information about police complicity in the Grooming Gang assaults and removes public protests about illegal migrants who have been accused of sexually assaulting young people on the street.


Censorship is creating a world where criminals and predators are protected for the sake of political harmony

This is why we say, over and over, the government cannot be trusted with censorship.

Even good intentions turn sour, and we are confronted with ridiculous scenes, such as Peter Kyle, the UK Science Secretary, accusing Reform leader Nigel Farage of being on the side of Jimmy Savile. Needless to say, Farage is demanding an apology.

‘If you want to overturn the Online Safety Act you are on the side of predators. It is as simple as that,’ Mr Kyle spat back.

This is what authoritarian governments do when they have overreach – falsely frame a legislative reform as an existential threat to safety. If you don’t want your free speech rights or privacy erased, you must be supporting predators. If you think industrial renewable energy projects are a bad idea, you must want the world to burn. Or freeze. Or flood. (They’re not quite sure on that one.)

The truth is, the UK has shown us what awaits Australia in the immediate future.

Even those who dislike social media need be concerned about the impacts on search engines such as Google and Microsoft. Those who do not verify their age will have their results automatically filtered to child settings. This does not mean the standard ‘safe search’. No, it is instead a more complex algorithm that we have been warned will include harmful content which could simply mean a discussion on migration or whatever the government deems to be misinformation.

It might be an article from the wonderful Professor Ian Plimer challenging the United Nations Intergovernmental Panel on Climate Change.

It is via these methods of ‘child safety’ that our access to knowledge shrinks.

The government has become what the late Christopher Hitchens warned about – an entity deciding what you can read.

Did the Prime Minister mention this at the last election?

Who is responsible for subverting democracy and taking this decision away from the Australian people and our elected representatives?

This intolerable story of the erosion of rights comes down to the eSafety Commissar, Julie Inman Grant. She seems absolutely giddy at the thought of more power. I’m disgusted. Enough is enough. We need to have a talk about digital overreach and the misuse of child safety as a means to control people’s access to the digital world.

Based on the Coalition’s introduction of necessary precursor policies and legislation, Labor’s assault on the digital world is so expansive and severe it is difficult to know which argument to take into battle.

And that is the point.

Destroying the modern public forum is an essential step on the path to cementing an era of unchallenged propaganda capable of re-shaping the social conversation of Australia.

The people who founded our democracy wrote privacy into the system for a reason.


It was to protect people from the government

First, terrorism was used. Then climate change. Then Covid. Now, child safety.

All of these have been used to deceitfully chip away at privacy and free speech. It must stop. We have to draw a line in the sand and protect the internet, for ourselves and for the next generation of children who deserve to grow up in a free country and indeed, a free world.

Life will never guarantee safety. That is not an excuse for the government to legislate danger.

‘Child safety’ or deliberate political censorship? by Senator Malcolm Roberts

Life will never guarantee safety. That is not an excuse to legislate danger.

Read on Substack

Using ‘child safety’ to restrict social media is a dangerous path for Australia

There are some policies so unworkable, so obscene, and so detached from reality that the public may be forgiven for thinking they will never come to pass – even after the Senate approves a bill.

Banning Australians under 16 from social media was an idea pitched by former Liberal Leader Peter Dutton and formalised by Prime Minister Anthony Albanese before the Federal Election.

Then it was forgotten…

The policy had a strange birth, following a tiny frenzy of media articles which sprung up out of nowhere describing a ‘social media bullying epidemic!’ A ‘crisis’ that vanished from the headlines once digital censorship had been cheered into the agenda by politicians desperate to talk about anything other than energy, migration, or debt…

These articles briefly reappeared when criticism against the original bill reached its peak, painting those who dared to oppose online censorship and intrusive biometric identification as being insensitive to the ‘plight of children’.

It’s not clear who is pulling the strings.

However, on more than one occasion the media has pitched a ‘crisis’ peddled by ‘experts’ that was ‘solved’ at a politically opportune time.

Call me a cynic, but something’s up. Another agenda disclosed to a select few, perhaps?

The under 16 ban will enter the real world in December, with children already being advised to ‘download their profiles’ and delete social media apps.

Some parents believe having the strong-arm of government in the living room will help, although it is more likely this interference will create an acrimonious social rift between generations that is far worse than the ‘you don’t understand my music’ sentiment.

‘I used the government to ban you from talking to your friends…’ is hardly expected to help strained relationships between parents and children.

Every generation has a desire to preserve the world they grew up with, and I understand a lot of people are hostile to social media and its uncertain future.

This is often because media entities describe the online world as a ‘sewer’. To them, X, Facebook, and YouTube represent an army of keyboard critics and free market competition.

The media present a narrow view of a sprawling advancement which has become as integral to civilisation as the roads our truck drivers use to deliver food.

Social media is one technological creation to which we must adapt – or accept – as we did with the invention of the internet itself.

Banning children from what has become a fundamental tool for future business could saddle them with a disadvantage on the global scale and deny them opportunities.

Australia’s eSafety Commissioner has evolved from the late Christopher Hitchens’ warning about ‘who gets to decide what I can read’ into the more sinister ‘we will decide which libraries you can enter’.

While regulator might not be burning books, they are definitely smacking children that try to read them.

Worse, this expensive regulatory mess will solve nothing. It may even be used to create additional restrictions on adults as part of a larger crackdown on freedom.

Certainly, it is already spawning censorial bureaucracies to watch over us…

As the eSafety Commissioner menacingly advises, ‘We’ve only used our formal powers 6% of the time.’

The demonisation of the digital world is a philosophy the major parties share.

Opposition Leader Sussan Ley made her position clear at the National Press Club:

‘Another area that demands stronger government intervention is the protection of our children from devices and technology. We have allowed the smartest people in the world to make billions of dollars by peddling addictive technology to children and it is shortening their childhoods. Parents need government in their corner.’

Do they?

Do parents want government in their homes, holding the strap?

That is not something I hear from the community.

I’ve never had a parent lean on my shoulder and exclaim, ‘Gosh! If only we had MORE government!’

It is terrible to watch the Labor and Liberal parties treating young people like helpless sheep – herding them into government-moderated holding pens until they have endured 16 years of uninterrupted brainwashing from the education system.

Re-making bright, eager, healthy children into docile sheep.

These are children who know more about the Digital Age than every single adult drafting the under 16 ban.

Not a ‘ban’, apparently…

‘Calling it a ban misunderstands its core purpose and the opportunity it presents,’ said the eSafety Commissioner.


‘We’re not building a Great Australian Internet Firewall, but we are seeking to protect under 16s from those unseen yet powerful forces … it may be more accurate to frame this as a social media delay.’


Later she contracts herself and adds: ‘Children have important digital rights to participation.’

This is not only about Australia. The eSafety Commissioner was adamant that this regulation was both ‘bold’ and ‘leading the world’.


‘Global collaboration is what we have to be doing. The internet’s global. We know laws are national and local and that’s why we’re the founders of the Global Online Safety Regulators Network – as we’re much stronger together. A lot of these companies are as large and wealthy as nation states so we need to band together with like-minded countries.’


A ‘United Nations’ of Digital Censors operating above government to control global speech…?

Astonishingly, that did not make it to the headlines.

Cutting Australian children off from the outside world leaves their minds to be poisoned with government-scripted paranoia. These are the fears and terrors of Parliament. Once mistrust has been sown against alternative media sources that contradict policy – only then, apparently, can young Australians be ‘safely’ released into the digital wilderness to become crusading activists policing the digital realm on behalf of the government.

This is how you create neurotic, ideological busy-bodies championing government policy.

It is not how you support young Australians in their experience interacting with and shaping the digital realm.

Listening to the eSafety Commissioner, Julie Inman Grant, give her recent speech at the Press Club in Canberra, it appears her ‘advice’ is being crafted from two positions: a grievance regarding her brief employment at Twitter (now X), and the belief that a global framework of eSafety bureaucrats should control the flow of information online.

Julie Inman Grant, who once introduced herself as the ‘censorship commissar’ (quoting Elon Musk) described her interaction with the tech giant as a ‘war’.

‘I made a strategic decision to withdraw here … let’s face it, the war is going to be much longer and more extended.’

That was in 2024.

As an elected Senator, I find it extremely concerning and distasteful to hear the eSafety Commissioner openly pitch the regulation of digital media as a ‘war’ which insinuates that social media platforms are hostile foes rather than private companies providing an extraordinary advancement of technology and – for the first time in human history – a global platform for real-time speech between the peoples of the world.

While speaking to the Canberra Press Club, the eSafety Commissioner pitched her argument by comparing social media to a beach.

‘There are indeed treacherous waters for our children to navigate, especially while their maturity and critical reasoning skills are still developing. And this is where we can learn so much from tried and tested lessons of water safety that Australia pioneered. From the backyard pools to the beach, Australia’s water safety culture is a global success story.’

Pardon me, but the eSafety Commissioner appears to be confused.

Australian children under 16 are not banned from pools and beaches. Nor are they indoctrinated into a cult of terror surrounding water.

‘A mixture of regulation, education, and community participation that reduces risks and supports parents keeping their children happily and safely frolicking in the sea. Picture any major beach in Australia and [it] will likely include the familiar sight of yellow and red flags fluttering in the breeze, children splashing in the waves, and lifeguards standing watch. Parents keep a watchful eye too, but are quietly confident in the knowledge that their kids will be okay. Not because the ocean is safe, but because we have learned to live beside it.’

Aside from the insult of using an Australian beach scene to sell censorship to children, her focus on community adult presence as a safety measure side-steps wildly from her comments later where she says: ‘…the difference will be that they are grouped more with their peers rather than – you know – billions of people around the world that are adults and kids and strangers.’

Are communities good or bad?

The eSafety Commissioner doesn’t know because she cannot get her messaging straight from one breath to the next.

Too much focus has been placed on the (manageable) problems unavoidable in a revolutionary technology development and not enough said about the extraordinary benefit that comes with opening up the world’s information, opinion, debate, and minds.

Of course, there will be a period of adjustment.

For children and parents.

That is not an excuse for regulators to reach into the cradle and suffocate social media in its crib.

This attitude would have seen Rome’s stone tablets smashed, Alexandria’s libraries burned, and the printing presses of Europe fall silent. All to ‘protect’ people from unregulated knowledge.

And it is not as if the internet is an unregulated ‘Wild West’ as claimed. There are many laws – most of which go unenforced for reasons that remain a mystery to the public – that deal with most of the examples the eSafety Commissioner offers as justification.

Deep fakes, blackmail, underage sexual content, harassment – these are all crimes.

We would support an investigation into how many of these reports authorities leave unanswered.

These failures are domestic. They are related to Australia’s weak criminal justice system, not Silicon Valley CEOs who are being used as scapegoats to disguise the irresponsible failure of ‘soft-touch’ sentencing.

Peer-based bullying, which makes up the bulk of tragic youth suicides, is largely due to school peers known to both the parents and teachers. These terrible stories almost always reveal the systemic failure of the education system which has shied away from punishing bullies and removing them from the school environment.

Before banning children from the internet, we should find out why schools have lost control of students.

Banning under 16s from social media also has the potential to turn the government into the worst schoolyard bully.

Imagine a class where only one person is under 16. All of their peers are on social media – except them. Differences are what drives exclusion, and in this case the government is creating an insurmountable social divide that will expose untold thousands of children to a friendship disadvantage.

And what of children who struggle with school?

The eSafety Commissioner said at one point, ‘…a vision the Prime Minister had of seeing more kids kicking the footy. That’s what we plan to help measure in…’

Not all kids ‘kick the footy’.

The children who do not fit in with their school peers often engage with small international niche creative communities. These children make school bearable through their social media friendships in the same way my generation had pen pals or friends in other clubs and areas.

Cutting children off from their best friends online is worse than bullying. It is cruelty.

Despite what is suggested by regulators, these children will not ‘just move to other platforms’. One Australian child cannot compel international children to change to another unregulated social platform because that is not how reality works. They will simply be excluded and forgotten.

This is before we consider sick children who live at home or in hospital and for which social media is the thread that connects them to the world.

Social media lived peacefully side-by-side with the Millennial generation, who are now in their 30s-40s.

How is that possible?

No doubt it had something to do with the rigour of their education and domestic environment which provided the balance lacking in schools which routinely engage in public activism – dragging children onto the streets as pawns in adult political games.

When the education system decided to focus on politics, it began to see free speech and the platforms that facilitate critical thinking, live news, and global knowledge as ‘dangerous’.

School eSafety programs spend much of their time obsessing about which sources of news can be ‘trusted’, although it is never made clear when educators were handed the task of ranking news organisations in the minds of children.

Who gets to decide which news outlets are ‘trustworthy’?

The hypocrisy of National Press Club host Tom Connell informing the audience that they could watch YouTube and follow the conversation on X cannot be overstated.

In 2025, the news unfolds on social media – much to the frustration of the legacy media.

World leaders correspond via Truth Social and X.

The eSafety Commissioner is effectively banning children under 16 from the news – from the world – and from their friends.

Imagine if she had insisted children be banned from reading newspapers ‘for their safety’.

It’s the same thing, yet the danger is easier to recognise in the latter.

Our children deserve protection – protection from the expansion of government into the role of parenting.

The eSafety Commissioner has gone too far by Senator Malcolm Roberts

Using ‘child safety’ to restrict social media is a dangerous path for Australia

Read on Substack

The Office of the eSafety Commissioner does commendable work in protecting children and adults from bullying and, most importantly, removing child abuse material. I praised the Office for this work.

However, in my opinion, the eSafety Commissioner has brought the office into disrepute with her personal vendetta against Twitter/X and her attempt to become the world internet police.

Last year, the Commissioner finalised investigations into 9,500 pieces of violent and extremist content. I asked what these were. The answer provided was that the Commissioner was taking down material from anywhere in the world, detecting it in part because they actively searched for it, even without a complaint.

Given that the Commissioner is positioning herself as the world internet police at our expense, I asked what benefit removing the 9,500 pieces of material had for Australians.

The answer relied on one incident, and there was no proof it actually caused a terrorist incident. I asked why there was no explanation of what the other material was, such as a transparency register so we can see what material they are requiring to be taken down to check for political bias. The question was ignored.

I also asked what direct benefit her actions had in addressing terrorism and violent material. The Commissioner answered regarding child material, which I had already praised.

The Commissioner is avoiding scrutiny of her takedown notices for violent and extremist material, and I believe it is because they follow a political bias.

One Nation calls for the eSafety Commissioner to stand down.

Transcript

Senator ROBERTS: Can I, first of all, pay a compliment and I’ll read out some statistics. From the ACMA annual report 2023-24, the office of the eSafety Commissioner has received 13,824 complaints regarding web URLs, with 82 per cent relating to reports about child sexual abuse, child abuse or paedophile activity. This is a 19 per cent increase from the previous year. Your office sent 9,190 notifications related to child sexual abuse material to the INHOPE network—which I understand are the good guys, the right people to work with—and referred 130 investigations to the Australian Federal Police. On cyber abuse, you received 2,695 complaints to the Cyberbullying Scheme for Australian children and 3,113 complaints to the Adult Cyber Abuse Scheme with a removal rate of 88 per cent where removal was required. My opening comment is simple: well done; thank you very much. This is important work. 

My first question is that you finalised 9,461 critical investigations into terrorist and violent extremist content, representing a 229 per cent increase—that’s amazing—in these types of complaints from the previous year. I’d like to ask about that. How do you define terrorist and violent extremist content? 

Ms Inman Grant : I will turn over to Ms Snell to talk about that. That is part of our illegal and restricted content team under the Online Content Scheme. 

Ms Snell : I’m actually going to invite Mr Downie, who is the executive manager for our Investigations Branch, who oversees this work, to talk specifically to this. 

Mr Downie : When we’re dealing with terrorism and violent extremist content under the Online Safety Act, we deal with terrorism as defined under the Criminal Code to the pure definition of what a terrorist act is. However, when we’re applying the Online Safety Act, we apply the content according to the classification scheme, and we’ll classify that material as ‘refuse classification’, which then falls into class 1 and class 2 definitions. 

Senator ROBERTS: Is this content relating to Australian content or international content? 

Mr Downie : With the complaints that we receive, we receive content that can be generated or hosted anywhere in the world, but the key is that it’s accessible by the people within the Australian community. 

Senator ROBERTS: Do you seek this content out yourself, or do you rely on a complaint before acting? 

Mr Downie : Generally, we rely on a complaint before acting; however, we do have own-motion investigation provisions where we are then able to further conduct investigations to locate material that may be in furtherance of that complaint. 

Senator ROBERTS: Of those 9,461 completed investigations, what was the outcome, please? 

Mr Downie: I’d have to take that on notice for the specific details of those investigations, but in the majority of cases that content is removed. 

Senator ROBERTS: Is there any demonstrable benefit from you taking this material down? What is the benefit to the taxpayer of this aspect of your office? 

Mr Downie : Having access to that type of content, whether it be globally or not, is very harmful to members of the community. That material can be used to incite violence. It can be used to radicalise vulnerable people or youth, which, as we’ve seen in the media, can be then used to incite further violence within the community. So less access to that type of content can only be beneficial for the Australian community. 

Ms Inman Grant : And I’d note that ASIO Director-General Burgess has said that the vast majority of terrorism investigations conducted right now are of young people between the ages of 14 and 21 and in every single case they have been radicalised somehow on the internet. You would probably also be aware of, heartbreakingly, the stabbing video of bishop Mar Mari Emmanuel, which was geo-blocked here by X but was available in the rest of the world. In the sentencing of the 17-year-old Southport killer, Axel Rudakubana, who went and stabbed three little girls to death while they were making bracelets at a Taylor Swift themed dance party, that very video, that very Wakeley stabbing video, he accessed on X 25 minutes before he stabbed those little girls and claimed that that was his inspiration. So you can imagine that this is something that the UK government has wanted to talk to us about. We have a partnership with Ofcom. We of course have different powers, but I think it’s just a very powerful reminder that this kind of content is accessed by young people. It can normalise, desensitise and, in the worst cases, radicalise. 

Senator ROBERTS:On page 206 of the ACMA report, there’s a graph which shows X is the source of five per cent of your cyberabuse claims and Google four per cent, compared to Facebook at 25 per cent. Page 216 of your report lists major noncompliance actions. X has four and Google one. Why does X occupy so much of your time? 

Ms Inman Grant : In terms of adult cyberabuse? 

Senator ROBERTS: In terms of terrorism complaints and cyberabuse. 

Ms Inman Grant : If you recall back to 16 April, around the Wakeley stabbing, we worked with all platforms. With the exception of Meta and X Corp., they all did a good job in trying to identify, detect and remove the Wakeley terrorism video. We weren’t satisfied that either Meta or X did, but, once we issued formal removal notices, Meta responded and complied within the hour, and, of course—you know the story—X said, ‘We’ll see you in court.’ That’s what has taken our time. 

Senator ROBERTS: What about the others? That would apply to one of your complaints against them. What about the others? Why the other three? 

Ms Inman Grant : It depends on the type of harm. For instance, when we’re talking about youth based cyberbullying, most of the cyberbullying happens on the top four platforms where children spend their time, on YouTube, TikTok, Snap and Instagram. When it comes to image based abuse, there’s a much higher proportion now of sexual extortion targeting young men between the ages of 18 and 24. They tend to meet on Instagram, sometimes on Snap, and then they’re moved off platform. So it depends on the form of abuse. It also depends on the complaints we get. But, when it comes to the terrorist and child sexual abuse material, we go to where the content is hosted and shared. 

Senator ROBERTS: That still doesn’t answer the question. You’ve got four major noncompliance actions against X and only one against Google, yet you’ve mentioned several platforms. Why does X have to occupy so much of your time? 

Ms Inman Grant: Because they did not comply with our notices. Google came close to not complying, so we gave them a formal warning. 

Mr Fleming : Those tribunal and court cases are often initiated by X, so we’re responding to the claims that they make challenging our powers. That’s why they feature the most. 

Senator ROBERTS: The report goes on to list how many notices are issued under each part of the act yet does not provide a detailed list. This is fine for child and adult abuse material, of course. We’re happy with that. For class 1 extremist and violent material, why are we not provided a list of what the commissioner considers worthy of a takedown notice and the reasons why? There’s a widespread belief in the public that you’re overstepping on your choice of material to take down. 

Ms Inman Grant: Respectfully, I’d like to read from some weighted and validated surveys of the Australian public. In November 2024, a weighted survey of Australians found that 87 per cent of those surveyed supported the introduction of stronger penalties for social media companies that do not comply with Australian laws, 77 per cent supported the proposed ban on social media for children and 75 per cent supported the Australian government’s plan to introduce a digital duty of care. In August 2024, a weighted survey of Australians found that 79 per cent said that social media platforms should operate with a regulator with the power to order content removal. That seems like a pretty overwhelming amount of support from the public. 

Senator ROBERTS: That wasn’t my question. My question was: why are we not provided a list of what the commissioner considers worthy of a takedown notice and a breakdown of the reasons why? 

Ms Inman Grant : We provide as much transparency as we can. You would understand that confidentiality is incredibly important. We can’t describe these in great detail. We can’t name names. What kind of information do you think would be helpful to your understanding? That’s something that we can certainly look at in the interests of transparency. 

Senator ROBERTS: The specific behaviours, without breaching confidentiality, would be helpful. We wouldn’t expect you to breach confidentiality or name names—certainly not—but we would like the types of actions that the commissioner thinks worthy of a takedown notice, as I said, and the reasons why. 

Senator McAllister: The commissioner and I are trying to understand, with a little more precision, what sort of information. You’re simply saying a generalised list of examples that are deidentified— 

Ms Inman Grant : Of 40,000 complaints we receive annually. 

Senator ROBERTS: You’re dealing with them, so presumably you know what they are. I’d like to see some sort of classification so that people could understand the proportions, because at the moment I don’t think you’re accountable for that. 

Ms Inman Grant : We can take that on notice. We would have to look at privacy and confidentiality. We would also have to look at resource implications and how that might serve the public interest, but we’re happy to take a look at that. 

Senator ROBERTS: I think the people have a right to know. Referencing unofficial takedown notices, which I note are issued under section 183(2)(zk), these go to the question of your secrecy. If these are dangerous enough to require a takedown, then they should be dangerous enough for you to list out by making the register of takedown notices public knowledge—that’s what I was getting at. Otherwise, you’re simply exercising power without any accountability, power that can be abused. How would we know? Can you, Commissioner, point to one terrorist act you’ve prevented, one person you’ve deradicalised or one benefit to Australian society from the money you have spent on your campaign against extremist material? 

Ms Inman Grant : I go back to what D-G Burgess often says, ‘You’re never congratulated when you stop something from happening.’ Again, do we have to have more heartbreaking examples of, like I just explained to you, what happened with those three little girls murdered in Southport, UK? We’ll never know. What I do know is I have parents coming up to me and saying: ‘You’ve saved my son’s life. He was sexually extorted. He had just turned 18. He went to the police; no-one would help him. I wasn’t going to let it go. I found your website. Your investigators supported him, got the content down, gave him advice and sent him on to mental health support services.’ So I do know that we’re saving lives every day. 

How many cases of 12- and 13-year-old girls being cyberbullied and bullied do you need to prove that this is a veritable epidemic and that young people are losing their lives? We’re here to help them and to prevent that from happening. My biggest regret, if there is one, is that more people don’t know about us. Only about 40 per cent of the Australian population knows about us, but we do everything we can to help people. When we stop helping people and making the online world a safer and better place, then, yes, it’s time to hang up our hats, but we’re just getting started. 

Senator ROBERTS: With due respect, Ms Inman Grant, you didn’t answer my question— 

CHAIR: Senator Roberts, we have to rotate the call. There are a lot of senators who wish to ask questions. 

Senator ROBERTS: I just want to clarify that one. 

CHAIR: I can come back to you, if you wish. 

Senator ROBERTS: It’ll only take a second to do this. 

CHAIR: Go on then. 

Senator ROBERTS: I asked, ‘Can you point to one terrorist act?’ I accept you’re doing a good job. You’re preventing child abuse, no doubt about that. We’ve discussed that in the past. Can you point to one terrorist act you prevented, one person deradicalised or one benefit to Australian society from the money you have spent on your campaign against extremist material? That’s what I want to know. 

Ms Inman Grant : We’re not going out into the public asking young people if they saw a particular video that radicalised them or not. We do know when people have been radicalised by content that has been online. Some of the gore content that we’ve taken down includes the manifestos, the horrific imagery of people at Christchurch huddling in the corner while being shot. Anything that’s dehumanising that we are able to get down to not cause further pain to victims and their families and have not incite others into taking the same action, I think, is worth doing. I don’t need proof that I prevented this, that or the other from happening. We’re trying to make the internet a safer, more positive place with less violent extremist material, and that’s why we take these issues so seriously. 

Senator ROBERTS: My concern is with— 

CHAIR: We’ll go to Senator Darmanin— 

Senator ROBERTS: I’ll put one more question on notice. 

I had a fantastic time chatting with Brodie Buchal on The Right Side Show! We dove into a range of topics, from Australian politics to the heated debate over the Under 16’s social media ban bill. We also tackled the lack of accountability in government processes and so much more.

The Labor-Liberal Uniparty has been advancing this bill based on a  case where bullying on social media led to a tragic suicide. In submissions on this bill, it became apparent that banning children from social media would cause as much harm as good. The best response to these tragic cases would be to empower parents to better manage their children’s use of social media.  This can be achieved by enhancing parental lock technology, making it more powerful, easier to use, and free (the best Apps available are commercial).   The Government ignored concerns raised by experts in their submissions and testimony, and pushed ahead with a bill that introduced a blanket ban for under 16.

Let’s be clear – this is a ‘world-first’ because the rest of the world knows such a ban is counterproductive.

Tech-savvy kids will get around the ban, and that’s where the real harm begins. The ban does not cover chat rooms in video games, which lacks the supervision present on social media platforms. Peer-to-peer chat apps are making a comeback, and some children may even turn to TOR, which is not supervised at all and by it’s design, is almost impossible to supervise. This bill will have the outcome of exposing kids to even worse forms of bullying.  

One Nation and the Greens united to stop Labor’s guillotine. We forced the government to remove the bill banning under 16’s on social media and extend scrutiny until February. Then, incredibly, the Liberal Senate leader, Simon Birmingham, moved to get the bill back in the guillotine process.  Barely hours later, Simon Birmingham informed the Senate that he was leaving. It’s clear he knew he was leaving and this was his parting gift.

I want to thank Senators Alex Antic and Matt Canavan for crossing the floor to vote against the Liberal-Nationals-Labor guillotine.  

One Nation will continue to fight against the social media ban, returning power to parents and families.  

Included are comments around Digital ID, which—despite claims to the contrary—will inevitably become part of this outrageous power grab.

Transcript

My remarks are directed to the minister but also to people listening at home to the Senate and to researchers and historians that will look back at this vote today in an attempt to understand what the hell the Senate was thinking. The amendment the government circulated, no doubt with the approval of the Liberal Party, answers that question. The Online Safety Amendment (Social Media Minimum Age) Bill 2024 can act to force every Australian to be the subject of a digital ID in the name of keeping children safe—and that’s what my question is about.  

The government accepted widespread public concern that the bill was designed to force everyone to get a digital ID and promised to include an amendment to specifically rule that out. In this government amendment that you’ve moved, SY115, new provision 63DB(1) excludes use of government issued identification or use of digital ID. That is great, except 63DB(2) provides that, if social media platforms can come up with an alternative means of assessing age that does not involve digital ID or government documents, they can—wait for it—accept a digital ID identification. In effect, this amendment specifies that a social media platform cannot use digital ID by itself but it can use digital ID as part of a more comprehensive verification. There’s no need to guess what that could be; this bill contains the answer: age-assurance software. The company which has been awarded the tender for the age-assurance trial is a British company called Age Check Certification Scheme. whose main business is provision of digital IDs backed by age-assurance software. 

TikTok has used age-assurance software to remove one million underage accounts from TikTok in Australia. This software can tell if a person is, for instance, under 12. That’s useful. The smaller the gap between the user and target age—16 in this case—the less accurate it is. This software can’t tell age within six months, and there’s no way of knowing a person turned 16 on the day of their application. You just can’t tell that from face scan. Accessing social media on your 16th birthday and, most likely, for months afterwards will require a second identifier containing the child’s facial scan and their date of birth, which is a digital ID, which this company specialises in. You’re setting them up. 

I have criticised this bill as an opportunistic attempt to capitalise on the public desire for better regulation of social media to force all Australians to get a digital ID. I’ll say that again. I have criticised this bill repeatedly, as have others, as an opportunistic attempt to capitalise on the public desire for better regulation of social media to force all Australians to get a digital ID. This amendment requires a change in my language, which is now that this bill is an opportunistic attempt to require every child, once they turn 16, to get a digital ID if they want to access social media. What age does the government’s digital ID start from? Sixteen. What a coincidence! This wasn’t the intention all along? That’s misinformation. 

This amendment exposes the original intention of the bell. Your amendment exposes the original intention of the bill, which was hidden in what looked like a poorly drafted bill. It wasn’t poorly drafted; it was deliberately dishonest, and the short committee referral, which the government fought against, has exposed the deceit. The truth is now out there, and the decision before the Senate is a simple one. A vote for this bill is a vote to require every child to get a digital ID on their 16th birthday. 

Compulsory digital IDs aside, there are many other reasons not to pass this bill. I will now share with the Senate and with posterity the words of Australian Human Rights Commission on the bill. One Nation fully supports the commission’s position, which deserves to be included in the Hansard record of the debate: 

Social media is a vital platform for young people to share their ideas and opinions, engage in dialogue, and participate in social and cultural activities. It can be a valuable educational tool by providing access to diverse perspectives, news and learning opportunities, as well as vital information about health, well-being and safety. A blanket ban risks unjustly curtailing these freedoms. 

Social media is integral to modern communication and socialisation. Excluding young people from these platforms may isolate them from their peers and limit their ability to ability to access much-needed information and support. This is particularly important for young people from marginalised, vulnerable or remote communities. 

These are the words of the Human Rights Commission. 

The social media ban will rely on effective age assurance processes being adopted, which means that all Australians may be required to prove their identity in order to access social media. This may potentially require all Australians to provide social media companies with sensitive identity information, which poses a risk to our privacy rights in light of recent examples of data breaches and personal information being stolen. 

Technological workarounds – such as VPNs and false age declarations – may undermine the effectiveness of the ban. Additionally, a ban will not address the root causes of online risks or make the platforms safer for everyone. 

The workarounds to this measure have not received enough debate. The bill carves out gaming sites, many of which have a chat feature. Children will move over to chatrooms and gaming sites which are not supervised. Tor—or, more accurately, onion routing—will provide another avenue for communication which is designed to make supervision exponentially harder than on mainstream social media platforms. I have advice from a leading internet security company that peer-to-peer social media, which again is harder for parents to supervise than current social media platforms, is making a comeback. As a result of this legislation, children will be exposed to more harm, not less. I had a call from a constituent— 

Senator Hanson-Young: You are right. 

Senator HANSON-YOUNG: It’s not often Senator Hanson-Young tells me I’m right. A moment ago, I had a call from a constituent who had called their local Liberal member of parliament about this bill and was told, ‘Oh, it’s okay; you can just sign up for your children.’ With age-assurance software, that will not work. With Digital ID connected to age-assurance software, the social media platform will know what you’re doing. Don’t be telling people: ‘It’s nothing. You can defeat it. You can still talk to Grandad on Facebook.’ You won’t be able to. Children may be able to use VPNs, virtual private networks, and the new PPNs, personal private networks, to appear to be in another country. That really won’t work either. The keystroke logging that accompanies the age-assurance software will assume someone pretending to be in Canada but interacting with Australian accounts is probably using a VPN. 

Minister, why did you say that this won’t lead to Digital ID when your amendment says exactly that? 

Today, the Senate held a Committee Hearing on the Online Safety Amendment (Social Media Minimum Age) Bill 2024. This expedited inquiry was scheduled with just one day’s notice, as the Liberal and Labor parties want to rush this legislation through. The first witness, Ms. Lucy Thomas OAM, CEO of Project Rockit, delivered six minutes of the most relevant, heartfelt, and inspirational testimony on the issue of censoring social media for those under 16. Her insights demonstrated the benefit of lived experience.

Before taking a position on this bill, take the time to listen to her testimony.

Transcript

Senator ROBERTS: Thank you all for being here. Ms Thomas, there are harms and benefits at school, and there are harms and benefits in life generally. Claude Mellins, professor of medical psychology in the Departments of Psychiatry and Sociomedical Sciences at Columbia University, stated: ‘For young people, social media provides a platform to help them figure out who they are. For very shy or introverted young people, it can be a way to meet others with similar interests.’ She added: ‘Social support and socializing are critical influences on coping and resilience.’ They provide an important point of connection. She then said in relation to Covid: ‘On the other hand, fewer opportunities for in-person interactions with friends and family meant less of a real-world check on some of the negative influences of social media.’ Isn’t the professor making an important point? It’s not about stopping real-world interactions it’s about balancing social media with real-world interactions. Isn’t it about a balance, not about prohibition? Isn’t it also the fact that parents and not governments are best placed to decide how their children develop?

Ms Thomas: Thank you for the question. I think you’re speaking to that idea of balance that a lot of us have been trying to refer to. We are acutely aware of the harms, and I think they’re beautifully captured in that quote, and acutely aware of the risk that we may create new harms by cutting young people off. I think this is a really important point, and I’d like to give you one example, a quote from a young person, Rhys from Tamworth, who commented: ‘Social media has helped me figure out and become comfortable with my sense of self, as there is a large community that is able to connect me with people all over the world. Living in a regional area, it’s difficult to find people dealing with the same personal developments, and social media really helped.’ This is beyond just direct mental health intervention; this is about finding other people like you. This is about finding spaces where we can affirm ourselves, use our voices and mobilise around actions that we care about, just like we’re doing here today. I’d love to point out that the Office of the eSafety Commissioner has done some fantastic research into the experiences of specific groups—those who are First Nations, LGBTQIA+ Australians, and disabled and neurodivergent young people. All of these group face greater hate speech online. Actually belonging to one of those communities, I can say that we also face greater hate speech offline. What was really important is they also found that young people in these communities that already face marginalisation are more likely to seek emotional support—not just mental health support, but connection, news and information, including information about themselves and the world around them. So I take your point.

Senator ROBERTS: Thank you. I have another quote from Deborah Glasofer, Associate Professor of Clinical Medical Psychology at Stanford University:

Whether it’s social media or in person, a good peer group makes the difference. A group of friends that connects over shared interests like art or music, and is balanced in their outlook on eating and appearance, is a positive. In fact, a good peer group online may be protective against negative or in-person influences.

Is this bill throwing out the good with the bad, instead of trying to improve support in digital media skills to allow children and parents to handle these trials better?

Ms Thomas: I think there is a risk of that, yes. I think we really need to, in a much longer and more thorough timeframe, interrogate and weigh up all of these risks and unintended possible impacts. I’d like to draw another quote from Lamisa from Western Sydney University. You spoke about influencers; we tend to imagine those being solely negative. Lamisa says: ‘Social media has given me creators who are people of colour, and I think it has really allowed me to learn that I don’t have to justify my existence, that I am allowed to have an opinion and that I am allowed to have a voice about who I am.’ So I absolutely think that there is a risk that we’ll throw out these experiences; in our desire to protect people, we create unintended harms that they have to live with.

Senator ROBERTS: I just received a text message from someone in this building, a fairly intelligent person, and he said: ‘I was born with a rare disorder. I spent more than four decades feeling isolated until I discovered people with the same disorder on social media. This legislation would prevent people under 16 from linking with the communities online that can provide them with shared lived experience.’ What do you say?

Ms Thomas: I’m going to give you one more quote. I’m aware that young people aren’t in the room, so I’m sorry I’m citing these references. Hannah from Sydney says: ‘Where I struggled in the physical world thanks to a lack of physically accessible design and foresight by those responsible for building our society, I have thrived online.’ The digital world has created so much opportunity for young people to participate and fully realise their opportunities. We just need to be very careful.

I know in talking about all these benefits, I’m probably going to receive an immediate response about some of the harms. I’m not here to say that harms don’t exist. They do. If anyone is aware of them, it’s me. I’ve been working in this space for 20 years. I started Project Rockit because I wanted to tackle these issues as a young person fresh out of school. We know they’re there, but we have to be very careful not to impact these positive benefits young people face.

Senator ROBERTS: Ms Thomas, isn’t there very important access to parents and grandparents on social media for their support and experiential interaction. A lot of children interact with their parents and grandparents through social media?

Ms Thomas: Am I allowed to answer this one?

CHAIR: Yes.

Ms Thomas: I think one of the big, grave concerns around implementation and enforcement is that it won’t just be young people who need to verify their ages online; it will be every Australian. The methods available, every Australian sharing their biometric data or presenting a government issued ID, are going to pose challenges for those Australians that you are talking about—older Australians who are already facing higher rates of digital exclusion and those from marginalised communities. Absolutely, this is a vital tool for grandparents and kids, for intergenerational play and learning, and we risk cutting young people off but also cutting older people off.

This is the third and final session on the Online Safety Amendment (Social Media Minimum Age) Bill 2024 — aka U16’s Social Media Ban – an important piece of legislation being waved through by the Liberal and Labor parties with minimal debate. The Department was called to explain the bill, which of course they defended with responses that would not hold up under closer scrutiny.  If only Senators had time to do this.

Several serious revelations emerged during the Department’s testimony, including this little pearl: it’s better for foreign-owned multinational tech platforms to control children’s internet use than for parents to supervise or manage their children’s social media and online interactions. One Nation strongly disagrees.  

I also raised concerns about the YouTube exemption, which is worded in such a way that it could apply to any video streaming site, including pornographic sites. The Department’s response was to point to other regulations and codes that “supposedly” protect children from accessing porn.   What utter nonsense! Any child in this country without a parental lock can access Pornhub by simply clicking the “Are you over 18?” box. Teachers nationwide report that even primary school students are being exposed to and influenced by pornography. If this bill accomplishes anything good, it should be to prevent children from accessing pornography, which it deliberately avoids doing.  

This bill claims to be about many things – keeping children safe is not one of them.

Transcript

Senator ROBERTS: Thank you for appearing today. Could you please explain the provisions around exemptions for sites that do not require a person to have an account, meaning they can simply arrive and watch? An example would be children watching cartoons on YouTube. What’s the definition here of a site that can be viewed without an account?

Mr Irwin: I guess it goes to the obligation around holding an account, or having an account, which relates to the creation or the holding of an account. So if there is any process—

Senator ROBERTS: Is it the creator’s responsibility?

Mr Irwin: Sorry?

Senator ROBERTS: Is it the creator’s responsibility? Is the account the creator’s responsibility?

Mr Irwin: No, all responsibility is on the platform. If a platform under this definition has the facility to create an account and/or has under 16s who have an account on there already, then they will have to take reasonable steps.

Senator ROBERTS: What’s the functional difference in your definition between YouTube and Porn Hub?

Mr Chisholm: One contains content that is restricted content that is prohibited to be accessed by children under law. Porn Hub is a pornographic website.

Senator ROBERTS: I understand that.

Mr Chisholm: YouTube has a whole range of information, including educational content and a range of information that doesn’t really match up with a site like Porn Hub.

Mr Irwin: That was the second limb of the age-assurance trial: looking at technologies for 18 or over, looking at pornographic material for age assurance. That also goes to the matter of the codes that DIGI were talking about before. Those codes relate to access to particular types of content including pornographic content.

Senator ROBERTS: Let me try and understand—

Mr Chisholm: The design of Pornhub is to provide pornographic material to people who are permitted to watch it. That’s the difference.

Senator ROBERTS: I guessed that, but I asked for the functional difference. Pornhub is 18-plus, but apparently you don’t have to prove it. Could you show me where in the legislation, in this child protection bill, you’re actually including porn sites?

Mr Chisholm: There are separate laws in relation to pornographic material, which we can step you through. This bill is more about age limits for digital platforms, imposing a 16-year age limit for digital platforms. There are other laws that prohibit access to pornographic material online including the codes process and classification system.

Mr Irwin: That’s correct.

Senator ROBERTS: What’s required for someone aged 16 or 17 to get access to Pornhub?

Mr Irwin: That’s subject to the codes that industry is developing right now, which DIGI talked about, in terms of what specifically is required. There is also a whole system of classification laws that are designed to prevent access to adult content by children. On top of that, there’s the eSafety Commissioner’s administration of things like basic online safety expectations and the phase 2 codes that are under development.

Senator ROBERTS: I’m glad you raised that because I was going to raise it. You exempt gaming sites because they already carry age recommendations. In fact, some video game sites are MA 15+; they’re not 16-plus. What will have to change? Will it be your bill or the MA 15+ rating?

Mr Chisholm: The bill doesn’t require them to change—

Ms Vandenbroek: Nothing will change.

Mr Chisholm: because gaming isn’t caught by the new definition. There’s nothing that requires gaming systems to change.

Senator ROBERTS: So social media is 16-plus, but video games are 15-plus.

Mr Chisholm: The policy here is to treat games as different to social media. For some of the reasons we talked about before, they are seen as a different form of content consumption and engagement to social media.

Senator ROBERTS: Doesn’t this indicate to people that this bill’s intent is not about what the government says?

Mr Chisholm: No, the bill is definitely about what the government says. It imposes a firm age limit of 16 on account creation for social media for all of the concerns and reasons outlined about the damage that’s being done to under-16s through exposure to social media. Games are also subject to classification rules, so they have their own regime they have to comply with now.

Mr Irwin: They’re subject to the broader Online Safety Act as well.

CHAIR: Senator Roberts, I’ll get you to wrap up.

Senator ROBERTS: I have a last question. I understand that there are parental controls that parents can buy—they’re sometimes free—in the form of apps that watch over what children are watching. What alternatives are already available for parents to control children’s social media and control their exposure? Did you evaluate them, and why don’t you just hand the authority back to where it belongs—to parents—because they can do a better job of parenting their child than government can?

Mr Chisholm: The very strong feedback that we received from parents during this consultation is that they do not want to bear the burden or responsibility of making decisions that should be better reflected in the law. At the moment, parents often refer to the 13-year age limit that’s part of the US terms of service—

Mr Irwin: For privacy reasons.

Mr Chisholm: for privacy reasons, that apply in Australia. That’s often used for parents to say to their children, ‘You can’t have a social media account until you’re 13.’ It’s really important for parents to point to a standard law, an age limit, that will apply to everybody. It’s also feedback we’ve received from a lot of children. They would rather have a universal law that applies to all children under the age of 16 instead of a situation where some children have it and some children don’t, and where all of the harms that we’re aware of from exposure to social media continue to magnify. We also don’t want a situation where there is any question the parents have some legal responsibility in relation to an age limit. The very strong view of the government is that that responsibility should be borne by the platforms, not parents.

Senator ROBERTS: We’re not going to have—

Mr Chisholm: The platforms are in a much better position to control their services than parents are.

Senator ROBERTS: So we want to put parenting in the hands of social media platforms?

Mr Chisholm: The parents have said to us that they have a very strong view that they want a 16-year age limit, and that the platforms are better placed to enforce that because it is their platforms.

Senator ROBERTS: How much notice did the parents get to give their comments? Because we got 24 hours notice of the closing of submissions.

Mr Irwin: We’ve been consulting, and I will add we do have evidence that 58 per cent of parents were not aware of social media parental monitoring, and only 36 per cent actually searched for online safety information.

Senator ROBERTS: So wouldn’t it be better to educate the parents?

Mr Chisholm: We are educating parents, too. That’s part of the digital literacy and other measures we are undertaking. Education is important, but it’s not enough.

Senator ROBERTS: I meant educating parents about the controls already available to keep the control over their children in parents’ hands, not usurping it and putting it in the government’s hands.

Mr Chisholm: I think it comes back to the point that we’ve made that the very strong view here is that platforms should bear the responsibility for imposing or following an age limit, not parents, who don’t have as much information about how these platforms operate as the platforms themselves.