Posts

I questioned the Commissioner regarding her September trip to Stanford and meetings with US tech firms. She will provide a detailed log of her itinerary, speaking engagements, and total costs on notice. Australians deserve to know exactly how their money is being spent and what is being discussed behind closed doors.

I then queried the Minister regarding concerns raised by US House Judiciary Committee Chairman Jim Jordan about the Commissioner’s conduct. While I support protecting children from harm, we must be vigilant when unelected officials are labelled “extreme” by international peers.

Lastly, I was interested to know what the Commissioner’s philosophy was regarding censorship, noting the “enormous power” that has been given to her. She denied being a censor, stating she only acts on public complaints regarding “highly damaging” and “refused classification” material, specifically excluding political speech.

The eSafety Commissioner has enormous power over what you see and say online. I will continue to hold this agency to account to protect the rights of adult Australians from government overreach.

P.S. At one point during this session, Senator Green accidentally called me “Minister” – saying “maybe one day, if the LNP has their way.” She even joked that One Nation is already writing policy for the LNP! 😆😆

— Senate Estimates | December 2025

Transcript

CHAIR: Senator Roberts, I understand you have a few more questions.

Senator ROBERTS: Yes, just three. Commissioner, you visited Stanford University in September this year as part of a USA trip. Did Australian taxpayers fund that?

Ms Inman Grant: Yes, I went, and I met with eight of the AI companies and the social media companies. Then I spent a day and a half at the Trust and Safety Research Conference.

Senator ROBERTS: Could you please provide a log of meetings and a record of your speeches, or any other documentation, to assure taxpayers that their money was spent appropriately, as well as the total cost of the trip?

Ms Inman Grant: I sure can.

Senator ROBERTS: On notice.

Ms Inman Grant: Yes.

Senator ROBERTS: Thank you. You’ve already answered a question from Senator Whitten about the House Judiciary Committee chairman wanting you to testify, so I don’t need to cover that. Minister, does it concern you that your commissioner is engaging in conduct that is so extreme that the US Congress, specifically the House Judiciary Committee chairman, Jim Jordan, is alarmed?

Senator Green: Minister, I think the eSafety Commissioner’s address—

Senator ROBERTS: I’m not a minister.

Senator Green: Sorry, Senator—maybe one day, if the LNP has their way.

*Senator Henderson interjecting—*

Senator Green: You never know. They wrote your net zero policy, so you never know. We are very proud of the reforms that we are undertaking. To be fair, I’m sure the coalition was very proud of the steps that they took in terms of online safety when the eSafety Commissioner was established. For the most part, we have had bipartisan support for these types of reforms, because they keep Australians safe. The social media ban or minimum age will seek to keep our children safe. It’s incredibly important. I know you come in here quite often talking about the safety of children and wanting to keep harmful material away from them. That is the work of the eSafety Commissioner. It’s open to other governments or other people in other parliaments to have their judgment of it, but from an Australian government point of view we are very proud of the work that she does.

Senator ROBERTS: Commissioner, you said earlier, in roughly these words, that you’ve never claimed to censor the net globally. Why do you think people think this?

Ms Inman Grant: We talked about Elon Musk’s tweet that said she’s the eSafety commissar trying to globally regulate the internet, and then Ben Fordham then picked it up, and it’s just had a life of its own.

Senator ROBERTS: I’ve complimented your office on its work in protecting children, quite clearly. There are other concerns we have with your work because it can cause consequences for adults that we don’t like, but it’s not appropriate to discuss it here. What’s your philosophy on censorship?

Ms Inman Grant: My philosophy is I’m not a censor. I respond to complaints from the public. We received many about the Charlie Kirk assassination and about the stabbing of Iryna Zarutska on a train where she bled to death and the decapitation of the Dallas hotel owner. If you think that that’s overstepping when that’s something that’s highly damaging and was determined—

Senator ROBERTS: No, I didn’t say that. I was wanting to know your thoughts on censorship—that’s all—because you’ve got enormous power.

Ms Inman Grant: My thoughts on censorship? Well, what has been helpfully built into the Online Safety Act is that we’re not regulating for political speech or commentary. It’s where either online invective or imagery veers into the lane of serious harm. You provide us with thresholds. Sometimes those thresholds are tested and sometimes they’re a grey area, but I think we help thousands of people every year. We’re doing world-leading work that the rest of the governments around the world are following. I think we’re punching above our weight. We’re a very small agency given the size of our population. So I guess I don’t have a view. I don’t see myself as a censor. I don’t tell you what you can or can’t say unless it’s refused classification or it’s trying to silence someone else’s voice by targeted online abuse that reaches the threshold of adult cyberabuse.

Senator ROBERTS: Thank you. Lastly, I think it was Mr Fleming who invited us to have a briefing. We haven’t forgotten. We’d like to do that, but we’ve been a bit busy. We will do it one day.

Mr Fleming: Maybe in the new year. The offer still stands.

Senator ROBERTS: Thank you

During Senate Estimates in December, I asked the eSafety Commissioner why social media platforms like X are being targeted, while Bluesky — a known hangout for the left — seems to be getting a free pass.

The Commissioner claimed there’s no “political bias” and that Bluesky has not been exempted – they’re just focusing on where the most kids are. She called Bluesky a “young company” that’s still finding its feet. It looks like a double standard to me — conservative platforms get targeted, while ‘left-wing hangouts’ get a free pass for being ‘low risk.’

Government shouldn’t be picking winners and losers based on politics. We need transparency, not a “dynamic list” that changes whenever a bureaucrat feels like it. Whether it’s the Labor Party or the Coalition, Australians are sick of the double standards and the “Big Brother” tactics.

I’ll keep speaking up to make sure your voice isn’t silenced by bureaucratic overreach. We need one rule for everyone, and total protection for our free speech.

— Senate Estimates | December 2025

Transcript

CHAIR: It wasn’t my intention.

Senator ROBERTS: No, I know that. Thank you for appearing again. I have, perhaps, an insight. Since COVID, people in Australia are very wary of government. That’s not just the Labor Party; that’s both. Commissioner, you have exempted Bluesky from your under-16 social media minimum-age restrictions, yet Bluesky is almost identical to X, as I understand it. It currently allows 13-year-olds or younger people saying they are 13 to sign up, and they have no age verification. Do you understand, Commissioner, that you have an obligation to discharge your duties without the perception of political bias? Your decision to exempt a left-wing hangout and to include a conservative hangout, X, looks like political bias.

Ms Inman Grant: Bluesky has not been exempted. They present a very low risk. They have actually identified themselves as an age-restricted social media platform. They probably have 50,000 Australian users—a very small number of young users. They’re building up their age inference tools. They’re a very young company. What we’ve decided to do—we’re talking to a range of companies that could be age restricted social media platforms, whether it’s Yubo, Yope, Lemon8 or other ones that we know we’re going to go to. But you missed the opening statement, where I said our focus—these assessments that we’re doing are voluntary. I don’t have specific declaratory powers in terms of who is in and who is out, so I can’t say anyone is exempted. It’s up to the legal teams of those companies to determine whether they’re in or out. Where we will focus our compliance is where the vast majority of young people are. For the purposes of transparency, fairness and due process, we developed the self-assessment tools. Then we did some initial assessments so that we could at least have a body of major companies that would fit the criteria set forth by parliament. We’ve got 10 that we’re starting with, but I’ve always said that this will be a dynamic list. If we see that there are significant migratory patterns with young people that are going over to Bluesk —again, we’ve had three conversations with them—we expect that they will start applying some of their age assurance tools. They’re just at the beginning of that journey.

Senator ROBERTS: So what you’re saying, Commissioner, as I interpret it, is that you’ve got objective criteria that you assess platforms against.

Ms Inman Grant: We developed a self-assessment tool, so there are consistent assessment criteria. The criteria we have to use are the criteria that was in the legislation that parliament passed. That primary test is around whether or not a particular site—if it didn’t meet an exclusion, say, the messaging exclusion, the online gaming exclusion or the education and mental health exclusion, we had to do a sole-and-significant-purpose test. If its sole or significant purpose was online social interaction, then our preliminary view—it is not a determination—was that they were an age restricted social media platform.

Mr Fleming: Senator, just to give you a pointer, it’s in section 63C of the Online Safety Act. The criteria are set out in the act, and, if someone meets those criteria, there are a set of rules that the minister made. If the rules apply to that platform, then they’re out of the scheme. That’s how it works. To reinforce the point the commissioner made, there’s no determination that platforms are in or out. We’ve just expressed our preliminary view based on our assessments against the criteria, like the platforms can do for themselves. Then we focused on where most of the kids are, and that’s where we’re going to focus our initial efforts.

Senator ROBERTS: Thank you. Minister, your government chose to use legislation against social media platforms. However, the commissioner has then included search engines in the scope of age restrictions, using an industry code under the Online Safety Act. Couldn’t you have simply done the whole thing under existing powers and created an industry code of practice, mandatory if necessary, for age control of social media instead of this whole blunt instrument legislation—an industry code as opposed to enforcement?

Senator Green: I’ll let the eSafety Commissioner answer because there would have been advice given to government about the best way forward. This is a very important step forward that we’re taking, and legislation was required.

Senator ROBERTS: Well, I asked you because I’m not allowed to ask—

Senator Green: No, you are allowed to—

Senator ROBERTS: for an opinion of an officer.

Senator Green: No, it’s not an opinion.

Senator ROBERTS: If the minister wants you to, that’s fine.

Senator Green: You’re asking why legislation was required. They can answer that question.

Ms Inman Grant: The industry codes were included in the Online Safety Act of 2021 under the then coalition government. What they decided was that they would split the technology industry into eight different sectors, from search engines to social media sites to ISPs to some broader categories, including the designated internet services and relevant electronic services. What Paul Fletcher, who was my minister at the time, decided was that he wanted to continue the tradition of co-regulation that had existed for many years across telecommunications and ensure that the industry developed the codes. We would decide whether or not they met appropriate community safeguards. If they did, we’d register them. If they did not, then I would create standards, and that would be a disallowable instrument that would require additional parliamentary scrutiny. It took 4½ or five years for all this deliberation, for this to happen. In most other jurisdictions, the regulator writes the code, but, with respect to the search engine code that I think you’re referring to, I don’t know if you missed the interaction I just had with Charlotte Walker—

Senator ROBERTS: I did.

Ms Inman Grant: They were written by Google and Bing, and they pretty much codify safe search practices that are used today. So, come 27 December, if you’re searching the internet and you come across violent pornography or explicit violence, it will be blurred. This is because 40 per cent of kids tend to come across this kind of violent conflict. The search engine is the gateway, and it’s unexpected, it’s unsolicited and it’s in their face. If you’re an adult and you want to continue through, you can do that. You only have to be age verified if you decide to search the internet with a Google account on, for instance, and a lot of families may choose to have a Google account on so that they can have different age-inappropriate settings set up. But, if you’re concerned about it, you just use DuckDuckGo, Bing or whatever other one. The other thing that I think is really important about the search engine code is that, if there’s a person in distress who is seeking to take their life, rather than the search engine taking them directly to a lethal-method site, it will redirect them in the first instance to an Australian mental health support provider. We all know that suicide is a terribly damaging thing for families and communities. So, if we can give someone in distress the support that they need rather than the directions in terms of how to take their life, any family would be grateful.

Senator ROBERTS: I’m sure they would. X currently—

Senator Green: I’m sorry, Senator, I misunderstood your question at the beginning. I thought you were asking about the minimum age legislation, so I apologise.

Senator ROBERTS: That’s alright.

Senator Green: I understand now what you were asking, and the eSafety Commissioner has given a very good answer.

Senator ROBERTS: X currently has, in early deployment, routines which do the following: pattern matching to determine age without the use of personal identifiers, such as a digital ID; pin protected parental controls—I tend to think government should not be undermining parents—to allow parents to set guardrails for their children on content that will be granulated to individual accounts, keywords or topics; and interaction monitoring to identify what could be harassment based on the pattern of posting, the words used and the ages of the people involved to stop offending posts being seen by anyone but the poster. If industry can do this by themselves, why did we need legislation? Why wasn’t a simple code of practice used instead of this ‘big brother, big stick’ drama?

Ms Inman Grant: Is that a question? I would just say in response—

Senator ROBERTS: It looks like the platforms are developing new technology.

Ms Inman Grant: I would just say that we had a very constructive meeting with X. They walked us through a number of the tools. They did say they were going to use age assurance with Grok, which could have some interesting outcomes. But a large number of parents don’t utilise parental controls. Sometimes it’s because they’re too difficult for parents to find or to work. This was a bipartisan act that the parliament obviously started. The momentum started in South Australia and then in New South Wales. But my view, after talking to so many of the ministers, the Prime Minister and the opposition leader, who supported it, was that they wanted to do something monumental. They wanted to create a significant normative change.

Senator ROBERTS: That’s what scares us.

Ms Inman Grant: One normative change that isn’t scary, I would think, is that we know that 84 per cent of eight- to 12-year-olds already have social media accounts, and, in 90 per cent of cases, parents have helped them set them up. Why? Because they wanted them to be exposed to harm early? No. It’s because they’re concerned that their kids’ friends are all on the sites and their kid will be excluded. What this change does is delay them from being exposed to all the harmful and deceptive design features. They can also sit down with their kids and say: ‘Hey, you’re not ready for this. You’re not going to be on it and your friends shouldn’t be on it either.’ So it takes the FOMO, exclusionary element out of it, and this is what we’ll be measuring.

Senator ROBERTS: So the government excludes them instead of their friends? It should be the parents, shouldn’t it?

Ms Inman Grant: They’re setting a standard like you’d set a drinking age or the age for cigarettes. They’re setting an age for social media that they think is the right age and—

Senator ROBERTS: Let’s move on. Commissioner, the search engine code included a grace period of 12 months to allow search companies to write their code to comply. As I just indicated, social media companies are close to a technological solution that will also solve their compliance. Will you allow a grace period to allow social media companies to properly write, test and deploy age-verification technology in an orderly manner—in other words, delay?

Ms Inman Grant: We’re following the letter of the law, but what we’ve said is that we are looking for systemic failures. We don’t expect accounts to immediately disappear overnight. We also have another requirement beyond the deactivating of the under-16 accounts on 10 December, which is preventing under-16s from creating accounts. We accept that that’s going to be a longer-term journey for a lot of these companies, and many that we’re talking about here already have very sophisticated age-inference tools or AI tools. Some of them will be supplementing them with third-party tools that have been tested with the age assurance technical trial. Again, they’re taking a layered approach. We will watch closely. If they have glitches, we’ll talk to them about it. What we care about is that they’re clear with us about the tools and the success of validation or the layered approach they plan to take. If it’s not working, the other requirement is continued improvement, which the technology is doing every day. So in some ways we will be providing a grace process.

Senator ROBERTS: It seems that you accept that this rushed introduction with insufficient time for social media companies to get the software right, with no time for testing and very little public education, could be a recipe for chaos.

Ms Inman Grant: I think they’ve had plenty of time and they’re all technically capable of achieving this.

CHAIR: Senator Roberts, noting the time, we’re due to take a short break. Do you have a final question? Then we’ll take a break and rotate the call after that.

Senator ROBERTS: Why was the decision made to time the introduction for school holidays, which is when children will be wanting to access social media to stay in contact with their friends, sports and activities?

Ms Inman Grant: It was written into the legislation.

Senator ROBERTS: It was one of the reasons we opposed it.

Senator CANAVAN: It’s killed Christmas.

Ms Inman Grant: That’s a legitimate concern. Kids are—

Senator CANAVAN: They’ll get new gadgets that they won’t be able to use.

Ms Inman Grant: Only for gaming.

CHAIR: That’s a good note.

Life will never guarantee safety. That is not an excuse to legislate danger.

Australia is set to ‘quietly’ introduce ‘unprecedented’ age verification checks for YouTube and Google as part of a wider push to gate-keep access to social media.

Quietly.

Without the consent of the Australian people.

Labor is pushing ahead despite alarming fallout from the UK where their Online Safety Act, claimed to be created in the interests of ‘child safety’, has led to the immediate censoring of political discussion surrounding mass migration and Grooming Gangs.

What began as genuine concern for children on social media has rapidly expanded to mandatory, wide-ranging, biometric age checking across the digital landscape.

Not only here – throughout the Western world.

Australia, the United Kingdom, and the European Union have all decided that information is the enemy of political ideas.


The Coalition established the eSafety Commissioner

Liberal Leader Sussan Ley continues to support the eSafety ‘Commissar’s’ proposed restrictions on X, Facebook, TikTok, and Instagram, set to begin in December and now expanded to YouTube and Google (including Google maps).

Failure to comply will see the imposition of extraordinary and ludicrous fines.

This is to satisfy an age verification technology whose reliability is yet to be proven. While these biometric technologies can guess at ages, they cannot return a reliable result to distinguish teenagers, even for the same individual. How can this be the proposed basis for adult rights to digital communication?

It is only natural that when adults find themselves unable to access essential digital services, or a 16-year-old on their birthday wants to download X, that a more reliable form of identification will be sought – and that will almost certainly be Digital ID.

So much for promises that this will be ‘voluntary’ and ‘only for government paperwork and applying for rental properties’.

Most believe, logically, that the point of ‘child safety’ legislation is to force the implementation of Digital ID and perhaps begin the crackdown against VPNs.

These are policy positions that would have been rejected if it weren’t for the added layer of ‘think of the children’ just as deconstructing our energy grid required the weaponisation of screaming children gluing themselves to the road believing they were ‘going to die’ because of fossil fuels.

It is a sickening form of confected emotionally-manipulative hysteria.

We may ask, for what other reasons have these extreme measures been placed upon the digital realm?

Especially considering YouTube is one of the most heavily regulated established platforms and Google has a fully-functional adult-content setting.

Safety?

I don’t believe that. I’m sure you don’t believe that. Chalk up another Uniparty lie.

There is more going on.

While the government continues turning a blind eye to gaming chats and unregulated message boards, it clearly does not believe in child safety online. And even if the eSafety Commissioner believes in her mission to ‘protect children online’, why not wait until the Under 16 social media comes into force in December?

There is enormous doubt about its functionality and, most assume, its public reception. It is likely to be a social disaster. Children around Australia will suddenly realise that government power extends beyond campaign slogans aimed at their frustrated parents.

The UK is experiencing a fraction of the power the Australian legislation proposes, and it is an unmitigated disaster which has been called an assault on fundamental human and civil rights. Instead of protecting children, the UK’s Online Safety Act has put them in danger because it silences information about police complicity in the Grooming Gang assaults and removes public protests about illegal migrants who have been accused of sexually assaulting young people on the street.


Censorship is creating a world where criminals and predators are protected for the sake of political harmony

This is why we say, over and over, the government cannot be trusted with censorship.

Even good intentions turn sour, and we are confronted with ridiculous scenes, such as Peter Kyle, the UK Science Secretary, accusing Reform leader Nigel Farage of being on the side of Jimmy Savile. Needless to say, Farage is demanding an apology.

‘If you want to overturn the Online Safety Act you are on the side of predators. It is as simple as that,’ Mr Kyle spat back.

This is what authoritarian governments do when they have overreach – falsely frame a legislative reform as an existential threat to safety. If you don’t want your free speech rights or privacy erased, you must be supporting predators. If you think industrial renewable energy projects are a bad idea, you must want the world to burn. Or freeze. Or flood. (They’re not quite sure on that one.)

The truth is, the UK has shown us what awaits Australia in the immediate future.

Even those who dislike social media need be concerned about the impacts on search engines such as Google and Microsoft. Those who do not verify their age will have their results automatically filtered to child settings. This does not mean the standard ‘safe search’. No, it is instead a more complex algorithm that we have been warned will include harmful content which could simply mean a discussion on migration or whatever the government deems to be misinformation.

It might be an article from the wonderful Professor Ian Plimer challenging the United Nations Intergovernmental Panel on Climate Change.

It is via these methods of ‘child safety’ that our access to knowledge shrinks.

The government has become what the late Christopher Hitchens warned about – an entity deciding what you can read.

Did the Prime Minister mention this at the last election?

Who is responsible for subverting democracy and taking this decision away from the Australian people and our elected representatives?

This intolerable story of the erosion of rights comes down to the eSafety Commissar, Julie Inman Grant. She seems absolutely giddy at the thought of more power. I’m disgusted. Enough is enough. We need to have a talk about digital overreach and the misuse of child safety as a means to control people’s access to the digital world.

Based on the Coalition’s introduction of necessary precursor policies and legislation, Labor’s assault on the digital world is so expansive and severe it is difficult to know which argument to take into battle.

And that is the point.

Destroying the modern public forum is an essential step on the path to cementing an era of unchallenged propaganda capable of re-shaping the social conversation of Australia.

The people who founded our democracy wrote privacy into the system for a reason.


It was to protect people from the government

First, terrorism was used. Then climate change. Then Covid. Now, child safety.

All of these have been used to deceitfully chip away at privacy and free speech. It must stop. We have to draw a line in the sand and protect the internet, for ourselves and for the next generation of children who deserve to grow up in a free country and indeed, a free world.

Life will never guarantee safety. That is not an excuse for the government to legislate danger.

‘Child safety’ or deliberate political censorship? by Senator Malcolm Roberts

Life will never guarantee safety. That is not an excuse to legislate danger.

Read on Substack

The e-Safety commissioner wants search engines like Google to have mandatory age verification. This will automatically censor search results.

We need to have an urgent and serious talk about the misuse of ‘child safety’ for the purposes of mass government censorship.

Using ‘child safety’ to restrict social media is a dangerous path for Australia

There are some policies so unworkable, so obscene, and so detached from reality that the public may be forgiven for thinking they will never come to pass – even after the Senate approves a bill.

Banning Australians under 16 from social media was an idea pitched by former Liberal Leader Peter Dutton and formalised by Prime Minister Anthony Albanese before the Federal Election.

Then it was forgotten…

The policy had a strange birth, following a tiny frenzy of media articles which sprung up out of nowhere describing a ‘social media bullying epidemic!’ A ‘crisis’ that vanished from the headlines once digital censorship had been cheered into the agenda by politicians desperate to talk about anything other than energy, migration, or debt…

These articles briefly reappeared when criticism against the original bill reached its peak, painting those who dared to oppose online censorship and intrusive biometric identification as being insensitive to the ‘plight of children’.

It’s not clear who is pulling the strings.

However, on more than one occasion the media has pitched a ‘crisis’ peddled by ‘experts’ that was ‘solved’ at a politically opportune time.

Call me a cynic, but something’s up. Another agenda disclosed to a select few, perhaps?

The under 16 ban will enter the real world in December, with children already being advised to ‘download their profiles’ and delete social media apps.

Some parents believe having the strong-arm of government in the living room will help, although it is more likely this interference will create an acrimonious social rift between generations that is far worse than the ‘you don’t understand my music’ sentiment.

‘I used the government to ban you from talking to your friends…’ is hardly expected to help strained relationships between parents and children.

Every generation has a desire to preserve the world they grew up with, and I understand a lot of people are hostile to social media and its uncertain future.

This is often because media entities describe the online world as a ‘sewer’. To them, X, Facebook, and YouTube represent an army of keyboard critics and free market competition.

The media present a narrow view of a sprawling advancement which has become as integral to civilisation as the roads our truck drivers use to deliver food.

Social media is one technological creation to which we must adapt – or accept – as we did with the invention of the internet itself.

Banning children from what has become a fundamental tool for future business could saddle them with a disadvantage on the global scale and deny them opportunities.

Australia’s eSafety Commissioner has evolved from the late Christopher Hitchens’ warning about ‘who gets to decide what I can read’ into the more sinister ‘we will decide which libraries you can enter’.

While regulator might not be burning books, they are definitely smacking children that try to read them.

Worse, this expensive regulatory mess will solve nothing. It may even be used to create additional restrictions on adults as part of a larger crackdown on freedom.

Certainly, it is already spawning censorial bureaucracies to watch over us…

As the eSafety Commissioner menacingly advises, ‘We’ve only used our formal powers 6% of the time.’

The demonisation of the digital world is a philosophy the major parties share.

Opposition Leader Sussan Ley made her position clear at the National Press Club:

‘Another area that demands stronger government intervention is the protection of our children from devices and technology. We have allowed the smartest people in the world to make billions of dollars by peddling addictive technology to children and it is shortening their childhoods. Parents need government in their corner.’

Do they?

Do parents want government in their homes, holding the strap?

That is not something I hear from the community.

I’ve never had a parent lean on my shoulder and exclaim, ‘Gosh! If only we had MORE government!’

It is terrible to watch the Labor and Liberal parties treating young people like helpless sheep – herding them into government-moderated holding pens until they have endured 16 years of uninterrupted brainwashing from the education system.

Re-making bright, eager, healthy children into docile sheep.

These are children who know more about the Digital Age than every single adult drafting the under 16 ban.

Not a ‘ban’, apparently…

‘Calling it a ban misunderstands its core purpose and the opportunity it presents,’ said the eSafety Commissioner.


‘We’re not building a Great Australian Internet Firewall, but we are seeking to protect under 16s from those unseen yet powerful forces … it may be more accurate to frame this as a social media delay.’


Later she contracts herself and adds: ‘Children have important digital rights to participation.’

This is not only about Australia. The eSafety Commissioner was adamant that this regulation was both ‘bold’ and ‘leading the world’.


‘Global collaboration is what we have to be doing. The internet’s global. We know laws are national and local and that’s why we’re the founders of the Global Online Safety Regulators Network – as we’re much stronger together. A lot of these companies are as large and wealthy as nation states so we need to band together with like-minded countries.’


A ‘United Nations’ of Digital Censors operating above government to control global speech…?

Astonishingly, that did not make it to the headlines.

Cutting Australian children off from the outside world leaves their minds to be poisoned with government-scripted paranoia. These are the fears and terrors of Parliament. Once mistrust has been sown against alternative media sources that contradict policy – only then, apparently, can young Australians be ‘safely’ released into the digital wilderness to become crusading activists policing the digital realm on behalf of the government.

This is how you create neurotic, ideological busy-bodies championing government policy.

It is not how you support young Australians in their experience interacting with and shaping the digital realm.

Listening to the eSafety Commissioner, Julie Inman Grant, give her recent speech at the Press Club in Canberra, it appears her ‘advice’ is being crafted from two positions: a grievance regarding her brief employment at Twitter (now X), and the belief that a global framework of eSafety bureaucrats should control the flow of information online.

Julie Inman Grant, who once introduced herself as the ‘censorship commissar’ (quoting Elon Musk) described her interaction with the tech giant as a ‘war’.

‘I made a strategic decision to withdraw here … let’s face it, the war is going to be much longer and more extended.’

That was in 2024.

As an elected Senator, I find it extremely concerning and distasteful to hear the eSafety Commissioner openly pitch the regulation of digital media as a ‘war’ which insinuates that social media platforms are hostile foes rather than private companies providing an extraordinary advancement of technology and – for the first time in human history – a global platform for real-time speech between the peoples of the world.

While speaking to the Canberra Press Club, the eSafety Commissioner pitched her argument by comparing social media to a beach.

‘There are indeed treacherous waters for our children to navigate, especially while their maturity and critical reasoning skills are still developing. And this is where we can learn so much from tried and tested lessons of water safety that Australia pioneered. From the backyard pools to the beach, Australia’s water safety culture is a global success story.’

Pardon me, but the eSafety Commissioner appears to be confused.

Australian children under 16 are not banned from pools and beaches. Nor are they indoctrinated into a cult of terror surrounding water.

‘A mixture of regulation, education, and community participation that reduces risks and supports parents keeping their children happily and safely frolicking in the sea. Picture any major beach in Australia and [it] will likely include the familiar sight of yellow and red flags fluttering in the breeze, children splashing in the waves, and lifeguards standing watch. Parents keep a watchful eye too, but are quietly confident in the knowledge that their kids will be okay. Not because the ocean is safe, but because we have learned to live beside it.’

Aside from the insult of using an Australian beach scene to sell censorship to children, her focus on community adult presence as a safety measure side-steps wildly from her comments later where she says: ‘…the difference will be that they are grouped more with their peers rather than – you know – billions of people around the world that are adults and kids and strangers.’

Are communities good or bad?

The eSafety Commissioner doesn’t know because she cannot get her messaging straight from one breath to the next.

Too much focus has been placed on the (manageable) problems unavoidable in a revolutionary technology development and not enough said about the extraordinary benefit that comes with opening up the world’s information, opinion, debate, and minds.

Of course, there will be a period of adjustment.

For children and parents.

That is not an excuse for regulators to reach into the cradle and suffocate social media in its crib.

This attitude would have seen Rome’s stone tablets smashed, Alexandria’s libraries burned, and the printing presses of Europe fall silent. All to ‘protect’ people from unregulated knowledge.

And it is not as if the internet is an unregulated ‘Wild West’ as claimed. There are many laws – most of which go unenforced for reasons that remain a mystery to the public – that deal with most of the examples the eSafety Commissioner offers as justification.

Deep fakes, blackmail, underage sexual content, harassment – these are all crimes.

We would support an investigation into how many of these reports authorities leave unanswered.

These failures are domestic. They are related to Australia’s weak criminal justice system, not Silicon Valley CEOs who are being used as scapegoats to disguise the irresponsible failure of ‘soft-touch’ sentencing.

Peer-based bullying, which makes up the bulk of tragic youth suicides, is largely due to school peers known to both the parents and teachers. These terrible stories almost always reveal the systemic failure of the education system which has shied away from punishing bullies and removing them from the school environment.

Before banning children from the internet, we should find out why schools have lost control of students.

Banning under 16s from social media also has the potential to turn the government into the worst schoolyard bully.

Imagine a class where only one person is under 16. All of their peers are on social media – except them. Differences are what drives exclusion, and in this case the government is creating an insurmountable social divide that will expose untold thousands of children to a friendship disadvantage.

And what of children who struggle with school?

The eSafety Commissioner said at one point, ‘…a vision the Prime Minister had of seeing more kids kicking the footy. That’s what we plan to help measure in…’

Not all kids ‘kick the footy’.

The children who do not fit in with their school peers often engage with small international niche creative communities. These children make school bearable through their social media friendships in the same way my generation had pen pals or friends in other clubs and areas.

Cutting children off from their best friends online is worse than bullying. It is cruelty.

Despite what is suggested by regulators, these children will not ‘just move to other platforms’. One Australian child cannot compel international children to change to another unregulated social platform because that is not how reality works. They will simply be excluded and forgotten.

This is before we consider sick children who live at home or in hospital and for which social media is the thread that connects them to the world.

Social media lived peacefully side-by-side with the Millennial generation, who are now in their 30s-40s.

How is that possible?

No doubt it had something to do with the rigour of their education and domestic environment which provided the balance lacking in schools which routinely engage in public activism – dragging children onto the streets as pawns in adult political games.

When the education system decided to focus on politics, it began to see free speech and the platforms that facilitate critical thinking, live news, and global knowledge as ‘dangerous’.

School eSafety programs spend much of their time obsessing about which sources of news can be ‘trusted’, although it is never made clear when educators were handed the task of ranking news organisations in the minds of children.

Who gets to decide which news outlets are ‘trustworthy’?

The hypocrisy of National Press Club host Tom Connell informing the audience that they could watch YouTube and follow the conversation on X cannot be overstated.

In 2025, the news unfolds on social media – much to the frustration of the legacy media.

World leaders correspond via Truth Social and X.

The eSafety Commissioner is effectively banning children under 16 from the news – from the world – and from their friends.

Imagine if she had insisted children be banned from reading newspapers ‘for their safety’.

It’s the same thing, yet the danger is easier to recognise in the latter.

Our children deserve protection – protection from the expansion of government into the role of parenting.

The eSafety Commissioner has gone too far by Senator Malcolm Roberts

Using ‘child safety’ to restrict social media is a dangerous path for Australia

Read on Substack

The Office of the eSafety Commissioner does commendable work in protecting children and adults from bullying and, most importantly, removing child abuse material. I praised the Office for this work.

However, in my opinion, the eSafety Commissioner has brought the office into disrepute with her personal vendetta against Twitter/X and her attempt to become the world internet police.

Last year, the Commissioner finalised investigations into 9,500 pieces of violent and extremist content. I asked what these were. The answer provided was that the Commissioner was taking down material from anywhere in the world, detecting it in part because they actively searched for it, even without a complaint.

Given that the Commissioner is positioning herself as the world internet police at our expense, I asked what benefit removing the 9,500 pieces of material had for Australians.

The answer relied on one incident, and there was no proof it actually caused a terrorist incident. I asked why there was no explanation of what the other material was, such as a transparency register so we can see what material they are requiring to be taken down to check for political bias. The question was ignored.

I also asked what direct benefit her actions had in addressing terrorism and violent material. The Commissioner answered regarding child material, which I had already praised.

The Commissioner is avoiding scrutiny of her takedown notices for violent and extremist material, and I believe it is because they follow a political bias.

One Nation calls for the eSafety Commissioner to stand down.

Transcript

Senator ROBERTS: Can I, first of all, pay a compliment and I’ll read out some statistics. From the ACMA annual report 2023-24, the office of the eSafety Commissioner has received 13,824 complaints regarding web URLs, with 82 per cent relating to reports about child sexual abuse, child abuse or paedophile activity. This is a 19 per cent increase from the previous year. Your office sent 9,190 notifications related to child sexual abuse material to the INHOPE network—which I understand are the good guys, the right people to work with—and referred 130 investigations to the Australian Federal Police. On cyber abuse, you received 2,695 complaints to the Cyberbullying Scheme for Australian children and 3,113 complaints to the Adult Cyber Abuse Scheme with a removal rate of 88 per cent where removal was required. My opening comment is simple: well done; thank you very much. This is important work. 

My first question is that you finalised 9,461 critical investigations into terrorist and violent extremist content, representing a 229 per cent increase—that’s amazing—in these types of complaints from the previous year. I’d like to ask about that. How do you define terrorist and violent extremist content? 

Ms Inman Grant : I will turn over to Ms Snell to talk about that. That is part of our illegal and restricted content team under the Online Content Scheme. 

Ms Snell : I’m actually going to invite Mr Downie, who is the executive manager for our Investigations Branch, who oversees this work, to talk specifically to this. 

Mr Downie : When we’re dealing with terrorism and violent extremist content under the Online Safety Act, we deal with terrorism as defined under the Criminal Code to the pure definition of what a terrorist act is. However, when we’re applying the Online Safety Act, we apply the content according to the classification scheme, and we’ll classify that material as ‘refuse classification’, which then falls into class 1 and class 2 definitions. 

Senator ROBERTS: Is this content relating to Australian content or international content? 

Mr Downie : With the complaints that we receive, we receive content that can be generated or hosted anywhere in the world, but the key is that it’s accessible by the people within the Australian community. 

Senator ROBERTS: Do you seek this content out yourself, or do you rely on a complaint before acting? 

Mr Downie : Generally, we rely on a complaint before acting; however, we do have own-motion investigation provisions where we are then able to further conduct investigations to locate material that may be in furtherance of that complaint. 

Senator ROBERTS: Of those 9,461 completed investigations, what was the outcome, please? 

Mr Downie: I’d have to take that on notice for the specific details of those investigations, but in the majority of cases that content is removed. 

Senator ROBERTS: Is there any demonstrable benefit from you taking this material down? What is the benefit to the taxpayer of this aspect of your office? 

Mr Downie : Having access to that type of content, whether it be globally or not, is very harmful to members of the community. That material can be used to incite violence. It can be used to radicalise vulnerable people or youth, which, as we’ve seen in the media, can be then used to incite further violence within the community. So less access to that type of content can only be beneficial for the Australian community. 

Ms Inman Grant : And I’d note that ASIO Director-General Burgess has said that the vast majority of terrorism investigations conducted right now are of young people between the ages of 14 and 21 and in every single case they have been radicalised somehow on the internet. You would probably also be aware of, heartbreakingly, the stabbing video of bishop Mar Mari Emmanuel, which was geo-blocked here by X but was available in the rest of the world. In the sentencing of the 17-year-old Southport killer, Axel Rudakubana, who went and stabbed three little girls to death while they were making bracelets at a Taylor Swift themed dance party, that very video, that very Wakeley stabbing video, he accessed on X 25 minutes before he stabbed those little girls and claimed that that was his inspiration. So you can imagine that this is something that the UK government has wanted to talk to us about. We have a partnership with Ofcom. We of course have different powers, but I think it’s just a very powerful reminder that this kind of content is accessed by young people. It can normalise, desensitise and, in the worst cases, radicalise. 

Senator ROBERTS:On page 206 of the ACMA report, there’s a graph which shows X is the source of five per cent of your cyberabuse claims and Google four per cent, compared to Facebook at 25 per cent. Page 216 of your report lists major noncompliance actions. X has four and Google one. Why does X occupy so much of your time? 

Ms Inman Grant : In terms of adult cyberabuse? 

Senator ROBERTS: In terms of terrorism complaints and cyberabuse. 

Ms Inman Grant : If you recall back to 16 April, around the Wakeley stabbing, we worked with all platforms. With the exception of Meta and X Corp., they all did a good job in trying to identify, detect and remove the Wakeley terrorism video. We weren’t satisfied that either Meta or X did, but, once we issued formal removal notices, Meta responded and complied within the hour, and, of course—you know the story—X said, ‘We’ll see you in court.’ That’s what has taken our time. 

Senator ROBERTS: What about the others? That would apply to one of your complaints against them. What about the others? Why the other three? 

Ms Inman Grant : It depends on the type of harm. For instance, when we’re talking about youth based cyberbullying, most of the cyberbullying happens on the top four platforms where children spend their time, on YouTube, TikTok, Snap and Instagram. When it comes to image based abuse, there’s a much higher proportion now of sexual extortion targeting young men between the ages of 18 and 24. They tend to meet on Instagram, sometimes on Snap, and then they’re moved off platform. So it depends on the form of abuse. It also depends on the complaints we get. But, when it comes to the terrorist and child sexual abuse material, we go to where the content is hosted and shared. 

Senator ROBERTS: That still doesn’t answer the question. You’ve got four major noncompliance actions against X and only one against Google, yet you’ve mentioned several platforms. Why does X have to occupy so much of your time? 

Ms Inman Grant: Because they did not comply with our notices. Google came close to not complying, so we gave them a formal warning. 

Mr Fleming : Those tribunal and court cases are often initiated by X, so we’re responding to the claims that they make challenging our powers. That’s why they feature the most. 

Senator ROBERTS: The report goes on to list how many notices are issued under each part of the act yet does not provide a detailed list. This is fine for child and adult abuse material, of course. We’re happy with that. For class 1 extremist and violent material, why are we not provided a list of what the commissioner considers worthy of a takedown notice and the reasons why? There’s a widespread belief in the public that you’re overstepping on your choice of material to take down. 

Ms Inman Grant: Respectfully, I’d like to read from some weighted and validated surveys of the Australian public. In November 2024, a weighted survey of Australians found that 87 per cent of those surveyed supported the introduction of stronger penalties for social media companies that do not comply with Australian laws, 77 per cent supported the proposed ban on social media for children and 75 per cent supported the Australian government’s plan to introduce a digital duty of care. In August 2024, a weighted survey of Australians found that 79 per cent said that social media platforms should operate with a regulator with the power to order content removal. That seems like a pretty overwhelming amount of support from the public. 

Senator ROBERTS: That wasn’t my question. My question was: why are we not provided a list of what the commissioner considers worthy of a takedown notice and a breakdown of the reasons why? 

Ms Inman Grant : We provide as much transparency as we can. You would understand that confidentiality is incredibly important. We can’t describe these in great detail. We can’t name names. What kind of information do you think would be helpful to your understanding? That’s something that we can certainly look at in the interests of transparency. 

Senator ROBERTS: The specific behaviours, without breaching confidentiality, would be helpful. We wouldn’t expect you to breach confidentiality or name names—certainly not—but we would like the types of actions that the commissioner thinks worthy of a takedown notice, as I said, and the reasons why. 

Senator McAllister: The commissioner and I are trying to understand, with a little more precision, what sort of information. You’re simply saying a generalised list of examples that are deidentified— 

Ms Inman Grant : Of 40,000 complaints we receive annually. 

Senator ROBERTS: You’re dealing with them, so presumably you know what they are. I’d like to see some sort of classification so that people could understand the proportions, because at the moment I don’t think you’re accountable for that. 

Ms Inman Grant : We can take that on notice. We would have to look at privacy and confidentiality. We would also have to look at resource implications and how that might serve the public interest, but we’re happy to take a look at that. 

Senator ROBERTS: I think the people have a right to know. Referencing unofficial takedown notices, which I note are issued under section 183(2)(zk), these go to the question of your secrecy. If these are dangerous enough to require a takedown, then they should be dangerous enough for you to list out by making the register of takedown notices public knowledge—that’s what I was getting at. Otherwise, you’re simply exercising power without any accountability, power that can be abused. How would we know? Can you, Commissioner, point to one terrorist act you’ve prevented, one person you’ve deradicalised or one benefit to Australian society from the money you have spent on your campaign against extremist material? 

Ms Inman Grant : I go back to what D-G Burgess often says, ‘You’re never congratulated when you stop something from happening.’ Again, do we have to have more heartbreaking examples of, like I just explained to you, what happened with those three little girls murdered in Southport, UK? We’ll never know. What I do know is I have parents coming up to me and saying: ‘You’ve saved my son’s life. He was sexually extorted. He had just turned 18. He went to the police; no-one would help him. I wasn’t going to let it go. I found your website. Your investigators supported him, got the content down, gave him advice and sent him on to mental health support services.’ So I do know that we’re saving lives every day. 

How many cases of 12- and 13-year-old girls being cyberbullied and bullied do you need to prove that this is a veritable epidemic and that young people are losing their lives? We’re here to help them and to prevent that from happening. My biggest regret, if there is one, is that more people don’t know about us. Only about 40 per cent of the Australian population knows about us, but we do everything we can to help people. When we stop helping people and making the online world a safer and better place, then, yes, it’s time to hang up our hats, but we’re just getting started. 

Senator ROBERTS: With due respect, Ms Inman Grant, you didn’t answer my question— 

CHAIR: Senator Roberts, we have to rotate the call. There are a lot of senators who wish to ask questions. 

Senator ROBERTS: I just want to clarify that one. 

CHAIR: I can come back to you, if you wish. 

Senator ROBERTS: It’ll only take a second to do this. 

CHAIR: Go on then. 

Senator ROBERTS: I asked, ‘Can you point to one terrorist act?’ I accept you’re doing a good job. You’re preventing child abuse, no doubt about that. We’ve discussed that in the past. Can you point to one terrorist act you prevented, one person deradicalised or one benefit to Australian society from the money you have spent on your campaign against extremist material? That’s what I want to know. 

Ms Inman Grant : We’re not going out into the public asking young people if they saw a particular video that radicalised them or not. We do know when people have been radicalised by content that has been online. Some of the gore content that we’ve taken down includes the manifestos, the horrific imagery of people at Christchurch huddling in the corner while being shot. Anything that’s dehumanising that we are able to get down to not cause further pain to victims and their families and have not incite others into taking the same action, I think, is worth doing. I don’t need proof that I prevented this, that or the other from happening. We’re trying to make the internet a safer, more positive place with less violent extremist material, and that’s why we take these issues so seriously. 

Senator ROBERTS: My concern is with— 

CHAIR: We’ll go to Senator Darmanin— 

Senator ROBERTS: I’ll put one more question on notice. 

The eSafety Commissioner has the power to issue takedown notices on various types of material, with exploitation material being the most common. One Nation supports these powers being used for this purpose. A small portion of their work involves removing material that is deemed “violent or distressing.” This was the power used in the case of the Bishop Mari Mari Emmanuel video. One Nation is concerned that these powers could be misused, as they are subject to political interpretation regarding what is and is not “violent or distressing.”

I asked the eSafety Commissioner if her department had a transparency portal where Senators and the public could see the material being taken down. The Commissioner responded by including exploitation material in her count, to show why such a portal was not feasible, yet I did not ask about exploitation material; my question specifically concerned material categorised as “violent or distressing.”

It is my belief that social media platforms primarily use AI to remove most of this material and that the department has only had to issue a small number of notices. I want to know what those notices were issued for and I will continue this inquiry during the next estimates session.

Transcript

Senator ROBERTS: Thank you for attending. My first question is about your newsroom statement from 4 October about the social media platform X and a transparency notice on the measures it’s taking to combat child sexual exploitation material. Is this the only transparency notice that has not been complied with?

Ms Inman Grant: Thus far, yes. Where we issued an infringement notice, we issued something called a service provider notification to Google for the same set of child sexual abuse material.

Senator ROBERTS: The only other platform is Google, and that hasn’t been issued with a transparency notice. Are there any others like Telegram or Facebook? Telegram does a lot of work in that area.

Ms Inman Grant: We are in the midst of a process around a series of very complex transparency notices in relation to terrorist and violent extremist material. Telegram is amongst them, and we’re engaging with them.

Senator ROBERTS: Thank you. This thread asks about a subset of your work—material that is violent or distressing. Do you have a transparency portal where your instructions to social media platforms to take down such material are registered in as close to real time as possible so we can see what you’re censoring?

Ms Inman Grant: We weren’t set up as a censor, Senator. We have frameworks provided through complaint schemes. Members of the public report content to us, particularly when the social media platform or messaging platform hasn’t responded. With respect to illegal and harmful online content, we also have very well legally defined requirements. We have both notice powers under the Criminal Code and then removal notices under the Online Safety Act and formal removal notices, which we exercised against both X and Meta during the Wakeley terrorist incident.

Mr Dagg: Can I just explain how we achieve the objective of transparency in terms of our actions. You may know that the Online Safety Act requires us to publish, under section 183, actions that we’ve taken in relation to a variety of harms. Our annual report has been published. You can find all of the information—

Senator ROBERTS: Your report has been published?

Mr Dagg: The annual report has been published, and we are required to report all of that information in the annual report. You can find that from page 223 in the appendices that relate to the eSafety Commissioner. That will show you all of the actions that we took for the financial year 2023-24.

Senator ROBERTS: Can you give us a bit of background on each one?

Mr Dagg: No—these are aggregated figures, so there’s no specific breakdown of each individual matter.

Senator ROBERTS: So there’s no breakdown and no opportunity for people to see how you’re doing it?

Mr Dagg: It would not be operationally feasible for us to report in real time the actions that we’re taking. Parliament expected us to report on an aggregated basis about the actions that we’ve taken, including requests, but we haven’t broken them down—

Senator ROBERTS: It’s just the aggregate numbers—

Mr Dagg: The aggregate numbers for a range of operational purposes, including security and operational feasibility.

Senator ROBERTS: So the platforms have to be transparent, and you don’t?

Mr Dagg: Well, the platforms report on things in an aggregated way, too, Senator. They’re not reporting on each individual specific matter that they deal with. They deal with millions of matters on a yearly basis. So, again, that just wouldn’t be feasible for them to do.

Senator ROBERTS: But the platforms have to be transparent to you.

Mr Dagg: Through the exercise of our compulsory transparency powers under the basic online safety expectations. But it’s important to note, Senator, that those transparency powers are around how the platforms are meeting the expectations. We’re not extracting from them specific information about how they’re dealing with this matter or that matter that might be reported to them. We’re interested in understanding how they take user reports, for example—if they’ve got reporting schemes in place, how their terms, services and policies are developed to meet the objects of the basic online safety expectations. The most recent determination includes some measures in relation to generative AI and how the companies are ensuring that these technologies aren’t being used, for example, to produce child sexual abuse material on a synthetic basis. That’s the kind of information that we’re drawing from the companies. We’re not drawing information about how they’re dealing with individual complaints.

Senator ROBERTS: The police force has long had transparency to the public through the court system. Whether you agree that the court system is perfect or not, that’s not the point. Who do you go through to provide transparency? How can we assess what you’re doing, rather than just in the aggregate?

Mr Dagg: When it comes to the principles of open justice, as a former police officer myself, the matters that make their way to court represent a tiny fraction of all matters that are reported to police. The matters that are reported to police are not reported on an individual basis. There are strict privacy concerns, for example, that ensure the protection of complainants’ identities and the specific matters that are reported to police forces. The Wakeley matter—the section 109 notice that we issued to Twitter X—is a good example of how that principle of transparency plays out in the Federal Court. The online file, for example, includes all of the evidence that the eSafety Commissioner relied on to make the case that the interlocutory measures ought to be accepted by the court.

Senator ROBERTS: The Senate is the house of review. What facility exists for the Senate to review your take-down notices of material? Where’s the supervision of your activity? Who oversees you?

Ms Inman Grant: There are a few different ways. One is through FOI, which you’ve exercised yourself, Senator. We’ve had a 2,288 per cent increase in FOIs over the past year. We are held accountable. We have reporting requirements that include any informal actions we take. Of course, we can be challenged in the Federal Court. We can be challenged at the AAT, or now the ART. We can be challenged by the Ombudsman, and a complainant can ask for an internal review to be done. So there are a number of different ways that we can provide transparency when it is asked for or required.
But, as Mr Dagg said, with 41,000 reports this year—and I think Mr Downey, who is now running the investigations branch, is expecting at least 60,000 reports next year—it would operationally be infeasible, and it would violate the privacy of the complainants. As I said before, that confidentiality is important. Even young people understand that one of the reasons children don’t report cyberbullying is they don’t want to be the dobber or the snitch, and they fear retribution. If we were to not treat some of these complaints as personal information—and the Information Commissioner agrees with us—I think it would undermine trust in us as an organisation.

Senator ROBERTS: I get that. Did you say that there was a 2,000 per cent increase in FOIs?

Ms Inman Grant: Yes, 2,288 per cent.

Senator ROBERTS: That’s a huge increase. It tells me that people are hungry to learn more.

Ms Inman Grant: Yes, and there have been some campaigns that have also encouraged people to put in FOIs, which we respond to.

Senator ROBERTS: You’ve used the defence of having so many infringements to take care of. That’s a big workload. What I’m interested in is not so much that but how you’re being held accountable. How can we see transparently what you’re doing?

Senator McAllister: Here we all are, Senator. What is the question that you seek to ask?

CHAIR: We call it estimates.

Senator McAllister: We are at estimates. The commissioner is here to answer your questions. If there are particular things that you’re interested in, you really should ask her.

Senator ROBERTS: What about the public? They need to know.

Senator McAllister: You are their representative, as you so often remind us.

CHAIR: You can send them the video of this.

Senator McAllister: You are a humble servant of the people of Queensland.

Senator ROBERTS: I want to go to freedom of information 24118, which asked for any guidelines you have with regard to the implied right to political communication to make sure you aren’t infringing on it as you issue take-down notices. I note that your freedom of information decision says: ‘There are no dedicated guides or policies with respect to the interaction of the implied right of political communication in use by the eSafety Commissioner or personnel who implement the various schemes under the OSA.’ There are no dedicated guides or policies?

Mr Dagg: We would need to assess each and every action we take through the lens of whether or not the implied constitutional right to political communication is infringed. That’s just operationally infeasible.

Senator ROBERTS: So are you saying, ‘To hell with the Constitution’?

Mr Dagg: No, not at all. The concern that a particular person’s interests may have been infringed in such a way as to raise a claim that the operation of the Online Safety Act is invalid is absolutely a matter that can be pursued through merits review or judicial review. But, to the commissioner’s point, we are going to be dealing with 60,000 complained URLs this year, which produces a significant percentage of actions we take. I’m sure you can understand that rigorously assessing whether or not they raise any specific issues in relation to the implied constitutional right makes it very difficult for us to make rapid decisions in line with the threshold set by the act. I think it’s important to note that the act contains very clear thresholds and very clear parameters for us to apply in terms of operational decision-making. The act itself, as you would have seen, is supported by a bill which was subject to exhaustive human rights review in its construction. We believe that, by properly administering the act on behalf of the commissioner, we’re taking actions which are in line with parliament’s expectations. If a person believes that their constitutional right—the implied right—has been infringed, there are avenues for review of that decision.

Senator ROBERTS: I can’t see how bypassing the Constitution or not including it as a consideration is in any way okay. The eSafety Commissioner and the delegates ordinarily—this is the quote: ‘The eSafety Commissioner and the delegates ordinarily proceed on the basis that the powers given to them under the OSA by the Australian Parliament are reasonably appropriate and adapted’. So you don’t turn your mind to whether you’re acting constitutionally at all; you just assume you are. How can this Senate be convinced that you are able to act within the Constitution when you don’t even have a document outlining the fundamental right of Australians to communicate in political matters? If you infringe on someone’s constitutional rights, then they complain? That’s it?

Senator McAllister: As you know, the constitutionality of any piece of legislation that comes before the parliament—

Senator ROBERTS: Not the legislation—

Senator McAllister: is quite frequently a matter of some discussion. Unless you seek to challenge it, we can assume that the legislative framework within which the commissioner and her staff operates is constitutional.

Senator ROBERTS: That’s a misrepresentation of what I said, Minister. I’m not saying that the act is unconstitutional; I’m saying that the consideration to take someone down needs to maintain constitutional rights—particularly political.

Senator McAllister: I think the two things are interconnected, Senator, because the powers that are exercised by the commissioner and the staff that work with her are enabled by the parliament and by the legislation.

Senator ROBERTS: I get that.

Senator McAllister: As I have indicated to you already, that is quite often subject to a discussion among senators about constitutional arrangements.

Senator ROBERTS: That still doesn’t answer the question—the right to political communication.

CHAIR: Senator Roberts, I am going to move on.

Senator ROBERTS: Thank you.

I break down the contents of the government’s proposed Misinformation and Disinformation (MAD) Bill and see it for exactly what it is – a censorship regime that would make George Orwell blush.

Liberal Shadow Minister for Communications, David Coleman, on Insiders on Sunday 9 June 2024.

The eSafety Commissioner still has the full support of Dutton’s Coalition. In fact, they’re proud of establishing Julie Inman Grant’s position.

Don’t expect any respect for freedom of speech from Labor, or Liberals if they’re returned to Government.

You can only trust One Nation to stand against the tyrants who want to tell Australians what they can and can’t see, or say.