The Office of the eSafety Commissioner does commendable work in protecting children and adults from bullying and, most importantly, removing child abuse material. I praised the Office for this work.
However, in my opinion, the eSafety Commissioner has brought the office into disrepute with her personal vendetta against Twitter/X and her attempt to become the world internet police.
Last year, the Commissioner finalised investigations into 9,500 pieces of violent and extremist content. I asked what these were. The answer provided was that the Commissioner was taking down material from anywhere in the world, detecting it in part because they actively searched for it, even without a complaint.
Given that the Commissioner is positioning herself as the world internet police at our expense, I asked what benefit removing the 9,500 pieces of material had for Australians.
The answer relied on one incident, and there was no proof it actually caused a terrorist incident. I asked why there was no explanation of what the other material was, such as a transparency register so we can see what material they are requiring to be taken down to check for political bias. The question was ignored.
I also asked what direct benefit her actions had in addressing terrorism and violent material. The Commissioner answered regarding child material, which I had already praised.
The Commissioner is avoiding scrutiny of her takedown notices for violent and extremist material, and I believe it is because they follow a political bias.
One Nation calls for the eSafety Commissioner to stand down.
Transcript
Senator ROBERTS: Can I, first of all, pay a compliment and I’ll read out some statistics. From the ACMA annual report 2023-24, the office of the eSafety Commissioner has received 13,824 complaints regarding web URLs, with 82 per cent relating to reports about child sexual abuse, child abuse or paedophile activity. This is a 19 per cent increase from the previous year. Your office sent 9,190 notifications related to child sexual abuse material to the INHOPE network—which I understand are the good guys, the right people to work with—and referred 130 investigations to the Australian Federal Police. On cyber abuse, you received 2,695 complaints to the Cyberbullying Scheme for Australian children and 3,113 complaints to the Adult Cyber Abuse Scheme with a removal rate of 88 per cent where removal was required. My opening comment is simple: well done; thank you very much. This is important work.
My first question is that you finalised 9,461 critical investigations into terrorist and violent extremist content, representing a 229 per cent increase—that’s amazing—in these types of complaints from the previous year. I’d like to ask about that. How do you define terrorist and violent extremist content?
Ms Inman Grant : I will turn over to Ms Snell to talk about that. That is part of our illegal and restricted content team under the Online Content Scheme.
Ms Snell : I’m actually going to invite Mr Downie, who is the executive manager for our Investigations Branch, who oversees this work, to talk specifically to this.
Mr Downie : When we’re dealing with terrorism and violent extremist content under the Online Safety Act, we deal with terrorism as defined under the Criminal Code to the pure definition of what a terrorist act is. However, when we’re applying the Online Safety Act, we apply the content according to the classification scheme, and we’ll classify that material as ‘refuse classification’, which then falls into class 1 and class 2 definitions.
Senator ROBERTS: Is this content relating to Australian content or international content?
Mr Downie : With the complaints that we receive, we receive content that can be generated or hosted anywhere in the world, but the key is that it’s accessible by the people within the Australian community.
Senator ROBERTS: Do you seek this content out yourself, or do you rely on a complaint before acting?
Mr Downie : Generally, we rely on a complaint before acting; however, we do have own-motion investigation provisions where we are then able to further conduct investigations to locate material that may be in furtherance of that complaint.
Senator ROBERTS: Of those 9,461 completed investigations, what was the outcome, please?
Mr Downie: I’d have to take that on notice for the specific details of those investigations, but in the majority of cases that content is removed.
Senator ROBERTS: Is there any demonstrable benefit from you taking this material down? What is the benefit to the taxpayer of this aspect of your office?
Mr Downie : Having access to that type of content, whether it be globally or not, is very harmful to members of the community. That material can be used to incite violence. It can be used to radicalise vulnerable people or youth, which, as we’ve seen in the media, can be then used to incite further violence within the community. So less access to that type of content can only be beneficial for the Australian community.
Ms Inman Grant : And I’d note that ASIO Director-General Burgess has said that the vast majority of terrorism investigations conducted right now are of young people between the ages of 14 and 21 and in every single case they have been radicalised somehow on the internet. You would probably also be aware of, heartbreakingly, the stabbing video of bishop Mar Mari Emmanuel, which was geo-blocked here by X but was available in the rest of the world. In the sentencing of the 17-year-old Southport killer, Axel Rudakubana, who went and stabbed three little girls to death while they were making bracelets at a Taylor Swift themed dance party, that very video, that very Wakeley stabbing video, he accessed on X 25 minutes before he stabbed those little girls and claimed that that was his inspiration. So you can imagine that this is something that the UK government has wanted to talk to us about. We have a partnership with Ofcom. We of course have different powers, but I think it’s just a very powerful reminder that this kind of content is accessed by young people. It can normalise, desensitise and, in the worst cases, radicalise.
Senator ROBERTS: On page 206 of the ACMA report, there’s a graph which shows X is the source of five per cent of your cyberabuse claims and Google four per cent, compared to Facebook at 25 per cent. Page 216 of your report lists major noncompliance actions. X has four and Google one. Why does X occupy so much of your time?
Ms Inman Grant : In terms of adult cyberabuse?
Senator ROBERTS: In terms of terrorism complaints and cyberabuse.
Ms Inman Grant : If you recall back to 16 April, around the Wakeley stabbing, we worked with all platforms. With the exception of Meta and X Corp., they all did a good job in trying to identify, detect and remove the Wakeley terrorism video. We weren’t satisfied that either Meta or X did, but, once we issued formal removal notices, Meta responded and complied within the hour, and, of course—you know the story—X said, ‘We’ll see you in court.’ That’s what has taken our time.
Senator ROBERTS: What about the others? That would apply to one of your complaints against them. What about the others? Why the other three?
Ms Inman Grant : It depends on the type of harm. For instance, when we’re talking about youth based cyberbullying, most of the cyberbullying happens on the top four platforms where children spend their time, on YouTube, TikTok, Snap and Instagram. When it comes to image based abuse, there’s a much higher proportion now of sexual extortion targeting young men between the ages of 18 and 24. They tend to meet on Instagram, sometimes on Snap, and then they’re moved off platform. So it depends on the form of abuse. It also depends on the complaints we get. But, when it comes to the terrorist and child sexual abuse material, we go to where the content is hosted and shared.
Senator ROBERTS: That still doesn’t answer the question. You’ve got four major noncompliance actions against X and only one against Google, yet you’ve mentioned several platforms. Why does X have to occupy so much of your time?
Ms Inman Grant: Because they did not comply with our notices. Google came close to not complying, so we gave them a formal warning.
Mr Fleming : Those tribunal and court cases are often initiated by X, so we’re responding to the claims that they make challenging our powers. That’s why they feature the most.
Senator ROBERTS: The report goes on to list how many notices are issued under each part of the act yet does not provide a detailed list. This is fine for child and adult abuse material, of course. We’re happy with that. For class 1 extremist and violent material, why are we not provided a list of what the commissioner considers worthy of a takedown notice and the reasons why? There’s a widespread belief in the public that you’re overstepping on your choice of material to take down.
Ms Inman Grant: Respectfully, I’d like to read from some weighted and validated surveys of the Australian public. In November 2024, a weighted survey of Australians found that 87 per cent of those surveyed supported the introduction of stronger penalties for social media companies that do not comply with Australian laws, 77 per cent supported the proposed ban on social media for children and 75 per cent supported the Australian government’s plan to introduce a digital duty of care. In August 2024, a weighted survey of Australians found that 79 per cent said that social media platforms should operate with a regulator with the power to order content removal. That seems like a pretty overwhelming amount of support from the public.
Senator ROBERTS: That wasn’t my question. My question was: why are we not provided a list of what the commissioner considers worthy of a takedown notice and a breakdown of the reasons why?
Ms Inman Grant : We provide as much transparency as we can. You would understand that confidentiality is incredibly important. We can’t describe these in great detail. We can’t name names. What kind of information do you think would be helpful to your understanding? That’s something that we can certainly look at in the interests of transparency.
Senator ROBERTS: The specific behaviours, without breaching confidentiality, would be helpful. We wouldn’t expect you to breach confidentiality or name names—certainly not—but we would like the types of actions that the commissioner thinks worthy of a takedown notice, as I said, and the reasons why.
Senator McAllister: The commissioner and I are trying to understand, with a little more precision, what sort of information. You’re simply saying a generalised list of examples that are deidentified—
Ms Inman Grant : Of 40,000 complaints we receive annually.
Senator ROBERTS: You’re dealing with them, so presumably you know what they are. I’d like to see some sort of classification so that people could understand the proportions, because at the moment I don’t think you’re accountable for that.
Ms Inman Grant : We can take that on notice. We would have to look at privacy and confidentiality. We would also have to look at resource implications and how that might serve the public interest, but we’re happy to take a look at that.
Senator ROBERTS: I think the people have a right to know. Referencing unofficial takedown notices, which I note are issued under section 183(2)(zk), these go to the question of your secrecy. If these are dangerous enough to require a takedown, then they should be dangerous enough for you to list out by making the register of takedown notices public knowledge—that’s what I was getting at. Otherwise, you’re simply exercising power without any accountability, power that can be abused. How would we know? Can you, Commissioner, point to one terrorist act you’ve prevented, one person you’ve deradicalised or one benefit to Australian society from the money you have spent on your campaign against extremist material?
Ms Inman Grant : I go back to what D-G Burgess often says, ‘You’re never congratulated when you stop something from happening.’ Again, do we have to have more heartbreaking examples of, like I just explained to you, what happened with those three little girls murdered in Southport, UK? We’ll never know. What I do know is I have parents coming up to me and saying: ‘You’ve saved my son’s life. He was sexually extorted. He had just turned 18. He went to the police; no-one would help him. I wasn’t going to let it go. I found your website. Your investigators supported him, got the content down, gave him advice and sent him on to mental health support services.’ So I do know that we’re saving lives every day.
How many cases of 12- and 13-year-old girls being cyberbullied and bullied do you need to prove that this is a veritable epidemic and that young people are losing their lives? We’re here to help them and to prevent that from happening. My biggest regret, if there is one, is that more people don’t know about us. Only about 40 per cent of the Australian population knows about us, but we do everything we can to help people. When we stop helping people and making the online world a safer and better place, then, yes, it’s time to hang up our hats, but we’re just getting started.
Senator ROBERTS: With due respect, Ms Inman Grant, you didn’t answer my question—
CHAIR: Senator Roberts, we have to rotate the call. There are a lot of senators who wish to ask questions.
Senator ROBERTS: I just want to clarify that one.
CHAIR: I can come back to you, if you wish.
Senator ROBERTS: It’ll only take a second to do this.
CHAIR: Go on then.
Senator ROBERTS: I asked, ‘Can you point to one terrorist act?’ I accept you’re doing a good job. You’re preventing child abuse, no doubt about that. We’ve discussed that in the past. Can you point to one terrorist act you prevented, one person deradicalised or one benefit to Australian society from the money you have spent on your campaign against extremist material? That’s what I want to know.
Ms Inman Grant : We’re not going out into the public asking young people if they saw a particular video that radicalised them or not. We do know when people have been radicalised by content that has been online. Some of the gore content that we’ve taken down includes the manifestos, the horrific imagery of people at Christchurch huddling in the corner while being shot. Anything that’s dehumanising that we are able to get down to not cause further pain to victims and their families and have not incite others into taking the same action, I think, is worth doing. I don’t need proof that I prevented this, that or the other from happening. We’re trying to make the internet a safer, more positive place with less violent extremist material, and that’s why we take these issues so seriously.
Senator ROBERTS: My concern is with—
CHAIR: We’ll go to Senator Darmanin—
Senator ROBERTS: I’ll put one more question on notice.




