Australia’s online safety regulator, eSafety, has imposed a $386,000 fine on X, previously known as Twitter, for their inability to provide crucial information concerning their efforts to combat child abuse content. This action comes after eSafety issued legal notices to major tech companies, including Google, TikTok, Twitch, Discord, and X, under the country’s Online Safety Act in February. The notices requested detailed answers about their strategies for tackling child sexual abuse material (CSAM).

While the monetary value of the fine may appear modest, it poses a challenge for X, which is already grappling with difficulties in retaining advertisers and managing its reputation.

In a press release, eSafety highlighted that X had left some sections of their responses completely blank, while others were either incomplete or inaccurate. Additionally, X, owned by Elon Musk, was criticized for not providing prompt responses to the regulator’s inquiries.

One of the most significant shortcomings was X’s failure to furnish information about CSAM detection technology during live streams and their claim of not using any technology to identify grooming.

The report also found Google guilty of offering generic responses, which eSafety deemed inadequate. However, instead of a fine, the regulator issued a formal warning to Google, indicating that Google’s infractions were not as severe.

eSafety Commissioner Julie Inman Grant expressed disappointment in Twitter/X for not living up to its public commitments in the fight against CSAM. She emphasized that the company must translate words into tangible actions.

“Twitter/X has publicly stated that combating child sexual exploitation is its top priority, but mere rhetoric is insufficient. We need to see concrete actions,” she said in a statement. “If Twitter/X and Google cannot provide answers to essential questions about their approach to addressing child sexual exploitation, it suggests they either do not want to be transparent about their actions or need to enhance their systems for monitoring their operations. Both scenarios are concerning and indicate that they are falling short of their responsibilities and the expectations of the Australian community.”

Notably, X removed an option for users to report political misinformation last month, raising concerns from the Australian digital research group Reset. They expressed worries that this change could result in inappropriate reviews of violative content and a failure to adhere to the company’s policies for labeling or removing such content.

Following Elon Musk’s takeover, X/Twitter reduced its staff focused on trust and safety issues. In December of the previous year, the company disbanded the Trust & Safety Council, an advisory group that provided guidance on effectively removing CSAM. As part of cost-cutting measures, the social media company also closed its physical office in Australia earlier this year.

This month, India issued notices to X, YouTube, and Telegram to remove CSAM from their platforms. Last week, the European Union formally requested details from X under the Digital Services Act (DSA) concerning their actions to combat misinformation related to the Israel-Hamas conflict.