X gutting safety staff creates ‘perfect storm’ for online hate


Joseph Brookes
Senior Reporter

X, the social media platform formerly known as Twitter, has gutted its safety and public policy personnel and reinstated thousands of banned accounts since billionaire Elon Musk’s takeover, creating a “perfect storm” for online hate speech.

The cuts to safety staff were confirmed by the company for the first time in disclosures to Australia’s online safety regulator, which is pursuing the company in a civil case for its alleged failure to comply with online safety laws.

Image: Shutterstock.com/Omar ElDeraa

X said that since Mr Musk’s takeover it had more than halved the number of moderators it directly employs and cut its global public policy staff by almost 80 per cent.

The number of moderators it directly employs on the platform have been reduced by more than half, while the number of global public policy staff have also been reduced by almost 80 per cent.

The platform has also reinstated over 6,100 previously banned Australian accounts in the first six months since the November 2022 acquisition announcement, including 195 accounts that were banned for hateful content violations, according to the regulator.

Globally, around 62,000 previously suspended accounts have reportedly been reinstated.

These accounts are believed to be operating without any additional scrutiny and X has confirmed it is no longer running tests on its recommender systems to reduce risk of amplification of hateful conduct or using automated tools to detect volumetric attacks or “pile-ons”.

“It’s almost inevitable that any social media platform will become more toxic and less safe for users if you combine significant reductions to safety and local public policy personnel with thousands of account reinstatements of previously banned users,” eSafety Commissioner Julie Inman Grant said on Thursday.

“You’re really creating a bit of a perfect storm.”

X has been under fire in Australia for allowing misinformation to flourish and its failure to answer the regulator’s questions on how it is tackling child sexual exploitation material.

In February last year, the regulator used the Online Safety Act’ Basic Online Safety Expectations to issue X and other platforms with notices to provide the information children’s safety protections.

The eSafety Commissioner released a report based on the responses in October and revealed X had given responses that were incorrect, significantly incomplete or irrelevant. The company was issued a $600,000 fine, the first ever under Australia’s online safety laws.

X did not pay the fine and sought a judicial review.

In December, the regulator commenced civil proceedings against X for its alleged failure to comply with the reporting notice. The eSafety Commissioner is requesting the judicial review be heard in tandem with the civil penalty proceedings to avoid delays.

The outside regulation has proved to be necessary, according to Reset Australia executive director Alice Dawkins, who says the self-regulatory model platforms have operated under has had varying results.

“Regulation is only as effective as the compliance of the sector’s worst performers… The outcome achieved by the Office of the eSafety Commissioner, announced today, provides a clear example of the regulatory architecture required to compel basic responsiveness from industry participants,” Ms Dawkins told InnovationAus.com.

X has already been removed from the platforms industry’s self-regulation effort for misinformation and disinformation after it shut down a user reporting tool and stopped cooperating with the industry group in charge.

Reset Australia exposed the removal of the tool and had sought a resolution under the industry’s Code of Practice on Misinformation and Disinformation code.

But its voluntary nature — X was removed from the code without additional punishment — shows it is “completely insufficient” to mitigate risks, Ms Dawkins said.

“Evidently, the more robust model driving the Basic Online Safety Expectations presents a more functional direction of travel,” she said.

The Albanese government will review the Online Safety Act this year and amend the Basic Online Safety Expectations for online services, including social media, to better protect children.

Ms Dawkins said it is an opportunity to “ramp up the focus on tech accountability to expand avenues for probes like this and take a systemic approach”.

“In a year widely recognised to carry severe digital risks for democracies worldwide, tech companies need maximum pressure from robust regulation to report on their risk mitigation efforts and be held accountable for them.”

Communications minister Michelle Rowland will also revamp the government’s plan to tackle online misinformation and disinformation with a more powerful regulator after the struggles with the self-regulatory model.

The first version of this new approach was widely criticised, however, and is expected to be changed significantly.

X was contacted for comment but did not provide a substantive response.

Do you know more? Contact James Riley via Email.

Leave a Comment