Govt backs away from Big Tech self-regulation to give ACMA new powers


Denham Sadler
Senior Reporter

The media regulator will be handed new powers to enforce social media misinformation codes, nearly three years after this recommendation was first made and almost a year since a report on it was provided to the federal government.

The announcement represents a significant change in approach from the federal government, after its previous reliance on industry self-regulation proved to be ineffective.

Facebook
Facebook removed 110,000 pieces of COVID misinformation posted by Australians last year. Credit: Twin Design / Shutterstock.com

Communications minister Paul Fletcher on Monday announced a widening of the Australian Communications and Media Authority’s (ACMA) remit when it comes to social media companies and Big Tech firms.

This will come in the form of legislation which the Coalition will introduce to Parliament in the second half of this year, if it wins the May federal election.

ACMA will be given information-gathering powers from social media companies around their efforts to combat misinformation and disinformation on their platforms and handed “reserve powers” to register and enforce existing industry codes if voluntary efforts are deemed to be inadequate.

This expansion of ACMA’s powers has been a long time coming, after it was first recommended by the competition watchdog in July 2019 as part of the digital platforms inquiry. ACMA itself provided a report to the government on the need for these powers in June last year, with this report only released publicly on Monday, nine months later.

Following the Australian Competition and Consumer Commission’s final report from its digital platforms inquiry in mid-2019, the government ordered social media companies to develop a voluntary code covering how they would deal with misinformation and disinformation on their platforms.

This code came into effect in February last year and has been adopted by eight companies: Google, Facebook, Microsoft, Twitter, TikTok, Redbubble, Apple and Adobe.

The code commits these companies to introducing and maintaining safeguards to protect from misinformation through a range of “scalable measures” to reduce its spread and visibility.

Under the code, signatories are able to choose which objectives and measures to agree to, and it does not apply to private messages or emails, satire, authorised government content and political advertising.

The code was slammed as being “woefully inadequate” at the time by the Centre for Responsible Technology.

ACMA delivered a report to the government on the effectiveness of this code, with recommendations for improvement, in June last year.

Nine months later, the government has now released this report and agreed to the five recommendations included in it.

It comes just days after Labor Senators called on the government to release this report and respond to it “immediately”.

The recommendations from ACMA include legislating to hand it new information-gathering powers to encourage better transparency from the tech companies about the effectiveness of measures aimed at addressing misinformation and disinformation.

These powers were first recommended by the ACCC in 2019 as part of the digital platforms inquiry.

The government will also look to give ACMA extra “reserve powers” to register and enforce the industry code and make industry standards if the voluntary efforts prove to be “inadequate or untimely”.

These initiatives will be legislated in the second half of this year, if the Coalition wins the upcoming federal election.

Under the plan, a Misinformation and Disinformation Action Group will be established to coordinate efforts in the space.

“ACMA’s report highlights that disinformation and misinformation are significant and ongoing issues,” Communications minister Paul Fletcher said in a statement.

“Digital platforms must take responsibility for what is on their sites and take action when harmful or misleading content appears. This is our government’s clear expectation – and just as we have backed that expectation with action in recently passing the new Online Safety Act, we are taking action when it comes to disinformation and misinformation.”

ACMA’s report, delivered to the government in June last year, found that misinformation poses an “increasing threat to Australians”, and that more than 80 per cent of Australians had experienced misinformation about the Covid-19 pandemic.

The report was also critical of the industry’s voluntary code, saying it is “limited by its definitions”.

The threshold in the code for action to be required is that the misinformation or disinformation poses a “serious” and “imminent” harm.

“The effect of this is that signatories could comply with the code without having to take any action on the type of information which can, over time, contribute to a range of chronic harms, such as reductions in community cohesion and a lessening of trust in public institutions,” the ACMA report said.

The watchdog also recommended that the code be made opt-out rather than opt-in, with signatories required to opt out of outcomes which are not relevant to it and provide a justification for this.

The code is also limited by the types of services and products it covers, according to ACMA, mainly due to private messaging being excluded.

Tech industry group DIGI, which led the development of the code, welcomed the release of the ACMA report and supported its recommendations in principle.

“This report will be a critical tool in our efforts to strengthen the code and maximise its effectiveness in addressing mis and disinformation online in Australia,” DIGI managing director Sunita Bose said.

“DIGI supports the ACMA’s five key recommendations in principle and we look forward to further work with the government on the details. We’ll be closely reviewing the report’s findings, as part of DIGI’s planned review of the code, where we intend to proactively invite views from the public, civil society and government about how it can be improved.”

The report is “well overdue” and confirms that digital platforms can’t be trusted to manage safety on their own sites, Centre for Responsible Technology director Peter Lewis said.

“It has been obvious for some time that the idea that advertising monopolies that are designed to keep their users on their platform for as long as possible should not manage their own affairs, as proposed by the industry,” Mr Lewis said.

According to Reset Australia director of tech policy Dhakshayini Sooriyakumaran, the new ACMA powers will only have a “limited impact”, and a complete reform of the misinformation code is needed.

“The era of Big Tech self-regulation is over. It is a failed project,” Ms Sooriyakumaran said.

“These new laws point to this reality, which globally tech regulation experts have for years. The next crucial step for this legislation is shifting away from regulating bad content and bad actors, and instead regulating the platform’s systems and processes.”

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories