Misinformation reports produced by digital platform providers like Meta, Google and TikTok are plagued with data integrity issues for the fourth year running, according to the regulator that could soon be stepping in.
Some companies are also resisting requests for information on specific incidents linked to online misinformation like the Bondi Junction stabbing, while X failed to provide any information after being kicked off the industry led code last year.
The regulator is warning the struggles are undermining public confidence and exposing the limits of the current voluntary arrangements in Australia, as the Albanese government pursues mandatory reporting and new watchdog powers.
The Australian Communications and Media Authority on Wednesday repeated its annual warning that platforms aren’t providing consistent or localised data on their efforts to curb misinformation and are taking only “minimal steps” to improve the disclosures.
The lack of detail and few consequences for it means “Australians cannot be confident that platforms are delivering” on their requirements under the industry code, the watchdog said.
It urged the Parliament to pass legislation that will allow it to step in with higher expectations for platforms backed by large fines, warning the rise of AI and looming global elections threaten to turbocharge misinformation.
A bill before Parliament would hand the ACMA new powers to collect more information from the companies and step in with its own standard if the code continues to fall short, backed by large civil penalties.
But it currently acts only as an overseer of the four-year old industry effort being coordinated by the Digital Industry Group, and has few enforcement options.
The ACMA warned about data quality issues last year and its latest report says “minimal if any improvement to qualitative reporting has been identified” this time around. Four years in, it remains “disappointing to see limited progress in this area”.
The industry code is outcomes based, allowing the platforms flexibility and a risk based approach to minimise the risks and harms of misinformation and disinformation. The code currently has nine signatories: Adobe, Apple, Google, Meta, Microsoft, Redbubble, TikTok, Twitch and Legitimate.
But there have been repeated data integrity issues across years and between companies, and no signatory provided Australian specific metrics for all code outcomes. The ACMA’s analysis also found “no discernible progress” on identifying key performance indicators, making it hard to hold the platforms to account.
Stakeholders are instead left with complex and potentially inaccurate reports to make sense of, the ACMA said.
“In our view, transparency reports should allow interested members of the public, academics, researchers and government agencies to easily access and extract information,” the report said.
“The significant effort required to extract and verify data significantly limits the utility of information reported by signatories.”
The regulator has warned of these integrity issues before and this year began developing its own framework for more consistent reporting on misinformation. It hopes the industry code will be updated to include it but is yet to finalise the metrics and has no assurance it will be added to the code.
In 2021, the ACMA asked the government for formal investigative and information gathering powers, an ability to enforce the industry code and when it isn’t working implement its own standard.
On Tuesday, the regulator said its three-year-old recommendations “remain relevant” and “their implementation is more urgent than ever, given community concern” about misinformation.
Both Coalition and Labor governments put ACMA at the heart of new schemes to administer the code and implement standards when needed. The Morrison government’s plan never made it out of Parliament, while the Albanese government is now on its second attempt at the underlying legislation.
A bill introduced by Communications minister Michelle Rowland last week would establish the scheme that gives ACMA the graduated powers it wants and set high thresholds for misinformation and protections for free speech.
No government has proposed allowing the ACMA to remove specific content.
X, the platform formerly known as Twitter, lost its code signatory status last year after it removed user misinformation reporting tools and stopped cooperating with the Digital Industry Group.
The company never formally responded to the decision and has not answered the ACMA’s enquiries on the matter either.
It also did not provide information requested by the ACMA in April about how it was addressing misinformation during the Bondi Junction and Wakeley Church attacks in Sydney.
“This is particularly concerning given analysis published by the Australian Broadcasting Corporation which tracked the spread of misinformation on the platform following the Bondi Junction attack,” the ACMA said in the new report.
Almost all the other platforms did provide the requested information but it was patchy.
“The limited and asymmetric information provided, reflects digital platforms’ unwillingness to voluntarily engage with the ACMA. This behaviour highlights the urgent need for information gathering powers. In the absence of such powers, the ACMA has been unable to provide transparency to the Australian public about the effectiveness of platform responses after these events.”
Do you know more? Contact James Riley via Email.