A review has been launched into the Big Tech misinformation code a year after it came into effect, but the global giants have already flagged they are likely to reject many of the media watchdog’s recommendations to improve it.
The Australian Code of Practice on Disinformation and Misinformation was launched in February last year after being developed by industry group DIGi in response to the competition watchdog’s digital platforms inquiry.
It includes a number of opt-in commitments for its voluntary signatories, including public reporting on efforts to combat misinformation and disinformation, and having effective complaints mechanisms in place.
The code has been adopted by Adobe, Apple, Google, Meta, Microsoft, Redbubble, TikTok and Twitter.
DIGI has now launched the planned review of the code a year after it was launched, and has also released an annual report on its implementation.
“The goal of the code is to incentivise best practice by technology companies in relation to misinformation and disinformation, through driving greater transparency, consistency and public accountability,” DIGI managing director Sunita Bose said.
“Hearing the views of academics, civil society and other Australians through our previous public consultation helped DIGI shape the code, and now we want to hear from them again as to whether it needs to be amended.”
An ACMA report completed in June last year made a number of recommendations for changes to the code, including that the commitments in it should be opt-out rather than opt-in, that it should be expanded to include private messaging and that its definitions should be altered.
In its discussion paper, DIGI rejected these changes.
On the opt-out issue, DIGI said that having flexibility is crucial and it would instead support a requirement that companies regularly reassess which outcomes should apply to them.
“DIGI considers the flexibility provided by the opt-in model is currently working well. Five out of the eight signatories have currently opted in to all of the measures set out in the code. DIGI’s experience is that this flexibility was key to securing the commitment of diverse signatories to the code,” the DIGI paper said.
ACMA had raised concerns that the definition of misinformation and disinformation in the code includes that it poses a “serious” and “imminent” threat means the code is “limited” and doesn’t address more harms online.
But this definition is in line with the ACCC’s digital platforms inquiry, DIGI said.
“The difficulty with that approach is that it is extremely difficult for platforms to foresee when an accumulation of instances of misinformation on a given topic is likely to result in harm that may be years or decades away,” it said.
DIGI also railed against the inclusion of private messaging in the code, saying this would impact the privacy of users.
“Users of messaging services have greater expectations about the privacy of their personal communications and will likely be seriously concerned about the imposition of regulations that require businesses to monitor or control the content of their messages for truthfulness,” the DIGI discussion paper said.
“In our view, the scope of the existing laws do not indicate that there is any need for digital platforms to undertake voluntary intrusive monitoring of users’ private communications beyond those given to enforcement agencies.”
DIGI also said that the ACMA report was completed before new governance arrangements around the code were implemented in October last year, including the establishment of an Administration Sub-Committee with representatives from signatory companies and independent members.
Companies signed onto the code released their annual reports under it last week, revealing how they are tracking alongside each outcome they have agreed to.
This revealed that Facebook has seen a sharp increase in the amount of misinformation and disinformation it is removing, hitting 180,000 in 2021, up from 110,000 in the previous year.
Submissions to DIGI’s review of the code are open until July 18.
The previous Coalition government pledged to hand significant enforcement powers to ACMA over the code with the ability for the watchdog to compel the tech firms to hand over information on their performance.
New Communications Minister Michelle Rowland has criticised the Coalition for waiting until late in its term to announce the new powers.
“The government waited until the dying days of the 46th parliament to flag a process to ‘scope’ ACMA powers, nine months after the ACMA recommended legislation in this area and almost three years since the ACCC recommended an enforceable code for disinformation and misinformation,” Ms Rowland said.
“While a voluntary code is in place, the regulator still doesn’t have the power to investigate what harms are presenting or how they’re being dealt with by digital platforms.
“The question of whether and how to regulate misinformation and disinformation is a live one in Australia and other jurisdictions around the world, and debate around this issue is healthy.”
Do you know more? Contact James Riley via Email.