If you think ‘transparency’ is a buzzword, you’d be easily forgiven. We regularly hear calls and commitments to corporate transparency and, while we know it’s a good thing, do we really know why?
It’s a question that the Digital Industry Group Inc. (DIGI) grappled with when we enshrined transparency as central to the theory of change in the Australian Code of Practice on Disinformation and Misinformation (ACPDM).
This week, we launched the third set of tech companies’ transparency reports under the code; a reminder that this code is no longer new, but it is novel as one of only two such codes in the world.
Signatory technology companies have voluntarily committed to, and continue to evolve, safeguards that protect against online disinformation and misinformation.
A novel code and a highly complex issue means the policy pathway forward is not clearly laid out, which is why DIGI has had to create structures that deepen accountability and encourage continued industry action.
This includes the appointment of Hal Crawford, an independent reviewer, who develops best practice reporting guidelines to drive improvements in signatories’ transparency reports. He assesses each transparency report against those guidelines, asks signatories for improvements, and attests claims prior to publication.
While the first set of transparency reports contained mostly qualitative descriptions of signatories’ efforts, since the independent process was introduced, subsequent reports have also included more quantitative insights, laying the foundations for comparisons.
For example, TikTok has seen improvements in its proactive enforcement of misinformation content prior to users even viewing it, which increased from 37.6 per cent in Q1 to 69.8 per cent in Q4 of 2022. Meta has reported on its work to display warnings on content that is found to be false by independent third-party fact checkers; over 2022, it displayed warnings on over 9 million distinct pieces of content in Australia.
Yet in their efforts toward transparency, signatories have to be extremely careful that the information they release does not inadvertently release the playbook to amplify harmful misinformation, as bad actors are relentless and resourceful.
Google’s report details a Chinese influence campaign known as ‘dragonbridge’ that perpetuates disinformation on news topics such as China’s COVID-19 response to the war in Ukraine. Despite this content receiving relatively low engagement from users, Google in 2022 disrupted over 50,000 instances of ‘dragonbridge’ activity across its services, and terminated over 100,000 accounts.
The transparency reports shed light on how the issues prone to disinformation have shifted. While efforts continue in relation to COVID-19, 2022 presented new challenges that signatories have reported upon, such as the war in Ukraine. Microsoft reported that Russian influence operations leveraged energy and food supply concerns across Europe to weaken the alliance against the Russian invasion.
Importantly, this was the first Australian federal election since the code was introduced, and we believe the code positively contributed to the Australian Electoral Commission’s determination that this election saw much lower levels of electoral mis- and disinformation this election than in other like minded democratic elections across the globe.
The electoral integrity work to achieve that outcome is detailed in the reports, with multiple signatories directing Australians to the AEC to elevate official electoral information, in addition to content removal.
We know that signatories are now preparing similar work to prevent mis- and disinformation in relation to the referendum into an Aboriginal and Torres Strait Islander Voice to Parliament, which we expect will be reported upon in their 2023 calendar reports.
Content removal is often the focus when it comes to conversation about how we tackle misinformation, but it is only one lever in how we effectively respond to the challenge holistically. This is partly due to the diversity of business models and services among signatories, and a recognition that research and education measures are essential to a sustainably effective response.
The reports show how signatories are supporting important media and digital literacy initiatives both ‘on platform’, such as through providing fact checking training to influencers and helping people identify trusted information, and ‘off platform’ to sponsor important expert-led programs.
Transparency reporting helps create a culture of accountability, establishes benchmarks for future progress, and creates public resources that can provide researchers, civil society and governments with insights into the scale and management of mis- and disinformation.
Transparency needs to be seen in concert with other industry, government and civil society efforts; One of the reasons why we strengthened other code commitments in a planned review last year, and we remain supportive of the ACMA receiving a longer-term mandate on this issue to reinforce DIGI’s efforts.
Ultimately, sustained shifts in the fight against mis- and disinformation require a multi-stakeholder and multi-pronged approach, of which industry transparency is a key component.
Sunita Bose is the managing director of the Digital Industry Group Inc. (DIGI)
Do you know more? Contact James Riley via Email.