Australia’s online safety codes should be abandoned at the eleventh hour in favour of stronger regulator led rules, digital rights advocates say, warning the self-regulation model will put young Australians at risk compared to tougher approaches overseas.
Australia’s online regulator, the eSafety Commissioner, is finalising her approvals of the safety codes developed by industry over the last year to regulate the treatment of certain online content.
The approach, and similar industry co-developed regulation like Australia’s Code of Practice on Disinformation and Misinformation, offers relatively weaker protection for end users, according to Reset Australia.
The group will release a report on Saturday showing similar public sentiment, with a 1500-person survey finding 86 per cent of respondents want either the regulator or the Parliament to write online safety codes. Just five per cent backed the social media industry to write its own rules.
The codes are part of Australia’s controversial Online Safety Act, which passed Parliament last year with bipartisan support despite a contracted public consultation and wide criticism from digital rights groups.
Reset Australia says the draft codes released for public consultation in September reveal comparatively weaker rules for internet companies in Australia.
“This approach just doesn’t work,” said Reset Australia director Dr Rys Farthing.
“Social media companies have had 20-odd years to improve children’s safety and privacy, but they chose not to. It’s unclear why they’d change their minds now, so allowing them to draft any codes will not improve things one iota.”
The proposed industry codes, drafted by the industry group representing social media companies, commit social media companies to have default settings for children.
The default setting when a “child” creates an account must protect the user “from unwanted contact from unknown end-users” and prevent their location data being shared with other accounts.
The draft code makes no mention of what age this would apply but has been updated by the industry groups to clarify it would apply these default settings only to users under 16.
In the UK, Ireland, and California – where regulators or legislators have developed rules for default privacy settings – they apply to any user under 18 by default.
The foreign regulator approaches also have tougher rules on collecting children’s location data.
“In Australia, the proposal is to not broadcast children’s location,” the Reset Australia report said. “In jurisdictions where codes have been drafted by regulators and legislators, they propose the stronger step of not collecting children’s locations in the first instance.”
Prohibiting sharing but not necessarily collection ignores the risk of data security failures, errors, or companies flaunting the rule to sell it in conjunction with something like targeted advertising.
Reset Australia also warns the proposed industry code dealing with reporting child exploitation material is not as strong as the legislator-led rules in the UK.
The draft Australian codes require a company to report child sexual exploitation material when it discovers it and forms a view it is a risk to an Australian.
The double requirement creates a higher reporting threshold, Reset Australia said.
“This potentially provides social media companies discretion to decide which images represent a serious and immediate threat to a child and which do not, and which to report and which not to,” the report said.
“This discretion may be problematic. Reporting all CSEM materials to authorities may share useful evidence or allow authorities to uncover patterns of behaviour and threats unknown to staff at Meta or Snapchat.”
Reset Australia wants the industry drafted safety codes to be abandoned and replaced with ones written by the eSafety Commissioner.
Longer term, the group recommends codes become the remit of the privacy regulator and any existing co-regulatory models be replaced by regulator and legislator drafted ones.
Do you know more? Contact James Riley via Email.