Social media plan backlash


Denham Sadler
Senior Reporter

A federal government plan to block sites hosting “extreme and violent content” during a crisis event has been labelled a “dangerous proposition” by digital rights advocates, with the Coalition urged to not rush its latest social media crackdown.

Prime Minister Scott Morrison on Monday took his efforts to crack down on social media platforms hosting violent, terrorism-related content to the world stage.

A new partnership with the OECD and New Zealand to develop Voluntary Transparency Reporting Protocols was unveiled, which would apply to the major social media companies.

Back in Australia, the Coalition announced over the weekend that it would be creating new rules to formally allow telcos to block sites hosting extreme content during “crisis events”.

The new global protocols, which Australia has contributed $300,000 towards, are to “strengthen transparency by tech companies in a bid to prevent online terrorist activity”, Mr Morrison said, and would reveal how the platforms are “preventing, detecting and removing terrorist and violent extremist content”.

“This work will establish standards and provide clarity about how online platforms are protecting their users, and help deliver commitments under the Christchurch Call to implement regular and transparent public reporting in a way that is measurable and supported by clear methodology,” Mr Morrison said in a statement.

“Digital industry will benefit from establishing a global level playing field. The project will assist to reduce the risk of further unilateral action at national levels, avoid fragmentation of the regulatory landscape and reduce reporting burdens for online platforms.”

Two of the largest social media companies, Facebook and Twitter, already produce twice-yearly reports focusing on the content and accounts that they have removed from their platforms, and it is unclear how Mr Morrison’s plans differ from these voluntary reports.

Over the weekend the Coalition also revealed its plan to give local telcos permission to block sites hosting “harmful and extreme content” during terror incidents.

This happened following the Christchurch terrorist attack earlier this year, with telcos including Telstra and Optus temporarily blocking sites found to be hosting footage of the attack.

This was done independently by the telcos, and the Coalition’s new policy would “establish a clear and unambiguous content-blocking framework for crisis events”, Communications Minister Paul Fletcher said.

“This was a responsible move taken by Australia’s major telecommunications companies that prioritised the safety of Australians online,” Mr Fletcher said.

“It is important that the government gives the industry the backing it needs for this type of action, now and into the future.”

Under the plan, formal powers will be given to the telcos to block sites hosting “extreme violent content” during “crisis events”, with the eSafety Commissioner to direct the companies based on independent determinations on a case-by-case basis to “keep Australians safe online while upholding important internet freedoms”.

The government would also establish a 24-7 Crisis Coordination Centre to “monitor and notify relevant government agencies of online crisis events” and assist the eSafety Commissioner during these events.

But Electronic Frontiers Australia policy committee chair Angus Murray said this is a “dangerous proposition”.

“The power to censor is a slippery slope that ought not be implemented without comprehensive, informed and careful consideration about the adequacy and proportionality of such a scheme,” Mr Murray told InnovationAus.com.

“At this stage, it is difficult to see how content blocking arrangements could be implemented without causing great cost and inconvenience.

“It is also a path that leads towards censorship and, although terror attacks should not be promoted, it is important that the social, cultural and political advantages of an open internet are not thrown out with the bathwater,” Mr Murray said.

The government’s announcements lack detail and appear to have been rushed, Australian Strategic Policy Institute’s International Cyber Policy Centre head Fergus Hanson said.

“We’ve had a lot of new initiatives in this space recently and it tends to be issues that have been around for a long time, but suddenly we have an initiative to try to fix them and then there’s not a lot of time to get the details right,” Mr Hanson told ABC Radio National on Monday morning.

“We don’t have a lot of details at the moment, and there’s not a lot of time to get the details right. We don’t have a lot of details at the moment and they could run into problems trying to implement something very quickly,” he said.

“As we’ve seen in the past with the abhorrent content law, you can quickly make a law that doesn’t really work and can’t be effective.”

It’s important the government outlines exactly what sites will be targeted with the new site-blocking powers, and the type of content that is the focus of the new legislation, Mr Hanson said.

“We want to be very clear about what is going to be blocked and what isn’t, and the parameters that are going to be used to determine what is taken down. It’s really difficult if you don’t have really clear rules in place,” he said.

“This is a problem that has been around for a long time. That is justification for taking a little more time to make sure we get the legislation right.”

Mr Morrison has been pushing for a range of measures to force tech companies to better handle and remove “abhorrent” content from their platforms. He has taken these attempts to the world stage, securing an agreement from leaders at the G20 conference earlier this year to take on social media companies that fail to act against the live-streaming of terrorist events.

Following the Christchurch terror attack, which was live-streamed on Facebook, the government formed a taskforce, with members coming from the likes of Facebook, YouTube, Microsoft, Twitter, Telstra and Optus.

The Coalition also unveiled and quickly passed legislation in April introducing new fines and potential jail sentences for the executives of tech companies that fail to take down “abhorrent violent material” quickly enough.

But Mr Hanson said this legislation was too rushed and is ineffective.

“Pretty much all the platforms would be in breach at the moment. It’s currently impossible for them to completely remove this content from their systems. We’ve created a law that doesn’t really have the right parameters to make it effective, and that’s a product of trying to do it too quickly without enough consultation,” Mr Hanson said.

Do you know more? Contact James Riley via Email.

Leave a Comment