The government will look to regulate the use of individuals’ data by tech giants like Google and Apple if the private sector can’t enforce its own standards, Assistant Minister for Digital Transformation Angus Taylor said.
Appearing on Q&A on Monday night, Mr Taylor discussed how the government would deal with big data, the ethics of using algorithms to make decisions, and the impact of automation on Australian jobs.
Mr Taylor said government wants to give Australians control of their own data.
“The way government is looking at it as the starting point is the customer must be in control of their data. That is absolutely essential,” Mr Taylor said.
“This means your banking data, energy usage data, telecommunications data – you have to be in control. That’s the fundamental principle. [If you] start from there and a lot of the ethical issues become a lot easier to deal with.”
The government is “working through” the Productivity Commission’s report into data availability and its use, which recommended the creation of a new Data Sharing and Release Act, and the creation of a Data Custodian to guide the government’s use of and access to data.
But Mr Taylor admitted these efforts might clash with large tech giants like Google and Facebook, which currently control so much of our data. He said the government would look to regulate them if satisfactory standards weren’t set by the private sector.
“The customer being in control of their own data will come into tension with many organisations that have a lot of data. Ultimately we would want industry to regulate it themselves,” he said.
He pointed to the government’s work on open banking, which would allow consumers to order their banks to hand over their own data or pass it onto a different financial services provider. Flagged in this year’s budget, the government is conducting a review of open banking in Australia and looking at how it should be implemented.
This strategy of encouraging the private sector to regulate itself before moving to make it mandatory is also how the government would approach other big data issues, Mr Taylor said.
“We’re saying that we want the customer to have control of their own data, and we want industry to come to its own conclusions. But the government will step in if it can’t sort it out. I think that’s the right way to do it,” Mr Taylor said.
But fellow panellists and technology futurist Shara Evans cast doubt on the government’s ability to impose any new laws or regulations on tech powerhouses like Google or Facebook.
“You have services, Google in particular, that are built on giving really cool stuff to people for free on the trade that you’re making your data available to them to mine and scale and re-sale,” Ms Evans said.
“I don’t know if a little country like Australia is going to change Google’s global practices.”
Labor MP Terri Butler agreed with Mr Taylor that the government has an important role to play in ensuring people have control over their own data.
“When we see a situation where more and more data comes into existence, then the question is; how do we make sure that the power over that data resides with the people whose interests it should serve?”
The only real force against corporate use of our data is democracy, and the collective way we can stand together as a community against the interests of the very wealthy and very powerful, small few,” Ms Butler said.
“Even smaller countries like Estonia manage to have a system where every single person can see every time their data is accessed.”
“We might be a small country when it comes to standing up to Google and Apple, but we are nonetheless a country of 23 million people who can decide what laws operate here, and how they should be enforced. I don’t think we should write off democracy.”
The panel also discussed the increasing use of computer algorithms replacing human decision-making.
The Australian Government in particular has increasingly turned to these algorithms to make decisions, including in the robo-debt scheme, and in its immigration detention centres to determine the security risk of detainees, and in selecting which areas to conduct drug testing of welfare recipients.
Ms Evans said removing the human element of decision-making can have a range of negative consequences.
“Big data, like social media data, YouTube videos or databases, is fed into these artificial intelligence software constructs,” Ms Evans said.
“The software learns by basically sucking in that information from these millions and millions of data points, and it can unconsciously pick up social biases,” she said.
“If we start to have AIs that have these capabilities and are sucking in these unconscious social biases, and we don’t have humans in the loop checking that the inputs from this massive data are congruent with the outputs and the decisions being reached, then we can end up with some serious issues.”
Prominent British scientist Brian Cox used an “extreme” example to describe the inherent dangers of removing human morality from decision-making through the use of these algorithms.
“One of the concerns is that if you sub-out decision-making to expert systems for nuclear war to an expert system. You could probably write an algorithm to run your defence forces, so if you saw a particular threat to your country it would deliver some sort of military response,” Prof Cox said.
“But that runs the risk of removing human morality from the decision-making. That’s a real threat to civilisation. To what level do you allow automatic, autonomous experts to make decisions where it may be good to have a moral component in those decisions?”
Discussion also turned to automation and job losses, with Professor Cox saying that governments have an important role to play in ensuring the population is prepared for this disruption and ready to take on the jobs of the future.
“The definition of an innovation system is that it replaces the jobs that it destroys with new jobs, and it creates new jobs faster than it destroys old ones,” he said.
“The challenge for government is to make sure it has a research, innovation and education system that creates new jobs and educates the workforce faster than the old jobs.
“You can’t hang on to the old jobs, they will be destroyed. It’s the government’s responsibility to make sure t
Do you know more? Contact James Riley via Email.