The experts charged with providing the Prime Minister with advice on technology have been examining artificial intelligence and a potential government response for the last month, with findings expected “shortly” to inform new national policies.
Australia’s National Science and Technology Council was last month directed by Industry and Science minister Ed Husic to investigate the increasingly popular technology that experts say is laden with risk.
It comes as federal government AI initiatives have slowed over the last two years, a time in which AI — particularly generative AI — has risen to prominence.
Mr Husic revealed his direction on Wednesday at the National Press Club, where he also advocated for a proactive approach to the technology to avoid misuse.
“I quietly, about a month ago, asked the National Science and Technology Council to consider this to consider the pathways that got us to this point, how the technology will evolve the sort of generative AI some of which has been captured around GPT,” he said.
“And to think through what the implications are and how government should respond. That work is being done.”
The National Science and Technology Council is the preeminent group for providing scientific and technological advice to the Australian government. It formed in 2019 and meets around three or four times a year.
The Council is chaired by the Prime Minister Anthony Albanese, with the Science minister Ed Husic as deputy chair. Australia’s chief scientist Dr Cathy Foley is its executive officer, and CSIRO boss Dr Larry Marshall serves as an ex officio member.
Its expert members are:
- Professor Genevieve Bell, AO,
- Professor Debra Henly,
- Professor Brian Schmidt, AC,
- Professor Fiona Wood, AM,
- Emeritus Professor Cheryl Praeger AC,
- Associate Professor Jeremy Brownlie
Professor Bell, a former Intel board member and director of the ANU School of Cybernetics, is heading the work on AI, which Mr Husic said will be peer reviewed. Professor Bell has previously represented Australia in a global AI partnership promoting responsible and human-centred use.
“We’re expecting that that report shortly to help inform policy work and to do that,” Mr Husic said.
Mr Husic, who last week launched Australia’s Responsible AI Network, advocated for a more proactive approach to the safe use of emerging technologies, including by government.
“There may be in time an expectation, when the quality and the confidence around the use of technology occurs, that we’ll expect AI and automated decision making to grow at scale.
“Governments will potentially — there’s a scenario — be questioned about why they didn’t use AI to do that.”
Mr Husic said Australian businesses and government need to have a proactive approach because regulations of the technology are almost inevitable.
“It will trigger calls for regulation and necessity for protection. And if we don’t get it right, you’ll see the type of response that’s come out of for instance, what you’ve seen with Optus and Medibank.
“If businesses don’t get their frameworks, right, the community expectation – absolutely understandably — is the government will step in. And then there’s a whole debate about are you being too tough or too soft? [It is] better to think ahead, get it right that way.”
The Australian Information Industry Association’s quickly welcomed the movement on an AI framework on Wednesday. The Industry group is developing its own guidance on AI, which will be put to government next week.
“AI is not merely the stuff of science-fiction; it is already playing a role across the economy,” AIIA chief executive Simon Bush said.
“Now is the time to embrace these technologies and implement practical guardrails around its ongoing use.”
Do you know more? Contact James Riley via Email.
Checked LinkedIn guys. Genevie is an anthropologist. Deb did biochemistry, biophysics and molecular biology. Brian read books on astronomy and physics. Fiona is into plastic surgery. Cheryl is a mathematician, which is cool, and Jeremy is a big guy in genomics and molecular entomology. Apart from Cheryl’s work there’s not much AI in there. How does this keep happening? Let’s get a bunch of people who aren’t from the topic area, label them “experts”, and get Ed Husic to “quietly” ask them how the technology will evolve generative AI. This amazes me every time. Why do we keep asking people from anthropology, astronomy, plastic surgery, genomics and biochemistry about things outside their fields? Should we ask them about to advise Phil Lowe on economics too? They didn’t study that. Or maybe we’ll ask them to report on the Murray River for the MDBA, or electricity stations for the AEMO. How does a report on the Australian Public Service sound? Dave Thodey got a job doing that and he’s an anthropologist. Is that weird? Sure … but then Ed is a journalist, and journos can write about things they never studied, and people can “think” about AI – like, “I wonder what it is?”