CSIRO’s Data61 has commissioned research projects from five universities into the use of emerging technologies to improve health and safety outcomes at potentially dangerous worksites.
Prototype projects will be built over five years and be trialed in real-world scenarios with the aim of developing commercially viable products that are “responsible by design”.
Data61 is committing $4.75 million in cash and $5.1 million in-kind, with each of the universities contributing to another $4.75 million in cash and $3.5 million in-kind.
Researchers will bring expertise in a range of emerging technologies such as generative and immersive artificial intelligence, augmented reality, and cybersecurity.
The Tech4HSE program was announced by Minister for Industry and Science Ed Husic at the launch of Australia’s AI Month in Canberra on Wednesday.
The program will involve researchers from the University of Queensland, Swinburne University of Technology, University of New South Wales, Curtin University, and the Australian National University.
The first research project between researchers from Data61 and the University of Queensland, focuses on supporting crisis preparedness and responses for workers in the energy industry.
Technology being developed as part of the project, which is already underway, will combine computer vision models with 3D generative AI.
Data61 science director Professor Aaron Quigley said the technology could support health, safety, and environmental (HSE) objectives for workers in a wide range of industries “whether they’re working with electrical equipment, heavy machinery or on our roads”.
“We’re bringing the best researchers in the nation together to help get everyone home safely, by creating advanced digital tools for training, identifying and monitoring hazards, and planning responses and actions,” Mr Quigley said.
The Australian Bureau of Statistics recorded around 497,300 reports from people who suffered from work-related injury or illness in 2021-22.
In developing the new AI tools as “responsible by design”, the researchers may refer to Australia’s AI Ethics Framework, released by Data61 in 2019. The federal government is currently looking considering legislative reforms to foster safe and responsible AI.
At the start of the month, Mr Husic travelled with Deputy Prime Minister Richard Marles to the United Kingdom to attend the AI Safety Summit hosted by UK Prime Minister Rishi Sunak.
During the summit, Australia joined 28 other countries in signing a joint declaration for safe and responsible AI development, as well as the separate Declaration on Responsible Military Use of AI and Autonomy. Ahead of the conference, US President Joe Biden issued an executive order to implement new AI guidelines.
“Australia is considering how our expertise, and we have a considerable pool to draw from, can best support the work of new AI safety summits that are being set up in the UK and the US,” Mr Husic said.
“These institutes are going to set up new ways of testing AI models to ensure they’re being developed in line with global safety standards, and we’re currently working out how we will support the agreement through our own actions in Australia.”
Mr Husic told InnovationAus.com that the government would need to be particularly careful about how it uses AI, particularly in the context of automated decision-making.
“Automated decision-making will get increasingly used by governments off the back of AI, largely because AI can crunch through a lot of data very quickly, and generally a lot more accurately. But you can’t just make the assumption that it works flawlessly,” Mr Husic said.
“You need to have things built in your system that allow people to say something has gone wrong or not meet expectation and allow it to be fixed, and to not have what people experienced through Robodebt.”
In its response to Robodebt, the government committed to a new oversight regime for automated decision-making algorithms, which may include making business rules and algorithms available “to enable independent expert scrutiny”.
Do you know more? Contact James Riley via Email.