Govt technology procurement is a ‘vulnerability’


Denham Sadler
Senior Reporter

The procurement process is a key point of weakness in the government’s development and use of emerging technologies, and human rights need to be better considered in this process, according to Human Rights Commissioner Ed Santow.

After three years of research and work, the Australian Human Rights Commission released its final report on human rights and technology last week, with nearly 40 recommendations for government to ensure emerging technologies are fair and protect human rights.

A number of recommendations were around government procurement, which Mr Santow said is a significant vulnerability at the moment.

“The government needs to be able to ask the right questions in the procurement process. That is a point of real vulnerability. If you don’t get that right you can end up with an AI system that causes real harm,” Mr Santow told InnovationAus.

Coat of Arms
Procurement is the next big challenge for government innovation. Credit: Sunflowerey Shutterstock.com

“If you do get that right, you’re able to make sure that the process of procurement asks the right questions and has the right protections, and then you’ve got a really robust system that the government can be confident in.”

The Commission called on the government to instruct the Department of Finance and the Digital Transformation Agency to amend the current procurement laws, policies and guidelines to ensure human rights are protected in the design and development of new technology.

“It is increasingly common to use government procurement processes as a lever to influence behaviour to achieve other policy outcomes. It is vital that the government procures AI-informed decision-making systems that are safe and protect human rights,” the report said.

“The Australian government generally works with, and relies on, the private sector to develop AI-informed decision-making systems. It is well recognised, therefore, that government procurement should focus on ensuring these systems are safe and comply with human rights.”

This should start with a review of the current procurement rules and policies to ensure they reflect a human rights-based approach to new technologies. Protections should then be included for when the government is looking to procure an AI-informed decision-making system, with a focus on the tool being transparent, explainable and accountable.

The Human Rights Commission pointed to the UK government’s Artificial Intelligence Office’s guidelines for AI procurement from last year as an example of a way forward for Australia.

These guidelines aim to ensure that the risks associated with this technology are identified and managed in the early stage of procurement, and that explainability and interpretability of the algorithms is included as design criteria.

There is also an important need to improve knowledge within the public sector on these new technologies and the risks associated with them, Mr Santow said.

“You need to understand at a high level what the strategy risks and opportunities are so you can make a good decision about where it is safe and effective to use AI in government agencies or in a company,” he said.

“The government needs to understand in more detail in the procurement process where the weak points are so they can conduct the process well and address those risks.”

This doesn’t require every public servant to know about AI in-depth, but a general upskilling around AI is needed, he said.

“You don’t need to be able to pull apart a car and build it from scratch, but you do need to know that if you push this lever then it goes forward so you can operate it safely. That level of upskilling is important,” Mr Santow said.

“It’s not about making everybody a data scientist or AI expert, it’s to enable people to operate effectively in a world that is increasingly operating using AI.”

The Human Rights Commission also recommended the Digital Transformation Agency’s Digital Sourcing Framework for ICT procurement be amended to include specific references to human rights protections, and for a requirement for any vendor of an AI-informed decision making system to complete a human rights impact assessment.

Further guidance should also be provided to government decision-makers to decide whether an AI-informed decision-making system will support compliance with the legal measures guaranteeing a right to a remedy, and to ensure compliance with the relevant Australian and international standards, the report said.

Do you know more? Contact James Riley via Email.

1 Comment
  1. Diigtal Koolaid 3 years ago

    Gabrielle has to be a really nice person, but she’s a lawyer with an MBA. She must know as much about “innovation” as any other banker (Deutsche Bank and Toronto Dominion Bank in New York). Remember what bankers did with “financial innovation” back around 2008? Yeah, I remember too. This happens all the time. People from another universe talk about stuff they don’t really know. Sure, they can say the words. They can talk about things. But they aren’t their things. Spot the problem. Now I hear she’s going to turn ideas into jobs and accelerate research and development in the NSW Action Plan. She has $26 million in cash that she can sped with friends. She’ll have a load of new friends. Watch. But don’t look for any change in anything. You can pay your tax again next year and Gabby will have another 26 large to spend. All good ? Thought so. (Spoiler: The procurement process isn’t the key point of weakness in the government’s development and use of emerging technologies – it’s the government and the public service. Don’t tell anyone.) ps. Loved the new non-functional requirements – explainability and interpretability. Like ?

Leave a Comment