The use of facial recognition by law enforcement is “misunderstood” and there is no agreed definition of high-risk applications, according to NSW Police, which has defended employing the controversial technology for “intelligence purposes”.
Australia’s largest police force has also refused to suspend its use of facial recognition, as alarm over its use grows following revelations major retail chains were using the technology without the active consent of customers.
Calls for a temporary ban on facial recognition technology (FRT) in government decision-making that has a “legal, or similarly significant, effect for individuals” was first sounded by the Australian Human Rights Commission (AHRC) last year.
In the absence of legislation, the Commission called out policing and law enforcement as areas where there is a “high risk to human risks” with its use of “one-to-many facial recognition to identify criminal suspects, detect crimes and help find missing persons”.
“While these objectives are legitimate and can promote community safety, there are significant human rights risks associated with the use of facial recognition for these ends,” the report said, adding that FRT can result in individuals being “wrongly identified”.
But responding to questions on notice from recent budget estimates hearings, Police minister Paul Toole flatly rejected the finding of the AHRC report, which he said was considered by NSW Police’s Governance Board in August 2021, three months after its release.
“The board noted that the use of facial recognition capability by law enforcement had been misunderstood, and the concept of ‘high-risk’ facial recognition has not been effectively described, nor does it reflect the way in which the NSW Police Force uses this technology,” he said.
Minister Toole was responding to questions from Greens MP Abigail Boyd, who also asked whether NSW Police would suspend its use of facial recognition until the AHRC recommendations are implemented, to which he responded: “no”.
“The NSW Police Force will continue to use commercially available technologies in its mission to ensure a safer NSW community,” he said, while adding that the force will consider AHRC recommendation that go to the “practice and governance” of FRT.
The previous Coalition federal government never provided a formal response to the AHRC report but the Attorney General’s Department said the report will “inform Government’s continued engagement on the various issues raised”.
NSW Police advised that FRT is being used to “identify potential suspects of crime, unidentified deceased and missing persons”, but stressed that the information is only used for “intelligence purposes” and it is “committed to the responsible and ethical use of FRT”.
The force also hasn’t ruled out expanding the use of the technology for predictive policing, with Minister Toole saying: “The NSW Police Force continually reviews new technology to assist police in their role, and will consider expanding the use of technology, as required”.
NSW Police last year began testing the federal government’s Face Matching Services, which allows photos to be compared against existing government-issued identity documents without legislation governing it use. The force has used FRT since 2004.
A key point of contention with the report appears to be the AHRC’s definition of high-risk, which Minister Toole said is “not agreed upon”, despite growing calls globally for changes to law enforcement access to facial recognition.
Last month, experts at the University of Technology Sydney’s Human Technology Institute, including former Australian Human Rights Commissioner Ed Santow, called for the government to adopt a model law that would govern the use of FRT, including in policing.
Attorney General Mark Dreyfus has previously signalled FRT could be best addressed through the Privacy Act review, already underway for two years and scheduled to be with government by the end of 2022.
Governments internationally are also looking at the issue of FRT including in the United States, where a group of US Democrats introduced a bill to “place strong limits and prohibitions on law enforcement use of facial recognition technology” last month.
The Facial Recognition Act of 2022 would require police to obtain a warrant “showing probable cause that an individual committed a serious violent felony” from a judge before using the technology.
The bill would also prohibit law enforcement from using facial recognition “at protests, to track individuals with real-time face surveillance, in conjunction with body cameras, and in other problematic scenarios”.
Do you know more? Contact James Riley via Email.