Digital Rights Watch welcomes OAIC landmark determination that Bunnings breached Australians’ privacy with facial recognition
Digital Rights Watch welcomes the determination from the Office of the Australian Information Commissioner today on Bunnings’ use of dangerous and invasive facial surveillance technology. This represents a landmark decision and corporate Australia should take as a warning about the use of this technology.
Digital Rights Watch is pleased by the outcome of the OAIC investigation. It represents a far-reaching and significant determination on the legality of facial recognition technology in Australia, clearly setting the rules for all businesses and organisations using or considering using the technology.
Facial recognition technology as used by Bunnings collects sensitive biometric information that can uniquely identify you, similar to your fingerprint. The huge public outcry at the time of the CHOICE investigation showed that Australians are deeply concerned about the use of this invasive tech. Our friends at CHOICE should be commended for their groundbreaking investigation and for tireless advocacy to hold Bunnings to account.
Covert use of facial recognition technology in retail settings and in public spaces impinges on our human right to privacy and normalises surveillance. The technology is prone to inaccuracies and bias, with higher rates of false identification for people with darker skin leading to discrimination.
Quotes attributable to Kate Bower, Digital Rights Advocate, Digital Rights Watch
“We commend the OAIC for making a clear determination and ordering Bunnings to stop the practice. But the fact that it has taken 2.5 years for the regulator to come to a decision highlights the inadequacies of our current law,” says Digital Rights Advocate, Kate Bower. “Australians want and deserve clear and enforceable laws that match community expectations. They want to be able to go to the shops without having their biometric information collected by big corporations. We need a properly resourced regulator that can take on these issues, and the capacity for complaints about privacy to be heard by courts directly.”
“We urgently need strong technology specific regulation that clearly prohibits the use of facial surveillance technologies in retail and other settings. Without strong privacy protections or a human rights act, Australians’ human rights are undefended and able to be exploited for profit.”
“Digital Rights Watch calls on the Attorney-General to take swift and decisive action to prohibit the use of facial surveillance technology by urgently introducing legislation so that Australians are not left waiting for more than two years for their rights to be protected.”
Media contact for interview: kate.bower@digitalrightswatch.org.au or media@digitalrightswatch.org.au +61 403 015 007