On 29 October 2024, after an almost 2-year investigation, the Australian Privacy Commissioner (Commissioner) determined that retail giant, Bunnings, had, through its use of facial recognition technology at 62 of its retail stores around the country between 6 November 2018 and 30 November 2021, interfered with the privacy of hundreds of thousands of customers.[1]
Facial recognition technology (FRT) is a computer vision technology that uses biometric information (e.g. facial features) to identify or verify a person in a digital image or video. A well-known example is where law enforcement uses FRT to determine whether a suspected criminal’s face matches any template held in their database. Private companies also increasingly use FRT for various purposes, which has raised concerns on the potential misuse of the technology and its ethical and privacy implications.
In the landmark Bunnings decision, Commissioner Kind confirmed that “facial recognition technology, and the surveillance it enables, has emerged as one of the most ethically challenging new technologies” of our time. She went on to find, inter alia, that Bunnings had collected the sensitive personal information of customers entering their stores without adequately notifying them of such collection and, importantly, did not obtain consent to the collection or use of this sensitive information. The Bunnings determination followed previous high-profile investigations by the Commissioner and her Office of Australian Commissioner (OAIC) into 7-Eleven, Kmart and Clearview AI in the last 18 months on their use of FRT.
The Commissioner’s decision raises critical questions for Australian businesses, who are seeking to leverage new and emerging technologies to protect their businesses and/or offer new products and services to their customers, as to how they can and must mitigate privacy risks arising under the Privacy Act 1988 (Cth) and APPs (Privacy Act).
In light of the Bunnings decision and ongoing international regulatory restrictions on the use of FRT, the OAIC has also since published detailed guidance on assessing the ethical and privacy risks of FRT usage, to help Australia businesses navigate these challenges.
We work through this OAIC guidance note here, the practical challenges and issues raised for business, and the privacy lessons that can be learned from the Bunnings’ decision.
The facial recognition system engaged by Bunnings’ at their stores functioned broadly in 4 key steps.[2]
Bunnings argued that their stores were experiencing an increase in violence and organised crime, particularly against staff, and it was explained that FRT was used as a safeguard measure to protect staff, customers and suppliers.[3]
To circumvent its compliance with the Privacy Act, and specifically APP 3 (i.e. the need to obtain customer consent to the collection of their sensitive personal information), Bunnings sought to rely on the exemption under s16A of the Privacy Act, that allows for the use of personal information, without consent, in a permitted general situation (i.e. a serious threat to health or safety, or unlawful activity or a misconduct situation exists).
It was outlined to the OAIC that, before enrolling an individual into the FRT database, Bunnings categorised individuals by reference to their behaviour and proposed risk to Bunnings’ business’ operations[4]. These included categories of individuals who:
The OAIC’s investigation of Bunnings use of FRT began in 2022 when consumer advocacy group, CHOICE, revealed that Bunnings (along with other companies, including The Good Guys & 7Eleven) was using the technology in their stores across Victoria and New South Wales. Bunnings stopped using FRT within their CCTVs in 2021, though it was noted in the OAIC proceedings that signage was spotted in 2023 in-store which indicated that re-usage may have arisen.
The Commissioner noted that, under the Privacy Act, sensitive information included biometric information that is to be used for the purpose of automated biometric verification or biometric identification, and it was satisfied that the facial images Bunnings uploaded to its database met this definition.
The Commissioner found that Bunnings had interfered with the privacy of hundreds of thousands of customers by collecting their personal and sensitive information through the use of FRT, including by:
When determining whether an exemption applied to the collection and consent requirements of APP 3 under S16A, Privacy Act, Commissioner Kind rejected Bunnings’ arguments and held that the key threshold element for the exemption to apply did not arise here – i.e. “the entity reasonably believes that the collection, use or disclosure is necessary to lessen or prevent a serious threat to the life, health or safety of any individual, or to public health or safety [emphasis added]”.
The Commissioner found that it was not necessary in this case and that “deploying facial recognition technology was the most intrusive option, disproportionately interfering with the privacy of everyone who entered its stores, not just high-risk individuals.”[6]
Further, the Commissioner stated that while the technology assisted with, and increased the convenience (and economics) for Bunnings in identifying repeat violent offenders, it was not a proportionate use and its impact on the privacy of individuals outweighed the benefits of use – “just because a technology may be helpful or convenient, does not mean its use is justifiable.”[7] Bunnings use of FRT was an “indiscriminate collection of personal information” in order to “take appropriate action in respect of actual or suspected unlawful activity by a relatively small number of individuals and in a limited set of circumstances”.
The Commissioner accordingly made a declaration, under section 52(1A) of the Privacy Act, that Bunnings:
Bunnings has announced that it will seek a review of the OAIC’s determination before the Administrative Review Tribunal. It remains to be seen whether the decision of the OAIC will be repealed post this review but noting the changes afoot internationally and broader ethical issues being raised in the market on the use of FRT, we consider that such an outcome may be unlikely.
The decision of the Commissioner sends a strong message to businesses seeking to utilise new technology involving biometric information that compliance with Australian privacy law needs to be actively considered at the design and implementation stage of any such technology.
This includes, at a minimum, the use of privacy-by-design and security-by-design methodologies, carrying out privacy impact assessment(s) on the proposed technology and anticipated implications on the privacy of affected individuals, and implementing any recommended actions to mitigate privacy and ethical risks arising. Consideration should also be given, at an early stage, to the necessity of the technology and if meeting the business need can be satisfied by utilising other, less privacy-intrusive, technologies or processes.
Following its highly publicised decision, the OAIC has released a facial recognition technology guide (Guide) to assist Australian businesses seeking to assess and mitigate their privacy law risks prior to rolling out this technology. The Guide differentiates between the use of FRT for:
and broadly:
Importantly, the Guide acknowledges the practical challenges with obtaining meaningful consent to the use of FRT, noting that the nature of FRT means that it is not often practical to obtain true, express consent from individuals whose biometric information might be captured by FRT. It notes that ‘merely having signage or notice about the use of FRT in and of itself, will not generally be sufficient to show that an individual has consented to the use of this technology’.
Tips for organisations that are considering deploying FRT to carry out facial identification in a commercial or retail setting are included – although arguably there are little practical examples provided by the OAIC as to how the four elements of consent (informed, voluntary, current and specific, and with capacity) can in practice be obtained to satisfy the detailed requirements set out in this Guide.
We recommend that organisations who currently deploy FRT technology, or who are considering the use of FRT, should consider the following specific questions (in addition to usual reviews against the Privacy Act and APPs):
The Bunnings decision and release of the OAIC Guide coincide with the passing of the Privacy and Other Legislation Amendment Act, 2024 (POLA) last month, which has uplifted the Privacy Act to include a host of new requirements, including (inter alia) transparency on the use of automated decision-making, a new statutory tort for serious interference with privacy and tiered civil penalties for non-serious privacy breaches.
The OAIC has also been provided with enhanced monitoring and investigation powers under POLA which will allow Commissioner Kind to obtain the level of transparency she has been seeking of late on the use of emerging technologies, such as FRT, by Australian businesses, and the processes, practices and methodologies they have in place to mitigate privacy and ethical risks arising.
To that end, a program of active review and audit, privacy risk management roll-out, and adopting a ‘privacy by design’ and ‘security by design’ approach to projects and activities that involve the use of personal information are strongly recommended to weather the privacy regulatory storm currently circling, and to provide your business with a competitive edge in this ever-changing technological environment.
Authored by:
Sinead Lynch, Partner
Eve Lillas, Senior Associate
Steven Schwartz, Paralegal
Isabel Taylor, Seasonal Clerk
[1] See Determination at Commissioner Initiated Investigation into Bunnings Group Ltd (Privacy) [2024] AICmr 230(29 October 2024)
[2] Determination para 25
[3] See ABC News Article – Bunnings breached privacy laws by using facial recognition on customers, Commissioner finds – ABC News
[4] Determination para 28
[5] Determination para 274
[6] OAIC Media Release – Bunnings breached Australians’ privacy with facial recognition tool | OAIC
[7] OAIC Media Release – Bunnings breached Australians’ privacy with facial recognition tool | OAIC