[widget id="surstudio-translator-revolution-3"]

Bunnings’ use of facial recognition technology found to breach the Privacy Act – What lessons can be learned?

12 December 2024
Dudley Kneller, Partner, Melbourne Sinead Lynch, Partner, Sydney Antoine Pace, Partner, Melbourne Mitchell Wright, Partner, Canberra

On 29 October 2024, after an almost 2-year investigation, the Australian Privacy Commissioner (Commissioner) determined that retail giant, Bunnings, had, through its use of facial recognition technology at 62 of its retail stores around the country between 6 November 2018 and 30 November 2021, interfered with the privacy of hundreds of thousands of customers.[1]

Facial recognition technology (FRT) is a computer vision technology that uses biometric information (e.g. facial features) to identify or verify a person in a digital image or video. A well-known example is where law enforcement uses FRT to determine whether a suspected criminal’s face matches any template held in their database. Private companies also increasingly use FRT for various purposes, which has raised concerns on the potential misuse of the technology and its ethical and privacy implications.

In the landmark Bunnings decision, Commissioner Kind confirmed that facial recognition technology, and the surveillance it enables, has emerged as one of the most ethically challenging new technologiesof our time. She went on to find, inter alia, that Bunnings had collected the sensitive personal information of customers entering their stores without adequately notifying them of such collection and, importantly, did not obtain consent to the collection or use of this sensitive information. The Bunnings determination followed previous high-profile investigations by the Commissioner and her Office of Australian Commissioner (OAIC) into 7-Eleven, Kmart and Clearview AI in the last 18 months on their use of FRT.

The Commissioner’s decision raises critical questions for Australian businesses, who are seeking to leverage new and emerging technologies to protect their businesses and/or offer new products and services to their customers, as to how they can and must mitigate privacy risks arising under the Privacy Act 1988 (Cth) and APPs (Privacy Act).

In light of the Bunnings decision and ongoing international regulatory restrictions on the use of FRT, the OAIC has also since published detailed guidance on assessing the ethical and privacy risks of FRT usage, to help Australia businesses navigate these challenges.

We work through this OAIC guidance note here, the practical challenges and issues raised for business, and the privacy lessons that can be learned from the Bunnings’ decision.

How did the Bunnings’ facial recognition system work?

The facial recognition system engaged by Bunnings’ at their stores functioned broadly in 4 key steps.[2]

  1. CCTV footage of customers was collected as they entered and exited the store. This footage was then separated into individualised still images;
  2. the Bunnings’ server applied a type of technical filter known as a ‘Gabor filter’ to each of the still images to evaluate whether it contained images of human faces;
  3. where human faces were found, various vector points were extracted by the technology to analyse specific features of the human face identified to form a vector set;
  4. these vector sets were then compared against vector points previously analysed by the server that had been enrolled in the Bunnings database (i.e. customers who may have previously entered the relevant store, or other stores of Bunnings and was similarly identified). Where the server found a match, an alert was generated and that information was then held in the server database; and
  5. in cases where there was no match found, the data was automatically deleted – though this took up to 4.17 milliseconds (a point that became important in the later decision).

Why was FRT used?

Bunnings argued that their stores were experiencing an increase in violence and organised crime, particularly against staff, and it was explained that FRT was used as a safeguard measure to protect staff, customers and suppliers.[3]

To circumvent its compliance with the Privacy Act, and specifically APP 3 (i.e. the need to obtain customer consent to the collection of their sensitive personal information), Bunnings sought to rely on the exemption under s16A of the Privacy Act, that allows for the use of personal information, without consent, in a permitted general situation (i.e. a serious threat to health or safety, or unlawful activity or a misconduct situation exists).

It was outlined to the OAIC that, before enrolling an individual into the FRT database, Bunnings categorised individuals by reference to their behaviour and proposed risk to Bunnings’ business’ operations[4]. These included categories of individuals who:

  • engaged in or were suspected to have engaged in ‘actual or threatening violence’ to staff or other customers of Bunnings;
  • committed or were suspected of committing ‘Organised Retail Crime – such as, theft or fraudulent activity on multiple occasions across multiple stores by two or more individuals;
  • demonstrated ‘violent, threatening or other inappropriate behaviour and as a result they had been issued a probation notice by staff, effectively banning them from entering any Bunnings store; and/or
  • engaged in ‘serious cases of theft.

The OAIC’s investigation of Bunnings use of FRT began in 2022 when consumer advocacy group, CHOICE, revealed that Bunnings (along with other companies, including The Good Guys & 7Eleven) was using the technology in their stores across Victoria and New South Wales. Bunnings stopped using FRT within their CCTVs in 2021, though it was noted in the OAIC proceedings that signage was spotted in 2023 in-store which indicated that re-usage may have arisen.

OAIC decision

The Commissioner noted that, under the Privacy Act, sensitive information included biometric information that is to be used for the purpose of automated biometric verification or biometric identification, and it was satisfied that the facial images Bunnings uploaded to its database met this definition.

The Commissioner found that Bunnings had interfered with the privacy of hundreds of thousands of customers by collecting their personal and sensitive information through the use of FRT, including by:

  • collecting the data in circumstances where the individuals did not provide consent to the collection of their sensitive information in violation of APP (APP) 3.3, and that an exception under APP 3.4 did not apply;
  • failing to take the reasonable steps in the circumstances to notify these individuals about the facts, circumstances and purposes of collection as well as the consequences of not collecting the information, contrary to APP 5.1;
  • failing to take reasonable steps as required by APP 1.2(a) to implement practices, procedures and systems to comply with the APPs; and
  • breaching APP 1.3 (and APP 1.4) by failing to include in its privacy policy(ies) information regarding the kinds of personal and sensitive information being collected and held, and the process by which such collection and retention took place.[5]

When determining whether an exemption applied to the collection and consent requirements of APP 3 under S16A, Privacy Act, Commissioner Kind rejected Bunnings’ arguments and held that the key threshold element for the exemption to apply did not arise here – i.e. “the entity reasonably believes that the collection, use or disclosure is necessary to lessen or prevent a serious threat to the life, health or safety of any individual, or to public health or safety [emphasis added]”.

The Commissioner found that it was not necessary in this case and that “deploying facial recognition technology was the most intrusive option, disproportionately interfering with the privacy of everyone who entered its stores, not just high-risk individuals.”[6]

Further, the Commissioner stated that while the technology assisted with, and increased the convenience (and economics) for Bunnings in identifying repeat violent offenders, it was not a proportionate use and its impact on the privacy of individuals outweighed the benefits of use – “just because a technology may be helpful or convenient, does not mean its use is justifiable.”[7] Bunnings use of FRT was an “indiscriminate collection of personal information” in order to “take appropriate action in respect of actual or suspected unlawful activity by a relatively small number of individuals and in a limited set of circumstances”.

The Commissioner accordingly made a declaration, under section 52(1A) of the Privacy Act, that Bunnings:

  • must not repeat or continue to use the FRT to interfere with the privacy of individuals; and
  • must, within 30 days of the determination, publish a statement on their website setting out:
    • the determination of the Commissioner;
    • details of the breach; and
    • advice for customers on how they can find more information regarding the breach and if necessary, make complaints.

Bunnings has announced that it will seek a review of the OAIC’s determination before the Administrative Review Tribunal. It remains to be seen whether the decision of the OAIC will be repealed post this review but noting the changes afoot internationally and broader ethical issues being raised in the market on the use of FRT, we consider that such an outcome may be unlikely.

Takeaways from the decision

The decision of the Commissioner sends a strong message to businesses seeking to utilise new technology involving biometric information that compliance with Australian privacy law needs to be actively considered at the design and implementation stage of any such technology.

This includes, at a minimum, the use of privacy-by-design and security-by-design methodologies, carrying out privacy impact assessment(s) on the proposed technology and anticipated implications on the privacy of affected individuals, and implementing any recommended actions to mitigate privacy and ethical risks arising. Consideration should also be given, at an early stage, to the necessity of the technology and if meeting the business need can be satisfied by utilising other, less privacy-intrusive, technologies or processes.

OAIC guidance

Following its highly publicised decision, the OAIC has released a facial recognition technology guide (Guide) to assist Australian businesses seeking to assess and mitigate their privacy law risks prior to rolling out this technology. The Guide differentiates between the use of FRT for:

  • facial verification– which refers to ‘one-to-one’ matching. It involves determining whether a face matches a single biometric template; and
  • facial identification– which refers to ‘one-to-many’ matching. It involves determining whether a face matches any biometric template in a database.

and broadly:

  • confirms that biometric templates and biometric information (including when used for automated verification or identification purposes) are sensitive personal information under the Privacy Act to which a higher degree of protection is afforded,
  • emphasis that sufficient notice and information must be proactively provided to ensure the ‘meaningful consent’ of the individual is obtained to collet such sensitive personal information (unless a valid exception applies);
  • recommends that, to ensure a privacy by design approach, businesses should undertake a privacy impact assessment prior to implementing or deploying any FRT;
  • emphasises the OAIC’s expectation that any business proposing to use FRT must explore, and take steps to address, the key issues of: (i) necessity and proportionality (APP 3) (is a less privacy-intrusive method available?); (ii) consent and transparency (APP 3 and 5); (iii) accuracy, bias and discrimination (APP 10); and (iv) governance and ongoing assurance (including privacy risk management practices and polices), from the start; and
  • recommends that staff are trained on the privacy risks of FRT, with periodic reviews and monitoring on usage of FRT (and its conformance with the Privacy Act) carried out.

Importantly, the Guide acknowledges the practical challenges with obtaining meaningful consent to the use of FRT, noting that the nature of FRT means that it is not often practical to obtain true, express consent from individuals whose biometric information might be captured by FRT. It notes that ‘merely having signage or notice about the use of FRT in and of itself, will not generally be sufficient to show that an individual has consented to the use of this technology’.

Tips for organisations that are considering deploying FRT to carry out facial identification in a commercial or retail setting are included – although arguably there are little practical examples provided by the OAIC as to how the four elements of consent (informed, voluntary, current and specific, and with capacity) can in practice be obtained to satisfy the detailed requirements set out in this Guide.

Recommendations and final note

We recommend that organisations who currently deploy FRT technology, or who are considering the use of FRT, should consider the following specific questions (in addition to usual reviews against the Privacy Act and APPs):

  1. Why is FRT required or proposed to be used?
  2. Is its use necessary, or is there another less privacy-intrusive way to achieve the same outcome?
  3. Is the use proportionate to the privacy and ethical risks identified from its usage? Has a Privacy Impact Assessment (PIA) been undertaken to identify and assess these risks?
  4. Does the business understand how the technology operates to: (i) avoid ethical issues regarding accuracy, bias, discrimination; and (ii) mitigate privacy risks such as transparency, security and accountability?
  5. How have individuals been notified and have they provided ‘meaningful’ consent (which is fully informed and voluntary)? The OAIC Guide confirms that (at a minimum) organisations need to have a clearly expressed and up to date Privacy Policy about how they manage personal information, including biometric information collected using FRT.
  6. Has the business documented its policies, practices and procedures adequately and taken steps to minimise privacy risks?
  7. What level of privacy governance and ongoing monitoring / review is in place (or proposed)?
  8. If relying on a third party hosted FRT system, what arrangements are in place?

The Bunnings decision and release of the OAIC Guide coincide with the passing of the Privacy and Other Legislation Amendment Act, 2024 (POLA) last month, which has uplifted the Privacy Act to include a host of new requirements, including (inter alia) transparency on the use of automated decision-making, a new statutory tort for serious interference with privacy and tiered civil penalties for non-serious privacy breaches.

The OAIC has also been provided with enhanced monitoring and investigation powers under POLA which will allow Commissioner Kind to obtain the level of transparency she has been seeking of late on the use of emerging technologies, such as FRT, by Australian businesses, and the processes, practices and methodologies they have in place to mitigate privacy and ethical risks arising.

To that end, a program of active review and audit, privacy risk management roll-out, and adopting a ‘privacy by design’ and ‘security by design’ approach to projects and activities that involve the use of personal information are strongly recommended to weather the privacy regulatory storm currently circling, and to provide your business with a competitive edge in this ever-changing technological environment.


Authored by:
Sinead Lynch, Partner
Eve Lillas, Senior Associate
Steven Schwartz, Paralegal
Isabel Taylor, Seasonal Clerk


 

[1] See Determination at Commissioner Initiated Investigation into Bunnings Group Ltd (Privacy) [2024] AICmr 230(29 October 2024)

[2] Determination para 25

[3] See ABC News Article – Bunnings breached privacy laws by using facial recognition on customers, Commissioner finds – ABC News

[4] Determination para 28

[5] Determination para 274

[6] OAIC Media Release – Bunnings breached Australians’ privacy with facial recognition tool | OAIC

[7] OAIC Media Release – Bunnings breached Australians’ privacy with facial recognition tool | OAIC

This update does not constitute legal advice and should not be relied upon as such. It is intended only to provide a summary and general overview on matters of interest and it is not intended to be comprehensive. You should seek legal or other professional advice before acting or relying on any of the content.

Get in touch