A few reports we were reading this week about facial recognition that we found of note.
First, Odia Kagan of Fox Rothschild writes:
Following the Federal Trade Commission’s decision in December 2023 to ban Rite Aid from using AI facial recognition, it has become crystal clear that U.S. regulators expect a risk assessment when a retailer uses facial recognition technology.
A new, and detailed, report from the New Zealand privacy commission provides helpful considerations for such Data Protection Impact Assessments (DPIAs). They include:
- Was the data trained on minorities?
- How long will the retailer retained data that wasn’t matched?
- Data minimization techniques (including when to share among stores and when to add to a watchlist).
- How accurate should the match be to trigger consideration (92.5%)?
Second, over in Ireland, the DPC announced the conclusion of its investigation into use of facial matching technology in connection with the Public Services Card by the Department of Social Protection (DSP). The four-year investigation followed an earlier investigation. The findings if the current investigation were that DSP:
- Infringed Articles 5(1)(a), 6(1), and 9(1) GDPR by failing to identify a valid lawful basis for the collection of biometric data in connection with SAFE 2 registration at the time of the inquiry;
- Having regard to the preceding finding, infringed Article 5(1)(e) GDPR by retaining biometric data collected as part of SAFE 2 registration;
- Infringed Articles 13(1)(c) and 13(2)(a) GDPR by failing to put in place suitably transparent information to data subjects as regards SAFE 2 registration; and
- Infringed Articles 35(7)(b) and (c) GDPR by failing to include certain details in the Data Protection Impact Assessment that it carried out in relation to SAFE 2 registration.
In light of the infringements identified above, the DPC has (1) reprimanded the DSP, (2) issued administrative fines totalling €550,000, and (3) issued an order to the DSP requiring it to cease processing of biometric data in connection with SAFE 2 registration within 9 months of this decision if the DSP cannot identify a valid lawful basis.
Read more about the investigation and findings on the DPC’s site.
And third, let us also take this opportunity to remind entities of the need to consider at what point the use of facial recognition is even warranted. Joe Cadillic sent along a recent item from The Guardian in the UK about a retail store customer who was put on a facial ID watchlist at Home Bargains after dispute over 39 pence of paracetamol they accused her of stealing. She firmly denies stealing it, but her complaint notes:
“To be clear: [she] did not steal the paracetamol during the first visit. The allegations by Home Bargains are false. However, even taking Home Bargains’ allegations at face value, their – and Facewatch’s – biometric processing was clearly not in the substantial public interest.
“The watchlist entry was created and acted upon in order to apprehend someone supposedly guilty of (on one occasion) stealing goods valued at less than £1. It is scarcely possible to imagine a less serious ‘offender’.”