Recent technological advances in artificial intelligence and machine learning have dramatically improved the utility of Automated Facial Recognition (AFR) technology, which is increasingly being deployed in both commercial and state sponsored contexts.
Although facial data is a form of biometric data, it is typically subject to additional layers of regulation. AFR is clearly a topic which has drawn the attention of a number of commentators and has the potential to cause serious reputational damage to a business if it fails to sensitively take into account the wider societal implications of its use – let alone the narrower regulatory implications.
Recent studies have also indicated that the technology may not be as accurate as initially thought, an issue which was considered in the case of R v. South Wales Police [2019] EWHC 2341, discussed further below.
Use cases
This makes any use case evaluation of AFR even more important. Use of AFR technologies is perhaps more widespread than you might think – especially in those areas that aren’t state or government sponsored (which tend to garner newspaper headlines).
The following list is a non-exclusive sample of some commercial use cases:
- AFR can be used to target particular audience demographics – recognising whether the person staring at the advert is male, female, old, young, or from a particular ethnic background. More advanced systems are even able to detect different emotional states expressed on an individual’s face, helping companies to work out the effectiveness of their advertising (although as you can imagine this raises a variety of equality and discrimination related issues).
- Social media platforms already deploy AFR as part of a network classification exercise, ‘tagging’ individuals who are identified as existing members of that network and making it easier for other people to search for contacts.
- AFR is a valuable tool in the diagnosis of diseases and physical conditions which cause detectable changes in appearance, such as some rare genetic disorders (but also more recently in relation to airport scanning for Covid-19 infections). It is also being used in other situations where insight into health status is useful – for example, by insurance underwriters to determine whether a person is a heavy smoker or drinker (surprisingly, the technology company using this approach claims it to be more accurate than traditional underwriting means, such as interviews).
- AFR is being used by major online retailers to disrupt traditional “bricks and mortar” cash and card payment models – Amazon Go stores, in which customers walk in, take an item and leave – are becoming a fixture of US high streets. On a similar level, AFR can enhance payment security by logging an individual’s unique facial features and using them for additional payment authentication with traditional tokenised payments, such as credit and debit cards.
- Many AFR applications are now being used to control access to secure sites and environments, typically replacing token-based systems, which rely on physical photo identity cards and RFIDs to open entry ways. As systems become affordable, use is spreading to general entry systems, where AFR is increasingly used for its convenience.
- On a similar but device-based level, AFR systems are used to unlock computers and phones (the most commonly deployed being FaceID on Apple’s iPhone X), replacing the biometric fingerprint scanning method.
Whilst it is true to say that the legal position in relation to AFR is evolving, it is not correct to assume that there is no regulation.
By its nature, much of the applicable existing law is “use case” dependent and will apply differently, for example, if you are using the technology as part of an active video surveillance application, querying a pre-existing facial database, or a combination of both.
Different laws and codes of conduct will apply depending upon the extent to which your facial recognition application is used on public or private property and, of course, state actors such as the police, counterterrorism and intelligence services will be bound by a separate regime applicable to the covert gathering and use of such data that is outside the scope of this short article.
Each use case will also likely fall under the oversight of a different regulator, or multiple regulators. Be warned – the regulatory environment is complex.
Compliance
Any analysis of the compliance of your AFR system will inevitably need to start with an assessment of its adherence to the data protection regime under the General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA 2018), and relevant guidance issued by the Information Commissioner’s Office (ICO).
Facial images themselves are classed as personal data (i.e. data that can readily identify individuals), but processing by an AFR system, converting it into biometrics, brings with it extra stringent regulation.
Issues you will need to consider include ensuring that your application is designed with a view to maintaining privacy (referred to as “privacy by design”) and determining the lawful basis upon which you are seeking to use AFR.
Understanding how AFR data will be held and processed is vital to determining the legality of your use case. From an AFR perspective, you’ll also need to be aware of the ICO’s guidance on CCTV (the Code of Practice for Surveillance Cameras and Personal Information), if your application involves video surveillance.
Turning to human rights, Article 8 of the European Convention of Human Rights provides for a right to respect for one’s private and family life, home and correspondence and is directly actionable under English law through the mechanism of the Human Rights Act 1998.
The right is clearly designed to circumscribe and limit intrusive activity which could damage privacy. Although clearly applicable to state-sponsored uses, you’ll need to think carefully how your AFR application might interfere with that right.
Current police trials of AFR are subject to challenge by human rights organisations on this basis (as we discuss shortly in the context of the South Wales Police case).
Chapter two of part one of the Protection of Freedoms Act 2012 (PFA 2012) provides for oversight by the Biometrics Commissioner of certain biometric data gathered under the Police and Criminal Evidence Act 1998 – principally fingerprints, DNA samples and custody pictures, but increasingly also facial databases (such as the National Police Database) maintained by law enforcement authorities.
Chapter one of part two (s33) of the PFA 2012 governs the use and regulation of CCTV and other surveillance camera technology by “relevant authorities” (principally state actors) and provides for the Surveillance Camera Commissioner, of which more below.
The Surveillance Camera Code of Practice 2013 (SCC Code) applies to the use of “overt” video surveillance systems in public places by relevant public authorities, where compliance is mandatory. Chapter four in particular refers to the use and/or processing of images obtained by CCTV (and Principle 12 specifically).
The code is administered by the Surveillance Camera Commissioner (SCC). Where AFR is used as part of a public authority administered public scheme, the SCC will have jurisdiction and can enforce compliance with the SCC Code. The SCC’s mission is to ensure voluntary compliance with the code by unregulated (private) entities, so you need to be mindful of this in any commercial use case.
If your use of AFR could produce biased outcomes (in particular through facial training data that does not adequately represent the ethnic diversity of modern society), then you will also need to be mindful of the impact of the Equality Act 2010 (EA 2010) and in particular the need to avoid both direct and indirect discrimination.
Finally, if your AFR application involves a security application that extends to public spaces and includes CCTV monitoring and/or door supervision, then this may constitute a licensable activity under the Private Security Act 2001 (PSA 2001).
This legislation was originally enacted to provide a licensing regime for the private security industry, which had previously suffered from a poor reputation.
Licences 3 OC_UK/49300758.1 are administered by the Security Industry Association. Failure to obtain a licence, when one is needed, is a criminal offence that may be punishable by fines and/or imprisonment.
Case law
So that was a very high level view of the relevant legislative framework applying to commercial use ases. What then of the applicable case law?
As you would imagine, this is currently sparse. As of the date of writing, there is only one UK Court of Appeal case which is worth mentioning, R (Bridges) v. Chief Constable of South Wales Police [2020] EWCA Civ 1058, although the principles arising from it in relation to AFR are necessarily limited to use by public sector organisations.
In this case, South Wales Police (SWP) were using an AFR pilot in Swansea city centre to match a “watch list” of felons, suspects and other persons of interest to their facial database.
Crucially for the purpose of their analysis, the police immediately deleted facial data in respect of unknown people and only retained data relating to facial matches.
The plaintiff, Ed Bridges, had brought an appeal against an earlier Divisional Court judgment on a number of grounds challenging the lawfulness of SWP’s use of the technology as a person in the vicinity of SWP’s AFR vans when in use.
The Court of Appeal agreed with the Divisional Court judgment, which held that there had been interference with the appellant’s human rights – specifically Article 8 of the ECHR, but, crucially, they disagreed with the lower court’s assessment that the underlying law was sufficient to justify its use within Article 8(2).
As the Data Protection Impact Assessment prepared by SWP under s64 of the DPA 2018 had incorrectly concluded that there was no infringement of Article 8, the court found in consequence that it had failed to properly address the risks arising from the use of the AFR system.
At the core of the judgment were concerns that this insufficiency in law inadequately dealt with what the Court termed the “who” question (i.e. who should be identified as a “person of interest” on SWP’s watch list) and the “where” question (i.e. where deployment of AFR would be justified).
Currently too much discretion is accorded to individual police officers in determining these issues (indeed the Court referred to the discretion here as “impermissibly wide”).
In terms of the existing legal framework, the Court’s view was that whilst the DPA 2018 formed an important part of the framework, it was not by itself sufficient and whilst the SCC Code briefly mentioned AFR, it did not deal specifically with these fundamental “who” and “where” issues.
Finally, and perhaps most controversially for the use of AFR in public sector contexts, the Court held that in its use of the technology SWP had not met its Public Sector Equality Duty (PSED), in accordance with s149(1) of the EA 2010.
The PSED is designed amongst other things to ensure that public sector authorities eliminate discrimination, harassment, victimisation and any other conduct prohibited by the EA 2010.
In sharp contradiction to the earlier Divisional Court judgment, the Court of Appeal held that SWP had failed the standard of the PSED by not investigating (or even considering) the potential for the AFR system to produce biased outcomes in relation to BAME community members.
There were many issues which were frustratingly not dealt with in a conclusive manner by the SWP judgment, and given the widespread public interest in this topic it seems likely that it will be appealed again to the Supreme Court.
It’s also apparent that the Court’s assessment of the insufficiency of the legal framework supporting public sector use of AFR will very likely lead to further legislative and policy interventions.
All of this does mean that applying AFR in both the public and private sectors remains a complex, evolving and delicate exercise within the UK and Europe.
The utility of the technology needs to be balanced with thorny and substantial issues involving individual liberty and personal privacy. Whether the new EU-proposed framework on AI addresses these issues substantively remains to be seen.
John Buyers is a partner and head of AI at Osborne Clarke