Facial Recognition Technology: The High Court’s flawed approach to information privacy

by | Feb 24, 2020

author profile picture

About Sapan Maini-Thompson

Sapan Maini-Thompson recently completed an LLM at University College London. Previously, he graduated with a BA from Oxford University and an MSc from the London School of Economics. He tweets @SapanMaini.

Citations


Sapan Maini-Thompson “Facial Recognition Technology: The High Court’s flawed approach to information privacy” (OxHRH Blog, 2019) <https://ohrh.law.ox.ac.uk/facial-recognition-technology:-the-high-court’s-flawed-approach-to-information privacy> [Date of Access].

In September 2019, the English High Court ruled the use of automated facial recognition technology (AFR) by South Wales Police Force was consistent with ECHR Article 8 and data protection legislation. Facial recognition technology harnesses a new form of biometric data, which engages a novel aspect of information privacy. By failing to interrogate how and why AFR challenges privacy rights more intrusively than existing surveillance methods, the Court’s proportionality analysis was flawed.

Interference with Article 8

The Court accepted that AFR engages the Article 8 rights of anyone whose face is scanned (or is at risk of being scanned). Following S. and Marper v. United Kingdom (2008), the Court held that AFR enables the extraction of “intrinsically private” information, similar to the retention of fingerprint records and DNA samples.

The instantaneous capture of an individual’s facial geometry is distinguished, however, on two grounds.

Firstly, the indiscriminate and non-consensual application of AFR is contrary to personhood because an individual’s face reflects their interpersonal identity. As Philip Brey has argued, facial recognition constitutes a form of “functional reductionism”, which “involves the creation of informational equivalents of body parts that exist outside their owner and are used and controlled by others”. In this way, AFR departs from a conventional understanding of the relationship between bodily integrity and personal autonomy. It demeans personal identity by transforming personhood from an intrinsic quality inhering in individuals into a quantity mapped by biological diagrams.

Secondly, the automatic extraction of unique personal information further shrinks the sphere in which an individual has the “right to be left alone”. This shifts the balance of power between the state and the individual in the public space, with potentially far-reaching psychological ramifications. According to Julie Cohen, changes of this nature threaten privacy because they might induce an individual to adapt their behaviour to a conformist norm.

The diminution of privacy in the public sphere, therefore, risks eroding participation rights such as freedom of assembly. As Collins LJ observed in R (Wood) v. Commissioner of Police of the Metropolis (2009), for example, the knowledge that one is being monitored can have a “chilling effect on the exercise of lawful rights” [92]. The risk of being wrongly identified amplifies this concern. While this interference is mitigated by deleting facial biometrics from a police database, it cannot be eliminated altogether because the intrusion into one’s psychological integrity has still taken place.

Article 8(2) and the Rule of Law

The Court’s failure to acknowledge these conceptual distinctions resulted in a series of false equivalences with existing surveillance regimes and regulatory instruments. This is significant to whether the lack of a specific statutory basis for the use of AFR renders the technology ultra vires. The Court held that it did not. As Lord Sumption JSC held in R (Catt) v Association of Chief Police Officers [2015]: “At common law the police have the power to obtain and store information for policing purposes.” [7]

The Court said this broad construal of the police’s common law powers meant the only issue is whether the use of AFR constitutes an “intrusive method” and, therefore, out-with the common law powers of the police. The Court said an “intrusive method” was a clear reference to physical intrusion. For this reason, “the [use of AFR] is no more intrusive than the use of CCTV in the streets.” [75]

This reasoning is flawed. First, it fails to consider how AFR extracts more private information from an individual than a video image, and therefore intrudes more intensely upon an individual’s privacy than CCTV. Second, and by extension, it demonstrates the inadequacy of interpreting intrusion in purely physical terms.

(Dis)-Proportionality

Coterminous with the Court’s reductionist approach to data extraction was a mistaken analysis of proportionality. Identifying criminal suspects is a legitimate security aim to which AFR is rationally connected. It is not clear, however, that a regime which enables the state to indiscriminately obtain the biometric information of its citizens meets the threshold of strict necessity.

Further, given the unique interference with information privacy, it is doubtful the pervasive application of AFR achieves the right balance between Article 8 rights and national security objectives. It is perhaps in anticipation of this generalised usage that the Court recommended periodic review. Indeed, the Information Commissioner has noted it is likely to be less challenging to justify sensitive processing where an AFR deployment is targeted, intelligence-led and time-limited.

A moratorium is in order.

Share this:

Related Content

1 Comment

  1. Kishor Dere

    Growing use of facial recognition technology poses new challenges to the exercise of human rights and fundamental rights. Right to privacy is certainly the most vulnerable one. The state as well as citizens have valid and genuine concerns. There is an urgent need to strike a balance between the two.

Submit a Comment