Police Use of Automated Facial Recognition Technology: an Emerging Challenge for Human Rights

by | Sep 9, 2019

author profile picture

About Jack Maxwell

Jack Maxwell is an Australian lawyer currently based in London, with interests in administrative law, constitutional law and human rights law. He completed the BCL at the University of Oxford in 2019, and has degrees in law and philosophy from the University of Melbourne. He blogs occasionally at foursciences.wordpress.com.

Citations


Jack Maxwell, “Police Use of Automated Facial Recognition Technology: an Emerging Challenge for Human Rights”, (OxHRH Blog, September 2019), <https://ohrh.law.ox.ac.uk/police-use-of-automated-facial-recognition-technology-an-emerging-challenge-for-human-rights/>, [Date of access].

Earlier this week, the High Court at Cardiff held that police use of automated facial recognition technology (AFR) is lawful. This is the first time that any court in the world has considered the issue. And it won’t be the last, as Edward Bridges, the claimant, has already announced that he will appeal the decision.

The South Wales Police have trialled AFR at public events since mid-2017. They deploy cameras which capture images of people in the vicinity, and the AFR system then processes these images and compares them to those of people on police watchlists. If the system identifies a match, it alerts an officer, who reviews the images to decide whether a match has in fact been made. If no match is identified, the system deletes the person’s data.

Mr Bridges argued that the police’s use of AFR violated art 8 of the European Convention on Human Rights (ECHR), which enshrines the right to respect for private life. He also argued that the police had violated the Equality Act 2010 by failing to consider the risk that AFR would disproportionately affect women or minority ethnic groups. (I leave aside here Mr Bridges’ unsuccessful data protection claims.)

Several important points emerge from the judgment.

First, the court rightly rejected the police’s argument that the AFR system did not engage art 8 at all. The police argued that AFR only captured a person’s image while in public, and that it was an almost instantaneous process. But AFR processes a person’s image to extract biometric data, which enables their identification in a wide range of circumstances. This information has an ‘intrinsically private’ character, similar to DNA or fingerprints. The jurisprudence of the European Court of Human Rights (ECtHR) makes clear that the capture, storage and processing of such information, even only briefly, engages art 8.

Second, the court held that the police’s use of AFR satisfied the legality requirement under art 8(2), even though no specific legislation currently regulates this technology. The police’s broad common law powers authorised them to use AFR, and data protection laws, the Surveillance Camera Code, and police policy documents provided a sufficiently clear legal framework for their doing so. The court ignored the ECtHR’s detailed jurisprudence on the legality requirements for covert surveillance, because AFR was said to be overt surveillance. This seems to underplay the intrusiveness of AFR, and to overlook the ECtHR’s indication that scrutiny of overt surveillance powers ‘should be guided’ by its other jurisprudence. But in any case, the legality requirement is a frail reed in the struggle to protect privacy against state interference. It has underpinned several successful challenges to covert surveillance regimes in the past, only for the state to enact detailed and sweeping laws which, on one view, serve principally to expand and legitimise that surveillance.

Third, the court’s analysis of the proportionality of AFR was unconvincing. As is common in this kind of case, the court spent much less time on the proportionality of AFR than its legality. And several aspects of the analysis are questionable. The court stressed that the watchlists were targeted, but they in fact included a broad range of people, including the rather ominous categories of ‘persons whose presence at a particular event causes particular concern’ and ‘persons simply of possible interest to [police] for intelligence purposes’. At times, the court struggled to make sense of the interests protected by privacy, and how they might be threatened by AFR. It noted that the breadth of the watchlists ‘did not have any impact’ on Mr Bridges, because he was not in fact included on them. But this overlooks the deleterious effect of a person’s very uncertainty over whether they might be on a watchlist, and thus liable to be stopped by police.

Finally, the court rejected Mr Bridges’ claim about the potentially discriminatory impact of AFR, because the police had no reason to believe that similar problems affected their specific system. This sets a low bar for the state in its use of algorithmic tools, and its management of the concomitant risks of discrimination – an area of vital importance for public law in the years to come, even beyond the immediate context of AFR.

 

Share this:

Related Content

0 Comments

Submit a Comment