Skip to content

Government covertly employs facial recognition technology for passport and immigration records in the UK

Home office's refusal to disclose information criticized as surprising and risky by advocacy groups

British authorities covertly deploy facial recognition technology on passport and immigration...
British authorities covertly deploy facial recognition technology on passport and immigration databases.

Government covertly employs facial recognition technology for passport and immigration records in the UK

The use of facial recognition technology (FRT) by UK police is expanding, with authorities claiming accuracy and transparency measures in place. However, significant concerns remain, particularly regarding the use of passport and immigration databases and issues raised by privacy groups.

Accuracy and Transparency

The Home Office and police assert that the facial recognition technology deployed in live facial recognition (LFR) vans is subject to strict rules, including targeted deployments against bespoke watchlists of serious offenders [1][2][3]. The technology has been independently tested by the National Physical Laboratory (NPL), which found it to be accurate without bias regarding ethnicity, age, or gender at operational settings [3][4]. Deployments comply with guidelines from the College of Policing and the Surveillance Camera Code of Practice, and trained officers review every match to avoid wrongful arrests [1][3]. Public notification is required by signs in areas where LFR is in use, and non-matches are deleted automatically to protect privacy [2]. The government is planning public consultations and a legal framework review to ensure transparency and safeguards [3][4].

Use of Passport and Immigration Databases

Despite assurances that passport and immigration databases are used only for retrospective and operator-initiated FRT (less intrusive methods), reports reveal an increase in police searches against these government-held databases with limited parliamentary or public oversight [5]. Privacy campaigners have criticized this as a lack of transparency, labelling the covert use of passport and immigration photo databases for FRT as "astonishing" and "dangerous" [5]. The Home Office states that police must request access to the passport database, and use is controlled, but critics argue that such access has escalated significantly, from just 2 searches in 2020 to 417 in 2023 by 31 police forces [5].

Concerns Raised by Privacy Groups

Human rights and privacy organizations like Amnesty International UK and Big Brother Watch criticize the technology for being discriminatory, especially against people of colour, and warn about risks of misidentification and wrongful arrest [1][5]. Amnesty International argues that FRT disproportionately misidentifies people of colour, raising serious ethical and human rights questions [1]. Privacy groups also condemn the lack of transparency and parliamentary oversight over the use of FRT linked to passport and immigration databases, viewing it as a covert and potentially intrusive state surveillance practice [5].

In summary, while UK authorities assert that police FRT is accurate, overseen, and used with safeguards, privacy advocacy groups strongly dispute these claims, highlighting both technical limitations, potential racial bias, and significant transparency and accountability concerns—particularly around the growing but secretive use of passport and immigration databases for FRT purposes. The Home Office has updated its statement to clarify that it uses passport and immigration databases only for Retrospective Facial Recognition (RFR), not LFR, and the police must request access from the Home Office before being allowed into the passport database. The controversy surrounding FRT usage in the UK continues, with calls for transparency, accountability, and a reconsideration of the technology's role in law enforcement.

  1. The escalating use of passport and immigration databases by UK police for facial recognition technology (FRT) raises concerns about the lack of transparency and parliamentary oversight, as highlighted by privacy groups like Amnesty International UK.
  2. Despite the Home Office's claims that the AI-powered facial recognition technology (FRT) in live facial recognition (LFR) vans avoids ethnic, age, or gender bias, the increasing use of software linked to databases has sparked concerns about potential privacy infringements in sports events or public places.
  3. IoT devices and smart software could revolutionize sports training and performance analysis, but the use of such technology poses questions about the collection and management of data, as highlighted by privacy groups when discussing the expanding use of FRT by UK law enforcement.

Read also:

    Latest