Skip to content

U.K. quietly implements biometric facial identification in passport and immigration databases.

Home Office's alleged lack of transparency denounced as 'astonishing' and 'perilous' by activists

UK covertly employs biometric facial recognition scans on passport and immigration databases
UK covertly employs biometric facial recognition scans on passport and immigration databases

U.K. quietly implements biometric facial identification in passport and immigration databases.

The UK government's use of facial recognition technology (FR) has come under scrutiny, with concerns raised about transparency, accuracy, and privacy rights.

Recent data reveals that police forces in the UK have been conducting facial recognition searches against over 150 million photos from passport and immigration databases in secret, without notifying Parliament or the public [1][3][4][5]. This practice has rapidly increased, with passport database searches rising from 2 in 2020 to 417 in 2023, and immigration database searches increasing nearly sevenfold from 16 in 2023 to 102 in 2024.

Transparency concerns

The government has not publicly disclosed this extensive use or provided a clear legal basis or policy framework controlling these searches. Campaigners and former officials criticize this secrecy as “Orwellian” and an historic breach of privacy, undermining democratic accountability [1][2][3][5].

Accuracy and Impact on Privacy

There are risks of misidentification since facial recognition algorithms are not infallible. Police can search non-suspects’ photos without their knowledge, including images taken from protests or social media. This expansive surveillance affects tens of millions of innocent people without their consent, constituting a serious intrusion into privacy rights [3][4].

Privacy watchdogs highlight that retaining biometric data unlawfully (such as from people acquitted of crimes) damages public trust. The lack of public or parliamentary scrutiny bypasses “policing by consent,” a core principle of UK policing [2]. Legal campaigns have been launched to challenge what is described as a historic breach of the right to privacy [1][3][4].

Looking Ahead

In summary, the UK government's secret use of facial recognition on passport and immigration databases has dramatically expanded without transparency or clear regulation, raising profound concerns about privacy rights, potential inaccuracies, and democratic oversight. Authorities have begun formulating policies following public outcry and legal pressure, but significant issues remain unresolved [2].

It is essential for the government to address these concerns and establish clear guidelines for the use of facial recognition technology to ensure transparency, maintain public trust, and uphold privacy rights.

References

[1] Big Brother Watch, Privacy International, and their respective directors have described the databases and lack of transparency as "Orwellian" and have called for a ban on the practice.

[2] The Home Office uses passport and immigration databases only for Retrospective Facial Recognition (RFR), not LFR, and the police must request access from the Home Office before being allowed into the passport database.

[3] The number of searches by 31 police forces against the passport databases rose from two in 2020 to 417 by 2023, and scans using the immigration database photos rose from 16 in 2023 to 102 the following year.

[4] Big Brother Watch claims the passport database contains around 58 million headshots of Brits, and the immigration database has around 92 million images.

[5] The UK will install its first permanent LFR camera in Croydon, South London. The Home Office insists that efforts are made to inform the public when a camera is due to be set up in any given location.

  1. The rapid increase in the use of facial recognition technology (AI) in the UK's passport and immigration databases, without proper transparency, has sparked concerns about privacy rights and democratic accountability, drawing comparisons to Orwellian practices by campaigners and former officials [1][2][3][5].
  2. With the rise of Internet of Things (IoT) technology and software, there is an increasing risk of misidentification due to the fallibility of facial recognition algorithms, as police can secretly search non-sportsman's photos without their knowledge, infiltrating the privacy of tens of millions of innocent people without consent [3][4].
  3. Legal and ethical concerns surrounding the UK government's excessive use of facial recognition technology have led to calls for clearer guidelines to ensure transparency, maintain public trust, protect privacy rights, and abide by the principles of policing by consent [1][2].

Read also:

    Latest