Thousands of passengers travelling from Dublin to Holyhead will be scanned by live facial recognition technology this week as part of a UK immigration enforcement pilot.
The use of the controversial technology at the Welsh port comes as police forces in England and Wales prepare a major expansion of the powerful surveillance tactic.
The trial at Holyhead this week follows a previous pilot in November last year. Transparency data published by the UK Home Office showed that during the six-day trial more than 7,500 faces were scanned, resulting in one arrest.
The operation is a “proof-of-concept pilot” by the Home Office's immigration enforcement division, which plans to use the technology to locate people within their “Population of Interest.”
Holyhead was chosen after intelligence showed individuals were returning to the UK in breach of deportation orders and using the Common Travel Area to circumvent immigration controls, the documents note.
Most passengers on the route are British or Irish nationals who move freely without routine checks and could “argue they are placed at a disadvantage by the location of the pilot.”
Human rights groups warn that the rapid adoption of facial recognition technology could enable widespread surveillance and discrimination.
Concerns have also been raised about its use on journeys within the Common Travel Area.
Úna Boyd, of the Committee on the Administration of Justice, said the trial “is yet another example of the UK government attempting to circumvent the law and carry out checks on Common Travel Area journeys.”
“We have long raised human rights concerns that CTA checks rely on racial profiling, and deploying this technology, which is inherently racist and misogynistic, will only deepen discriminatory and unlawful practices,” she added.
A Home Office spokeswoman rejected the claims, stating that “it is inaccurate to state Live Facial Recognition is used to circumvent border arrangements.”
The Department of Justice in Dublin said it would not comment on operations in another jurisdiction.

Shaun Thompson, an anti-knife crime campaigner, who was stopped by police after being misidentified by facial recognition technology. Photo by Big Brother Watch.
Rollout
Live facial recognition technology captures digital images as people move through a designated “Zone of Recognition”.
Facial features are then “extracted and expressed as numerical values,” the Home Office documents state.
At Holyhead, images will be compared against a watchlist of individuals who have previously been issued deportation orders and are suspected of attempting to return to the UK.
The initial three-day trial in November scanned 2,038 faces, but there were no matches to the watchlist.
A second three-day trial later that month scanned a further 5,474 faces, leading to two alerts, one of whom was arrested.
The Home Office annouced a third trial will start the week commencing 23 February 2026.
The Labour government has invested millions and pledged to ramp up the use of facial recognition technology, which it agrues helps police solve crimes and protect vulnerable people.
But civil liberties groups say the technology also invades privacy and targets minorities.
Dr Elizabeth Farries, of University College Dublin’s Centre for Digital Policy (CDP), said there are “established problems” with live facial recognition technology (FRT).
“We are being continuously fingerprinted as members of the public going about our daily lives. We are caught in this continuous monitoring, which poses a high risk to our fundamental rights.”
She said that “it is an empirical fact” that black men are more likely to be incorrectly identified by AI surveillance technology.
“Bias is not just a technical error in AI, but a structural and systemic issue where AI systems reinforce existing societal prejudices, including bias against race.”
The Metropolitan Police are currently being taken to court by Shaun Thompson, a black community worker, who was wrongly identified as a criminal suspect.
Mr Thompson - who had to provide his ID, have his fingerprint scanned, and was inspected for scars and tattoos - described live facial recognition technology as 'stop and search on steroids'.
Mission creep
Attempts to grant Gardai such powers since 2022 have met opposition from TDs and civil liberties groups.
Last year, the government’s expert group on artificial intelligence warned that plans for facial recognition technology risk “gradual mission creep towards an untargeted mass surveillance state”.
Olga Cronin, a senior policy officer at the Irish Council for Civil Liberties (ICCL), warned about the potential future use of powerful surveillance technologies.
“When there are new technologies put on the table, we always have to consider what that looks like in the hands of a non-democratic ruler,” Ms Cronin said.
She cited its use by ICE in the US, by Iran against protestors, and by Hungary against LGBT parade participants.
“We might have a benign government today, and they might want to use tools for legitimate reasons,” she added.
“But we always have to be mindful that we are embedding surveillance infrastructure that may be in the hands of not-so-benign governments in the future.”
A version of this article appeared in The Irish Times
