‘I was left homeless after the Home Office claimed I wasn’t a child – here’s why AI age checks are dangerous for migrants like me’
The Home Office has advertised a £1.3m contract for an algorithm that can ‘accurately predict someone’s age’. One former asylum seeker tells Holly Bancroft why he thinks this could go very wrong
Jean was just 16 when he was left outside the front door of the headquarters of UK Visas and Immigration in Croydon, south London – alone, frightened and without any documents to prove who he was. He had arrived in Britain just hours before – his first ever trip outside his central African country and the first time he had even travelled out of his hometown.
The past few days had been a nightmare that had seen him witness a horrific attack on his family. He himself had been subjected to torture and, left without anyone else to turn to, he had managed to find help from a trusted friend of the family.
She had brought him to the UK by plane, and Jean briefly thought he might have found safety with her, until he was taken to Croydon’s Lunar House and told he was now on his own.
“She said, ‘I can’t support you any more’. All I remember is she said, ‘Go into the building and tell them who you are’. It was a bit of a fight to let her go. I was scared, I didn’t want to go into that building. But she convinced me to go inside. I found out later that it was an immigration centre,” Jean, a pseudonym used for safety reasons, told The Independent.
“I was feeling confusion and fear. The weather, the language... everything was new to me. I was just lost. Initially, I was scared seeing people in uniform because that brought me back to what I had witnessed at home. I was traumatised from what I had experienced.”
Thousands of unaccompanied asylum-seeking children seek help from the UK authorities each year, the majority of them aged 16 or 17. In the year ending March 2025, there were 3,707 asylum claims from lone children.
For those aged 17 and under, social services must provide somewhere safe to live, as well as provide clothes, food and education, and help with an asylum claim.
However, hundreds of children are wrongly assessed by Home Office officials as adults, meaning they do not get the help they are entitled to and are often put into dangerous situations.
Data obtained by the Helen Bamber Foundation revealed that at least 678 children in 2024 were wrongly classified as adults after a human “visual assessment” at the border.
David Bolt, the independent chief inspector of borders and immigration, found factors like “lack of eye contact” were used to make decisions, and said that children were being “pressured” into declaring they were over 18. From a sample of 55 cases that the inspector looked at where the Home Office had said the asylum seeker was “significantly over 18”, 76 per cent were in fact found to be children.

Ministers now plan to replace human judgement with AI facial-recognition technology, in a move that charities and rights groups have said amounts to an “experiment on migrants” that will lead to “serious, life-changing consequences”.
The Home Office is in the market for “an algorithm that can accurately predict the age of a subject”. A government contract notice, seen by The Independent, says that the technology “will have multiple use cases for Home Office, an example could/ would be to assist in determining the age of those who are encountered without verifiable identity documentation”.
The three-year contract, which will start in February next year, is valued at £1.3m. Announcing the plans in July, the then Home Office minister Dame Angela Eagle said that the AI facial age-estimation technology would be the “most cost-effective option”.
The aim is for facial age estimation to be “fully integrated into the current age assessment system over the course of 2026”, she said.
It is not yet clear whether the AI age-estimation technology would be used on children as they arrive in the UK on small boats, or to inform final asylum claim decisions. The Home Office has said that the technology will be used to assist officials, and that no final decisions have been made about what stage of the process it will be integrated.
If used on arrival, the algorithm would have to account for the ageing effect of traumatic journeys, past torture and abuse – experiences that can often make young asylum seekers appear older.

Jean initially found help from social services when he arrived in the UK in 2012, and was housed in care with other children. However, Home Office officials later decided he wasn’t a child after all, and his support was taken away.
The decision was devastating. “I was called to an interview at 4pm. They gave me a time when offices are about to close, that’s my understanding now, but I didn’t release it at the time,” he said.
“I had to wait for them to receive me at 5pm. They said, ‘You are not a child’, saying, ‘You are a liar’. I told them, ‘I am not a liar, I know who I am, I know my age.’ When someone is at a desk questioning your age, you feel like you are invisible. You have to fight for your identity, and it is not easy to fight for yourself.
“You feel like you have to isolate yourself to cope with what you have been through, constantly questioning, why, why, why? You feel like you want to end everything because they don’t believe you, and I know for sure that many young people are in the same situation.”
He explained that the immigration officials told him he had to go to the offices of a charity, Refugee Council, and that he should find his own way there: “They gave me a map, and it was a long journey to get there, especially as I was struggling with the language. I managed to get there but it was about to close, and I got sent to a hostel to sleep”.
By this stage a 17-year-old boy with little English, Jean was housed with adult asylum seekers in a hostel. He felt incredibly unsafe and thought it would be a good decision to leave, something he later viewed as a mistake.
“I was traumatised, anxious, and I just wanted to be on my own. That was the idea,” he explained. This then led to around four years sleeping rough in London – until a stranger who saw him begging for money at a train station directed him to Notre Dame charity in Leicester Square.
He got a referral to migrant charity Freedom from Torture, which was able to support Jean in submitting a fresh asylum claim. A judge’s decision to grant him sanctuary in 2018, and a recognition that he should have been helped as a child refugee all those years ago, has meant Jean now has a roof over his head in council-provided accommodation.
On the day of our interview, he heard that he is now a British citizen. However, he fears for others like him who arrive in the UK as children but who are told they are liars.

Speaking about the government’s plans to use AI to help with decision-making, he said: “It’s a way of not treating people as human beings. They are treating us as a tool to train their AI.
“They are testing something, and it’s like we are not human. They are thinking, ‘OK, let’s use them.’
“Making decisions based on a computer, we all know it’s not always accurate. They need to understand that a lot of young people are going through trauma, and they may look different at that moment when they really need help.”
Kamena Dorling, director of policy at the Helen Bamber Foundation, said the government’s plans were “concerning unless significant safeguards are put in place”.
She added: “Existing evidence has found that AI can be even less accurate and more biased than human decision-making when judging a person’s age, with similar patterns of errors.
“Crucially, AI cannot account for factors that can significantly alter a young person’s appearance after fleeing conflict and persecution and making dangerous journeys, such as trauma, malnutrition and exhaustion.”
Anna Bacciarelli, senior AI researcher at Human Rights Watch, said: “The UK government’s plans to use facial age estimation are misguided at best, and should be scrapped immediately.
“In addition to subjecting vulnerable children and young people to a dehumanising process that could undermine their privacy, non-discrimination and other human rights, we don’t actually know if the technology works. There’s no standardised industry benchmarks, and simply no ethical way to train and audit this technology on like-for-like populations.
“In the UK, it’s been used so far in shops and bars, not refugee processing centres.”
A Home Office spokesperson said: “Robust age assessments are a vital tool in maintaining border security.
“We will start to modernise that process in the coming months through the testing of fast and effective AI age-estimation technology. We then intend to integrate facial age estimation into the current system, subject to the results of testing and assurance.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments
Bookmark popover
Removed from bookmarks