Home Office drops 'racist' algorithm from visa decisions

  • Published
A photo of the UK Border Control at an airport shows two signs, reading "Welcome to the UK border" and "Welcome to the UK", with warning about passport controlImage source, Getty Images

The Home Office has agreed to stop using a computer algorithm to help decide visa applications after allegations that it contained "entrenched racism".

The Joint Council for the Welfare of Immigrants (JCWI) and digital rights group Foxglove launched a legal challenge against the system.

Foxglove characterised it as "speedy boarding for white people".

The Home Office, however, said it did not accept that description.

"We have been reviewing how the visa application streaming tool operates and will be redesigning our processes to make them even more streamlined and secure," it said in a statement.

The controversy centred over an applicant's nationality being used as a part of the automatic system.

Use of the controversial algorithm will be suspended on Friday 7 August, with a redesigned system expected to be in place by the autumn.

Foxglove said the system had "been used for years to process every visa application to the UK".

What did the algorithm do?

The Home Office characterised the algorithm as a "streamlining" system.

The system took some information provided by visa applicants and automatically processed it, giving each person a colour code based on a "traffic light" system - green, amber, or red.

One metric used was nationality - and FoxGlove alleged that the Home Office kept a "secret list of suspect nationalities" which would automatically be given a red rating.

Those people were likely to be denied a visa, the group said.

"The visa algorithm discriminated on the basis of nationality - by design," added JCWI.

People from red-flagged countries, it said, "received intensive scrutiny by Home Office officials, were approached with more scepticism, took longer to determine, and were much more likely to be refused".

The group argued this process amounted to racial discrimination, putting it in breach of the Equality Act.

There was another factor at play, which the JCWI and Foxglove called a "feedback loop".

Visa decision rates would be used to decide which countries were on the "suspect nationalities" list, they said.

But the algorithm used that list, and red-flagged applications were less likely to succeed. Those results were then used to reinforce the list.

The JCWI said it was "a vicious circle".

'Institutionally racist'

"We're delighted the Home Office has seen sense and scrapped the streaming tool. Racist feedback loops meant that what should have been a fair migration process was, in practice, just speedy boarding for white people," said Cori Crider, founder of Foxglove.

Chai Patel, legal policy director of JCWI, said the Windrush scandal had shown the Home Office was "oblivious to the racist assumptions and systems it operates".

"This streaming tool took decades of institutionally racist practices, such as targeting particular nationalities for immigration raids, and turned them into software," he said.

The Home Office said it could not comment further while litigation was still ongoing.

Until the new system is in place, the streaming of visa applications will be based on information about the specific person - such as their previous travel - and nationality will not be taken into account.