The growth of technology has brought a great deal of efficiency and security to almost all organisations and businesses. But such progress may have taken a slightly wrong turn as the reliance on artificial intelligence by the Home Office as a streaming tool for visa applications may actually be carrying out its functions on grounds of racial bias.

Last week the Guardian reported a legal action has been filed by the Joint Council for the Welfare of Immigrants (JCWI) against the Home Office over the use of an algorithm which filters visa applications on potentially unlawful grounds such as 'nationality'. This follows a report by the Financial Times in June this year that the Home Office was 'secretly' using the algorithm to process visa applications. The Home Office has since confirmed this technology is only used to allocate applications and not decide on them. But why the mystery?

I am among many who would consider the advancement of technology as a blessing to mankind as I, quite frankly cannot imagine a world without smartphones. On the other hand some would consider the advancement of technology as a disaster. Despite this being slightly dramatic I can see why one would come to this conclusion. We are increasingly dependent on technology for almost all everyday tasks. For example, to know directions to a location we would use GPS or for cab services we would resort to the numerous apps on our phones. We no longer memorise directions because Google maps does it for us. In many ways such levels of reliance on technology can be daunting.

The Home Office’s reliance on the algorithm to stream visa applications may have been implemented to better its efficiency and to finally improve on its service standards and visa processing times (which in itself is a topic for another blog). The Home Office is said to have been relying on an algorithm which uses nationality to stream applicants in the categories green, yellow and red. Regardless of the affirmation the Home Office has given to confirm the algorithm is only used to allocate applications and not make decisions, the basis on which these applicants are streamed and their applications allocated is crucial. We cannot be caught up celebrating the technological advancement of our time and ignore an unhealthy reliance on artificial intelligence in dealing with people’s lives, and in so doing discriminating against them on grounds of their race and nationality.

The process of allocating cases (by an algorithm) and alerting human caseworkers that an application has been streamed as red, simply on the basis of the applicant’s nationality (if this is indeed the case) already tells the caseworker the application is a potential refusal before the caseworker has fully considered it. By contrast an application which has been streamed as green by an algorithm tells the caseworker this case is worthy of an approval. This means two almost identical applications, based on the same grounds, are treated and processed differently. The case labelled as red is likely to be treated with more suspicion and subjected to intense and intrusive scrutiny. However, the case which has been given the green light receives less scrutiny and is more likely to be approved.

It doesn’t take a genius to know which nationalities/races are likely to get the red light and which are likely to get the green light. Some may accept the justification for this and say the reason for making such a distinction on the basis of nationality may be due to the fact that certain applicants from a select few countries may have previously breached immigration laws and thus are considered higher risk, therefore it’s okay for everyone with these nationalities to be painted with the same brush. This is embarrassing to write let alone say out loud. Why should an applicant who genuinely wishes to visit his or her family in the UK be subjected to a harsher scrutiny and treated differently because previous applicants from his or her country of nationality have overstayed or committed a crime? An applicant should only be held to this standard if he or she has personally breached an immigration law and/or committed a crime themselves in the past. Applying a harsh standard to a completely new applicant simply because their nationality matches that of previous offenders is both unfair and prejudicial. Applicants of all nationalities have the potential to breach immigration rules.

Aside from the increased scrutiny and chance of refusal, the streamlining alone means that the processing times of applications will vary according to, for example, the applicant’s nationality. It is nearly always the case that all applicants want and need their application to be dealt with as quickly as possible. They all pay the same high Home Office fees, including often for priority processing, and yet receive different service standards based on preconceived applicant characteristics. Treating applicants who have been streamed as red (based on their nationality alone) differently to applicants who have been streamed as green based also on their nationality alone for the same application is without question discriminatory.

Nationality is a protected characteristic under the Equalities Act 2010 and a public authority such as the Home Office “must when making decisions of a strategic nature about how to exercise its functions, have due regard to the desirability of exercising them in a way that is designed to reduce the inequalities of outcome...”

As mentioned above, if visa applicants are being treated differently because of having been streamed as red by an algorithm, not only does this explain why we have such a high number of UK visa refusals for African visitors to the UK but it also indicates that the Home Office may actually be in breach of the Equalities Act. Consequently, while the use of technology should be appreciated for the remarkable improvement it has brought to our lives, it should not be programmed to encourage racial bias as this is making an already bad situation worse.

In conclusion therefore, the Home Office must be transparent in how the algorithm works and the basis on which it has been designed to stream visa applications. It will be interesting to see if JCWI’s legal action reveals the full details and basis on which the algorithm works – which could for example stream against not only nationality but also sex and age.