Skip to main content

Public Law Project begins judicial review over ‘discriminatory’ Home Office algorithm used to identify potential sham marriages

Summary

Marriages involving Greeks, Bulgarians, Romanians and Albanians are disproportionately investigated, PLP says

By EIN
Date of Publication:

The Public Law Project (PLP) announced on Friday that it had begun a legal challenge over what it argues is an unlawful and discriminatory automated algorithm used by the Home Office to identify and investigate potential sham marriages.

Image credit: UK GovernmentAccording to the PLP, the Home Office has used the algorithm in its automated 'triage' tool since at least April 2019. For more background on the algorithm, see PLP's September 2021 written evidence to Parliament's Justice and Home Affairs Committee, which is available here.

In 2021, PLP explained that the algorithm gives each couple a 'red' or 'green' rating. 'Red' means that the Home Office should investigate the couple 'to rule out or identify sham activity', while 'green' means that an investigation is not warranted. PLP said it was concerned that the algorithm may be flawed and discriminatory: "In particular, the Home Office's documents show that some nationalities, including Bulgarian, Greek, Romanian and Albanian people, have their marriages rated 'Red' at a much higher rate than others."

Ariane Adam, PLP's Legal Director, stated last week that the information that is publicly known about the algorithm demonstrates prima facie indirect discrimination against these nationalities.

In launching its legal challenge, PLP commented: "Profiling by nationality is not permitted without ministerial authority. Based on the information publicly available, the Home Office has not sought such authority. There is no evidence, other than the outcomes of the triage tool itself, that the nationalities disproportionately affected are more likely to be involved in sham marriages. Where there is prima facie discriminatory impact, the Home Office must justify it by demonstrating that it is a proportionate means of achieving a legitimate aim. No publicly available documents demonstrate such a justification has been made."

PLP also argues that the Home Office's secrecy around the algorithm and triage tool breaches transparency rules under the General Data Protection Regulation (GDPR).

Last month, the First-Tier Tribunal (General Regulatory Chamber) gave its decision in Public Law Project v The Information Commissioner [2023] UKFTT 00102 (GRC). PLP was appealing against the Information Commissioner's decision allowing the Home Office's refusal to disclose certain information about the triage tool from a Freedom of Information (FOI) request. The Home Office refused to disclose the criteria used by the tool after concluding that publication of the criteria would prejudice the ability to detect and deter sham marriages and would not be in the public interest.

The First-Tier Tribunal (General Regulatory Chamber) dismissed PLP's appeal, finding that section 31(1) of the Freedom of Information Act 2000 was engaged, which allowed the information to be withheld by virtue of paragraphs (a) "the prevention or detection of crime" and (e) "the operation of the immigration controls".

The decision stated: "The Tribunal have considered the criteria set out in the closed bundle and are satisfied that it would be likely, or more than probable that there would be prejudice, that would be real, actual and of substance. This prejudice would result from disclosure of the withheld information to the world at large. It is, in our view predictable that understanding the criteria could lead interested individuals or parties, to adapt their behaviour or answers to any questioning or subsequent investigation. We find that this in turn would have a negative effect, including on the voluntary supply of information to the [Home Office] in the future. The Tribunal therefore accept that Section 31(1)(a) is engaged."

As PLP noted, the Tribunal added, however, that it recognised the potential for bias and noted that the apparent discriminatory effect of the Home Office's use of the algorithm could be challenged by way of judicial review.

The Tribunal's decision continued: "Specific nationalities may be more vulnerable to suspicions or accusations of abuse of systems and the processes involved. The Tribunal also accept that there will be some indirect discrimination for the reasons in the appellants arguments, however the Tribunal cannot support the suggestion that disclosure of the criteria will help to minimise or help to understand such prejudice in so far as the referral into the tool could equally have impact in terms of indirect discrimination. […] The Appellant further argues that disclosure is particularly important given that the available evidence demonstrates a prima facie situation of indirect discrimination. Whilst clearly this is a matter of public interest, and the Commissioner has accepted this, there are alternative means to address the issues, for example judicial review, or other legal causes of action, challenges, and other public officials and/or authorities that can provide a means of redress to various injustices that might be suffered by concerned individuals or groups."

PLP began the judicial review process last Thursday by writing a pre-action protocol letter to the Home Office.

BBC News reported the Home Office had declined to comment on the specifics of PLP's legal challenge, though the Home Office has previously denied that the algorithm uses nationality as a factor.