TLEF sought expert legal opinion from Robin Allen QC and Dee Masters of Cloisters chambers over impact of using algorithms.
Post-Brexit EU Settlement Scheme, and welfare benefits decisions made using AI may disadvantage women, the equality and discrimination experts say.
As part of TLEF’s work to identify gaps in legal analysis and understanding, the Foundation has obtained a legal opinion from specialist discrimination law barristers which raises concerns about the impact of algorithms and automated data processing on the transparency and fairness of government decision-making.
LINK TO OPINION: https://www.cloisters.com/equality-implications-of-government-decision-making-and-artificial-intelligence/
Barristers Dee Masters and Robin Allen QC, both from Cloisters, concluded in their opinion for TLEF: ‘There is a very real possibility that the current use of governmental automated decision-making is breaching the existing equality law framework in the UK. What is more, it is hidden from sight due to the way in which the technology is being deployed.’
They predicted that equality claims arising from AI will become ‘the next battle ground over coming decades’.
A key focus of the Foundation’s policy work, led by Swee Leng Harris, is on ensuring that automated processes which are intended to make applications quicker and simpler are fair and open to proper scrutiny.
The Cloisters team were asked to consider two specific areas of government decision-making: its EU Settlement Scheme for European nationals who want to stay in the UK post-Brexit; and the use of ‘risk-based verification’ by some local authorities to detect fraudulent housing and council tax benefit claims.
EU nationals who have been in the UK continuously for five years are eligible for settled status, which means they have the right to stay here. To determine eligibility, the Home Office uses an automated decision-making process which analyses data from the Department of Work & Pensions and the applicant’s tax records.
‘It appears that a case-worker is also involved in the decision-making process but the government has not fully explained how its AI system works or how the human case worker can exercise discretion,’ say Allen and Masters.
They are concerned that although the automated process checks some DWP data to determine whether an applicant meets the five-year criteria, records relating to Child Benefit and Child Tax Credit are not interrogated. ‘This is important because the vast majority of Child Benefit recipients are women, and women are more likely to be in receipt of Child Tax Credits.’ This means that the system could be skewed against women applicants and ‘could very well lead to indirect sex discrimination contrary to s19 Equality Act 2010’. They add that there may also be implications for disabled applicants, as commentators have suggested that they and their carers need to provide additional information as part of the settled status process.’
Allen and Masters also identified transparency concerns over the use of algorithms by local authorities to identify fraudulent housing and council tax benefit applications. The software gives each applicant a risk-rating for fraud, which determines how much scrutiny their application gets. The barristers say: ‘There is no publicly available information which explains how such algorithms are being deployed and on what basis.’
They add: ‘Our view is that if there is some evidence that an individual has been discriminated against by a risk-based verification system and this is coupled with complete lack of transparency, then the burden of proof should shift to the local authority to prove that discrimination is not happening.’
Swee Leng Harris says: ‘Our aim in obtaining this legal opinion was to add to the available technical legal analysis of automated decision-making, which needs to be fair and open to proper scrutiny. Where this doesn’t happen, it is clear that there may well be grounds for legal challenge under existing equality legislation.’
© 2013 - 2024 The Legal Education Foundation
Registered charity 271297 (England/Wales)