U-turn on A-level algorithm in wake of JR threats

17 Aug 2020

Information Law, Public Law and Judicial Review

At 4pm today the Education Secretary Gavin Williamson announced a u-turn on Ofqual’s ‘standardised’ A-level results, in the wake of an imminent judicial review of Ofqual’s algorithm by Ealing A-level student Curtis Parfitt-Ford, supported by Leigh Day and tech-justice non-profit Foxglove. A team of barristers, comprised of Estelle Dehon of Cornerstone Barristers and Ciar McAndrew of Monckton, led by David Wolfe QC of Matrix Chambers, was instructed.

The announcement confirmed that teachers’ predictions, known as Centre Assessment Grades (CAGs) would be used to grade students’ A-level performance instead of the algorithm which had caused many students to drop two or three grades. This impact was not evenly spread – state schools with cohorts of more than 15 students were particularly disadvantaged, as were sixth form colleges.

The government was less than 24 hours away from a judicial review claim being launched. A pre-action letter sent on Friday explained why Ofqual’s algorithmic grading exceeded its statutory powers, violated key principles of data protection law, was unlawfully discriminatory and was tainted by procedural error.

The proposed legal challenge would have sat at the intersection of equality law and fairness under the GDPR. It alleged breaches of the duty of fairness because the algorithm profiled individuals based not on their individual academic performance as assessed by their teachers, but predominantly on the historic performance of their school or educational institution and the relative academic performance of their cohort. Furthermore, in giving historical data relating to school performance a decisive role in the standardisation process, it baked into the decisions factors such as the historic disadvantage of students at the school, leading to algorithmic bias. The result was that the model exacerbated unfairness through a standardisation process nominally designed to prevent unfairness.

Estelle Dehon said: “Data protection laws are often seen as unsexy, but this case shows their true power and import. As more and more decisions are influenced by algorithmic models, or actually taken by the algorithms, the GDPR becomes a key bulwark against bias.”

Notes

Curtis Parfitt-Ford is an A-level student at Elthorne Park High School, a comprehensive school in Ealing, London. He excelled in his GCSEs – achieving six grade 9s and five grade 8s – and brought a legal challenge out of deep concern that the system would further ingrain inequality in education, treating him and his peers, especially those at disadvantaged schools, unfairly. Curtis is also a programmer and has started his own company, Loudspeek, a digital system which aims to help people launch campaigns and write to their MPs. Curtis is actively considering his next educational steps.

Foxglove, which has supported Curtis to bring this case, is a new non-profit which exists to make tech fair. They were behind the recent successful legal challenge for the Joint Council for the Welfare of Immigrants (JCWI) to the Home Office visa algorithm – another biased algorithm that harmed millions of people.

Legal team: Curtis was represented by solicitor Rosa Curling at Leigh Day. Counsel are David Wolfe QC at Matrix Chambers, Estelle Dehon at Cornerstone Barristers and Ciar McAndrew at Monckton.

PRESS COVERAGE