In the wake of the COVID-19 pandemic, schools have increasingly turned to digital learning platforms to help students continue their education. These new technologies, many of which use machine learning (ML) algorithms, have been absolutely vital in the effort to bring degree-conferrring educational institutions online.

The appeal of ML is its potential to make highly accurate predictions with minimal human intervention. In online learning, ML is applied in numerous ways, such as mapping a student’s behavioural pattern, predict learning outcomes, personalise curriculum, or customize instruction more accurately. Applications like SpotterEDU , which has seen exponential growth since the start of the pandemic, use ML to monitor students’ attendance and manage communications between educators and students.

However, there may be some issues with the widespread collection and processing of students’ personal data. Applications like SpotterEDU make use of geolocation data to provide insights to educators and school adminstrators. This raises obvious concerns over the physical privacy of students. And also educators’ premature and wholesale reliance on data and ML recommnedations can also lead to biased and discriminatory outcomes in education.

The recently released A-level exam results in the UK has brought to light major concerns surrounding ML in education. Compared to acquiring a standard US high school diploma, the A-level examination is taken annually by final year high school students to graduate. Educators in the UK relied on an algorithm designed by the Direct Centre Performance Model to grade the exams. In August, the exam results were released, causing significant turmoil among many students. Thousands were significantly marked down which caused scrutiny in the use of algorithms and automated grading systems. The markdown caused an uprising of discussions on whether the government’s reliance on algorithms in the marking of the A-level exams in its aims to produce an accurate and a fair grading system was a lawful decision.

Article 22 of the GDPR stipulates that individuals have the right not to be subject to a decision wholly automated that has a significant impact on their lives. As such, the Office of Qualifications and Examinations Regulation’s argument to defend the recent 2020 A-Level outcome that the results were not based on automated decision-making because teachers provided predicted grades for students and rankings can be challenged.

The algorithm calculated and predicted the current results based on previous results. The A-level results algorithm relied on schools’ historical performance data to calculate the present results. This means that students attending an underperforming school currently may be doing well but are affected by the performance of previous students who might not be doing so well.

Where a Data Protection Officer (DPO) can help

In June 2019, the Information Commissioner’s Office (ICO) examined how ML may lead to human bias and discrimination under the General Data Protection Regulation (GDPR) and the Equality Act 2010. The ICO expects organisations to comply with provisions of the GDPR, specifically those provisions that protect the fundamental rights and freedoms of data subjects and prevent possible discrimination. According to the ICO, an ML algorithm can become discriminatory if it receives imbalanced training data, which could be a reflection of past discrimination. The ICO highlights technical approaches to mitigate discrimination risk in ML models, which include: anti-classification methods, outcome and error parity, and equal calibration.

Many organisations are turning to these new technologies to address the challenges posed by the COVID-19 pandemic. But, as demonstrated in the recent A-level examination controversy, these technologies can lead to discriminatory and potentially unlawful outcomes.

To capture the benefits of ML algorithms while avoiding the obvious pitfalls of this new technology, organisations need to find progressive ways to design and use algorithms to be compliant with the GDPR. By hiring a data protection officer (DPO), organisations can rely on the knowledge and expertise of an indivual or team that is familiar with these challenges and can help design an internal data governance system that provides value for both data controllers/processors and data subjects.