Automated essay grading in the US has been shown to mark down African American students and those from other countries.
Read MoreThis white paper takes a deeper dive into the data and algorithm used to underestimate the pass rate of students of certain nationalities, looking at how data and modelling can lead to bias.
Read MoreThis short article gives an example of how predictive algorithms can penalise underrepresented groups of people. In this example, students from Guam had their pass rate underestimated versus other nationalities, because of the low number of students in the data set used to build the prediction model, resulting in insufficient accuracy.
Read MoreThis article details the algorithm used to inform A Level results for students who could not take exams due to the 2020 pandemic. The algorithm took into account the postcode of the student, which meant that students from lower income areas were more likely to have their grade reduced whilst students in high-income areas were […]
Read MoreAn outcry over alleged algorithmic bias against pupils from more disadvantaged backgrounds has now left teenagers and experts alike calling for greater scrutiny of the technology.
Read MoreCase study explaining algorithm bias inherent in grade prediction for A Level students. Demonstrates the physical impact AI can have, if not scrutinised for bias.
Read More