Using AI in the examination process

Technology to protect remote exam-taking against fraud has quickly become necessary as exams move online. Combinations of machine learning algorithms, facial recognition technology and eye tracking software are used to make sure that the person taking the exam is who they say they are and identify cheating. However, there have been many documented cases where they cannot identify students with dark skin, not allowing them to take exams.

Filter resources by type or complexity

All ArticleBeginnerIntermediateReport

Software that monitors students during tests perpetuates inequality and violates their privacy

In an opinion piece by a University Librarian, he claims that millions of algorithmically proctored (invigilated) tests are happening every month around the world, increasing exponentially during the pandemic. In his experience algorithmic ‘proctoring’ reinforces white supremacy, sexism, ableism, and transphobia, invades students’ privacy and is often a civil rights violation.

Read More

Remote testing monitored by AI is failing the students forced to undergo it

An opinion piece in which examples are given of students who have been highly disadvantaged by exam software, including a muslim woman forced to remove her hijab by software, in order to prove she is not hiding anything behind it.

Read More

Exams that use facial recognition may be ‘fair’ – but they’re also intrusive

News article which argues that whilst AI facial recognition during exams might be fair, it is both an invasion of privacy and is at risk of bringing unwarranted biases.

Read More