Using AI in the examination process

Technology to protect remote exam-taking against fraud has quickly become necessary as exams move online. Combinations of machine learning algorithms, facial recognition technology and eye tracking software are used to make sure that the person taking the exam is who they say they are and identify cheating. However, there have been many documented cases where they cannot identify students with dark skin, not allowing them to take exams.

Filter resources by type or complexity

Software that monitors students during tests perpetuates inequality and violates their privacy

Software that monitors students during tests perpetuates inequality and violates their privacy

Software that monitors students during tests perpetuates inequality and violates their privacyThe coronavirus pandemic has been a boon for the test proctoring (invigilating) industry. About half a dozen companies in…

Read More
Remote testing monitored by AI is failing the students forced to undergo it

Remote testing monitored by AI is failing the students forced to undergo it

What’s worse than remote school? Remote test-taking with AI proctorsWhile automated proctoring (invigilating) might seem like a panacea in this age of virtual schooling, it’s a terrible solution for millions…

Read More
Exams that use facial recognition may be ‘fair’ – but they’re also intrusive

Exams that use facial recognition may be ‘fair’ – but they’re also intrusive

Exams that use facial recognition may be ‘fair’ – but they’re also intrusiveAlthough companies behind remote tests say their technology ensures integrity, I’m concerned about privacy and bias News article which…

Read More

Actions you can take

Other topics in this area