AI systems and tools can be used to enhance the learning in the classroom, to facilitate online learning and taking exams online. However, these technologies can disadvantage those with dark skin, as they are not made to work for them. Algorithms are also used in automated decision making to predict grades, either to make decisions about students, or award exams. There have been examples where these algorithms are biased against marginalised groups, making discriminatory decisions about students futures.

Using AI in the examination process

Technology to protect remote exam taking against fraud has quickly become necessary as exams move online. Combinations of machine learning algorithms, facial recognition technology and eye tracking software are used to make sure that the person taking the exam is who they say they are and identify cheating.

Read More »

How AI applications are used to aid learning

There are many types of different exciting AI applications being used to enhance learning in the class room, from automated ‘smart tutors’ who can assess pupil performance more accurately tailor learning interventions than humans, to facial recognition cameras which can assess pupils’ understanding through analysing their facial expressions.

Read More »

Using AI systems in school admissions

A range of different ways to make decisions about admissions to universities and schools are being used, including algorithms which use data pulled from students social media channels. There is concern that the data used will be biased against ethnic minorities due to the smaller amounts of data available and human bias used in creating the models.

Read More »