Quick, concise Axios video that describes algorithmic bias, how and why human bias ends up in systems used for hiring and criminal justice among other things.
Through a case study of mortgage applications, this article shows how bias might be introduced to AI systems by either bias within historical data, and/or inherent biases of AI programmers and employers. This article gives reasons why this presents a risk to businesses in terms of missing out on customers (refusing credit to creditworthy people) […]
This article explains how make-up can be used both as a way to evade facial recognition systems, but also as an art form.
A New Jersey man was accused of shoplifting and trying to hit an officer with a car. He is the third known black man to be wrongfully arrested based on face recognition.
There is data which predicts that the introduction of AI into the financial system will bring with it increased racial bias and discrimination. Through a case study of mortgage applications, this article shows how bias might be introduced to AI systems by either 1. bias within historical data, and 2. inherent biases of AI programmers […]
This article argues that leading AI ethics researchers, such as Timnit Gebru, are often promised total academic freedom when recruited or interviewed for in-house roles at technology companies. However, internal roadblocks, lack of employee diversity, and hierarchical issues limit the impact of this research which aims to promote equity, fairness, and accountability in AI products. […]
In 2018, Amazon’s use of AI for hiring was discovered to favour male job candidates, because its algorithms had been trained on 10 years’ worth of internal data that heavily skewed male. The algorithm was trained, in effect, to believe that male candidates were better than female candidates.
This video argues that hiring is largely analogue and broken. This leads to major problems such as inefficiency, ineffectiveness (50% of first-year hires fail), poor candidate experience, and lack of diversity. The hiring process is plagued by gender bias, age bias, socioeconomic bias, and racial bias. Pymetrics intentionally audits algorithms to weed out unconscious human […]
PwC has come under fire for the development of a facial recognition tool that logs when employees are absent from their computer screens while they work from home. As it is known that image recognition works better for white males than for other parts of the population, would non-white employees be penalised by the tool […]
View of an AI expert on human job loss. Provides an anecdotal view from an AI expert on what jobs are already being displaced with AI and automation.