AI systems could unfairly decline new bank account applications, block payments and credit cards, deny loans, and other vital financial services and products to qualified customers because of how their data is treated or labeled.
This is an overview on racial justice in tech and in AI that considers how systemic change must happen for technology to be support equity.
This short article looks at the link between the lack of diversity in the AI workforce and the bias against ethnic minorities within financial services – the “new danger of ‘bias in, bias out’”.
A detailed summary of research by Mark Weber, Mikhail Yurochkin, Sherif Botros and Vanio Markov, breaking down the lack of racial justice in the current US financial system which leads to the loss of the right to financial security, along with examination of solutions.
This video is an in depth panel discussion of the issues uncovered in the ‘Unmasking Facial Recognition’ report from WebRootsDemocracy. This report found that facial recognition technology use is likely to exacerbate racist outcomes in policing and revealed that London’s Metropolitan Police failed to carry out an Equality Impact Assessment before trialling the technology at […]
Machine Bias – There’s software used across the country to predict future criminals and it’s biased against blacks
This is an article detailing a software which is used to predict the likelihood of recurring criminality. It uses case studies to demonstrate the racial bias prevalent in the software used to predict the ‘risk’ of further crimes. Even for a similar crime, a white criminal would be much more likely to be judged low-risk. […]
Explains (from a US perspective) how the development of machine learning and algorithms has left financial services at risk of exacerbating biases. Using the example of lending, the article explains how algorithms incorporate biases into our systems, and what organisations can do to limit risk, particularly from a legal perspective.
UN Working Paper evidence base for conceptual framework of cyclic relationship between climate change and social inequality.
Algorithms can be used to decide when to withhold loans, mortgages and even bank accounts, on the basis of who is likely to make money for the bank. Minority ethnic groups can be disproportionately disadvantaged within these prediction systems by being determined as not meeting the criteria for lending, even when others with the same financial status get approved.
This report provides an overview of AI and Climate Change with considerations for AI Climate Solutions. It discusses values-based climate communication and the difference between climate engagement and climate manipulation. A good source on localising climate visualisation, and how AI has been utilised to do this (including the risks).