We and AI, PO Box 76297
© We and AI 2020
AI systems are used to decide who has access to bank accounts, credit, loans, and mortgages, aimed at reducing risk for lenders. Algorithms are used to predict who should get credit and what rates they should get, and whether someone’s payments should be authorised. Often the models built to predict these things contain historic, postcode and proxy data, which discriminates against ethnic minorities. This means that depending on your ethnicity and other factors, you could get less financial opportunity than others, which can have an impact on the type of education, careers, homes and lifestyle you have.
Algorithms can be used to decide when to withhold loans, mortgages and even bank accounts, on the basis of who is likely to make money for the bank. Minority ethnic groups can be disproportionately disadvantaged within these prediction systems by being determined as not meeting the criteria for lending, even when others with the same financial status get approved.