Why we need to act on racial bias in AI

Some starting points, and why inaction makes you part of the problem

For many who saw a man get murdered in front of complicit police colleagues and helpless bystanders, the fact that something needs to change in the system which allows this to repeatedly happen without consequence has become ever more apparent. 

What might be less apparent is that every one of us in tech and business needs to be part of that change, however racially prejudiced or “woke” we believe ourselves to be.

In the UK, as protesters fill Hyde Park today, many are aware that while systemic racism may not result in as many horrific deaths of black people at the hands of police as in the US, racism is still pervasive within our society and politics. There is nowhere more so than in the tech industry and within businesses leadership, a brief look at the lack of diversity in boardrooms and tech teams across the country will testify to this.

And It is a monumental problem, because while the technology industry moves at pace to automate, innovate and drive efficiencies with data and algorithms, in doing so it codifies and amplifies the biases and inequalities in our society. Businesses and organisations eager to implement these technologies overlook the biased data, inadequate data, historic data, inaccurate data and modelling which inform the predictive algorithms, facial and voice recognition tools which inform them. The result – historic prejudice and inequality mean technology products (and the businesses which run on them), disadvantage BAME people by denying them opportunities to get jobs, financial help, residency, even freedom from incarceration. Products are developed which only work properly for white people, pushing black people particularly further into the margins.

If this is something which you or your colleagues have not yet fully understood the significance of, now is the time to learn. Below is a very small selection from a larger list we are looking at summarising on our website as an accessible resource. Please add your own suggestions on texts to include in the comments section.

Articles

Just this year, in 2020: 

These examples come from a database of articles compiled by our member Charlie Pownall documenting all sorts of contentious uses of AI which we are working on building into resources.

Our team also noted that healthcare carries a number of issues for non-white people, for technology optimised for pale skin has been seen to risk neglecting skin cancer in those with dark skin. 

Books

Books

Race after Technology by Ruha Benjamin

Algorithms of Oppression by Safiya Noble 

Automating Inequality by Virgina Eubanks

Videos

Videos

Videos discussing all of these authors can be found easily online, for example Ruha Benjamin in discussion with Meredith Whittaker 

A seminal video is Joy Buolamwini discussing her fight against algorithmic bias

A short explanation of how algorithmic bias works

And more

If however you are already familiar with the issues and the way our systemic bias creeps into code, then there is even more that can be done. Here are some starting points:

  1. Work to get more black people hired and recognised, to increase the diversity of those developing and making decisions on AI. Whether you are in the position to hire, mentor, inspire or advocate, you can make a difference. Support organisations such as https://www.blackgirlscode.com/ or https://blackinai.github.io/ in the US, or Tech London Advocates Black Women in Tech
  2. Speak up publicly when you feel that tech strategy, development, procurement or implementation decisions are being made which have not had enough thought put into the consequences on race discrimination and racial differences. 
  3. Engage your colleagues and partners in the issues, don’t let the moment pass but instead hold them and yourself accountable on a daily basis.
  4. Ask questions about algorithmic auditing. Does your company audit algorithms? Do they plan to? If they do not, bias will appear. There are many cases and methods for auditing, contact us to find out more.
  5. Find out what more you can do. Contact local movements, for example in the UK Stand Against Racism and InequalityCoalition for Racial Equality and Rights, or Stand up To Racism,who have called for everyone to #taketheknee on their doorsteps at 6pm tonight. 
  6. Join We and AI at weandai.orgWe are a volunteer organisation working specifically to raise awareness of how all lives are being impacted by AI, particularly those of marginalised communities. Wider public understanding of the amplification of systemic racism is high on our agenda and we need volunteers from all walks of life to help us with our programmes.

Most important to remember, is that, although we have spent the last few years of the rapid AI deployment teaching our machines to be racist, we have been teaching our children for millennia. Machine behaviour is easier to change than human behaviour – IF we take the opportunity to correct it while we still can. Let’s make technology become better than us, not worse, and let’s make ourselves better on the way.

“If you are neutral in situations of injustice, you have chosen the side of the oppressor.” Nelson Mandela