For many who saw a man get murdered in front of complicit police colleagues and helpless bystanders, the fact that something needs to change in the system which allows this to repeatedly happen without consequence has become ever more apparent.
What might be less apparent is that every one of us in tech and business needs to be part of that change, however racially prejudiced or “woke” we believe ourselves to be.
In the UK, as protesters fill Hyde Park today, many are aware that while systemic racism may not result in as many horrific deaths of black people at the hands of police as in the US, racism is still pervasive within our society and politics. There is nowhere more so than in the tech industry and within businesses leadership, a brief look at the lack of diversity in boardrooms and tech teams across the country will testify to this.
And It is a monumental problem, because while the technology industry moves at pace to automate, innovate and drive efficiencies with data and algorithms, in doing so it codifies and amplifies the biases and inequalities in our society. Businesses and organisations eager to implement these technologies overlook the biased data, inadequate data, historic data, inaccurate data and modelling which inform the predictive algorithms, facial and voice recognition tools which inform them. The result – historic prejudice and inequality mean technology products (and the businesses which run on them), disadvantage BAME people by denying them opportunities to get jobs, financial help, residency, even freedom from incarceration. Products are developed which only work properly for white people, pushing black people particularly further into the margins.
If this is something which you or your colleagues have not yet fully understood the significance of, now is the time to learn. Below is a very small selection from a larger list we are looking at summarising on our website as an accessible resource. Please add your own suggestions on texts to include in the comments section.
Just this year, in 2020:
These examples come from a database of articles compiled by our member Charlie Pownall documenting all sorts of contentious uses of AI which we are working on building into resources.
Our team also noted that healthcare carries a number of issues for non-white people, for technology optimised for pale skin has been seen to risk neglecting skin cancer in those with dark skin.
Videos
Videos discussing all of these authors can be found easily online, for example Ruha Benjamin in discussion with Meredith Whittaker
A seminal video is Joy Buolamwini discussing her fight against algorithmic bias
A short explanation of how algorithmic bias works
If however you are already familiar with the issues and the way our systemic bias creeps into code, then there is even more that can be done. Here are some starting points:
Most important to remember, is that, although we have spent the last few years of the rapid AI deployment teaching our machines to be racist, we have been teaching our children for millennia. Machine behaviour is easier to change than human behaviour – IF we take the opportunity to correct it while we still can. Let’s make technology become better than us, not worse, and let’s make ourselves better on the way.
“If you are neutral in situations of injustice, you have chosen the side of the oppressor.” Nelson Mandela
We and AI Ltd, a Charitable Company Limited by Guarantee Company Registration Number 13376771
Cookie | Duration | Description |
---|---|---|
__cf_bm | 30 minutes | This cookie is set by CloudFare. The cookie is used to support Cloudfare Bot Management. |
geo | This cookie is used for identifying the geographical location by country of the user. | |
lang | This cookie is used to store the language preferences of a user to serve up content in that stored language the next time user visit the website. |
Cookie | Duration | Description |
---|---|---|
_gat | 1 minute | This cookies is installed by Google Universal Analytics to throttle the request rate to limit the colllection of data on high traffic sites. |
YSC | session | This cookies is set by Youtube and is used to track the views of embedded videos. |
Cookie | Duration | Description |
---|---|---|
_ga | 2 years | This cookie is installed by Google Analytics. The cookie is used to calculate visitor, session, campaign data and keep track of site usage for the site's analytics report. The cookies store information anonymously and assign a randomly generated number to identify unique visitors. |
_gid | 1 day | This cookie is installed by Google Analytics. The cookie is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected including the number visitors, the source where they have come from, and the pages visted in an anonymous form. |
Cookie | Duration | Description |
---|---|---|
IDE | 1 year 24 days | Used by Google DoubleClick and stores information about how the user uses the website and any other advertisement before visiting the website. This is used to present users with ads that are relevant to them according to the user profile. |
test_cookie | 15 minutes | This cookie is set by doubleclick.net. The purpose of the cookie is to determine if the user's browser supports cookies. |
VISITOR_INFO1_LIVE | 5 months 27 days | This cookie is set by Youtube. Used to track the information of the embedded YouTube videos on a website. |
Cookie | Duration | Description |
---|---|---|
_gat_gtag_UA_310745_12 | 1 minute | No description |
attribution_user_id | 1 year | No description |
CONSENT | 16 years 9 months 4 days 1 hour 3 minutes | No description |
cookielawinfo-checkbox-functional | 1 year | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checkbox-others | 1 year | No description |