Algorithms become the arbiters of determinations about individuals (e.g. government benefits, granting licenses, pre-sentencing and sentencing, granting parole). Whilst AI tools may be used to mitigate human biases and for speed and lower costs of trials, there is evidence that it may enforce biases by using characteristics such as postcode or social economic level as a proxy for ethnicity.
The use of commercial AI tools such speech recognition – which have been shown to be less reliable for non-white speakers – can actively harm some groups when criminal justice agencies use them to transcribe courtroom proceedings.
When it comes to decision making, it might seem that computers are less biased than humans. But algorithms can be just as biased as the people who create the… Quick, concise Axios video that describes algorithmic bias, how and why human bias ends up in systems used for hiring and criminal justice among other things.
Read MoreMeasuring racial discrimination in algorithms There is growing concern that the rise of algorithmic decision-making can lead to discrimination against legally protected groups, but measuring such algorithmic discrimination is often hampered by a fundamental selection challenge. We develop new quasi-experimental tools to overcome this challenge and measure algorithmic discrimination in the setting of pre-trial bail […]
Read MoreDartmouth professor Dr. Hany Farid reverse engineers the inherent dangers and potential biases of recommendations engines built to mete out justice in today’s criminal justice system. In this video, he provides an example of how the number of crimes is used as proxy for race.
Read MoreWhilst some believe AI will increase police and sentencing objectivity, others fear it will exacerbate bias. For example, the over-policing of minority communities in the past has generated a disproportionate number of crimes in some areas, which are passed to algorithms, which in turn reinforce over-policing.
Read MoreA study on the discriminatory impact of algorithms in pre-trial bail decisions.
Read MoreRacial disparities in automated speech recognition Automated speech recognition (ASR) systems are now used in a variety of applications to convert spoken language to text, from virtual assistants, to closed captioning, to hands-free computing. By analyzing a large corpus of sociolinguistic interviews with white and African American speakers, we demo… Analysis of five state-of-the-art automated […]
Read MoreSophisticated computational techniques, known as machine-learning algorithms, increasingly underpin advances in business practices, from investment banking to product marketing and self-driving cars. Machine learning—the foundation of artificial intelligence—portends vast changes to the private sect… This article highlights the benefits of artificial intelligence in adjudication and making law in terms of improving accuracy, reducing human biases […]
Read MoreThe impact of AI on litigation. The current use of AI in reviewing documents, predicting outcome of cases and predicting success rates for lawyers. This article highlights concerns about fallibility and the need of human oversight.
Read MoreWe examine the impact of artificial intelligence on the UK’s legal sector
Read MoreLaw Society partner and equity crowdfunding platfrom Seedrs explains how developments within AI are taking law firms and solicitors to the next level. A article on how AI can be used in adjudication and law in general. It highlights that although AI has vast potential, there is not a broad adoption so far.
Read More