Biased Algorithms Are a Racial Justice Issue
Increasingly, life-or-death decisions made by our government are being automated
Decisions on where to send police patrol cars, which foster parents to investigate, and who gets released on bail before trial are some of the most important, life-or-death decisions made by our government.
And, increasingly, those decisions are being automated.
The last eight years have seen an explosion in the capability of artificial intelligence, which is now used for everything from arranging your news feed on Facebook to identifying enemy combatants for the U.S. military. The automated decisions that affect us the most are somewhere in the middle.
A.I.’s big feature is essentially pattern matching. The easiest way to understand it is by thinking about pictures of dogs. If you show an image recognition algorithm hundreds or thousands of images of dogs, it will start to draw conclusions on what a dog is based on the similarities of the images. That could be the shape of pointed ears, fur, paws, a snout, or a nose. This analysis of data is called “training.” After it’s been trained, when asked to look at another picture, the algorithm could tell if anything matches its idea of a dog.
This idea scales to any kind of data. It’s how voice recognition works because the algorithms have been trained on what spoken words correspond to written ones. It’s how Facebook’s photo-tagging works, as Facebook already has images of you and your friends’ faces to train itself on.
But it’s also how predictive policing works, the idea that by identifying when and where crime has occurred, you can send police officers to patrol those areas as a deterrent. By looking at trends in past data — like muggings continually occur in a specific neighborhood between 1 a.m. and 2 a.m. — a predictive policing algorithm might suggest that is the time and place to send officers.
Where that data comes from is the problem. Remember the dog example? If your favorite dogs were huskies and Pomeranians, your algorithm would think all dogs have pointy ears. If it saw a picture of a basset hound, it likely wouldn’t believe it was a dog at all. This is bias, meaning a predisposition of the algorithm that…