![March 25, 2014: Inmates gather in the common room at the new 192-bed facility at the Stanislaus County Jail in Modesto, Calif.](https://media.wnyc.org/i/800/0/c/85/1/AP_17083672546832.jpg)
Invisible computer algorithms increasingly shape the world we live in, but they don't always get it right. There is a raging debate about whether these algorithms mitigate human bias or compound it. A ProPublica investigation found that one of the major algorithms used to inform the judicial process is racially biased, predicting black defendants will have higher risks of recidivism than they actually do, while white defendants have lower rates than they actually do. In response, City Council recently passed a bill to study whether the algorithms used by city agencies are inadvertently discriminatory, the first such bill in the country. Lauren Kirchner, senior reporting fellow at ProPublica, weighs in on the bill and the use of algorithms by government agencies.