Julia Angwin

Julia Angwin

02-12-2021

13:21

Critics have long suspected that predictive policing software was racially biased. Today, we have the answer: @themarkup & @gizmodo analyzed 5.9 million algorithmic crime predictions. We found they disproportionately target Black & Latino areas. /1

Police across the U.S. use software from a company called PredPol (recently renamed @Geolitica_PS ) that says it predicts future crime without racial bias. But we found it rarely predicted crime in White areas & disproportionately predicted it in Black & Latino areas. /2

In other words, the whiter the neighborhood, the fewer crime predictions. The same trend proved true for income: the wealthier the neighborhood, the fewer predictions. /3

When he was at @gizmodo, @dmehro found 7.8 million PredPol crime prediction reports exposed on the open web from 70 jurisdictions. He & @suryamattu dove into the data while @asankin & @anniegilbertson reported on the ground. @elarrubia was their fearless leader. /4

We limited our analysis to U.S. city and county law enforcement agencies for which we had at least six months’ worth of data. This left us with 38 jurisdictions. We then matched the predictions to Census data to determine the demographics of the targeted areas. /5

This is the first independent analysis of the popular algorithmic crime prediction software even though the tech has been used by law enforcement for a decade @ProfFerguson told us. /6

Our work builds on the seminal 2016 study by @KLdivergence and @wsisaac which showed that PredPol’s open-source algorithm would likely have disproportionately targeted Black & Latino neighborhoods, even though everyone uses drugs at similar rates. /7

When we asked PredPol CEO Brian MacDonald whether he was concerned about the race disparities we found, he didn’t address the question directly, but said the software mirrored reported crime rates. /8

But crimes are not reported equally by different demographic groups. @BJSGov has repeatedly found that White crime victims are less likely to report violent crime to police than Black or Hispanic victims. /9

PredPol has long held a position that because the software doesn’t include race or other demographic data, that “eliminates the possibility for privacy or civil rights violations seen with other intelligence-led or predictive policing models.” /10

But PredPol does know something about the race effects of its algorithm. In 2018 PredPol’s co-founders published a paper showing their algorithm would have targeted Black & Latino neighborhoods up to 400% more than White residents in Indianapolis. /11

The PredPol co-founders paper described how to make the algorithm more fair but “at a cost of reduced accuracy of the algorithms.” PredPol didn't make the change & the CEO said he did not inform clients because it “was an academic study conducted independently of PredPol.” /12

And so PredPol continues to make crime predictions that are incredibly unevenly distributed by race. Take these two neighborhoods in Plainfield, N.J. – where 11 crimes were predicted in the White neighborhood and 1,940 in the Black & Brown neighborhood. /13

Some police chiefs have noticed the disparities. In the Chicago suburb of Elgin, Ill., Police Department Deputy Chief Adam Schuessler told us that PredPol was like "policing bias-by-proxy.” The department has stopped using the software. /14



Follow us on Twitter

to be informed of the latest developments and updates!


You can easily use to @tivitikothread bot for create more readable thread!
Donate 💲

You can keep this app free of charge by supporting 😊

for server charges...