Predictive Analytics: A Time and a Place, Not Just in LE?

August 17, 2020

The concept seems sound: analyze data from past crimes to predict future crimes and stop them before they happen. However, in practice the reality is not so simple. That is, as Popular Mechanics explains, “Why Hundreds of Mathematicians Are Boycotting Predictive Policing.” Academic mathematicians are in a unique position—many were brought into the development of predictive policing algorithms in 2016 by The Institute for Computational and Experimental Research in Mathematics (ICERM). One of the partners, PredPol, makes and sells predictive policing tools. Reporter Courtney Linder informs us:

“Several prominent academic mathematicians want to sever ties with police departments across the U.S., according to a letter submitted to Notices of the American Mathematical Society on June 15. The letter arrived weeks after widespread protests against police brutality, and has inspired over 1,500 other researchers to join the boycott. These mathematicians are urging fellow researchers to stop all work related to predictive policing software, which broadly includes any data analytics tools that use historical data to help forecast future crime, potential offenders, and victims. … Some of the mathematicians include Cathy O’Neil, author of the popular book Weapons of Math Destruction, which outlines the very algorithmic bias that the letter rallies against. There’s also Federico Ardila, a Colombian mathematician currently teaching at San Francisco State University, who is known for his work to diversify the field of mathematics.”

Linder helpfully explains what predictive policing is and how it came about. The embedded four-minute video is a good place to start (interestingly, it is produced from a pro-predictive policing point of view). The article also details why many object to the use of this technology. Chicago’s Office of the Inspector General has issued an advisory with a list of best practices to avoid bias, while Santa Cruz has banned the software altogether. We’re told:

“The researchers take particular issue with PredPol, the high-profile company that helped put on the ICERM workshop, claiming in the letter that its technology creates racist feedback loops. In other words, they believe that the software doesn’t help to predict future crime, but instead reinforces the biases of the officers.”

Structural bias also comes into play, as well as the consideration that some crimes go underreported, skewing data. The piece wraps up by describing how widespread this technology is, an account that can be summarized by quoting PredPol’s own claim that one in 33 Americans are “protected” by its software.

With physics and other disciplines like Google online advertising based on probabilities and predictive analytics, what’s the scientific limit on real world applications? Subjective perceptions?

Cynthia Murrell, August 17, 2020

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta