Blog

PREDICTIVE POLICING

2017-04-21/ StratLytics

The National Institute for Justice explains that “predictive policing tries to harness the power of information, geospatial technologies and evidence-based intervention models to reduce crime and improve public safety. This two-pronged approach — applying advanced analytics to various data sets, in conjunction with intervention models — can move law enforcement from reacting to crimes into the realm of predicting what and where something is likely to happen and deploying resources accordingly.”

Today, more and more police departments are using algorithms that predict future crimes. Predictive policing is just one tool in this new, tech-enhanced and data-fortified era of fighting and preventing crime. As the ability to collect, store and analyze data becomes cheaper and easier, law enforcement agencies all over the world are adopting techniques that harness the potential of technology to provide more and better information. But while these new tools have been welcomed by law enforcement agencies, they’re raising concerns about privacy, surveillance and how much power should be given over to computer algorithms1.

The Origins of Predictive Policing

The notion of crime forecasting dates back to 1931, when sociologist Clifford R. Shaw of the University of Chicago and criminologist Henry D. McKay of Chicago's Institute for Juvenile Research wrote a book exploring the persistence of juvenile crime in specific neighborhoods. Scientists have experimented with using statistical and geospatial analyses to determine crime risk levels ever since. In the 1990s, the National Institute of Justice (NIJ) and others (including the New York Police department) embraced geographic information system tools for mapping crime data, and researchers began using everything from basic regression analysis to cutting-edge mathematical models to forecast when and where the next outbreak might occur. But until recently, the limits of computing power and storage prevented them from using large data sets.

Jeffrey Brantingham is a professor of anthropology at UCLA who helped develop the predictive policing system that is now licensed to dozens of police departments under the brand name PredPol. “This is not Minority Report,” he’s quick to say, referring to the science-fiction story often associated with PredPol’s technique and proprietary algorithm. “Minority Report is about predicting who will commit a crime before they commit it. This is about predicting where and when crime is most likely to occur, not who will commit it.”

Brantingham also emphasized that the algorithm cannot replace police work; it’s intended to help police officers do their jobs better. “Our directive to officers was to ‘get in the box’ and use their training and experience to police what they see,” said Cmdr. Sean Malinowski, the LAPD’s chief of staff. “Flexibility in how to use predictions proved to be popular and has become a key part of how the LAPD deploys predictive policing today2.”

What is PredPol?

Dozens of cities across the US and beyond are using the PredPol software to predict a handful of other crimes, including gang activity, drug crimes and shootings. Police in Atlanta use PredPol to predict robberies. Seattle police are using it to target gun violence. In England, Kent police have used PredPol to predict drug crimes and robberies. In Kent, it’s not just police taking a more proactive approach by concentrating officers in prediction areas, but also civilian public safety volunteers and drug intervention workers.

The prediction algorithm is constantly reacting to crime reports in these cities, and a red box predicting crime can move at any moment. But although officers in the divisions using PredPol are required to spend a certain amount of time in those red boxes every patrol, they’re not just blindly following the orders of the crime map. The officer still has a lot of discretion. It’s not just the algorithm. The officer still has to know the area well enough to know when to adjust and go back into manual mode.

PredPol’s predictive policing is the sum of two parts:

1. Predictive Policing Technology: An algorithm developed from high-level mathematics and sociological and statistical analysis of criminality. This algorithm factors in historical crime data from the police department and produces predictions on where and when a crime is most likely to occur.

2. Insights of officers and crime analysts. According to the National Institute of Justice: “the predictive policing approach does not replace traditional policing. Instead, it enhances existing approaches such as problem-oriented policing, community policing, intelligence-led policing and hot spot policing.”

Predictive policing is more than traditional hotspot mapping. Predictive Policing’s forecasting technology includes high-level mathematics, machine learning, and proven theories of crime behavior, that take a forward-looking approach to crime prevention3.

While PredPol’s predictive boxes predict that a crime will happen in the prediction area, there is no guarantee that an incident or arrest will occur. The presence of police officers in the prediction areas creates a deterrence and suppression effect, thus preventing crime in the first place.

PredPol does not collect, upload, analyze or in any way involve any information about individuals or populations and their characteristics – PredPol’s software technology does not pose any personal privacy or profiling concerns. The algorithm uses only three pieces of data – type, place, and time – of past crimes.

The Chicago Police Department Take Predictive Policing One Step Further

As with PredPol, the approach in predictive policing seeks to forecast where and when crime will happen; another focuses on who will commit crime or become a victim…

The Chicago Police have made it personal. The department is using network analysis to generate a highly controversial Strategic Subject List of people deemed at risk of becoming either victims or perpetrators of violent crimes. Officers and community members then pay visits to people on the list to inform them that they are considered high-risk4.

The Custom Notification program, as it's called, was inspired in part by studies done by Andrew Papachristos, a sociologist at Yale University. Papachristos grew up in Chicago's Rogers Park neighborhood in the 1980s and '90s, at the height of the crack era. When he started studying crime, Papachristos wanted to understand the networks behind it. For a 2014 paper, he and Christopher Wildeman of Cornell University studied a high-crime neighborhood on Chicago's West Side. They found that 41% of all gun homicide victims in the community of 82,000 belonged to a network of people who had been arrested together, and who comprised a mere 4% of the population—suggesting, with other studies, that much can be learned about crime by examining the company people keep, Papachristos says.

Intrigued by these ideas, the Chicago police teamed up with Miles Wernick, a medical imaging researcher at the Illinois Institute of Technology in Chicago, to develop the Custom Notification program. Because gang violence was distributed across the city, hot spot policing wasn't as effective in Chicago, says Commander Jonathan Lewin, head of technology for the department. "The geography of the map isn't as helpful as looking at people and how risky a person is," he says. The hope was that the list would allow police to provide social services to people in danger, while also preventing likely shooters from picking up a gun.

Validations / Concerns

A recent detailed report from the RAND corporation concluded that the Custom Notification program implemented in Chicago saved zero lives — and that overall the list of hundreds of likely shooters generated wasn’t even being used as intended. “There was no practical direction about what to do with individuals on the ‘Strategic Suspect List,’ little executive or administrative attention paid to the pilot, and little to no follow-up with district commanders,” the report concluded. One of its authors pointed out that Chicago’s police department had 11 different anti-violence programs going on, and the list of likely shooters “just got lost.” But it did identify one result of the program. People on the list were more likely to be arrested, prompting conclusion that it “essentially served as a way to find suspects after the fact5”.

That’s one of the biggest concerns about predictive policing. Some civil liberties groups argue that it just hides racial prejudice “by shrouding it in the legitimacy accorded by science.” If there’s a bias in the criminal justice system, that carries through to the statistics which are ultimately fed into the algorithms, says one analyst with the Human Rights Data Analysis Group and a Ph.D. candidate at Michigan State University. “They’re not predicting the future. What they’re actually predicting is where the next recorded police observations are going to occur.” In addition, with programs such as those used in Chicago and proprietary software like PredPol, the Human Rights Data Analysis Group stated “For the sake of transparency and for policymakers, we need to have some insight into what’s going on so that it can be validated by outside groups.”

Predictive Policing techniques such as the use of PredPol have shown promising results. But the ability to thoroughly validate the models through a third party has been challenging (with regards to analytics and public policies as well). With the advent of Big Data, predictive policing is still evolving, but civil liberties will have to be an integral part going forward. And at the end of the day, the analytics associated with predictive policing are just another set of tools, not an end all.

Comments