Insights

Publication | Legaltech News

Nervous System: The Police Beat Algorithm and Automated Criminal Justice Information Systems

David Kalat

August 4, 2020

David Kalat writes about the use of computer technology to predict crime and allocate police resources—and the legacies that have resulted.

Download the article.

Modern usages of computer technology to predict crime and allocate police resources have their roots in a 1965 initiative by President Lyndon Johnson. The President’s Commission on Law Enforcement and Administration of Justice was tasked with determining how to leverage computers to help solve the nation’s “crime problem.”

Between 1964 and 1968, the overall violent crime rate increased fifty percent, and by the early 1970s it had doubled. Beginning in 1965, the Commission’s Science and Technology Task Force met at the Institute for Defense Analyses in Alexandria, Virginia, under the guidance of Attorney General Nicholas Katzenbach. The Task Force sought technological, scientific solutions for rising crime rates.

Saul Gass, a pioneering operations researcher with IBM and member of the Task Force, developed the Police Beat Algorithm to calculate how and where to apply patrol officers for maximum effectiveness. The algorithm promised to generate automated suspect profiles based on the demographics of urban spaces—before a crime was actually committed. At least, that was the premise. Critics note that the algorithm offered new, computerized support to an existing infrastructure of racial prejudice.

Gass developed a mathematical model to allocate police patrol units based on statistical analysis of past crime data. The algorithm weighted different types of crimes based on the threat level. Criminal homicide, forcible rape, robbery, aggravated assault, burglary, larceny, and auto theft were the “index” crimes with the highest weighted scores. The algorithm divided police precincts into beats by correlating the demographic information in census tract data with historical data about crimes across five distinct variables: number of index crimes, population, area, overall crime rate against total population, and overall crime rate against total area.

The Commission’s work led to a pattern of investment in automated criminal justice information systems at the same time that the nation experienced a cycle of civil unrest.

Following the assassination of Dr. Martin Luther King Jr. on April 4, 1968, this came to a head in ways that would directly influence the development of computerized policing tools.

On the day of Dr. King’s funeral, April 9, many schools in the Kansas City, Missouri, area closed to allow students to mourn. However, on the night before the funeral, James Hazlett, superintendent of Schools for the Kansas City School District, decided that schools in Kansas City itself would fly flags at half-mast but otherwise stay open as usual.

Heartbroken and angry, students walked out in protest and marched quietly to City Hall to register their reactions publicly. As the students started marching downtown, Mayor Ilus Davis granted permission for the protest at City Hall. As the students proceeded, the police massed to meet them.

Not long after their students’ arrival at City Hall, the police started firing tear gas at the crowd. When police threw tear gas cans into a church where a dance was being held, the infuriated response from the protesters escalated into riots. During the chaos, police killed five black men and a teenager, none of whom were armed. So many fires raged, firefighters were unable to respond to all the calls. When all was said and done, an estimated $4 million in damage ($29 million in today’s dollars) had been wreaked.

In the aftermath of the unrest, the city turned to Kansas City Police Chief Clarence Kelley for a solution. At first, Kelley’s answer was better communication technology.

On July 1, 1968, Kansas City deployed the ALERT (Auxiliary Law Enforcement Response Team) system to leverage the latest telecommunications and computing technologies to link police officers to a central command, so that radio calls by officers would receive a response within ten seconds. ALERT was also a computerized database, connected via high-speed microwave communications links to the FBI computer in Washington.

As the use of ALERT grew, and more data was fed into it, a new opportunity presented itself. Combining ALERT’s database of past criminal activity with Gass’ algorithm provided a framework for modeling future crimes—predicting and profiling.

This was put to the test in 1973. With funding from the Police Foundation, Kansas City conducted experiments to probe the effectiveness of trying to proactively prevent crime.

The first study compared three policing methods. From August 1972 to July 1973, patrol officers were divided into three groups: the control group performed its duties as usual; the “Location Oriented Patrol” focused attentions on high-crime areas, as identified by the computerized data; and the “Perpetrator Oriented Patrol” focused on specific individuals suspected of criminal activity. Like the other study, this experiment found no significant improvement over the status quo: “neither (as tried in Kansas City) represented a substantial improvement over the more usual mix of tactical unit activities,” concluded the report.

A second study, from October 1972 to September 1973, compared the results of increasing or decreasing police activity. The city’s precincts were divided into fifteen beats using the algorithm. Five beats were designated the “control” set, and routine patrol practices were maintained in these without any change. In a second set of five beats, patrols were virtually eliminated, with police only responding in reaction to calls from residents. In the final set of five beats, patrols were doubled or tripled in intensity.

The study found no difference between the reduced patrols, increased patrols, or control group in terms of number of “index” crimes or the citizens’ perceived sense of safety, as measured in post hoc surveys. The Police Foundation summarized its finding: “routine preventive patrol in marked police cars has little value in preventing crime or making citizens feel safe.”

Shortly after the studies were concluded, the Kansas City Police Department conducted “Operation Robbery Control” to try to reduce the number of robberies during the Christmas season. Using Gass’ algorithm with the ALERT data to predict likely sites for potential robberies, the police saturated patrols in predominantly black neighborhoods, along with an increased use of undercover officers, informants, surveillance, and other tools. Operation Robbery Control succeeded in suppressing robberies by 27 percent over the preceding year, helping to embed the concept of racial profiling into police practices, despite the fact that the most rigorous scientific tests of the idea had conclusively shown its failure.

The views and opinions expressed in this article are those of the author and do not necessarily reflect the opinions, position, or policy of Berkeley Research Group, LLC or its other employees and affiliates.

 

Read the article. (subscription required)

BRG Experts

Related Professionals

David Kalat

Director

Chicago