Black Software: The Internet and Racial Justice, from the AfroNet to Black Lives Matter
by Charlton D. McIlwain
Ring's effort to cozy up with law enforcement agencies and launch a citizen-installed surveillance state is undoubtedly a danger to our civil liberties but the doorbell camera company is far from the first corporation willing to leverage its technology to the US government in the name of "fighting crime" -- really just a pseudonym for enforcing America's tradition of racial segregation.
As the excerpt from Black Software by Charlton D. McIlwain illustrates, law enforcement technology has long served as unofficial cover for local and federal officers in their efforts to protect White Americans from their single biggest existential threat: black neighbors.
If you thought Stop and Frisk was wrong, wait until you see how Civil Rights-era Kansas dealt with the prospect of a "suspicious" black person even existing in a predominantly white neighborhood. Because it sure sounds familiar.
The President's Crime Commission report in 1968 had recommended that the federal government invest massive amounts of resources into what were later dubbed Criminal Justice Information Systems. It invested millions of dollars to design and build them. The growing and persisting fear of crime was its underlying rationale. But the commission's long list of use cases for these systems ultimately proved most persuasive.
The computing industry, led by IBM, the federal government, national and local law enforcement agencies, and academics at elite science and engineering institutions had started developing these use cases beginning in 1965. That's when New York City police commissioner Harold Leary formed the Joint Study Group. This study group included representatives from the police department's planning and communications departments and four representatives from IBM. One was a sales manager. The other three were computer programmers.
In the end, the Joint Study Group outlined thirteen potential new law enforcement computer applications. The list included applications for computer-aided dispatch, crime analysis, fingerprint identification, resource allocation, and election returns. New York City began pursuing only one of these identified systems—a computer-aided dispatch system. They called it SPRINT—Specialty Police Radio Inquiry Network. The system was built from scratch. But builders based it on an existing IBM design model for a flight reservation system.
During the same time, Kansas City's chief of police Kelley had assembled a team of his own. It consisted of an in-house team of two: his assistant, Lt. Col. James Newman, the department's chief data systems director, and Melvin Bockelman. Both were dubbed "patrolmen programmers." They were policemen first, but they were armed with technical data processing training. Two IBM personnel, marketing representative Owen Craig and Roger Eggerling, an IBM systems engineer, rounded out Kelley's team.
IBM described its systems engineers as assisting our customers in defining their systems problems and determining the best combination of IBM equipment to solve them. Speaking more holistically about how IBM built its enterprise, the company had reported to its board and shareholders back in 1961 that this era demands a higher degree of professionalism than ever before among the sales representatives who initiate and develop customer interest, the systems engineers who help our customers study, define and develop solutions for their problems, and the customer engineers who install and maintain equipment at peak efficiency.
IBM systems engineers were also its link to the scientific and engineering academic community. They presented seventy papers in one year alone, for example. IBM systems engineers refined their computing knowledge within an academic field. They had also distributed their knowledge about systems building throughout both the scientific and industrial community. The plan?
Imbed the police beat algorithm within a geographical crime information system with graphical inputs and outputs, thus enabling us to bring the proper man-machine interaction to bear on this heuristic-analytic type of decision problem.
IBM systems engineer Saul Gass worked in IBM's government services division. Gass divided command and control systems into the two primary problem areas they confronted: police planning and police operations. Police planning had much to do with allocating human and material resources. How many police personnel should be dedicated to a given geographical area based on its population size and crime rate? How should you divide up a geographical area into efficient police patrol beats? How much equipment should be stored, and in what locations, in order to be ready to respond swiftly and effectively to a riot situation? These are examples of planning problems that police had to solve in order to maximize success.
Operational problems, on the other hand, involved different types of questions. How do you identify crime patterns? How do you both predict and apprehend suspects based on those patterns? Once apprehended, how do you associate suspects with other crimes they may have committed? And, when you know all this, how can you prevent crime from being committed in the first place?
These concerns were packaged into a command and control solution called computer-aided dispatch (CAD). Underlying the CAD system was software, powered by an algorithm that automated solutions to specific operational and planning problems. Its task was to answer the question of how to allocate a finite number of police patrol units to police beats (parsed geographical areas). And, how to allocate those resources to patrol beats so that police officers were positioned to be dispatched to and arrive at the scene of a crime. Gass's mathematical model could be used to determine this, given some known factors and data. He had already developed such a model. He also possessed "real-world" crime data, from New York City's SPRINT. The array of symbols, functions, and notations looks complicated to the non-mathematician, but the information and data the algorithm called for tell us everything we need to know.
First, US Census tracts parse geographic areas and develop uniform, structured data about those areas—primarily population size and racial demographics. These tracts enable strategic deployment of police officers by geography, population size, and racial composition.
Gass's model (and the police community) contended that all crimes were not created equal. Thus, Gass's algorithm required "weighted" crimes. Like census tracts data, a police department like Kansas City's could rely on an existing weighting system. In the mid-1960s, the International Association of Chiefs of Police had already produced such a ranking. A score of four represented the highest-priority crime. A score of one was the least threat, and therefore least priority. Criminal homicide, forcible rape, robbery, aggravated assault, burglary, larceny, and auto theft all received a score of four. These were also known as "index crimes." The FBI had developed this system for its Uniform Crime Reports.
In addition to weighted crimes, Gass's formula required weighted crime incidents. And it required weighted workloads. Then, the algorithm required that police correlate workloads with the geographical areas where the greatest numbers of the highest-weighted crimes took place.
Using census tract designations, and these crime weights, Gass's formula used five measures of the workload for a census tract: number of index crimes, population, area, level of crime multiplied by the population, and the level of crime multiplied by the area. This produced a geographical map of a city, parsed by patrol beats. They could be designated as high to low threat. These criteria could then be used to determine police resource allocations. One might, for example, assign twelve police officers to regularly patrol the high-threat area, and only three for the low.
It could also be used to determine whom and how many police officers to dispatch to a given area when a crime was reported. It would determine with what urgency and speed the officer(s) should respond. And it determined what precautions police should take in order to protect their safety. A call reporting a "suspicious" Negro loitering in a low-threat area, for instance, might lead a dispatcher to hail four squad cars. The Negro profiled as high threat; the neighborhood coded as low threat and white. Of course, one need only correlate these threat areas with their corresponding census tract demographics to begin to formulate not only geographically based threat profiles, but the corresponding racial profiles as well.
Producing and then systematizing such a profile in ways that could have measurable effects, however, required a much larger system. It would have to include more applications than just CAD. It would need to be networked; reach beyond a single city or local area; and be able to constantly ingest new data, process that data, and use them to model criminal profiles and affect future police decision-making. Such a system would be a massive undertaking. It would cost millions of dollars. Those who commanded it would be compelled to demonstrate that the system's outputs produced the desired outcome: to efficiently protect America's white citizens from its most feared criminal suspects.
From Black Software: The Internet and Racial Justice, from the AfroNet to Black Lives Matter by Charlton D. McIlwain. Copyright © 2019 by Charlton D. McIlwain and published by Oxford University Press. All rights reserved.