domingo, 6 de noviembre de 2011


The Santa Cruz Experiment
(Popular Science)

Can a criminal act be prevented before it begins? By turning its crime problem into a data problem, one city is reinventing police work for the 21st century
By Kalee Thompson Posted 10.10.2011 at 3:53 pm

Predicting Crime Bridgeman Art Library/Getty Images

Last year the criminals of Santa Cruz, California, stole 160 cars and committed 495 burglaries. For a city of 60,000, that’s about average. And so are the challenges facing its police force. Since 2001, the SCPD has laid off 10 of its 104 officers, even as the city’s population grew by 5,500. The department now has to do more with less, which is the story of just about every police force in America. But this summer, the way the SCPD fights crime changed. It began a six-month experiment using large sets of data and a sophisticated algorithm to forecast when and where future crimes are most likely to take place—and how officers could be deployed preemptively to stop them.

The approach is called predictive policing, and the experiment in Santa Cruz represents a leap forward in the data-driven crime-fighting models that began in the 1990s with CompStat, which uses mapping and statistics to track crime. Data-collection techniques have improved, processing power has increased, and police forces have refined their methods. In Chicago, officers are partnering with computer scientists at the Illinois Institute of Technology to develop a crime-fighting algorithm. In Memphis, a project called Blue CRUSH (Criminal Reduction Utilizing Statistical History) relies on analytics software created by IBM. In Richmond, Virginia, police have reduced crime by adopting consumer-research techniques that Walmart and Amazon use to predict what people will buy. Similar techniques can be used to predict where and when criminals will act.

The experiment in Santa Cruz is different. George Mohler, a30-year-old mathematician, based the experiment’s algorithm on one used by seismologists to predict earthquakes and their aftershocks. The algorithm targets property crime, including home burglaries, car breakins and vehicle thefts, which were up 25 percent in Santa Cruz in the first half of this year. Such crimes, Mohler has found, tend to cluster and spread in a way that is similar to tremors after a large quake. The scope of the experiment in Santa Cruz is broad. An entire department is using the software. An entire town is serving as its data set. And because Santa Cruz is so statistically average, the results of its experiment could be applied almost anywhere. The project went live on July 1. A week later, I arrived to find out how the police force in this seaside town is changing how crime gets fought in the 21st century.

The days of “primal policing,” when the department could “flood the streets with cops and hope you get lucky,” are over.

The black-and-white cruiser makes a slow turn onto Linden Street. It’s just after noon on a Thursday, and I’m riding shotgun with deputy chief Steve Clark, who is leading the rollout of Santa Cruz’s predictive-policing program. Clark is 47, with a graying buzz cut and a laid-back surfer’s drawl. He grew up in the area and has been a Santa Cruz cop for 25 years.

For almost a week now, Santa Cruz’s 60 patrol officers have been relying on George Mohler’s software to guide them to “hot spots,” areas at the highest risk for home break-ins and vehicle thefts. The department divides its city into five regions, with at least one car on duty in each. Before the experiment, individual officers decided where and how to focus their time when no calls were coming in. Now they will focus on patrolling hot spots, making two or three passes down a particular block during a one-hour window. Officers pick up their hot-spot maps at the roll-call meeting that precedes each shift. The goal, Clark tells me, is “to get smarter about the way we do the basic elements of police work.” The days of “primal policing,” when the department could “flood the streets with cops and hope you get lucky,” are over.

I’m holding a small stack of paper, 10 maps of the city, each marked with a different red box, representing today’s 10 hot spots. They are surprisingly small, just 500 feet by 500 feet. Above each map is a set of statistics: the probability that a crime will take place in that area today, the two hourlong windows when that potential crime is most likely to occur, and the likelihood that the crime will be a home break-in or an auto theft (“burgs,” Clark calls them). I flip through the stack until I find Linden Street, where, the statistics reveal, there is a 2.06 percent chance of a crime happening today, and 3:1 odds that a crime, should it occur, will be a home break-in versus an auto theft. “These are the high-probability windows,” Clark says, pointing to two times above the map, 7 a.m. and noon.

Numbers Game: Santa Cruz’s six-month predictive-policing trial is built around an algorithm that uses constantly updated burglary data. Officers register drives through predicted crime zones using an onboard computer Cody Pickens
We pass a woman walking a pair of fluffy Maltese puppies. At a stop, two retirees, both wearing wide-brimmed hats in the midday sun, lean down to say hello through the cruiser’s unrolled window. Most break-ins happen during the day, when people are at work, but I’d expected the hot zones to be in or near bad neighborhoods. This isn’t a part of town many of his officers would usually pass through during day shifts, Clark says. “Now as I’m driving down the street I’m thinking, OK, we’re in an area where there’s a high probability of residential burglary. I’m looking at front doors. I’m looking at front yards. I’m looking for screens that are out. That tells me maybe somebody’s inside.”

On the next block, we pass a man sitting in his car, eating what looks like a fast-food burger. He slumps a little into his seat as the cruiser rolls by. “We’re gonna circle back around and look at Mr. Hamburglar there,” Clark says. “Usually if you’re going to have a cheeseburger, you’re going to bring it inside and eat it in your kitchen, right? Maybe he’s visiting somebody. Maybe he’s waiting to work on a house. Who knows? But when you talk about probabilities, there’s a probability that that guy doesn’t live in the neighborhood."

George Mohler sits behind his computer in a basement office at Santa Clara University. He is wearing kneelength shorts and a button-down shirt with a baby-blue print, and looks less like a professor than a student (or an indie rocker, which he is—he plays bass in a band called the Idyllists). Mohler pulls up Santa Cruz’s crime stats. A string of coordinates, the precise location of individual crimes, runs down the left-hand side of an Excel spreadsheet. Each coordinate is followed by a date, time and code to identify the event—1 for a home breakin, 0 for a vehicle break-in. There are 4,300 crimes in all, dating back to 2006.

In the past decade, property crime fell by 29 percent and violent crime by 39 percent. Both are now at the lowest levels since 1973, when systematic nationwide data collection began. Many factors—an aging population, the end of the crack epidemic, increased incarceration rates—have played a part in the decrease. But most criminologists give a large slice of the credit to the use of comparative statistics, which was pioneered in New York City by former police chief William Bratton. In CompStat, as Bratton called his approach, police departments collect data on recent crimes, map them, and patrol based on those maps.

"Predicting crime is no different from predicting the weather. Except that you can't do anything about the weather."

Mohler’s seismology-inspired algorithm is different. In his formula, the distance and time separating two crimes is a data point too, so it assesses the risk of main “shocks” and the risk of aftershocks connected to that first event. “If you have 5,000 events, our model actually considers on the order of 5,000 events: 5,000 multiplied by 4,999 multiplied by 4,998, and so on,” Mohler says. It’s this massive secondary data set that helps identify the high-probability zones, where “aftercrimes” are most likely to occur. Before he moved to Santa Clara, Mohler proved that his algorithm could work in a simulation run on crime data from Los Angeles’s San Fernando Valley. Mohler and his colleagues at the University of California at Los Angeles found that their maps successfully predicted 20 to 95 percent more crimes than maps used in CompStat.

Mohler admits that the earthquake analogy isn’t intuitive. “One is a physical process, and one is sociological,” he says. “They’re not related in terms of what’s driving it. But these models are flexible, and they describe a wide range of contagion-like processes.” Epidemiologists use the seismologists’ model to predict the spread of disease, and the same models are increasingly common in the financial world. “Corporate default is contagious,” Mohler says. “One event will trigger a sequence of further events. Basically, anytime an event increases the likelihood of more events, these kind of models can be used.”

An algorithm is a progressive series of calculations used to process and analyze large sets of data. Mohler’s algorithm draws on basic data from past Santa Cruz burglaries, although other crime-predicting algorithms might incorporate days of the week, holidays or the weather. Mohler’s algorithm will, in all likelihood, not always be limited to burglaries, but it’s the type of event that it is best suited to. “We’re starting with the easiest thing to model,” he says. “This is the lowest-hanging fruit.”

Just as seismologists are helpless to predict a specific earthquake, the algorithm can’t stop a specific break-in. “It’s not the kind of thing where you can say, ‘There’s going to be a crime at this house at this time’ and you send the police there and they catch somebody,” Mohler says. “It’s making patrols more efficient. You have a whole city to cover, and crime is not evenly distributed throughout that city in space and time. Predicting crime is no different from predicting the weather. Except that weather you can’t do anything about.”

Predictive policing came to law enforcement by way of retail. In 2004, Walmart analyzed a decade’s worth of point-of-sale data. The company’s researchers discovered, among other things, that before a hurricane, its customers stock up on batteries, bottled water and flashlights. No surprise there. But the same analysis also revealed something less obvious. “In advance of bad weather, their sales of PopTarts—strawberry Pop-Tarts, in fact—go through the roof,” says Colleen McCue, a psychologist who, in 2009, co-authored a paper with L.A. police chief Charlie Beck in the law-enforcement magazine The Police Chief titled “Predictive Policing: What Can We Learn from Walmart and Amazon about Fighting Crime in a Recession?”

Criminals are just another type of consumer, McCue says. Most property thefts are crimes of opportunity, and many are fueled by a need for drug money. “If drug prices go up, you see more property crimes,” she says. “But we also have some strawberry Pop-Tarts in law enforcement.” For eight years, McCue worked as a crime analyst for the Richmond, Virginia, police department, where she was among the first to deploy predictive-policing techniques. In 2003, she analyzed data in the weeks after Hurricane Isabel, and discovered that following bad weather, complaints of random gunfire—what police in Richmond call “promiscuous shooting”—increased. “No one really knew why it was happening,” she says. But knowing that it was going to occur and being able to pinpoint the areas at highest risk, the department could prepare accordingly. The Richmond PD deployed officers to targeted areas McCue identified, and random gunfire dropped by 47 percent.

In retail, the ability to predict trends can generate sales. In law enforcement, it can save money. “For every crime that was prevented, you didn’t have to arrest anyone,” says McCue, whose targeted Richmond initiative meant that the department could deploy 50 fewer officers on a single New Year’s Eve, saving $15,000 in personnel costs. “You didn’t have the time associated with processing and booking criminals. You didn’t have the costs associated to hold them if they needed to be held prior to trial, or the judicial resources to try them or the correctional resources to incarcerate them.”

Retailers also spend a lot of time thinking about space, McCue says. “How do you move someone through a store? How do you position things on shelves? We’re doing similar things. We’re asking, How do bad guys move through communities? How can we position our policing assets to be unfavorable to crime?” She says that someday police officers will be as adept at predicting what branch a bank robber will hold up next as Netflix or Amazon are at predicting what movie or book a customer will like. The data analysis incorporates not only the past behavior—the “likes”—of that particular consumer or criminal based on what books they bought or cars they burglarized, but also the preferences demonstrated by other, similar buyers, or bad guys. “Just knowing that a relationship exists, Walmart can make sure they have enough Pop-Tarts on their shelves to meet demand,” McCue says. A big-box store doesn’t need to understand why people crave toaster treats when the wind begins to howl, just as cops don’t need to understand why criminals fire guns or steal cars. They just need to know where and when.

Before we leave Linden Street, Clark records our drive-through on his dashboard computer. Every check-in means more data, and, after the six month trial, the recorded check-ins will help Mohler determine how effective the program has been. Clark runs a check on the Hamburglar’s license plate, which doesn’t turn up anything. He then looks at the hot-spot maps and we head downtown, toward a triple-decker parking garage that’s been flagged all week. “You can see how something like this has a high potential for auto burglary,” Clark says as we zigzag up toward the open roof. “You’re isolated. You’ve got lots of areas to exit. You can walk up here with your backpack, smash a window, grab a purse, and go.”

One of the most common criticisms of predictive policing is that it will not tell police officers anything they don’t know already. In Santa Cruz, some officers work Sunday through Wednesday, Clark says, while others are on a Wednesday-through-Saturday schedule. There are three shifts during the day, and every four months the officers change shifts. Officers don’t necessarily talk to colleagues who aren’t on the same watch, which can lead to gaps in the collective knowledge of the department. Inevitably, some members of the force will be new to the job, new to the area or simply less vigilant than they could be. There isn’t always a good system (or any system at all) for capturing the institutional knowledge of a retiring officer. And although their experience may be invaluable, not even the best officers can process information the way a computer can. “The human brain cannot weigh more than three or four variables at one time,” says Sean Malinowski, a captain at the Los Angeles Police Department who plans to launch a predictive-policing program of his own. Whereas humans are emotional and our perceptions easy to influence, a computer is impartial. Clark notes the 12:00 and 14:00 on his map. “What wouldn’t necessarily ping are these high-probability time windows,” he says.

Afternoon passes, and the evening shift begins its patrols. Calls coming in over the radio increase: We respond to reports of a stolen car, an assault in a public restroom, and an eight-year-old girl described as wearing a pink T-shirt and Hello Kitty flipflops missing on the beach (she’s quickly found). The only arrest I witness is of a painfully skinny, needle-marked woman picked up for snatching a purse. The bag in question belongs to a college student, one of a group of teenagers living at the dingy Peter Pan Motel a couple of blocks from the boardwalk. As the police officers book the snatcher, I stand around talking with the kids, Christians from the deep South who have devoted their summer to selling soft-serve and T-shirts and spreading the gospel on the California coast. I try to explain my own mission: the maps, the statistics, the effort to stop crimes before they even take place. “Sounds like Minority Report,” one of the kids says.

In October 2010, a man armed with a nine-millimeter handgun shot multiple rounds at the Pentagon, a Marine Corps recruiting station and the Marine Corps museum (no one was hurt in the incidents, which occurred late at night). McCue worked with the Department of Homeland Security to draw up a “sniper preference model” that could predict where the gunman would strike next. The sniper was fixated on military targets, which gave the task an unusual urgency. “We got a senior government official out of the gym on a Friday night,” McCue says. “He got the data for us, and we created the models over the weekend.”

For their model, McCue’s team relied on two types of data: the previous shootings, and geospatial information about the areas surrounding every target, including the roadways, terrain and even the socioeconomics of each community. “Where you go, places that you frequent—that says a lot about you,” McCue says. Looking closely at the characteristics of past targets allows crime analysts to identify new areas that might be at risk, a crucial skill given the reactive nature of the job. “Crime and terrorism are almost like a bubble under the carpet,” she says. “Something bad happens, you bring in a lot of resources, and then it moves.” The goal is to get ahead of the movement.

McCue’s software, like Mohler’s, produced maps of the areas most at risk for a future strike. Three days after McCue sent the maps to local, state and federal law-enforcement agencies, the sniper struck again, firing into a Coast Guard recruiting outpost in an office building in Woodbridge, Virginia. Even though the location seemed unlike previous ones, the office building fell within one of McCue’s high-likelihood target areas. Northern Virginia is filled with military targets, but McCue’s model picked up on the densely wooded area behind the Woodbridge office building and the easy access to nearby highways. After the Coast Guard shooting, the gunman went cold. He wasn’t heard from again until mid-June, when he was arrested in Arlington Cemetery, right outside the Pentagon. “He was picked up in one of the highest-likelihood areas identified by the model,” McCue says.

Knowing what a criminal is after isn't all that different than knowing what a retail store's customers want.

Knowing what a serial sniper is after isn’t all that different than knowing what a latte sipper wants. “When you pick your coffee shop, you may not have any idea why,” McCue says. But Starbucks does. You want good parking, or easy public transportation or foot access. Starbucks studies the demographics of the area to be sure the people passing by can afford to pay $5 for their specialty drink, just as McCue incorporated geospatial data to inform the sniper hunt. Today the Department of Defense uses a similar process to predict the areas at highest risk of improvised explosive device (IED) attacks in war zones. “If I’m going to place an IED, I need to make sure people are going to be going through the area,” McCue says. “I need to understand the routine activity of the target population. The math is very similar.”

When predictive policing was first introduced in Santa Cruz in July, Clark says some of his officers told him they thought it sounded like “voodoo magic.” Relying on math to combat property crime ran counter to many officers’ notions of criminal behavior. “I think some took it as an affront to their skills. Others were concerned that this would create extra work,” Clark says. But driving through a 500-by 500-foot hot spot during an hour-long window isn’t asking much. Which is exactly the point. Small, directed efforts in policing can bring about great change. In the late 1980s and early 1990s, the New York City Transit Authority focused on removing graffiti from every subway car and cracked down on people who jumped turnstiles without paying. By 1996, felonies in the subway system had dropped by 50 percent.By the end of the year, the Los Angeles Police Department hopes to begin testing Mohler’s algorithm. “We’ll probably do several experiments throughout the city, some in violent crime, some in property crime,” Malinowski, the LAPD captain, says.

Unlike Santa Cruz, the L.A. experiment will be run like a clinical trial, with control areas where crime is predicted and tracked but predictive policing methods are not introduced. Malinowski says his department is following the Santa Cruz experiment closely. And the results, though too early to be conclusive, are promising.By the end of July, property crime was down 27 percent from the year before, an impressive drop, especially given the 25 percent rise in the first six months of the year. What’s more, seven criminals had been discovered inside the hot spots.

In one afternoon in July at the tripledecker garage, two women were detained after they were caught peering into cars. One had an outstanding warrant for possession of methamphetamines; the other was carrying the drug. And in late August, a few blocks from Linden Street, police officers stopped a man for suspicious behavior. When they searched him, they found stolen goods from a burglary that had taken place nearby a few days before. “This guy was actually wearing a ring that belonged to the victim in that theft case,” Clark says. “And here he is cruising around in one of our zones."

No hay comentarios:

Publicar un comentario