Technologies to BetterFight CrimeOn a Saturday afternoon last summer, Mark Raschtook his son to his baseball game at a park in Georgetown,Maryland. The ballpark is located in an area that has zoneparking with a two-hour limit. Rasch was forced to park ina spot that was a bit of a hike from the ball field. He latereyed an opening closer to the park and moved his car there.The game ended, Rasch packed up, and was ready topull away when he noticed a parking enforcement officerwriting tickets. âIâm OK, right?â he asked, assuming thatbecause he had moved his car she wouldnât know heâd beenparked in the zone longer than two hours.Wrong. The officer not only knew that he had movedhis car, but also when and how long heâd been parkedwithin the zone. Fortunately, she didnât write him a ticket,as he was about to pull out. But the encounter left Rasch,who is a lawyer and a cyber-security consultant, a littlespooked at the realization of just how much informationlaw enforcement is generating.If there was a time when law-enforcement agenciessuffered from an information deficit, itâs passed. Of themore than 18,000 law-enforcement agencies across theUnited States, the vast majority has some form of technology for collecting crime-related data in digital form. Thebiggest city agencies have sophisticated data warehouses,and even the most provincial are database savvy.So itâs not surprising that law-enforcement and criminal justice agencies are running into the same data-relatedproblems that CIOs have been experiencing for years: ensuring data quality and accessibility, developing and enforcing standards for interoperability, and exploiting thosedigital resources in the most effective manner.The era of data-driven law enforcement began in the early1990s in New York City. It was there that police chief WilliamBratton sought to impress newly elected mayor RudolphGiuliani with a radical approach to policing that came to beknown as CompStat. CompStat put an emphasis on leveragingdataâaccurate, detailed, and timelyâto optimize police work.âPolice departments are powerful collectors of data,âsays Michael Berkow, president of Altegrity Security Consulting, a newly launched division of security firm Altegrity.Before joining ASC last month, Berkow was chief of theSavannah-Chatham police department, and before that hewas second-in-command to Bratton in Los Angeles afterBratton left New York to be chief of the LAPD.Police departments were motivated to implement orupgrade IT systems by the Y2K frenzy, Berkow says. âBy2000â2001, everybody had some level of digital information,â he says. That and CompStat led to a movementknown by the initials ILP, which stand for âinformation-ledpolicingâ or, according to some, âintelligence-led policing.âThe concept is simple: Leverage data to help positionlimited police resources where they can do the most good.Itâs an effort to be more proactive, to âchange the environment,â Berkow says, from the reactive, response-orientedmethods of the past.To a great extent, data are about the context of criminal behavior. âWe know that the same small group of criminals is responsible for a disproportionate amount ofcrime,â says Berkow. Police refer to that group as PPOs:persistent prolific offenders. Past criminal behavior, such asdomestic violence, can be a strong indicator of potentialfuture problems. When Berkow was chief in Savannah, hisdepartment went through data on recent homicide casesand noticed an interesting data point: Of 20-some arrestsfor homicide, 18 of those people had prior arrests for possession of firearms. âWe started this very detailed review ofevery aspect of our gun arrests,â he says.Law-enforcement officials often refer to the need for âactionable information.â One of the first ways police agenciesused incident-report data in digital form was in conjunctionwith geographical information systems, in support of whatâsknown as electronic crime mapping, or hot-spot analysis.Police in the city of Edmonton, Alberta, brought indata analysis technology from business intelligence vendorCognos (now part of IBM) a few years ago. The first project police officials concentrated on was using the reportingtool in conjunction with a new geographic-based resourcedeployment model being implemented by the agency. âOurbusiness analytics reports became a key component of howwe deployed policemen around the city,â says John Warden,staff sergeant in the business performance section of theEdmonton Police Service.Now the agency is using the data to plot criminal activity according to both geographic area and comparativehistory. âWeâre really delving into those analytics in terms ofplace and time,â says Warden. The holy grail of informationled policing is whatâs referred to as predictive policing:being able to predict where and when crimes may occur.Thatâs where Chicago wants to go. The Chicago PDoperates what Jonathan Lewin, commander of informationservices, refers to as âthe largest police transaction databasein the United States.â Costing $35 million, Chicagoâs Citizenand Law Enforcement Analysis and Reporting (CLEAR)system processes âall the arrests for all the departments inCook countyâabout 120âin real time,â Lewin says, and450 local, state, and federal law enforcement agencies havequery access to it. Lewinâs IT shop has about 100 staffers andemploys between 10 and 20 contract workers from Oracle,whose database technology the system is based on.Chicago PD is working with the Illinois Institute ofTechnology (IIT), by way of a $200,000 grant from the National Institute of Justice, on an âinitial explorationâ of apredictive policing model. The grant was awarded partlyon the basis of work done by Dr. Miles Wernick of IIT inthe area of medical imaging and pattern recognition, andthe project involves exploring ânontraditional disciplinesâand how they might apply to crime projection. âWeâre going to be using all the data in the CLEAR system,â Lewinsays, including arrests, incidents, calls for service, streetgang activity, as well as weather data and community concerns such as reports of nonworking streetlights. âThismodel will seek to use all these variables in attempting tomodel future patterns of criminal activity,â he says.SPSS is a name often associated with predictive policing. The statistical analysis software developer, recently acquired by IBM, has customer histories that tout the successof its tools in the criminal justice environment, such as theMemphis, Tennessee, police force, which SPSS says reduced robberies by 80 percent by identifying a particularâhot spotâ and proactively deploying resources there.But can software really predict crime? âItâs not a binaryyes or no; itâs more of an assessment of riskâhow probablesomething is,â says Bill Haffey, technical director for thepublic sector at SPSS.The private sector is also doing its part. CargoNet, thefirst-ever national database of truck theft information, is a▼joint project from insurance data provider ISO and the National Insurance Crime Bureau (NICB). CargoNet willcollect up to 257 fields of data, detailing such things as thedestination, plate number, and carrier; the time, date, andlocation of the theft; as well as serial numbers and otheridentifying details on the stolen goods. Refreshed severaltimes per day, CargoNet is expected to track more than10,000 events per year, driving both a national alerting system and a corresponding truck-stop watch program.Truck theft happens mostly on weekends, and itâs rifearound the Los Angeles basin, Atlanta, Miami, Dallas/Ft. Worth, and Memphis, Tennessee. Trucks and trailerstypically slip away in the dark of night from truck stops, restareas, distribution centers, and transfer points. The goodsmost often hit are consumer electronics, food, wine andspirits, clothing, and other items easily sold on the street.These historical patterns are well known, but cops on thebeat need up-to-the-minute information on the latest truckstops and distribution centers hit, the time of day perpetratorsstrike, and the type of goods stolen. Carriers and manufacturers want fresh, nationwide information so they can change thetiming of deliveries and avoid specific truck stops and routes.Insurers want a single source of data so they can get a bettergauge risk and bring the problem under control nationwide.All this collecting, warehousing, and mining crimerelated data begs the question: How much is too much?The Georgetown incident still bothers Rasch. âWhat itmeant was that D.C. was keeping a database of people whoare legally parked,â says Rasch, which, from a privacy standpoint, is âmore intrusive than chalking the tires.âPertinent questions include: How long do they hold onto that data? And with whom do they share it? Itâs an important discussion to have, both in terms of privacy and effectivepolice methods. After all, as Rasch points out, it was a parking ticket that led to the arrest of serial killer Son of Sam.SOURCE: John Soat, âBeyond Street Smarts,â InformationWeek,November 16, 2009, and Doug Henschen, âNational Database TracksTruck Thefts,â InformationWeek, January 26, 2010.CASE STUDY QUESTIONS1. What are some of the most important benefits derivedby the law-enforcement agencies mentioned in thecase? How do these technologies allow them to betterfight crime? Provide several examples.2. How are the data-related issues faced by law enforcement similar to those that could be found in companies?How are they different? Where do these problemscome from? Explain.3. Imagine that you had access to the same crime-relatedinformation as that managed by police departments.How would you analyze this information, and what actions would you take as a result?