Minority Report
John Anderton was Chief of Precrime in the 2002 technoir Spielberg film Minority Report set in the year 2054. Crimes were detected before they occurred by three precognitive humans lying in a vat. Contemporary researchers and police believe that they can begin to pull off that trick—predict a crime before it happens—using computer algorithms. For example, the police in Fresno, California's Real Time Crime Center are testing the Beware crime alert tool developed by the security systems company Intrado. The company claims:
Accessed through any browser (fixed or mobile) on any Internet-enabled device including tablets, smartphones, laptop and desktop computers, Beware® from Intrado searches, sorts and scores billions of publically-available commercial records in a matter of seconds—alerting responders to potentially dangerous situations while en route to, or at the location of, a 9-1-1 request for assistance.
In addition, Beware can calculate a criminal risk score for any residents of Fresno. For example, the Washington Post reported:
While officers raced to a recent 911 call about a man threatening his ex-girlfriend, a police operator in headquarters consulted software that scored the suspect's potential for violence the way a bank might run a credit report. The program scoured billions of data points, including arrest reports, property records, commercial databases, deep Web searches and the man's social- media postings. It calculated his threat level as the highest of three color-coded scores: a bright red warning. The man had a firearm conviction and gang associations, so out of caution police called a negotiator. The suspect surrendered, and police said the intelligence helped them make the right call — it turned out he had a gun.
But how results from Beware could mislead police was highlighted at Fresno City Council meeting:
Minority Report
Councilman Clinton J. Olivier, a libertarian-leaning Republican, said Beware was like something out of a dystopian science fiction novel and asked [Police Chief Jerry] Dyer a simple question: "Could you run my threat level now?" Dyer agreed. The scan returned Olivier as a green, but his home came back as a yellow, possibly because of someone who previously lived at his address, a police official said. "Even though it's not me that's the yellow guy, your officers are going to treat whoever comes out of that house in his boxer shorts as the yellow guy," Olivier said. "That may not be fair to me." He added later: "[Beware] has failed right here with a council member as the example."
Researchers are also working on computer systems that can predict the probability that a parolee will re-offend. Bloomberg View reports on the crime forecasting research of University of Pennsylvania statistician Richard Berk. Basically, Berk trained a machine using data on 100,000 cases noting the offenders' age, sex, zip code, age at first arrest, and a long list of possible previous charges for such things as drunk driving, animal mistreatment, and firearms crimes. The machine was then asked to predict which offenders were likely to engage in domestic abuse again. As Bloomberg View reported:
Currently, about half of those arrested for domestic violence are released … The challenge [Berk] and [Penn psychologist Susan] Sorenson faced was to continue to release half but pick a less dangerous half. The result: About 20 percent of those released by judges were later arrested for the same crime. Of the computer's choices, it was only 10 percent.
That's a pretty good result. However, some worry that judges and other members of law enforcement might begin to "trust the software" more than their own knowledge and experience.
No doubt such crime prediction software will be increasingly refined. Its outputs can be used chiefly to cast police suspicion upon citizens or they can be used to intervene in ways that help the people it identifies as being at-risk to avoid becoming criminals.
Comments