There is some comfort to be gained from dystopian sci-fi: no matter how realistically rendered, and how much it comments on current bleak realities, it presents a vision of the world as it is not — at least, not yet. But prepare to have the boundary between everyday life and futurist fiction comprehensively shattered by Matthias Heeder and Monika Hielscher‘s deft, chilling documentary “Pre-Crime,”. The picture borrows its title from the Philip K. Dick story “Minority Report” on which Steven Spielberg‘s film was based, though it locates its dystopia in a very real, very present — indeed omnipresent — present tense. Turns out, we are living in the future right now, it’s just not the one with the jet packs and silver jump suits.
“Pre-Crime” is the collective term for the practice of “predictive policing” — that is, aiding law enforcement by putting together lists and maps, based on probability math, of high-risk urban areas and individuals likely to commit serious crimes. That “likely to” is key: based on an ever-expanding database that secretively culls information from social media, surveillance programs, credit card usage, police records, global mapping systems, cellphone tracking etc. The various models used by police departments across the Western world, including PredPol in Kent in the U.K. and The Heat List compiled in Chicago, are built to allow the authorities to pre-empt crime. The words used to describe this practice are chunky and jargonistic: moving from “reactive” policing to “pro-active” policing; allowing an officer’s role to become “preventative” rather than punitive; seeking to eradicate crime “before it happens” rather than standing by to wait for innocent people to become victims and known persons-of-interest to become full-blown perpetrators.
It doesn’t take a Masters in Ethics to understand the issues inherent in this deterministic view of human behavior, but Heeder and Hielscher’s icily arch approach brings it home with full force with a cunning selection of experts, from the developer of Ubisoft‘s predictive policing video game “Watchdogs” to human rights activists, police officers, social workers and tech consultants, contributing to a fluid, dynamic presentation of some pretty cold truths. They even suggest that our current reality has already surpassed that of Dick’s imagined future in its lack of humanity; rather than a bald telepathic Samantha Morton in a bath full of goop, current Predictive Policing models rely on algorithms, databases and long, long strings of ones and zeros, and as one commentator says, “Code has no conscience.”
Conscienceless it may be, but code can have bias. When a key part part of the salient data set is inevitably based on previously reported crimes, entire classes and segments of society are far more likely to appear on these watchlists than others. In one of the slightly whimsical cutaways to Heeder on a rocky outcrop doodling and staring out to sea (which start off slightly irksome but quickly become a necessary respite from the blizzard of grim facts), he muses “is there any watchlist for corporate crime?” And of course there isn’t: put bluntly, if the color of your skin and the color of your collar is non-white, you are far more likely to be targeted by these tools, which enshrine existing biases in code that might as well be written in stone. There is no transparency to how the lists are compiled, no accountability for any errors made, and once you’re on one, there’s pretty much no way of getting off it.
To demonstrate life at the pointy end of pre-crime profiling, the directors select two articulate and sympathetic case studies — both young black males — who have each been informed that they are on their respective city’s lists (letting the pre-crime “suspects” know they’re suspected is a fundamental part of these programs). American Robert McDaniel was flagged not for his few petty arrests for marijuana possession and the like, but because of who he was arrested with — that, plus the fact that his best friend was murdered, which in the eyes of the almighty algorithm, makes McDaniel a heavy probability to turn to murder or violent crime himself, in a rather dizzying elision of victim and perpetrator. McDaniel, whom we see working his menial day-job, sweeping up and stacking boxes, is pretty much resigned to a lifetime of keeping his head down but is constantly aware of the cloud of unwarranted suspicion that surrounds him, that means he cannot ever afford the luxury of bad luck.
The other case is of Tottenham-based Smurfz, a young, hoodie-clad man whose pin-sharp assessment of the unfairness of his situation makes him a compellingly persuasive presence. Not only does he draw a straight line between PredPol and the targeting of ethnic minorities, he also implies that knowing you’re already on a list calling you a criminal may in fact be far more likely to make you one, like a real-world echo of the Observer Effect in physics.
This is one of the most insidious and potentially explosive aspects of this absorbing and provocative film. Because the thing about adopting a predictive system of law enforcement, as many police departments around the globe already have done, is that it’s hard for a non-mathematician to sort probability from prophecy, and prophecy has a tendency to become self-fulfilling. There’s a sense that as impersonal as it is, the system wants to prove itself right, so the line between identifying potential criminals and creating actual criminals becomes increasingly blurry.
Whatever the inherent biases of the tools themselves, just as frightening is the thought of what ends this evolving tech can be used for, by authorities and regimes whose agenda may not simply be the prevention of crime, but the suppression of dissent and the abrogation of human rights and freedoms, behind an increasingly opaque shield of secrecy. There’s no reason, says a German contributor commenting from Berlin, to believe that to tools used to monitor society will not soon be used to “adjust” it. The required-viewing “Pre-Crime,” however, at least means that when that crime is inevitably perpetrated on us, we won’t be able to say we didn’t see it coming. [A-]