The Jennifer Lawrence/Chris Pratt romantic adventure “Passengers” may not be what everyone has in mind. It’s certainly not what’s being advertised in the trailers and television commercials (read our review here). But it’s not too surprising there are some twists to a script that landed on 2007’s Black List (a trend setting survey of unproduced screenplays) and kickstarted a studio career for its author, Jon Spaihts.
As Spaihts’ star rose (working on “Prometheus” and “Doctor Strange”) his original idea was still floating through many stages of development, like a high concept spaceship zooming off to its eventual landing even as its origins grew distant. Morton Tyldum, the director of “The Imitation Game” (known in some circles as “Today They Call Them Computers: The Movie”) was the one who eventually brought Spaihts’ vision to the screen.
We had the good fortune to speak to the screenwriter about his feelings on automated computer systems, generation ships, “Alien: Covenant” (which he is not involved with), “Doctor Strange 2” (still TBD) and the likely onslaught of fiery “hot takes” this film might inspire in viewers who toss nuance out the airlock in the face of ethically challenging storytelling.
Whether you’ll ultimately feel that “Passengers” shows true love transcending an instigative moral breach with the uniqueness of its setting or eggs on future stalkers with an “eh, she’ll eventually come around” mentality is a conversation that Spaihts seems ready to have.
The movie is predicated on, “These hibernation pods never make a mistake. No one could ever wake up accidentally.” And all you’ve got to do is hit a big space rock and it’s going to happen. Are you a little bit fearful of the computer singularity?
Yes, but the notion is that the premise of the film is that something nearly impossible has happened, that basically a one in a trillion fault has taken place. Something has actually penetrated a shield and it didn’t just perforate the ship… it happened to damage the most powerful computer on the ship, which then had to distribute across the network, reassign all of its processing tasks to every auxiliary and ancillary process on the ship.
It’s running 110% rate of capacity for years on end until they start to lose chips faster so the cascade spreads and things tumble down hill. So it’s a ship in the grip of an extraordinarily unlikely crisis.
But does this mean that you personally are a little bit distrustful of self-driving cars or of stores like Amazon Store where you don’t have talk to a cashier?
Oh no. It’s just a story point. In a sense all artificial intelligence is human intelligence packaged in code. It’s just prefabricated human intelligence and algorithms. And all human intelligence is capable of failure, so undeniably self driving cars will crash and kill people sometimes and self-regulating systems will fail in cataclysmic and unpredictable ways, but people do all of those things, too. And people do them probably at higher rates than most automated systems. There’s no shortage of human beings killing people by crashing cars. So I have no special fear of A.I.
**Major spoiler section coming: stop reading here until you’ve seen the movie. Good? Ok, good. Skip below to the mark below when spoilers end.**