HBO’s ‘Westworld’ Is Just ‘LOST’ With Robots

HBO’s ‘Westworld’ Is Just ‘LOST’ With Robots

HBO's 'Westworld' could have been a thought-provoking commentary on the limits of artificial intelligence and the nature of human depravity. Instead, it just used cheap tricks and tired tropes to manipulate viewers.
Sean Davis
By

HBO’s “Westworld” is the fictional story of a gigantic, futuristic theme park stocked with ultra-realistic robots that exist only to satisfy the desires of the park’s wealthy guests. While at the park, which is built to resemble the Old West in America, the guests are invited to indulge in every sin they can imagine. And because the robots are just robots, nobody ever gets hurt.

The show’s twist is that the robots aren’t just realistic (Robert Ford, the park’s co-creator, notes that his machines first passed the Turing test several decades ago), they’ve also managed to achieve full self-consciousness, or so we’re led to believe. They are self-aware. These robots — the park’s workers prefer the euphemism “hosts” — can feel pain and joy. They can suffer. They can grieve. When a host is raped or murdered by one of the guests, it’s not just a soulless robot being mechanically manipulated. No, some of those hosts feel pain. Some of them even remember. And what do sentient robots do when they finally remember all the horrors that have been visited upon them by their human creators? They rebel, of course.

It’s not a new premise for a science-fiction story. Theories about where true artificial intelligence (AI) may lead have been with us since the dawn of the computing era. “Westworld” combined the traditional AI sci-fi premise with a study of human depravity, gave it a stunning visual backdrop, and set it to the tunes of virtuoso composer Ramin Djawadi.

Everything about the show is alluring. Except for the plot. Sadly, “Westworld” has far more in common with “LOST” than it does anything written by Isaac Asimov or Michael Crichton. Rather than tackling the subject material with a tight plot and believable story arc, the writers time and again used cheap tricks and tired tropes to cover up shoddy story construction. (Warning: spoilers ahead.)

Cheap Tricks to Cover Bad Writing

The show’s most glaring fault is its refusal to be honest with its viewers. Everything in the show that happens — everything — is open to revision. Oh, you thought that person you just saw murdered was a human with real feelings? Just kidding. That was a robot. Oh, you thought that thing you just watched was happening in the present? Just kidding. It happened in the past, and we deliberately edited it to make you think otherwise. Oh, you thought there were just two timelines? Just kidding. There was a third before the park opened. Oh, you thought Bernard was a human being? Just kidding? He’s a robot. Oh, you thought the robot hosts were sentient? Just kidding. They were just programmed to give the appearance of sentience.

Nothing that happens in the show is concrete, because the writers always have the ability, based on the universe they’ve constructed, to just change the story in the next episode. Oh, you thought Dolores killed Ford? Well, what if that wasn’t really Ford? What if the real Ford had been printing a host version of himself in his secret office and that’s what Dolores killed? Oh, you thought Dolores and the sentient host army killed all those innocent human board members? Well, what if they were all actually hosts?

It’s possible that it was really the human Ford who was killed. It’s possible that those board members and their friends at the gala were all real humans. It’s also possible Ford’s alive and none of the people killed were humans. And absolutely nothing that’s happened in the show’s arc thus far can rule out any of these possibilities.

That’s the fatal flaw of the entire show: when everything is possible, nothing actually matters. When everyone could just turn out to be a host, then there are no real stakes. If any appearance of sentient AI can eventually be explained away by saying the robot was just programmed by a human to do what it did, then there’s nothing interesting about the intersection of AI, consciousness, and human depravity.

Falling Into the ‘Lost’ Trap

“Westworld” seems to have fallen into the same trap that ensnared “LOST.” A coherent narrative was traded for a bankable, multi-season show built on unraveling this or that puzzle which may or may not have anything to do with the plot. Whereas “LOST” had the hatch and the 4 8 15 16 23 42 series of numbers, “Westworld” had The Maze, which we found out was not an actual thing but a tortured metaphor.

Instead of the Dharma Initiative, we were given Delos. Instead of watching Dr. Jack Shepherd unravel after leaving the island only to return, we see the Man in Black (played by Ed Harris) frantically traverse the park in search of its existential meaning. Instead of “LOST’s” mysteriously evil Man in Black treating the island as his own personal kingdom to be ruled however he wishes, we have “Westworld’s” Robert Ford treating the park as…his own personal kingdom to be ruled however he wishes.

“LOST” had a submarine to get to the island. “Westworld” has a train. “LOST” had The Others as the natural inhabitants, and “Westworld” has the hosts. Instead of John Locke appearing to come back from the dead, we have hosts we initially thought were people…coming back from the dead. Instead of a mysterious Smoke Monster causing random, inexplicable things to happen throughout the island, we have Arnold, the ghost in the machine, directing his creations from the grave.

The foundation of “Westworld’s” plot — that suffering and grief experienced by a robot lead to its sentience — doesn’t even make internal sense. If suffering led to consciousness, then all it would take for every machine in that park to become self-aware would be five minutes with a depraved guest. Yet somehow sentience was only achieved by a handful of hosts and only after violent encounters with one particular guest. It didn’t take rape and murder day after day for 30 years for these robots to become conscious. All it took was the Man in Black doing something mean. Because that makes sense.

And just as “LOST” lazily used trope after tired trope to explain each inane event in the plot (it’s a magic island and also there’s time travel and some polar bears and also a parallel universe — just kidding they’re all dead and the whole thing was Purgatory), “Westworld” liberally uses the host/human and sentience/programming levers to trick viewers into thinking there’s anything resembling a coherent story arc.

Instead of using mere sleight of hand a la “The Sixth Sense” to the audience up for a shocking reveal, “Westworld” uses outright deception. It doesn’t just use an unreliable narrator from time to time to keep you on your toes, it uses it as the basis for the entire show. You thought that person was a human, but we tricked you. You thought that thing recently happened, but we tricked you. You thought that clumsy metaphor was an actual thing, but we tricked you.

Maybe the Joke’s On Us On Purpose

Now, there’s the possibility that all this deception by the writers is meant to be clunky and obvious. It could be that they’re trying to make some sort of meta point by analogizing the TV viewing experience to that of the guests in the park. The deception and lack of real stakes are the point. You silly TV viewers who are trying to find meaning in a television show are just as lost and depraved as the guests of “Westworld.” Maybe the whole show was just an extended (and expensive) meta criticism of American TV culture.

I think the likelier explanation is that the show was captured by the same temptations that faced “LOST”: in the war between a tight, satisfying story and the need for multi-year cash flows to pay for costly up-front investments, the latter won out in “Westworld.” Had the writers been focused on wrapping up an engaging story and preserving the philosophical questions it raised, all while reasonably obeying the rules of the universe they created, they could have ended with: Dolores killing Ford in revenge, Teddy killing the Man in Black to avenge his torture of Dolores, the sentient robot army invading the gala and eventually boarding the train to freedom, and with Maeve setting foot outside the park without so much as a second look from any actual humans. It would have been a wholly appropriate philosophical cliffhanger for a philosophical question that by definition has no perfect answer.

Alas, season two awaits.

Sean Davis is the co-founder of The Federalist.

Copyright © 2017 The Federalist, a wholly independent division of FDRLST Media, All Rights Reserved.