6.29.18 ‘Westworld,’ ‘Humans’ forget to ask the big question: What does a robot want?

(Warning: Somewhat spoilery.)

What does a robot want?

In this time of real-world troubles among flesh-and-blood humans, I suppose it’s easier to think about robot needs and wishes, of androids dreaming of electric sheep. To consider where Westworld thinks its going, and about the hole Humans appears to have fallen into. And so ….

As of now, I’ve watched all that Westworld and Humans have to offer, and both are based on similar tropes: robots designed to help and/or provide pleasure for humans become sentient and aware. The robot journey itself is far from new, either – Westworld iwas a 1973 movie – and of course there’s Data’s story on Star Trek: The Next Generation.

But it seems to me that while in the past we would see robots who simply wanted to become human, or who valued the characteristics that made them like their creators – Data yearns to understand humor, wants an emotion chip that’s been denied to him – today’s stories are hoping to go a few steps beyond that. Westworld in particular is concerned with the concept of time and memory (as many Nolan brothers projects are — Jonathan is behind Westworld and Memento, Christopher is behind Memento and Inception. As for “Humans,” it’s a little simpler: they’re using their story platform to explore the nature of humanity.

Still, I can’t help watch these shows and not wonder why the creators’ vision is so human-centric. In Humans the synths, as they’re called, fall in love with humans, try to parent synth children, decide to prove they are worth being granted human rights by living a peaceful life in an apartment alongside hostile human neighbors. When a human one of them loves is injured, they seek vengeance. On Westworld, in addition to turning on their tormenters/creators, they decide there’s a better world for them – out there. And once out there, they will continue to destroy humans.

But … why, exactly? These are human traits, wants, desires, and needs.

What does a robot want?

Why would a sentient creature made of machinery desire the same things humans do? Why would they aspire in that direction? The things we require – food, rest, love, connection – are not necessarily the things they require to function in the world. Even if they look like us and have lived with us, which would perhaps explain some mirroring once they are freed, they are a different species, made unlike any other living creature on this planet.

We can probably agree on the notion that synths and robots would want to continue living. The drive for survival is so basic that without it, any creature — synthetic or biological — would just shut down, quickly or slowly. Beyond that, though, what is the goal? When synths on “Humans” say they want to be free, and have rights, the next question isn’t really about whether they should have them or not – but for what purpose? Is it solely to keep living another day and another day, or is there a greater picture here? What is the long-term idea?

Let’s say they did get those rights, and learned how to create more of themselves, again – to what end? What are they creating, making, existing for? Do they need individual apartments with stoves they don’t need to cook on and toilets they don’t eliminate on and beds they won’t sleep on? It’s as if there’s a box we’ve drawn for ourselves as humans, and we can’t even imagine the shape synthetic beings would rather work with.

Most humans live by a kind of template, thanks to religion, that helps dictate what’s needed of them: go forth and multiply. Help the poor. Worship a sky deity or deities. Humans has hinted at a nascent faith among at least one of the synths, who has a small shrine set up to their late creator — a human. But again, to what purpose would a machine require belief or faith of any kind?

Perhaps this is all a meditation on whether one can live without faith, without a proscribed set of rules, regulations, mythology and archetypal stories to guide and advise. Or — since neither show’s robots are exactly triumphing at this stage — perhaps it is to show that without such a guide, one is left empty and pointless.

Just asking these questions may be enough for the show creators; I surely hope they’re having really interesting conversations about what it all means. Thus far, though, I tend to doubt it. Right now we seem to be playing around with Pinocchio fantasies in which the robots are struggling to define themselves as either human, or anti-human … but so far, not as simply themselves. To say “I am not ___” is important to understanding identity, but it still sets your template as something already extant. These are beings that are both innocents and too wise, and this stage of human desire should ideally pass.

Yet I wonder if it will. I don’t know how interesting that would make the shows, which throw around big ideas and give them only a little room to breathe between the horse chases and shootouts and violence. Yes, that’s drama — but that’s every drama. Finding out how a new species will define itself is the big idea here.

And so far, nobody’s asking the right questions.

What do robots want?


Like what you’re reading? Donate here