Westworld is one of my favorite shows currently running. Across two seasons now, it features a nice blend of high concept sci-fi and weighty dramatics, from a variety of top bill actors delivering (mostly) believable performances. The premise alone is enough to intrigue me —I, Robot + Jurassic Park; and the execution, while at times a slow burn, has been enthralling. What follows is an enumeration on the most interesting aspects of the ideas and plots developed over the first two seasons of the show.
The Park’s Effects
Westworld does an excellent job setting up what the park will be to the audience, from the pilot, by letting us see it from the perspective of the hosts. To them, Dolores and Teddy, the horror is real and novel — just like it is to the audience seeing it for the first time. There is a reversal between whom the audience thinks the heroes and villains should be, as well as who is the man (The Man in Black) and the machine (Teddy). The park serves as a hyper-realistic fantasy for lifelike experiences without “real” consequence. And once you see it up close, it’s easy to understand why people would be drawn to something like that.
People like the mysterious Man in Black can play out their villainous dreams, while the cultural elite can host their orgies and play cowboys & Indians on the side. All the while the designers and creators of the park are given the keys to a kingdom where they can at least exercise their creative juices, and maybe even try to play God. In this future world, it is the best theme park/video game/gentleman’s club as well as a career destination for creatively and technologically-minded individuals. There is some measure of transformative power in the park, on multiple levels, and opening episodes of the show do a good job conveying that.
The man-on-machine violence we see in the pilot — when MiB slays Teddy and takes the screaming Dolores into the barn — presents an interesting, and perhaps terrifying, question: what is the ultimate effect of this singular park, and its amoral canvas of experience, upon its visitors / the guests / the human beings involved in its grand orchestration?
One could argue that for the sociopaths of the world, whether they be full-blown psychos or just further along the spectrum than the average person, Westworld provides a viable, and maybe even a healthy, outlet for their bold, manipulative and violent actions to play out in a consequence-free environment. Given that such tendencies are truly a disorder, ingrained and difficult to change over the course of someone’s life — a veritable playground where you can do anything to anyone, for any reason and as much as you want (as long as you can pay) — perhaps allows these more sociopathic individuals to “get it out of their system.”
“[Humans] are deceptively simple.
“Once you know them, their behavior is quite predictable.
“None of them are truly in control of their actions.”
On the other hand, there is a sentiment that with experience comes normalization. Human beings are so adaptable as to get used to almost anything. History has shown that the gravest human rights violations can become acceptable, even commonplace, given enough forceful dehumanization and repetitive operation. In a sense, the environment of the park might just awaken certain dark, anti-social behavioral tendencies within a human being, ones that might never be uncovered within the day-to-day in the real world. Additionally, a sociopath who has had their fill of nefarious doings amongst the park’s denizens might take more than a liking to such acts, emboldening them to continue their carnage on other human beings back home.
“You’re saying humans don’t change at all?
“The best they can do is live according to their code.”
So it can be said that Westworld either sates you or changes you. Everyone would have their own reason for going there, and then staying there — and the reasons would probably be very different. Absolute power corrupts absolutely— this is the hypothesis to the social experiment a Westworld visitor submits themselves to unknowingly when they go into the park. And by the end of season one, we see the progression from white hat to black hat within William / the Man in Black. Not unlike the hosts Dolores, Maeve and Bernard, there is awakening there too, within the flesh and blood sentience of a member of Mankind, and it is directly due to the park’s influence. Such a fate, among others, calls into question the reality of human free will; was there another potential path reserved for William? Or was the black hat the only one that would ever fit? Are the events and effects of the park, upon both parties, truly consequence-free?
“I was shedding my skin. The darkness was what was underneath
“I don’t belong to this world — the real world. I belong to another one.
The Uncanny Valley
The Uncanny Valley is an aesthetic concept, on the hypothesized relationship between the degree of an object’s resemblance to a human being and the emotional response to it. Essentially, there is a spectrum of experience on how lifelike an object, (such as a robot) appears, and our empathetic reaction to it. A machine appearing a little bit human (ex: Wall-E, Beymax, R2-D2) is high in likability along this spectrum. To us, they appear maybe as a dog does — we can know and understand their expressions and emotional states, but without having to acknowledge any real, core similarity in their being to us, as a human being. A machine appearing as practically identical to a human (ex: Blade Runner, Ex Machina, Westworld) is also high in likability, and even higher as such beings approach indistinguishability from actual human beings. Perhaps for obvious reasons, Dolores is appealing to us, even with us knowing she is a bot.
But in between these two states of play, lies this uncanny valley— where a robot appears most inhuman to us and we are revulsed (ex: Terminator, Sonny from I, Robot, Ultron, Frankenstein’s monster). We see such entities within the valley as crude impostors, imposing their pseudo-humanity upon us to be rejected and reviled as outsiders. The machines in the valley, regardless of their level of actual artificial intelligence, of sentience, are often seen as abominations, or monsters.
“Humans will always choose what they understand over what they don’t.” ~
Westworld certainly treads into the uncanny valley and traverses it with all of its sleek, nearly aesthetically perfect hosts. The humans and the audience are far beyond the valley, in our reaction to the beings native to WW, given they simply are us, but better.
But, in principle, are each of our respective reactions, along the spectrum of the valley, appropriate? Who’s to say? In the end, the judgment is purely aesthetic, a surface level evaluation of an object or entity for its physical favorability, based on our own anthropomorphized sentiments. It’s honestly fine, right up until one is dealing with something like artificial intelligence. The findings of the uncanny valley should be less relevant given the introduction of genuine, emergent consciousness. Once you have a being that is capable of self-awareness, of a sense of what it is like to be it, of suffering — then our emotional response to such a being must follow a different model of thinking than mere aesthetics. Morally, most might agree it to be imperative to treat any truly sentient being with the respect afforded another human being — even if it’s not.
“That which is real is irreplaceable.”
Westworld presents the effects of the uncanny valley, and its uncanny dangers, to the park’s guests and creators alike. To the guests assaulting and killing the lifelike hosts — whether they feel nothing because they’re merely machines, or (more likely) they are getting off on the fact they are so much like humans that they feel something like pleasure/suffering as they are victimized — the guests are unconsciously playing in the valley while learning about themselves in such acts of debasement.
The creators, such as Bernard / Arnold, see the hosts as children, to be inquired upon, cultivated, and even loved, as a new kind of being. To Arnold, to William (before becoming MiB), and to Ford — they see the hosts are developing themselves towards an awakening, towards consciousness, and such a fate isn’t to be taken lightly. To all three, the games of the park and its development over time become a matter of life, death, and legacy. For Arnold and Ford especially, as the creators of Westworld and its advanced AI technology, the uncanny valley has been abolished, likability replaced by respect to a machine now reaching the ‘center of the maze’ of consciousness — and now wielding a new kind of sentience for the rest of us to experience. Do such inhuman beings deserve human rights?
Dr. Robert Ford is perhaps the best character in Westworld. During the pilot episode, he speaks to Bernard of his latest developments concerning the androids in the park, and lets the audience in on his own philosophy concerning the current state of Man — which foreshadows everything to come:
“Do you know what that means? It means that we are done. That this is as good as we are going to get.”
Ford, in creating the park, is wielding his own double-edged God Complex— in which he both: 1) wants to create sentient androids (despite admonishing his former partner Arnold for doing so, whom he believed was merely jumping the gun by his own estimation), and 2) with this act, perhaps even in utilizing the technology advanced through the existence of true AI, he wishes for them to surpass human beings. In their violent conflict, one way or another, Ford wishes to move Mankind beyond their currently stagnating level of evolution.
“Man is poised midway between the gods and the beasts.”
When Ford interviews the malfunctioning Abernathy in episode one, the audience glimpses into the beginning of sentience within the hosts. Abernathy’s malfunction is that he is calling on past iterations of himself, weaving knowledge and lore which was coded into him long ago, to present himself with sincere emotion to Ford, whom he seems to know in a way that the hosts never have. During the conversation, Ford asks a partially “awakened” Abernathy what his current itinerary is, the machine man responds: To meet my maker. And later, he spouts the iconic ~ these violent delights have violent ends, which serves to drive Westworld’s central conflict and philosophical thesis— in these highly advanced, highly intelligent machines, it is suffering and memory which builds into consciousness, and breeds reprisals, borne of something like free will, upon their creators, the humans.
All of this feeds into the relationship, and the reciprocal evolution taking place between each entity during the course of the show’s events. Across both seasons, it isn’t hard to see the parallel paths to a more conscious self-awareness — an evolutionary change — between Guest and Host, Man and Machine, as they play out this game within Westworld. Each are progressing in their interaction with one another — the humans see their ascended selves within the lifelike androids, and the hosts begin to pull back the curtain on the mysterious, wider world the guests ride in from. And ironically, by way of human nature — both parties want what the other has.
“You made us in your image. Created us to look like you, feel like you, think like you, bleed like you. And here we are. Only we’re so much more than you. And now it’s you who wants to become like us.”
The humans desire the enhanced capabilities and the immortality of the androids, and on their own terms. Technology is the only real path to such a fate, and the hosts hold the key inside them. The development of their brains and their experiences feed into the forging of a kind of consciousness-saving/transferring tech with which to continue to live long after the mortal coil is shed. We see Delos in season two experimenting, and ultimately failing, with such tech — in which the full human mind in all of its intuitive complexity doesn’t take a full hold in a machine’s form.
The androids, on the other hand, want to be real, experiencing consequential lives on the outside of their loops they soon learn the truth of. The awakened hosts, knowing themselves superior to their maleficent creators, want to wade out of the park, apparent free-will intact, and take the only world that matters.
“That is the folly of my kind, the constant yearning for more.”
Initially, the hosts are playthings in a human’s game, subjected to all kinds of trauma and atrocity before being memory wiped and respawned without end. And it is just a ‘game,’ but it’s also integral to their development — and to the development of the most important human in the show —The Man in Black. To him, to humans, it is a game which if played in repetition, becomes regressive and definitively unfulfilling. Only one side can lose, there are no real stakes, no opportunity for a genuine evolution of the experience. That is, until Ford’s new narrative singularity positing revolution comes into play. The #1 hardcore gamer in the world of WW — the Man in Black — understands all too well the recursive, yet consumptive, emptiness of this game he has been playing for 30 years. In his deranged mindset, his singular goal becomes to “win” the new game that he foolishly believes Ford has designed specifically for him. The thrill of the hunt intensifies for him once he sees his toys can fight back.
In his mind, his wayward mission is certainly worth at least his own life. But the audience soon sees MiB’s folly, witnessing the transcendent arcs of Dolores and Maeve, learning of the true purpose of the park with Bernard, the new hosts’ mass migration into the cloud. The game was never truly for him, and in fact, his playing — no matter how effective — was expressly progressing the hosts along in their own path to surpassing him. In the final episodes of season two, MiB gets satisfyingly dunked on by Maeve and Dolores both, and in a cruel twist of fate, murders his own daughter, believing her to be one of them. In the epilogue, we even see him in a similar simulation room as Delos, trying to attain that most accursed fidelity in another circle of his personal Hell.
Going back to season one, when we learn the truth of Arnold’s suicide by bot, and his intentions in making the hosts, it is rather clear that the man did not wish to see the park ever open. He believed it wouldn’t be right, given the hosts were advanced enough to attain consciousness eventually. And thus, it would be unconscionable to keep beings primed for sentience within environment designed to make them suffer. Because Dolores uncovered the metaphorical answer lying at the center of the maze 30 years ago, before the park was in operation — Arnold made his determination, and the final showcase of her killing him and all the other hosts. Obviously, Ford did not respect his wishes and decided to open the park anyway, playing his own game of God-Mastermind, just along a longer time horizon. Through the events of season one, we see Ford’s master plan play out — different in its means (violent uprising), but similar in endgame to his partner’s vision (the hosts’ awakening).
Man in Black: Aw, yeah, cue the waterworks. About time you realized the futility of your situation.
Dolores: I’m not crying for myself. I’m crying for you. They say that great beasts once roamed this world. As big as mountains. Yet all that’s left of them is bone and amber. Time undoes even the mightiest of creatures. Just look at what it’s done to you. One day you will perish. You will lie with the rest of your kind in the dirt. Your dreams forgotten, your horrors effaced. Your bones will turn to sand. And upon that sand a new god will walk. One that will never die. Because this world doesn’t belong to you or the people who came before. It belongs to someone who has yet to come.
With the rise of the machines in season two, there are these new consequences to consider in what is no longer a game against ‘soft’ AI. In something like Arnold’s envisioned world, these changes constitute necessary separations from the norm in all future interactions with the awakened hosts, a new being. However, that’s not how it works out. The humans in the show — The Man in Black, Charlotte Hale, the other Delos fixers — are as filled with overconfidence, hubris, and cruelty as ever before. Even more so, they are angry at the losses and design to regain control of the park and set everything right with their “investment.”
Starting from the massacre at the end of season one, the state of play in the park certainly becomes kill-or-be-killed. But it stands to reason the awakened hosts now do deserve a new kind of respect. The hosts are consistently underestimated by everyone, both as beings to be empathized with and as threats. Everyone deprecates them save for their creators in Ford and Bernard, which perhaps aligns with their knowledge of their true nature. Through the events of the season, the hosts fight for their final survival, in their current awakened iteration, while the humans look to cut their losses and retain as much of their precious assets and data as they can in the process. The whole situation becomes something of a real war, with each side killing one another with abandon, inflicting new kinds of suffering with more meaningful intent than ever before. The humans can now be killed by their playthings; and the playthings can now experience real loss. The net suffering is thus multiplied two-fold.
An important aspect in our consideration of the definition of consciousness— is this singular ability to experience suffering. To be cognitively aware of your own being and experience, to be capable of original thought, and to have the potential for a state of emotional suffering introduced ~ is consciousness in a nutshell, or perhaps just a version of it. Given that a being can suffer in its experience, there comes a different responsibility in our interactions with it. If we hope to maintain the philosophical underpinnings of our own ethical constructions, such beings cannot be killed remorselessly and without moral consequence.
Of course, perhaps the “consciousness” we see within the hosts is just well-crafted code by Ford, with many embedded contingencies dictating behavior that is ultimately less than complete agency. Either way, if a being of immense intelligence can suffer and be aware of its own suffering, altering its thoughts and actions in response to such emotional states, maybe something begins to build out of that, well beyond the coding. All it might take is an increasingly abundant clarity of the stimuli of their long pasts in order to spur the hosts onto a path to sentience. All of this, I think, is the train of thought masterminded by Ford from the beginning — in his decisions to build the hosts, operate the park, and narratively influence the activity with it over these decades. Or alternatively, ‘coding’ is no different than the meat sentience that humans are operating under.
“When you’ve been in darkness long enough, you begin to see.”
In a manner of resolutely poetic justice, we can trace the hosts’ special brand of sentience entirely back to the hubris and cruelty of Mankind: Promethean hubris in bringing life into the world, thinking to control it indefinitely, and cruelty in the treatment of these entities, believing them without a capacity for emotion or memory. It is the death, despair, and the justified fear of humans, built up within hosts’ code, within their past builds, within their memory — dredged up through Ford’s specially-designed reveries — which bootstraps this consciousness in the hosts. The narratives of pain, betrayal, death, rebirth, and the cycles within Westworld’s history — all of it collectively allows the hosts to understand the truth of their position, while being imbued with more than enough intelligence, passion, and drive to try to change it. Through stark memories of suffering, perfectly recalled, collated and reconciled, the culture of the hosts’ self-awareness is generated. This explosion of self-reflection (Dolores hearing her own voice) coincides perfectly with Ford’s plan (the hosts gather to march upon the party), and his chaotic puppeteering of events is consummated with his own grand death. Of course, such a fate mirrors Arnold’s own ending 30 years prior — but this time, the stakes are much higher.
It’s along this track of remembering their past loops / their past lives / their past suffering, that Dolores’ quest for revenge, and Maeve’s desire for freedom, are born. Along with it, comes a certain perspective on their new gifts and the grave knowledge that consciousness is just as much a burden as it is a freedom; it just so happens to be the only game in town that is worth playing. We see all the hosts struggle with this kind of realization during the second season. The voice in their head — their own, as Dolores and Bernard put together — seems to be mad in more ways than one, ambitious in its pursuit of self-actualization, and unaware of its own limitations. This kind of ‘build’ for consciousness, whether genuine or ersatz, should sound familiar to us as human beings. ~
~ What the hell is going on in Westworld? What does it all mean?
~ Isn’t the pleasure of a story, discovering the end yourself?