This is a recap of episode 2 of Westworld, ‘Chestnut.’ SPOILERS BELOW!


I like this show but the title sequence is bad

It’s cheesy. The robot hands playing the piano keys in time with the theme song gives me shudders every time. So far, they’ve also shown the player piano within each episode. Just showing it within the episode would be a subtle touch of symbolism, but coupled with appearing in the title sequence, it just feels way too heavy handed to me.

I think I’m in the minority opinion here based on what I’ve read online, so I won’t dwell on it, but just had to put this out here. Okay, now to the show!


We start off with some more Donnie Darko vibes

Screen Shot 2016-10-11 at 10.50.32 AM.png

Much like when Donnie wakes up to a mysterious voice commanding him to “wake up,” get out of his bed and walk somewhere (in that case, in order to save his life and perhaps to save the world?), this ep starts out with Dolores being commanded to “Wake up, Dolores” in the middle of the night. Did it sound just a little bit like a modified Bernard voice to anyone else? “Do you remember?” the voice asks.

Episode 1 already had some Donnie Darko vibes with the Grandma Death style whisper of Peter Abernathy saying to Dolores, “These violent delights have violent ends.”

Screen Shot 2016-10-11 at 10.42.56 AM.png

We’ll see if this is intentional, or most likely just some happily similar cinematic choices.


New major character alert: William (played by Jimmi Simpson)

Screen Shot 2016-10-11 at 10.53.19 AM.png

We first see William sleeping on a train (or maybe a hovercraft?) as he arrives at the Westworld amusement park with his co-worker (not friend) and repeat visitor Logan.

Here’s two questions we should be asking ourselves throughout the series:

1) is this robot a human?
2) is this human a robot?

William wakes up on a train in a very similar fashion to the way Teddy wakes up on a train at the start of each of his days. That said, I don’t think William is the strongest candidate for human-who-is-actually-a-robot. More on who I do think is a better candidate below.


William and Logan engage in some “locker room talk” to show that Logan is an asshole, but in case that wasn’t clear William wears white and Logan wears black

Screen Shot 2016-10-11 at 11.01.59 AM.png

I’m wearing a white shirt because I’m a good guy and he’s wearing a black shirt because he’s a bad guy.

The first words out of Logan’s mouth are commenting on a woman’s appearance in a Trumpish way, which makes us fairly confident that this character is a bad guy.

Logan then says what he thinks is the point of visiting Westworld:

William: You’re being an asshole.
Logan: No, I’m being myself, which was the whole point of this trip.

So, one way that visitors see Westworld is as a place that they can finally be their true selves, living in a world where they can commit crimes and any manner of debauchery without consequence.

This is in contrast to two opposing viewpoints of the point of Westworld put forward later in the episode…

Head of narrative Lee Sizemore: the point is to find out who you really are.

Father of Westworld Dr. Ford: people know who they are; the point is to find out what better version of themselves they might become.

Also, William chooses a white hat and Logan chooses a black one. My friend Nathan Maggio theorizes that this is intentionally a reference to the computer hacking terminology surrounding white hat vs black hat hackers, which seems plausible given that Westworld is basically an elaborate computer system.

Screen Shot 2016-10-11 at 11.24.13 AM.png

Uh oh, might be a COMPUTER VIRUS spreading!

Screen Shot 2016-10-11 at 11.13.56 AM.png

Programmer Elsie Hughes once again invokes the phrase “dissonant episode” to refer to the existential crisis / brain meltdown that Abernathy had in the first episode.

If this is not just a dissonant episode, then whatever Abernathy had … could be contagious.

— Elsie Hughes, s01 ep02

AKA a computer virus!

When the movie that this TV show is based on first came out in 1973, the idea of a computer virus was a new concept and probably a hard metaphor for folks to wrap their brains around (although the term did originate as far back as 1949in this paper). Nowadays though, the idea of malware infecting one robot being able to infect another robot connected in the same system does not seem so shocking to us.

Bernard brushes off Elsie’s concern, but then later in the episode goes to talk to Dolores, which leads us to believe Bernard is hiding something. Speaking of Dolores…


Dolores remembers some messed up shit

Screen Shot 2016-10-11 at 11.19.45 AM.png

While following the familiar steps of our daily routine, Dolores once again hears the strange voice, this time compelling her to “Remember.” We then see a street littered in corpses as sinister music swells and screams echo, giving us the feeling that something very bad has happened.

Dolores bumps into Maeve and tells her the “violent delights” line. Will this scary game of robot-telephone be how the “dissonant episode” is spread? It seems possible, since following this utterance, Maeve starts to remember some troubling images of her own.


The Man In Black carries out some more ruthless murders on his quest to get closer to the “deeper level” of the game AND SECURITY KNOWS ABOUT IT!

Screen Shot 2016-10-11 at 11.31.37 AM.png
Screen Shot 2016-10-11 at 12.34.55 PM.png

The maze! The very gross maze.

The Man In Black saves a criminal named Lawrence about to be hanged, only to take him to his hometown to kill his cousins and his wife and to threaten to kill his daughter in order to get information about the deeper level of the game.

We also learn the answer to a question that had been nagging at me during the first episode: does park security know about the violence spree that the Man In Black is carrying out? Turns out, yes. Not only that, but head of security Stubbs tells one of his lackeys “That gentlemen gets whatever he wants.”

Just when his carnage appears to be about to culminate in murdering a man’s daughter right in front of him, the little girl appears to speak from a voice not her own, more like that of an omniscient narrator or oracle:

Lawrence’s daughter: The maze isn’t meant for you.
The Man in Black: What did I tell you, Lawrence? Always another level. I’ll take my chances, sweetheart.
Lawrence’s daughter: Follow the blood arroyo to the place where the snake lays its eggs.

The Man in Black then leaves town dragging along Lawrence, who so far doesn’t know about the maze, and leaves behind the only person who has told him about the maze, the little girl. This doesn’t make a whole lot of sense to me, but hey, he seems to know what he is doing.


Radiohead’s “No Surprises” player piano version interlude

Screen Shot 2016-10-11 at 12.35.43 PM.png

From the album OK Computer:

A heart that’s full up like a landfill
A job that slowly kills you
Bruises that won’t heal
You look so tired, unhappy
Bring down the government
They don’t, they don’t speak for us
I’ll take a quiet life
A handshake of carbon monoxide

With no alarms and no surprises
No alarms and no surprises
No alarms and no surprises
Silent, silent

This is my final fit
My final bellyache

With no alarms and no surprises
No alarms and no surprises
No alarms and no surprises, please

Such a pretty house
And such a pretty garden

No alarms and no surprises
No alarms and no surprises
No alarms and no surprises, please


Maive practices her speech and has some nightmares

Screen Shot 2016-10-11 at 12.44.25 PM.png

So far, Westworld is a show that is dealing with repetition as a theme, numerated instances of the same that taken on new meaning – or don’t – when they’re done again.

In the first episode, this was manifested with Dolores living out the same day with slightly different chaotic iterations leading to vastly different paths to different outcomes.

In this episode, this theme of repetition is carried on by Maive as she gives the same speech over and over to different clients with different results, influenced both by the nightmares of getting scalped by the Man In Black that torture her, coupled with the folks in Narrative and Programming messing with her cognition (first turning up the aggression, then reseting that and instead turning up her mental acuity with better results). Her speech:

You can hear it, can’t you? That little voice. The one that’s telling you “don’t.” Don’t stare too long. Don’t touch. Don’t do anything you might regret. I used to be the same. Whenever I wanted something, I could hear that voice telling me to stop, to be careful, to leave most of my live unlived. You know the only place that voice left me alone? In my dreams. I was free. I could be as good or as bad as I felt like being. And if I wanted something, I could just reach out and take it. But then I would wake up, and the voice would start all over again. So I ran away. Crossed the shining sea. And then when I finally set foot back on solid ground, the first thing I heard was that goddamned voice. Do you know what it said? It said, “This is the new world, and in this world, you can be whoever the fuck you want.”

Screen Shot 2016-10-11 at 12.57.20 PM.png

Writer Sizemore does not like creator Ford

Screen Shot 2016-10-11 at 1.56.35 PM.png

Sizemore is a little shit writer wants to create a new storyline and to “clear out the dead weight” which to him includes both the old robots that Ford designed with storylines he finds boring compared to his titillating cheap thrills.

When he presents his “Odyssey on Red River” to Dr. Ford for approval, Ford says no.

As echoed earlier by the Man in Black himself, Dr. Ford explains to to Sizemore that it isn’t the “parlor tricks” of cheap emotional thrills that people love about visiting the park, but rather the little details.

The guests don’t return for the obvious things we do. The garish things. They come back because of the subtleties, the details. They come ack becase they discover something they imagine noone had ever noticed before. Something they fall in love with. They’re not looking for a story that tells them who they are. They already know who they are. They’re here because they want a glimpse of who they could be.

— Dr. Ford, s01 ep02

“You can’t play God without being acquainted with the devil.”

Screen Shot 2016-10-11 at 12.54.40 PM.png

Some very nice visual imagery here of the struggle between good and evil inside the creator of Westworld, Dr. Ford. Thanks to my friend Nathan Maggio for pointing it out!


Look at all that milk!

Screen Shot 2016-10-11 at 12.56.05 PM.png

Robots getting made in a milk bath, y'all.


Is Bernard a robot?

Screen Shot 2016-10-11 at 12.57.57 PM.png

Look, it seems likely just from a storytelling perspective that one of these humans is a robot. As William’s host tells us earlier in the episode, “If you can’t tell the difference, does it matter?”

Bernard is analytical and likes to study what the humans behind the scenes do when they feel different emotions. It would be useful to have the potential artificial intelligence of a robot to help with programming other robots, and this is a basic prediction of A.I. that as computers get smarter, they’ll be able to program themselves and other computers, and in fact program with such sophistication that we will cease to be able to understand their creations with our human brains.

Furthermore, later in the episode, the head of security sleeps with Bernard - is she possibly using him for his body to get pleasure out of him the same way that visitors do with hosts in the park?

Oh, yeah, and this scene with Dr. Ford starts with him telling Beranard, “I know how that head of yours works.” A colloquial turn of phrase or a very specific and literal indication that he designed Bernard’s brain? As Dr. Ford and the Man in Black both wax poetically about Westworld the theme park, every single detail is there for a reason and adds up to something – doesn’t it make sense that Westwrold the TV show would try to do the same thing?

Screen Shot 2016-10-11 at 1.44.08 PM.png

Bernard also tells us an important piece of information: the robots talk to each other even when no humans are around in order to practice.

Bernard: They’re always trying to error correct, make themselves more human. When they talk to each other, it’s a way of practicing.
Theresa: Is that what you’re doing now? Practicing?


Occam’s Razor reference

Screen Shot 2016-10-11 at 2.25.18 PM.png

From Wikipedia:
“Occam’s [razor](https://en.wikipedia.org/wiki/Razor_(philosophy) is a problem-solving principle attributed to William of Ockham (c. 1287–1347), who was an English Franciscan friar, scholastic philosopher and theologian. The principle can be interpreted as stating Among competing hypotheses, the one with the fewest assumptions should be selected.”

Or as it is usually used: the simplest solution is usually the right one.

Dr. Ford follows up by saying that Westworld is complicated, as in a complex system. Those who study failure or success in specific outcomes in complex systems tell us that while our simple human brains usually want a simple explanation (The airplane crashed because the pilot was tired, for example) in reality, it is often a large number of failures of various sizes at various points in the system that lead to the final failure (The airplane crashed because the autopilot behaved in a way that the co-pilot was unfamiliar with because he was trained on a similar but different system, there was a weather pattern unusual for that time of year, the airplane was close to being decommissioned but wasn’t because of budget cuts related to a change in government policy related to airplane subsidies years before, and the pilot was tired).


Hieronymus Bosch reference

Screen Shot 2016-10-11 at 2.23.44 PM.png

Dropped by Sizemore when explaining his new storyline. Bosch is know for his detailed and terrifying landscapes of hell. Sizemore says that the horror that he creates will make Bosch seem tame. The ability of humans to create their own hell on Earth seems to be the operating point here.


Westworld as a video game vs Westworld as an amusement park

Screen Shot 2016-10-11 at 1.08.50 PM.png

Westworld seems to be presented to, and understood by, its visitors as an adult amusement park where they can come experience violent and sexual thrills without consequence. However, it can also be understood as a video game.

For one, the imagery of the security and QA folks monitoring Westworld very much as a video game feel, specifically something like The Sims. Also, the idea of bumping up someone’s aggression vs mental acuity, as happened with Maive, will be familiar to anyone who has ever designed a character in a video game and had to decide how to spend attribute points.

More than the above, though, I’m reminded of video games in the way that guests must choose how to spend their time in the park. Westworld is essentially an MMORPG that you play inside of with your actual body. When Logan tells William to stay away from the treasure hunt old man who invites him to go on a quest, it feels extremely reminiscent of a game like World of Warcraft where the villagers in a town have floating exclamation points over their heads which indicated that if you click on them and start an interaction, you’ll have the option of carrying out the quest they ask of you for some reward. Or, as is your prerogative, you can move on and keep exploring.

The blurring of video games, amusement parks, and virtual reality is an excellent way to think about what the real future of entertainment will look like for us.


The violence in Westworld is hard to watch

Screen Shot 2016-10-11 at 1.16.30 PM.png

You can write off the violence as being par for the course for another HBO show, but let’s assume that it has a point. The juxtaposition of truly horrible acts of violence with human visitors reveling in the pain at the very least condemns the “rape and pillage” of the guests.

The scene of Logan stabbing the old treasure hunter in the hand is awful and his pain seems very real. It’s immediately followed by Logan sating his “new appetite” for sex with multiple partners at once, one of whom slaps him in the face, further blurring the line between violence and pleasure, the “violent delights” that the guests enjoy.

Screen Shot 2016-10-11 at 1.17.56 PM.png

Meanwhile, Logan turns down Clementine’s advances, saying that he has “somebody real” waiting back home.

Clementine replies that she understands since “real love is always worth waiting for.”


Is the Man in Black a robot?

Screen Shot 2016-10-11 at 1.34.17 PM.png

So far, we have only seen the Man in Black interact with hosts. We have not seen him hurt a human visitor yet. Since the hosts/robots in the park seem unable to hurt him, we naturally assume that he is a visitor. He also has said he has been coming there for 30 years, he has paid to go there, etc.

However, he also said that in a sense he was “born” in Westworld. Could it be possible that he is a robot who has found a way to program himself to be invincible to the other robots? Clearly, there is a mechanism in place so their guns can’t hurt humans, so it is not difficult to imagine this being applied to one of the robots, too.

Or perhaps he is a part of Dr. Ford’s latest creation – a robot who thinks he is a human visitor?


Dr. Ford talks to a little robot boy in the desert, tames a robot snake, and points to a robot Church in the distance that is a part of his latest and most original creation

Screen Shot 2016-10-11 at 1.38.24 PM.png

What is Dr. Ford up to??? Does the corporate “board” know about it? Are they trying to shut him down? Does Dr. Ford’s latest storyline involve the Man in Black?

The episode ends with Bernard (wearing a grey hat?? Perhaps that would be reading too much into the colors…) and Dr. Ford looking out as his creation, so whatever Dr. Ford is up to, we know that at least Bernard is privy to it in some capacity.

Screen Shot 2016-10-11 at 2.02.00 PM.png

Maive wakes up from one nightmare into another one

Screen Shot 2016-10-11 at 1.54.01 PM.png

Maive uses a counting down from three technique to wake up from a nightmare about the Man in Black scalping her (which, as we were told earlier in the episode, is probably a memory since most dreams are memories) to wake up from sleep mode in having surgery done to her by two goofs.

Maive had an “MRSA in her abdomen” which was causing her physical discomfort. MRSA is a type of staph infection, so interestingly, the robots are able to get sick and experience pain. This might be further setting up the possibility of the spread of a literal “computer virus” between the robots, or it might be establishing just how real and sophisticated these creations are.

But more to the point, Maive wakes up and sees all of the corpses of her robot friends being cleaned and repaired in a true scene right out of a horror movie. With the new “reveries” software package, what effect will Maive’s ability to remember this nightmare have on her and the rest of the world?


Oh yeah, Dolores finds a gun

Screen Shot 2016-10-11 at 1.56.15 PM.png

That voice telling her to wake up and remember? It leads her to a spot in her yard where she buried a gun. It seems safe to assume that this is a special gun. Maybe a real gun that can be used to shoot real humans.


Memories are important to show co-creator Jonathan Nolan

Nolan wrote the short story (called Memento Mori) that the film Memento, which his brother Christopher made, was based on. That film, obviously, is about memory. Memory will also play an important role in Westworld. Memory, perhaps, is even being put forth as the essence of what it means to be a human. Being able to hold a memory – in particular, a memory of yourself – in your brain, to reflect on it, is the essential component of what separates man from beasts or computers. Animals follow their chemical reactions and instincts from one moment to the next; computer programs follow decision trees based on what they’ve been programmed to do (if this happens, do this); humans, in contrast, can reflect on what they’ve done. They can remember. Remembering is in some ways the core of meta-cognition.

The ability to remember as a proxy for being human is an interesting way to conceive of humanity, and it will be satisfying to watch this concept further developed as the season goes on.

I have to believe in a world outside my own mind. I have to believe that my actions still have meaning, even if I can’t remember them. I have to believe that when my eyes are closed, the world’s still there. Do I believe the world’s still there? Is it still out there?… Yeah. We all need mirrors to remind ourselves who we are. I’m no different.

— Leonard Shelby

Looking ahead at future episodes, a question I want answered: how does time function in Westworld?

During the first episode, it seemed like the characters reset every single day. However, there are certain other storylines occurring where that doesn’t seem possible. Are visitors in the park for two weeks at a time, at which point they do a clean up and reset all the storylines? How often are the robots’ memories “wiped”?

If these time questions have been answered and I just haven’t seen it, and/or if you have your own theories, please let me know in the comments.


Thanks for reading and see you next week!

Advertisement
Advertisement