I am now going on my fourth episode of Westworld, and the show does not fail to captivate me. It continues to use amazing acting, intriguing storylines and mystery to enthrall its viewers. Needless to say, it continues to interest me.
One way by which this show continues to captivate its viewers is by incorporating difficult questions into the theme. These themes make us question our morals, our beliefs, and humanity’s futures. Is it right for there to be a theme park for people to satisfy their questionable acts? If the humanlike droids don’t “feel”, does that make void the illegality of assaulting/raping/killing them? Is an act committed in this park against a droid considered to be of the same questionable moral character as doing it in real life? Continually, the show continues to pose these questions to the viewer without giving answers. Ultimately, it is up for us to decide.
A theme Westworld continually addresses in the third episode (and clearly will be throughout the show) is the concept of machines being able to “feel”. While this may seem cliché for a show which involves robotic humans, the way in which it questions it is all the more interesting due to the acts being committed on these robots by the humans. It appears that the writers are leaving it up to us to formulate a belief, but their argument seems to be along the lines of “it depends”. Inconclusive, clearly, but for good reason. It seems that the robots are “learning” how to have human emotions throughout the show, however, is it only because of the programming created by the humans themselves? Is it really original? These questions seem to be the basis of their argument, one which they leave purposefully vague so that we ourselves can determine it.
The show demonstrates its argument through character interactions. We continually see the head programmer interact with one droid in particular, whom begins to show signs of human behavior. This is our introduction to the conflict, as we delve into deciding whether or not this droid is capable of having these emotions. The argument itself is demonstrated again as inconclusive by having characters mention droids cannot feel, while the programmer continues to get attached. If I were to make predictions, I believe this theme will be a major portion of the show. Since the show is about immoral acts being taken out on droids by humans, the question of the morals of these actions being taken out on potentially “feeling” machines will continue to arise. Surely, the show will continue to explore this complicated theme further. And as we move into a world of more advanced technology, it seems all the more relevant.