There's a moment in the third act of Westworld in which Peter finds a woman he thinks is injured after the robots have gone... wrong. He believes she's a guest who has been injured because, unlike so many others nearby she is semiconscious (and chained up). Then he gets her some water and she shorts out. Turns out she's a robot.
The fact that the robots cannot be distinguished from the guests (except by their hands) is important. And, it makes me wonder if there isn't some greater similarity between the two. I'm reminded of the Architect in The Matrix Reloaded...
[The Oracle] stumbled upon a solution whereby nearly 99.9% of all test subjects accepted the program, as long as they were given a choice, even if they were only aware of the choice at a near unconscious level.
I'm reminded of The Purge, which if you haven't seen it a) isn't that great a movie as it ends up just being a rather straightforward home invasion story but b) is based on an imagined future America in which once a year law is abandoned for 24 hours. The following comes from the promotional website for The Purge: Anarchy:
It is a night that, year after year, has saved our country--and it is good.
It is normal to feel violence and rage, and the Purge has established a lawful and healthy outlet for those feelings.
For the record, that's from the fake page explaining, "Talking to Kids About the Purge." It's an attempt to explain how giving in to your impulses for that one day is entirely a good, something that keeps society going. In the Matrix, the equivalent is that some of the people know the world is artificial and choose to leave it. This element keeps the system going. Similarly, the premise of the the Purge films suggests that allowing people to indulge their fantasies and act outside the parameters of the law and society on a regular basis will keep things running smoothly otherwise.
Whether or not society works this way is open for debate. Tangentially, I've got to wonder, are minor perversions allowable to maintain a central tendency toward normalcy? That is, as long as we generally subscribe to and act within the parameters set forth by society around us, does it really matter what abnormal shit interests us, say, in the bedroom? Or when we're alone with the internet.
If we had Delos, would we be better off?
(Assuming it were cheaper to get in, of course.)
That is just one piece of this tangent, whether or not an outlet like Delos would be good for us. I heard from a therapist recently that those who practice cutting (deliberate self-harm) are less likely to commit suicide than others who may suffer from similar causative ailments (e.g. depression, anxiety). I wonder if being allowed to act out our fantasies helps us be more productive citizens, as it were. As long as we clearly separate fantasy from reality. I was reading about BDSM less recently, and there's a passage in The New Bottoming Book by Dossie and Easton (2001) that I'd like to cite. It reads:
The common thread that seems to run through this section is that fantasy is not reality. Good players learn to handle reality first and use it as a foundation on which to build really hot fantasies. When you confuse fantasies with reality, you distance yourself from your power...
It helps to be conscious of the boundary between "scene space" or "in the game" and out. (p. 28-29)
The second piece of the tangent is this: do these robots have free will? Do we? In Cognitive Technologies and the Pragmatics of Cognition, Dror (2007) makes a distinction based on goals and the ownership thereof.
(I write this as I just noticed that the technicians in the control room are all unconscious when Peter arrives... did they suffocate because of the ventilation problems or are they robots? On a literal level, I think we're supposed to accept that they suffocated because they are underground and the ventilation system failed. But, they look just like the various robots during maintenance time, slouched over their desks. Essentially, this is a robot world run by robots for a handful of (presumably) non-robotic guests. The fact that telling them apart is difficult, as I said above, is important suggests that there is a very thin line between the two. If the technicians were demonstrating concern for the guests earlier--and they were--are they really less human than Peter and John who made the deliberate choice to participate in simulated murder and debauchery?
The technicians are not robots, of course. But, this movies gets me thinking about the fundamental differences between robots and, well, humans. Do we have free will or do our decisions just come from a complicated process built on biological predilections and a history of other interactions that have fated us to make certain choices under certain situations? We like to pretend we've got choices to make, but those choices are just dictated by society, by biology, by life, the universe and everything... Let's get out of this parenthetical.)
Dror argues that robotic ownership of goals "raises questions about the integration between body, control system, and the aims of the actions undertaken by the system." Pretend all you want that you are an ensouled being with free will and actual choices to make, you're just a collection of biological circuits, neurons firing based on years of patterned responses to stimuli. But, Dror continues:
Even if I don't freely choose my goals, the goals I pursue are mine. Mt goals are important and intrinsically connected to me. Certain goals I do no pursue because others impose them upon me or ask me to achieve them, but because they matter to me. What makes goals belong to a system? How are they grounded in the system? (p. 68)
Complicated systems are awesome.
(And the movie ended like half an hour ago.)
In Westworld we're told, the robots are "highly complicated pieces of equipment, almost as complicated as living organisms. In some cases, they've been designed by other computers. We don't know exactly how they work." Not much different from a lot of our understandings of life, of each other...