Friday’s film Ex Machina unexpectedly took me back to my Freshmen Writing Seminar. My Freshmen Writing Seminar about vegetarianism. At the surface, the two don’t seem to connect. But in the film, Ava (the AI) gets help from the main character by generating empathy from him when she starts to get emotional about her situation trapped as she is in her creator’s house. But when she is finally freed, her empathy seems to stop as she leaves the main character trapped in the house like she was. So, the question becomes did she actually feel emotions, or was she just mimicking what human emotion looks like? This is where my FWS comes in. In that class we debated weather animals had feelings, if they deserved to be given empathy and rights. This movie made me go one step further in that questioning. If we create an AI that feels like we do, what really differentiates us from machines? Why would we be more deserving of rights if we all act and feel the same? And how would be be able to tell if they could actually feel or were just more perfect pretenders than humans? I’ve concluded that just like with animals, we can’t really know for sure. And that is a scary thought.
West World and the movie Her have similar themes. It’s always interesting to think about when a machine becomes so similar to a person, should that machine be treated as a person, even if we don’t know whether or not it has a form of consciousness. This problem would be a lot less frustrating and philosophical if science could figure out what exactly consciousness is, but so far we still don’t know. My personal opinion is that if there is a chance that a machine feels human emotions, then you shouldn’t mistreat it, but that breaks down when you start to think about the small chance that your computer has of being conscious. Even a smart fridge might be conscious, we just can’t tell. Hopefully, if we do have cool robots in the future, we treat them well.