Tuesday, February 21, 2017

Determinists' Delusion

Before I get to this topic I want to apologize. I have been over a year without an entry (I have been busy but that's a lame excuse). Before I get into the meat of this issue I want to talk about what started me thinking about this issue. I got into a short discussion on Facebook about the HBO show Westworld and the ramifications of artificial intelligence (AI) (I have since started also watching the British TV show Humans). They both deal with AI and how humans would interact with very human-like robots.

The premise of the show Westworld is complicated and I won't give any spoilers away. Essentially a group of people (only one of the founders of this company appears to be still alive, that is somewhat of a mystery) started a theme park. At this theme park, instead of riding roller coasters, you get to experience a fully authentic style Old West town (and surrounding area). The visitors to the park are free to do, say, behave, kill, rape, etc., etc. anyone and anything they can imagine. The park is set in the future (obviously, since we don't have human-like robots today) and the park's technology keeps the visitors safe. The "hosts" (the term the show uses for the robots) can fight back, and they're programmed to behave in the typical manner of an Old West town, but the "hosts" are incapable of hurting the visitors. The "hosts" can try to shoot the visitors but the bullets don't really hurt the humans (they sting a little), and if a "host" tries to stab or otherwise seriously harm a visitor their primary programming stops them and they freeze (usually). I can't say that I actually recommend the show ... I wish HBO hadn't done it. It intentionally and brazenly uses nudity/sex/violence/etc. to sell the show to a crass, sex-addled society. I understand the nudity (in a sense); basically, when the "hosts" are in the shop for "repair," they never (unless it's to practice a scene from the theme park) wear clothing. So, there are lots of scenes with lots of naked "people" in them. Also, there are a wide variety of sensuous and violent scenes, including rape scenes and scenes depicting slaughter of random "people/hosts" (I don't recall if there are any scenes directly displaying the slaughter of children). What often happens in the theme park is, visitors come to the park and live out their wildest fantasies. There appears, the non-violent characters are not highlighted, that there are some who just come to the park and have fun, explore, play games, get their picture (there is an old-timey photographer often shown in the background), and leave. But, of course, the show revolves around several characters living out their darkest, vilest, cruelest dreams in the park.

The show Humans (technically the 'a' is upside down) is a much less (so far, I'm only on episode three of season 1) dark show. There certainly is much less nudity, the "synths" are nude in some scenes but they're mostly covered or clothed throughout the show (the nudity has not been full-frontal-nudity as it is Westworld several times). However, the show tackles similar philosophical and existential issues with AI. It seems, so far, that the theme is, some of these "synths" are gaining full consciousness or at least seem like they are (what does it even mean to have full consciousness). And, in a sense, some of them seem to be rebelling against their human masters. There are some clear displays of how "synths" are enslaved (some even in the sex industry), but mostly they are just unpaid, unfeeling, slavish (but most often not mistreated) household servants. For example, the primary family characters bought their "synth" to help around the house because the mother was often away on business and the father also worked outside the home and just neither of them had time for domestic issues like cooking or cleaning. So, they bought a "synth" to cook and clean for them and to free them up to spend time together as a family. There are a couple side-plots that haven't fully developed, but apparently, "synths" are hackable and some have gained a certain level of independence/consciousness. They're governed by something similar to Isaac Asimov's "Three Laws of Robotics" 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. It isn't overtly stated that they must obey these laws (the Will Smith movie I, Robot makes these laws a quintessential part of the plot, but the writers of Humans are more subtle. The "synths" aren't permitted to touch a human unless given direct, explicit permission. Also, and this comes up in the plot, children are not permitted to be "primary users" and may not be touched by the "synths" unless given express permission by an adult/"primary user." Anyways ... I prattle on. These two shows illustrate human depravity like few others I've ever seen. But, all that is just the precursor to what I really want to talk about. That is something that Sam Harris and Paul Bloom discussed in this podcast.

Sam Harris is (in)famous as one of the "four horsemen of atheism" and is an outspoken author and speaker on the subject of atheism. Even in this podcast, which isn't really about atheism/theism, Harris seems to be unable to control himself in deriding the Christian God. He says that God is the worst possible "mind criminal" (a term from an author that he references in the podcast). He says this because God created minds, and then tortures them in Hell (I believe he may have phrased it, "consigns them to torture in Hell"). Nevertheless, the discussion is very interesting. They talk about empathy and compassion. They talk about it with regard to how we will or should treat (future) AI (I'm not going to keep calling these "robots" by the monikers used in the TV shows, I'm just going to call them "AI"). The biggest thing that hit me in this whole conversation wasn't directly about AI. In the course of the conversation Harris insisted (without much argument or support) that though determinism leads us to less anger when people do evil, but doesn't lead us to less desire or love of empathy/compassion (those are not the same, but in this discussion we can treat them together). This inexplicable insistence leads me to think that Harris (and presumably his guest) is deluding himself. Let me put this more straightforwardly. Determinism is the idea that everything and I mean everything, is determined. There is no escape. This is not the "nature versus nurture" argument. This is a combination of nature and nurture that says, essentially, given the combination of one's genes and upbringing that one will behave exactly the way one behaves. So, we cannot blame the despot because he was programmed to be despotic because of his genes and upbringing. If the despot had been born with different genetic makeup he would be different, and if the despot had been born to different parents, he would be different. Regardless, one cannot change one's genes or upbringing, so we shouldn't be mad at the mass murderer because he's just acting out his genes and upbringing. People are just complicated forms of biological machines.

On the other hand, though, Harris insists that we can praise the virtuous person. How does this make sense? He says that we can praise this person because she chooses to care about the wellbeing of others. But wait, doesn't determinism mean that a person is merely the sum total of his/her genes and upbringing, there's not really any such thing as "choosing" something at all. Choosing requires an agent, a person. But determinism destroys personhood. The bad person is just being bad because his genes and upbringing make him do so, just as the good person is just being good because her genes and upbringing make her do so. Harris phrases it something like, people with compassion want the best in others. When I heard him say this, "With loving kindness, you really just ... you want that person to be happy." Isn't "wanting something" a choice? He says that it's different in two ways: first because in hating a bad man, we're assigning ultimate agency to the man, and under determinacy, there is no ultimate agency. But, loving kindness doesn't ascribe ultimate agency to the good person, it's just wanting that person to be happy. Well, no offense to Harris, but his logic is clearly flawed here. The statement may technically be true; that loving kindness doesn't claim that the good person that you're loving has ultimate agency. However, what he's clearly ignoring (perhaps unintentionally), is that his idea of loving kindness clearly ascribes ultimate agency to the person doing the loving kindness. Also, the good deeds for which Harris is saying one might show loving kindness to someone requires that he/she must have ultimate agency (despite Harris' claim otherwise). Harris is deluding himself if he thinks he can have his cake and eat it too. Either determinism absolves both bad and good behavior or the determinists are just lying about their insistence on determinism. Harris (and many, many others) only really claim determinism in a way that suits their preconceived notions.