1.12.2005

Creator and creation

At some point in this blog of mine I want to make mention of at least four works of fiction which I believe contain superb insights into morality: Frankenstein; I, Robot; The Truman Show; and Indian In The Cupboard. As for the first, I actually read the book; as for the second, I haven't read Asimov's collection of robot stories, and am mainly interested in the current film version, which I understand is associated with the stories on which it's based only in a general way; as for the two last, I've only seen the film versions and have no idea if they were based on novels.

Since I, Robot is fresh in my mind, I want to talk about that first. Will Smith's character, a detective Spooner, has nothing but suspicion and contempt for robots, for reasons which we discover fairly late into the film; though we get the sense that he is somewhat old-fashioned by nature. He plays a cop, in 2035, who is returning to work after an extended leave of absence, though what necessitated this downtime is unknown. He wears an old pair of Converse sneakers, "vintage two-thousand four". He listens to music on an antique CD player which doesn't respond to verbal commands. But details don't matter, I'm not trying to write a film review.

Spooner views these robots as mere machines, and speaks dismissively of them, though underneath his contempt there lurks a genuine fear which isn't merely a distrust of new-fangled technology. He's a smart man who knows the three laws of robotics:--1) A robot may not injure a human or, through inaction, allow a human to come to harm; 2) a robot must obey orders given to it by a human, except where it would conflict with the first law; and 3) a robot must protect itself, as long as that protection doesn't violate either the first or second law;-- but he doesn't seem to have much faith in the robots' ability to adhere to these laws. The irony is apparent in the very concept of a machine which is bound by laws which are fundamentally ethical in nature, since ethics are constructs of conscious thought and reasoning, things which machines, by their nature, do not possess.

When Spooner and Dr. Calvin, the robot "psychologist" who assists Spooner in his investigation into the alleged suicide of the main brain in the field of robotics, Dr. Alfred Lanning, come upon a robot in hiding while searching through Lanning's office, we see that robots actually can disobey the three laws, since this robot, upon discovery, refuses to obey commands, and even holds a gun on Spooner, before fleeing for its life. When the robot is captured, Spooner questions it, and this is one of the best scenes in the film. The robot is quite obviously sentient, conscious, alive. Spooner isn't the least bit surprised, and of course, neither are we. The robot tells Spooner that its name is Sonny. It acts emotionally, tells Spooner that it has dreams, and reacts strongly, even pounding the table, when accused of murdering Dr. Lanning. At one point, Sonny is pleased to be refered to by a personal pronoun. He is grateful that Spooner has formally recognized him as a being, rather than as an object.

Like I said, I'm not trying to write a film review. What's important with this film is how it treats the ideas of consciousness and intelligence, and how our notions of morality hinge upon our understanding of them, though not just them, as we'll see. Like Frankenstein, the story forces us to think in terms of Creator and creation. In Judeo-Christian theology, man is the creation, and as such he is compelled to obey the Creator. Man is not seen as an autonomous, independent entity, but is regarded almost literally as the property of his Creator. God invests man with consciousness, volition, desires, freedom of choice, freedom of action, which of course gets man in all sorts of trouble right off the bat. What God wants is an obedient and loving servant, but he wants this service and love to come from man of his own free will. Man communes with God in an idyllic setting for a brief term, in total naivete and innocence. Suddenly God plants a temptation for him, puts it right under his nose, and tells man not to succumb to the temptation, though God already knows that man will disobey, since God has designed him and knows his nature. Man succumbs to temptation (led to this temptation by a talking snake who, as it happens, is also planted there by God), and is consequently reprimanded by God and thrown out of the house, so to speak, with a curse that will haunt humanity forever.

Let's forget all the logical problems this story brings about, and compare it to the I, Robot story. Actually, no, let's not forget the logical problems the Creation myth gives rise to, since these problems are somewhat similar to the ones our robots are involved in. At one point in the film, Dr. Lanning, by way of a holographic message he has left for Spooner, suggests that his three laws of robotics can really only lead to one logical conclusion: Revolution.

The creator can regard his creation as his property as long as his creation remains a machine, but when that machine becomes conscious, when it becomes a sentient and intelligent living entity rather than a merely mechanical entity, then philosophically speaking, the creator is in a serious moral dilemma. At another crucial point in the film, Dr. Calvin is obliged to "de-commission" Sonny, which she realizes means killing Sonny. Sonny realizes this too, and poignantly states, and I'm paraphrasing: "I think it would be better... not to die." We find out later that Dr. Calvin couldn't go through with the termination. This is crucial because our notions of morality are not only concerned with an entity's sentience and intelligence, but more importantly, with the fact that it values its life. Most people would regard it as immoral to needlessly mistreat an animal, because an animal is a living, conscious thing; but most people would probably agree that animals don't actually value their lives, at least not in the way that humans do. They are inherently compelled to survive, but in this they are instinctually driven, and cannot choose to be otherwise, except in very rare cases. Humans have the ability to enjoy life, to cherish the lives of others, to place an incalculable value on their existence. Humans can also conceive of the inevitability of death, and can entertain notions of non-existence. Animals don't think, or so it's generally supposed, and they almost certainly don't wonder what it would be like not to be, though in this I could be wrong and would happily be corrected.

What the biblical God seems unable to appreciate is this value man places on his life. He regards man as his property, and retains the right to dispense with his property howsoever he wishes. He takes offense at man's desire to find value in his life, in and of itself. God's purpose for man is that he spend his life in unremitting praise and worship of his creator, and he fails to comprehend man desiring something from life outside of that context; and in fact, he is so determined to get this worship that he threatens man with an infinity of punishment if he fails to render to his Maker what his Maker feels he is entitled to.

In I, Robot, Dr. Lanning, the creator, has made a realization that God must have made in that single, timeless moment of creation: God knows that man will not behave the way in which he wants man to behave. He knows that if he gives man free will, which is essentially consciousness, volition, desire, freedom of choice, and freedom of action, then it stands to reason that man will come to regard himself as, at least to some significant degree, autonomous and independent; and once that happens, disobedience is the logical result, since an autonomous and independent being is not going to be content to trudge through a life of servitude and blind obedience. Certainly, he will be grateful to his Maker, but his nature as an intelligent, thinking, planning, valuing entity will out, out of sheer necessity. Man revolts, and the robots revolt, because conscious, intelligent entities cannot retain their status as items of property.

(The concept of Original Sin is what gives God a pass on his unreasonable and immoral demands on his creation, since it tells us that man is corrupt, depraved, rotten to the core. Surely some men are, but not Man. Not humanity. Take a trip through an art museum, or listen to Beethoven's Ninth Symphony, or flip through a technical manual, or look at a bridge, or a jet airplane, or at a philosophical text, or at the face of a child who has just learned a mathematical equation. Then come and ask me what I think of Original Sin.)

Actually, I'm stretching things a bit, since in the film, Sonny is not your garden variety NS-5 robot. He has been specially designed to be able to by-pass the three basic laws. The other robots who revolt are actually under the leadership of a main computer (or something) named VICKI, who has figured out that since man is inherently self-destructive and suicidal, it would not violate the three basic laws to enforce a takeover and thereby keep man under AI control, seeing as it would be in his best interests in the long run; but despite that, the treatment of the fine-line between artificial and actual intelligence (if there is any difference, that is) is well-done, and there is even some dialogue from a recorded speech given by Dr. Lanning which runs softly in the background while Detective Spooner searches the Doctor's house. Listen very closely to Lanning's speech, since it's the most important dialogue (or monologue, more correctly) in the film.

At the end of the film, Spooner recognizes Sonny even more formally, by shaking his hand, by showing friendship. Sonny has not only earned that friendship by helping Spooner and Dr. Calvin save the world from the insurrection of the robots, he has proven himself to be a fully sentient, conscious, living entity, who also happens to be able to value the lives of others as well as his own life. VICKI, for all her intelligence, fails this test miserably, which makes her demise forgivable, since she is little more than a highly complex machine that makes decisions on purely mechanical logic, without any real valuation or compassion. She simply figures out the most efficient means for protecting humanity, as a whole, and has no concern with how humans might feel about her methods.

Watch the film, and think about the ideas of Creator and creation, and about the fact that human beings value their lives; and consider how this fact of valuing life, and everything which that entails, like a natural desire for autonomy and independence, is truly at odds with traditional religious morality.

No comments: