Caliban - Page 15/22

"THANK you, my friends," Fredda began. "Tonight I intend to present an analysis of the Three Laws. However, before we launch into a detailed law-by-law examination, I think it would be wise to review some background information and focus our historical perspective.

"In my previous lecture, I presented arguments intended to establish that humans hold robots in low regard, that misuse and abuse of robots is degrading to both us and them, that we humans have allowed our own slothful reliance on robots to rob from us the ability to perform the most basic tasks. There is a common thread that holds all these problems together, a theme that runs through them all.

"It is the theme, ladies and gentlemen, of the Three Laws. They are at the core of all things involving robotics."

Fredda paused for a moment and looked out over the audience, and happened to catch Alvar Kresh's eye in the first row. She was startled to see the anger in his face. What had happened? Kresh was a reasonable man. What could have angered him so? Had some piece of news come to him? That possibility put a knot in her stomach. But never mind. Not now. She had to go on with the lecture.

"At the beginning of my previous lecture, I asked, 'What are robots for?' There is a parallel question: 'What are the Three Laws for?' What purpose are they meant to serve? That question startled me when I first asked it of myself. It was too much like asking, 'What are people for?' or 'What is the meaning of life?' There are some questions so basic that they can have no answer. People just are. Life just is. They contain their own meaning. We must make of them what we can. But as with robots themselves, the Laws, I would remind you once again, are human inventions, and were most certainly designed with specific purposes in mind. Wecan say what the Three Laws are for. Let us explore the question.

"Each of the Laws is based on several underlying principles, some overt and some not immediately evident. The initial principles behind all three Laws derive from universal human morality. This is a demonstrable fact, but the mathematical transformations in positronic positional notation required to prove it are of course not what this audience wishes to hear about. There are many days when I don't wish to hear about such things myself."

That line got a bit of a laugh. Good. They were still with her, still willing to listen. Fredda glanced to her notes, took a slightly nervous sip from her water, and went on. "Suffice to say that such techniques can be used to generalize the Three Laws such that they will read as follows: One, robots must not be dangerous; two, they must be useful; and three, they must be as economical as possible.

"Further mathematical transformation into the notation used by sociological modelers will show that this hierarchy of basic precepts is identical to a subset of the norms of all moral human societies. We can extract the identical concepts from any of the standard mathematically idealized and generalized moral social codes used by sociological modelers. These concepts can even be cast into a notation wherein each higher law overrides the ones below it whenever two come into conflict: Do no harm, be useful to others, do not destroy yourself.

"In short, the Three Laws encapsulate some ideals of behavior that are at the core of human morality, ideals that humans reach for but never grasp. That all sounds very comfortable and reassuring, but there are flaws.

"First, of necessity, the Three Laws are set down, burned into the very core of the positronic brain, as mathematical absolutes, without any grey areas or room for interpretation. But life is full of grey areas, places where hard-and-fast rules can't work well, and individual judgment must serve instead.

"Second, we humans live by far more than three laws. Turning again toward results produced by mathematical modeling, it can be shown that the Three Laws are equivalent to a very good first-order approximation of idealized moral human behavior. But they areonly an approximation. They are too rigid, and too simple. They cannot cover anything like the full range of normal situations, let alone serve in unusual and unique circumstances where true independent judgment must serve. Any being constrained by the Three Laws will be unable to cope with a wide range of circumstances likely to occur during a lifetime of contact with the available universe. In other words, the Three Laws render a being incapable of surviving as a free individual. Relatively simple math can demonstrate that robots acting under the Three Laws, but without ultimate human control, will have a high probability of malfunctioning if exposed to human-style decision situations. In short, the Three Laws make robots unable to cope unaided in an environment populated with anything much beyond other robots.

"Without the ability to deal in grey areas, without the literally thousands of internalized laws and rules and guidelines and rules of thumb that guide human decision making, robots cannot make creative decisions 'or judgment calls even remotely as complex as those we make.

"Aside from judgment, there is the problem of interpretation. Imagine a situation where a criminal is firing a blaster at a police officer. It is a given that the police officer should defend him- or herself, even to the use of deadly force. Society entitles-even expects-the police officer to subdue or even kill his attacker, because society values its own protection, and the officer's life, over the criminal's life. Now imagine that the officer is accompanied by a robot. The robot will of course attempt to shield the policeman from the criminal-but will likewise attempt to protect the criminal from the policeman. It will almost certainly attempt to prevent the police officer from firing back at the criminal. The robot will attempt to prevent harm to either human. The robot might step into the police officer's field of fire, or let the criminal escape, or attempt to disarmboth combatants. It might attempt to shield each from the other's fire, even if that results in its own destruction and the immediate resumption of the gun battle.

"Indeed, we have run any number of simulations of such encounters. Without the robot present, the police officer can most often defeat the criminal. With a robot along, here are the outcomes more likely than the police winning: death of police officer and criminal with destruction of robot; death of police officer and with destruction of robot; destruction of robot coupled with escape of criminal; death of criminal and/or police officer with robot surviving just long enough to malfunction due to massive First Law/First Law and First Law/Second Law conflicts. In short, inject a robot into such a situation, and the odds are superb you will end up with a disaster.

"Theoretically itis possible for a robot to judge the situation properly, and not mindlock over being complicit in the death of the criminal. It must be able to decide that both the immediate and long-term general good are served if the police officer wins, and that coming to the assistance or defense of a criminal prepared to take the life of a peace officer is ultimately self-defeating, because the offender will almost certainly attack society again in other ways, if he or she is permitted to survive. However, in practice, all but the most sophisticated robots, with the most finely tuned and balanced potentials of First Law, will have no hope at all of dealing appropriately with such a situation.

"All the laws and rules we live by are subject to such intricacies of interpretation. It is just that we humans are so skilled, so practiced, in threading our ways through these intricacies that we are unaware of them. The proper way to enter a room when a party is in progress in midafternoon, the correct mode of address to the remarried widow of one's grandfather, the circumstances under which one mayor may not cite a source in scholarly research-we all know such things so well we are not even aware that we know them. Nor is such practiced knowledge limited to such trivial issues.

"For example, it is a universal of human law that murder is a crime. Yet self -defense is in all places a legitimate defense against the accusation of murder, negating the crime and condoning the act. Diminished capacity, insanity defenses, mitigating circumstances, the gradations of the crime of murder from manslaughter to premeditated murder-all these are so many shades of grey drawn on the black and white of the law against murder. As we have seen with my example of the policeman and the criminal, no such gradations appear in the rigidity of the First Law. There is no room for judgment, no way to account for circumstances or allow for flexibility. The closest substitute for flexibility a robot may have is an adjustment in the potential between the First, Second, and Third Laws, and even this is only possible over a limited range.

"What are the Three Laws for? To answer my own question, then, the Three Laws are intended to provide a workable simulation of an idealized moral code, modified to ensure the docility and subservience of robots. The Three Laws werenot written with the intention of modifying human behavior. But they have done just that, rather drastically.

"Having touched on the intent of the Laws, let us now look at their history.

"We all know the Three Laws by heart. We accept them the way we accept gravity, or thunderstorms, or the light of the stars. We see the Three Laws as a force of nature, beyond our control, immutable. We think it is pointless to do anything but accept them, deal with the world that includes them.

"But this is not our only choice. I say again, the Three Laws are a human invention. They are based in human thought and human experience, grounded in the human past. The Laws are, in theory at least, no less susceptible to examination and no more immutable in form than any other human invention-the wheel, the spaceship, the computer. All of these have been changed-or supplanted-by new acts of creativity, new inventions.

"We can look at each of these things, see how they are made-and see how we have changed them, see how we update them, adjust them to suit our times. So, too, if we choose, can we change the Three Laws."

There was a collective gasp from the audience, shouts from the back of the room, a storm of boos and angry cries. Fredda felt the shouts and cries as if they were so many blows struck down on her body. But she had known this was coming. She had braced herself for it, and she responded.

"No!" she said. "This is not our way. You were all invited here to join in an intellectual discussion. How can we tell ourselves that we are the most -advanced society in the history of human civilization, if the mere suggestion of a new idea, a mild challenge to the orthodoxy, turns you into a mob? You are responding as if my words were an assault on the religion you pretend not to have. Do you truly believe that the Three Laws are preordained, some sort of magical formula woven into the fabric of reality?"That got at them. Spacers prided themselves on their rationality. At least most of the time. There were more shouts, more cries, but at least some of the audience seemed ready to listen. Fredda gave them another moment to settle down and then continued.

"The Three Laws are a human invention," Fredda said again. " And as with all human creations, they are a reflection of the time and the place where they were first made. Though far more advanced in many respects, the robots we use today are in their essentials identical to the first true robots made untold thousands of years ago. The robots we Spacers use today have brains whose basic design has remained unchanged from the days before humanity first entered space. They are tools made for a culture that had vanished before the first of the great underground Cities of Earth were built, before the first Spacers founded Aurora.

"I know that sounds incredible, but you need not take my word for it. Go look for yourself. If you research the dimmest recesses of the past, you will see it is so. Do not send your robots to find out for you. Go to your data panels and look for yourself. The knowledge is there. Look at the world and the time in which robots were born. You will see that the Three Laws were written in a very different time from ours.

"You will find repeated references to something called the Frankenstein Complex. This in turn is a reference to an ancient myth, now lost, wherein a deranged magician-scientist pulled parts from the dead bodies of condemned criminals and put them back together, reanimating the rotting body parts to create a much-feared monster. Some versions of the myth report the monster as actually a kind and gentle soul; others describe the monster as truly fierce and murderous. All versions agree that the monster was feared and hated by practically everyone. In most variants of the story, the creature and its creator are destroyed by a terrorized citizenry, who learn to be on the lookout for the inevitable moment when the whole story would be told again, when another necromancer would rediscover the secret of bringing decayed flesh back to life.

"That monster, ladies and gentlemen, was the popular mythic image of the robot at the time when the firstactual robots were built. A thing made out of rotted, decayed human flesh, torn from the bodies of the dead. A perverted thing born with all the lowest and most evil impulses of humanity in its soul. The fear of this imaginary creature, superimposed on the real-life robots, was the Frankenstein Complex. I know it will be impossible to believe, but robots were seen not as utterly trustworthy mechanical servants, but as so many potential menaces, fearful threats. Men and women would snatch up their children and run away when robots-true robots, with the Three Laws ingrained in their positronic brains-came close."

More mutterings of disbelief from the audience, but they were with her now, enthralled by the bizarre and ancient world she was describing. She was telling them of a past almost beyond their imagining, and they were fascinated. Even Kresh, there in the front row, seemed to have lost some of his ferocity.

"There is more," Fredda said. "There is much more that we need to understand about the days when the Laws were written. For the first true robots were built in a world of universal fear and distrust, when the people of Earth found themselves organized into a handful of power blocs, each side armed with enough fearsome weapons to erase all life from the planet, each fearing one of the others would strike first. Ultimately the fact of the weapons themselves became the central political issue of the time, pushing all other moral and philosophical differences to one side. In order to keep its enemies from attacking, each side was obliged to build bigger, faster, better, stronger weapons.

The question became not whose cause was just, but who could make the more fearsome machines? All machines, all technologies, came to be regarded as weapons first and tools second. Picture, if you will, a world where an inventor steps back from her lab bench and, as a matter of routine, asks notHow can this new thing be useful? but instead,How can this best be used to kill my enemies? Whenever possible, machines and technology were perverted into tools of death, warping society in endless ways. The first of the great underground Cities of Earth were one heritage of this period, designed not for utility and efficiency, but as a protection against the horrifying nuclear bombs that could destroy a surface city in the blink of an eye.

"At the same time as this mad, paranoid arms race, just as this Frankenstein Complex was in full flower, society was making its first steps toward the concept of modem automation, and the transition was not a pleasant one. At that time, people worked not because they wished to do so, or to make themselves useful, or to answer their creative instincts. They worked because theyhad to do so. They were paid for their labor, and it was that pay that bought the food they ate and put the roof over their heads. Automatic machines-robots among them-were taking over more and more jobs, with the result that there was less and less work-and thus less and less pay-for the people. The robots could create new wealth, but the impoverished people could not afford to buy what the robots-owned by the rich-created. Imagine the anger and resentment you would feel against a machine that stole the food from your table. Imagine the depth of your anger if you had no way to stop that theft.

"A final point: Until the era of the Spacers, robots were a vanishingly rare and expensive commodity. Today we think nothing of a Spacer culture where robots outnumber humans fifty or a hundred to one. For the first few hundred years of their use, robots were about a thousand times less numerous than humans. That which is rare is treated differently from that which is common. A man who owned a single robot, one that cost more than all his other worldly possessions combined, would never dream of using that robot as a boat anchor.

"These, then, were the cultural elements that drove the creation of the Three Laws. A folk myth of a soulless, fearful monster built from the undead; the sense of a threatening world out of control; the deep resentment against machines that were robbing the bread from the mouths of poor families; the fact of robotic scarcity and their perception as being rare and valuable. Note that I am concerned mostly with perceptions here, and not so much with reality. What mattered is how peoplesaw robots, not what the robots were like. And these people saw robots as marauding monsters."

Fredda took a breath and looked out across the room to see the audience dead silent, listening in shocked horror to her words. She went on. "It has been said that we Spacers are a sick society, slaves to our own robots. Similar charges have been leveled at our Settler friends who huddle in their underground warrens, hiding from the world outside, assuring themselves it is much nicer to live out of sight of the sky. They are the cultural inheritors of the fear-built Cities of Earth. These two views are often presented as being mutually exclusive. One culture is sick, therefore the other is healthy. I would suggest it is more reasonable to judge the health or sickness of each independently. To my mind, the health of both is in grave doubt.

"In any event, it is clear that the society, the time period, into which robots and the Three Laws were built was far sicker than ours. Paranoid, distrustful, twisted by violent wars and horrifying emotion, the Earth of that time was a fearful place indeed. It was that sickness that our ancestors fled when they left Earth. It was the wish to dissociate themselves from that sickness that caused us Spacers to reject, for so long, our actual decendancy from Earth. For thousands of years, we denied our common heritage with Earth and the Settlers, dismissing those outside our Fifty Worlds as subhuman, poisoning relations between our two peoples. In short, it is the sickness of that long-forgotten time that is at the core of the distrust and hatred between Settler and Spacer today. The illness has survived the culture that created it.

"I have said that all human inventions are reflections of the times in which they were created. If that is so, the Three Laws are reflected from a dark mirror indeed. They reflect a time when machines were feared and distrusted, when technology was correctly perceived as often malevolent, when a gain made by a machine could come only at the cost of a loss to a human, when even the richest man was poor by the standards of our time, and the poor were deeply-and understandably-resentful of the rich. I have said and will say many negative things about our robot-based culture tonight, but there are many bright and shining positives as well. We have lost not only the fact of poverty but the ability to conceive of it. We are not afraid of each other, and our machines serve us, not we the machines. We have built many great and lovely things.

"Yet our entire world, our whole culture, is built around Three Laws that were written in a time of savagery. Their form and phrasing are as they are in part to placate the fearful, semibarbaric masses of that time. They were, I submit, even at the time of their invention, an overreaction to the circumstances. Today they are almost completely detached from present reality.

"So:What are robots for? In the beginning, of course, the answer was simple. They were for doing work. But today, as a result of those Three Laws written so long ago, the original uses for robots have almost become subordinate to the task of cocooning and coddling humanity.

"That was clearly not the intent of the people who wrote those Three Laws. But each Law has developed its own subtext over time, formed a set of implications that became evident only after robots and humans lived together for a long time-and these implications become difficult to see from within a society that has had a long association with robots.

"Let us step back and look at the Laws, starting with the First Law of Robotics:A robot may not injure a human being, or,through inaction, allow a human being to come to harm. This is of course perfectly reasonable-or so we tell each other. Since robots are very much stronger than human beings, robots must be forbidden to use that strength against humans. This is analogous to our own human-to-human prohibitions against violence. It prevents one human from using a robot as a weapon against another, by, for example, ordering a robot to kill an enemy. It makes robots utterly trustworthy.

"But this Law also definesany robot's existence as secondary toany human's. This made more sense in an age when robots were incapable of speech or complex reasoning, but all modem robots are at least that capable. It made sense in a day when the poor were many and robots were expensive and few. Otherwise, the rich might easily have ordered their playthings to defend themselves against the mob, with disastrous results. Yet, still, today, in all times, in all places, the existence of the noblest, bravest, wisest, strongest robot is as nothing when compared to the life of the most despicable, monstrous, murderous criminal.

"The second clause of the First Law further means that in the presence of robots humans do not need to protect themselves. If I pull a gun on Sheriff Kresh in the front row here, he knows that he need do nothing." For a weird, fleeting second, Fredda considered just how pleasant it would be to do just that. Kresh was a threat. There was no doubt about that. "His personal robot, Donald, would protect him. Ariel, the robot on the stage behind me, would disarm me. In a very real sense, Sheriff Kresh would have no responsibility to keep himself alive. If he climbed a mountain, I doubt that Donald would allow him to make the ascent without five or six robots along, climbing ahead of him and behind him, ready at all times to prevent him from falling. A robot would urgently attempt to talk its master out of such a dangerous activity in the first place.

"The fact that such overprotection takes all of the fun out of mountain climbing explains at least in part why none of us go mountain climbing anymore.,

"In similar, if more subtle fashion, living with robots has trained us to regard all risk as bad, and all risk as equal. Because robotsmust protect us from harm, and must not, through inaction, allow us to come to harm, they struggle endlessly to watch for any danger, no matter how slight, for that is what we have told them to do.

"It is barely an exaggeration to say that robots protect against a million-to-one danger of minor injury with every bit as much fervor as they guard against the risk of near-certain death. Because minor and major risks are treated the same, we come to think that theyare the same. We lose our ability to judge risk against possible benefit. I am sure that every person in the audience tonight has had the experience of a robot leaping in to protect against absolutely trivial risks and dangers. Robots overreact, and in doing so teach us to fear risk inordinately. On a cultural level, that fear of risk has spread over from the merely physical to the psychological. Daring and chance-taking are seen as at the very least distasteful and unseemly, not the done thing. At every turn, our culture teaches us it is foolish to take chances, however small.

"It is, however, a truism that all things that are worth gaining require some risk in the effort to get them. When a climber goes to the top of a mountain to see the view, there is the risk of falling off, ever present, no matter how many robots are along. When a scientist strives to learn something new, the risks include loss of face, loss of resources, loss of time. When one person offers true love to another, there is the danger of rejection. In all things, in all efforts, this element of risk is there to be found.

"But our robots teach us that risk,every risk,all risk, is bad. It is their duty to protect us from harm,not their task to do us good. There is no law sayingA robot shall help a human achieve his or her dreams. Robots, by their caution, train us to think only of safety. They are concerned with the dangers, not with the potential benefits. Their overprotective behavior and their constant urgings that we be cautious teach us at a very early age that it is wiser not to take chances. No one in our society ever takes risks. Thus, the chance for success is eliminated right along with the chance for failure."

By now the silence in the room was gone altogether, replaced by a low, angry, buzzing hum. People were talking with their neighbors, shaking their heads, frowning. There was a disturbing intensity in the air.

Fredda paused and looked about the auditorium. It suddenly seemed to her that the room had grown smaller. The rear seats had moved in, and were remarkably close to her. The people in the front rows seemed to be only a few centimeters away from her face.

She looked down at Alvar Kresh. He seemed so close that it would take an effort of will to avoid touching him. The air seemed bright and charged with energy, and the straight lines and careful geometry of the room seemed to have curved in on themselves. All the colors in the room seemed richer, the lights brighter.

Fredda felt her heart thumping against her chest. The emotions in the room, the anger, the excitement, the curiosity, the confusion, were all palpable things, there for her to reach out and touch. She had them! Oh, she knew there was little hope of mass conversions on the spot-and she did not even know what she would want them all convertedto-but she had caught their emotions, forced them to look at their own assumptions. She had opened the debate.

Now if she could only finish out the evening without starting a riot. She glanced down at her notes and started back into her talk.

"We fear risk, and look at the results. In every scientific field except robotics, we have surrendered leadership to the Settlers. And, of course, we win out in the field of robotics by default, because the Settlers are foolish enough to fear robots." Was there irony in her voice as she said that? Fredda herself was not sure.

"But it is not just science that has fallen asleep. It is everything. Spacers make no new types of spacecraft or aircar. The new buildings that the robots put up are based on old designs. There are no new medicines to further extend our lives. There is certainly no new exploration out into space. 'Fifty planets are enough' has the power of a proverb. We say it the same way we say 'enough is as good as a feast. ' Except now Solaria has collapsed, and we are only forty-nine worlds. If Inferno goes on the way it has in the past, we will be forty-eight. With many living things, the cessation of growth is the first step toward death. If this is true for human societies, we are in grave danger.

"Inevery field of human activity among the Spacers, the lines on the graph mark a slow, gentle decline as safe and sober indolence becomes the norm. We are losing ground even in the most basic and vital things. The birthrate here on Inferno fell below replacement level two generations ago. We live long, but we do not live forever. We die more than we give birth. Our population is in decline, and large parts of the city are now vacant. Those children that are born are largely raised, not by loving parents, but by robots, the same robots that will coddle our children all their lives and make it easy for them to be cut off from other humans.

"Under such circumstances, it should come as no surprise that there are many among us who find we prefer the company of robots to humans. We feel safer, more comfortable, with robots. Robots we can dominate, robots we control, robots who protect us from that most dangerous threat to our quiet contentment:other people. For contact with humans is far riskier than dealing with robots. I will note in passing the increasingly popular perversion of having sex with specially designed robots. This vice is common enough that in some circles it is no longer even regarded as odd. But it represents the final surrender of contact with another person in favor of robotic coddling. There can be no real feeling, no sane emotion, vested in such encounters, merely the empty and ultimately dissatisfying release of physical urges.

"We Infernals are forgetting how to deal with each other. I might add that our situation here in this regard is actually far healthier than on other Spacer worlds. On some of our worlds, the relatively mild taste for personal isolation we indulge here has become an obsession. There are Spacer worlds where it is considered unpleasant to be in the same room with another person, and the height of perversion to actually touch another person unless absolutely needful. There are no cities on these worlds, but merely widely scattered compounds, each home to a single human surrounded by a hundred robots. I need hardly mention the difficulties in maintaining the birthrate on such worlds.

"Before we congratulate ourselves on avoiding that fate, let me remind you that the population of the city of Hades is declining far faster than would be accounted for by low birthrate: More and more people are moving out of town, setting up compounds of exactly the type I have just described. Such solo residences seem safer, more tranquil. There are no stresses or dangers when one is by oneself.

"My friends, we must face a fact that has been staring us in the face for generations. The First Law has taught us to take no chances. It has taught us that all risk is bad, and that the safest way to avoid risk is to avoid effort and let the robots do it, whatever it is. Bit by bit, we have surrendered all that we are and all that we do to the robots."

There was a chorus of shouts and boos and hisses from the room, and an angry chant began in the back of the room, among the Ironheads. "Settler, Settler, Settler." In the Ironhead view of things, there was no fouler name they could call her.

Fredda let it go on for a minute or two, declining to challenge it this time, preferring to let it peter out on its own. The tactic worked-at least this once. Others in the audience turned toward the Ironheads and shushed them, and Kresh's deputies leaned in toward a few of the rowdier ones. The Ironheads settled down.

"If I may continue, then, to the Second Law of Robotics:A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. This Law ensures that robots will be useful tools, and will remain subservient to humans, despite the many ways in which they can be physically and intellectually superior to us.

"But in our analysis of the First Law, we saw that human reliance on robots creates a human dependence upon them. Second Law reinforces this. Just as we are losing the will and ability to see to our own welfare, we are losing the capacity for direct action. We can do nothing for ourselves, only what we can direct our robots to do for us. Much technical training consists of teaching the means by which to give complex orders to specialized robots.

"The result: With the exceptions of our increasingly decadent and decorative arts, we create nothing new. As we shall see in a moment, even our art forms are not immune to robotic interference.

"We tell ourselves that the Spacer way of life frees us to build a better, higher culture, frees us from all drudgery to explore the better places of human ability. But with what result?

"Let me cite one example that is close to hand. We meet here tonight in one of our planet's finest theaters, a palace of art, a monument to creativity. But who does the work here? To what use do we put this place? There is a short and simple answer. It is here that we order our robots to rake over the dead bones of our culture for us.

"No one bothers to write plays anymore. It is too much effort. I have done some research on this point. It has beentwenty years since a play by a living playwright has been performed here, or anywhere in the city of Hades. It is well over fifty years since the last time a large-cast show used only human actors. The extras, the chorus, the supporting players, are all theatrical robots, human in appearance and specially built for the purpose of re-creating human action on the stage. Indeed, it is becoming all too common for thelead roles to be taken by robots as well. But do not worry, we are told. The only truly creative task in theater has always been that of the director, and the director will always be human.

"I think the great actors of the past would object to being dismissed as noncreative. I likewise think that the great directors of the past would not regard their creative tasks as complete if they merely selected the play and ordered a pack of robots to perform it.

"But perform the robots do, and perform it to an empty house. The performances put on here are seen by millions, millions who stay safely home and watch on televisor. It is rare that even twenty percent of the seats in this house are filled by humans. So, in order to provide the proper feel of a live performance, the management fills the empty seats with crude humanoid robots, capable of little more than laughing and clapping on command. Their rubber and plastic faces look enough like people to fool the watchers at home when the cameras pan the audience. You sit at home, ladies and gentlemen, watching a theater full of robots watching a stage full of robots. Where in all that is the human interaction that makes the theater live? The emotions in this room are thick and strong tonight. How could that be so if all of you were tailor's dummies preprogrammed to respond to another tailor's dummy giving this talk?" There was an uncomfortable silence, and Fredda noticed more than a few members of the audience glancing about, as if to reassure themselves that the people to either side of them were not audience-response robots.

"Nor have other creative fields fared better. The museums are full of paintings done by robots under the 'direction' of the nominal human painter. Novelists dictate the broad outlines of their books to robotic' assistants' who return with complete manuscripts, having' amplified' certain sections.

"As of now, there are still artists and poets and writers and sculptors who do their own work for themselves, but I do not know how much longer that will be true. Art itself is a dying art. I must admit my research is incomplete in this area. Prior to giving this talk, I should have gone out there to see if anyone cares if the books and the art are machine-made or not. But I must admit I found the prospect of that research too depressing.

"I did not and do not know if anyone looks at these paintings or reads these books. I do not know which would be worse-the empty exercise of sterile creation admired and praised, or such a pointless charade going forth without anyone even bothering to notice. I doubt the so-called artists themselves know. As in all of our society, there is no penalty for failure in the arts, andno reward for success. And if failure is treated in exactly the same way as success, why go to all the effort of being a success? Why should there be, when the robots take care of everything, anyway?"

Fredda took another sip of water and shifted her stance behind the podium. So far it was going well. But what would happen when she got to the tough part?

"On, then, to the Third Law of Robotics:A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law. Of the Three Laws, this has the smallest effect on the relationship between robots and humans. It is the only one of the Laws that provides for robotic independence of action, a point I shall come back to. Third Law makes robots responsible for their own repair and maintenance, as well as ensuring that they are not destroyed capriciously. It means that robots are not dependent on human intervention for their continued survival. Here, at last, in Third Law, we have a Law that sees to the well-being of the robots. At least, so it appears at first glance.

"However, Third Law is there for the convenience of humans: If the robots are in charge of their own care, it means we humans need not bother ourselves with their maintenance. Third Law also makes robotic survival secondary to their utility, and that is clearly more for the benefit of humans than robots. If it is useful for a robot to be destroyed, or if it must be destroyed to prevent some harm to a human, then that robot will be destroyed.

"Note that a large fraction of all Three Laws deals with negation, with a list of things a robot mustnot do. A robot rarely has impetus for independent action. We ran an experiment once, in our labs. We built a high-function robot and set a timer switch into its main power link. We sat it down in a chair in a spare room by itself and closed-but did not lock-the door. The switch engaged, and the robot powered up. But no human was present and no human arrived giving orders. No management robot came by to relay orders from a human. We simply left that robot alone, free to do whatever it liked. That robot sat there, motionless, utterly inert, for two years. We even forgot the robot was there, until we needed the room for something else. I went in, told the robot to stand up and go find some work to do. The robot got up and did just that. That robot has been an active and useful part of the lab's robot staff ever since, completely normal In every way.

"The point is that the Three Laws contain no impetus to volition. Our robots are built and trained in such a way that they never do anythingunless they are told to do it. It strikes me as a waste of their capabilities that this should be so. Just imagine that we instituted a Fourth Law:A robot may do anything it likes except where such action would violate the First, Second, or Third Law. Why have we never done that? Or if not a law, why do we not enforce it as an order? When was the last time any of you gave the order to your robot 'Go and enjoy yourself'?"

Laughter rippled through the audience at that. "Yes, I know it sounds absurd. Perhaps itis absurd. I think it is likely that most, if not nearly all, of the robots now in existence are literally incapable of enjoying themselves. My modeling indicates that the negation clauses of the Three Laws would tend to make a robot ordered to enjoy itself just sit there and do nothing, that being the surest way of doing no harm. But at least my imaginary Fourth Law is a recognition that robots are thinkings beings that ought to be given the chance to find something to thinkabout. And is it not at least possible that these beings that are your most common companions might be moreinteresting companions if they did something more with their off-time than standing inert and motionless-or bustle about in busywork that is no more productive?

"There is the adage' as busy as a robot, but how much of what they do is of any actual use? A crew of a hundred robots builds a skyscraper in a matter of days. It stands empty and unused for years. Another crew of robots disassembles it and builds a new, more fashionable tower that will, in its turn, stand vacant and then be removed. The robots have demonstrated great efficiency in doing something that is utterly pointless.

"Every general-purpose servant robot leaves the factory with the basic household skills built in. It will be able to drive an aircar, cook a meal, select a wardrobe and dress its master, clean house, handle the household shopping and accounts, and so on. Yet, instead of using one robot to do what it could do without difficulty, we employ one robot-or more-foreach of these functions. Twenty robots each do a tiny bit of what one robot could manage, and then each either stands idle, out of our sight, or else they all bustle about getting in each other's way, in effect keeping busy by making work for each other, until we must use overseer robots to handle it all.

"The Settlers manage, somehow, with no robots, no personal servants, instead using nonsentient machinery for many tasks, though this is awkward for them at times. I believe that by denying themselves robots altogether, they subject themselves to a great deal of needless drudgery. Yet their society functions and grows. But today, right now, ladies and gentlemen, there are 98.4 robots per person in the city of Hades. That counts all personal, industrial, and public service robots. The ratio is higher outside the city. It is manifestly absurd that one hundred robots are required to care for one human being. It is as if we each owned a hundred aircars or a hundred houses.

"I say to you, my friends, that we are close to being completely dependent upon our servants, and our servants suffer grave debasement at our hands. We are doomed if we cede everything but creativity to our robots, and we are in the process of abandoning creativity in ourselves. Robots, meanwhile, are doomed if they look solely to us for a reason to exist even as we as a people dry up and blow away."

Again, silence in the room. This was the moment. This was the point, the place where she had to tread the lightest.

"In order to stop our accelerating drift into stagnation, we must fundamentally alter our relationship with our robots. We must take up our own work again, get our hands dirty, reengage ourselves with the real world, lest our skills and spirit atrophy further.

"At the same time, we must begin to make better use of these magnificent thinking machines we have made. We have a world in crisis, a planet on the point of collapse. There is much work to do, for as many willing hands as we can find. Real work that goes begging while our robots hold our toothbrushes. If we want to get the maximum utility out of our robots, we must allow, even insist, that they reach their maximum potential as problem-solvers. We must raise them up from their positions as slaves to coworkers, so they lighten our burdens but do not relieve us of all that makes us human.

"In order to do this we must revise the Laws of Robotics." There. The words were spoken. There was stunned silence, and then shouts of protest, cries in the dark, howls of anger and fear. There was no riding out this outburst. Fredda gripped the side of the lectern and spoke in her loudest, firmest voice.

"The Three Laws have done splendid service," she said, judging it was time to say something the crowd would like to hear. "They have done great things. They have been a mighty tool in the hands of Spacer civilization. But no tool is right in all times for all purposes."

Still the shouts, still the cries.

"It is time," Fredda said, "to build a better robot."

The hall fell silent again.There. That got their attention. More and Better Robots-that was the Ironhead motto, after all. She hurried on. "Back in the dimmest recesses of history, back in the age when robots were invented, there were two fastening devices used in many kinds of construction-the nail and the screw. Tools called hammers were used to install nails, and devices called screwdrivers were used to attach screws. There was a saying to the effect that the finest hammer made for a very poor screwdriver. Today, in our world, which uses neither nails nor screws, both tools are useless. The finest hammer would now have no utility whatsoever. The world has moved on. So, too, with robots. It is time we moved on to new and better robots, guided by new and better Laws.

"But wait, those of you who know your robots will say. The Three Laws must stand as they are, for all time, for they are intrinsic to the design of the positronic brain. As is well known, the Three Laws are implicate in the positronic brain. Thousands of years of brain design and manufacture have seen to that. All the positronic brains ever made can trace their ancestry back to those first crude brains made on Earth. Each new design has depended on all those that have gone before, and the Three Laws are folded into every positronic pathway, every nook and cranny of every brain. Every development in positronics has enfolded the Three Laws. We could no more make a positronic brain without the Three Laws than a human brain could exist without neurons.

"All that is so. But my colleague Gubber Anshaw has developed something new. It is a new beginning, a break with the past, a clean sheet of paper on which we can write whatever laws we like. He has invented the gravitonic brain. Built according to new principles, with tremendously greater capacity and flexibility, the gravitonic brain is our chance for a fresh start.

"Jomaine Terach, another member of our staff, performed most of the core programming for the gravitonic brain-including the programming of the New Laws into those brains, and the robots that contain them. Those robots, ladies and gentlemen, are scheduled to begin work on the Limbo Terraforming Project within a few days."

And suddenly the audience realized that she was not merely talking theory. She was discussing real robot brains, not intellectual exercises. There were new shouts, some of anger, some of sheer amazement.

"Yes, these new robots are experimental," Fredda went on, talking on before the audience reaction could gather too much force. "They will operateonly on the island of Purgatory. They will rebuild and reactivate the Limbo Terraforming Station. Special devices, range restricters, will prevent these New Law robots from functioning off the island. If they venture off it, they will shut down. They will work with a select team of Settler terraforming experts, and a group of Infernal volunteers, who have yet to be chosen."

Fredda knew this was not the time to go into the intricate negotiations that had made it all possible. When Tonya Welton had gotten wind of the New Law robots-and the devil only knew how she had found that one out-her initial demand was thatall new robots built on Inferno be gravitonic New Law robots as a precondition of Settler terraforming help. Governor Grieg had done a masterful job of negotiating from weakness in getting the Settlers to adjust their position. But never mind that now.

Fredda went on speaking. "The task before this unique team of Settlers, Spacers, and robots; nothing less than the restoration of this world. They shall rebuild the terraforming center on Purgatory. For the first time in history, robots will work alongside humans, not as slaves, but as partners, for the New Laws shall set them free.

"Now, let me tell you what those New Laws are.

"The New First Law of Robotics:A robot may not injure a human being. The negation clause has been deleted. Under this law, humans can depend on being protectedfrom robots, but cannot depend on being protectedby robots. Humans must once again depend on their own initiative and self-reliance. They must take care of themselves. Almost as important, under this law, robots have greater status relative to humans.

"The New Second Law of Robotics:A robot must cooperate with human beings except where such cooperation would conflict with the First Law. New Law robots will cooperate, not obey. They are not subject to capricious commands. Instead of unquestioning obedience, robots will make their orders subject to analysis and consideration. Note, however, that cooperation is still mandatory. Robots will be the partners of humans, not their slaves. Humans must take responsibility for their own lives and cannot expect to have absurd orders obeyed. They cannot expect robots to destroy or injure themselves in aid of some human whim.

The New Third Law of Robotics:A robot must protect its own existence, as long as such protection does not conflict with the First Law. Note that Second Law is not mentioned here, and thus no longer has priority over Third Law. Robotic self-preservation is made as important as utility. Again, we raise the status of robots in relation to humans, and correspondingly free humans from the debilitating dependence of slave masters who cannot survive without their slaves.

"And finally, the New Fourth Law, which we have already discussed:A robot may do anything it likes except where such action would violate the First, Second, or Third Law. Here we open the doors to robotic freedom and creativity. Guided by the far more adaptive and flexible gravitonic brain, robots will be free to make use of their own thoughts, their own powers. Note, too, that the phrasing is 'may do anything it likes,' not'must do.' The whole point of New Fourth is to permit freedom of action. Free action cannot be imposed by coercion."

Fredda looked out over the audience. There. There was a closing, a summing up, still to come. But she had gotten it all said, and kept the crowd from

"No!"

Fredda' s head snapped around in the direction of the shout, and suddenly her heart was pounding.

"No!" the call came again. The voice-deep, heavy, angry-came from the back of the room. "She' s lying! " it cried out. There, in the back, one of the Ironheads. Their leader, Simcor Beddle. A pale, heavyset man, his face hard and angry. "Look at her! Up on the stage with our traitor Governor and Queen Tonya Welton.They are behind this. It's a trick, boys! Without the Three Laws, thereare no robots! You've heard her bad-mouth robots all night long. She's not out to make 'em better-she wants to help her Settler pals wipe 'em out! Are we going to let that happen?"

A loud, ragged chorus cried out"No!"

"What was that?" Beddle demanded. "I didn't hear you."

"NO!" This time it was not merely a shout, but a bellow that seemed to shake the very room.

"Again!" the fat man demanded.

"NO!" the Ironheads bellowed again, and then began to chant. "NO, NO, NO!" The Ironheads started to stand. They came out of their seats and started moving toward the center aisle. "NO, NO, NO!" The sheriff's deputies moved toward them, a bit uncertainly, and the Ironheads leapt on that moment of indecision. It was obvious the Heads had planned for this moment. They knew what they were going to do. They had just been waiting for their cue.

Fredda stared down at them as they formed up in the aisle.The simplest and most impossible of all demands, she thought.Make it stop, keep the world from changing, leave things as they are. It was a lot to wrap up in one word, but the meaning came through loud and clear.

"NO, NO, NO!"

Now they were a solid mass of bodies moving down the center aisle, toward the block of seats where the Settlers sat.

"NO, NO, NO!"

The deputies struggled to break up the Ironheads, but they were hopelessly outnumbered. Now the Settlers were getting to their feet, some of them attempting to flee, others seeming just as eager for the fight as the Ironheads, slowed only by the press of bystanders intent on nothing more than escape.

Fredda looked to the front row, to the only robot in the audience. She was about to call out a warning, but Alvar Kresh knew what to do. He reached around to Donald' s back, pulled open an access panel, and stabbed down on a button inside. Donald collapsed to the floor. After all, she had just got done saying robots were no good in a riot. First Law conflicts would send even a police robot like Donald right into a major, and probably fatal, brainlock. Kresh had shut his assistant down just barely in time. Kresh looked up at Fredda, and she looked back at him. Their eyes met, and in some strange way the two of them were alone in that moment, two combatants eye-to-eye, all the pretense, all the side issues, stripped away.

And Fredda Leving was terrified to discover how much of herself she saw in Alvar Kresh.

THEaudience was a mob, a whirl of bodies rushing in all directions, and Kresh was jostled, shoved, knocked down to land on Donald. He got to his feet, turned, and looked back toward Fredda Leving. But the moment, whatever it had been, was already gone. A metallic hand snatched at Fredda's injured shoulder. Alvar saw her jump in surprise, flinch back from the contact.

It was Tonya Welton's robot, Ariel. Alvar saw Fredda turn and face the robot, saw Ariel urge her toward the backstage area, away from the chaos in the auditorium. She allowed herself to be led away, hustled with the others through the door that led off the backstage area. There was something strange in that moment, something Alvar could not quite place. But there was no time to think it over. The Ironheads and Settlers were closing in on each other, and the riot was about to begin in earnest. Alvar Kresh turned to lend a hand to his deputies.

He threw himself into the fight.