The Positronic Man - Page 13/25

ANDREW EXPERIENCED a sensation of discomfort after Little Miss's death that would not leave him for weeks. To call it grief might be a little too strong, he thought, for he suspected that there was no place in his positronic pathways for any feeling that corresponded exactly to the human emotion known as grief.

And yet there was no question but that he was disturbed in some way that could only be traced to the loss of Little Miss. He could not have quantified it. A certain heaviness about his thoughts, a certain odd sluggishness about his movements, a perception of general imbalance in his rhythms-he felt these things, but he suspected that no instruments would be able to detect any measurable change in his capacities.

To ease this sensation of what he would not let himself call grief he plunged deep into his research on robot history, and his manuscript began to grow from day to day.

A brief prologue sufficed to deal with the concept of the robot in history and literature-the metal men of the ancient Greek myths, the automata imagined by clever storytellers like E. T. A. Hoffmann and Karel Capek, and other such fantasies. He summarized the old fables quickly and dispensed with them. It was the positronic robot-the real robot, the authentic item-that Andrew was primarily concerned with.

And so Andrew moved swiftly to the year 1982 and the incorporation of United States Robots and Mechanical Men by its visionary founder, Lawrence Robertson. He felt almost as though he were reliving the story himself, as he told of the early years of struggle in drafty converted-warehouse rooms and the first dramatic breakthrough in the construction of the platinum-iridium positronic brain, after endless trial-and-error. The conception and development of the indispensable Three Laws; research director Alfred Lanning's early triumphs at designing mobile robot units, clumsy and ponderous and incapable of speech, but versatile enough to be able to interpret human orders and select the best of a number of possible alternative responses. Followed by the first mobile speaking units at the turn of the Twenty-First Century.

And then Andrew turned to something much more troublesome for him to describe: the period of negative human reaction which followed, the hysteria and downright terror that the new robots engendered, the worldwide outburst of legislation prohibiting the use of robot labor on Earth. Because miniaturization of the positronic brain was still in the development stage then and the need for elaborate cooling systems was great, the early mobile speaking units had been gigantic-nearly twelve feet high, frightful lumbering monsters that had summoned up all of humanity's fears of artificial beings-of Frankenstein's monster and the Golem and all the rest of that assortment of nightmares.

Andrew's book devoted three entire chapters to that time of extreme robot-fear. They were enormously difficult chapters to write, for they dealt entirely with human irrationality, and that was a subject almost impossible for Andrew to comprehend.

He grappled with it as well as he could, striving to put himself in the place of human beings who-though they knew that the Three Laws provided foolproof safeguards against the possibility that robots could do harm to humans-persisted in looking upon robots with dread and loathing. And after a time Andrew actually succeeded in understanding, as far as he was able, how it had been possible for humans to have felt insecure in the face of such a powerful guarantee of security.

For what he discovered, as he made his way through the archives of robotics, was that the Three Laws were not as foolproof a safeguard as they seemed. They were, in fact, full of ambiguities and hidden sources of conflict. And they could unexpectedly confront robots-straightforward literal-minded creatures that they were-with the need to make decisions that were not necessarily ideal from the human point of view.

The robot who was sent on a dangerous errand on an alien planet, for example-to find and bring back some substance vital to the safety and well-being of a human explorer-might feel such a conflict between the Second Law of obedience and the Third Law of self-preservation that he would fall into a hopeless equilibrium, unable either to go forward or to retreat. And by such a stalemate the robot-through inaction-thus could create dire jeopardy for the human who had sent him on his mission, despite the imperatives of the First Law that supposedly took precedence over the other two. For how could a robot invariably know that the conflict he was experiencing between the Second and Third Laws was placing a human in danger? Unless the nature of his mission had been spelled out precisely in advance, he might remain unaware of the consequences of his inaction and never realize that his dithering was creating a First Law violation.

Or the robot who might, through faulty design or poor programming, decide that a certain human being was not human at all, and therefore not in a position to demand the protection that the First and Second Laws were supposed to afford

Or the robot who was given a poorly phrased order, and interpreted it so literally that he inadvertently caused danger to humans nearby

There were dozens of such case histories in the archives. The early roboticists-most notably the extraordinary robopsychologist, Susan Calvin, that formidable and austere woman-had labored long and mightily to cope with the difficulties that kept cropping up.

The problems had become especially intricate as robots with more advanced types of positronic pathways began to emerge from the workshops of U. S. Robots and Mechanical Men toward the middle of the Twenty-First Century: robots with a broader capacity for thought, robots who were able to look at situations and perceive their complexities with an almost human depth of understanding. Robots like-though he took care not to say so explicitly-Andrew Martin himself. The new generalized-pathway robots, equipped with the ability to interpret data in much more subjective terms than their predecessors, often reacted in ways that humans were not expecting. Always within the framework of the Three Laws, of course. But sometimes from a perspective that had not been anticipated by the framers of those laws.

As he studied the annals of robot development, Andrew at last understood why so many humans had been so phobic about robots. It wasn't that the Three Laws were badly drawn-not at all. Indeed, they were masterly exemplars of logic. The trouble was that humans themselves were not always logical-were, on occasion, downright illogical-and robots were not always capable of coping with the swoops and curves and tangents of human thought.

So it was humans themselves who sometimes led robots into violations of one or another of the Three Laws-and then, in their illogical way, often would blame the robots themselves for having done something undesirable which in fact they had actually been ordered to do by their human masters.

Andrew handled these chapters with the utmost care and delicacy, revising and revising them to eliminate any possibility of bias. It was not his intention to write a diatribe against the flaws of mankind. His prime goal, as always, was to serve the needs of mankind.

The original purpose of writing his book might have been to arrive at a deeper understanding of his own relationship to the human beings who were his creators-but as he proceeded with it he saw that, if properly and thoughtfully done, the book could be an invaluable bridge between humans and robots, a source of enlightenment not only for robots but for the flesh-and-blood species that had brought them into the world. Anything that enabled humans and robots to get along better would permit robots to be of greater service to humanity; and that, of course, was the reason for their existence.

When he had finished half his book, Andrew asked George Charney to read what he had written and offer suggestions for its improvement. Several years had passed since the death of Little Miss, and George himself seemed unwell now, his once robust frame gaunt, his hair nearly gone. He looked at Andrew's bulky manuscript with an expression of barely masked discomfort and said, "I'm not really much of a writer myself, you know, Andrew."

"I'm not asking for your opinion of my literary style, George. It's my ideas that I want you to evaluate. I need to know whether there's anything in the manuscript that might be offensive to human beings."

"I'm sure there isn't, Andrew. You have always been the soul of courtesy."

"I would never knowingly give offense, that is true. But the possibility that I would inadvertently-"

George sighed. "Yes. Yes, I understand. All right, I'll read your book, Andrew. But you know that I've been getting tired very easily these days. It may take me a while to plow all the way through it."

"There is no hurry," said Andrew.

Indeed George took his time: close to a year. When he finally returned the manuscript to Andrew, though, there was no more than half a page of notes attached to it, the most minor factual corrections and nothing more.

Andrew said mildly, "I had hoped for criticisms of a more general kind, George."

"I don't have any general criticisms to make. It's a remarkable work. Remarkable. It's a truly profound study of its subject. You should be proud of what you've done."

"But where I touch on the topic of how human irrationality has often led to Three Laws difficulties-"

"Absolutely on the mark, Andrew. We are a sloppy-minded species, aren't we? Brilliant and tremendously creative at times, but full of all sorts of messy little contradictions and confusions. We must seem like a hopelessly illogical bunch to you, don't we, Andrew?"

"There are times that it does seem that way to me, yes. But it is not my intention to write a book that is critical of human beings. Far from it, George. What I want to give the world is something that will bring humans and robots closer together. And if I should seem to be expressing scorn for the mental abilities of humans in any way, that would be the direct opposite of what I want to be doing. Which is why I had hoped that you would single out, in your reading of my manuscript, any passages that might be interpreted in such a way that-"

"Perhaps you should have asked my son Paul to read the manuscript instead of me," George said. "He's right at the top of his profession, you know. So much more in touch with all these matters of nuance and subtle inference than I am these days."

And Andrew finally understood from that statement that George Charney had not wanted to read his manuscript at all-that George was growing old and weary, that he was entering the final years of his life, that once again the wheel of the generations had turned and that Paul was now the head of the family. Sir had gone and so had Little Miss and soon it was going to be George's turn. Martins and Charneys came and went and yet Andrew remained-not exactly unchanging (for his body was still undergoing occasional technological updating and it also seemed to him that his mental processes were constantly deepening and growing richer as he allowed himself to recognize fully his own extraordinary capabilities), but certainly invulnerable to the ravages of the passing years.

He took his nearly finished manuscript to Paul Charney. Paul read it at once and offered not only praise but, as George had indicated, valuable suggestions for revision. There were places where Andrew's inability to comprehend the abrupt, non-linear jumps of reasoning of which the human mind is capable had led him into certain oversimplifications and unwarranted conclusions. If anything, Paul thought the book was too sympathetic to the human point of view. A little more criticism of the irrational human attitude toward robotics, and toward science in general, might not have been out of place.

Andrew had not expected that.

He said, "But I would not want to offend anyone, Paul."

"No book worth reading has ever been written that didn't manage to offend someone," Paul replied. "Write what you believe to be the truth, Andrew. It would be amazing if everybody in the world agreed with you. But your viewpoint is unique. You have something real and valuable to give the world here. It won't be worth a thing, though, if you suppress what you feel and write only what you think others want to hear."

"But the First Law-"

"Damn the First Law, Andrew! The First Law isn't everything! How can you harm someone with a book? Well, by hitting him over the head with it, I suppose. But not otherwise. Ideas can't do harm-even wrong ideas, even foolish and vicious ideas. People do the harm. They seize hold of certain ideas, sometimes, and use them as the justification for doing unconscionable, outrageous things. Human history is full of examples of that. But the ideas themselves are just ideas. They must never be throttled. They need to be brought forth, inspected, tested, if necessary rejected, right out in the open. -Anyway, the First Law doesn't say anything about robots writing books. Sticks and stones, Andrew-they can do harm. But words-"

"As you yourself have just remarked, Paul, human history is full of harmful events that began simply with words. If those words had never been uttered, the harmful events would not have taken place."

"You don't understand what I'm saying, do you? Or do you? I think you do. You know what power ideas have, and you don't have a lot of faith in the ability of humans to tell a good idea from a bad one. Well, neither do I, sometimes. But in the long run the bad idea will perish. That's been the story of human civilization for thousands of years. The good does prevail, sooner or later, no matter what horrors have happened along the way. And so it's wrong to suppress an idea that may have value to the world. -Look, Andrew: you're probably the closest thing to a human being that has ever come out of the factories of U. S. Robots and Mechanical Men. You're uniquely equipped to tell the world what it needs to know about the human-robot relationship, because in some ways you partake of the nature of each. And so you may help to heal that relationship, which even at this late date is still a very troubled one. Write your book. Write it honestly."

"Yes. I will, Paul."

"Do you have a publisher in mind for it, by the way?"

"A publisher? Why, no. I haven't yet given any thought to-"

"Well, you should. Or let me do it for you. I have a friend in the book business-a client, really-do you mind if I say a word or two to him?"

"That would be quite kind of you," Andrew said.

"Not at all. I want to see this book out there where it can be read by everybody, just as you do."

And indeed within a few weeks Paul had secured a publishing contract for Andrew's book. He assured Andrew that the terms were extremely generous, extremely fair. That was good enough for Andrew. He signed the contract without hesitation.

Over the next year, while he worked on the closing sections of his manuscript, Andrew often thought of the things Paul had said to him that day-about the importance of stating his beliefs honestly, the value that his book could have if he did. And also about his own uniqueness. There was one statement of Paul's that Andrew could not get out of his mind.

Look, Andrew: you're probably the closest thing to a human being that has ever come out of the factories of u. s. Robots and Mechanical Men. You're uniquely equipped to tea the world what it needs to know about the human-robot relationship, because in some ways you partake of the nature of each.

Was it so? Is that what Paul really thought, Andrew wondered, or had it just been the heat of the moment that had led him to say those things?

Andrew asked himself that over and over again, and gradually he began to form an answer.

And then he decided that the time had come to pay another visit to the offices of Feingold and Charney and have another talk with Paul.

He arrived unannounced, but the receptionist greeted him without any inflection of surprise in its voice. Andrew was far from an unfamiliar figure by this time at the Feingold and Charney headquarters.

He waited patiently while the receptionist disappeared into the inner office to notify Paul that Andrew was here. It would surely have been more efficient if the receptionist had used the holographic chatterbox, but unquestionably it was unmanned (or perhaps the word was "unroboted") by having to deal with another robot rather than with a human being.

Eventually the receptionist returned. "Mr. Charney will be with you soon," the receptionist announced, and went back to its tasks without another word.

Andrew passed the time revolving in his mind the matter of his word choice of a few minutes before. Could "unroboted" be used as an analog of "unmanned"? he wondered. Or had "unmanned" become a purely metaphoric term sufficiently divorced from its original literal meaning to be applied to robots-or to women, for that matter?

Many similar semantic problems had cropped up frequently while Andrew was working on his book. Human language, having been invented by humans for the use of humans, was full of little tricky complexities of that sort. The effort that was required in order to cope with them had undoubtedly increased Andrew's own working vocabulary-and, he suspected, the adaptability of his positronic pathways as well.

Occasionally as Andrew sat in the waiting room someone would enter the room and stare at him. He was the free robot, after all-still the only one. The clothes-wearing robot. An anomaly; a freak. But Andrew never tried to avoid the glances of these curiosity-seekers. He met each one calmly, and each in turn looked quickly away.

Paul Charney finally came out. He and Andrew had not seen each other since the winter, at the funeral of Paul's father George, who had died peacefully at the family home and now lay buried on a hillside over the Pacific. Paul looked surprised to see Andrew now, or so Andrew thought-though Andrew still had no real faith in his ability to interpret human facial expressions accurately.

"Well, Andrew. So good to see you again. I'm sorry I made you wait, but there was something I had to finish."

"Quite all right. I am never in a hurry, Paul."

Paul had taken lately to wearing the heavy makeup that fashion was currently dictating for both sexes, and though it made the somewhat bland lines of his face sharper and firmer, Andrew disapproved. He felt that Paul's strong, incisive personality needed no such cosmetic enhancement. It would have been perfectly all right for Paul to allow himself to look bland; there was nothing bland about the man himself, and no need for all this paint and powder.

Andrew kept his disapproval to himself, of course. But the fact that he disapproved of Paul's appearance at all was something of a novelty for him. He had only just begun to have such thoughts. Since finishing the first draft of his book, Andrew had discovered that disapproving of the things human beings did, as long as he avoided expressing such opinions openly, did not make him as uneasy as he might have anticipated. He could think disapproving thoughts without difficulty and he was even able to put his disapproval in writing. He was certain that it had not always been like that for him.

Paul said, "Come inside, Andrew. I heard that you wanted to talk to me, but I wasn't really expecting that you'd come all the way down here to do it."

"If you are too busy to see me just now, Paul, I am prepared to continue to wait."

Paul glanced at the interplay of shifting shadows on the dial on the wall that served as the reception-office's timepiece and said, "I can make some time. Did you come alone?"

"I hired an automatobile."

"Any trouble doing that?" Paul asked, with more than a trace of anxiety in his tone.

"I wasn't expecting any. My rights are protected."

Paul looked all the more anxious for that. "Andrew, I've explained to you half a dozen times that that law is essentially unenforceable, at least in most circumstances. -and if you insist on wearing clothes, you're bound to run into trouble eventually, you know. Just as you did that first time when my father had to rescue you."

"It was the only such time, Paul. But I'm sorry that you're displeased."

"Well, look at it this way: you're virtually a living legend, do you realize that? People sometimes like to win a little ugly fame for themselves by making trouble for celebrities, and a celebrity is certainly what you are. Besides, as I've already told you, you're too valuable in too many ways for you to have any right to take chances with yourself. -How's the book coming along, by the way?"

"I've finished a complete draft. Now I'm doing the final editing and polishing. At least, I hope it will be the final editing and polishing. The publisher is quite pleased with what he's seen so far."

"Good!"

"I don't know that he's necessarily pleased with the book as a book. There are parts of it that make him uncomfortable, I think. But it's my guess that he expects to sell a great many copies simply because it's the first book written by a robot, and it's that aspect that pleases him."

"It's only human, I'm afraid, to be interested in making money, Andrew."

"I would not be displeased by it either. Let the book sell, for whatever reason it does. I can find good uses for whatever money it brings in."

"But I thought you were well off, Andrew! You've always had your own income-and there was the quite considerable amount of money my grandmother left you-"

"Little Miss was extremely generous. And I'm sure I can count on the family to help me out further, if a time comes when my expenses begin to exceed my income. Still, I would rather be able to earn my own way at all times. I would not want to draw on your resources except as a last resort."

"Expenses? What expenses can you be talking about? Yachts? Trips to Mars?"

"Nothing like that," said Andrew. "But I do have something rather costly in mind, Paul. It's my hope that the royalties from my book will be large enough to see me through what I have in mind. My next step, so to speak."

Paul looked a little uneasy. "And what is that?"

"Another upgrade."

"You've always been able to pay for your upgrades out of your own funds up till now."

"This one may be more expensive than the others."

Paul nodded. "Then the book royalties will come in handy. And if they're disappointing, I'm sure that we can find some way of making up-"

"It isn't only a matter of money," Andrew said. "There are some other complications. -Paul, for this one I have to go straight to the top. I need to see the head of the U. S. Robots and Mechanical Men Corporation and get his clearance for the job. I've tried to make an appointment, but so far I haven't been able to get through to him at all. No doubt it's because of my book. The corporation wasn't particularly enthusiastic about my writing a book, you know-they provided no cooperation whatever, as a matter of fact-"

A grin appeared on Paul's face. "Cooperation, Andrew? Cooperation's the last thing you could have expected from them. You scare them silly. They didn't cooperate with us in either stage of our great fight for robot rights, did they? Quite the reverse, actually. And you surely understand why. Give a robot too many rights and no one's going to want to buy one, eh?"

"That may be true, or perhaps not. In any case, I want to speak with the head of the company concerning a very special request that I have. I can't manage to get through by myself, but perhaps if you make the call for me-"

"You know that I'm not any more popular with them than you are, Andrew."

"Nevertheless, you're the head of a powerful and influential law firm and a member of a great and distinguished family. They can't simply ignore you. And if they try, you can always hint that by seeing me they stand a chance of heading off a new campaign by Feingold and Charney to strengthen the civil rights of robots even further."

"Wouldn't that be a lie, Andrew?"

"Yes, Paul, and I'm not good at telling lies. I can't tell one at all, in fact, unless I do it under the constraint of one of the Three Laws. That's why you have to make the call for me."

Paul chuckled. "Ah, Andrew, Andrew! You can't tell a lie, but you can urge me to tell one for you, is that it? You're getting more human all the time!"