The Robots of Dawn (Robot #3) - Page 2/19

6

It was Baley's third time on a spaceship and the passage of two years had in no way dimmed his memory of the first two times. He knew exactly what to expect.

There would be the isolation the fact that no one would see him or have anything to do with him, with the exception (perhaps) of a robot. There would be the constant medical treatment - the fumigation and sterilization. (No other way of putting it.) There would be the attempt to make him fit to approach the disease-conscious Spacers who thought of Earthpeople as walking bags of multifarious infections.

There would be differences, too, however. He would not, this time, be quite so afraid of the process. Surely the feeling of loss at being out of the womb would be less dreadful.

He would be prepared for the wider surroundings. This time he told himself boldly (but with a small knot in his stomach, for all that), he might even be able to insist on being given a view of space.

Would it look different from photographs of the night sky as seen from Outside? He wondered.

He remembered his first view of a planetarium dome (safely within the City, of course). It had given him no sensation of being Outside, no discomfort at all.

Then there were the two times - no, three - that he had been in the open at night and saw the real stars in the real dome of the sky. That had been far less impressive than the planetarium dome had been, but there had been a coot wind each time and a feeling of distance, which made it more frightening than the dome - but less frightening than daytime, for the darkness was a comforting wall about him.

Would, then, the sight of the stars through a spaceship viewing window seem more like a planetarium or more like Earth's night sky? Or would it be a different sensation altogether?

He concentrated on that, as though to wash out the thought of leaving Jessie, Ben, and the City.

With nothing less than bravado, he refused the car and insisted on walking the short distance from the gate to the ship in the company of the robot who had come for him. It was just a roofed-over arcade, after all.

The passage was slightly curved and he looked back while he could still see Ben at the other end. He lifted his hand casually, as though he were taking the Expressway to Trenton, and Ben waved both arms wildly, holding up the first two fingers of each hand outspread in the ancient symbol of victory.

Victory? A useless gesture, Baley was certain.

He switched to another thought that might serve to fill, and occupy him. What would it be like to board a spaceship by day, with the sun shining brightly on its metal and with himself and the others who were boarding all exposed to the Outside.

How would it feel to be entirely aware of a tiny cylindrical world, one that would detach itself from the infinitely larger world to which it was temporarily attached and that would then lose itself in an Outside, infinitely larger than any Outside on, Earth, until after an endless stretch of Nothingness it would find another -

He held himself grimly to a steady walk, letting no change in expression show - or so he thought, at least. The robot at his sidle, however, brought him to a halt.

"Are you ill, sir?" (Not "master," merely "sir." It was an Auroran robot.)

"I'm all right, boy," said Baley hoarsely. "Move on."

He kept his eyes turned to the ground and did not lift them again till the ship itself was towering above him.

An Auroran ship!

He was sure of that. Outlined by a warm spotlight, it soared taller, more gracefully, and yet more powerfully than the Solarian ships had.

Baley moved inside and the comparison remained in favor of Aurora. His room was larger than the ones two years before had been: more luxurious, more comfortable.

He knew exactly what was coming and removed all his clothes without hesitation. (Perhaps they would be disintegrated by plasma torch. Certainly, he would not get them back on returning to Earth - if he returned. He hadn't the first time.)

He would receive no other clothes till he had been thoroughly bathed, examined, dosed, and injected. He almost welcomed the humiliating, procedures imposed on him. After all, it served to keep his mind off what was taking place. He was scarcely aware of the initial acceleration and scarcely had time to think of the moment during which he left Earth and entered space.

When he was finally dressed again, he surveyed the results unhappily in a mirror. The material, whatever it was, was smooth, and reflective and shifted color with any change in angle. The trouser legs hugged his ankles and were, in turn, covered by the tops of shoes that molded themselves softly to his feet. The sleeves of his blouse hugged his wrists and his hands were covered by thin, transparent gloves. The top of the blouse covered his neck and an attached hood could, if desired, cover his head. He was being so covered, not for his own comfort, he knew, but to reduce his danger to the Spacers.

He thought, as he looked at the outfit, that he should feel uncomfortably enclosed, uncomfortably hot, uncomfortably damp. But he did not. He wasn't, to his enormous relief, even sweating.

He made the reasonable deduction. He said to the robot that had walked him to the ship and was still, with him, "Boy, are these clothes temperature-controlled?"

The robot said, "Indeed they are, sir. It is all-weather cloth and is considered very desirable. It is also exceedingly expensive. Few on Aurora are in a position to wear it."

"That so? Jehoshaphat!"

He stared at the robot. It seemed a fairly primitive model, not very much different from Earth models, in fact. Still, there was a certain subtlety of expression that Earth models lacked'. It could change expression in a limited way" for instance. It had smiled very slightly when it indicated that Baley had been given that which few on Aurora could afford.

The structure of its body resembled metal and yet had the look of something woven, something shifting slightly with movement, something with, colors that matched and contrasted pleasingly. In short, unless one looked very closely and steadily, the robot, though definitely nonhumaniform, seemed to be wearing clothing.

Baley said, "What ought I to call you, boy?"

"I am Giskard, sir."

"R. Giskard?"

"If you wish, sir."

"Do you have a library on this ship?"

"Yes, sir."

"Can you get me book-films on Aurora?"

"What kind, sir?"

"Histories - political science - geographies - anything that will let me know about the planet."

"Yes, sir."

"And a viewer."

"Yes, sir.

The robot left through the double door and Baley nodded grimly to himself. On his trip to Solaria, it had never occurred to him to spend the useless time crossing space in learning something useful. He had come along a bit in the last two years.

He tried the door the robot had just passed through. It was locked and utterly without give. He would have been enormously surprised at anything else.

He investigated the room. There was a hyperwave screen. He handled the controls idly, received a blast of music, managed to lower the volume eventually, and, listened with disapproval. Tinkly and discordant. The instruments of the orchestra seemed vaguely distorted,

He touched other contacts and finally managed - to change the view. What he saw was a space-soccer game that was played, obviously, under conditions of zero-gravity. The ball flew in straight lines and the players (too many of them on each side - with fins on backs, elbows, and knees that must serve to control movement) soared in graceful, sweeps. The unusual movements made Baley feel dizzy. He leaned forward and had just found and used the off-switch when he heard the door open behind him.

He turned and, because he thoroughly expected to see R. Giskard, he was aware at first only of someone who was not R. Giskard. It took a blink or two to realize that he saw a thoroughly human shape, with a broad, high-cheekboned face and with short, bronze hair lying flatly backward, someone dressed in clothing with a conservative cut and color scheme.

"Jehoshaphat!" said Baley in a nearly strangled voice.

"Partner Elijah," said the other, stepping forward, a small grave smile on his face.

"Daneel!" cried Baley, throwing his arms around the robot and hugging tightly. "Daneel!"

7

Baley continued to hold Daneel, the one unexpected familiar object on the ship, the one strong link to the past. He clung to Daneel in a gush of relief and affection.

And then, little by little, he collected his thoughts and knew that he was hugging not Daneel but R. Daneel - Robot Daneel Olivaw. He was hugging a robot and the robot was holding him lightly, allowing himself to be hugged, judging that the action gave pleasure to a human being and enduring that action because the positronic potentials of his brain made it impossible to repel the embrace and so cause disappointment and embarrassment to the human being.

The insurmountable First Law of Robotics states: "A robot may not injure a human being" - and to repel a friendly gesture would do injury.

Slowly, so, as to reveal no sign of his own chagrin, Baley released his hold. He even gave each upper arm of the robot a final squeeze, so that there might seem to be no shame to the release.

"Haven't seen you, Daneel," said Baley, "since you brought that, ship to Earth with the two mathematicians. Remember?"

"Of a certainty, Partner Elijah. It is a pleasure to see you."

"You feel emotion, do you?" said Baley lightly.

"I cannot say what I feel in any human sense, Partner Elijah. I can say, however, that the sight of you seems to make my thoughts flow more easily, and the gravitational pull on my body seems to assault my senses with lesser insistence, and that there are other changes I can identify. I imagine that what I sense corresponds in a rough way to what it is that you may sense when you feel pleasure."

Baley nodded. "Whatever it is you sense when, you see me, old partner, that makes it seem preferable to the state in which you are when you don't see me, suits me well - if you follow my meaning. But how is it you are here?"

"Giskard Reventlov, having reported you - " R. Daneel paused.

"Purified?" asked Baley sardonically.

"Disinfected," said R. Daneel. "I felt it appropriate to enter then."

"Surely you would not fear infection otherwise?"

"Not at all, Partner Elijah, but others on the ship might then be reluctant to have me approach them. The people of Aurora are sensitive to the chance of infection, sometimes to a point beyond a rational estimate of the probabilities."

"I understand, but I wasn't asking why you were here at this moment. I meant why are you here at all?"

"Dr. Fastolfe, of whose establishment I am part, directed me to board the ship that had been sent to pick you up for several reasons. He felt it desirable that you have one immediate item of the known in what he was certain would be a difficult mission for you."

"That was a kindly thought on his part. I thank him."

R. Daneel bowed gravely in acknowledgment. "Dr. Fastolfe also felt that the meeting would give in" - the robot paused "appropriate sensations - "

"Pleasure, you mean, Daneel."

"Since I am permitted to use the term, yes. And as a third reason - and the most important - "

The door opened again at that point and R. Giskard walked in.

Baley's head turned toward it and he felt a surge of displeasure. There was no mistaking R. Giskard as a robot and its presence, emphasized, somehow, the robotism of Daneel (R. Daneel, Baley suddenly thought again), even though Daneel was far the superior of the two. Baley didn't want the robotism of Daneel emphasized; he didn't want himself humiliated for his inability to regard Daneel as anything but a human being with a somewhat stilted way with the language.

He said impatiently, "What is it, boy?"

R. Giskard said, "I have brought the book-films you wished to see, sir, and the viewer."

"Well, put them down. Put them down. - And you needn't stay. Daneel will be here with me."

"Yes, sir." The robot's eyes - faintly glowing, Baley noticed, as Daneel's were not - turned briefly to R. Daneel, as though seeking orders from a superior being.

R. Daneel said quietly, "It will be appropriate, friend Giskard, to remain just outside the door."

"I shall, friend Daneel," said R. Giskard.

It left and Baley said with some discontent, "Why does it have to stay just outside the door? Am I a prisoner?"

"In the sense," said R. Daneel, "that it would not be permitted for you to mingle with the ship's company in the course of this voyage, I regret to be forced to say you are indeed a prisoner. Yet that is not the reason for the presence of Giskard. - And I should tell you at this point that it might well be advisable, Partner Elijah, if you did not address Giskard - or any robot - as boy."

Baley frowned. "Does it resent the expression?"

"Giskard does not resent any action of a human being. It is simply that 'boy' is not a customary term of address for robots on Aurora and it would be inadvisable to create friction with the Aurorans, by unintentionally stressing your place of origin through habits of speech that are nonessential."

"How do I address it, then?"

"As you address me, by the use of his accepted identifying name. That is, after all, merely a sound indicating the particular person you are addressing - and why should one sound be preferable to another? It is merely a matter of convention. And it is also the custom on Aurora to refer to a robot as 'he' - or sometimes 'she' - rather than as 'it.' Then, too, it is not the custom on Aurora to use the initial 'R.' except under formal conditions where the entire name of the robot is appropriate and even then the initial is nowadays often left out."

"In that case - Daneel," (Baley repressed the sudden impulse to say "R. Daneel") "how do you distinguish between robots and human beings?"

"The distinction is usually self-evident, Partner Elijah. There would seem to be no need to emphasize it unnecessarily. At least that is the Auroran view and, since you have asked Giskard for films on Aurora, I assume you wish to familiarize yourself with things Auroran as an aid to the task you have undertaken."

"The task which has been dumped on me, yes. And what if the distinction between robot and human being is not self evident, Daneel? As in your case?"

"Then why make the distinction, unless the situation is such that it is essential to make it?"

Baley took a deep breath. It was going to be difficult to adjust to this Auroran pretense that robots did not exist. He said, "But then, if Giskard is not here to keep me prisoner, why is it - he - outside the door?"

"Those are according to the instructions of Dr. Fastolfe, Partner Elijah. Giskard is to protect you."

"Protect me? Against what? - Or against whom?"

"Dr. Fastolfe was not precise on that point, Partner Elijah. Still, as human passions are running high over the matter of Jander Panell - "

"Jander Panell?"

"The robot whose usefulness was terminated."

"The robot, in other words, who was killed?"

"Killed, Partner Elijah, is a term that is usually applied to human beings."

"But on Aurora distinctions between robots and human beings are avoided, are they not?"

"So they are! Nevertheless, the possibility of distinction or lack of distinction in the particular case of the ending of functioning has never arisen to my knowledge. I do not know what the rules are."

Baley pondered the matter. It was a point of no real importance, purely a matter of semantics. Still, he wanted to probe the manner of thinking of the Aurorans. He would get nowhere otherwise.

He said slowly, "A human being who is functioning, is alive. If that life is violently ended by the deliberate action of another human being, we call that 'murder' or 'homicide.' 'Murder' is, somehow, the stronger word. To be witness, suddenly, to an attempted violent end to the life of a human being, one would shout 'Murder!' It is not at all likely that one would shout 'Homicide!' It is the more formal word, the less emotional word."

R. Daneel said, "I do not understand the distinction you are making, Partner Elijah. Since 'murder' and 'homicide' are both used to represent the violent ending of the life of a human being, the two words must be interchangeable. Where, then, is the distinction?"

"Of the two words, one screamed out will more effectively chill the blood of a human being than the other will, Daneel."

"Why is that?"

"Connotations and associations; the subtle effect, not of dictionary meaning, but of years of usage; the - nature of the sentences and conditions and events in which one has experienced the use of one word as compared with that of the other."

"There is nothing of this in my programming," said Daneel, with a curious sound of helplessness hovering over the apparent lack of emotion with which he said this (the same lack of emotion with which he said everything).

Baley said, "Will you accept my word for it, Daneel?"

Quickly, Daneel said, almost as though he had just been presented with the solution to a puzzle, "Without doubt."

"Now, then, we might say that a robot that is functioning is alive," said Baley. "Many might refuse to broaden the word so far, but we are free to devise definitions to suit ourselves if it is useful. It is easy to treat a functioning robot as alive and it would be unnecessarily complicated to try to invent a new word for the condition or to avoid the use of the familiar one. You are alive, for instance, Daneel, aren't you?"

Daneel said, slowly and with emphasis, "I am functioning!"

"Come. If a squirrel is alive, or a bug, or a tree, or a blade of grass, why not you? I would never remember to say - or to think - that I am alive but that you are merely functioning, especially if I am to live for a while on Aurora, where I am to try not, to make unnecessary distinctions between a robot and myself. Therefore, I tell you that we are both alive and I ask you to take my word for it."

"I will do so, Partner Elijah."

"And yet can we say that the ending of robotic life - by the deliberate violent action of a human being is also 'murder'? We might hesitate. If the crime is the same, the punishment should be the same, but would that be right? If the punishment of the murder of a human being is death, should one actually execute a human being who puts an end to a robot?"

"The punishment of a murderer is psychic-probing, Partner Elijah, followed by the construction of a new personality. It is the personal structure of the mind that has committed the crime, not the life of the body."

"And what is, the punishment on Aurora for putting a violent end to the functioning of a robot?"

"I do not know, Partner Elijah. Such an incident has never occurred on Aurora, as far as I know."

"I suspect the punishment would not be psychic-probing," said Baley. "How about 'roboticide'?"

"Roboticide?"

"As the term used to describe the killing of a robot."

Daneel said, "But what about the verb derived from the noun, Partner Elijah? One never says 'to homicide' and it would therefore not be proper to say 'to roboticide.'"

"You're right. You would have to say 'to murder' in each case."

"But murder applies specifically to human beings. One does not murder an animal, for instance."

Baley said, "True. And one does not murder even a human being by accident, only be deliberate intent. The more general term is 'to kill'. That applies to accidental death as well as to deliberate murder - and it applies to animals as well as human beings. Even a tree may be killed by disease, so why may not a robot be killed, 'eh, Daneel?"

"Human beings and other animals and plants as well, Partner Elijah, are all living things," said Daneel. "A robot is a human artifact, as much as this viewer is. An artifact is 'destroyed', 'damaged', 'demolished', and so on. It is never 'killed'."

"Nevertheless, Daneel, I shall say 'killed.' Jander Panell was killed."

Daneel said, "Why should a difference in a word make any difference to the thing described?"

"That which we call a rose by any other name would smell as sweet. Is that it, Daneel?"

Daneel paused, then said, "I am not certain what is meant by the smell of a rose, but if a rose on Earth is the common flower that is called a rose on Aurora, and if by its 'smell' you mean a property that can be detected, sensed, or measured by human beings, then surely calling a rose by another sound combination and holding all else equal - would not affect the smell or any other of its intrinsic properties."

"True. And yet, changes in name do result in changes in perception, where human beings are concerned."

"I do not see why, Partner Elijah."

"Because human beings are often illogical, Daneel. It is not an admirable characteristic."

Baley sank deeper into his chair and fiddled with his viewer, allowing his mind, for a few minutes, to retreat into private thought. The discussion with Daneel was useful in itself, for while Baley played with the question of words, he managed to forget that he was in space, to forget that the ship was moving forward until it was far enough from the mass centers of the Solar System to make the Jump through hyperspace; to forget that he would soon be several million kilometers from Earth and, not long after that, several lightyears from Earth.

More important, there were positive conclusions to be drawn. It was clear that Daneel's talk about Aurorans, making no distinction between robots and human beings was misleading. The Aurorans might virtuously remove the initial "R.," the use of "boy" as a form of address, and the use of "it" as the customary pronoun, but from Daneel's resistance to the use of the same word for the violent ends of a robot and of a human being (a resistance inherent in his programming which was, in turn, the natural consequence of Auroran assumptions about how Daneel ought to behave) one had to conclude that these were merely superficial changes. In essence, Aurorans were as firm as Earthmen in their belief that robots were machines that were infinitely inferior to human beings.

That meant that his formidable task of finding a useful resolution of the crisis (if that were possible at all) would not be hampered by at least one particular misperception of Auroran society.

Baley wondered if he ought to question Giskard, in order to confirm the conclusions he reached from his conversation with Daneel - and, without much hesitation, decided not to. Giskard's simple and rather unsubtle mind would be of no use. He would "Yes, sir" and "No, sir" to the end. It would be like questioning a recording.

Well, then, Baley decided, he would continue with Daneel, who was at least capable of responding with something approaching subtlety.

He, said, "Daneel, let us consider the case of Jander Panell, which I assume, from what you have said so far, is the first case of roboticide in the history of Aurora. The human being responsible - the killer - is I take it, not known."

"If," said Daneel, "one assumes that a human being was responsible, then his identity is not known. In that, you are right, Partner Elijah."

"What about the motive? Why was Jander Panell killed?"

"That, too, is not known."

"But Jander Panell was a humaniform robot, one like yourself and not one like, for instance, R. Gis - I mean, Giskard."

"That is so. Jander was a humaniform robot like myself."

"Might it not be, then, that no case of roboticide was intended?"

"I do not understand, Partner Elijah."

Baley said, a little impatiently, "Might not the killer have thought this Jander was a human being, that the intention was homicide, not roboticide?"

Slowly, Daneel shook his head. "Humaniform robots are quite like human beings in appearance, Partner Elijah, down to the hairs and pores in our skin. Our voices are thoroughly natural, we can go through the motions of eating, and so on. And yet, in our behavior there are noticeable differences. There may be fewer such differences with time and with refinement of technique, - but as yet they are many. You - and other Earthmen not used to humaniform robots - may not easily notes these differences, but Aurorans would. No Auroran would mistake Jander - or me - for a human being, not for a moment."

"Might some Spacer, other than an Auroran, make the mistake?"

Daneel hesitated. "I do not think so. I do not speak from personal observation or from direct programmed knowledge, but I do have the programming to know that all Spacer worlds are as intimately acquainted with robots as Aurora is - some, like Solaria, even more so - and I deduce, therefore, that no Spacer would miss the distinction between human and robot."

"Are there humaniform robots on the other Spacer worlds?"

"No, Partner Elijah, they exist only on Aurora so far."

"Then other Spacers would not be intimately acquainted with humaniform robots and might well miss the distinctions and mistake them for human beings."

"I do not think that is likely. Even humaniform robots will behave in robotic fashion in certain definite ways that any Spacer would recognize."

"And yet surely there are Spacers who are not as intelligent as most, not as experienced, not as mature. There are Spacer children, if nothing else, who would miss the distinction."

"It is quite certain, Partner Elijah, that the - roboticide was not committed by anyone unintelligent, inexperienced, or young. Completely certain."

"We're making eliminations. Good, If no Spacer would miss the distinction, what about an Earthman? Is it possible that - "

"Partner Elijah, when you arrive in Aurora, you will be the first Earthman to set foot on the planet since the period of original settlement was over. All Aurorans now alive were born on Aurora or, in a relatively few cases, on other Spacer worlds.

"The first Earthman," muttered Baley. "I am honored. Might not an Earthman be present on Aurora without the knowledge of Aurorans?"

"No!" said Daneel with simple certainty.

"Your knowledge, Daneel, might not be absolute."

"No!" came the repetition, in tones precisely similar to the first.

"We conclude, then," said Baley with a shrug, "that the roboticide was intended to be roboticide and nothing else."

"That was the conclusion from the start."

Baley said, "Those Aurorans who concluded this at the start had all the information to begin with. I am getting it now for the first time."

"My remark, Partner Elijah, was not meant in any pejorative manner. I know better than to belittle your abilities."

"Thank you, Daneel. I know there was no intended sneer in your remark. - You said just a while ago that the roboticide was not committed by anyone unintelligent, - inexperienced, or young and that this is completely certain. Let us consider your remark - "

Baley knew that he was taking the long route. He had to. Considering his lack of understanding of Auroran ways and of their manner of thought, he could not afford to make assumptions and skip steps. If he were dealing with an intelligent human being in this way, that person would be likely to grow impatient and blurt out information - and consider Baley an idiot into the bargain. Daneel, however, as a robot, would follow Baley down the winding road with total patience.

That was one type of behavior that gave away Daneel as a robot, however humaniform he might be. An Auroran might be able to judge him a robot from a single answer to a single question. Daneel was right as to the subtle distinctions.

Baley said, "One might eliminate children, perhaps also most women, and many male adults by presuming that the method of roboticide involved great strength - that Jander's head was perhaps crushed by a violent blow or that his chest was smashed inward. This would not, I imagine, be easy for anyone who was not a particularly large and strong human being." From what Demachek had said on Earth, Baley knew that this was not the manner of the roboticide, but how was he to tell that Demachek herself had not been misled?

Daneel said, "It would not be possible at all for any human being."

"Why not?"

"Surely, Partner Elijah, you are aware that the robotic skeleton is metallic in nature and much stronger than human bone. Our movements are more strongly powered, faster, and more delicately controlled. The Third Law of Robotics states: 'A robot must protect its own existence.' An assault by a human being could easily be fended off. The strongest human being could be immobilized. Nor is it likely that a robot can be caught unaware. We are always aware of human beings. We could not fulfill our functions otherwise."

Baley said, "Come now, Daneel. The Third Law states: 'A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.' The Second Law states: 'A robot must obey the orders given it by a human being, except where such orders would conflict with the First Law.' And the First Law states: 'A robot may, not injure a human being or, through inaction, allow a human being, to come to harm.' A human being could order a robot to destroy himself - and a robot would then use his own strength to smash his own skull. And if a human being attacked a robot, that robot could not fend off the attack without harming the human being, which would violate First Law."

Daneel Said, "You are, I suppose, thinking of Earth's robots. On Aurora - or on any of the Spacer worlds - robots are regarded more highly than on Earth, and are, in general, more complex, versatile, and valuable. The Third Law is distinctly stronger in comparison to the Second Law on Spacer worlds than it is on Earth. An order for self-destruction would be questioned and there would have to be a truly legitimate reason for it to be carried, through - a clear and present danger. And in fending off an attack, the First Law would not be violated, for Auroran robots are deft enough to immobilize a human being without hurting him."

"Suppose, though, that a human being maintained that, unless a robot destroyed himself, he - the human being - would be destroyed? Would not the robot then destroy himself?"

"An Auroran robot would surely question a mere statement to that effect. There would have to be clear evidence of the possible destruction of a human being."

"Might not a human being be, sufficiently subtle to so arrange matters in such a way as to make it seem to a robot that the human being was indeed in great danger? Is it the ingenuity that would be required that makes you eliminate the unintelligent, inexperienced, and young?"

And Daneel said, "No, Partner Elijah, it is not."

"Is there an error in my reasoning?"

"None."

"Then the effort may be in my assumption that he was physically damaged. He was not, in actual fact, physically damaged. Is that right?"

"Yes, Partner Elijah."

(That meant Demachek had had her facts straight, Baley thought.)

"In that case, Daneel, Jander was mentally damaged. Roblock! Total and irreversible!"

"Roblock?"

"Short for robot-block, the permanent shutdown of the functioning of the positronic pathways."

"We do not use the word 'roblock' on Aurora, Partner Elijah."

"What do you say?"

"We say 'mental freeze-out'."

"Either way, it is the same phenomenon being described."

"It might be wise, Partner Elijah, to use our expression or the Aurorans you speak to may not understand; conversation may be impeded. You stated a short while ago that different words make a difference."

"Very well. I will say 'freeze-out'. - Could such a thing happen spontaneously?"

"Yes, but the chances are infinitesimally small, roboticists say. As a humaniform robot, I can report that I have never myself experienced any effect that could even approach mental freeze-out."

"Then one must assume that a human being deliberately set up a situation in which mental freeze-out would take place."

"That is precisely what Dr. Fastolfe's opposition contends, Partner Elijah."

"And since this would take robotic training, experience, and skill, the unintelligent, the inexperienced, and the young cannot have been responsible."

"That is the natural reasoning, Partner Elijah."

"It might even be possible to list the number of human beings on Aurora with sufficient skill and thus set up a group of suspects that might not be very large in number."

"That has, in actual fact, been done, Partner Elijah."

"And how long is the list?"

"The longest list suggested contains only one name."

It was Baley's turn to pause. His brows drew together in an angry frown and he said, quite explosively, "Only one name?"

Daneel said quietly, "Only one name, Partner Elijah. That is the judgment of Dr. Han Fastolfe, who is Aurora's greatest theoretical roboticist."

"But what is, then, the mystery in all this? Whose is the one name?"

R. Daneel said, "Why, that of Dr. Han Fastolfe, of course. I have just stated that he is Aurora's greatest theoretical roboticist and, in Dr. Fastolfe's professional opinion, he himself, is the only one who could possibly have maneuvered Jander Panell into total mental freeze-out without leaving any sign of the process. However, Dr. Fastolfe also states that he did not do it."

"But that no one else could have, either?"

"Indeed, Partner Elijah. There lies the mystery."

"And what if Dr. Fastolfe - " Baley paused. There would be no point in asking Daneel if Dr. Fastolfe was lying or was somehow mistaken, either in his own judgment that no one but he could have done it or in the statement that he himself had not done it. Daneel had been programmed by Fastolfe and there would be no chance that the programming included the ability to doubt the programmer.

Baley said, therefore, with as close an approach to mildness as he could manage, "I will think about this, Daneel, and we will talk again."

"That is well, Partner Elijah, It is, in any case, time for sleep. Since, it is possible that, on Aurora, the pressure of events may force an irregular schedule upon you, it would be wise to seize the opportunity for sleep now. I will show you how one produces a bed and how one manages the bedclothes."

"Thank you, Daneel," muttered Baley.

He was under no illusion that sleep would come easily. He was being sent to Aurora for the specific purpose of demonstrating that - Fastolfe was innocent of roboticide - and success in that was required for Earth's continued security and (much less important but equally dear to Baley's heart) for the continued prospering of Baley's own career - yet, even before reaching Aurora, he had discovered that Fastolfe had virtually confessed to the crime.

Baley did sleep - eventually, after Daneel demonstrated how to reduce the field intensity that served as a form of pseudogravity. This was not true antigravity and it consumed so much energy that the process could only be used at restricted times and under unusual conditions.

Daneel was not programmed to be able to explain the manner in which this worked and, if he had, Baley was quite certain he would not have understood it. Fortunately, the controls could be operated without any understanding of the scientific Justification.

Daneel said, "The field intensity cannot be reduced to zero at least, not by these controls. Sleeping under zero-gravity is not, in any case, comfortable, certainly not for those inexperienced in space travel. What one needs is an intensity low enough to give one a feeling of freedom from the - pressure of one's own weight, but high enough to maintain an up-down orientation. The level varies with the individual. Most people would feel most comfortable at the minimum intensity allowed by the control, but, you might find that, on first use, you would wish a higher intensity, so that you might retain the familiarity of the weight sensation to a somewhat greater extent. Simply experiment with different levels and find the one that suits."

Lost in the novelty of the sensation, Baley found his mind drifting away from the problem of Fastolfe's affirmation/denial, even as his body drifted away from wakefulness. Perhaps the two were one process.

He dreamed he was back on Earth (of course), moving along an Expressway but not in one of the seats. Rather, he was - floating along beside the high-speed strip, just over the head of the moving people, gaining on, them slightly. None of the ground-bound people seemed surprised; none looked up at him. It was a rather pleasant sensation and he missed it upon waking.

8

After breakfast the following morning -

Was it morning actually? Could it be morning - or any other time of day - in space?

Clearly, it couldn't. He thought awhile and decided he would define morning as the time after waking, and he would define breakfast as the meal eaten after waking, and abandon specific timekeeping as objectively unimportant. - For him, at least, if not for the ship.

After breakfast, then, the following morning, he studied the news sheets offered him only long enough to see that they said nothing about the roboticide on Aurora and then turned to those book-films that had been brought to him the previous day ("wake period"?) by Giskard.

He chose those whose titles sounded historical and, after viewing through several hastily, he decided that Giskard had brought him books for adolescents. They were heavily illustrated and simply written. He wondered if that was Giskard's estimate of Baley's intelligence - or, perhaps, of his needs. After some thought, Baley decided that Giskard, in his robotic innocence, had chosen well, and that there was no point, in brooding over a possible insult.

He settled down to viewing with greater concentration and noted at once that Daneel was viewing the book-film with him. Actual curiosity? Or just to keep his eyes occupied?

Daneel did not once ask to have a page repeated. Nor did he stop to ask a question. Presumably, he merely accepted what he read with robotic trust and did not permit himself the luxury of either doubt or curiosity.

Baley did not ask Daneel any questions concerning what he read, though he did ask for instructions on the operation of the print-out mechanism of the Auroran viewer, with which he was not familiar.

Occasionally, Baley stopped to make use of the small room that adjoined his room and could be used for the various private physiological functions, so private that the room was referred to as "the Personal," with the capital letter always understood, both on Earth and - as Baley discovered when Daneel referred to it - on Aurora. It was just large enough for one person which made it bewildering to a City-dweller accustomed to huge banks of urinals, excretory seats, washbasins, and showers.

In viewing the book-films, Baley did not attempt to memorize details. He had no intention of becoming an expert on Auroran society, nor even of passing a high school test on the subject. Rather, he wished to get the feel of it.

He noticed, for instance, even through the hagiographic attitude of historians writing for young people, that the Auroran pioneers - the founding fathers, the Earthpeople who had first come to Aurora to settle in the early days of interstellar travel had been very much Earthpeople. Their politics, their quarrels, every facet of their behavior had been Earthish; what happened on Aurora was, in ways, similar to the events that took place when the relatively empty sections of Earth had been settled a couple of thousand years before.

Of course, the Aurorans had no intelligent life to encounter and to fight, no thinking organisms to puzzle the invaders from Earth with questions of treatment, humane or cruel. There was precious little life of any kind, in fact. So the planet was quickly settled by human beings, by their domesticated plants and animals, and by the parasites and other organisms that were adventitiously brought along. And, of course, the settlers brought robots with them.

The first Aurorans quickly felt the planet to be theirs, since it fell into their laps with no sense of competition, and they had called the planet New Earth to begin with. That was natural, since it was the first extra solar planet - the first Spacer world to be settled. It was the first fruit of interstellar travel, the first dawn of an immense new era. They quickly cut the umbilical cord, however, and renamed the planet Aurora after the Roman goddess of the dawn.

It was the World of the Dawn. And so did the settlers from the start self-consciously declare themselves the progenitors of a new kind. All previous history of humanity was a dark Night and only for the Aurorans on this new world was the Day finally approaching.

It was this great fact, this great self-praise, that made itself felt over all the details: all the names, dates, winners, losers. It was the essential.

Other worlds were settled, some from Earth, some from Aurora, but Baley paid no attention to that or to any of the details. He was after the broad brushstrokes and he noted the two massive changes that took place and pushed the Aurorans ever farther away from their Earthly origins. These were first, the increasing integration of robots into every facet of life and second, the extension of the life-span.

As the robots grew more advanced and versatile, the Aurorans grew more dependent on them. But never helplessly so. Not like the world of Solaria, Baley remembered, on which a very few human beings were in the collective womb of very many robots. Aurora was not like that.

And yet they grew more dependent.

Viewing as he did for intuitive feel - for trend and generality - every step in the course of human/robot interaction seemed to depend on dependence. Even the manner in which a consensus of robotic rights was reached - the gradual dropping of what Daneel would call "unnecessary distinctions" was a sign of the dependence. To Baley, it seemed not, that the Aurorans were growing more humane in their attitude out of a liking for the humane, but that they were denying the robotic nature of the objects in order to remove the discomfort of having to recognize the fact that, the human beings were dependent upon objects of artificial intelligence.

As for the extended life-span, that was accompanied by a slowing of the pace of history. The peaks and troughs smoothed out. There was a growing continuity and a growing consensus.

There was no question but that the history, he was viewing grew less interesting as it went along; it became almost soporific. For those living through it, this had to be good. History was interesting to the extent that it was catastrophic and, while that might make absorbing viewing, it made horrible living. Undoubtedly, personal lives continued to be interesting for the vast majority of Aurorans and, if the collective interaction of lives grew quiet, who would mind?

If the World of the Dawn had a quiet sunlit Day, who on that world would clamor for storm?

Somewhere in the course of his viewing, Baley felt an indescribable sensation. If he had been forced to attempt a description, he would have said it was that of a momentary inversion. It was as though he had been turned inside out and then back as he had been - in the course of a small fraction of a second.

So momentary had it been that he almost missed it, ignoring it as though it had been a tiny hiccup inside himself.

It was only perhaps a minute later, suddenly going over the feeling in retrospect, that he remembered the sensation as something he had experienced twice before: once when traveling to Solaria and once when returning to Earth from that planet.

It was the "Jump," the passage through hyperspace that, in a timeless, spaceless interval, sent the ship across the parsecs and defeated the speed-of-light limit of the Universe. (No mystery in words, since the ship, merely left the Universe and traversed something which involved no speed limit. Total mystery in concept, however for there was no way of describing what hyperspace was, unless one made use of mathematical symbols which could, in any case, not be translated into anything comprehensible.)

If one accepted the fact that human beings had learned to manipulate hyperspace without understanding the thing they manipulated, then the effect was clear. At one moment, the ship had been within microparsecs of Earth, and at the next moment, it was within microparsecs of Aurora.

Ideally, the Jump took zero time - literally zero - and, if it were carried through with perfect smoothness, there would not, could not be any biological sensation at all. Physicists maintained, however, that perfect smoothness required infinite energy so that there was always an "effective time" that was not quite zero, though it could be made as short as desired. It was that which produced that odd and essentially harmless feeling of inversion.

The sudden realization that he was very far from Earth and very close to Aurora filled Baley with a desire to see the Spacer world.

Partly, it was the desire to see somewhere people lived. Partly, it was a natural curiosity to see something that had been filling his thoughts as a result of the book-films he had been viewing.

Giskard entered just then with the middle meal between waking and sleeping (call it "lunch") and said, "We are approaching Aurora, sir, but it will not be possible for you to observe it from the bridge. There would, in any case, be nothing to see. Aurora's sun is merely a bright star and it will be several days before we are near enough to Aurora itself to see any detail." Then he added, as though in afterthought, "It will not be possible for you to observe it from the bridge at that time, either."

Baley felt strangely abashed. Apparently, it was assumed he would want to observe and that want was simply squashed. His presence as a viewer was not desired.

He said, "Very well, Giskard," and the robot left.

Baley looked after him somberly. How many other constraints would be placed on him? Improbable as successful completion of his task was, he wondered in how many different ways Aurorans would conspire to make it impossible.