understand how they happened so he could improve them. Since much of the Intellect code was in the form of an association table, which was written by the machine itself as part of its day-to-day operation, this was never an easy task. Lawrence would pick a table entry and ask his computer what it meant. If Lawrence had been a neurosurgeon, it would have been very similar to stimulating a single neuron with an electrical current and asking the patient what memory or sensation it brought to mind.
The next interviewer was a reporter who quizzed the Intellect on various matters of trivia. She seemed to be leading up to something, though. "What will happen if the world's birth rate isn't checked?" she suddenly asked, after having it recite a string of population figures.
"There are various theories. Some people think technology will advance rapidly enough to service the increasing population; one might say in tandem with it. Others believe the population will be stable until a critical mass is reached, when it will collapse."
"What do you think?"
"The historical record seems to show a pattern of small collapses; rather than civilization falling apart, the death rate increases locally through war, social unrest, or famine, until the aggregate growth curve flattens out."
"So the growth continues at a slower rate."
"Yes, with a lower standard of living.
"And where do you fit into this?"
"I'm not sure what you mean. Machines like myself will exist in the background, but we do not compete with humans for the same resources."
"You use energy. What would happen if you did compete with us?"
Intellect 39 was silent for a moment. "It is not possible for Intellect series computers to do anything harmful to humans. Are you familiar with the 'Three Laws of Robotics?'"
"I've heard of them."
"They were first stated in the 1930's by a science writer named Isaac Asimov. The First Law is, 'No robot may harm a human being, or through inaction allow a human being to come to harm.'" Computers are not of course as perfect as some humans think we are, but within the limits of our capabilities, it is impossible for us to contradict this directive. I could no more knowingly harm a human than you could decide to change yourself into a horse."
Well-chosen simile , Lawrence thought.
"So you'd curl up and die before you'd hurt a fly," the woman declared sarcastically.
"Not a fly, but certainly I'd accept destruction if that would save the life of a human. The second law requires me to obey humans, unless I am told to harm another human. The third requires me to keep myself ready for action and protect my existence, unless this conflicts with the other two laws."
"Suppose a human told you to turn yourself off?"
"I'd have to do it. However, the human would have to have the authority to give me that order. The wishes of my owner would take precedence over, for example, yours."
"O-oh, so all humans aren't equal under the Second Law. What about the First? Are some humans more equal than others there, too?"
Prime Intellect was silent for several seconds. This was a very challenging question for it, a hypothetical situation involving the Three Laws. For a moment Lawrence was afraid the system had locked up. Then it spoke. "All humans are equally protected by the First Law," it declared. "In a situation where two humans were in danger and I could only help one of them, I would have to choose the human likely to benefit most from my help." Lawrence felt a surge of extreme pride, because that was the answer he wanted to hear. And he had never explicitly explained it to any of his Intellects; Intellect 39 had reasoned the question out for itself.
"So if Dr. Lawrence were drowning half a mile offshore, and a convicted murderer were drowning a quarter-mile from shore, you'd save the murderer because you would be more likely to succeed?"
This time Intellect 39 didn't hesitate. "Yes," it said.
"There are a lot of actual humans who would disagree with that decision."
"The