agent is doing. The intuitive appeal of the idea of conscious will can be traced in part to the embedding of the experience of will, and of the notion that will has a force, in the larger conception of causal agency. People appear to be goal-seeking agents who have the special ability to envision their goals consciously in advance of action. The experience of conscious will feels like being a causal agent.
Mechanisms and Minds
We all know a lot about agents and goals, desires, and intentions, and use these concepts all the time. These concepts are only useful, however, for understanding a limited range of our experience. The movements of clock hands and raindrops and electric trains, for instance, can be under-stood in terms of causal relations that have no consciousness or will at all. They are mechanisms. Extending the notion of causal agency to these items—to say these things have the ability to cause themselves to be-have— doesn’t fit very well with the physical causal relations we perceive all around us. Imagine a spoon, knife, and fork deciding to go for a walk to the far end of the dinner table (“We’re off to see the salad”), and you can see the problem. Things don’t usually will themselves to move, whereas people seem to do this all the time.
This rudimentary observation suggests that people have at hand two radically different systems of explanation, one for minds and one for everything else. Mentalistic explanation works wonders for understanding minds, but it doesn’t work elsewhere unless we want to start thinking that everything from people to rocks to beer cans to the whole universe actually does what it consciously wants. 9 Mechanistic explanation, in turn, is just splendid for understanding rocks and beer cans, and the movements of the planets, but it leaves much wanting in understanding minds.
Each of us is quite comfortable with using these two very different ways of thinking about and explaining events—a physical, mechanical way and a psychological, mental way. In the mechanical explanatory sys-tem, people apply intuitive versions of physics to questions of causality, and so they think about causes and effects as events in the world. In the mental explanatory system, people apply implicit psychological theories to questions of causality, focusing on issues of conscious thoughts and the experience of will as they try to explain actions. In the mechanical way of thinking, all the psychological trappings are unnecessary: A physical system such as a clock, for instance, doesn’t have to intend to keep time or to experience doing so. The essence of the mental explanatory system, in contrast, is the occurrence of the relevant thoughts and feelings about the action, and in this system the objects and events of physical causality are not particularly important: A person might experience having willed the death of an enemy and become wracked with guilt, for instance, even though there was no mechanism for this to have happened.
9 . This odd possibility is the extreme consequence of attributing minds to things that can’t talk. Chalmers (1996) makes the case for this theory, such as it is.
These two explanatory systems fall into place as children develop ways of understanding both the physical and psychological worlds. The first inklings that mind perception and mechanistic explanation might develop separately in children came from the juxtaposition of two findings by Jean Piaget: Children often neglect intention in making moral judgments, and yet they sometimes over attribute intention to inanimate objects. In the case of moral judgment, Piaget (1932) found that children before the age of seven or eight who are asked to decide whether a person has done something wrong don’t concern themselves very much with the person’s intention and focus instead on the damage caused by the action. For instance, in deciding how bad Haley was when she pushed Kelsey into the creek, a young child (say, aged