of our own group. For example, I am very forgiving where my own actions are concerned. I will forgive myself in a heartbeat—and toss in some compassionate humor in the bargain—for a crime that I would roast anybody else for.
Social psychologists have shown these effects with an interesting twist. When a person is placed under cognitive load (by having to memorize a string of numbers while making a moral evaluation), the individual does not express the usual bias toward self. But when the same evaluation is made absent cognitive load, a strong bias emerges in favor of seeing oneself acting more fairly than another individual doing the identical action. This suggests that built deeply in us is a mechanism that tries to make universally just evaluations, but that after the fact, “higher” faculties paint the matter in our favor. Why might it be advantageous for our psyches to be organized this way? The possession of an unbiased internal observer ought to give benefits in policing our own behavior, since only if we recognize our behavior correctly can we decide who is at fault in conflict with others.
The Illusion of Control
Humans (and many other animals) need predictability and control. Experiments show that occasionally administering electrical shocks at random creates much more anxiety (profuse sweating, high heart rate) than regular and predictable punishment. Certainty of risk is easier to bear than uncertainty. Controlling events gives greater certainty. If you can control, to some degree, your frequency of being shocked, you feel better than if you have less control over less frequent shocks. Similar effects are well known for other animals, such as rats and pigeons.
But there is also something called an illusion of control, in which we believe we have greater ability to affect outcomes than we actually do. For the stock market, we have no ability to affect its outcome by any of our actions, so any notion that we do must be an illusion. This was measured directly on actual stockbrokers. Scientists set up a computer screen with a line moving across it more or less like the stock market average, up and down—jagged—initially with a general bias downward but then recovering to go into positive territory, all while a subject sits in front of the screen, able to press a computer mouse, and told that pressing it “may” affect the progress of the line, up or down. In fact, the mouse is not connected to anything. Afterward, people are asked how much they thought they controlled the line’s movement, a measure of their “illusion of control.”
A very interesting finding emerged when those taking the tests were stockbrokers (105 men and 2 women) whose firms provided data both on internal evaluation and on salaries paid. In both cases, those with a higher illusion of control did worse. They were evaluated by their superiors as being less productive and, more important, they earned less money. Cause and effect is not certain, of course. But if the direction of effect were such that poor performers responded to their own failure by asserting greater control over external events, then they would be blaming themselves more for failure than success, contrary to the well-documented human bias to rationalize away one’s failures. The alternative scenario then seems much more likely—that imagining one has greater control over events than one actually has leads to poorer performance: being a worse stockbroker. Note the absence of a social dimension here. One has no control over the movement of markets and scarcely much knowledge. There seems little possibility to fool your superiors along these lines when they can measure your success easily and directly. It is not at all clear that such an illusion in other situations may not give some social benefits—or even individual ones, as in prompting greater effort toward achieving actual control.
It is interesting to note that lacking control increases something called