Many of the conclusions presented in this book may already have occurred to you, for social psychological phenomena are all around you. We constantly observe people thinking about, influencing, and relating to one another. It pays to discern what a facial expression predicts, how to get someone to do something, or whether to regard another as friend or foe. For centuries, philosophers, novelists, and poets have observed and commented on social behavior. Does this mean that social psychology is just common sense in fancy words? Social psychology faces two contradictory criticisms: first, that it is trivial because it documents the obvious; second, that it is dangerous because its findings could be used to manipulate people. Here, let’s examine the first objection.

Do social psychology and the other social sciences simply formalize what any amateur already knows intuitively? Writer Cullen Murphy (1990) took that view: “Day after day social scientists go out into the world. Day after day they discover that people’s behavior is pretty much what you’d expect.” Nearly a half-century earlier, historian Arthur Schlesinger, Jr. (1949), reacted with similar scorn to social scientists’ studies of American World War II soldiers. Sociologist Paul Lazarsfeld (1949) reviewed those studies and offered a sample with interpretive comments, a few of which I paraphrase:   

 1. Better-educated soldiers suffered more adjustment problems than did less educated soldiers. (Intellectuals were less prepared for battle stresses than were street-smart people.)   

   2. Southern soldiers coped better with the hot South Sea Island climate than did Northern soldiers. (Southerners are more accustomed to hot weather.) 

   3. White privates were more eager for promotion than were Black privates. (Years of oppression take a toll on achievement motivation.)

   4. Southern Blacks preferred Southern to Northern White officers. (Southern officers were more experienced and skilled in interacting with Blacks.) As you read those findings, did you agree that they were basically common sense? If so, you may be surprised to learn that Lazarsfeld went on to say,  “Every one of these statements is the direct opposite of what was actually found.”  In reality, the studies found that less-educated soldiers adapted more poorly. Southerners were not more likely than northerners to adjust to a tropical climate. Blacks were more eager than Whites for promotion, and so forth. “If we had mentioned the actual results of the investigation first [as Schlesinger experienced], the reader would have labeled these ‘obvious’ also.” One problem with common sense is that we invoke it after we know the facts. Events are far more “obvious” and predictable in hindsight than beforehand.

Experiments reveal that when people learn the outcome of an experiment, that outcome suddenly seems unsurprising—much less surprising than it is to people who are simply told about the experimental procedure and the possible outcomes (Slovic & Fischhoff, 1977). L ikewise, in everyday life we often do not expect something to happen until it does.  Then  we suddenly see clearly the forces that brought the event about and feel unsurprised. Moreover, we may also misremember our earlier view (Blank &  others, 2008; Nestler & others, 2010). Errors in judging the future’s foreseeability and in remembering our past combine to create hindsight bias (also called the I-knew-it-all-along phenomenon ). Thus, after elections or stock market shifts, most commentators find the turn of events unsurprising: “The market was due for a correction.” After the 2010 Gulf oil disaster, it seemed obvious in hindsight that BP employees had taken some shortcuts and ignored warnings, and that government oversight was lax.

As the Danish philosopher–theologian Søren Kierkegaard put it, “Life is lived forwards, but understood backwards.” I f hindsight bias is pervasive, you may now be feeling that you already knew about this phenomenon. Indeed, almost any conceivable result of a psychological experiment can seem like common sense— after  you know the result.  You can demonstrate the phenomenon yourself. Take a group of people and tell half of them one psychological finding and the other half the opposite result. For example, tell half as follows: S  ocial psychologists have found that, whether choosing friends or falling in love, we are most attracted to people whose traits are different from our own. There seems to be w isdom in the old saying “Opposites attract.”

  Tell the other half:   Social psychologists have found that, whether choosing friends or falling in love, we are most attracted to people whose traits are similar to our own. There seems to be wisdom in the old saying “Birds of a feather flock together.”  A sk the people first to explain the result. Then ask them to say whether it is “surprising” or “not surprising.” Virtually all will find a good explanation for whichever result they were given and will say it is “not surprising.”  Indeed, we can draw on our stockpile of proverbs to make almost any result seem to make sense. If a social psychologist reports that separation intensifies romantic attraction, John Q. Public responds, “You get paid for this? Everybody knows that ‘absence makes the heart grow fonder.’” Should it turn out that separation  weakens  attraction, John will say, “My grandmother could have told you, ‘Out of sight, out of mind.’”

Karl Teigen (1986) must have had a few chuckles when he asked University of Leicester (England) students to evaluate actual proverbs and their opposites. When given the proverb “Fear is stronger than love,” most rated it as true. But so did students who were given its reversed form, “Love is stronger than fear.” Likewise, the genuine proverb “He that is fallen cannot help him who is down” was rated highly; but so too was “He that is fallen can help him who is down.” My favorites, however, were two highly rated proverbs: “Wise men make proverbs and fools repeat them” (authentic) and its made-up counterpart, “Fools make proverbs and wise men repeat them.” For more dueling proverbs, see “Focus On: I Knew It All Along.”

The hindsight bias creates a problem for many psychology students. Sometimes results are genuinely surprising (for example, that Olympic  bronze  medalists take more joy in their achievement than do silver medalists). More often, when you read the results of experiments in your textbooks, the material seems easy, even obvious. When you later take a multiple-choice test on which you must choose among several plausible conclusions, the task may become surprisingly difficult. “I don’t know what happened,” the befuddled student later moans. “I thought I knew the material.”  The I-knew-it-all-along phenomenon can have unfortunate consequences. It is conducive to arrogance—an overestimation of our own intellectual powers. Moreover, because outcomes seem as if they should have been foreseeable, we are more likely to blame decision makers for what are in retrospect “obvious” bad choices than to praise them for good choices, which also seem “obvious.”

Starting after the morning of 9/11 and working backward, signals pointing to the impending disaster seemed obvious. A U.S. Senate investigative report listed the missed or misinterpreted clues (Gladwell, 2003): The CIA knew that al Qaeda operatives had entered the country. An FBI agent sent a memo to headquarters that began by warning “the Bureau and New York of the possibility of a coordinated effort by Osama bin Laden to send students to the United States to attend civilian aviation universities and colleges.” The FBI ignored that accurate warning and failed to relate it to other reports that terrorists were planning to use planes as weapons. The president received a daily briefing titled “Bin Laden Determined to Strike Inside the United States” and stayed on holiday. “The dumb fools!” it seemed to hindsight critics. “Why couldn’t they connect the dots?”  But what seems clear in hindsight is seldom clear on the front side of history. The intelligence community is overwhelmed with “noise”—piles of useless information surrounding the rare shreds of useful information.

Analysts must therefore be selective in deciding which to pursue, and only when a lead is pursued does it stand a chance of being connected to another lead. In the 6 years before 9/11, the FBI’s counterterrorism unit could never have pursued all 68,000  uninvestigated leads. In hindsight, the few useful ones are now obvious. In the aftermath of the 2008 world financial crisis, it seemed obvious that government regulators should have placed safeguards against the ill-fated bank lending practices. But what was obvious in hindsight was unforeseen by the chief American regulator, Alan Greenspan, who found himself “in a state of shocked disbelief” at the economic collapse. W e sometimes blame ourselves for “stupid mistakes”—perhaps for not having handled a person or a situation better. Looking back, we see how we should have handled it. “I should have known how busy I would be at the semester’s end and started that paper earlier.” But sometimes we are too hard on ourselves.

We forget that what is obvious to us  now  was not nearly so obvious at the time. Physicians who are told both a patient’s symptoms and the cause of death (as determined by autopsy) sometimes wonder how an incorrect diagnosis could have been made. Other physicians, given only the symptoms, do not find the diagnosis nearly so obvious (Dawson & others, 1988). Would juries be slower to assume malpractice if they were forced to take a foresight rather than a hindsight perspective? What do we conclude—that common sense is usually wrong? Sometimes it is. At other times, conventional wisdom is right—or it falls on both sides of an issue: Does happiness come from knowing the truth, or from preserving illusions? From being with others, or from living in peaceful solitude? Opinions are a dime a dozen. No matter what we find, there will be someone who foresaw it. (Mark Twain jested that Adam was the only person who, when saying a good thing, knew that nobody had said it before.) But which of the many competing ideas best fit reality? Research can specify the circumstances under which a  commonsense truism is valid. The point is not that common sense is predictably wrong. Rather, common sense usually is right— after the fact.  We therefore easily deceive ourselves into thinking that we know and knew more than we do and did. And that is precisely why we need science to help us sift reality from illusion and genuine predictions from easy hindsight. 

Leave a Reply

Close Menu