Psychology: Trust Your Gut

When Benjamin Franklin's nephew Joseph Priestley found himself stumped by a complex life decision, he wrote his sage uncle for advice. In his 1772 letter of reply, Franklin described his own method for reasoning out complex problems, which he called "moral algebra." Divide a sheet of paper in half, he counseled his nephew, and make an exhaustive list of pros and cons. Then, over a couple days, weigh the pros and cons, and when a pro and a con seem of equal weight, strike them both out. What is left in the balance is the best answer.

Such "balance sheet" calculation is still taught today as the most logical and systematic method for dealing with many of life's complexities. Kids are counseled to choose colleges and careers this way, and managers similarly deliberate the pros and cons in important business decisions; some people are even methodical in matters of the heart.

But is moral algebra really the best method for decision making in today's dizzyingly complicated world? Or is there virtue in simplicity for many life choices? A growing number of psychologists are questioning the soundness of Franklin's method, and its modern iterations, including data-heavy calculations by increasingly powerful computers.

One of the leading challengers to the dogma of decision making is psychologist Gerd Gigerenzer, of the Max Planck Institute in Germany, whose new book "Gut Feelings" collects a convincing body of evidence for the power of hunches over laborious data crunching. Hunches, gut feelings, intuition—these are all colloquial English for what Gigerenzer and his colleagues call "heuristics," fast and efficient cognitive shortcuts that (according to the emerging theory) can help us negotiate life, if we let them.

Consider the "take the best" heuristic. "Take the best" means that you reason and calculate only as much as you absolutely have to; then you stop and do something else. So, for example, if there are 10 pieces of information that you might weigh in a thorough decision, but one piece of information is clearly more important than the others, then that one piece of information is often enough to make a choice. You don't need the rest; other details just complicate things and waste time.

Gigerenzer has demonstrated this in the laboratory. He asked a large number of parents to consider a scenario in which their child wakes up after midnight short of breath, wheezing and coughing. They are told that a doctor could make a home visit in 20 minutes; it's a physician they know but don't like all that much, because he never listens to their view. Alternatively, they could take their child to a clinic 60 minutes away; the doctors there are unknown, but good listeners by reputation. Which to choose?

There are actually four pieces of information in play here: 20 minutes vs. 60 minutes, home visit vs. driving to the clinic, familiar vs. unfamiliar doctor, and good vs. bad listener. Some parents in Gigerenzer's experiment did weigh all four pieces of information, but almost half did not. Instead they made this very important decision based on one factor, and for the vast majority that factor was whether or not the physician was a good listener—even if it meant waiting 40 minutes longer for treatment. Many fewer made their decision based on waiting time alone. Nobody much cared about a home visit.

Gigerenzer calls such decision making "satisficing," as in "satisfying" enough to "suffice." Satisficers don't feel the need to know everything, in contrast to "maximizers," who do want to weigh every detail imaginable in making even minor life decisions. Interestingly, studies have found that satisficers are more optimistic about life, have higher self-esteem, and are generally happier than maximizers.

Gigerenzer has had a hard time convincing other cognitive scientists of the power and accuracy of heuristics. Nobody quite believes that you can make sounder decisions with less information and less time, which is what heuristics claim to do. To prove his point, he has gone head-to-head with powerful computers, which can crunch vast amounts of information in the manner of Franklin's moral algebra. Consider another experiment involving parents: in this one parents have to choose a Chicago high school for their children, and they want the one with the lowest dropout rate. But that information is unavailable, so how does one make a decision?

Well, there is a lot of other information available, including SAT scores, attendance rates, writing scores, and more—18 pieces of information in all. Gigerenzer had a computer do what's called "multiple regression" analysis, which is just modern jargon for Franklin's moral algebra. It estimated the importance of all 18 pieces of available information and did a complex calculation to predict the dropout rate for each school. Gigerenzer also had a computer choose a school using the "take the best" strategy. In this case, it looked first at attendance, but there was no significant difference in the schools, so it moved on to a second piece of information, writing scores. Based only on these two pieces of information, the "take the best" method was more accurate than the complex and time-consuming analysis in determining the actual dropout rates of Chicago schools—and much faster.

Gigerenzer and his colleagues have run similar head-to-head tests on dozens of real-world problems, in fields as diverse as economics and biology and health care. In every case, one good reason has proven superior to data-greedy mathematical equations in making the best choices. Psychologists now believe that these cognitive shortcuts evolved over eons in the brain's neurons, probably because exhaustive and complex calculation was so often impractical for our early ancestors, who were always only one step ahead of their predators. Today we're one step ahead of an information tsunami, so it's comforting to know that the quick and dirty choices we're forced to make on the fly are grounded in some ancient intelligence.

Wray Herbert writes the "We're Only Human…"column at