Tuesday, October 28, 2008

The Behavioral Revolution By DAVID BROOKS

October 28, 2008
Op-Ed Columnist
The Behavioral Revolution By DAVID BROOKS

Roughly speaking, there are four steps to every decision. First, you perceive a situation. Then you think of possible courses of action. Then you calculate which course is in your best interest. Then you take the action.

Over the past few centuries, public policy analysts have assumed that step three is the most important. Economic models and entire social science disciplines are premised on the assumption that people are mostly engaged in rationally calculating and maximizing their self-interest.

But during this financial crisis, that way of thinking has failed spectacularly. As Alan Greenspan noted in his Congressional testimony last week, he was "shocked" that markets did not work as anticipated. "I made a mistake in presuming that the self-interests of organizations, specifically banks and others, were such as that they were best capable of protecting their own shareholders and their equity in the firms."

So perhaps this will be the moment when we alter our view of decision-making. Perhaps this will be the moment when we shift our focus from step three, rational calculation, to step one, perception.

Perceiving a situation seems, at first glimpse, like a remarkably simple operation. You just look and see what's around. But the operation that seems most simple is actually the most complex, it's just that most of the action takes place below the level of awareness. Looking at and perceiving the world is an active process of meaning-making that shapes and biases the rest of the decision-making chain.

Economists and psychologists have been exploring our perceptual biases for four decades now, with the work of Amos Tversky and Daniel Kahneman, and also with work by people like Richard Thaler, Robert Shiller, John Bargh and Dan Ariely.

My sense is that this financial crisis is going to amount to a coming-out party for behavioral economists and others who are bringing sophisticated psychology to the realm of public policy. At least these folks have plausible explanations for why so many people could have been so gigantically wrong about the risks they were taking.

Nassim Nicholas Taleb has been deeply influenced by this stream of research. Taleb not only has an explanation for what's happening, he saw it coming. His popular books "Fooled by Randomness" and "The Black Swan" were broadsides at the risk-management models used in the financial world and beyond.

In "The Black Swan," Taleb wrote, "The government-sponsored institution Fannie Mae, when I look at its risks, seems to be sitting on a barrel of dynamite, vulnerable to the slightest hiccup." Globalization, he noted, "creates interlocking fragility." He warned that while the growth of giant banks gives the appearance of stability, in reality, it raises the risk of a systemic collapse — "when one fails, they all fail."

Taleb believes that our brains evolved to suit a world much simpler than the one we now face. His writing is idiosyncratic, but he does touch on many of the perceptual biases that distort our thinking: our tendency to see data that confirm our prejudices more vividly than data that contradict them; our tendency to overvalue recent events when anticipating future possibilities; our tendency to spin concurring facts into a single causal narrative; our tendency to applaud our own supposed skill in circumstances when we've actually benefited from dumb luck.

And looking at the financial crisis, it is easy to see dozens of errors of perception. Traders misperceived the possibility of rare events. They got caught in social contagions and reinforced each other's risk assessments. They failed to perceive how tightly linked global networks can transform small events into big disasters.

Taleb is characteristically vituperative about the quantitative risk models, which try to model something that defies modelization. He subscribes to what he calls the tragic vision of humankind, which "believes in the existence of inherent limitations and flaws in the way we think and act and requires an acknowledgement of this fact as a basis for any individual and collective action." If recent events don't underline this worldview, nothing will.

If you start thinking about our faulty perceptions, the first thing you realize is that markets are not perfectly efficient, people are not always good guardians of their own self-interest and there might be limited circumstances when government could usefully slant the decision-making architecture (see "Nudge" by Thaler and Cass Sunstein for proposals). But the second thing you realize is that government officials are probably going to be even worse perceivers of reality than private business types. Their information feedback mechanism is more limited, and, being deeply politicized, they're even more likely to filter inconvenient facts.

This meltdown is not just a financial event, but also a cultural one. It's a big, whopping reminder that the human mind is continually trying to perceive things that aren't true, and not perceiving them takes enormous effort.

http://www.nytimes.com/2008/10/28/opinion/28brooks.html?sq=Behavioral%20Revolution&st=cse&scp=1&pagewanted=print

http://snipurl.com/7sioj

No comments:

Blog Archive