Undoing Irrational Thinking

ednext-feb17-green-lewis-coverThe Undoing Project: A Friendship That Changed Our Minds
by Michael Lewis
W.W. Norton and Company, 2016, $28.95; 368 pages.

As reviewed by Jay P. Greene

Michael Lewis’ The Undoing Project is an excellent new book about the amazing, and eventually problematic, collaboration between Daniel Kahneman and Amos Tversky.

Kahneman and Tversky achieved honor and fame for their research on how people make judgments and decisions and, in particular, on the less-than-rational ways that people’s minds work. But did they really revolutionize social science as much as Lewis suggests?

For one thing, while Kahneman and Tversky are remarkably persuasive in demonstrating how people regularly deviate from rationality, subsequent scholars have not had much success in building new theories based on systematic irrationality. While people may not behave rationally, it remains quite useful to assume rationality when building theories, as long as those theories yield accurate predictions.

Additionally, much of the work that has tried to build on Kahneman and Tversky seems to violate their basic finding that expert judgment is unreliable. The development of behavioral economics and its application to a variety of fields, including education, mostly seems to consist of trying to devise ways to correct the systematic irrationality of others. For instance, if low-income students are accepted to college but do not enroll after failing to complete the FAFSA financial aid form, we assume they are behaving counter to their long-term interests, and we propose interventions to induce them to complete the form and enroll.

As I’ve written elsewhere, this approach has a variety of problems, but chief among them is that it assumes too much rationality on the part of the social scientist devising the solutions. How do we know that people would be better off if we could nudge them into doing something other than what they had originally decided to do? Just as other people may be systematically irrational, so too may the social scientists devising plans for improving other people’s lives. I’m not saying that no interventions are helpful. I’m just saying that we should be extremely cautious and humble when developing plans for how other people should live their lives.

The need for humility among experts and social s­­­­­­cientists was a central theme in Kahneman and Tversky’s work. Their approach was not, as one critic accused them, a psychology of stupid people; it was a psychology of all people, including experts and social scientists. In fact, one of Kahneman and Tversky’s first experiments was to test statisticians to see if they would behave rationally or not.

They gave statisticians problems to solve and examined whether they properly took into account new evidence. Kahneman and Tversky found that statisticians neglected to properly update their priors when they encountered new evidence, even though Bayes’ Theorem would tell them not to behave that way. In other words, even statisticians, who you might think would be particularly familiar with Bayes’ Theorem, do not actually think like Bayesians. In subsequent experiments, Kahneman and Tversky found that even warning subjects of the systematic irrationality to which they might be prone does not prevent them from being systematically irrational. Greater knowledge and expertise do not prevent us from falling into the same intellectual potholes over and over again.

So Kahneman and Tversky’s research demonstrates that there is no priestly class immune to the shortcomings of others, and that even foreknowledge and the confession of one’s sins of irrationality provide little protection against repeating common errors. And yet, much of behavioral economics seems to pay little heed to this central finding as practitioners in the field move full steam ahead in devising solutions for other people’s irrationality. They seem to forget that devising solutions, building models, and testing these models all require human judgments, which are also prone to systematic error.

In his seminal volume, Thinking, Fast and SlowKahneman admits there is no real solution to our tendency to deviate from rationality. Instead, he suggests some habits to check the errors, mostly involving slowing down and being more cautious and self-critical, as well as inviting the criticism of others. Let’s not correct for popular mistakes by installing a technocratic elite, he implies, because that elite are also prone to common errors.

If Kahneman and Tversky’s insights do not easily lend themselves to the construction of new theories, they are still quite useful in offering bits of wisdom to practitioners of all types. Let me illustrate this with an example of what advice Kahneman and Tversky might have to offer to education foundations.

Foundations have particular challenges in detecting and correcting errors in their own thinking. Because many people want things from foundations, especially money, foundations frequently are organized so as to limit communication with the outside world. Foundations generally don’t open their doors, phone lines, and email inboxes to those who might want to suggest something to them or ask something of them, for fear that the foundation staff will be overwhelmed. So they typically hide behind a series of locked doors, avoid making their phone numbers or email addresses readily available, and insist that all applications follow a specific format, be submitted at a particular time, and be designed to address predetermined issues.

Insulating themselves from external influence is understandable, but it creates real problems for foundations if they ever hope to detect when they are mistaken and need to make changes. The limited communication that does make it through to foundation leaders tends to reaffirm whatever they are already doing. Prospective grantees don’t like to tell foundations that they need to change course because that makes getting funded unlikely. Instead, foundations tend to hear unequivocal positive reinforcement. To make matters worse, many foundations are run in very top-down ways, which discourage questioning and self-criticism.

The Undoing Project presents a very similar situation having to do with airline pilot errors. Lewis describes a case in which a commercial airline was experiencing too many accidents caused by pilot error. The accidents were not fatal, but they were costly and dangerous — things like planes landing at the wrong airport. So the airline approached Amos Tversky and asked for help in improving their training so as to minimize these pilot errors. They wanted Tversky to design a pilot-training method that would make sure pilots had the information and skills to avoid errors.

Tversky told the airline that they were pursuing the wrong goal. Pilots are going to make mistakes, and no amount of information or training can stop them from committing those errors. We often assume that our errors are always caused by ignorance, but the deeper problem is that once we have a mental model of the world, we tend to downplay or ignore information that is inconsistent with our model and instead bolster facts that support it. If a pilot thinks he is landing at the right airport, he distorts available information to confirm he is landing in Fort Lauderdale rather than in nearby Palm Beach, even if he is incorrect. The problem is not a lack of information, but our tendency to fit information into preconceived beliefs.

Tversky’s advice was to change the cockpit culture to facilitate questioning and self-criticism. At the time, cockpits were very hierarchical, based on the belief that copilots needed to implement pilot orders quickly and without question, lest the delay and doubt promote indecision and disorder. So the airline implemented Tversky’s suggestions and changed their training to encourage copilots to doubt and question and pilots to be more receptive to that criticism. The result was a marked decline in accidents caused by pilot error. Apparently, we aren’t very good at detecting our own errors, but we are more likely to do so if others are encouraged to point them out.

So what might Tversky suggest to education foundations? I think he’d recognize that they have exceptional difficulty in detecting their own errors and need intentional, institutional arrangements to address that problem. In particular, he might suggest that they divide their staff into a Team A and Team B. Each team would work on a different theory of change — theories that are not necessarily at odds with each other but are also not identical. For example, one team might focus on promoting school choice and another on promoting test-based accountability. Or one team may promote tax credits and the other ESAs. Dividing staff into somewhat competing teams would give them incentives to point out shortcomings in the other team’s approach, which could be a useful check on the all-too-common problem of groupthink.

Another potential solution is to hire two or three internal devil’s advocates who would question the assumptions and evidence believed by foundation staff. To protect those devil’s advocates, it would probably be best to have them report directly to the board rather than to the people they would be questioning.

Whatever the particular arrangements, the point is that education foundations should strive to promote an internal culture of doubt and self-criticism if they wish to catch and correct their own errors and avoid groupthink. One foundation that I think has taken steps in this direction is the Arnold Foundation. They actually hold internal seminars in which they invite outside speakers to come and potentially offer critiques of their work. Neerav Kingsland, who heads education efforts at Arnold, is also especially available on blogs and twitter for critical discussion. I don’t always agree with Neerav, but I am impressed by his openness to dissent.

The collaboration between Kahneman and Tversky was itself an example of the importance of engaging in tough criticism within an effort. Like the airline pilots, they developed habits of challenging each other, which made their work together better than it ever could have been individually.

Jay P. Greene is distinguished professor of education reform at the University of Arkansas.

Last Updated

NEWSLETTER

Notify Me When Education Next

Posts a Big Story

Business + Editorial Office

Program on Education Policy and Governance
Harvard Kennedy School
79 JFK Street, Cambridge, MA 02138
Phone (617) 496-5488
Fax (617) 496-4428
Email Education_Next@hks.harvard.edu

For subscription service to the printed journal
Phone (617) 496-5488
Email subscriptions@educationnext.org

Copyright © 2024 President & Fellows of Harvard College