213 – The environmental planning fallacy
Psychologists studying the way that people plan projects have found that they are often way too optimistic – they think the project will take less time, or cost less, or achieve greater change, or that the change is worth more, relative to what is realistic based on experience with other similar projects. We have observed exactly these tendencies in environmental planning, and seen that they create serious (but often unrecognised) problems for managers wishing to prioritise environmental projects.
Psychologists Daniel Kahneman and Amos Tversky made very successful careers out of studying the biases that commonly occur in people’s thinking, culminating in Kahneman being awarded the Nobel Prize for Economics in 2002 (Tversky would surely have shared the award if he had still been alive). Kahneman’s recent book for a general audience, ‘Thinking, Fast and Slow’ (Kahneman, 2011), includes very enlightening and entertaining information about the various biases.
The bias towards optimistic planning they called the ‘planning fallacy’. In his book, Kahneman provided a variety of striking examples. Here’s one that’s not unusual:
‘A 2005 study examined rail projects undertaken worldwide between 1969 and 1998. In more than 90% of the cases, the number of passengers projected to use the system was overestimated. Even though these passenger shortfalls were widely publicized, forecasts did not improve over those 30 years; on average planners overestimated how many people would use the new rail projects by 106%, and the average cost overrun was 45%’ (Kahneman, 2011).
Although we know these things occur, understanding of the reasons why they occur is not so clear-cut. Suggestions include that:
- planners focus on the most optimistic scenario, rather than using their full experience;
- simple wishful thinking;
- biased interpretation of poor past results;
- underestimating or overlooking the variety of risks that could affect the project.
When people plan environmental projects, exactly the same sorts of things occur. In observing existing environmental plans, or helping people to develop new ones, my collaborators in the INFFER project and I (Pannell et al., 2012) have often observed the phenomenon. It’s so common that we’ve developed our own term for it: the ‘culture of hope’. Unfortunately, overly optimistic claims about projects tend not to get picked up by funding programs when they are evaluating project proposals (e.g., Pannell and Roberts, 2010).
To illustrate the problem, we often observe that people who develop an environmental project frequently seem to be overly positive in their perceptions about the following variables.
- The value of the environmental assets that the project will protect or enhance
- The level of cooperation and behaviour change by landholders
- The various risks that might cause the project to fail (some of which tend to be ignored completely, not just understated)
- The cost of the project in the short term
- The longer-term costs, beyond the initial project (also commonly ignored)
With the combined effects of these biases and omissions, it’s common for the assumptions in the plan for an environment project to make it look dramatically better than it really is. I reckon the implied benefit: cost ratio could be exaggerated by a factor of 10 or more in many cases we’ve seen. So the likelihood that decision making will be messed up, with adverse consequences for the environment, is very high.
Although the biases might sometimes result from people trying to game the system to get support for their pet project, that is certainly not the only cause. The psychologists have shown that optimistic biases occur even when people are not trying to distort things. It’s human nature.
From an economist’s perspective, this matters because it is not possible to judge whether an environmental project is worth investing in if the assumptions made about it are inaccurate. From an environmentalist’s perspective, there is a serious risk that funders will end up supporting those projects that have been exaggerated the most, rather than the projects that will really deliver the most valuable environmental outcomes.
Having observed this phenomenon from the earliest days of developing INFFER (before we had even heard of the ‘planning fallacy’), we developed several strategies to try to counter it.
Explicit questions about negative factors that tend to be ignored. When people are completing our Project Assessment Form, we require responses to questions about a range of specific risks, and about long-term funding, so that they can be factored into the project assessment.
Logical consistency checks. We ask them to check whether the answers to some later questions are logically consistent with their answers to specific earlier questions. This helps to flush out some biased responses.
Review of assumptions by independent experts. Kahneman and Tversky found that we are much better at being realistic when judging other people’s projects, rather than our own. Our system includes a facility for independent reviewers to provide comments on particular assumptions, and we use this extensively when supporting people to develop projects.
Feasibility assessment. For large projects, we recommend that a feasibility assessment should be included as the first phase of project funding, with further funding depending on the results of the feasibility assessment. This process should involve collection of additional information about those aspects of the project that were most uncertain in the project-assessment phase, followed by revision of the original assessment.
Even with all of these measures, I don’t think we eliminate the planning fallacy problem, but I do think we reduce its impacts quite a bit.
It’s interesting to observe that most environmental managers are quite oblivious to the problem. Some of them view our measures to try to counter it as unnecessary inconveniences. The measures mean that our process is more involved than the simpler processes many environmental managers are used to, so there is some resistance to using it.
Despite this, our experience shows that it’s crucial to include these measures to counter the planning fallacy. Without them, the project plans developed are of little value to decision makers who genuinely wish to support projects that will deliver the best environmental outcomes with the available resources.
Further reading
Kahneman, D. (2011). Thinking, Fast and Slow, Farrar, Straus and Giroux, New York.
Pannell, D.J. and Roberts, A.M. (2010). The National Action Plan for Salinity and Water Quality: A retrospective assessment, Australian Journal of Agricultural and Resource Economics54(4): 437-456. Journal web site here ♦ IDEAS page for this paper
Pannell, D.J., Roberts, A.M., Park, G., Alexander, J., Curatolo, A. and Marsh, S. (2012). Integrated assessment of public investment in land-use change to protect environmental assets in Australia, Land Use Policy 29(2): 377-387. Journal web site ♦ IDEAS page for this paper
I like the “culture of hope” term. I think of it as “Field of Dreams Syndrome” — if you build something, they will come. I tend to ask these kinds of questions to my students sometimes (including in exams on benefit-cost analysis). Their optimism is heartening in one way, but alarming in another.
I am not familiar with the work of Kahneman and Tversky. However I wonder if they considered an opposite but perhaps related tendency for people to exaggerate the costs (or dangers) of doing something. For instance often people seem to be pessimistic when facing change, overestimating the costs of having to change. When the change actually occurs, lo and behold the sky doesn’t fall in, or at least doesn’t fall in as much as feared. Examples that readily come to mind include the costs of reducing emissions or perhaps the Y2K bug. Perhaps this is a ‘culture of fear’.
Of the twenty-odd biases they talk about, this wasn’t one of them. Not to say it doesn’t happen.