Skip to main content
About HEC About HEC
Summer School Summer School
Faculty & Research Faculty & Research
Master’s programs Master’s programs
Bachelor Programs Bachelor Programs
MBA Programs MBA Programs
PhD Program PhD Program
Executive Education Executive Education
HEC Online HEC Online
About HEC
Overview Overview
Who
We Are
Who
We Are
Egalité des chances Egalité des chances
HEC Talents HEC Talents
International International
Campus
Life
Campus
Life
Sustainability Sustainability
Diversity
& Inclusion
Diversity
& Inclusion
Stories Stories
The HEC
Foundation
The HEC
Foundation
Summer School
Youth Programs Youth Programs
Summer programs Summer programs
Online Programs Online Programs
Faculty & Research
Overview Overview
Faculty Directory Faculty Directory
Departments Departments
Centers Centers
Chairs Chairs
Grants Grants
Knowledge@HEC Knowledge@HEC
Master’s programs
Master in
Management
Master in
Management
Master's
Programs
Master's
Programs
Double Degree
Programs
Double Degree
Programs
Bachelor
Programs
Bachelor
Programs
Summer
Programs
Summer
Programs
Exchange
students
Exchange
students
Student
Life
Student
Life
Our
Difference
Our
Difference
Bachelor Programs
Overview Overview
Course content Course content
Admissions Admissions
Fees and Financing Fees and Financing
MBA Programs
MBA MBA
Executive MBA Executive MBA
TRIUM EMBA TRIUM EMBA
PhD Program
Overview Overview
HEC Difference HEC Difference
Program details Program details
Research areas Research areas
HEC Community HEC Community
Placement Placement
Job Market Job Market
Admissions Admissions
Financing Financing
Executive Education
Home Home
About us About us
Management topics Management topics
Open Programs Open Programs
Custom Programs Custom Programs
Events/News Events/News
Contacts Contacts
HEC Online
Overview Overview
Degree Program Degree Program
Executive certificates Executive certificates
MOOCs MOOCs
Summer Programs Summer Programs
Youth programs Youth programs
Article

How to Deal with Severe Uncertainty?

Decision Sciences
Published on:

Severe uncertainty, deep uncertainty, radical uncertainty, ambiguity… different actors in a range of fields – decision scientists, risk analysts, climate scientists, central bankers – use a variety of phrases to talk of some extreme, important yet too often ignored form of uncertainty. But what is it? And how should we deal with this particular species of uncertainty: how should we characterise it, communicate it, and decide in the face of it? In this interview, CNRS Research Director and HEC Paris Research Professor Brian Hill explains the concept and unveils applicable tools based on theoretical models for guiding decisions in situations of severe uncertainty.

hourglass on the grass

©icedmocha on Adobe Stock

listen to the podcast:

What is severe uncertainty?

A central characteristic of severe uncertainty is the lack of justified probabilities. When tossing a coin, we know precisely the probability of heads. Economists standardly assume that all uncertainties are glorified coin tosses: we can come up with a precise probability for whatever might happen (even if we might not always be right about it). But clearly many real-life situations are just not like that. There are many cases where we don’t know something for sure, and, though that doesn’t necessarily mean that we know nothing at all, what we do know is not enough to justify a solid, precise probability.

 

A central characteristic of severe uncertainty is the lack of justified probabilities.

 

What’s the coronavirus mortality rate? We know that it’s worse than the flu, and below 15%, but beyond that? Can we give a number we are 90% sure about? How fast will the global economy recover to turn-of-the-year GDP levels, or the Dow Jones to its pre-Covid-19 levels? They will almost surely not be there by September, but beyond that? Can we put precise probabilities? What will happen to sea level in, say, New York over the next 30 years? Given our understanding of climate change, we know it will rise, and almost certainly by less than 4m, but beyond that?  

Why is severe uncertainty relevant now?

Severe uncertainty is especially relevant now because we increasingly face situations involving it. Examples abound, including climate mitigation policy, Coronavirus reaction, economic policy, and of course business decisions. I should also add that this is being increasingly recognized, with the ex-governor of the Bank of England, Lord King, having just published a book on Radical Uncertainty with John Kay.

 

These decisions don’t allow us the time to do that: we have to respond to the Coronavirus before fully understanding it.

 

What do all these examples have in common? Urgency. Since the problem is lack of knowledge, one instinctual response would be to go out and do (more) research. But these decisions don’t allow us the time to do that: we have to respond to the Coronavirus before fully understanding it; by the time we know the sea level in New York in 2050 it might be too late to save it from flooding; and so on.

South of Manhattan surrounded by the bay
"Can we put precise probabilities? What will happen to sea level in, say, New York over the next 30 years?" (Photo: South of Manhattan, New York City ©DiegoAransay on AdobeStock)

 

Why do most people in economics, finance and risk analysis continue to discount severe uncertainty, by assuming that all uncertainty can be fully captured by probabilities?

There are basically two reasons: one pragmatic and the other principled. First, it’s easier to work with precise probabilities, and the mathematical methods are familiar. Second, a bunch of philosophical, “axiom-based” arguments purport to show that, if you stray from precise probabilities, your decision making will violate some seemingly “rational” dynamic principles. These arguments have persuaded many over the years. If they were right, then these rationality principles would justify pretending that we always had precise probabilities (despite the egregiousness of the pretence).

 

In my research, I show that you can satisfy the rationality principles, even if you do not stick to precise probabilities.

 

In sum, beyond these arguments, the only barrier to a more refined, richer approach to uncertainty is inertia. In my research (1), I show that these arguments rest on a mistake: you can satisfy (properly formalised versions of) the rationality principles, even if you do not stick to precise probabilities. It thus removes the main hurdle to building an account of rational or sensible decision making that doesn’t need to assume precise probabilities.

How should we decide in the face of severe uncertainty, then? 

As I see it, severe uncertainty poses a double challenge. The first is to work out what we do know and how solid that knowledge is, avoiding two pitfalls: nihilism – assuming that because we can’t put probabilities, we don’t know anything at all – and self-deception – pretending or assuming that we know more or have more precise knowledge than we in fact do. The second is to work out how to harness what we know – and more importantly recognize what we don’t – in decision making. Good, responsible, and informed but not self-deceptive decision making.

In my research, "Confidence in Beliefs and Rational Decision Making" (2), I have developed an approach to decision under uncertainty that meets each of these challenges. It combines two ingredients:

1. Confidence

Forget pretending that you can always give a probability and:

a. Ask for your best guess. Then ask how confident you are of it. That might not be very confident at all (if so: don’t rely on it!)
b. Then ask: if you had to give a probability range that you were very confident in, what would it be. (For difficult cases, this range could be very large: that’s what makes the case difficult!)
c. Repeat, asking for ranges that more or less confident in, or sure of.
(Note that ranges are well-known ways of not having to give precise values. To take a topical example, often in discussions of Covid-19 (e.g. here), epidemiologists report ranges. Under the proposal, you don’t even need to settle on a single range, but just ask how confident you are in a given range – on the basis of what you know).

 

2. Confidence-based caution
a. For more important decisions, demand more confidence in the judgements on which you rely to take the decision. If you have lots of confidence in a judgement or an assessment, by all means base your decision on it. If not, perhaps you should fall back on the (weaker, more imprecise) judgements of which you are more sure – especially if the decision is very important.
b. Now these judgements may be so weak as not to support any option as best: you don’t know enough to categorically justify a single course of action. In such cases, acknowledging this is a crucial first step. In the face of it, it’s best to show caution and take an alternative that won’t lead to too bad a result, no matter which of the values in the range (of which you are sufficiently confident) turns out to be right.

 

Basically, this advice amounts to applying precaution when you are not confident enough for the importance of the decision, and choosing boldly when you are.

This approach is not just common sense: in my research (2), I have shown that it can be defended by the sort of principled, “rationality” arguments used by some to defend the reducibility of all uncertainty to probabilities.

What about models? 

You often find criticism of, say, economic models with a tendency, when attacking the use of probabilities to represent uncertainty, of throwing the baby out with the bathwater. This is a case of what I previously called the pitfall of nihilism. By contrast, climate scientists have a relatively sophisticated use of models, which can serve as an example.

They realise that models are the input to an assessment or judgement about the question of interest (e.g. temperature in 2050, etc.), but no model – nor even all models – provide the whole picture.

 

In my research on climate uncertainty, uncertainty is reported as a form of confidence judgements on the probability assessments that come out, or could have come out, of the models.

 

Climate scientists (e.g. in IPCC reports) have to make a judgement, drawing on models, but also on other evidence, their experience and common sense. And these judgements do not generally come in the form of precise probabilities, although that’s what models produce. Rather, as I have discussed in my research with co-authors on climate uncertainty ((3) and (4)), they rightly report uncertainty in the form of confidence judgements on the probability assessments that come out, or could have come out, of the models. In other words, they adopt as reporting practice the approach I set out above.

 
1. Dynamic consistency and ambiguity: A reappraisal, Games and Economic Behavior, 120:  289-310, 2020.
2. Confidence in Beliefs and Rational Decision Making, Economics and Philosophy, 35(2): 223-258, 2019
3. Climate Change Assessments: Confidence, Probability and Decision, Philosophy of Science 84 (3): 500-522, 2017 (with R. Bradley, C. Helgeson) 
4. Combining probability with qualitative degree-of-certainty metrics in assessment, Climatic Change 149 (3-4): 517-525, 2018 (with R. Bradley, C. Helgeson) 

 

Learn more on Brian Hill’s “Decision Making under Severe Uncertainty” website, including filmed interviews of experts on the “Uncertainty Across Disciplines” project.  

Related content on Decision Sciences

Decision Sciences

Risking the future? How Delayed Consequences Can Bias the Perception of Risk

By Emmanuel Kemel

Brian Hill GREGHEC
Brian Hill
CNRS Research Professor
Emmanuel Kemel HEC professor
Emmanuel Kemel
CNRS Research Professor
Economics

How Much to Reveal to Persuade a Decision Maker?

By Tristan Tomala, Marie Laclau, Frédéric Koessler

Photo Credits: Fergregory / Adobe Stock

Decision Sciences

Black Swans and Other Challenges to Rational Decision Making

By Stefania Minardi, Itzhak Gilboa