Uncertainty assessment tools

Many methodologies and tools suitable for supporting uncertainty assessment have been developed and reported in the scientific literature. Here we present a selection of uncertainty tools to represent the commonly applied types.

The methods addrress different types and dimensions of uncertainty. To assist in selecting tools for conducting uncertainty assessment in a given case, the table below presents the uncertainty matrix and shows what tools can be used to address each of the sorts and locations of uncertainty distinguished.
Some of the tools are hard to map because they address a range of locations and dimanesions. We have listed these methods in those boxes of the table where we consider it particularly strong. One example is the NUSAP method, which generally includes some quantitative tool (sensitivity analysis or Monte Carlo analysis) in combination with systematic critical review and pedigree analysis.

Further it should be noted that the use of many of the tools is not limited to the boxes in which they are listed. For instance, sensitivity analysis could also be applied to assess sensitivity to different model structures and scenario analysis and sensitivity analysis (screening) may overlap. In a Monte Carlo assessment one could address model structure uncertainty by introducing a switch-parameter to switch between different model equations representing different conceivable model structures and sample for that switch-parameter from for instance a uniform distribution. So the table should not be interpreted too strict, it gives a rough overview of the basic scope of application of each tool.
 

 

Correspondence of the tools with the sorts and locations of uncertainty distinguished in the uncertainty matrix.Entries printed in italics are not described in this toolbox because there are no standard methods to perform these tasks.
                                                       
Type à
 
        Location
            ¯
Level of uncertainty
(From determinism, through probability and possibility, to ignorance)  
Nature of uncertainty
Qualification of knowledge base (backing)
Value-ladenness of choices
Statistical uncertainty (range+ probability)
Scenario- uncertainty
 (‘what-if’ option)
Recognized Ignorance
Knowledge related uncertainty
Variability related uncertainty
Ecological,
technological,
economic, social and
political
representation
SA
QA
EE
Sc
QA
SI
EE
Sc
MQC
QA
SI
NUSAP/EP
EE
NUSAP / EP
MQC
QA
EE
NUSAP / EP
MQC
QA
PR
EPR
EE
CRA,
Sc, AA, SI, EE
PR, EPR
Data
(in general sense)
Measurements+
Monitoring data;
Survey data
SA,
EPE
MCA
EE
Sc
EE
Sc
QA
NUSAP
MQC
DV
MV
EE
NUSAP
MQC
DV
QA
EE
NUSAP
MQC
QA
PR
EPR
EE
CRA
Sc
PR
EPR
SI
 
 
 
M
o
d
e
l
 
 Model  
 Inputs
Measurements
monitoring data;
survey data
Model Structure
Parameters
 
Relations
 
SA, MMS, EE,
MQC, MC
Sc, MMS
NUSAP, MQC, MC, MV
MQC, NUSAP, QA, EE
MQC, NUSAP, MC, MV, PR, EPR, EE
CRA, MMS, PR, EPR, SI
Technical Model
Software& hardware-implement.
QA
SA
QA
SA
QA
SA
PR
PR
SA
PR
Expert
Judgement
Narratives;
storylines;
advices
SA, QA
EE
Sc, QA, SI, EE
Sc, MQC, QA, SI, NUSAP/EP, EE
NUSAP / EP
MQC, QA, EE
NUSAP / EP, MQC, QA, PR, EPR, EE
CRA, Sc, AA
SI, PR, EPR, EE
 
Outputs
(indicators; statements)
Sc, SA, EPE, MC, EE
Sc, SA, EE
NUSAP, EE
NUSAP, MQC, PR, EPR, EE
NUSAP, MQC, QA, PR, EPR, EE
CRA, PR, EPR

 

Explanation of abbreviations in the table: 

AA: Actor Analysis
CRA: Critical Review of Assumptions
DV: Data Validation
EE: Expert Elicitation
EP: Extended Pedigree scheme
EPE: Error propagation equation
EPR: Extended Peer Review (review by stakeholders)
MC: Model Comparison
MCA: Tier 2 analysis / Monte Carlo Analysis
MMS: Multiple Model Simulation
MQC: Model Quality Checklist
MV: Model validation
NUSAP: NUSAP
PR: Peer Review
QA: Quality Assurance
SA: Sensitivity Analysis
Sc: Scenario Analysis
SI: Stakeholder Involvement/ Extended Peer Review