Sensitivity Analysis
Sensitivity analysis (SA) is the study of how the variation in the output of a model (numerical or otherwise) can be apportioned, qualitatively or quantitatively, to different sources of variation, and of how the given model depends upon the information fed into it (Saltelli et al., 2000, 2008).
Goals and use
The goal of sensitivity analysis is to understand the quantitative sources of uncertainty in model calculations and to identify those sources that contribute the largest amount of uncertainty in a given outcome of interest.
Three types of sensitivity analysis can be distinguished:
- Screening, which is basically a general investigation of the effects of variation in the inputs but not a quantitative method giving the exact percentage of the total amount of variation that each factor accounts for. The main purpose of screening methods is to identify in an efficient way a short list of the most important sensitive factors, so that in a follow-up uncertainty analysis the limited resources can be used in the most efficient way.
- Local SA, the effect of the variation in each input factor when the others are kept at some constant level. The result is typically a series of partial derivatives - or an approximation thereof-, one for each factor, that defines the rate of change of the output relative to the rate of change of the input.
- Global SA, the effects on the outcomes of interest of variation in the inputs, as all inputs are allowed to vary over their ranges. This can be extended to take into account the shape of their probability density functions. This usually requires some procedure for sampling the parameters, perhaps in a Monte Carlo form, and the result is more complex than for local SA. In their book, Saltelli et al. (2000) describe a range of different statistics describing how this type of information can be summarized. Global SA is a variance-analysis based method, using indices expressing the contribution of parameters to the variance in the output (e.g. standardized rank correlation coefficients and partial rank correlation coefficients) (cf. Saltelli et al. 2000).
There is one particular (global) screening method for sensitivity analysis that we consider state of the art and recommend for its computational efficiency: the Morris algorithm (Morris, 1991). The typical case to apply this tool is if there are many parameters and available resources do not allow to specify probability density functions for a full Monte Carlo analysis.The description of Morris given here is taken from Potting et al., (2001): "The Morris method for global sensitivity analysis is a so-called one step-at-a-time method, meaning that in each run only one input parameter is given a new value. It facilitates a global sensitivity analysis by making a number r of local changes at different points x(1→r) of the possible range of input values. The method starts by sampling a set of start values within the defined ranges of possible values for all input variables and calculating the subsequent model outcome. The second step changes the values for one variable (all other inputs remaining at their start values) and calculates the resulting change in model outcome compared to the first run. Next, the values for another variable are changed (the previous variable is kept at its changed value and all other ones kept at their start values) and the resulting change in model outcome compared to the second run is calculated. This goes on until all input variables are changed. This procedure is repeated r times (where r is usually taken between 5 and 15), each time with a different set of start values, which leads to a number of r*(k+1) runs, where k is the number of input variables. Such number is very efficient compared to more demanding methods for sensitivity analysis (Campolongo et al. 1999).
The Morris method thus results in a number of r changes in model outcome from r times changing the input value of a given variable. This information is expressed in so-called elementary effects. These elementary effects are approximations of the gradient δy/δx of the model output y with respect to a specific value for input variable x. The resulting set of r elementary effects is used to calculate the average elementary effect (to lose dependence of the specific point at which each measure was taken) and the standard deviation. The average elementary effect is indicated by μ, and the standard deviation by σ. The σ expresses whether the relation between input variable and model outcome has a linear (σ = 0) or a curvi-linear (σ > 0) character. (Campolongo et al. 1999) Curvi-linearity will be caused by curvi-linear (main) effects and interaction effects from the analysed input variable with other ones."
In summary, the Morris method applies a sophisticated algorithm for global SA where parameters are varied one step at a time in such a way that if sensitivity of one parameter is contingent on the values that other parameters may take, the Morris method is likely to capture such dependencies.
Sorts and locations of uncertainty addressed
Sensitivity Analysis typically addresses statistical uncertainty (inexactness) in inputs and parameters. It is however also possible to use this technique to analyse sensitivity to changes in model structure. It does not treat knowledge uncertainty separately from variability related uncertainty. It provides no insight in the quality of the knowledge base nor in issues of value loading.
Required resources
Skills:
Computer requirements:
Software for sensitivity analysis will run on an average PC. The precise requirements depend on the model to which you apply the sensitivity analysis.
Strengths and limitations
- Basic computer skills
- Basic knowledge of statistical concepts
Typical strengths of Sensitivity Analysis are:
- Gives insight in the potential influences of all sorts of changes in inputs
- Helps discriminating across parameters according to importance for the accuracy of the outcome
- Software for sensitivity analysis is freely available (e.g. SIMLAB: http://sensitivity-analysis.jrc.cec.eu.int/default2.asp?page=SIMLAB)
- Easy to use
Typical weaknesses of Sensitivity Analysis are:
- Has a tendency to yield an overload of information.
- Sensitivity analysis does not require one to assess how likely it is that specific values of the parameters will actually occur.
- Sensitivity testing does not encourage the analyst to consider dependencies between parameters and probabilities that certain values will occur together.
- (Morris:) interactions and non-linearity are hard to distinguish with the Morris method.
These weaknesses can be partly overcome by a skilled design of the SA experiments, taking into account dependencies and restrictions and by being creative in structuring, synthesizing and communicating the information captured in the large amount of numbers produced by the sensitivity analysis.
Guidance on application
- If a likely range is not known one can use for instance the point values plus or minus 50% or a factor 2 (half the point value to double the point value), depending on the nature of the variable, as a first go
- Make sure that the ranges do not include physically impossible values
- Explore possible dependencies
Pitfalls
Typical pitfalls of SA can be:
- Forgetting that SA takes the model structure and boundaries for granted
-
Wasting time on finding out likely ranges for unimportant parameters
This could be avoided by using a two-step approach applying a default range (e.g. a factor 2) on all parameters to find out which parameters appear to be sensitive at all. In such a case one should however also go over the list of variables identified as insensitive and include for the second step also those variables where one has doubts as to whether one is sure that the default range used in that calculation captures the full conceivable range for that parameter. - Ignoring dependencies between parameters
- Exploring irrelevant or physically unrealistic parts of the parameter hyper space
References
Handbooks:
Saltelli A, Ratto M, Andres T, Campolongo F, Cariboni J, Gatelli D, Saisana M and Tarantola S 2008 Global Sensitivity Analysis:
The Primer (Chichester: Wiley)
Andrea Saltelli, Karen Chan, Marian Scott, Sensitivity Analysis John Wiley & Sons publishers, Probability and Statistics series, 2000.
Andrea Saltelli, Stefano Tarantola, Francesca Campolongo, Marco Ratto, Sensitivity Analysis in Practice: A Guide to Assessing Scientific Models, John Wiley & Sons publishers, 2004 (Where the Saltelli et al. 2000 book provides the theoretical basis, this book is a comprehensive practical compendium of recommended methods tailored to specified settings, built around a set of examples and the freely available SIMLAB software.).
Papers
Campolongo, F., S. Tarantola and A. Saltelli. Tackling quantitatively large dimensionality problems. Computer Physics Communication, Vol. 1999, Issue 117, pp75-85.
Janssen, P.H.M., P.S.C. Heuberger, & R.A. Sanders. 1994. UNCSAM: a tool for automating sensitivity and uncertainty analysis. Environmental Software 9:1-11.
Morris, M.D. Factorial sampling plans for preliminary computational experiments. Technometrics, Vol. 33 (1991), Issue 2.
RIVM example of application of Morris:
Jose Potting, Peter Heuberger, Arthur Beusen, Detlef van Vuuren and Bert de Vries, Sensitivity Analysis, chapter 5 in: Jeroen P. van der Sluijs, Jose Potting, James Risbey, Detlef van Vuuren, Bert de Vries, Arthur Beusen, Peter Heuberger, Serafin Corral Quintana, Silvio Funtowicz, Penny Kloprogge, David Nuijten, Arthur Petersen, Jerry Ravetz. 2001. Uncertainty assessment of the IMAGE/TIMER B1 CO2 emissions scenario, using the NUSAP method Dutch National Research Program on Climate Change, Report no: 410 200 104 (2001), 227 pp. (downloadable from http://www.nusap.net)
Websites
http://sensitivity-analysis.jrc.ec.europa.eu/
Tutorial for sensitivity analysis
Software
Available software for sensitivity analysis includes:
- SIMLAB: http://simlab.jrc.ec.europa.eu/
- @risk http://www.palisade.com (commercial plug-in for excel)
- Crystal Ball http://www.oracle.com/crystalball (commercial plug-in for excel)