Dimensions of uncertainty

According to its type, uncertainty can be classified as (Funtowicz and Ravetz, 1990):

 

Technical uncertainty: inexactness.
This refers to significant error-bars, probability distribution functions, multiple tenable model structures etc.;

 

Methodological uncertainty: unreliability
This refers to the level of confidence, quality, soundness, scientific status etc. of the knowledge;

 

Epistemological uncertianty: ignorance.
This refers to all 'don't know what we don't know'. Funtowicz and Ravetz talk about the border with ignorance rather than ignorance because by definition we cannot say anything useful about that of which we are ignorant, "but the boundless sea of ignorance has shores which we can stand on and map."

  • Inexactness can be addressed by quantitative uncertainty analysis tools such as sentitivity analysis and monte carlo analysis. We refer to the section on uncertainty tools for more details.
  • Unreliability can be addressed by quality control. See the section on quality for more details.
  • Ignorance is the most difficult category of uncertainty to address. Ignorance refers to all ’don’t know what we don’t know’. One strategy to address ignorance is more research, an other one is to attempt to anticipate the unexpected (surprise).

 

Reducing ignorance by research, a paradox
Ignorance is unassessable, so the only thing we can do is explore the border with ignorance. The paradox is that we try to reduce ignorance by doing more research, whereas more research increases the border with ignorance and ignorance increases with increased commitments based on given knowledge (e.g. Wynne, 1992). Pascal once said: "Science is like a ball in a universe of ignorance. The more we expand knowledge, the greater the ignorance encountered by the ball’s expanding surface." (Cited in: Giarini and Stahel, 1993). Giarini and Stahel (1993, p219/220) have put forward the philosophical notion that "Our ignorance and our imperfect information are an instance of disequilibrium, a condition of life and of evolution. Our growing ignorance, determined by the growth of our knowledge which increases the number of unanswered questions, is the best evidence that we are part of the flow of life. Experience tells us that whenever we have the feeling of having completely mastered and understood a problem, it is often because the object or the situation of reference no longer exists: we are just about to discover that our confidence in our capacity "totally" to understand is at least partly misplaced.".

 

Anticipating surprise
"Much of the work to date has been based, implicitly or explicitly, on an evolutionary paradigm - the gradual, incremental unfolding of the world system in a manner that can be described by surprise-free models, with parameters derived from a combination of time series and crosssectional analysis of the existing system. ... The focus on surprise-free models and projections is not the result of ignorance or reductionism so much as of the lack of practically usable methodologies to deal with discontinuities and random events. The multiplicity of conceivable surprises is so large and heterogeneous that the analyst despairs of deciding where to begin, and instead proceeds in the hope that in the longer sweep of history surprises and discontinuities will average out, leaving smoother long-term trends that can be identified in retrospect and can provide a basis for reasonable approximations in the future." (Brooks, 1986).

Surprise can play a role in every step of the causal chain of environmental ans sustainability issues. Examples from the past are discrete events such as the oil shocks of 1973 and 1979; discontinuities in long-term trends, such as the acceleration of USA oil imports between 1966 and 1973; but also events that turn out to trigger or accelerate the policy process such as the 1988 US heat wave, and the unprecedented damage (US$ 15,500,000,000) caused by super storm Andrew in 1992 (Property Claim Services, 1996). The natural system also has surprises such as the volcanic eruption of Mt. Pinatubo in June 1991 which is believed to be responsible for the observed discontinuity in the trends in atmospheric concentrations of CO2, CO and CH4 and in temperature (McCormick et al., 1995).

A further issue is that non-linear stochastic systems might have contra-intuitive future states which are missed if the system representation is inadequate. Such an inadequacy can be the neglect of feedbacks in the system. Another problem that might make models inadequate is that in real-world stochastic complex systems, the variable probability values are constantly in flux. Further, the natural stochasticity in nature constantly alters the relationships between system components, and new external variables are added regularly, which change the natural conditions for the overall system. For instance, the introduction of human-made substances, such as CFCs, into the atmosphere has dramatically changed stratospheric chemistry. As another example, the emission of a certain component can change the atmospheric chemistry pathways of a range of other components. These categories of "dynamic system dynamics" are not represented or are only poorly represented in current models.

The simplifications made to model complex systems despite our limited understanding might well rule out certain characteristics of system dynamics such as the existence and nature of attractors in the system, which might be crucial in the evaluation of future behaviour of the
system.

Most of the Earth Systems Models use smoothened, idealized and deterministic functional relations, a part of the potentially identifiable surprise is ruled out by the way the model is constructed, namely idealized smooth curves are used to represent relations between variables,
whereas nature contains noise and proves time and time again to be much more capricious and erratic. For instance, the temperature record of the global mean temperature obtained from aggregated measurements and advanced reconstructions of past climates is non-smooth and is
understood to be a mixture of cyclical behaviour on virtually all time scales (such as the diurnal cycle, the seasonal cycle, the El Niño Southern Oscillation, the 11 year and 22 year solar cycles, the 80-90 year solar cycle, the Milankovitch cycles of 22, 43 and 100 thousand years), trends, and irregular fluctuations. These fluctuations are usually called ’natural variability’ of the climate. The trends and the cyclic behaviour can be modelled with smooth assumptions, the irregular part cannot.

Recently, a new bottom-up modelling technique for complex adaptive (social) systems has been developed, called agent-based modelling. It can to a certain extent be used to model some of the aspects of surprise. The method has been demonstrated with a traffic model for the city
Albuquerque. Travel behaviour and decision-rules of every single inhabitant of the city (the agents) are modelled, together with the road network. The resulting aggregated traffic patterns over the time of a day show the build-up of morning rush-hour traffic and resulting traffic jams. The hope is that a deeper understanding of how complex adaptive systems work will suggest the right type of mathematical structures and lead to a decent theory of these processes, which this new school of modelling believes will ultimately lead to the increased predictability of surprises, such as traffic jams (Casti, 1996a; Casti, 1996b).

Given the absence of adequate methodology to model surprise, a systematic search for examples of non-linearities from the past might be the prelude to a search for possible future surprises (Brooks, 1986). Other strategies that can help us to understand surprise include focusing on the underlying principles of surprise, which is what happens in surprise theory (Holling, 1986) and systematic ’thinking the unthinkable’ by imagining unlikely future events followed by the construction of plausible scenarios by which they might be realized (Kates and Clark, 1996).
Non-smoothness introduces a problem into sensitivity and uncertainty analysis because classic uncertainty analysis is based on smooth systems. Sensitivity analysis of non-smooth systems is a special topic that deserves more attention. Such analysis should focus on the identification of (thresholds in) indicators which could be used to predict jumps in the system and discontinuities in trends.

 

References

H. Brooks, The Typology of Surprises in Technology, Institutions and Development, in: W.C. Clark (ed.), Sustainable Development of the Biosphere, Cambridge University Press, New York, 1986, p.325-348.

J. Casti, What if..., in: New Scientist, 13 July 1996.

J. Casti, Would-Be Worlds, John Wiley & Sons, 1996.

S.O. Funtowicz and J.R. Ravetz, Uncertainty and Quality in Science for Policy, Kluwer, Dordrecht, 1990

O. Giarini and W.R. Stahel, The Limits to Certainty, Facing Risks in the New Service Economy, 2nd revised edition, Kluwer Academic Publishers, Dordrecht, 1993.

C.S. Holling, The Resilience of Terrestrial Ecosystems: Local Surprise and Global Change, in: W.C. Clark and R.E. Munn (eds.) Sustainable Development of the Biosphere, Cambridge University Press, 1986, p. 292-317.

R.W. Kates and W.C. Clark, Expecting the unexpected, in: Environment, 38 (2), 1996, p. 6-11 and 28-34.

M.P. McCormick, L.W. Thomason, and C.R. Trepte, Atmospheric Effects of the Mt Pinatubo Eruption, in: Nature, 373, 1995, p. 399-404.

B. Wynne, Uncertainty and Environmental Learning, in: Global Environmental Change, 2, 1992, p.111-127.