Policy issues such as climate change, biodiversity loss, genetically modified crops, and environmental health risks of ambient particulate matter are complex and contested (e.g., Funtowicz and Ravetz 1990, Funtowicz 2006, Pilkey and Pilkey-Jarvis 2007, Beck 2007). For lengthy periods of debate, there are few ‘facts’ that command universal assent. Every side may possess total certainty in the validity of their arguments, but they cannot all be correct in their conviction. Decisions need to be made before conclusive supporting evidence is available, while at the same time the potential impactis of wrong decisions can be huge. Questions that cannot be answered due to inconclusive evidence include: how likely are human-caused abrupt climate changes, such as nonlinear sea level rise? How much mitigation of greenhouse gas emissions is needed to prevent dangerous anthropogenic interference with the climate system? What will be the future impact of human activities on biodiversity? Can there be science-based precaution? What is the fraction of particulate matter that causes health risks?
Governmental and intergovernmental agencies that inform the public about such risks increasingly recognize that uncertainty and disagreement can no longer be suppressed or denied, but needs to be dealt with in a transparent and effective manner. For instance, in its recent report Models in Environmental Regulatory Decision Making (NRC 2007), the US National Research Council recommends that the US-EPA pay more attention to the systematic treatment and communication of uncertainties.
The problems of inconclusive and uncertain evidence in science-for-policy can be addressed along different lines. On the one hand, there are formal methods for sensitivity and uncertainty analysis (Saltelli et al 2008) as well as methods for making inferences from (uncertain) evidence. For instance, in Bayesian methods, evidence is used to update or to newly infer the probability that a hypothesis is true. In Dempster–Shafer theory (Dempster 1967, Shafer 1976), Bayesian probabilities are assigned to fuzzy sets, whereby evidence can be associated with multiple possible events or sets of events. The theory provides rules for combining evidence from multiple sources and conflicting evidence. To give a third example of formal approaches, in epidemiological work there is a practice of performing meta-analysis to combine evidence from multiple studies (e.g. Schwartz 1994). On the other hand, it is increasingly recognized that not all uncertainties can be quantified or handled in a formal way, and complementary, reflective approaches to explore the quality of evidence have been developed. Examples of such methods are pedigree analysis (van der Sluijs et al 2005a, 2005b), model quality checklist (Risbey et al 2005), data quality indicators (SETAC 1994), and data attribute rating system (Beck and Wilson 1997).
In response to emerging needs, several institutions that interface science and policy have adopted Knowledge Quality Assessment (KQA) approaches, which include both formal and reflective methods (IPCC 2005, EPA 2003, UK Strategy Unit 2002, MNP/UU 2003, Kinzig et al 2003, Janssen et al 2005, Ha-Duong et al 2007).
There is a growing need for comprehensive, multi-disciplinary approaches to KQA, which take into account the societal context of knowledge production. KQA can facilitate more transparent and systematic treatment and communication of uncertainties in environmental assessments.
Evidence in policy advice
In the modern view of scientific policy advice, science produces objective, valid, and reliable knowledge. That view resembles the situation in a mono-disciplinary textbook: for each problem, there is just one correct solution, derived from the facts produced by science. But in the real world of policy advice on complex issues, we face uncertainty and controversy and questions arise to what extent the information can be really objective, valid, and reliable (Oreskes et al 1994, Petersen 2000, Funtowicz 2006). Scientific assessments of complex policy issues have to integrate information covering the entire spectrum from well-established scientific knowledge to educated guesses, preliminary models, and tentative assumptions. The genuine policy debates that take place use all such materials, not as certain, established facts, but as evidence, whose quality must be assessed through appropriate procedures. Since the evaluation of uncertain information can involve an assignment of burden of proof (is a substance deemed harmless until proved otherwise?), the analogy with jurisprudence is much closer than had previously been realized by spokesmen for science. And when we consider such materials as evidence brought into an argument, rather than as imperfect facts or defective knowledge, the need for analysis of their quality, including uncertainties, becomes natural and obvious.
Social studies of scientific advice show that for many complex problems, the processes within the scientific community as well as between this community and the ‘external’ world—policy makers, stakeholders and civil society—determine the acceptability of a scientific assessment as a shared basis for action. These processes concern, among others, the framing of the problem, the choice of methods, the strategy to gather the data, the review and interpretation of results, the distribution of roles in knowledge production and assessment, and the function of the results in the policy arena. Although assumptions underlying the design of these processes are rarely discussed openly, they are important for the knowledge becoming either ‘contested’ or ‘robust’. More research on complex issues sometimes reveals more uncertainties and can even lead to more intense controversy and weaker evidence if these implicit assumptions are not adequately dealt with (Sarewitz 2004, van der Sluijs 2005).
In contrast to the general practice, it is not enough to analyze uncertainty as a ‘technical’ problem or merely seek for consensus interpretations of inconclusive evidence. In addition, the production of knowledge and the assessment of uncertainty have to address deeper uncertainties that reside in problem framings, expert judgments, assumed model structures, etc. Because scientists are generally not well prepared for this new task, systematic guidance is needed.
This e-learning parcours presents state of the art KQA concepts, tools and approaches.
References
Beck L and Wilson D 1997 EPA’s data attribute rating system Proc.Specialty Conf. on Emission Inventory: Planning for the Future pp 176–89
Beck M B 2007 How best to look forward? Science 316 202–3
Dempster A P 1967 Upper and lower probabilities induced by a multivalued mapping Ann. Stat. 28 325–39
EPA (Environmental Protection Agency) 2003 Draft Guidance on the Development, Evaluation, and Application of Regulatory Environmental Models (Washington, DC: US EPA Council for Regulatory Environmental Modeling)
Funtowicz S O 2006Why knowledge assessment? Interfaces between Science and Society ed A. Guimarares Pereira, S Guedes Vaz and S Tognetti (Sheffield: Greenleaf) pp 138–45
Funtowicz S O and Ravetz J R 1990 Uncertainty and Quality in Science for Policy (Dordrecht: Kluwer)
Ha-Duong M, Swart R, Bernstein L and Petersen A C 2007 Uncertainty management in the IPCC: agreeing to disagree Global Environ. Change 17 8–11
IPCC (Intergovernmental Panel on Climate Change) 2005 Guidance Notes for Lead Authors of the IPCC Fourth Assessment Report on Addressing Uncertainties
Janssen P H M, Petersen A C, van der Sluijs J P, Risbey J S and Ravetz J R 2005 A guidance for assessing and communicating uncertainties Water Sci. Technol. 52 125–131
Kinzig A et al 2003 Coping with uncertainty: a call for a new science-policy forum Ambio 32 330–5
MNP/UU 2003 RIVM/MNP Guidance for Uncertainty Assessment and Communication (Bilthoven: MNP) (Utrecht: Utrecht University)
NRC (National Research Council) 2007 Models in Environmental Regulatory Decision Making (Washington, DC: National Academies Press)
Oreskes N, Shrader-Frechette K and Belitz K 1994 Verification, validation, and conformation of numerical models in the Earth sciences Science 263 641–6
Petersen A C 2000 Philosophy of climate science Bull. Am. Meteorol. Soc. 81 265–71
Pilkey O H and Pilkey-Jarvis L 2007 Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future (New York: Columbia University Press)
Risbey J S, van der Sluijs J P, Kloprogge P, Ravetz J R, Funtowicz S and Corral Quintana S 2005 Application of a checklist for quality assistance in environmental modelling to an energy model Environ. Model. Assess. 10 63–79
Saltelli A, Ratto M, Andres T, Campolongo F, Cariboni J, Gatelli D, Saisana M and Tarantola S 2008 Global Sensitivity Analysis: The Primer (Chichester: Wiley)
SETAC 1994 Life-Cycle Assessment Data Quality: A Conceptual Framework (Pensacola, FL: Society of Environmental Toxicology and Chemistry and SETAC Foundation for Environmenal Education)
Sarewitz D 2004 How science makes environmental controversies worse Environ. Sci. Policy 7 385–403
Schwartz J 1994 Air pollutioni and daily mortality: a review and meta analysis Environ. Res. 64 36–52
Shafer 1976 A Mathematical Theory of Evidence (Princeton, NJ: Princeton University Press)
UK Strategy Unit 2002 Risk: Improving Government’s Capability to Handle Risk and Uncertainty (London: United Kingdom Strategy Unit, Cabinet Office)
van der Sluijs J P 2005 Uncertainty as a monster in the science policy interface: four coping strategies Water Sci. Technol. 52 87–92
van der Sluijs J P, Craye M, Funtowicz S O, Kloprogge P, Ravetz J R and Risbey J S 2005a Combining quantitative and qualitative measures of uncertainty in model based environmental assessment: the NUSAP system Risk Anal. 25 481–92
van der Sluijs J P, Craye M, Funtowicz S O, Kloprogge P, Ravetz J R and Risbey J S 2005b Experiences with the NUSAP system for multidimensional uncertainty assessment in model based foresight studies Water Sci. Technol. 52 133–44