University of Sheffield

Tony O'Hagan - Academic pages - Abstracts

.

Bayesian Analysis of Computer Code Outputs

Marc C. Kennedy, Anthony O'Hagan and Neil Higgins

National Institute for Statistical Sciences, USA, University of Sheffield and National Radiological Protection Board, UK

Publication details: In Quantitative Methods for Current Environmental Issues, C W Anderson, V Barnett, P C Chatwin, A H El-Shaarawi (editors), 227-243. Springer-Verlag, 2002.


Abstract

Complex computer models are widely used to describe and predict environmental phenomena. Although such models are generally deterministic, there is a flourishing area of research that treats their outputs as random quantities, in order to provide powerful solutions to some important problems facing the users of such models. Such problems include interpolation/emulation of the computer code itself, sensitivity and uncertainty analysis, and calibration.

This article reviews this field, with particular reference to a Bayesian methodology that offers a unified framework for addressing all such problems. A substantial practical example is then presented, involving calibration of a model to describe the radioactive deposition following the Windscale nuclear accident in the UK in 1957. The example illustrates some important features of the approach when the computer model is unable to represent the real phenomenon accurately, so that the Bayesian method then seeks to correct the computer model.


Return to my publications page.
Updated: 21 January 2002
Maintained by: Tony O'Hagan