Data Assimilation and Climate Research June 23-25

 

 

Lectures

Particle Filters and ENSKF
Peter Bickel , University of California, Berkeley

With long time scales nonlinearity in climate forecasting becomes even more significant than usual. A way proposed to deal with this difficulty is particle filtering .I'll discuss the theoretical basis of the method as well as some well known limitations. We'll also present some methods lying between particle filtering and ENSKF which have performed well in the usual test bed models.
The non classical parts are collaborative work including parts with some combination of :
Jeff Anderson (NCAR), Thomas Bengtsson (Bell Labs), Jing Lei (UCB), Bo Li (Purdue University), and Chris Snyder( NCAR)

Progress toward Dynamical Paleoclimate State Estimation
Gregory J. Hakim, University of Washington

While state estimation is a mature subject in weather analysis and prediction, it remains an open problem for estimating low-frequency variability of the coupled-climate system. This problem has deep implications for understand past climate variability and for predicting future climate change. Here we focus on the paleoclimate reconstruction problem, where one attempts to constrain state estimates of the climate system given sparse and noisy proxy data. There are at least three barriers to progress on this problem not faced by the analogous weather forecasting problem: (1) assimilating time-integrated observations rather than instantaneous values, (2) simulating long time periods, and (3) estimating the proxy, which is often biological or chemical.
After reviewing these issues I will discuss recent and ongoing basic research to address these challenges, and present results from a hierarchy of idealized models.

Current Progress on Data Assimilation with the Local Ensemble Transform Kalman Filter
Eric Kostelich, School of Mathematical & Statistical Sciences, Arizona State University, Tempe, AZ 85287-1804

This talk will present a survey of recent work with a novel data assimilation framework, the Local Ensemble Transform Kalman Filter. I will present a brief overview of the approach, then discuss some new research directions, which include modifications of the usual Kalman objective function to implement digital filtering directly in the data assimilation algorithm; to estimate bias parameters in the assimilation of satellite radiances; and to accommodate non-Gaussian data. Potential applications to climate models and carbon emission estimation will be described.
This is joint work with many people, including Istvan Szunyogh at Texas A&M University; Brian Hunt, Ed Ott and Eugenia Kalnay at U. Maryland, College Park; and Jose Arevequia of CPTEC, Brazil.

Uncertainty Prediction and Intelligent Sampling for Ocean Estimation
P.F.J. Lermusiaux and the MSEAS group
(Massachusetts Institute of Technology, Cambridge MA, USA)

A grand challenge in Ocean and Earth System Sciences is the ability to quantitatively predict the accuracy of predictions. Optimal prognostic approximations of the Fokker-Plank equations must capture the essence of the complex system while being computationally tractable. We derived new Dynamically Orthogonal (DO) equations that decompose the solution into mean and stochastic dynamical components. This leads to a coupled system of field equations consisting of a PDE for the mean field, a family of PDEs for the orthonormal basis that evolves the error subspace where the stochasticity `lives' as well as a system of SDEs that define how the stochasticity evolves in the time varying error subspace. For this derivation, we impose nothing more than a rate-of-change of the subspace dynamically orthogonal to the subspace itself. Our work extends and generalizes the classic Proper-Orthogonal-Decomposition and the generalized Polynomial-Chaos equations. Using these DO equations and the ideas of Error Subspace Statistical Estimation (ESSE), we provide adaptive schemes for learning the size of the error subspace. We discuss new DO numerical schemes and apply them to viscous Navier-Stokes flows as well as to idealized ocean-climate simulations, and we compare our results with Monte Carlo simulations.
Another grand challenge is to develop algorithms for optimal sensing of the Ocean and Earth System using large numbers of smart vehicles. The more intelligent they become, i.e. the more knowledgeable about the predicted future dynamics and about the predicted effects of their sampling on field estimates, the greater their impact. We review our experience and future directions focusing on ocean uncertainty reduction. The schemes we have developed include: adaptive sampling via ESSE with non-linear predictions of error reductions; Mixed Integer Linear Programming for path planning; nonlinear path planning using Genetic algorithms; Dynamic programming and onboard routing for path planning; Level-Set methods for ocean sampling swarms; adaptive sampling with DO-assimilation and POMDPs; and, Command and control of autonomous surface craft over the Web, directly from ocean model instructions.

Nonlinear/NonGaussian Estimation and Stochastic Parametrization
Prof. Juan M. Restrepo
Group Leader, Uncertainty Quantification Group, Department of Mathematics, University of Arizona

In the first part of the presentation I will describe a particle-filter method that relies on the parameterization of the fluctuation field in order to obtain a cost reduction. The method is ad-hoc (as is enKF, for example) but is capable of handling non-linear non-Gaussian estimation problems. Attempts at using dimension reduction, with the aim of increasing the filter's applicability to larger dynamic problems led us to consider Bred Vectors. Two things resulted from this endeavor: we developed an alternative algorithm for the calculation of Bred Vectors; we also gained useful insight on their interpretation and their potential utility in sensitivity analysis and in the compact representation of posterior covariances. In the third part of my presentation I will show how we used stochastic parameterization of a physical process and suggest that, while there is no general principle for the parameterization, the procedure can result in a dimension reduction along with a robust characterization of physical critical to the dynamics in question. To conclude I describe our current project in data assimilation, which combines stochastic parameterization and nonlinear/non-Gaussian estimation.
Joint work with P. Krause, J. Ramirez, J. C. McWilliams, M. Banner, A. Mazzucato, N. Balci, G. Sell

Particle Filters for Lagrangian Data Assimilation
Elaine Spiller, Marquette University

Sequential Monte-Carlo (SMC) techniques allow one to approximate posterior distributions of hidden states at observation instances without making assumptions of linearity on the dynamic model or of Gaussianity on the distribution of state variables. In the case of Lagrangian data assimilation, we are interested in making inference on a flow field by sequentially observing a tracer moving in that field.
Particle filters are a way to implement SMC using a large number of random samples, or particles, to obtain discrete approximations of these distributions. I will discuss the application particle filters to Lagrangian data assimilation example problems, describing some difficulties and interesting modifications along the way.

Reconstructing Paleoclimate Using a Data Assimilation Method - Challenges and Problems
Gerard van der Schrier, KNMI

Traditional data assimilation techniques require very detailed knowledge of the atmospheric state, till at the level of the smallest spatially resolved scales and at a temporal scale of days. In paleoclimatology, this precise knowledge of the past atmospheric state is absent, where one typically has a low-spatial resolution, statistical reconstruction of atmospheric circulation for a limited domain based on proxy or documentary data from only a few locations. These reconstructions often have a temporal resolution of months or longer. This renders the traditional data-assimilation techniques in paleoclimatology useless since the linearity assumption used in these traditional techniques breaks down on these timescales. Here we apply a recently developed technique which can be used to overcome this problem. This technique determines small perturbations to the time evolution of the prognostic variables which optimally adjust the model-atmosphere in the direction of a target pattern. These tendency perturbations are referred to as Forcing Singular Vectors (FSVs) and can be computed using an adjoint model. With FSVs, it is possible to make a simulation of global climate which reproduces, in a time-averaged sense, the statistical reconstructions of atmospheric circulation, while leaving the atmosphere free to respond in a dynamically consistent way to any changes in climatic conditions. Importantly, synoptic-scale variability internal to the atmospheric or climatic system is not suppressed and can adjust to the changes in the large-scale atmospheric circulation. This gives a simulation of paleoclimate which is close to the scarce observations. Two applications of FSV to paleoclimatology are discussed ("Little Ice Age" climate in Europe and rapid climate change at the end of the last glacial). Both applications are in the framework of an intermediate complexity ocean-atmosphere coupled GCM. Both the benefits and the problems associated with the application of FSV to paleoclimatology are discussed.

Toward Practical Rare Event Simulation in High Dimensions
Jonathan Weare, Courant Institute, New York University

I will discuss an importance sampling method for certain rare event problems involving small noise diffusions. Standard Monte Carlo schemes for these problems behave exponentially poorly in the small noise limit. Previous work in rare event simulation has focused on developing, in specific situations, estimators with optimal exponential variance decay rates. I will introduce an estimator related to a deterministic control problem that not only has an optimal variance decay rate under certain conditions, but that can even have vanishingly small statistical relative error in the small noise limit.
The method can be seen as the limit of a well known zero variance importance sampling scheme for diffusions which requires the solution of a second order partial differential equation.

Assimilating Coherent Features
Jeffrey Weiss, Colorado University

Coherent features in the climate system include spatial structures such as jets, fronts, and vortices. Dramatic failures of weather forecasts are often due to errors in the location and magnitude of these features. We propose a general, modular technique to assimilate coherent features. The modularity of the technique allows it to be used with different classes of coherent features and with a variety of assimilation techniques. As a first step, we implement a simplified jet assimilation algorithm in an idealized weather forecast system, and obtain a significant reduction in forecast error. Implementation of this technique in more realistic models and applicability to climate prediction are discussed.

Constraining Simulated Atmospheric States by Sparse Empirical Information
Martin Widmann
School of Geography, Earth and Environmental Sciences, University of Birmingham, UK
in collaboration with Hugues Goosse, Institut d'Astronomie et de Géophysique George Lemâitre, Université catholique de Louvain

Empirical information on the state of the atmosphere before the mid-19th century is sparse and usually represents seasonal or annual means, and thus data assimilation methods used for the 20th century are not directly applicable. In this presentation three assimilation methods that have been developed during recent years to constrain the states in atmosphere models with empirical information typically available for the last few hundred years will be introduced and their common aspects as well as their differences will be discussed. These methods are Ensemble Member Selection, Forcing Singular Vectors and Pattern Nudging.
The first method uses ensembles of simulations with an Earth System Model of Intermediate Complexity (EMIC) and selects the members that are closest to empirical information with respect to some cost function. The Forcing Singular Vectors approach has also been implemented using an EMIC and is based on an adjoint model to define small additional forcings that after a given time bring the simulated state close to a target state. It will be covered in detail in the talk by Gerard van der Schrier but will be briefly discussed here to complete the overview.
Pattern Nudging uses simple nudging terms to control the circulation in General Circulation Models (GCMs) such that the amplitudes of large-scale anomalies are close to prescribed values without suppressing synoptic-scale variability. In applications these large-scale anomalies may be obtained from proxy-based reconstructions or represent idealised situations for process studies. In comparison with the other methods Pattern Nudging needs less computing resources as it does not require ensemble simulations or adjoint models, which makes it particularly suitable for GCMs. Based on simulations with the atmosphere GCM ECHAM4 it will be shown that Pattern Nudging performs generally well when the aim is to control the state of the Northern Annular Mode, but that the performance varies seasonally. It will also be shown that the synoptic-scale variability responds realistically to the large-scale forcing.

An Oceanographic Perspective on Climate State Estimation
Carl Wunsch, Department of Earth, Atmospheric and Planetary Sciences
Massachusetts Institute of Technology

Fifteen years of experience in estimating the oceanic state leads to some conclusions concerning existing and potential climate-state estimates, and hypothetically, of their prediction. Among the major conclusions are that: (1) The widespread use of "data assimilation" methods developed for weather forecasting are inappropriate for the climate problem. In, for example, the meteorological reanalyses, one sees large-scale inconsistencies based on the same data sets; jumps in the state when new data types are introduced; and gross violation of basic constraints such as energy and water conservation. More generally, such estimates cannot be used for climatologically oriented budget calculations regionally or globally. Practical methods do exist for appropriate estimates, but are computationally more demanding than are meteorological methodologies: the price has to be paid. (2) Useful climate state estimates necessitate using systems that couple ocean, atmosphere, sea ice, glacial ice, and groundwater models as well as the global-scale observations pertaining to all of them. Problems in doing so range from computational loads, to numerical issues arising from the huge range of time scales embedded in these systems, to the need to understand the observational errors in all of the data types. Carrying out such calculations is beyond the resources of academic groups and requires a different infrastructure. (3) The ability to provide uncertainty estimates for any long computation with large numerical models remains very primitive, is a major obstacle to climate change discussion credibility, and gravely hampers discussions of risk factors. Although nice theoretical ideas exist (ensembles, Fokker-Planck calculations, inverse Hessians,....) no practical, useful, system has yet emerged. Ensembles, which are the most widely used approach suffer from trivial dimensionality and failure to span the uncertainty space in the various parameters. The others have computational costs which remain forbidding.