A Local Least Squares Framework for Ensemble Filtering

Anderson, Jeffrey L.
April 2003
Monthly Weather Review;Apr2003, Vol. 131 Issue 4, p634
Academic Journal
Many methods using ensemble integrations of prediction models as integral parts of data assimilation have appeared in the atmospheric and oceanic literature. In general, these methods have been derived from the Kalman filter and have been known as ensemble Kalman filters. A more general class of methods including these ensemble Kalman filter methods is derived starting from the nonlinear filtering problem. When working in a joint state–observation space, many features of ensemble filtering algorithms are easier to derive and compare. The ensemble filter methods derived here make a (local) least squares assumption about the relation between prior distributions of an observation variable and model state variables. In this context, the update procedure applied when a new observation becomes available can be described in two parts. First, an update increment is computed for each prior ensemble estimate of the observation variable by applying a scalar ensemble filter. Second, a linear regression of the prior ensemble sample of each state variable on the observation variable is performed to compute update increments for each state variable ensemble member from corresponding observation variable increments. The regression can be applied globally or locally using Gaussian kernel methods. Several previously documented ensemble Kalman filter methods, the perturbed observation ensemble Kalman filter and ensemble adjustment Kalman filter, are developed in this context. Some new ensemble filters that extend beyond the Kalman filter context are also discussed. The two-part method can provide a computationally efficient implementation of ensemble filters and allows more straightforward comparison of methods since they differ only in the solution of a scalar filtering problem.


Related Articles

  • A PRELIMINARY STUDY ON THUNDERSTORM FORECAST WITH LS-SVM METHOD. WANG Zhen-hui; ZHANG Yi; ZHU Jia // Journal of Tropical Meteorology;Mar2013, Vol. 19 Issue 1, p104 

    The LS-SVM (Least squares support vector machine) method is presented to set up a model to forecast the occurrence of thunderstorms in the Nanjing area by combining NCEP FNL Operational Global Analysis data on 1.0°×1.0° grids and cloud-to-ground lightning data observed with a lightning...

  • Least squares support vector machine for short-term prediction of meteorological time series. Mellit, A.; Pavan, A.; Benghanem, M. // Theoretical & Applied Climatology;Jan2013, Vol. 111 Issue 1/2, p297 

    The prediction of meteorological time series plays very important role in several fields. In this paper, an application of least squares support vector machine (LS-SVM) for short-term prediction of meteorological time series (e.g. solar irradiation, air temperature, relative humidity, wind...

  • Prediction of daily crop reference evapotranspiration (ETo) values through a least-squares support vector machine model. Xianghong Guo; Xihuan Sun; Juanjuan Ma // Hydrology Research;2011, Vol. 42 Issue 4, p268 

    Real-time prediction of daily reference crop evapotranspiration (ETo) is the basis for estimating crop evapotranspiration and for computing crop irrigation requirements. In recent years, least-squares support vector machines (LSSVMS) have been applied for forecasting in many field of...

  • Robust trend estimation of observed German precipitation. Trömel, S.; Schönwiese, C. D. // Theoretical & Applied Climatology;2008, Vol. 93 Issue 1/2, p107 

    Trends in climate time series are habitually estimated on the basis of the least-squares method. This estimator is optimal if the residuals follow the Gaussian distribution. Unfortunately, only a small number of observed climate time series fulfil this assumption. This work introduces a robust...

  • Application and Comparison of Robust Linear Regression Methods for Trend Estimation. Muhlbauer, Andreas; Spichtinger, Peter; Lohmann, Ulrike // Journal of Applied Meteorology & Climatology;Sep2009, Vol. 48 Issue 9, p1961 

    In this study, robust parametric regression methods are applied to temperature and precipitation time series in Switzerland and the trend results are compared with trends from classical least squares (LS) regression and nonparametric approaches. It is found that in individual time series...

  • Estimation of the impact of short-term fluctuations in inputs on temporally aggregated outputs of process-oriented models. Å. Forsman; C. Andersson; A. Grimvall; M. Hoffmann // Journal of Hydroinformatics;Jul2003, Vol. 5 Issue 3, p169 

    Process-oriented models driven by highly resolved meteorological inputs and comprising a short internal time step are sometimes used to predict substance fluxes in air, soil and water over fairly long periods of time. To ascertain whether regression-based input–output analyses in such...

  • On Fitting a Straight Line to Data when the 'Noise' in Both Variables Is Unknown*. Clarke, Allan J.; Van Gorder, Stephen // Journal of Atmospheric & Oceanic Technology;Jan2013, Vol. 30 Issue 1, p151 

    In meteorology and oceanography, and other fields, it is often necessary to fit a straight line to some points and estimate its slope. If both variables corresponding to the points are noisy, the slope as estimated by the ordinary least squares regression coefficient is biased low; that is, for...

  • Reliability of Regression-Corrected Climate Forecasts. Tippett, Michael K.; DelSole, Timothy; Barnston, Anthony G. // Journal of Climate;May2014, Vol. 27 Issue 9, p3393 

    Regression is often used to calibrate climate model forecasts with observations. Reliability is an aspect of forecast quality that refers to the degree of correspondence between forecast probabilities and observed frequencies of occurrence. While regression-corrected climate forecasts are...

  • Conditional bias-penalized kriging (CBPK). Seo, Dong-Jun // Stochastic Environmental Research & Risk Assessment;Jan2013, Vol. 27 Issue 1, p43 

    Simple and ordinary kriging, or SK and OK, respectively, represent the best linear unbiased estimator in the unconditional sense in that they minimize the unconditional (on the unknown truth) error variance and are unbiased in the unconditional mean. However, because the above properties hold...


Read the Article


Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics