Data Assimilation System

The variegated data nature and origin make a control procedure essential in order to guarantee the goodness of the informative content. Data coming from the observative network can indeed be affected by uncertainties of different kind: intrinsic and coarse uncertainties. The former are attributable to measurement instruments or to representativeness uncertainties, i.e. to the discrete nature of the observative network, unable to resolve scales lower than average separation between two observation stations. With the term coarse uncertainties we instead mean those resulting from a bad calibration, an erroneous observations recording or coding and uncertainties introduced in telecommunications. It is important to underline that these uncertainties could turn out to be correlated, both in space and time, between them or with the synoptic situation and sistematic uncertaintes can add to them.
Therefore algorithms implemented in quality control are essential for contaminated data rejection or modification. Controls are cyclically realised and consist in a first control for the right information coding, station localisation included; successively the fact that the observation value is physically reasonable is verified and, as last control, timing and spatial consistency among observations close to each other is requested. Collected observations, though properly selected, anyway provide a spatial and time sampling of the meteorological conditions under exam and, being irregularly distributed in space, they have to be successively analysed, in order to produce, at a fixed istant of time, a regular spatial representation of the state of the atmosphere, which is needed for the numerical model integration.
Such process, known as data assimilation, allows to elaborate observed data using interpolation methods and data conditioning and successively force the model state towards conditions as much similar as possible to those described by the observation related to the initial instant of the simulation .
In order to obtain the analysis, i.e. the most likely representation of the state of the atmosphere, data assimilation methods combine two kind of information. The former derives from observations and it is a discontinuos and patchy field. The second is obtained through a short range forecast by the model, known as first guess or background, and it is a continuos field, allowing to use the observations at best and to extrapolate the information coherently. With the term objective analysis, or spatial analysis, we mean that process allowing to pass from an irregular and eterogenous distribution of the observation network provided data to a tridimensional regular grid, estimated through the use of mathematical alghorithms, maximizing the informative data content. An accurate definiton of the initial conditions is essential for the success of the assimilation/numerical prediction system and the importance of the prediction phase uncertainties, due to the initial conditions, are predominant respect to the other uncertainties sources.
Italian Air Force Meteorological Service has a more than decennial leading experience in this field, allowing itself to mantain a major role even at global level. Initially OI method has been implemented, successively substituted by a variational 3DVAR scheme (Bonavita M, 2005). Recently stocastic Kalman Filter (Ensemble Kalman Filter, EnKF) has been implemented (Bonavita, 2008, 2010). With EnKF we mean a class of algorithms that in recent years have received a growing attention as possible candidates for the substitution of the current generation of data assimilation variational systems in the meteorological and oceanographic fields. The particular version of the EnKF used at COMET, whose characteristics will be detailed in paragraph 6., is known as Local Ensemble Transform Kalman Filter (LETKF) and it is operational since 1 june 2011. Before that date a similar algorithm was operationally used only by Canadian Meteorological Center (CMC) for their global scale probabilistic forecast system initialisation (Meng and Zhang, 2011). COMET is therefore the first operational center on the world stage to use an ensemble analysis scheme to initialize a model on a regional scale .


References:
- Bonavita M., Torrisi L., 2005, Meteor. And Atmos. Phys. Vol. 88, 39-52

- Bonavita, Torrisi and Marcucci ,2008, Q.J.R.Meteorol. Soc. 134

- Bonavita M, Torrisi L, Marcucci F. 2010. Ensemble data assimilation with the CNMCA regional forecasting system. Q.J. R. Meteorol. Soc. 136: 132-145

- Bonavita M, Torrisi L, Marcucci F. 2010. Ensemble data assimilation with the CNMCA regional forecasting system. Q.J. R. Meteorol. Soc. 136: 132-145

- Meng, Z., Zhang, F., 2011. Limited-area ensemble-based data assimilation. Monthly Weather Review 139, 2025–2045.