The variegated data nature and origin make a control procedure essential in order to guarantee the goodness of the informative content. Data coming from the observative network can indeed be affected by uncertainties of a different kind: intrinsic and coarse uncertainties. The former is attributable to measurement instruments or to representativeness uncertainties, i.e. to the discrete nature of the observative network, unable to resolve scales lower than average separation between two observation stations. With the term coarse uncertainties, we instead mean those resulting from a bad calibration, an erroneous observation recording or coding, and uncertainties introduced in telecommunications. It is important to underline that these uncertainties could turn out to be correlated, both in space and time, between them or with the synoptic situation and systematic uncertainties can add to them.
Therefore algorithms implemented in quality control are essential for contaminated data rejection or modification. Controls are cyclically realized and consist of a first control for the right information coding, station localization included; successively the fact that the observed value is physically reasonable is verified and, as the last control, timing and spatial consistency among observations close to each other is requested. Collected observations, though properly selected, anyway provide a spatial and time sampling of the meteorological conditions under exam and, being irregularly distributed in space, they have to be successively analyzed, in order to produce, at a fixed instant of time, a regular spatial representation of the state of the atmosphere, which is needed for the numerical model integration.
Such a process, known as data assimilation, allows to elaborate observed data using interpolation methods and data conditioning and successively forces the model state towards conditions as much similar as possible to those described by the observation related to the initial instant of the simulation.
In order to obtain the analysis, i.e. the most likely representation of the state of the atmosphere, data assimilation methods combine two kinds of information. The former derives from observations and it is a discontinuous and patchy field. The second is obtained through a short-range forecast by the model, known as first guess or background, and it is a continuous field, allowing to use of the observations at best and to extrapolate the information coherently. With the term objective analysis or spatial analysis, we mean that the process allowing to pass from an irregular and heterogeneous distribution of the observation network provided data to a tridimensional regular grid, estimated through the use of mathematical algorithms, maximizing the informative data content. An accurate definition of the initial conditions is essential for the success of the assimilation/numerical prediction system and the importance of the prediction phase uncertainties, due to the initial conditions, are predominant with respect to the other uncertainties sources.
Italian Air Force Meteorological Service has more than decennial leading experience in this field, allowing itself to maintain a major role even at the global level. Initially, OI method has been implemented, successively substituted by a variational 3DVAR scheme (Bonavita M, 2005). Recently stochastic Kalman Filter (Ensemble Kalman Filter, EnKF) has been implemented (Bonavita, 2008, 2010). With EnKF we mean a class of algorithms that in recent years have received growing attention as possible candidates for the substitution of the current generation of data assimilation variational systems in the meteorological and oceanographic fields. The particular version of the EnKF used at COMET, whose characteristics will be detailed in paragraph 6., is known as Local Ensemble Transform Kalman Filter (LETKF) and it is operational since 1 June 2011. Before that date, a similar algorithm was operationally used only by Canadian Meteorological Center (CMC) for their global scale probabilistic forecast system initialization (Meng and Zhang, 2011). COMET is therefore the first operational center on the world stage to use an ensemble analysis scheme to initialize a model on a regional scale.