scholarly journals Dense CTD survey versus glider fleet sampling: comparison of the performance for regional ocean prediction West of Sardinia

Author(s):  
Jaime Hernandez-Lasheras ◽  
Baptiste Mourre

Abstract. The REP14-MEDsea trial carried out off the West coast of Sardinia in June 2014 provided a rich set of observations from both ship-based CTDs and a fleet of underwater gliders. We present the results of several simulations assimilating data either from CTDs or from different subsets of glider data, including up to 8 vehicles, in addition to satellite sea level anomalies, surface temperature and Argo profiles. The WMOP regional ocean model is used with a Local Mutimodel Ensemble Optimal Interpolation scheme to recursively ingest both lower-resolution large scale and dense local observations over the whole sea trial duration. Results show the capacity of the system to ingest both type of data, leading to improvements in the representation of all assimilated variables. These improvements persist during the 3-day periods separating two analysis. At the same time, the system presents some limitations in properly representing the smaller scale structures, which are smoothed out by the model error covariances provided by the ensemble. An evaluation of the forecasts using independent measurements from shipborne CTDs and a towed Scanfish deployed at the end of the sea trial shows that the simulations assimilating initial CTD data reduce the error by 30 to 40 % (according to the variable under consideration) with respect to the simulation without data assimilation. In the glider-data-assimilative experiments, the forecast error is reduced as the number of vehicles increases. The simulation assimilating CTDs outperforms the simulations assimilating data from one to four gliders. A fleet of eight gliders provides a similar performance as the 10-km spaced CTD initilization survey in these experiments, with an overall 40 % model error reduction capacity with respect to the simulation without data assimilation.

Ocean Science ◽  
2018 ◽  
Vol 14 (5) ◽  
pp. 1069-1084 ◽  
Author(s):  
Jaime Hernandez-Lasheras ◽  
Baptiste Mourre

Abstract. The REP14-MED sea trial carried out off the west coast of Sardinia in June 2014 provided a rich set of observations from both ship-based conductivity–temperature–depth (CTD) probes and a fleet of underwater gliders. We present the results of several simulations assimilating data either from CTDs or from different subsets of glider data, including up to eight vehicles, in addition to satellite sea level anomalies, surface temperature and Argo profiles. The Western Mediterranean OPerational forcasting system (WMOP) regional ocean model is used with a local multi-model ensemble optimal interpolation scheme to recursively ingest both lower-resolution large-scale and dense local observations over the whole sea trial duration. Results show the capacity of the system to ingest both types of data, leading to improvements in the representation of all assimilated variables. These improvements persist during the 3-day periods separating two analyses. At the same time, the system presents some limitations in properly representing the smaller-scale structures, which are smoothed out by the model error covariances provided by the ensemble. An evaluation of the forecasts using independent measurements from shipborne CTDs and a towed ScanFish deployed at the end of the sea trial shows that the simulations assimilating initial CTD data reduce the error by 39 % on average with respect to the simulation without data assimilation. In the glider-data-assimilative experiments, the forecast error is reduced as the number of vehicles increases. The simulation assimilating CTDs outperforms the simulations assimilating data from one to four gliders. A fleet of eight gliders provides similar performance to the 10 km spaced CTD initialization survey in these experiments, with an overall 40 % model error reduction capacity with respect to the simulation without data assimilation when comparing against independent campaign observations.


2019 ◽  
Vol 11 (7) ◽  
pp. 858 ◽  
Author(s):  
Redouane Lguensat ◽  
Phi Huynh Viet ◽  
Miao Sun ◽  
Ge Chen ◽  
Tian Fenglin ◽  
...  

From the recent developments of data-driven methods as a means to better exploit large-scale observation, simulation and reanalysis datasets for solving inverse problems, this study addresses the improvement of the reconstruction of higher-resolution Sea Level Anomaly (SLA) fields using analog strategies. This reconstruction is stated as an analog data assimilation issue, where the analog models rely on patch-based and Empirical Orthogonal Functions (EOF)-based representations to circumvent the curse of dimensionality. We implement an Observation System Simulation Experiment (OSSE) in the South China Sea. The reported results show the relevance of the proposed framework with a significant gain in terms of Root Mean Square Error (RMSE) for scales below 100 km. We further discuss the usefulness of the proposed analog model as a means to exploit high-resolution model simulations for the processing and analysis of current and future satellite-derived altimetric data with regard to conventional interpolation schemes, especially optimal interpolation.


2019 ◽  
Vol 147 (2) ◽  
pp. 627-643 ◽  
Author(s):  
Matthew J. Carrier ◽  
John J. Osborne ◽  
Hans E. Ngodock ◽  
Scott R. Smith ◽  
Innocent Souopgui ◽  
...  

Abstract Most ocean data assimilation systems are tuned to process and assimilate observations to constrain features on the order of the mesoscale and larger. Typically this involves removal of observations or computing averaged observations. This procedure, while necessary, eliminates many observations from the analysis step and can reduce the overall effectiveness of a particular observing platform. Simply including these observations is not an option as doing so can produce an overdetermined, ill-conditioned problem that is more difficult to solve. An approach, presented here, aims to avoid such issues while at the same time increasing the number of observations within the assimilation. A two-step assimilation procedure with the four-dimensional variational data assimilation (4DVAR) system is adopted. The first step attempts to constrain the large-scale features by assimilating a set of super observations with appropriate background error correlation scales and error variances. The second step then attempts to correct smaller-scale features by assimilating the full observation set with shorter background error correlation scales and appropriate error variances; here the background state is taken as the analysis from the first step. Results using a real high-density observation set from underwater gliders in the region southeast of Iceland, collected during the 2017 Nordic Recognized Environmental Picture (NREP) experiment, will be shown using the Navy Coastal Ocean Model 4DVAR (NCOM-4DVAR).


2005 ◽  
Vol 133 (8) ◽  
pp. 2310-2334 ◽  
Author(s):  
Anna Borovikov ◽  
Michele M. Rienecker ◽  
Christian L. Keppenne ◽  
Gregory C. Johnson

Abstract One of the most difficult aspects of ocean-state estimation is the prescription of the model forecast error covariances. The paucity of ocean observations limits our ability to estimate the covariance structures from model–observation differences. In most practical applications, simple covariances are usually prescribed. Rarely are cross covariances between different model variables used. Here a comparison is made between a univariate optimal interpolation (UOI) scheme and a multivariate OI algorithm (MvOI) in the assimilation of ocean temperature profiles. In the UOI case only temperature is updated using a Gaussian covariance function. In the MvOI, salinity, zonal, and meridional velocities as well as temperature are updated using an empirically estimated multivariate covariance matrix. Earlier studies have shown that a univariate OI has a detrimental effect on the salinity and velocity fields of the model. Apparently, in a sequential framework it is important to analyze temperature and salinity together. For the MvOI an estimate of the forecast error statistics is made by Monte Carlo techniques from an ensemble of model forecasts. An important advantage of using an ensemble of ocean states is that it provides a natural way to estimate cross covariances between the fields of different physical variables constituting the model-state vector, at the same time incorporating the model’s dynamical and thermodynamical constraints as well as the effects of physical boundaries. Only temperature observations from the Tropical Atmosphere–Ocean array have been assimilated in this study. To investigate the efficacy of the multivariate scheme, two data assimilation experiments are validated with a large independent set of recently published subsurface observations of salinity, zonal velocity, and temperature. For reference, a control run with no data assimilation is used to check how the data assimilation affects systematic model errors. While the performance of the UOI and MvOI is similar with respect to the temperature field, the salinity and velocity fields are greatly improved when the multivariate correction is used, as is evident from the analyses of the rms differences between these fields and independent observations. The MvOI assimilation is found to improve upon the control run in generating water masses with properties close to the observed, while the UOI fails to maintain the temperature and salinity structure.


2011 ◽  
Vol 21 (12) ◽  
pp. 3619-3626 ◽  
Author(s):  
ALBERTO CARRASSI ◽  
STÉPHANE VANNITSEM

In this paper, a method to account for model error due to unresolved scales in sequential data assimilation, is proposed. An equation for the model error covariance required in the extended Kalman filter update is derived along with an approximation suitable for application with large scale dynamics typical in environmental modeling. This approach is tested in the context of a low order chaotic dynamical system. The results show that the filter skill is significantly improved by implementing the proposed scheme for the treatment of the unresolved scales.


2012 ◽  
Vol 27 (1) ◽  
pp. 124-140 ◽  
Author(s):  
Bin Liu ◽  
Lian Xie

Abstract Accurately forecasting a tropical cyclone’s (TC) track and intensity remains one of the top priorities in weather forecasting. A dynamical downscaling approach based on the scale-selective data assimilation (SSDA) method is applied to demonstrate its effectiveness in TC track and intensity forecasting. The SSDA approach retains the merits of global models in representing large-scale environmental flows and regional models in describing small-scale characteristics. The regional model is driven from the model domain interior by assimilating large-scale flows from global models, as well as from the model lateral boundaries by the conventional sponge zone relaxation. By using Hurricane Felix (2007) as a demonstration case, it is shown that, by assimilating large-scale flows from the Global Forecast System (GFS) forecasts into the regional model, the SSDA experiments perform better than both the original GFS forecasts and the control experiments, in which the regional model is only driven by lateral boundary conditions. The overall mean track forecast error for the SSDA experiments is reduced by over 40% relative to the control experiments, and by about 30% relative to the GFS forecasts, respectively. In terms of TC intensity, benefiting from higher grid resolution that better represents regional and small-scale processes, both the control and SSDA runs outperform the GFS forecasts. The SSDA runs show approximately 14% less overall mean intensity forecast error than do the control runs. It should be noted that, for the Felix case, the advantage of SSDA becomes more evident for forecasts with a lead time longer than 48 h.


2015 ◽  
Vol 2 (2) ◽  
pp. 513-536 ◽  
Author(s):  
I. Grooms ◽  
Y. Lee

Abstract. Superparameterization (SP) is a multiscale computational approach wherein a large scale atmosphere or ocean model is coupled to an array of simulations of small scale dynamics on periodic domains embedded into the computational grid of the large scale model. SP has been successfully developed in global atmosphere and climate models, and is a promising approach for new applications. The authors develop a 3D-Var variational data assimilation framework for use with SP; the relatively low cost and simplicity of 3D-Var in comparison with ensemble approaches makes it a natural fit for relatively expensive multiscale SP models. To demonstrate the assimilation framework in a simple model, the authors develop a new system of ordinary differential equations similar to the two-scale Lorenz-'96 model. The system has one set of variables denoted {Yi}, with large and small scale parts, and the SP approximation to the system is straightforward. With the new assimilation framework the SP model approximates the large scale dynamics of the true system accurately.


Author(s):  
Ryan N. Smith ◽  
Jonathan Kelly ◽  
Yi Chao ◽  
Burton H. Jones ◽  
Gaurav S. Sukhatme

Autonomous underwater gliders are robust and widely-used ocean sampling platforms that are characterized by their endurance, and are one of the best approaches to gather subsurface data at the appropriate spatial resolution to advance our knowledge of the ocean environment. Gliders generally do not employ sophisticated sensors for underwater localization, but instead dead-reckon between set waypoints. Thus, these vehicles are subject to large positional errors between prescribed and actual surfacing locations. Here, we investigate the implementation of a large-scale, regional ocean model into the trajectory design for autonomous gliders to improve their navigational accuracy. We compute the dead-reckoning error for our Slocum gliders, and compare this to the average positional error recorded from multiple deployments conducted over the past year. We then compare trajectory plans computed on-board the vehicle during recent deployments to our prediction-based trajectory plans for 140 surfacing occurrences.


2013 ◽  
Vol 31 (2) ◽  
pp. 271 ◽  
Author(s):  
Leonardo Nascimento Lima ◽  
Clemente Augusto Souza Tanajura

ABSTRACT. In this study, assimilation of Jason-1 and Jason-2 along-track sea level anomaly (SLA) data was conducted in a region of the tropical and South Atlantic (7◦N-36◦S, 20◦W up to the Brazilian coast) using an optimal interpolation method and the HYCOM (Hybrid Coordinate Ocean Model). Four 24 h-forecast experiments were performed daily from January 1 until March 31, 2011 considering different SLA assimilation data windows (1 day and 2 days) and different coefficients in the parameterization of the SLA covariance matrix model. The model horizontal resolution was 1/12◦ and the number of vertical layers was 21. The SLA analyses added to the mean sea surface height were projected to the subsurface with the Cooper & Haines (1996) scheme. The results showed that the experiment with 2-day window of along-track data and with specific parameterizations of the model SLA covariance error for sub-regions of the METAREA V was the most accurate. It completely reconstructed the model sea surface height and important improvements in the circulation were produced. For instance, there was a substantial improvement in the representation of the Brazil Current and North Brazil Undercurrent. However, since no assimilation of vertical profiles of temperature and salinity and of sea surface temperature was performed, the methodology employed here should be considered only as a step towards a high quality analysis for operational forecasting systems.   Keywords: data assimilation, optimal interpolation, Cooper & Haines scheme, altimetry data.   RESUMO. Neste estudo, a assimilação de dados de anomalia da altura da superfície do mar (AASM) ao longo da trilha dos satélites Jason-1 e Jason-2 foi conduzida em uma região do Atlântico tropical e Sul (7◦N-36◦S, 20◦W até a costa do Brasil) com o método de interpolação ótima e o modelo oceânico HYCOM (Hybrid Coordinate Ocean Model). Foram realizados quatro experimentos de previsão de 24 h entre 1 de janeiro e 31 de março de 2011, considerando diferentes janelas de assimilação de AASM (1 dia e 2 dias) e diferentes coeficientes na parametrização da matriz de covariância dos erros de AASM do modelo. A resolução horizontal empregada no HYCOM foi 1/12◦ para 21 camadas verticais. As correções de altura da superfície do mar devido à assimilação de AASM foram projetadas abaixo da camada de mistura através da técnica de Cooper & Haines (1996). Os resultados mostraram que o experimento com assimilação de dados ao longo da trilha dos satélites com a janela de 2 dias e com parametrizações da matriz de covariância específicas para sub-regiões da METAREA V foi o mais acurado. Ele reconstruiu completamente a altura da superfície do mar e também proporcionou melhorias na circulação oceânica reproduzida pelo modelo. Por exemplo, houve substancial melhoria da representação nos campos da Corrente do Brasil e Subcorrente Norte do Brasil. Entretanto, tendo em vista que não foi realizada a assimilação de perfis verticais de temperatura e de salinidade e da temperatura da superfície do mar, a metodologia apresentada deve ser considerada apenas como um passo na conquista de uma análise oceânica e de um sistema previsor de qualidade para fins operacionais.   Palavras-chave: assimilação de dados, interpolação ótima, técnica de Cooper & Haines, dados de altimetria.


Ocean Science ◽  
2007 ◽  
Vol 3 (2) ◽  
pp. 321-335 ◽  
Author(s):  
V. Dulière ◽  
T. Fichefet

Abstract. Data assimilation into sea ice models designed for climate studies has started about 15 years ago. In most of the studies conducted so far, it is assumed that the improvement brought by the assimilation is straightforward. However, some studies suggest this might not be true. In order to elucidate this question and to find an appropriate way to further assimilate sea ice concentration and velocity observations into a global sea ice-ocean model, we analyze here results from a number of twin experiments (i.e. experiments in which the assimilated data are model outputs) carried out with a simplified model of the Arctic sea ice pack. Our objective is to determine to what degree the assimilation of ice velocity and/or concentration data improves the global performance of the model and, more specifically, reduces the error in the computed ice thickness. A simple optimal interpolation scheme is used, and outputs from a control run and from perturbed experiments without and with data assimilation are thoroughly compared. Our results indicate that, under certain conditions depending on the assimilation weights and the type of model error, the assimilation of ice velocity data enhances the model performance. The assimilation of ice concentration data can also help in improving the model behavior, but it has to be handled with care because of the strong connection between ice concentration and ice thickness. This study is first step towards real data assimilation into NEMO-LIM, a global sea ice-ocean model.


Sign in / Sign up

Export Citation Format

Share Document