Aviso+/CMEMS Observations

DART project logo

Jump to DART Documentation Main Index
version information for this file:
$Id: AVISO.html 11877 2017-08-03 22:03:32Z thoar@ucar.edu $



convert_aviso was contributed by Frederic Castruccio - Thanks Fred!

This short description of the SEALEVEL_GLO_SLA_L3_REP_OBSERVATIONS_008_018 product is repeated from the INFORMATION tab from the Copernicus Marine Environment Monitoring Service online catalogue (in April 2017).

For the Global Ocean- Mono altimeter satellite along-track sea surface heights computed with respect to a twenty-year mean. Previously distributed by Aviso+, no change in the scientific content. All the missions are homogenized with respect to a reference mission which is currently Jason-2. This product is computed with an optimal and centered computation time window (6 weeks before and after the date). Two kinds of datasets are proposed: filtered (nominal dataset) and unfiltered.
The convert_aviso.f90 program is designed to read a netCDF file containing the (Level 3) sea surface anomalies from any of the following platforms: "Jason-1", "Envisat", or "Geosat Follow On". One of those platforms must be listed in the netCDF file global attribute: platform

The data files have names like:
dt_global_en_sla_vfec_20080101_20140106.nc, or
dt_global_g2_sla_vfec_20080101_20140106.nc; corresponding to the Jason-1, Envisat, and the Geosat Follow On platforms.

The DART observation TYPE corresponding to each of these platforms are J1_SEA_SURFACE_ANOMALY, EN_SEA_SURFACE_ANOMALY, and GFO_SEA_SURFACE_ANOMALY, respectively and are defined in obs_def_ocean_mod.f90.

Important Usage Note: Fred uses python scripts to repeatedly call convert_aviso and decided it was easiest to simply provide the input file name as a command-line argument and always have the output file have the name obs_seq.aviso. convert_aviso has been modified to optionally take a second command-line argument that provides a netCDF file specifying the observation error standard deviations for each observation. If this file is not provided, the input namelist value for the observation error standard deviation is used. Other DART modules still require run-time crontrol specified by input.nml.

Fred wrote another python script (shell_scripts/convert_aviso_2.py) to repeatedly call convert_aviso which was subsequently modified by Romain Escudier of Rutgers to be able to select which year you want to convert with a little more error-checking. Both convert_aviso.py and convert_aviso_2.py must be edited to reflect the location of the data files on your system.

DART provides shell_scripts/run_convert_aviso.csh to prepare the input required and run convert_aviso_2.py as a batch job.

After creating a large number of output observation sequence files, it is usually necessary to consolidate the files and subset them into files containing just the timeframe required for a single assimilation. NOTE: the obs_sequence_tool is constructed for just this purpose.

The shell_scripts/makedaily.sh script attempts to consolidate all the SLA observations and those that may have been (separately) converted from the World Ocean Database into 24-hour segments centered at midnight GMT. You will have to modify the makedaily.sh script to suit your filesystem and naming convention. It is provided as a starting point.

Reminder: (according to the data providers): In order to compute Absolute Dynamic Topography, the Mean Dynamic Topography (MDT) can be added. It is distributed by Aviso+ ( http://www.aviso.altimetry.fr/en/data/products/auxiliary-products/mdt.html ). Fred was using this product in assimilations with POP, so he chose a different source for MDT - consistent with POP's behavior.



The Copernicus Marine and Environment Monitoring Service (CMEMS) has taken over the processing and distribution of the Ssalto/Duacs multimission altimeter products formerly administered by Aviso+. After a registration process, the along-track sea level anomalies (SLA) may be downloaded from http://marine.copernicus.eu/services-portfolio/access-to-products/ - search for the SEALEVEL_GLO_SLA_L3_REP_OBSERVATIONS_008_018 if it does not come up directly.



convert_aviso.f90 does the actual conversion from netCDF to a DART observation sequence file, which may be ASCII or binary.
shell_scripts/convert_aviso.py python script to convert a series of input files and datestamp the output files. This script works with a single satellite and issues a warning if a particular date has no input data.
shell_scripts/convert_aviso_2.py python script to convert a series of input files and datestamp the output files. This version works on many satellite platforms.
shell_scripts/run_convert_aviso.csh shell script that can be run interactively or as a batch job to run convert_aviso_2.py.
shell_scripts/makedaily.sh shell script to repeatedly call obs_sequence_tool to consolidate multiple observation sequence files into an observation sequence file that has ALL the observations from ALL platforms in a single file, as well as any other observations from pre-existing observation sequence files (from the WOD, for example). makedaily.sh is capable of looping over time ranges and creating observation sequences for each time range. This is also the place to do any geographic subsetting.



This namelist is read from the file input.nml. Namelists start with an ampersand '&' and terminate with a slash '/'. Character strings that contain a '/' must be enclosed in quotes to prevent them from prematurely terminating the namelist.

   observation_error_std = 0.03
Contents Type Description
observation_error_std real(r8) observation error standard deviation to be used for all observations IF the optional second command-line argument is not provided.











Terms of Use

DART software - Copyright UCAR. This open source software is provided by UCAR, "as is", without charge, subject to all terms of use at http://www.image.ucar.edu/DAReS/DART/DART_download

Contact: Tim Hoar
Revision: $Revision: 11877 $
Source: $URL: https://svn-dares-dart.cgd.ucar.edu/DART/releases/classic/observations/AVISO/AVISO.html $
Change Date: $Date: 2017-08-03 16:03:32 -0600 (Thu, 03 Aug 2017) $
Change history:  try "svn log" or "svn diff"