Analysis of Ocean in Situ Observations and Web-Based Visualization: From Individual Measurements to an Integrated View

Analysis of Ocean in Situ Observations and Web-Based Visualization: From Individual Measurements to an Integrated View

Alexander Barth (University of Liege, Belgium), Sylvain Watelet (University of Liege, Belgium), Charles Troupin (Balearic Islands Coastal Ocean Observing and Forecasting System (SOCIB), Spain), Aida Alvera-Azcárate (University of Liege, Belgium), and Jean-Marie Beckers (University of Liege, Belgium)
DOI: 10.4018/978-1-5225-0700-0.ch015
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The sparsity of observations poses a challenge common to various ocean disciplines. Even for physical parameters where the spatial and temporal coverage is higher, current observational networks undersample a broad spectrum of scales. This situation is generally more severe for chemical and biological parameters because such sensors are less widely deployed. The present chapter describes the analysis tool DIVA (Data-Interpolating Variational Analysis) which is designed to generate gridded fields from in situ observations. DIVA has been applied to various physical (temperature and salinity), chemical (concentration of nitrate, nitrite and phosphate) and biological parameters (abundance of a species). The chapter also shows the technologies used to visualize the gridded fields. Visualization of analyses from in situ observations provide a unique set of challenges since the accuracy of the analysed field is not spatially uniform as it strongly depends on the location of the observations. In addition, an adequate treatment of the depth and time dimensions is essential.
Chapter Preview
Top

Introduction

In situ measurements of ocean properties are generally sparsely distributed and thus undersample the ocean variability. A good knowledge of the scales and dynamics of the ocean is essential to make the best use of this limited amount of data. The data distribution is often very inhomogeneous, both in space and time. Accuracy is also inhomogeneous and point measurements (e.g., moorings, coastal stations) are not necessarily representative of typical ocean conditions. Therefore it is necessary to take all these aspects into account when analysing in situ measurements.

Various applications and interpretations of marine data require a gridded field covering the complete domain as a first step. These applications include, for example:

  • Computation of budgets (e.g., heat, salt content),

  • Identification of long-term trends,

  • Identification and characterization of oceanographic features,

  • Determination of derived variables from parameters not necessarily measured at the same location (for example density which is a function of temperature and salinity),

  • Model initialization and validation.

In principle, some of these applications could be also performed directly on the measured observations. For example, the average ocean temperature could be naively computed as the mean of all observations. However in practice the data distribution is very inhomogeneous in space (with a higher density of observations near the surface and close to the coast) and in time (e.g., more ship-based observations are generally available during summer than during winter and the density of observations has increased substantially during the last decades). Hence computing the mean using directly the observations would result in a biased estimation of the average temperature.

Top

Background

Most analysis methods are linear techniques, which means that the analysed field (or its anomaly versus a reference state) at a given location is a weighted linear combination of the observations. Different methods exist to compute those weights. The most obvious way to compute them is to use a monotonic decreasing function of the distance between the location of the analysed field and the observations: observations farther away from a given location have less influence than observations nearby. This approach is implemented in the Cressman method (Cressman, 1959). The weighting function usually decreases over a given length-scale and is exactly zero beyond a given search radius. Various weighting schemes have been proposed (Barnes, 1964). The search radius can be made to vary in space depending on data coverage and/or dynamical scales. The Cressman weighting is a very simple and numerically quite efficient method. However, it suffers from some limitations, in particular no estimate can be obtained at locations where no observation is located within the search radius. In regions with very few observations, the method can therefore return a discontinuous field. In addition, the presence of topographic barriers disconnecting different water bodies cannot be taken into account easily. Another limitation is that all observations are assumed to have a similar error variance since the weighting is based only on distance.

Complete Chapter List

Search this Book:
Reset