A Multispectral and Multiscale View of the Sun

A Multispectral and Multiscale View of the Sun

T. Dudok de Wit (LPC2E, CNRS and University of Orléans, France)
DOI: 10.4018/978-1-60960-477-6.ch012
OnDemand PDF Download:
No Current Special Offers


The emergence of a new discipline called space weather, which aims at understanding and predicting the impact of solar activity on the terrestrial environment and on technological systems, has led to a growing need for analysing solar images in real time. The rapidly growing volume of solar images, however, makes it increasingly impractical to process them for scientific purposes. This situation has prompted the development of novel processing techniques for doing feature recognition, image tracking, knowledge extraction, et cetera. This chapter focuses on two particular concepts and lists some of their applications. The first one is Blind Source Separation (BSS), which has great potential for condensing the information that is contained in multispectral images. The second one is multiscale (multiresolution, or wavelet) analysis, which is particularly well suited for capturing scale-invariant structures in solar images.
Chapter Preview


The Sun is a world of paradoxes. It is our closest star and yet, distant stars and galaxies have received far more attention as far as data analysis techniques are concerned. Until the dawn of the space age, most solar images were taken in the visible light only, since the terrestrial atmosphere absorbs most other wavelengths. Visible light, however, mostly reveals the lowest layer of the solar atmosphere, which is relatively featureless apart from the occasional presence of structures such as sunspots. Space-borne telescopes have opened the infrared, the ultraviolet and the X-ray windows, in which the Sun appears much more structured. The vacuum ultraviolet (VUV) range, whose wavelength range extends from 10 to 200 nm, has received considerable attention since it provides deep insight into the highly dynamic and energetic solar atmosphere (Aschwanden, 2005a).

The prime objective of solar image analysis is a better understanding of the complex physical processes that govern the solar atmosphere. The traditional approach consists in observing the Sun simultaneously in different wavelengths and in matching the results obtained by spectroscopic diagnostics with physical models. Indeed, key quantities such as the temperature or the density cannot be directly accessed and so a quantitative picture can only be obtained at the price of time-consuming comparisons with simulations from radiation transfer models, using strong assumptions such as local thermodynamic equilibrium. A key issue is to find new and more empirical means for rapidly inferring pertinent physical properties from such data cubes.

This situation has recently evolved with the emergence of a new discipline called space weather, which aims at understanding and predicting solar variability in order to mitigate its adverse effects on Earth. Manifestations of solar activity such as flares and interplanetary perturbations indeed influence the terrestrial environment and sometimes cause significant economic losses by affecting satellites, electric power grids, radio communications, satellite orbits, airborne remote sensing and also climate. This new discipline has stimulated the search for new and quicker ways of characterising solar variability. For most users of space weather, empirical quantities that are readily available are valued more than physical quantities whose computation cannot be done in real-time. The sudden need for operational services has stimulated the search for novel multidisciplinary solutions for automated data processing that rely on concepts such as feature recognition, knowledge extraction, machine learning, classification, source separation, etc. (Schrijver et al., 2007). The truly multidisciplinary character of this quest is attested by the fact that most of these concepts are also exploited in other chapters of this book.

In most studies of the Sun, the focus has been on the identification and on the characterisation of individual solar features, such as loops (Inhester, Feng, & Wiegelmann, 2008), sunspots (Colak & Qahwaji, 2009), prominences (Labrosse, Dalla, & Marshall, 2010) and interplanetary disturbances (Robbrecht & Berghmans, 2005). As the cadence and the size of solar images increases, however, so does the need for extracting metadata and doing data mining. The human eye often remains one of the best expert systems, so tools are also needed to assist humans in visualising multiple images. The Solar Dynamics Observatory satellite, for example, which delivered its first images in April 2010, provides several times per minute 4096 x 4096 images in 7 wavelengths in the VUV simultaneously. For such purposes, it is desirable to have techniques that, in addition to extracting specific features, can display (1) multiple wavelengths simultaneously and, (2) multiple scales in a more compact way. Both tasks have not received much attention yet, but will surely become an active field of research in the next decade.

In this short overview, we shall focus on two particular concepts that are particularly appropriate for handling such tasks; they are multispectral and multiscale analysis. In both cases, potential applications will be emphasized rather than their technical aspects, which can be found in the literature. For recent reviews on feature detection algorithms, see the chapter by Pérez-Suárez et al. (2010), and also Aschwanden (2010) and Zharkova, Ipson, Benkhalil, and Zharkov (2005). The book by Starck and Murtagh (2006) is another reference in the field, although it concentrates on astronomical objects that are point-like.

Complete Chapter List

Search this Book: