Wavelet Transform Modulus Maxima Decay Lines: Damage Detection in Varying Operating Conditions

Wavelet Transform Modulus Maxima Decay Lines: Damage Detection in Varying Operating Conditions

Andreas Kyprianou (University of Cyprus, Cyprus) and Andreas Tjirkallis (University of Cyprus, Cyprus)
Copyright: © 2015 |Pages: 21
DOI: 10.4018/978-1-4666-8490-4.ch003
OnDemand PDF Download:
No Current Special Offers


An important task in structural health monitoring (SHM) is that of damage detection under varying environmental and operational conditions. Structures, under varying environmental conditions, change their mass, elasticity and damping properties whereas changing operational conditions cause changes to excitations. A damage detection methodology implemented in these circumstances faces serious challenges since changes to structural behaviour imparted by environmental or operational conditions could be wrongly attributed to damage. The part of a damage detection decision algorithm that removes environmental and operational effects is called normalization. In this chapter a normalization methodology that is based on the similarity between continuous wavelet transform maxima decay lines is presented. This methodology is implemented on both simulated and experimental data. Simulated data were obtained from a three degree of freedom system. Varying environmental conditions were simulated by temperature dependent stiffness parameters and operating conditions by changing the colour of random excitation. Experimental data were obtained from damaged cantilever beams that were subjected to random excitations of different colour and varying temperatures.
Chapter Preview


The essential constituent of a structural health monitoring (SHM) methodology is its damage detection procedure. The underlying operating principle of damage detection algorithms is the comparison of the behavior of the structure under monitoring, against its expected (nominal) behavior. If a significant difference between the two behaviors is observed, the damage detection algorithm has to decide whether or should be attributed to damage. Over the years many damage detection methodologies have been developed, that exploit the effects of damage on structural responses. The most demanding challenge faced by these methodologies is that of detecting damage of structures in service; for they are operating under varying environmental and loading conditions.

Various studies have demonstrated that the structural changes induced by varying environmental and operational conditions could be falsely attributed to damage. Therefore, damage detection algorithms should somehow be taking into consideration the effects of changing environmental and operating conditions. The action of removing these effects from the data measured from structures operating under these circumstances was given the name of normalization (Sohn & Farrar, 2001; Sohn, Worden, & Farrar, 2002; Sohn, 2007).

Over the years different normalization techniques have been developed by taking into consideration:

  • 1.

    Information about the actual operational and environmental conditions, usually available in terms of environmental measurements,

  • 2.

    The lack of this information, and

  • 3.

    That such information might or might not be available.

In the context of (1) and (2) damage detection algorithms utilize models, called base line models, which capture the behavior of the undamaged structure under varying environmental and operational conditions. In (3) damage detection features that are insensitive to environmental or operational conditions are sought, and hence base line models are not required. A review of the normalization methodologies developed up to 2007 can be found in Sohn (2007) and the references therein.

The normalization methodologies developed after 2007 in the context of (1) relied on statistical analysis. Deramaeker et al. (2008) implemented statistical factor analysis of the eigen-properties. They used stochastic subspace identification methodology and the Fourier transforms of modal filters in order to compute the eigen-properties. Limongeli (2010) used spline interpolation to obtain frequency response functions (FRF) at locations on structures where measurements are not available. She proposed as a damage detection feature the difference of interpolation errors between a reference structure and the monitored in-service structure as a damage detection feature. In order to avoid the environmental effects on this feature she employed a statistical classification criterion. This was realised by formulating a statistical hypothesis test on the probability distribution of the error that interpolating FRFs yielded.

Hypothesis testing on the predictive ability of a model generated by embedding delay vectors was proposed in Figuiredo et al. (2010) for the situations where information about environmental conditions is not available. Chang and Sohn (2009) presented an approach that was based on unsupervised support vector machines to obtain non-linear principal components that capture the effects of unmeasured environmental conditions on the structure. This procedure was also used with impedance measurements (Lim, Kim, & Sohn, 2011). An alternative way that has been motivated by the biological immune system, for damage detection of structures operating in varying environmental conditions where information about the actual operating conditions is not available was proposed in Surace and Worden (2010).

Complete Chapter List

Search this Book: