Article Preview
TopIntroduction
In nuclear engineering, processing of radiation signals is of great significance for monitoring of radioactive sources and detecting abrupt changes in consecutive measurements. Radiation measurement systems display the number of electric pulses (i.e. counts) produced by the interaction of a specific type of particles, such as photons, neutrons, cosmic rays or any other, in the detector’s sensing element (Tsoulfanidis & Landsberger, 2010) for a given time interval. So, it is common for registered counts to result from radioactive sources and naturally occurring radioactive materials (Ely et al., 2003) in close proximity from the sensor location. In a static environment, a radiation measurement system displays a series of counts, whose gross number fluctuates around a value. However, in reality the majority of environments are dynamic and the radiation gross number exhibits several abrupt changes or transitions. Therefore, long term recording of radiation measurements for a specific location allows ahead-of-time estimation of the count number “be reported” by a radiation sensor. As a result the crucial step in monitoring is finding an abrupt change as soon as possible by recognizing unusual measurements, i.e. radiation recordings that do not match the expectations fostered by previous observations.
Towards that end, artificial intelligence (Rusell & Norvig, 2002) has a great potential with respect to creating intermediate infrastructures for developing and/or combining various monitoring methodologies. Fusion and processing of multi-source data, can be combined with intelligent tools to achieve significant improvement of surveillance parameters. More specifically, automated implementation of the associated processes can decrease the required measurement time, while intelligent processing of the obtained signals can shorten abrupt change detection time.
Several methods for radiation monitoring and subsequent abrupt change detection are already in use and even more are and will be proposed (Fagan et al., 2012). The efficiency of each technique is based on a set of unique characteristics, appropriate for either count estimation or measurement analysis. Particularly, Stephens and Peurrung (2004) studied the use of multiple sensors and information fusion in monitoring radioactive sources. Apostolopoulos (2008) applied tools from statistical signal processing for radiation monitoring and detecting abrupt changes in measurements. Pfund et al. (2010) proposed an anomaly detection algorithm based on weighted spectral comparison ratios, while Tardiff et al. (2006) proposed a methodology adopting principal component analysis for anomaly identification. Principal component analysis was also employed for data analysis in portal monitors by Runkle et al. (2006). The sequential probability ratio test (SPRT) has been extensively applied in radiation monitoring since it was part of several methodologies such as those proposed by Fehlau (1993), Jarman et al. (2004), and Luo et al. (2010). Adoption of Kalman filter for tracking background measurements was proposed by Jarman et al. (2008), and a matched filter based approach for vehicle portal monitors was introduced by Runkle et al. (2005). Further, the use of wavelets in detecting radiation anomalies was proposed by Ominoamou et al. (2009).
Although these methods have been traditionally used for stand-alone monitoring, the belief that has been evolving in the research community is that combination of techniques is likely to increase inspection efficiency. Toward that end, researchers have focused on developing methodologies for bringing developed and proposed technologies together (Tsoukalas, 1997; Ikonomopoulos et al., 1993). The ultimate goal is to build hybrid systems with enhanced accuracy (i.e., the most optimal proportion of positive and false alarms) and increased detection speed.