Restoration of CT Images Corrupted With Fixed Valued Impulse Noise Using an Optimum Decision-Based Filter

Restoration of CT Images Corrupted With Fixed Valued Impulse Noise Using an Optimum Decision-Based Filter

Priyank Saxena, R. Sukesh Kumar
Copyright: © 2018 |Pages: 20
DOI: 10.4018/978-1-5225-5246-8.ch008
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The main aim of this chapter is to perform the restoration of computed tomography (CT) images acquired at the reduced level of radiation dose. Reduction in radiation dose affects the image quality as it increases noise and decreases low contrast resolution. In this chapter, an optimum decision-based filter (ODBF) is proposed as an image-space denoising technique, to detect and restore the low dose CT (LDCT) images corrupted with fixed valued impulse noise (salt and pepper) of unequal density. The detection stage employs k-means clustering to discriminate the noise-free pixels from the noisy-pixels by splitting the image data into three clusters of different intensities. The restoration stage employs mask else trimmed median (METM) estimation followed by an optional adaptive mask sizing for restoration of noisy pixels. The proposed method demonstrates noticeable improvement over other existing methods in restoration of LDCT images while maintaining the image contrast and edge details.
Chapter Preview
Top

Introduction

Computed Tomography (CT) is the most widely used Medical Imaging Technique for medical diagnosis, which usually acquires noise, artefacts etc. during image acquisition. CT scan is well suited to detect the presence of lesions of very low contrast. As compared with other imaging modalities, there is a growing concern regarding the amount of radiation dose associated with CT. For medical examination, high amounts of radiation increases the risk of cancer during the examinee’s whole lifetime (de González et al., 2009). In CT, there is a trade-off between image quality and the amount of radiation exposure to patients. Reduction in radiation dose affects the image quality as it increases noise and decreases low contrast resolution. For instance, reduction in radiation exposure by a factor of 2 results in an increase in the noise approximately by a factor of978-1-5225-5246-8.ch008.m01 (Borsdorf et al., 2008). The image noise and non- stationary streak artefacts are highly apparent in Low Dose CT (LDCT) images. The image noise limits the visualization of low-contrast structures which in turn affects the diagnostic quality. Therefore, it is critically important to maintain diagnostically acceptable image quality in LDCT examinations. Now a days, LDCT examinations are useful in several clinical studies such as the annual repeat screening for lung cancer using shorter CT scan time (Oguchi et al., 2000).

CT Images are often corrupted with impulse noise which is mostly generated due to errors in image acquisition, recording, and transmission or by the use of low quality sensors (Lin et al. 2015). The Impulse noise can be of Fixed valued (salt & pepper) or Random valued. Fixed Valued Impulse Noise (FVIN) has the property that the pixel corrupted by this noise acquire either the highest (255) or lowest (0) intensity value present for an 8-bit gray scale image, consequently resulting in the degradation of the edge sharpness and textural information of the image.

In this chapter, the impulse noise model is considered to be of fixed valued with unequal probability distribution and the restoration of LDCT is performed by the proposed Optimum Decision Based Filter (ODBF). Detection of pixels corrupted with FVIN is performed by k-means clustering (Kanungo et al. 2014). The k-means clearly discriminates the noise-free pixels from the noisy-pixels by splitting the image into three clusters of high (salt), low (pepper) and medium (noise-free) intensities respectively from the corrupted LDCT image. Once the noise-free pixels are identified correctly, they remain unaltered. The proposed ODBF employs Mask Else Trimmed Median (METM) estimation technique followed by an optional Adaptive Mask Sizing for the restoration of the noisy-pixels for a wide range of Noise Density (ND). The option of adaptive mask sizing gets invoked in case of the failure of METM estimation to suppress noise of very high density. The proposed filter can be applied iteratively to remove the heavy noise.

The performance of the proposed method for FVIN removal from LDCT images is evaluated by comparing it with the other existing methods such as Standard Median Filter (SMF), Adaptive Median Filter (AMF), Decision Based Algorithm (DBA), Modified Decision Based Unsymmetric Trimmed Median Filter (MDBUTMF) and Decision Based Trimmed Median Filter (DBTMF). It is evident from the experimental results that the proposed ODBF method performs restoration of LDCT images significantly better than the other existing methods in terms of visual quality and the image quality assessment metrics such as Mean Square Error (MSE), Peak Signal to Noise Ratio (PSNR), and Structural Similarity Index Measurement (SSIM), Image Enhancement Factor (IEF).

Complete Chapter List

Search this Book:
Reset