An Image De-Noising Method Based on Intensity Histogram Equalization Technique for Image Enhancement

An Image De-Noising Method Based on Intensity Histogram Equalization Technique for Image Enhancement

Shantharajah S. P. (VIT University, India), Ramkumar T (VIT University, India) and Balakrishnan N (Sona College of Technology, India)
Copyright: © 2018 |Pages: 13
DOI: 10.4018/978-1-5225-5204-8.ch012
OnDemand PDF Download:
No Current Special Offers


Image enhancement is a quantifying criterion for sharpening and enhancing image quality, where many techniques are empirical with interactive procedures to obtain précised results. The proposed Intensity Histogram Equalization (IHE) approach conquers the noise defects that has a preprocessor to remove noise and enhances image contrast, providing ways to improve the intensity of the image. The preprocessor has the mask production, enlightenment equalization and color normalization for efficient processing of the images which generates a binary image by labeling pixels, overcomes the non-uniform illumination of image and classifies color capacity, respectively. The distinct and discrete mapping function calculates the histogram values and improves the contrast of the image. The performance of IHE is based on noise removal ratio, reliability rate, false positive error measure, Max-Flow Computational Complexity Measure with NDRA and Variation HOD. As the outcome, the different levels of contrast have been significantly improved when evaluated against with the existing systems.
Chapter Preview


In image processing applications, contrast enhancement technique plays a vital role in obtaining image quality. In today’s world, applications of image processing dominates in various fields such as digital photography, remote sensing, LED and LCD display based images. The poor quality image fails to provide the best result while image operations are performed on it. The same result is obtained in the imaging devices also. In many image and video applications, these devices functions as human eyes which create the final result of visual quality. They usually correlate high image contrast with good image quality. Certainly, growth in image display and generation of knowledge are the focus for further enhancement in various image development methods.

The raw image has lesser contrast than the ideal one since the effects of the image have poor illumination conditions, low quality inexpensive imaging sensors, user operation errors, and media deterioration. For enhanced human analysis of image semantics and higher perceptual quality, contrast improvement is always focused. Broadly, the contrast development methods have been re-categorized into two folds namely, context-sensitive (point-wise operators) and context-free (point operators). In context-sensitive approach, the contrast is defined using the rate of change in intensity between adjacent pixels. The contrast is raised honestly by changing the local waveform on pixel by pixel forms.

Context-free contrast developmental method fails to change the local waveform on a pixel by pixel basis. As an option, the class of context-free contrast development methods changes with a statistical technique. They control the histogram of the input image to divide the gray levels of higher probability from the adjacent gray levels. Context-free techniques enhance the average difference between any two changed input gray levels. In relation to its context-sensitive counterpart, the context-free method fails to experience from the ringing artifact and it maintains the relative ordering of varied gray levels.

The technique, Minimization of Total Variation (TV) has been emerged as an approach for image de-noising. The correlation between the TV minimization issues and binary MRF models are discovered. A combinatorial optimization algorithm is planned for the TV minimization problem in the discrete setting using graph cuts. Image de-noising is the issue of improving a true image from an examined noisy image. The variation approach is a significant method for explaining the image de-noising issues when the image is described on the uninterrupted domain. Egil Bae et al. (2011) describes the Euler’s elastica model which is the higher order model of essential meaning that reduces the curve of all level lines in the image. Traditional numerical technique for reducing the energy in higher order models is difficult. An effective minimization algorithm depending on the graph cuts for reducing the energy in the Euler’s elastic model, by reducing the issues of explaining a series of simple graph representing issues. The sequence contains the links to the gradient flow of the energy function and joins to a minimum point.

The method for enhancing the contrast of the images is Histogram Equalization (HE). HE is an ease of using and yields better result comparing to other methods. Histogram Equalization procedures works on digital images by remapping the gray levels of the image which is by likelihood allocation of the input gray levels.

Histogram Equalization is a method that increases the contrast of an image by increasing the dynamic range of intensity given to pixels with the most probable intensity values. One transformation function that accomplishes this is a cumulative distribution function. The transformation of the histogram equalization function is scaled such that the least intense value in the original image is mapped to a zero intensity value in the equalized image. As well, the most intense value in the original image is mapped to an intensity value that is equal to the maximum intensity value determined by the bit depth of the image. This produces results that have a dynamic range that is slightly larger than produced by the histogram equalization algorithm described by Gonzalez and Woods (2009). The HE methods can be grouped into two standard forms such as global and local histogram equalization method. Global Histogram Equalization (GHE) works on the histogram information of the entire input image for its function.

Complete Chapter List

Search this Book: