New Artificial Neural Network Models for Bio Medical Image Compression: Bio Medical Image Compression

New Artificial Neural Network Models for Bio Medical Image Compression: Bio Medical Image Compression

G. Vimala Kumari, G. Sasibhushana Rao, B. Prabhakara Rao
Copyright: © 2019 |Pages: 21
DOI: 10.4018/IJAMC.2019100106
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This article presents an image compression method using feed-forward back-propagation neural networks (NNs). Marked progress has been made in the area of image compression in the last decade. Image compression removing redundant information in image data is a solution for storage and data transmission problems for huge amounts of data. NNs offer the potential for providing a novel solution to the problem of image compression by its ability to generate an internal data representation. A comparison among various feed-forward back-propagation training algorithms was presented with different compression ratios and different block sizes. The learning methods, the Levenberg Marquardt (LM) algorithm and the Gradient Descent (GD) have been used to perform the training of the network architecture and finally, the performance is evaluated in terms of MSE and PSNR using medical images. The decompressed results obtained using these two algorithms are computed in terms of PSNR and MSE along with performance plots and regression plots from which it can be observed that the LM algorithm gives more accurate results than the GD algorithm.
Article Preview
Top

1. Introduction

Artificial neural networks (ANNs) are archetypes of the biological neuron system and thus have been drawn from the abilities of a human brain. The architecture of ANN being drawn from the concept of brain functioning, a neural network is a hugely reticulated network of a huge number of neurons which are processing elements. ANNs are employed to summarize and prototype some of the functional aspects of the human brain system in an effort so as to acquire some of its computational strengths. A NN consists of eight components: neurons, signal function, activation state vector, activity aggregation rule, pattern of connectivity, learning rule, activation rule, and environment. Recently, ANNs are applied in areas in which high rates of computation are essential and considered as probable solutions to problems of image compression. Generally, two different categories have been put forward for enhancing the performance of compression methods. Firstly, a method for compression by using ANN technology has to be developed to improve the design. Secondly, neural networks have to be applied to develop compression methods. Backpropagation algorithm is extensively used learning algorithms in ANNs. With generalization ability and high accuracy, the feedforward neural network architecture is capable of approximating most problems. This architecture is based on the learning rule of error-correction. Error propagation comprises of two passes, a forward pass and a backward pass through different layers of network. The effect, of input vector’s application to the sensory nodes of the network, transmits through the network layer by layer in the forward pass. In the end, a set of outputs are produced as an actual response of this process. All the synaptic weights of the networks are fixed during the forward pass only and adjusted according to the need of error-correction during the back pass. The error signal is produced when the actual output of the network is subtracted from the expected output. This error signal is then propagated backward against the direction of synaptic conditions through the network. Until the actual output of the network so produced is nearer to the expected output, the synaptic weights are adjusted. To produce a complex output, the backpropagation neural network is essentially made of a network of simple processing elements working together. From the above knowledge of back propagation neural networks, image compression, and decompression can be achieved.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing