An Efficient Handwritten Character Recognition Using Quantum Multilayer Neural Network (QMLNN) Architecture: Quantum Multilayer Neural Network

An Efficient Handwritten Character Recognition Using Quantum Multilayer Neural Network (QMLNN) Architecture: Quantum Multilayer Neural Network

Debanjan Konar, Suman Kalyan Kar
Copyright: © 2018 |Pages: 15
DOI: 10.4018/978-1-5225-5219-2.ch008
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter proposes a quantum multi-layer neural network (QMLNN) architecture suitable for handwritten character recognition in real time, assisted by quantum backpropagation of errors calculated from the quantum-inspired fuzziness measure of network output states. It is composed of three second-order neighborhood-topology-based inter-connected layers of neurons represented by qubits known as input, hidden, and output layers. The QMLNN architecture is a feed forward network with standard quantum backpropagation algorithm for the adjustment of its weighted interconnection. QMLNN self-organizes the quantum fuzzy input image information by means of the quantum backpropagating errors at the intermediate and output layers of the architecture. The interconnection weights are described using rotation gates. After the network is stabilized, a quantum observation at the output layer destroys the superposition of quantum states in order to obtain true binary outputs.
Chapter Preview
Top

Introduction

Owing to wide variations in writing styles, variations in sizes and orientation of the handwritten characters, recognition of characters remains an uphill task in computer vision and pattern recognition community. Numerous image processing applications are relying on the techniques of identification and recognition of characters from real-life applications of text documents and images. The primary objective of handwritten character recognition lies in the conversion of characters present in an image into character codes pertaining to text and image processing. Artificial Neural Networks (ANN) often offers to solve unorganized machine learning problems like associative pattern recognition tasks, image processing tasks in parallel processing mode. Basic feed forward ANN is employed by many computer vision researchers to solve pattern recognition problems with high time complexity. The character recognition problem can be solved using various feature selection techniques and neural network classifiers. The significant contributions of feed forward ANNs assisted by back-propagation algorithms in character recognition problems deserves special mention (Devireddy, 2005). The Bayesian Network classifiers (Bouchain, 2007; Bonci et al, 2006) are one of the most suitable probabilistic approach for recognition of characters. In handwritten character recognition, high recognition accuracy can be obtained using back-propagation learning algorithm in multilayer neural network architectures.

A Hidden Markov Model (HMM) based approach is proposed by Kundu and Chen (2002) achieved 88% recognition accuracy working with 100 postal words. Tomoyuki et al. (2002) also achieved 80% recognition accuracy in experiment while considering 1646 city names of Europe as data sets. A K-NN classifier has been employed by Gatos et al. (2006) to recognize 3799 words from IAM database which yields 81% accuracy. A plethora of supervised artificial neural networks (Samadiani et al, 2005; Chi et al, 1995) have been suggested to obtain real time results. In addition, numerous neighborhoods based supervised neural network architectures have been entrusted upon for pattern recognition and it has been found efficient in recognizing handwritten characters. However, owing to interconnection weight adjustments using standard back-propagation algorithms in these supervised network architectures, the time complexity increases manifold. Efforts have been made to combine quantum computing with the standard back-propagation algorithm resulting in time efficient network architectures.

Micro-quantum level effects offer to perform computational tasks using time effect procedures in Quantum computing and also outperform the classical computing approaches in terms of computational time (Mcmohan, 2008). The popularity of artificial neural network combined with quantum computing is growing in leaps and bounds due to implied parallelism offered by quantum computing. An array of quantum dots -assisted Quantum Neural Network (QNN) architecture is proposed by Behrman et al. (1994). Matsui et al. (2000) also projected a quantum multilayer feed forward neural network model referred as QNN using quantum learning technique. Quantum associative memory (Ventura et al, 2000; Perus, 1998) and neural network quantum dots (Behramam 1994) are the basic components of QNN research. An automated pattern recognition algorithm is proposed by Aytekin et al. (2013) guided by the principle of quantum mechanics. A novel model of QNN is also suggested by Ezhov (2001) to solve classification problems. Moreover, quantum back-propagation based neural network architecture is introduced in (2013) to encounter the pattern recognition tasks.

Complete Chapter List

Search this Book:
Reset