Biometric: Authentication and Service to Cloud

Biometric: Authentication and Service to Cloud

Ajay Rawat, Shivani Gambhir
DOI: 10.4018/978-1-4666-6559-0.ch012
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Cloud computing lacks control over physical and logical aspects of the system, which imposes profound changes in security and privacy procedure; hence, it needs a high level of security. Currently, many researchers and developments are being done to provide client service-level agreements regarding security issues. These researchers are attracted towards biometrics and its security applications, since it is based on biometric traits, thus providing a high level of security. Due to biometrics' benefits and cloud advantages, the collaboration of cloud and biometrics have open up wide areas this field. This chapter discusses some case studies of integration of biometrics and cloud computing.
Chapter Preview
Top

Biometrics And Its Work Process

The word ‘Biometrics’ is derived from Greek word ‘Bio’ means life and ‘Metrikos’ means measure. Thus, identification of humans through their characteristics and traits is referred as biometrics. It is used in the area where authentication of individual or to have supervision upon individuals in a group. Each and every individual have unique biometric characteristic which cannot be forgotten, stolen or lost. But in token based or knowledge based security mechanism there are chances that it can be lost or stolen. The following the most commonly used biometric authentication and recognition trait: faces, fingerprints, irises, palm-prints, speech etc.

Biometrics system is essentially a pattern recognition system. In spite of the design being used to deploy biometric system it contains four basic components.

  • Sensor Module: It is a data procuring module (or sensor) that captures image and/or video sequences of an individual who is either registering into the biometric system or using it for verification/identification purposes.

  • Feature Module: It is a template generation module which develops a biometric template pattern from the input data using machine learning, computer vision and pattern recognition techniques.

  • System Database Module: It is a repository of registered/enrolled biometric patterns of users.

  • Matcher Module: It is a matching module which compares the biometric pattern of ’live’ image of the users to the respective biometric patterns stored in the System Database Module. Based on the matching results, it makes a decision with respect to the identity of the current user presented to the system.

It provides two functions i.e. identification and authentication/verification.

In Identification process, system identifies the individual by the matching it with all the templates available in the biometric database. While performing this operation it does one-to-many comparisons to ascertain the identity of an unknown individual. If the comparison of the biometric sample to actual template matches with the database; identifying of individual is succeed. The system will fail is the individual is not enrolled in the system database. Identification process is significant element in both in 'positive recognition’ and ‘negative recognition’. In positive recognition user does not have to provide any information about the template to claim its identity, and in 'negative recognition' is used to obstruct a single user form multiple identities. The tradition methods such as PIN, key, tokens can be used in former approach. The latter approach can only be achieved through biometrics since other methods of personal recognition such as passwords, PINs or keys are ineffective.

In verification process, system performs comparison of his captured biometric characteristics sample with the registered templates stored in a biometric database. There are three steps involved in person verification.

  • Step 1: During this process, biometric templates are captured using sensing devices that generate reference models for all users and stores in model database.

  • Step 2: To generate the genuine and impostor scores, some samples are matched with reference model and thus calculate threshold.

  • Step 3: In this step biometric template testing is accomplish. It is done to indicate which template should be used for either comparison of smart card, username or ID number (e.g. PIN). 'Positive recognition' is a common use of verification mode, “where the aim is to prevent multiple people from using same identity”.

Key Terms in this Chapter

Homomorphic Encryption Scheme: A form of encryption which allows specific types of computations to be carried out on cipher text and generate an encrypted result which, when decrypted, matches the result of operations performed on the plaintext.

VQ (Vector Quantization): A classical quantization technique from signal processing which allows the modeling of probability density functions by the distribution of prototype vectors.

Paillier Algorithm: The Paillier cryptosystem, named after and invented by Pascal Paillier in 1999, is a probabilistic asymmetric algorithm for public key cryptography. The problem of computing n-the residue classes is believed to be computationally difficult. The decisional composite residuesity assumption is the intractability hypothesis upon which this cryptosystem is based. The scheme is an additive homomorphic cryptosystem; this means that, given only the public-key and the encryption of m 1 and m 2 , one can compute the encryption of m 1 + m 2 .

Linde-Buzo-Gray (LBG): The Linde–Buzo–Gray algorithm (introduced by Yoseph Linde, Andrés Buzo and Robert M. Gray in 1980) is a vector quantization algorithm to derive a good codebook.

Mel-Frequency Cepstral Coefficients: In sound processing, the mel-frequency cepstrum (MFC) is a representation of the short-term power spectrum of a sound, based on a linear cosine transform of a log power spectrum on a nonlinear mel scale of frequency. Mel-frequency cepstral coefficients (MFCCs) are coefficients that collectively make up an MFC.

Principal Component Analysis (PCA): A statistical procedure that uses orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components.

ElGamal: In cryptography, the ElGamal encryption system is an asymmetric key encryption algorithm for public-key cryptography which is based on the Diffie–Hellman key exchange.

Codebook: A codebook is a type of document used for gathering and storing codes. Originally codebooks were often literally books, but today codebook is a byword for the complete record of a series of codes, regardless of physical format.

Complete Chapter List

Search this Book:
Reset