Learning Normal Maps for Robust 3D Face Recognition from Kinect Data

Learning Normal Maps for Robust 3D Face Recognition from Kinect Data

Ahmed Yassine Boumedine, Samia Bentaieb, Abdelaziz Ouamri
Copyright: © 2022 |Volume: 13 |Issue: 2 |Pages: 11
ISSN: 1942-3594|EISSN: 1942-3608|EISBN13: 9781683181149|DOI: 10.4018/ijaec.314616
Cite Article Cite Article

MLA

Boumedine, Ahmed Yassine, et al. "Learning Normal Maps for Robust 3D Face Recognition from Kinect Data." IJAEC vol.13, no.2 2022: pp.1-11. http://doi.org/10.4018/ijaec.314616

APA

Boumedine, A. Y., Bentaieb, S., & Ouamri, A. (2022). Learning Normal Maps for Robust 3D Face Recognition from Kinect Data. International Journal of Applied Evolutionary Computation (IJAEC), 13(2), 1-11. http://doi.org/10.4018/ijaec.314616

Chicago

Boumedine, Ahmed Yassine, Samia Bentaieb, and Abdelaziz Ouamri. "Learning Normal Maps for Robust 3D Face Recognition from Kinect Data," International Journal of Applied Evolutionary Computation (IJAEC) 13, no.2: 1-11. http://doi.org/10.4018/ijaec.314616

Export Reference

Mendeley
Favorite Full-Issue Download

Abstract

Face recognition using 3D scans can be achieved by many approaches, but most of these approaches are based on high quality depth sensors. In this paper, the authors use the normal maps obtained from the Kinect sensor to investigate the usefulness of data augmentation and signal-level fusion derived from depth data captured by a low quality sensor. In this face recognition process, the authors first preprocess the captured 3D scan of each person by cropping the face and reducing the noise; normals are computed and separated into three maps: Nx, Ny, and Nz. the authors combine the three normal maps to form an RGB image; these images are used to train a convolutional neural network. The authors investigate the order of components that yields to the best accuracy and compare it with previous results obtained on CurtinFaces and KinectFaceDB databases, achieving rank one identification rate of 94.04% and 91.35%, respectively.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.