An Iterative Method for 3D Body Registration Using a Single RGB-D Sensor

An Iterative Method for 3D Body Registration Using a Single RGB-D Sensor

Victor Villena-Martinez (Department of Computer Technology, University of Alicante, Alicante, Spain), Andres Fuster-Guillo (Department of Computer Technology, University of Alicante, Alicante, Spain), Marcelo Saval-Calvo (Department of Computer Technology, University of Alicante, Alicante, Spain) and Jorge Azorin-Lopez (Department of Computer Technology, University of Alicante, Alicante, Spain)
Copyright: © 2017 |Pages: 14
DOI: 10.4018/IJCVIP.2017070103
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

In this paper, the problem of 3D body registration using a single RGB-D sensor is approached. It has been guided by three main requirements: low-cost, unconstrained movement and accuracy. In order to fit them, an iterative registration method for accurately aligning data from single RGB-D sensor is proposed. The data is acquired while a person rotates in front of the camera, without the need of any external marker or constraint about its pose. The articulated alignment is carried out in a model-free approach in order to be more consistent with the real data. The iterative method is divided in stages, contributing to each other by the refinement of a specific part of the acquired data. The exploratory results validate the proposed method that is able to feed on itself in each iteration improving the final result by a progressive iteration, with the required precision under the conditions of affordability and unconstrained movement acquisition.
Article Preview

1. Introduction

Nowadays, there are several techniques to obtain a 3D model of the human body. This model is useful in many applications of different scopes: medical, textile, shoes, etc. The tandem of 3D technology and medicine has a long trajectory, 3D models have been used to help experts study patients and make decisions (Treleaven & Wells, 2007), e.g.: dietetic treatment, child growth monitoring, podiatry, orthopedics, among others. There are several solutions to obtain a high definition 3D model of the human body, but they are expensive and bulky, making its portability more difficult, with a prohibitive cost. The usefulness of these models has made more attractive the development of inexpensive systems with few restrictions for the subject pose during the acquisition. Consumer RGB-D, low-cost RGB-D, or just RGB-D sensors have become popular in fitting those requirements (Lai, Bo, Ren, & Fox, 2013) due to their combination of affordability and portability. However, the accuracy of them is not high enough for some applications.

Therefore, in this paper we face the problem of providing a 3D body representation using RGB-D sensors. The proposal has to fit some requirements: (1) affordability and flexibility for broad transferring solutions, (2) unconstrained method in terms of free movement of the subject in front of the sensor, (3) accurate alignment of the data to create an accurate representation.

Low-cost RGB-D sensors are considered a new type of sensors due to the large usage they are receiving in the scientific and industrial community. This kind of sensors combine color and depth information, the later one estimated using Time of Flight (ToF) or structured-light techniques. In this work, we propose a method for RGB-D devices, due to the low-cost requirement, and their accuracy are appropriate for body modeling since they were originally developed for body movement tracking (Saval-Calvo et al., 2017). However, it could be possible to use our approach with any other sensor which provides color and depth information.

To develope a full model of bodies using RGB-D data, our proposal faces the registration or alignment of 3D points. The registration is the process to align one data set onto one or more data sets. It could be performed rigidly and non-rigidly, which refer to transform the data all with the same transformation, or by applying different transformations to each data point (Saval-Calvo, Azorin-Lopez, Fuster-Guillo, & Mora-Mora, 2015). Registration of 3D data is a widely studied problem. (Henry, Krainin, Herbst, Ren, & Fox, 2014) made a 3D reconstruction of indoor environments with an RGB-D sensor. The work of (Lovato, Bissolo, Lanza, Stella, & Giachetti, 2014) carried out an accurate 3D registration of the foot using a PrimeSense rotating around it. These sensors could be used in combination with augmented reality markers in the scene in order to estimate more accurately the transformation to obtain the 3D model (Mihalyi, Pathak, Vaskevicius, Fromm, & Birk, 2015).

The 3D registration of the human body is complex due to its articulated nature, and the impossibility to keep the same pose in different instants of time. These inconveniences could be approached by different acquisition approximations to obtain a model:

  • Single view, performing a partial reconstruction of the subject of interest.

  • Adding sensors to acquire the subject from different angles in the same instant of time, avoiding any movement of the subject.

  • Controlled environment, using external elements to the sensor, like augmented reality markers.

  • Using articulated/isometric techniques for the registration. These techniques take into account the movements performed by the subject during the acquisition.

This work is focused on the fourth approach in order to provide a registration of the human body fitting the requirements above mentioned of affordability, flexibility and accuracy.

The rest of the paper is organized as follows: a study of the background is presented in section 1.1; an explanation of the proposed method is done in section 2; the experiments are presented in section 3; finally, the conclusion is presented in 4.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 7: 4 Issues (2017): 3 Released, 1 Forthcoming
Volume 6: 2 Issues (2016)
Volume 5: 2 Issues (2015)
Volume 4: 2 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing