Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights

Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights

Vítor Hugo Carvalho, José Eusébio
DOI: 10.4018/IJHISI.294114
Article PDF Download
Open access articles are freely available for download

Abstract

The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions trough gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV or control the bed slope. It was used a shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.
Article Preview

1. INTRODUCTION

The human-machine interaction has evolved significantly allowing nowadays to develop appropriate solutions to support people who have a certain type of physical or cognitive limitation. The development of natural and intuitive interaction techniques called Natural User Interfaces (NUI) allow people who are bedridden and/or have a physical disability to perform a set of actions by means of gestures thus increasing their quality of life (Lopes et al, 2015; Mendes, 2009; Buxton, 2009). Following this trend current technologies allow a great freedom of interaction between human and machines although there are still several gaps that need to be addressed. The possibility of bedridden people to interact with electronic devices naturally is still far from maturity. This project aims to contribute to this gap and enable the bedridden people to perform actions autonomously through technology.

There are thousands of people around the world depending upon external assistance to perform basic activities and it is estimated that this number will increase in the forthcoming years. If there is a field in which science and technology can and should play a central role it is precisely the field of health and wellbeing of people. In the particular case of electronics and computing there is still a set of systems that can be developed as a means to improve the quality of life of bedridden people. Considering this necessity, the prototype presented in this paper makes total sense in today's society where there is an aging population increase. The solution implemented is based on image processing and computer vision through the Kinect 3D sensor.

The goal of this project is to allow the user (a bedridden person who depends upon external help) to interact, in a very intuitive and natural way, with a computerized system equipped with the Kinect 3D sensor using six hand gestures to perform a set of basic actions that are fundamental for bedridden people, such as, switch on/off the TV, call the emergency help, control the bed inclination, switch on/off the light, among others. From the point of view of the user the solution should be an intuitive and minimalist application that presents several benefits related essentially with an improved quality of life but that could also bring added value, due to its capacity, to save patients confronted with life-threatening situations (allows to call emergency help). An algorithm of shape matching was considered for the hand gesture recognition (Yam et al, 2004; Ren et al, 2013; Altman, 2013).

This paper is organized in 6 sections. Section 2, State of Art, presents studies and solutions of existing bedridden assistive technologies as well as the added value of the presented solution to improve the quality of life of bedridden persons. Section 3, System Architecture, shows, among others, the hardware and software used in the solution developed including also the user interface. Section 4, System Development, describes the process of obtaining the sensory device information and its conversion as well as the methods applied to recognize hand gestures. Section 5, Experimental Results, presents the tests performed with several subjects to validate the system and finally, section 6, Conclusion and Future Work, enunciates the final considerations presenting the system limitations and further possible developments.

Complete Article List

Search this Journal:
Reset
Volume 19: 1 Issue (2024)
Volume 18: 1 Issue (2023)
Volume 17: 2 Issues (2022)
Volume 16: 4 Issues (2021)
Volume 15: 4 Issues (2020)
Volume 14: 4 Issues (2019)
Volume 13: 4 Issues (2018)
Volume 12: 4 Issues (2017)
Volume 11: 4 Issues (2016)
Volume 10: 4 Issues (2015)
Volume 9: 4 Issues (2014)
Volume 8: 4 Issues (2013)
Volume 7: 4 Issues (2012)
Volume 6: 4 Issues (2011)
Volume 5: 4 Issues (2010)
Volume 4: 4 Issues (2009)
Volume 3: 4 Issues (2008)
Volume 2: 4 Issues (2007)
Volume 1: 4 Issues (2006)
View Complete Journal Contents Listing