A Flexible Bio-Signal Based HMI for Hands-Free Control of an Electric Powered Wheelchair

A Flexible Bio-Signal Based HMI for Hands-Free Control of an Electric Powered Wheelchair

Ericka Janet Rechy-Ramirez, Huosheng Hu
Copyright: © 2014 |Pages: 18
DOI: 10.4018/ijalr.2014010105
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This paper presents a bio-signal based human machine interface (HMI) for hands-free control of an electric powered wheelchair. In this novel HMI, an Emotive EPOC sensor is deployed to detect facial expressions and head movements of users, which are then recognized and converted to four uni-modal control modes and two bi-modal control modes to operate the wheelchair. Nine facial expressions and up-down head movements have been defined and tested, so that users can select some of these facial expressions and head movements to form the six control commands. The proposed HMI is user-friendly and allows users to select one of available control modes according to their comfort. Experiments are conducted to show the feasibility and performance of the proposed HMI.
Article Preview
Top

Introduction

The current electric powered wheelchairs (EPWs) are mostly joystick-driven, and cannot be used by disabled people whose autonomies are seriously affected by spinal cord injuries, tetraplegia or amputation, as well as elderly people with limited mobility. It is necessary to develop a new human-machine interface for these people to use EPWs for their independent life. Up to now, several human machine interfaces have been developed for hands-free control of a wheelchair. Electromyography (EMG), Electroencephalography (EEG), and Electrooculography (EOG) signals as well as vision techniques have been used for obtaining facial expressions, thoughts, eye-gaze, head and shoulder movements from the user to operate a wheelchair.

Human Machine Interfaces Based on Head Movements

Adachi, et al. (1998) detected the direction of the head by using a camera in front of the user to track ten features points around the eyes, nose and mouth. Christensen & Garcia (2005) used forward-backward head movements for the wheelchair to go forward and backward, and left-right head turning for the wheelchair turning. The head movements were detected by an infrared sensor placed behind the head of the user. Jia, et al. (2007) also developed a visual HMI to detect head movements for giving the commands, in which the nose position on user’s face is utilized for head motion detection.

Human Machine Interfaces Based on Facial Expressions

Electromyography (EMG) signals are widely deployed to obtain facial expressions for hands-free control of a wheelchair. In Felzer and Freisleben (2002), a finite state machine was used to command the wheelchair, in which the user performs a facial expression (i.e. raising the eyebrow) until the desired command is reached instead of employing one expression per control command. In Tamura et al. (2010), three types of facial expressions were employed to control a wheelchair: winking with the right eye (to turn to the right), winking with the left eye (to turn to the left) and biting (to go forward and stop); while Firoozabadi, Oskoei and Hu (2008) detected 4 facial expressions to operate a wheelchair: smiling (to go forward), tensing the eyebrows and pulling them up (to go backward), retracting and pulling the right lip corners upwards (to turn to the right), and retracting and pulling the left lip corners upwards (to turn to the left). To stop the wheelchair, the user has to relax facial muscles. Xu, et al. (2013) used an incremental online learning algorithm in real-time to process surface EMG signals from facial movements for adaptive control of a wheelchair.

Human Machine Interfaces Based on Eye-Gaze

Electrooculography signal (EOG), vision techniques and infra-red photo sensors have been used for detecting eye gaze of the user for controlling a wheelchair. Crisman, et al. (1991) employed timed eye winks, i.e. short and long closings from one or both eyelids for operating a wheelchair. The detection of the timed eye winks was done by using two pairs of infra-red photo sensors attached to the ear pieces of a normal pair of eyeglass frames. Barea, et al. (2000); Barea, et al. (2003); Kuo, et al. (2009) used EOG signal to detect eye-gaze for giving the commands to the wheelchair. The eye-gaze was detected by using electrodes placed on the outer side of the eyes. While Bartolein, et al. (2008) employed an eye tracking device called SensoMotoric Instruments GmbH to obtain the eye-gaze behavior of the user for executing the commands on the wheelchair. On the other hand, Gajwani and Chhabria (2010) used eye tracking and eye blinking obtained by a camera mounted on a cap to control a wheelchair. Nguyen and Jo (2012) used a glasses frame, an infrared camera with two LEDs and a 3D orientation sensor to build the eye-gaze tracker for giving the commands to the wheelchair.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 8: 2 Issues (2018)
Volume 7: 2 Issues (2017)
Volume 6: 2 Issues (2016)
Volume 5: 1 Issue (2015)
Volume 4: 1 Issue (2014)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing