An Electric Wheelchair Controlled by Head Movements and Facial Expressions: Uni-Modal, Bi-Modal, and Fuzzy Bi-Modal Modes

An Electric Wheelchair Controlled by Head Movements and Facial Expressions: Uni-Modal, Bi-Modal, and Fuzzy Bi-Modal Modes

DOI: 10.4018/978-1-5225-5396-0.ch001
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

A bio-signal-based human machine interface is proposed for hands-free control of a wheelchair. An Emotiv EPOC sensor is used to detect facial expressions and head movements of users. Nine facial expressions and up-down head movements can be chosen to form five commands: move-forward and backward, turn-left and right, and stop. Four uni-modal modes, three bi-modal modes, and three fuzzy bi-modal modes are created to control a wheelchair. Fuzzy modes use the users' strength in making the head movement and facial expression to adjust the wheelchair speed via a fuzzy logic system. Two subjects tested the ten modes with several command configurations. Means, minimum, and maximum values of the traveling times achieved by each subject in each mode were collected. Results showed that both subjects achieved the lowest mean, minimum and maximum traveling times using fuzzy modes. Statistical tests showed that there were significant differences between traveling times of fuzzy modes of subject B and traveling times of bi-modal modes and those of the respective fuzzy modes of both subjects.
Chapter Preview
Top

Introduction

The current electric powered wheelchairs (EPWs) are mostly joystick-driven. Consequently, disabled people whose autonomies are seriously affected by spinal cord injuries, tetraplegia or amputation, as well as the elderly people with limited mobility, might not be able to use these EPWs. Up to now, several human machine interfaces (HMIs) have been developed for hands-free control of a wheelchair in order to assist the disabled and elderly people. Electromyography (EMG), Electroencephalography (EEG), and Electrooculography (EOG) signals as well as vision techniques have been used for identifying facial expressions, thoughts, eye-gaze, head, hand and shoulder movements from users to operate a wheelchair.

Human Wheelchair Interfaces Based on Head Movements

In terms of head movements, vision techniques have been employed to detect head movements to control a wheelchair. For instance, the direction of the head has been used to provide commands to a wheelchair (Adachi et al., 1998). Specifically, the authors used a camera in front of the user to track ten features-points around the eyes, nose and mouth; therefore, the direction of the head is identified. Another study (Christensen & Garcia, 2005) used forward-backward head movements to move forward and backward the wheelchair, and turn left-right head movements to turn the wheelchair. An infrared sensor placed behind the user’s head was employed to detect these head movements. Likewise, Jia et al. (2007) developed a visual HMI to detect head movements for giving the commands to the wheelchair. In this research, the nose position on user’s face is utilized for head motion detection. Conversely, the gyroscope of an Emotiv EPOC sensor was used to detect up, down, left and right head movements in order to control a wheelchair (Rechy-Ramirez & Hu, 2012).

Human Wheelchair Interfaces Based on Facial Expressions

Electromyography (EMG) signals –muscular activity- are widely used to obtain facial expressions for hands-free control of a wheelchair. A finite state machine has been employed to command the wheelchair through one facial expression (Felzer & Freisleben, 2002). In this research, the user performs a facial expression (i.e. raising the eyebrow) until the desired command is reached instead of employing one expression per control command. Furthermore, three types of facial expressions have been employed to control a wheelchair: winking with the right eye (to turn to the right), winking with the left eye (to turn to the left) and biting (to go forward and stop) (Tamura et al., 2010). Another study (Firoozabadi, Oskoei & Hu, 2008) has used four facial expressions to operate a wheelchair: smiling (to go forward), tensing the eyebrows and pulling them up (to go backward), retracting and pulling the right lip corners upward (to turn right), and retracting and pulling the left lip corners upward (to turn left). To stop the wheelchair, users should relax facial muscles. Moreover, an incremental online learning algorithm in real-time has been employed to process EMG signals from facial movements for adaptive control of a wheelchair (Xu et al., 2013).

Key Terms in this Chapter

Fuzzy Logic: Lotfi Zadeh introduced fuzzy logic in 1965. This technique is used to cope with data uncertainty. It involves a fuzzification process, fuzzy inference, and defuzzification process.

Expressiv® Suite of Emotiv: It is a suite provided by the Emotiv EPOC sensor. This suite detects facial expressions from the user (i.e., blink, right-wink, left-wink, look right/left, raise-brow, furrow-brow, smile, clench, right-smirk, left-smirk, and laugh).

Uni-Modal: In this chapter, this term is used to indicate that control modes use one modality for issuing commands to the wheelchair: either head movements or facial expressions.

Cognitiv® Suite of Emotiv: It is a suite provided by the Emotiv EPOC sensor. This suite recognized 14 thoughts: neutral, right, left, push, pull, lift, drop, rotate-left, rotate-right, rotate-clockwise, rotate-anticlockwise, rotate-forwards, rotate-reverse, and disappear.

Bi-Modal: In this chapter, this term is used to indicate that control modes use two modalities for issuing commands to the wheelchair: head movements and facial expressions.

Emotiv EPOC: A sensor that measures EEG activity from 14 saline electrodes. The Emotiv EPOC has one gyroscope and three suites: affectiv, expressiv, and cognitiv.

Human-Machine Interface: It is an interface, which provides a friendly, intuitive and transparent interaction between the human and the control system of any device.

Complete Chapter List

Search this Book:
Reset