Reference Hub1
Design and Evaluation of Vision-Based Head and Face Tracking Interfaces for Assistive Input

Design and Evaluation of Vision-Based Head and Face Tracking Interfaces for Assistive Input

Chamin Morikawa, Michael J. Lyons
ISBN13: 9781466644380|ISBN10: 1466644389|EISBN13: 9781466644397
DOI: 10.4018/978-1-4666-4438-0.ch007
Cite Chapter Cite Chapter

MLA

Morikawa, Chamin, and Michael J. Lyons. "Design and Evaluation of Vision-Based Head and Face Tracking Interfaces for Assistive Input." Assistive Technologies and Computer Access for Motor Disabilities, edited by Georgios Kouroupetroglou, IGI Global, 2014, pp. 180-205. https://doi.org/10.4018/978-1-4666-4438-0.ch007

APA

Morikawa, C. & Lyons, M. J. (2014). Design and Evaluation of Vision-Based Head and Face Tracking Interfaces for Assistive Input. In G. Kouroupetroglou (Ed.), Assistive Technologies and Computer Access for Motor Disabilities (pp. 180-205). IGI Global. https://doi.org/10.4018/978-1-4666-4438-0.ch007

Chicago

Morikawa, Chamin, and Michael J. Lyons. "Design and Evaluation of Vision-Based Head and Face Tracking Interfaces for Assistive Input." In Assistive Technologies and Computer Access for Motor Disabilities, edited by Georgios Kouroupetroglou, 180-205. Hershey, PA: IGI Global, 2014. https://doi.org/10.4018/978-1-4666-4438-0.ch007

Export Reference

Mendeley
Favorite

Abstract

Interaction methods based on computer-vision hold the potential to become the next powerful technology to support breakthroughs in the field of human-computer interaction. Non-invasive vision-based techniques permit unconventional interaction methods to be considered, including use of movements of the face and head for intentional gestural control of computer systems. Facial gesture interfaces open new possibilities for assistive input technologies. This chapter gives an overview of research aimed at developing vision-based head and face-tracking interfaces. This work has important implications for future assistive input devices. To illustrate this concretely the authors describe work from their own research in which they developed two vision-based facial feature tracking algorithms for human computer interaction and assistive input. Evaluation forms a critical component of this research and the authors provide examples of new quantitative evaluation tasks as well as the use of model real-world applications for the qualitative evaluation of new interaction styles.

Request Access

You do not own this content. Please login to recommend this title to your institution's librarian or purchase it from the IGI Global bookstore.