Line-of-Sight Detection Using Center of Gravity with Pixel Number Variation

Line-of-Sight Detection Using Center of Gravity with Pixel Number Variation

Takahito Niwa, Ippei Torii, Naohiro Ishii
Copyright: © 2017 |Pages: 13
DOI: 10.4018/IJSI.2017070105
(Individual Articles)
No Current Special Offers


This study is developed from the application software, called “Eyetalk” which supports communication for handicapped people studied by Torii laboratory, Aichi Institute of Technology and modified it to be used by the physically handicapped people with involuntary movement (Abnormal movement of the body that occurs independently of the consciousness). Then, the line-of-sight for the detection of the eye movement direction is developed for the communication to determine whether the user is looking left or right, which is based on the computation of the center of gravity with pixel number in the eye image. Further, a newly proposed blink detection method using afterimage is applied to the developed system. This study will be extended to the application that everyone can use easily and quickly to express his/her thoughts and requests.
Article Preview


Recently, the tools using mobile information devices such as smart phones and tablet have been growing significantly. In special schools for the handicapped in Japan, these mobile information devices had been adopted positively. They use communication support tools and applications to take actions for autistic children. We have developed an app. called “Hanamaru(Smiley)” (Torii, Ohtani, Shirahama et al., 2013) for children with autism who cannot understand what to do in the daily life. It is a schedule app that shows autistic children “what I should do next” with the image, voice and text. It is widely used in special schools for the handicapped in Japan as the communication support tool called “Let’s Talk!” (Torii, Ohtani, Niwa, Yamamoto, & Ishii, 2012) for children with autism. Torii laboratory has developed the support tools that help physically handicapped people. Physically handicap means having difficulty in activities of daily life, such as walking. Most of the cases, it is caused by a brain disease, such as cerebral palsy. It is difficult for the patients to communicate with family and caregiver. Therefore, in order to know their feelings, various tools have been developed in the past (Kitazawa, 2007; Kitazawa & Nishida, 2008). We have studied and developed the communication tool called “Eyetalk” (Torii, Ohtani, Niwa, & Ishii, 2015) using blinks. The tool is based on the blink detection method using an afterimage as a new blink detection method that can detect weak and short blinks of physically handicapped. This application has been widely used not only by physically handicapped people but also people who have become unable to write for communication. However, there are differences in the conditions of the physically handicapped people. So some people cannot use this application even they need it. In our developed system, it is difficult for the physically handicapped people accompanied by involuntary movements (Abnormal movement of the body that occur independently of the consciousness) to use it. So, it is necessary to develop a new method of the input system to satisfy the conditions of the above handicapped people. The main author of this paper is involved in the study and development of “Eyetalk” in Torii laboratory, the Aichi Institute of Technology, and learned the determination method by the image processing. Based on this experience and studies, the new communication tool that allows input words by the line-of-sight to the screen, is useful. This application of line-of-sight judges that a user looks right or left. Also the Japanese alphabetical table is modified and the predictive input system to be used for character input is developed. With the wide range of input system of this application that can correspond to many physical conditions and the wide variety of input technology, the burden of physically handicapped people can be reduced for the communication.

Complete Article List

Search this Journal:
Volume 12: 1 Issue (2024)
Volume 11: 1 Issue (2023)
Volume 10: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 9: 4 Issues (2021)
Volume 8: 4 Issues (2020)
Volume 7: 4 Issues (2019)
Volume 6: 4 Issues (2018)
Volume 5: 4 Issues (2017)
Volume 4: 4 Issues (2016)
Volume 3: 4 Issues (2015)
Volume 2: 4 Issues (2014)
Volume 1: 4 Issues (2013)
View Complete Journal Contents Listing