MobiGaze: Gaze Interface for Mobile Devices

MobiGaze: Gaze Interface for Mobile Devices

Takashi Nagamatsu (Kobe University, Japan), Michiya Yamamoto (Kwansei Gakuin University, Japan) and Hiroshi Sato (Kwansei Gakuin University, Japan)
DOI: 10.4018/978-1-4666-1628-8.ch004


Today, touch-screen-based handheld mobile devices are widely used; however, they are awkward to use with one hand. We propose MobiGaze, which is a user interface that uses one’s gaze to operate a handheld mobile device. By using MobiGaze, one can target the entire display area easily, even when the device is quite large. Moreover, MobiGaze can use both gaze and touch interactions. The combination of gaze and touch becomes a novel interaction paradigm. A gaze-and-touch interface effectively avoids the Midas-touch problem. The authors adopted a gaze-tracking method that uses a stereo camera to develop MobiGaze, because the user’s line of sight is detected in 3D. This allows the user to move the handheld mobile device freely. They constructed a prototype MobiGaze system, which consists of two cameras with IR-LEDs, a Windows-based notebook PC, and iPod touches. The authors evaluated the accuracy in a laboratory experiment and developed several applications for MobiGaze.
Chapter Preview


Today, handheld mobile devices that have a touch screen are used widely; users can interact with these devices intuitively.

When using a mobile device, the user holds the device in one hand and touches the screen using the other hand. However, when the user is holding something or has one hand occupied with something else, the device is difficult to control with the single available hand. In this common circumstance, most users touch the screen using their thumb, as shown in Figure 1. It is difficult, however, to touch the entire area of the display with one’s thumb; when one touches the top area of the screen, the device becomes unstable and there is a possibility that the user will drop the device. Moreover, as the displays on devices become increasingly larger (e.g., iPad), one-hand control becomes even more difficult. In addition, the thumb blocks a large portion of the display; consequently, the user cannot look at the display area under the thumb and has difficulty touching the display accurately (fat finger problem).

Figure 1.

Difficulty of usage of mobile touch screen device with one hand


To solve these problems, we propose the use of a gaze-tracking technique in mobile devices. Gaze tracking can be used as a means of pointing in a mobile device, and it does not require the use of both hands and eliminates the problem of blocking of the screen. If a mobile device features a gaze-tracking function for pointing, the user could just gaze at the object and touch it with his/her thumb anywhere on the screen to activate the selected object. Moreover, pointing with one’s gaze is quicker than using a track ball or pointing stick, and the technique can be easily learned.

There are several types of gaze tracking technologies such as search coil, Electro-Oculogram (EOG), infrared corneal limbus tracker, and camera-based system. The search coil is a coil embedded in a scleral contact lens. The EOG uses electrodes that are placed on the skin around an eye. The infrared corneal limbus tracker uses infrared LEDs and photodiodes mounted on a kind of glasses. The camera-based system uses a video camera and can be divided into two categories: head-mounted and remote. Because a head-mounted gaze tracker (EOG, infrared corneal limbus tracker, and head-mounted type camera-based system) is not suitable for everyday life, we concentrate on developing a remote-type camera-based gaze tracker for a mobile device.

There are a number of technical problems associated with the introduction of gaze-tracking technology to a mobile device. Because a mobile device is small, the measurements made by a gaze-tracking system require a high degree of accuracy to indicate objects on a small screen. Moreover, the user moves the hand that holds the mobile device. Because the camera is attached to the mobile device, this hand movement causes quick changes in the direction that the camera is pointing. A successful gaze-tracking method must allow free hand movement while still providing the requisite accuracy.

In this chapter, we present a user interface that uses one’s gaze (gaze interface) for a mobile device (MobiGaze). We describe a gaze-tracking method that we adopted and the implementation of our current prototypes. MobiGaze controls a mobile device not only by gaze but also by a combination of gaze and touch, which is a new interaction technique. We describe the benefits of gaze-and-touch interaction. Using gaze-and-touch interaction, we developed some applications for this new interaction device.


Several studies have sought to detect the Point Of Gaze (POG) on the display of mobile devices.

Lukander constructed a system that makes it possible to measure the POG on the screen surface of a handheld mobile device (Lukander, 2006). Lukander’s system used a commercial head-mounted eye tracker and a magnetic positional tracker.

Drewes et al. investigated how gaze interaction can be used to control applications on handheld devices (Drewes, Luca, & Schmidt, 2007). They used a commercial tabletop eye tracker in combination with a mobile phone attached to a screen.

Tobii provided an eye-tracking testing solution for a mobile device (Tobii, 2011). However, this is a tabletop-type solution, so it is useful only in a laboratory.

Complete Chapter List

Search this Book: