Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies

Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies

Päivi Majaranta (University of Tampere, Finland), Hirotaka Aoki (Tokyo Institute of Technology, Japan), Mick Donegan (The ACE Centre, UK), Dan Witzner Hansen (IT University of Copenhagen, Denmark), John Paulin Hansen (IT University of Copenhagen, Denmark), Aulikki Hyrskykari (University of Tampere, Finland) and Kari-Jouko Räihä (University of Tampere, Finland)
Indexed In: SCOPUS
Release Date: October, 2011|Copyright: © 2012 |Pages: 382
ISBN13: 9781613500989|ISBN10: 161350098X|EISBN13: 9781613500996|DOI: 10.4018/978-1-61350-098-9

Description

Recent advances in eye tracking technology will allow for a proliferation of new applications. Improvements in interactive methods using eye movement and gaze control could result in faster and more efficient human computer interfaces, benefitting users with and without disabilities.

Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies focuses on interactive communication and control tools based on gaze tracking, including eye typing, computer control, and gaming, with special attention to assistive technologies. For researchers and practitioners interested in the applied use of gaze tracking, the book offers instructions for building a basic eye tracker from off-the-shelf components, gives practical hints on building interactive applications, presents smooth and efficient interaction techniques, and summarizes the results of effective research on cutting edge gaze interaction applications.

Topics Covered

The many academic areas covered in this publication include, but are not limited to:

  • Applications for People with Disabilities
  • Assistive Technologies
  • Eye Tracking
  • Gaze Control
  • Gaze Interaction
  • Gaze Tracking
  • Gaze-Aware Systems
  • Human computer interfaces
  • User Interfaces

Reviews and Testimonials

"I have been following the work of this book's editors and contributors since I met (most, if not all of) them over a decade ago. [...]

– Andrew T. Duchowski, Clemson University, USA

This is a unique book in a specialized area, providing both basic information on the mechanisms of operation of these highly sophisticated devices and practical descriptions of the usefulness of the applied technology. It is well written, well referenced, thorough, and appropriately organized.

– Doody's Book Review Service

Table of Contents and List of Contributors

Search this Book:
Reset

Preface

Kari-Jouko Räihä
University of Tampere, Finland

Motivation

The ability to express oneself quickly and efficiently in precise language is fundamental for quality of life. Some people with special needs are not able to carry out interpersonal communication fluently. Producing simple, short sentences may take several minutes, depending on the specific impairment and the communication equipment in use. Similarly, access to the modern information society through the Internet may be severely limited by an inability to use the normal controls of a computer.

Another basic human right is being able to decide where one wishes to be and, equally importantly, where one wishes not to be. In other words, providing means by which people with motor impairments can move about and control their environment is another important goal for improving the quality of life.

Numerous assistive technologies and solutions, targeted at various user groups, have been developed over the years to overcome these problems. This book focuses on communication and control tools based on eye gaze tracking. Eye tracking makes it possible for people who are otherwise totally paralysed to use eye movements and gaze direction as a means to communicate with the outside world. This technology is crucial to users suffering from various types of motor neuron disease (MND), such as ALS or locked-in syndrome. Once thought rare, this impairment is, in fact, quite common: there are nearly 120,000 cases diagnosed worldwide each year. In addition, the technology and interaction methods developed can improve the quality and speed of communication for users with lesser impairments, and eventually for all users. According to Donegan et al. (2009), a new group of disabled users are emerging who are ‘choosing to use eye control either to augment or replace their existing method, often because they feel it is a more comfortable, quicker and/or easier way of controlling technology’. In general, our target audience consists of all user groups for whom advanced eye tracking would provide a better interaction method.

Proper tools can improve the quality of life and even life expectancy considerably. With eye typing systems, MND patients have written entire books (Vigand & Vigand, 1997; Martin & Yockey, 2000). In addition to productivity tools (for typing, editing, communication, and control), recent developments have provided access to entertainment applications, such as games played over the Internet, that will bring many everyday pastimes within the reach of this user group. The ultimate challenge could be formulated as a variation of the Turing test: if people are communicating or collaborating at a distance over the Internet, the impairment of one person, ideally, should not be apparent to the others.

Although this objective lies, in general, in the distant future, developments in several fields have taken us closer to reaching it at least in specific cases. First and foremost, the technology has taken enormous steps forward in the last few decades. Eye tracking has a surprisingly long history, going back to the 19th century. For a long time, the techniques used to detect where a person is looking required obtrusive means, such as magnetic coils attached to the cornea. Advances in video-based methods changed this completely, toward the end of the 20th century. Video-based eye tracking, which does not require physical contact with a person, and EOG (eye tracking based on electrodes attached to the face), opened the gates to using eye tracking in augmentative and alternate communication (AAC). Systems for this purpose have been in use since the early 1980s (e.g., Hutchinson, White, Martin, Reichert, & Frey, 1989; Gips, Olivieri, & Tecce, 1993). The last decade has brought much progress in the packaging of the technology. It now has a small footprint and can even be integrated into spectacle frames or embedded within the monitor, making it much easier to install the device in varying environments and to support mobility through mounting of the eye tracker on a wheelchair, for instance.

The extent of this development can be recognised when one considers that some of the earlier devices required manual adjustment of the analogue video by means of a separate control unit: the optics and a camera moved by a servo motor facing the user, and the bulky control unit in another room. Today the digital video image can be analysed in real time inside a system that includes a laptop computer, screen, and video camera embedded in one portable unit. It can be mounted on an arm that can be rotated to any angle to accommodate users who cannot use the equipment in the normal position.

The use of an eye tracking device can open the door to a whole new world for users who have been unable to find an acceptable means of communication. Section 2 of the book presents many such examples, including that of Jayne, a young person in her early twenties with athetoid cerebral palsy. Jayne embraced gaze control technology with the support and encouragement of family, college staff, and friends. Previously, she had rejected other computer control methods. However, after she was given access to an eye tracking system, her prolific e-mail correspondence caused the college server to become ‘jammed’ on one occasion. ‘I get emails now…’ her mum says, pausing to reflect. ‘…endless emails, but I am sure this will calm down’. Secondly she continues, ‘I think it’s the technology itself. She so hated switches and, I think, felt they were unreliable’. Jayne’s notion that switch control was inaccurate was just too hard to shake. ‘The more excited she got the more it made presses, whereas it’s the other way around for her when she is using gaze control technology. The more excited she gets the less it activates’, her mother is quoted as saying. With the exception of gaze control technology, other forms of high-tech communication hadn’t just stymied Jayne up to that point; she had rejected them out of hand.

The user base of gaze-based communication and control systems has been growing at an increasing rate. This has several reasons in addition to the advances in technology. Our knowledge of what is required to find a suitable solution for a particular individual has increased. For users with severe impairments, there can hardly ever be off-the-shelf solutions: careful adaptation of the technology to the needs of the user has to be performed on a case-by-case basis. On the other hand, the more we have learnt about individual users, the easier it has become to apply the existing knowledge in finding working solutions for new users. Spreading of information has been important in another sense also: eye tracking devices are not cheap, and people who could make use of them often need financial help in acquiring one. Awareness of the effectiveness of this technology has contributed more and more to the development of eye trackers whose financial support from society would be acceptable.

The early motivation for developing and using eye trackers was psychological research: the need to find the mechanisms that drive eye movement and to discover the process of visual perception. The role of eye trackers in AAC was the first major area where the technology was put to use in interaction with computers. It did not take long, however, before the potential of using point of gaze as an additional input channel in general control of computers was recognised (Jacob, 1991). The eyes provide a natural means for indicating which object on the screen the user wants to manipulate: whatever means are used for issuing a command, the target of the command must be located first – by the user looking at it. Although this sounds natural, the simplicity is deceiving: we do not want the computer to react to everything we look at. Finding natural, fluent, and efficient interaction techniques has given rise to a lot of research, ingenious new interaction techniques, and experiments that tell us how well they work.

Gaze-based interaction can never reach the speed of some other forms of communication, but careful interface design can take these systems a long way. A good example is text entry using gaze, a core element of any tool intended for communication. One natural gaze-based technique uses a soft keyboard on the computer screen, where keys are ‘pressed’ by looking at them long enough. This is naturally slow, since the threshold for selection of a key cannot be too low, lest false selections become annoyingly frequent. For this reason, it was long thought that this technique could provide text entry rates on the order of 10 words per minute. Here a ‘word’ is normalised to be a sequence of five characters. To put this in perspective, normal typing on a keyboard reaches entry rates of between 50 and 100 words per minute. In Section 3, we will present several approaches to improving the situation. An ingenious technique called Dasher was introduced by Ward and MacKay (2002). It was based on a totally different paradigm for text entry, not the one familiar from keyboards and typewriters. The paradigm was equally suitable for use with eye gaze and with other input devices, and it allowed text entry rates of close to 30 words per minute. But also with traditional techniques much can be done by carefully adapting them to use via eye gaze. Majaranta, Ahola, and Špakov (2009) show how the text entry rate even with soft keyboards can be almost doubled simply by providing the user with means to control the threshold for key selection. This is one of the many examples throughout the book stressing how important it is to adapt the solutions for each individual user and to provide means for adjusting the settings of one’s device oneself.

This combination of different areas – advances of the technology, increased knowledge of the users and their needs, and development of algorithms that facilitate the interaction – make gaze interaction using eye tracking both a challenging and a fascinating field. It is truly multidisciplinary and needs co-operation from both researchers and practitioners.

Why This Book
To facilitate bringing together the different stakeholders in this field, the European Union funded a network of Excellence called COGAIN (Communication by Gaze Interaction) in its 6th Framework Programme. The network consisted of 25 partners from universities and research institutes, care centres, end-user organisations, and manufacturers. The partners came from the Czech Republic, Denmark, Finland, France, Germany, Italy, Japan, Lithuania, Spain, Sweden, Switzerland, the UK, and the USA. It was a unique and unprecedented coalition, allowing researchers to get in direct contact with the user community, enabling manufacturers to learn from the latest research and share their extensive experience with researchers, and providing information on the latest technological solutions to the user community.

This book is a result of the work done in COGAIN. In 2004–2009, the project implemented various gaze operated systems; conducted research; and produced, alongside new hardware and software, many reports and deliverables, containing a great deal of information on eye tracking, gaze estimation, safety issues, gaze-based interfaces, and (especially) gaze-based communication and control. The deliverables of the project are freely available on the Net (http://www.cogain.org/wiki/COGAIN_reports). Although they often represent the most comprehensive account of a specific theme, finding the right deliverable and obtaining the necessary background for reading it may be challenging. We are therefore proud to offer you this ‘COGAIN book’ that collects the essential knowledge inside the covers of one book in an easy to-follow, organised format.

The COGAIN book is a collection of self-study materials for people who are interested in the applied use of gaze tracking. The book can also be used as a textbook for courses including an eye tracking module. The book provides basic knowledge of how a gaze tracker works and enough pointers to further readings for a reader who is eager to know more. The book also gives practical hints on building interactive applications that benefit from gaze input and instructions for building a basic eye tracker from off-the-shelf components. Special attention is given to applications targeted at people with disabilities. The book will assist professionals working in the field of assistive technology so that they know the key features of a gaze control system and will be able to make an assessment and select the best system for each client. Examples and case studies are provided to illustrate the issues. The book should be interesting and useful for anyone who has an interest in the subject but is not an expert in all areas, including user interface designers, software engineers, researchers, rehabilitation experts, and assistive technology professionals. Eye tracking brings together diverse skills and technologies and is a truly multidisciplinary area. This book is an attempt to bring the multidisciplinary expertise of the COGAIN Network of Excellence together in written form.

Outline of the Content
The book is divided into six thematic sections. They are organised such that the sections that are easier for a novice reader to follow are before the more demanding sections. Sections 2–4 provide practical advice and numerous case studies while sections 5–6 focus on more theoretical and technical information. Each section is complete on its own, and the reader may choose to read only the sections of interest. However, we do recommend first reading the introduction, as it should aid in understanding and putting into context the various thematic sections of the book.

Section 1 describes the area of focus of the book gives an introduction to the basic concepts. Majaranta and Donegan provide with a brief glance at the history of eye tracking and its application area in ‘Introduction to Gaze Interaction’. Mulvey introduces basics for ‘Eye Anatomy, Eye Movements, and Vision’. Hansen and Majaranta then provide a simple explanation of how a gaze tracking device works in ‘Basics of Camera-Based Gaze Tracking’.

Section 2 focuses on end-user-related issues. First, Donegan in ‘Features of Gaze Control Systems’ discuss the user requirements and key features of a gaze communication and control system from the user’s point of view. Then ‘A Model for Gaze Control Assessments and Evaluation’, by Holmqvist and Buchholz, provides practical advice for assistive technology professionals on how to succeed in finding the right combination of user needs and supporting technology. Pasian, Corno, Signorile, and Farinetti in ‘The Impact of Gaze-controlled Technology on Quality of Life’ provide several case studies and results from user trials, presenting in detail the process of introducing an eye tracking device to impaired users. Further case studies, with focus on users with the kind of impairments that might at first seem to prevent the use of eye gaze for control, are described by Donegan in ‘Participatory Design – the Story of Jayne and Other Complex Cases’. This section should be useful for everybody who works with people with disabilities but also for developers who wish to get a deeper understanding of the target users and their needs.

Section 3 is focused on gaze control applications. It starts by considering the classic area of application of gaze-based interaction: Majaranta reviews in ‘Communication and Text Entry by Gaze’ both traditional approaches and more recent research on this topic. Skovsgaard, Räihä, and Tall then take a broader look at gaze-based interaction techniques, in ‘Computer Control by Gaze’. They take the reader through numerous techniques, citing results related to their efficiency, and discuss as an example one specific field of application: browsing large information spaces, such as the World Wide Web, with eye gaze. In the last chapter, ‘Beyond Communication and Control: Environmental Control and Mobility by Gaze’, Bates, Castellina, Corno, Novák, and Štepánková review the challenges and requirements in taking gaze based interaction beyond communication and control, such as environmental control and mobility by gaze. This field is still in its early stages of development when compared to general interaction and control of a computer. The chapter pins down issues of safety and compatibility with other control systems that need to be taken into account in further development of the solutions. In all three chapters, design issues are discussed in such a way that this part of the book should be useful for anyone interested in implementing a gaze-controlled application, independent of the application area.

Section 4 broadens the perspective from voluntary eye movements used for control to involuntary, natural eye movements. In the first chapter, ‘Eye Movements and Attention’, Mulvey and Heubner provide an overview of the relationship of eye movements, cognitive processing, and attention – examining visual attention in particular. The chapter looks at what might be possible beyond direct point-and-click gaze control, in inferring subjective states. Vidaurre, Tangermann, Kübler, Müller, and Millán, in ‘Brain–Computer Interfaces and Visual Activity’, present what is known about the effects of visual stimuli on brain activity as measured via an array of electrodes placed on the scalp. They also discuss the possibilities of brain controlled interfaces, either with the brain signals as the sole input or in combination with the measured point of gaze. Istance and Hyrskykari close this section with ‘Gaze-Aware Systems and Attentive Applications’, introducing applications that exploit the information from natural eye movements in the background of the application and do not require the user’s intentional changing of gaze behaviour. Gaze-aware applications provide information about where the user’s visual attention is targeting. This additional information channel opens gaze tracking to mainstream applications. The chapter offers design tips and discusses lessons learnt from research applications.

Section 5 introduces research methods, metrics, and measures that have emerged within the field associated with the design of gaze interaction systems. Hansen and Aoki start this section with ‘Methods and Measures – an Introduction’. Some of the methods are inherited from an engineering approach to system design that can be found within the human factors tradition while others are unique to gaze interaction. In particular, the chapter takes a detailed look at the metrics used in analysis of text entry techniques. MacKenzie discusses, in ‘Evaluating Eye Tracking Systems for Computer Input’, techniques applicable in general, not just for text entry. The tools range from evaluating throughput and modelling eye movements in pointing tasks by using Fitts’ law to collecting subjective opinions via questionnaires. The chapter describes in detail several case studies (experiments and their statistical analyses) that can be used as a model in design and analysis of new interaction techniques. It is well known that eye tracking experiments produce large quantities of data, and this alone may make it tedious to find interesting patterns in gaze behaviour. In the next chapter, ‘Gaze Data Analysis: Methods, Tools, Visualisations’, Špakov discusses software tools that alleviate this problem by providing overviews of gaze behaviour that can be visually inspected. While most of this book focuses on the use of eye gaze for interaction, the visualisation tools can be equally well applied in traditional usability studies for various visual layouts, such as alternative designs of Web pages. Heikkilä and Ovaska give, in ‘Usability Evaluation of Gaze Interaction’, advice on how to carry out an experiment or usability test in a laboratory setting. They, too, discuss several examples of experiments analysed in various contexts. The chapter pays special attention to what should be done to the collected raw data: how to define fixation filters and specify areas of interest as the starting point for visualisations and statistical analyses. Finally, Donegan, Gill, and Oosthuizen take us back from mainstream applications and their analysis to the special issues raised by users with disabilities and consideration of the applications intended for them, in ‘A Client-focused Methodology for Gaze-Control Assessment, Implementation, and Evaluation’.

Section 6 takes the reader into the heart of a gaze tracking system. It introduces both the hardware and software part of an eye movement tracking system and explains in detail how such a system works and what is required for the estimation of gaze direction. This section can be seen as a tutorial for readers who wish to build their own gaze tracker. Hansen, Villanueva, Mulvey, and Mardanbegi kick things off with their chapter ‘Introduction to Eye and Gaze Trackers’, which provides an overview of the main software and hardware components needed for eye tracking. It explains how the joint effect of eye position and head position can be used to determine the estimated point of gaze on the screen. In ‘Image Analysis’, Droege walks the reader through the many algorithms involved in identifying from a video image the user’s eye, the pupil, and a glint caused by an external light source (if used). After these core inputs have been obtained, another set of algorithms are needed for estimating the point of gaze; these are covered by Villanueva, Cabeza, and San Agustin in the chapter ‘Gaze Estimation’. A crucial element for making this possible is a geometric modelling of the eyeball, to allow mathematical computation of how light travels through the eyeball and is reflected from the cornea and retina. In ‘Eye Tracker Hardware Design’, Daunys looks at how to design the hardware components needed in an eye tracker. This involves selecting a suitable camera and processor, choosing the appropriate optical system, and deciding on the illumination of the eye needed for facilitating the image analysis algorithms. The most common source of illumination in use today is infrared light, since it is not visible to the human eye and provides images that are better suited to eye and gaze tracking. Nevertheless, its long-term use may be irritating or cause dryness of the eye. Next, Mulvey, Villanueva, Sliney, Lange, and Donegan present the work done in COGAIN to determine safety limits for exposure to infrared light, in ‘Safety Issues and Infrared Light’. Hansen, Mulvey, and Mardanbeigi then wrap up the section by looking at where we can expect eye tracker development to head next, in their ‘Discussion and Future Directions’ chapter.

Finally, the editors of the book – Donegan, Majaranta, Hansen, Hyrskykari, Aoki, Hansen, and Räihä – sum up the research presented in this book, in ‘Conclusion and a Look to the Future’. The closing chapter looks at several themes whose research and development is currently active in the area of gaze interaction and assistive technologies based on eye tracking.

Conclusion
The focus of this book is on the use of gaze for interaction. Other common uses of eye trackers are found in psychological studies and usability analysis of visual designs, such as Web pages. There is a wealth of literature and books on both, but on the use of gaze for interaction the knowledge is scattered. A seminal book was published by Duchowski (2003), but thereafter the main sources for new techniques have been the biennial Symposium on Eye Tracking Research and Applications. In addition to bringing the reader up to date with the key developments via a single compendium, another element of the proceedings of the symposia sets these volumes apart from others: the focus on solutions aimed at users with disabilities, without forgetting mainstream applications. The latter is of increasing importance as the algorithms improve, more inexpensive cameras can be used for eye tracking, and the technology can eventually be shipped as commodity goods with modern laptop computers.

This book is a joint effort of key actors in the COGAIN Network of Excellence. Funding from the European Commission ended in 2009, but COGAIN continues its work in the form of the COGAIN Association; see http://www.cogain.org/. The association is open to anyone working in this fascinating field, for sharing knowledge and experiences to help others in the community. The authors of the chapters of this book are donating any proceeds from the book to the association.

We hope you enjoy reading the book and find it useful!

References

Donegan, M., Morris, D. J., Corno, F., Signorile, I., Chió, A., Pasian, V.,...Holmqvist, E. (2009). Understanding users and their needs. Universal Access in the Information Society, 8(4), 259–275. doi:10.1007/s10209-009-0148-1.

Duchowski, A. T. (2003). Eye tracking methodology: Theory and practice. London, UK: Springer.

Gips, J., Olivieri, C. P., & Tecce, J. J. (1993). Direct control of the computer through electrodes placed around the eyes. In M. J. Smith & G. Salvendy (Eds.), Human–computer interaction: Applications and case studies. Proceedings of HCI International ’93 (pp. 630–635). Amsterdam, The Netherlands: Elsevier.

Hutchinson, T. E., White, K. P. Jr., Martin, W. N., Reichert, K. C., & Frey, L. A. (1989). Human–computer interaction using eye-gaze input. IEEE Transactions on Systems, Man, and Cybernetics, 19(6), 1527–1534. doi:10.1109/21.44068

Jacob, R. J. K. (1991). The use of eye movements in human–computer interaction techniques: What you look at is what you get. ACM Transactions on Information Systems, 9(3), 152–169. doi:10.1145/123078.128728

Majaranta, P., Ahola, U.-K., & Špakov, O. (2009). Fast gaze typing with an adjustable dwell time. Proceedings of the 27th International Conference on Human Factors in Computing Systems (CHI’09) (pp. 357–360). New York, NY: ACM. doi: 10.1145/1518701.1518758

Martin, J., & Yockey, R. (2000). On any given day. Winston-Salem, NC: Blair.

Vigand, P., & Vigand, S. (1997). Only the eyes say yes: A love story. New York, NY: Arcade.

Ward, D. J., & MacKay, D. J. C. (2002). Fast hands-free writing by gaze direction. Nature, 418(6900), 838. doi:10.1038/418838a

Author(s)/Editor(s) Biography

Päivi Majaranta is a researcher at the University of Tampere, where she also received her PhD in Interactive Technology in 2009. She has worked on several research projects related to eye tracking. She is especially interested in the application of eye tracking in gaze-controlled and gaze-aware interfaces.
Hirotaka Aoki is an Associate Professor at Tokyo Institute of Technology. He received his PhD in Engineering from Tokyo Institute of Technology. His current research interests lie in the application of eye tracking techniques to cognitive work analysis, usability engineering and consumer behaviour analysis.
Mick Donegan is the Founder and Director of SpecialEffect, a charity dedicated to providing enhanced opportunities for people with disabilities to access video games and express themselves through design and music. He was awarded a PhD by Birmingham University in 2006 for an investigation into the conditions for the successful use of Assistive Technology in mainstream education. He was the coordinator of the User Requirements element of COGAIN, a European gaze control and disability project and is currently an Advisor for TOBI, a European funded project on brain control and disability
Dan Witzner Hansen is an Associate Professor within the Innovative Communication group at the IT University of Copenhagen, where is also received his PhD. He has been assistant professor at both ITU and the Technical University of Denmark and has been a visiting researcher at Cavendish laboratories, University of Cambridge, UK. His research interests are within computer vision and machine learning for interactive purposes with as special focus on eye tracking and gaze interaction in mobile scenarios. He is the author of several papers and patents related to eye and gaze tracking.
John Paulin Hansen is an Associate Professor at the IT University of Copenhagen. He received his PhD in psychology from Aarhus University. Hansen has a major interest in gaze interaction and Assistive Technologies. He has been pioneering the use of gaze tracking for usability studies and was one of the initiators of the COGAIN network. Hansen is now head of the Innovative Communication research group at IT University of Copenhagen.
Aulikki Hyrskykari is a Lecturer in Computer Science at the University of Tampere. She obtained her Lic.Phil degree in Computer Science in 1995 and her PhD in Interactive Tecnology in 2006 at the University of Tampere. She worked as a coordinator in the EU FP5 IST Project iEye, a three year project which focused on studying gaze assisted access to information. She has also acted as a program and organizing committee member in several international HCI conferences, most recently as the program chair of the ACM Eye Tracking Research and Applications conference ETRA 2010.
Kari-Jouko Räihä is a Professor of Computer Science at the University of Tampere. He received his PhD in 1982 at the University of Helsinki. He has done research in compiler construction, databases, and for the past 20 years in human-computer interaction. He is particularly interested in new interaction techniques for the desktop environment, and in the use of eye gaze for analyzing interaction and as an input channel. He has led tens of research projects, including the COGAIN Network of Excellence funded by the European Commission. He is currently the Dean of the School of Information Sciences at the University of Tampere.

Indices