Today, more than a third of all adults and more than fifty percent of all college students in the U.S. have smart phones (Nielsenwire, 2012). Commercially available smart phones have enough processor and graphics engine power to run augmented reality applications, and network connectivity fast enough to download high resolution videos. In addition, they come with built-in GPS and motion sensors that can track the user’s location reliably and run applications that are location-sensitive in real time. Because many students carry these powerful devices everywhere they go, providing educational material through mobile devices strikes us as a prime opportunity to reach the aspirational goal of “learning anytime, anywhere,” which expands Weiser’s vision (Weiser, 1991) of ubiquitous computing (ubicomp) to everyday educational contexts.
Mobile Augmented Reality (MAR)—which involves the dynamic overlay of digital information in the user’s view through mobile devices—is an increasingly popular technology for enhancing how people interact with and learn about the environment and objects in the physical world. In MAR, mobile devices act as the “magic” lens through which people can see the world annotated or augmented with digital information (Bier et al., 1993). Information displayed in MAR is meant to be understood in conjunction with the dynamically changing physical environment mobile users are in. MAR keeps people connected to the physical world via their natural senses (e.g., smelling the air, hearing the ambient sounds, feeling the heat in the environment, etc.) while at the same time having access to, and interacting with, a wealth of digital information. This combination has the potential to be a powerful way for people to learn about the environment or physical objects in situ, in a way desktop computers or other types of stationary displays cannot accomplish.
To date, much of the effort made in the field of HCI and Ubiquitous Computing has focused on technical enablers of MAR (e.g., Billinghurst et al., 2001; Cheok et al., 2004; Gleue et al., 2001; Henrysson et al., 2005; Rekimoto, 1997; Viega et al., 1996). However, many questions with regards to how MAR can be used to meaningfully support learning remain. For example, what type(s) of information or situation(s) would most benefit from MAR displays? In what ways can having access to digital information while being embedded in the real physical world contribute to users’ understanding about the environment?
In this article, we investigate the potential role MAR technologies could play in engaging students in uniquely mobile and personal learning experiences by providing experts’ perspectives and knowledge in situ (Goodwin, 1994), i.e., the way experts notice things and ask questions about the very environment students are standing in, so that even in the absence of experts, the students could actively investigate and learn about their environment. Using MAR and experts’ videos on smart phones, we explore ways to draw students’ attention to the physical environment they are immersed in, and actively engage them in the subject of sustainability and biodiversity in the specific context of their environment. We report on our iterative design process and evaluation of GreenHat, a MAR system that aims to help students engage with the natural environment from multiple expert perspectives (landscape architecture and conservation biology). The evolution of the prototype design was based on our observations of experts in the field, as well as individuals using the MAR tools in contrast to other mobile tools, such as interactive digital map tools, to navigate and learn about their environment. With multiple iterations of our design of a mobile learning tool, we investigate MAR’s role in enabling increased interaction with the physical environment.