Lifelog media system stores and manages users’ everyday experiences in the form of multimedia data. To build such a system, we require an integrated framework for capturing the experiences to multimedia data, storing and managing those data, and also presenting the data to the user in a user-friendly way. Due to the mobility of the user, we built a mobile framework that includes wearable devices that enable the user to capture experiences easily, and a Web-based management system that can be presented anytime and anywhere using Web interface. In this chapter, we provide solutions for some issues that emerge in this system (such as mobility and user friendliness), mostly on the database performance.
The lifelog system is intended to store everyday experiences of a user into a database system. The objective of the lifelog concept is to be able to trace the “threads” of an individual’s life in terms of events, states, and relationships (DARPA, 2003). Generically, the term lifelog or flog is used to describe a storage system that can automatically and persistently record and archive the useful informational dimension of an object or life experiences of the user with a particular data structure (Wikipedia, 2007).
This kind of system involves capturing a great amount of personal experiences in the form of digital multimedia. To manage those data systematically so the user can efficiently retrieve useful experiences whenever he or she needs them, an efficient metadata database management system enabling user-friendly search of the experience using human conceivable cue is essential.
Several works have been studied in lifelog media (Aizawa, Tancharoen, Kawasaki & Yamasaki, 2004; de Silva, Oh, Yamasaki & Aizawa, 2005; He, Xiang & Shi, 2005). In those studies, the authors are interested in the concept of digital logging and the conceptual implementation of their lifelog media systems.
Takahashi, et al. (2004) especially concentrated on the data representation and introduced multilayer data interpretation to represent human interaction. This multilayer data interpretation is similar to the approach in this research, but this research uses automatically generated metadata from various wearable sensors. This enables users to search the desired media using a wide range of information.
To manage the enormous lifelog media data efficiently, the system requires special database system with a special indexing mechanism. Tusch, Kosch & Blöszödorményi (2000) introduced VIDEX, a generalized model for indexing video that was applied in SMOOTH (Kosch et al., 2001) to manage soccer game records. Here they use RDBMS to implement the database, while other projects such as OpenDrama (Celma & Mieza, 2004) use Native XML Database (NXD).
Key Terms in this Chapter
Wearable Devices: Devices that can be worn on a user’s body to capture the user’s activities and experiences.
Web Interface: A user interface that is implemented in the form of a Web page and can be navigated using a standard Web browser
Lifelog: A storage system that can automatically and persistently record and archive some informational dimension of an object’s (object lifelog) or user’s (user lifelog) life experience in a particular data category.
XML Annotation: Extra information asserted with a particular point in a document or other piece of information in the form of an XML document
XML Database: A data persistence software system that allows data to be imported, accessed, and exported in an XML format
Metadata: Data about data, used to facilitate the understanding, use, and management of data.
Multimedia: Media that utilizes a combination of different content forms (i.e., text, audio, still images, animation, video, and interactivity content forms).