The MobiFall Dataset: Fall Detection and Classification with a Smartphone

The MobiFall Dataset: Fall Detection and Classification with a Smartphone

George Vavoulas, Matthew Pediaditis, Charikleia Chatzaki, Emmanouil G. Spanakis, Manolis Tsiknakis
DOI: 10.4018/ijmstr.2014010103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Fall detection is receiving significant attention in the field of preventive medicine, wellness management and assisted living, especially for the elderly. As a result, several fall detection systems are reported in the research literature or exist as commercial systems. Most of them use accelerometers and/ or gyroscopes attached on a person's body as the primary signal sources. These systems use either discrete sensors as part of a product designed specifically for this task or sensors that are embedded in mobile devices such as smartphones. The latter approach has the advantage of offering well tested and widely available communication services, e.g. for calling emergency when necessary. Nevertheless, automatic fall detection continues to present significant challenges, with the recognition of the type of fall being the most critical. The aim of this work is to introduce a human fall and activity dataset to be used in testing new detection methods, as well as performing objective comparisons between different reported algorithms for fall detection and activity recognition, based on inertial-sensor data from smartphones. The dataset contains signals recorded from the accelerometer and gyroscope sensors of a latest technology smartphone for four different types of falls and nine different activities of daily living. Utilizing this dataset, the results of an elaborate evaluation of machine learning-based fall detection and fall classification are presented and discussed in detail.
Article Preview
Top

1. Introduction

A fall is defined as a sudden, uncontrolled and unintentional downward displacement of the body to the ground. It is evident that falls affect millions of people (especially the elderly) and may result in significant injuries (Kannus, Sievänen, Palvanen, Järvinen, & Parkkari, 2005). Moreover, injury is a leading cause of death among elderly people (Stevens, Corso, Finkelstein, & Miller, 2006). Automatic fall detection systems rely on a set of threshold values for predetermined parameters, as well as classification rules, in order to continuously process motion data, obtained from an accelerometer and/or a gyroscope, or other sensors, and to determine in near real-time if a fall event has occurred.

Automatic fall detection is one of the hottest topics in the field of preventive health care since the last decade. Numerous papers report approaches to automatic fall detection based on the analysis of images, video, audio, as well as inertial sensor data from sensors that are either discrete (stand-alone) or integrated inside a mobile phone (Abbate, Avvenuti, Bonatesta, Cola, Corsini, & Vecchio, 2012 ; Bagalà et al., 2012; Bourke, O’Brien, & Lyons, 2007; Fudickar, Karth, Mahr, & Schnor, 2012; Rougier, Meunier, St-Arnaud & Rousseau, 2011; Sposaro & Tyson, 2009; Vaidehi, Ganapathy, Mohan, Aldrin, & Nirmal, 2011; Zhang, Wang, Liu, & Hou, 2006).

The utilization of mobile phones or smartphones for the provision of pervasive health care services (Hristoskova, Sakkalis, Zacharioudakis, Tsiknakis, & De Turck, 2014) provides a cost-effective and powerful solution to the well-known issue of increasing health-care needs and costs due to the growing population of elderly (Spanakis, Lelis, Chiarugi, & Chronaki, 2005 ; Spanakis et al. 2012). Various such fall detection systems already exist (Table 1) and each one of these uses a specific phone with different embedded sensors. Moreover each method is evaluated within its own testing environment and with its own data. Thus it is very difficult, if not impossible, to compare different existing approaches on their validity and effectiveness.

Table 1.
Overview of fall detection and classification methods
Method TypeArticleNo. of SubjectsTypes of MotionsPhone/Sensor PositionPerformance
FallsADLS
Threshold BasedDai et al. (2010)1533WaistForward fall: FN 2.6%
Lateral fall: FN 3.3%
Backward fall: FN 2.1%
ADLS: FP 8.7%
Lee et al. (2011)1848WaistSP: 81%, SE: 77%
Tolkiehn et al. (2011) 121312WaistSP: 85.24%, SE: 87.77%
Fang et al. (2012) 4Performed but not defined4Chest, waist, thighSP: 72.22%, SE: 73.78%
Cao et al. (2012) 20Performed but not defined3Shirt pocketSP: 92.75%, SE: 86.75%
Viet et al. (2012)536Shirt pocket, pants pocket, hand, earSP: 96%, SE: 80%
Machine LearningZhang et al. (2006)32Low risk fall
High risk fall
Normal/ High-intensity, Special movements, Critical movementsClothes pocket
Hanged on neck
Mean ratio of correctness 93.3%
Luštrek et al. (2009) 3Performed but not defined612 body tags attached to: shoulders, elbows, wrists, hips, knees and ankles.SVM Accuracy:
97.7% on clear data
96.5% on noisy data
Abbate et al. ( 2012)736WaistAccuracy: 100%
Albert et al. (2012) 154Fall-like events extracted while 9 subjects wearing device for 10 daysStandardized position& orientation: BeltRLR Accuracy:
Detection: 98 %
Classification: 99.6%
Zhao et al. (2012)1043WaistAccuracy: 98.4%
Fahmi et al. (2012) Unclear445 postures:
Holding phone, Phone in ear, chest/pants pockets, on sidewise lying, on supine lying
SE: 85.3%
SP: 90.5%
Kansiz et al. (2013)846PocketAverage recall value: 0.88

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 5: 4 Issues (2017)
Volume 4: 4 Issues (2016)
Volume 3: 4 Issues (2015)
Volume 2: 4 Issues (2014)
Volume 1: 4 Issues (2013)
View Complete Journal Contents Listing