Falls is one of the biggest risks to the health and wellbeing of the elderly. Falls in the elderly are a major problem for today’s society. Approximately one in every three adults aged 65 years old or older falls each year creating physical injuries and psychological harm. In addition significant economic effects result from this problem. According to World Health Organization in the year 2030 the estimation of injuries related to falls will increase by 100%. The ability therefore to distinguish between Activities of Daily Living (ADL) and falls is therefore an important problem.
Data from the accelerometer and gyroscope sensors (plus orientation data; the orientation sensor is software-based and derives its data from the accelerometer and the geomagnetic field sensor) of a smartphone were recorded. Specifically, a Samsung Galaxy S3 device with the LSM330DLC inertial module (3D accelerometer and gyroscope) was used to capture the motion data. The gyroscope was calibrated prior to the recordings using the device’s integrated tool. For the purpose of data capture an Android application was developed that records raw data for acceleration, angular velocity and orientation with the enabled parameter “SENSOR_DELAY_FASTEST.” This provides the highest possible sampling rate. The signals can be subsampled at any time if lower sampling rates are desired.
In an attempt to simulate every-day usage of mobile phones, the device was located in a trouser pocket freely chosen by the subject in any random orientation. For the falls, the subjects used the pocket on the opposite side of the falling direction.For the fall simulation a relative hard mattress of 5 cm in thickness (as used in martial arts) was utilized to dampen the fall. All the falls were performed under the strict instructions of the authors to be sure that the subjects perform the right fall in a realistic way. The ADLs were chosen based on their commonness and on their similarity to actual falls, which may produce false positives.
Each sample is stored along with its timestamp in ns. The developed application uses a SQLite database in order to store activities and the subject’s data. Three .txt-files are stored for each trial, one for the accelerometer, one for the gyroscope and one for the orientation data. In the header section of every file there is information about the recording, subject and activity code.
Second Release of MobiAct dataset
MobiAct is a publicly available dataset which includes data from a smartphone when participants are performing different types of activities and a range of falls. It is based on the previously released MobiAct dataset (Vavoulas et al. 2016).
The extended version of the MobiAct dataset includes 4 different types of falls, 12 different ADLs and a scenario of daily living from a total of 66 subjects with more than 3200 trials, all captured by a smartphone. The activities of daily living were selected based on the following criteria: a) Activities which are fall-like were firstly included. These include sequences where the subject usually stays motionless at the end, in different positions, such as sitting on a chair or stepping in and out of a car; b) Activities which are sudden or rapid and are similar to falls, like jumping and jogging; c) The most common everyday activities like walking, standing, ascending and descending stairs (“stairs up” and “stairs down”).
The MobiFall and MobiAct datasets are available for non-commercial research and educational purposes only.
* Please use the following citation: Chatzaki C., Pediaditis M., Vavoulas G., Tsiknakis M. (2017) Human Daily Activity and Fall Recognition Using a Smartphone’s Acceleration Sensor. In: Rocker C., O’Donoghue J., Ziefle M., Helfert M., Molloy W. (eds) Information and Communication Technologies for Ageing Well and e-Health. ICT4AWE 2016. Communications in Computer and Information Science, vol 736, pp 100-118. Springer, Cham, DOI 10.1007/978-3-319-62704-5_7.
Download the second release of MobiAct dataset from here.
First release of MobiAct dataset
* Please use the following citation: Vavoulas, G., Chatzaki, C., Malliotakis, T., Pediaditis, M. and Tsiknakis, M., The MobiAct Dataset: Recognition of Activities of Daily Living using Smartphones.,In Proceedings of the International Conference on Information and Communication Technologies for Ageing Well and e-Health (ICT4AWE 2016), pages 143-151,ISBN: 978-989-758-180-9.
Download the first release of MobiAct dataset from here.
Second release of MobiFall dataset
* Please use the following citation: G. Vavoulas, M. Pediaditis, C. Chatzaki, E. G. Spanakis, M. Tsiknakis, The MobiFall Dataset: Fall Detection and Classification with a Smartphone, invited publication for the International Journal of Monitoring and Surveillance Technologies Research, pp 44-56, 2014, DOI:10.4018/ijmstr.2014010103.
Download the second release of MobiFall dataset from here.
First release of MobiFall dataset
* Please use the following citation: G. Vavoulas, M. Pediaditis, E. Spanakis, M. Tsiknakis, The MobiFall Dataset: An Initial Evaluation of Fall Detection Algorithms Using Smartphones, 6th IEEE International Symposium on Monitoring & Surveillance Research (ISMSR): Healthcare-Bioinformatics, Chania, Greece, 2013, DOI:10.1109/BIBE.2013.6701629.
Download the first release of MobiFall dataset from here.
The development of the MobiAct and MobiFall datasets were partially funded by the FP7 project “MyHealthAvatar – A Demonstration of 4D Digital Avatar Infrastructure for Access of Complete Patient Information“.