JoVE Logo

Sign In

A subscription to JoVE is required to view this content. Sign in or start your free trial.

In This Article

  • Summary
  • Abstract
  • Introduction
  • Protocol
  • Results
  • Discussion
  • Disclosures
  • Acknowledgements
  • Materials
  • References
  • Reprints and Permissions

Summary

A standardized evaluation method was developed for Wearable Mobility Monitoring Systems (WMMS) that includes continuous activities in a realistic daily living environment. Testing with a series of daily living activities can decrease activity recognition sensitivity; therefore, realistic testing circuits are encouraged for valid evaluation of WMMS performance.

Abstract

An evaluation method that includes continuous activities in a daily-living environment was developed for Wearable Mobility Monitoring Systems (WMMS) that attempt to recognize user activities. Participants performed a pre-determined set of daily living actions within a continuous test circuit that included mobility activities (walking, standing, sitting, lying, ascending/descending stairs), daily living tasks (combing hair, brushing teeth, preparing food, eating, washing dishes), and subtle environment changes (opening doors, using an elevator, walking on inclines, traversing staircase landings, walking outdoors).

To evaluate WMMS performance on this circuit, fifteen able-bodied participants completed the tasks while wearing a smartphone at their right front pelvis. The WMMS application used smartphone accelerometer and gyroscope signals to classify activity states. A gold standard comparison data set was created by video-recording each trial and manually logging activity onset times. Gold standard and WMMS data were analyzed offline. Three classification sets were calculated for each circuit: (i) mobility or immobility, ii) sit, stand, lie, or walking, and (iii) sit, stand, lie, walking, climbing stairs, or small standing movement. Sensitivities, specificities, and F-Scores for activity categorization and changes-of-state were calculated.

The mobile versus immobile classification set had a sensitivity of 86.30% ± 7.2% and specificity of 98.96% ± 0.6%, while the second prediction set had a sensitivity of 88.35% ± 7.80% and specificity of 98.51% ± 0.62%. For the third classification set, sensitivity was 84.92% ± 6.38% and specificity was 98.17 ± 0.62. F1 scores for the first, second and third classification sets were 86.17 ± 6.3, 80.19 ± 6.36, and 78.42 ± 5.96, respectively. This demonstrates that WMMS performance depends on the evaluation protocol in addition to the algorithms. The demonstrated protocol can be used and tailored for evaluating human activity recognition systems in rehabilitation medicine where mobility monitoring may be beneficial in clinical decision-making.

Introduction

Ubiquitous sensing has become an engaging research area due to increasingly powerful, small, low cost computing and sensing equipment 1. Mobility monitoring using wearable sensors has generated a great deal of interest since consumer-level microelectronics are capable of detecting motion characteristics with high accuracy 1. Human activity recognition (HAR) using wearable sensors is a recent area of research, with preliminary studies performed in the 1980s and 1990s 2-4.

Modern smartphones contain the necessary sensors and real-time computation capability for mobility activity recognition. Real-time analysis on the device permits activity classification and data upload without user or investigator intervention. A smartphone with mobility analysis software could provide fitness tracking, health monitoring, fall detection, home or work automation, and self-managing exercise programs 5. Smartphones can be considered inertial measurement platforms for detecting mobile activities and mobile patterns in humans, using generated mathematical signal features calculated with onboard sensor outputs 6. Common feature generation methods include heuristic, time-domain, frequency-domain, and wavelet analysis-based approaches 7.

Modern smartphone HAR systems have shown high prediction accuracies when detecting specified activities 1,5,6,7. These studies vary in evaluation methodology as well as accuracy since most studies have their own training set, environmental setup, and data collection protocol. Sensitivity, specificity, accuracy, recall, precision, and F-Score are commonly used to describe prediction quality. However, little to no information is available on methods for "concurrent activity" recognition and evaluation of the ability to detect activity changes in real-time 1, for HAR systems that attempt to categorize several activities. Assessment methods for HAR system accuracy vary substantially between studies. Regardless of the classification algorithm or applied features, descriptions of gold standard evaluation methods are vague for most HAR research.

Activity recognition in a daily living environment has not been extensively researched. Most smartphone-based activity recognition systems are evaluated in a controlled manner, leading to an evaluation protocol that may be advantageous to the algorithm rather than realistic to a real-world environment. Within their evaluation scheme, participants often perform only the actions intended for prediction, rather than applying a large range of realistic activities for the participant to perform consecutively, mimicking real-life events.

Some smartphone HAR studies 8,9 group similar activities together, such as stairs and walking, but exclude other activities from the data set. Prediction accuracy is then determined by how well the algorithm identified the target activities. Dernbach et al. 9 had participants write the activity they were about to execute before moving, interrupting continuous change-of-state transitions. HAR system evaluations should assess the algorithm while the participant performs natural actions in a daily living setting. This would permit a real-life evaluation that replicates daily use of the application. A realistic circuit includes many changes-of-state as well as a mix of actions not predicable by the system. An investigator can then assess the algorithm's response to these additional movements, thus evaluating the algorithm's robustness to anomalous movements.

This paper presents a Wearable Mobility Monitoring System (WMMS) evaluation protocol that uses a controlled course that reflects real-life daily living environments. WMMS evaluation can then be made under controlled but realistic conditions. In this protocol, we use a third-generation WMMS that was developed at the University of Ottawa and Ottawa Hospital Research Institute 11-15. The WMMS was designed for smartphones with a tri-axis accelerometer and gyroscope. The mobility algorithm accounts for user variability, provides a reduction in the number of false positives for changes-of-state identification, and increases sensitivity in activity categorization. Minimizing false positives is important since the WMMS triggers short video clip recording when activity changes of state are detected, for context-sensitive activity evaluation that further improves WMMS classification. Unnecessary video recording creates inefficiencies in storage and battery use. The WMMS algorithm is structured as a low-computational learning model and evaluated using different prediction levels, where an increase in prediction level signifies an increase in the amount of recognizable actions.

Protocol

This protocol was approved by the Ottawa Health Science Network Research Ethics Board.

1. Preparation

  1. Provide participants with an outline of the research, answer any questions, and obtain informed consent. Record participant characteristics (e.g., age, gender, height, weight, waist girth, leg height from the anterior superior iliac spine to the medial malleolus), identification code, and date on a data sheet. Ensure that the second smartphone that is used to capture video is set to at least a 30 frames per second capture rate.
  2. Securely attach a phone holster to the participant's front right belt or pant waist. Start the smartphone application that will be used to collect the sensor data (i.e., data logging or WMMS application) on the mobility measurement smartphone and ensure that the application is running appropriately. Place the smartphone in the holster, with back of the device (rear camera) facing outward.
  3. Start digital video recording on a second smartphone. For anonymity, record the comparison video without showing the person's face, but ensure to record all activity transitions. The phone can be handheld.

2. Activity Circuit

  1. Follow the participant and video their actions, on the second smartphone, while they perform the following actions, spoken by the investigator:
    1. From a standing position, shake the smartphone to indicate the start of the trial.
    2. Continue standing for at least 10 sec. This standing phase can be used for phone orientation calibration 14.
    3. Walk to a nearby chair and sit down.
    4. Stand up and walk 60 meters to an elevator.
    5. Stand and wait for the elevator and then walk into the elevator.
    6. Take the elevator to the second floor.
    7. Turn and walk into the home environment.
    8. Walk into the bathroom and simulate brushing teeth.
    9. Simulate combing hair.
    10. Simulate washing hands.
    11. Dry hands using a towel.
    12. Walk to the kitchen.
    13. Take dishes from a rack and place them on the counter.
    14. Fill a kettle with water from the kitchen sink.
    15. Place the kettle on the stove element.
    16. Place bread in a toaster.
    17. Walk to the dining room.
    18. Sit at a dining room table.
    19. Simulate eating a meal at the table.
    20. Stand and walk back to the kitchen sink.
    21. Rinse off the dishes and place them in a rack.
    22. Walk from the kitchen back to the elevator.
    23. Stand and wait for the elevator and then walk into the elevator.
    24. Take the elevator to the first floor.
    25. Walk 50 meters to a stairwell.
    26. Open the door and enter the stairwell.
    27. Walk up stairs (13 steps, around landing, 13 steps).
    28. Open the stairwell door into the hallway.
    29. Turn right and walk down the hall for 15 meters.
    30. Turn around and walk 15 meters back to the stairwell.
    31. Open the door and enter the stairwell.
    32. Walk down stairs (13 steps, around landing, 13 steps).
    33. Exit the stairwell and walk into a room.
    34. Lie on a bed.
    35. Get up and walk 10 meters to a ramp.
    36. Walk up the ramp, turn around, then down the ramp (20 meters).
    37. Continue walking into the hall and open the door to outside.
    38. Walk 100 meters on the paved pathway.
    39. Turn around and walk back to the room.
    40. Walk into the room and stand at the starting point.
    41. Continue standing, and then shake the smartphone to indicate the end of trial.

3. Trial Completion

  1. Stop the video recording smartphone and ask the participant to remove and return the smartphone and holster. Stop the data logging or WMMS application on the smartphone. Copy the acquired motion data files and the video file from both phones to a computer for post-processing.

4. Post-processing

  1. Synchronize timing between the video and the raw sensor data by determining the time when the shake action started. This shaking movement corresponds to a distinct accelerometer signal and video frame. Check for synchronization error by subtracting the end shake time from the start shake time, for sensor and video data sources. Time differences should be similar between the two data sets.
  2. Determine actual change-of-state times from the gold-standard video by recording the time difference from the start shake time to the video frame at the transition between activities. Use video editing software to obtain timing to within 0.033 sec (i.e., 30 frames per second video rate). Use WMMS software to generate comparable changes-of-state from the sensor data.
  3. Generate two data sets, one with true activities and the second with predicted activities, by labeling the activity for each video frame (based on the change of state timing) and then calculating the predicted activity at each video frame time from the WMMS output. For WMMS performance evaluation, calculate true positives, false negatives, true negatives, false positives between the gold-standard activity and WMMS predicted activity. Use these parameters to calculate sensitivity, specificity, and F-score outcomes measures.
    Note: A tolerance setting of 3 data windows on either side of the window being analyzed can be used for determining change-of-state outcomes, and 2 data windows for classification outcomes. For example, since 1 second data windows were used for the WMMS in this study, 3 sec before and after the current window were examined so that consecutive changes within this tolerance are ignored. The consideration was that changes of state that happen in less than 3 sec can be ignored for gross human movement analysis since these states would be considered transitory.

Results

The study protocol was conducted with a convenience sample of fifteen able-bodied participants whose average weight was 68.9 (± 11.1) kg, height was 173.9 (± 11.4) cm, and age was 26 (± 9) years, recruited from The Ottawa Hospital and University of Ottawa staff and students. A smartphone captured sensor data at a variable 40-50 Hz rate. Sample rate variations are typical for smartphone sensor sampling. A second smartphone was used to record digital video at 1280x720 (720p) resolution.

Discussion

Human activity recognition with a wearable mobility monitoring system has received more attention in recent years due to the technical advances in wearable computing and smartphones and systematic needs for quantitative outcome measures that help with clinical decision-making and health intervention evaluation. The methodology described in this paper was effective for evaluating WMMS development since activity classification errors were found that would not have been present if a broad range of activities of daily living...

Disclosures

The authors declare that they have no competing financial interests.

Acknowledgements

The authors acknowledge Evan Beisheim, Nicole Capela, Andrew Herbert-Copley for technical and data collection assistance. Project funding was received from the Natural Sciences and Engineering Research Council of Canada (NSERC) and BlackBerry Ltd., including smartphones used in the study.

Materials

NameCompanyCatalog NumberComments
Smartphone or wearable measurement deviceBlackberryZ10
Smartphone for video recordingBlackberryZ10 or 9800
Phone holsterAny
Data logger application for the smartphoneBlackBerry World - TOHRC Data Logger for BlackBerry 10http://appworld.blackberry.com/webstore/content/32013891/?countrycode=CA
Wearable mobility measurementCustom Blackberry 10 and Matlab software for mobility monitoringhttp://www.irrd.ca/cag/smartphone/
Video editing or analysis softwareMotion Analysis Toolshttp://www.irrd.ca/cag/mat/

References

  1. Lara, O. D., Labrador, M. A. A Survey on Human Activity Recognition using Wearable Sensors. IEEE Communications Surveys Tutorials. 15 (3), 1192-1209 (2013).
  2. Foerster, F., Smeja, M., Fahrenberg, J. Detection of posture and motion by accelerometry: a validation study in ambulatory monitoring. Computers in Human Behavior. 15 (5), 571-583 (1999).
  3. Elsmore, T. F., Naitoh, P. . Monitoring Activity With a Wrist-Worn Actigraph: Effects of Amplifier Passband and threshold Variations. , (1993).
  4. Kripke, D. F., Webster, J. B., Mullaney, D. J., Messin, S., Mason, W. . Measuring sleep by wrist actigraph. , (1981).
  5. Lockhart, J. W., Pulickal, T., Weiss, G. M. Applications of mobile activity recognition. Proceedings of the 2012 ACM Conference on Ubiquitous Computing. , 1054-1058 (2012).
  6. Incel, O. D., Kose, M., Ersoy, C. A Review and Taxonomy of Activity Recognition on Mobile Phones. BioNanoScience. 3 (2), 145-171 (2013).
  7. Yang, C. C., Hsu, Y. L. A Review of Accelerometry-Based Wearable Motion Detectors for Physical Activity Monitoring. Sensors. 10 (8), 7772-7788 (2010).
  8. He, Y., Li, Y. Physical Activity Recognition Utilizing the Built-In Kinematic Sensors of a Smartphone. International Journal of Distributed Sensor Networks. 2013, (2013).
  9. Vo, Q. V., Hoang, M. T., Choi, D. Personalization in Mobile Activity Recognition System using K-medoids Clustering Algorithm. International Journal of Distributed Sensor Networks. 2013, (2013).
  10. Dernbach, S., Das, B., Krishnan, N. C., Thomas, B. L., Cook, D. J. Simple and Complex Activity Recognition through Smart Phones. , 214-221 (2012).
  11. Hache, G., Lemaire, E. D., Baddour, N. Mobility change-of-state detection using a smartphone-based approach. IEEE International Workshop on Medical Measurements and Applications Proceedings (MeMeA). , 43-46 (2010).
  12. Wu, H. H., Lemaire, E. D., Baddour, N. Change-of-state determination to recognize mobility activities using a BlackBerry smartphone. , 5252-5255 (2011).
  13. Wu, H., Lemaire, E. D., Baddour, N. Activity Change-of-state Identification Using a Blackberry Smartphone. Journal of Medical and Biological Engineering. 32 (4), 265-271 (2012).
  14. Tundo, M. D., Lemaire, E., Baddour, N. Correcting Smartphone orientation for accelerometer-based analysis. IEEE International Symposium on Medical Measurements and Applications Proceedings (MeMeA). , 58-62 (2013).
  15. Tundo, M. D. . Development of a human activity recognition system using inertial measurement unit sensors on a smartphone. , (2014).

Reprints and Permissions

Request permission to reuse the text or figures of this JoVE article

Request Permission

Explore More Articles

Keywords Smartphone basedHuman Activity RecognitionDaily Living EnvironmentWearable Mobility Monitoring SystemsActivity ClassificationSensitivitySpecificityF scoreEvaluation Protocol

This article has been published

Video Coming Soon

JoVE Logo

Privacy

Terms of Use

Policies

Research

Education

ABOUT JoVE

Copyright © 2025 MyJoVE Corporation. All rights reserved