scholarly journals Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors

Sensors ◽  
2016 ◽  
Vol 16 (4) ◽  
pp. 426 ◽  
Author(s):  
Muhammad Shoaib ◽  
Stephan Bosch ◽  
Ozlem Incel ◽  
Hans Scholten ◽  
Paul Havinga
2019 ◽  
Vol 11 (21) ◽  
pp. 2531 ◽  
Author(s):  
Zhiqiang Gao ◽  
Dawei Liu ◽  
Kaizhu Huang ◽  
Yi Huang

Today’s smartphones are equipped with embedded sensors, such as accelerometers and gyroscopes, which have enabled a variety of measurements and recognition tasks. In this paper, we jointly investigate two types of recognition problems in a joint manner, e.g., human activity recognition and smartphone on-body position recognition, in order to enable more robust context-aware applications. So far, these two problems have been studied separately without considering the interactions between each other. In this study, by first applying a novel data preprocessing technique, we propose a joint recognition framework based on the multi-task learning strategy, which can reduce computational demand, better exploit complementary information between the two recognition tasks, and lead to higher recognition performance. We also extend the joint recognition framework so that additional information, such as user identification with biometric motion analysis, can be offered. We evaluate our work systematically and comprehensively on two datasets with real-world settings. Our joint recognition model achieves the promising performance of 0.9174 in terms of F 1 -score for user identification on the benchmark RealWorld Human Activity Recognition (HAR) dataset. On the other hand, in comparison with the conventional approach, the proposed joint model is shown to be able to improve human activity recognition and position recognition by 5.1 % and 9.6 % respectively.


2019 ◽  
Vol 5 (1) ◽  
pp. 1-9
Author(s):  
Mohammad Iqbal ◽  
Chandrawati Putri Wulandari ◽  
Wawan Yunanto ◽  
Ghaluh Indah Permata Sari

Discovering rare human activity patterns—from triggered motion sensors deliver peculiar information to notify people about hazard situations. This study aims to recognize rare human activities using mining non-zero-rare sequential patterns technique. In particular, this study mines the triggered motion sensor sequences to obtain non-zero-rare human activity patterns—the patterns which most occur in the motion sensor sequences and the occurrence numbers are less than the pre-defined occurrence threshold. This study proposes an algorithm to mine non-zero-rare pattern on human activity recognition called Mining Multi-class Non-Zero-Rare Sequential Patterns (MMRSP).  The experimental result showed that non-zero-rare human activity patterns succeed to capture the unusual activity. Furthermore, the MMRSP performed well according to the precision value of rare activities.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2368
Author(s):  
Fatima Amjad ◽  
Muhammad Hassan Khan ◽  
Muhammad Adeel Nisar ◽  
Muhammad Shahid Farid ◽  
Marcin Grzegorzek

Human activity recognition (HAR) aims to recognize the actions of the human body through a series of observations and environmental conditions. The analysis of human activities has drawn the attention of the research community in the last two decades due to its widespread applications, diverse nature of activities, and recording infrastructure. Lately, one of the most challenging applications in this framework is to recognize the human body actions using unobtrusive wearable motion sensors. Since the human activities of daily life (e.g., cooking, eating) comprises several repetitive and circumstantial short sequences of actions (e.g., moving arm), it is quite difficult to directly use the sensory data for recognition because the multiple sequences of the same activity data may have large diversity. However, a similarity can be observed in the temporal occurrence of the atomic actions. Therefore, this paper presents a two-level hierarchical method to recognize human activities using a set of wearable sensors. In the first step, the atomic activities are detected from the original sensory data, and their recognition scores are obtained. Secondly, the composite activities are recognized using the scores of atomic actions. We propose two different methods of feature extraction from atomic scores to recognize the composite activities, and they include handcrafted features and the features obtained using the subspace pooling technique. The proposed method is evaluated on the large publicly available CogAge dataset, which contains the instances of both atomic and composite activities. The data is recorded using three unobtrusive wearable devices: smartphone, smartwatch, and smart glasses. We also investigated the performance evaluation of different classification algorithms to recognize the composite activities. The proposed method achieved 79% and 62.8% average recognition accuracies using the handcrafted features and the features obtained using subspace pooling technique, respectively. The recognition results of the proposed technique and their comparison with the existing state-of-the-art techniques confirm its effectiveness.


Sign in / Sign up

Export Citation Format

Share Document