In spite of extensive research in the last decade, activity recognition still faces many challenges for real-world applications. On one hand, when attempting to recognize various activities, different sensors play different on different activity classes. This heterogeneity raises the necessity of learning the optimal combination of sensor modalities for each activity. On the other hand, users may consistently or occasionally annotate activities. To boost recognition accuracy, we need to incorporate the user input and incrementally adjust the model. To tackle these challenges, we propose an adaptive activity recognition with dynamic heterogeneous sensor fusion framework.We dynamically fuse various modalities to characterize different activities. The model is consistently updated upon arrival of newly labeled data. To evaluate the effectiveness of the proposed framework, we incorporate it into popular feature transformation algorithms, e.g., Linear Discriminant Analysis, Marginal Fisher's Analysis, and Maximum Mutual Information in the proposed framework. Finally, we carry out experiments on a real-world dataset collected over two weeks. The result demonstrates the practical implication of our framework and its advantage over existing approaches.
CITATION STYLE
Zeng, M., Wang, X., Nguyen, L. T., Wu, P., Mengshoel, O. J., & Zhang, J. (2015). Adaptive activity recognition with dynamic heterogeneous sensor fusion. In Proceedings of the 2014 6th International Conference on Mobile Computing, Applications and Services, MobiCASE 2014 (pp. 189–196). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.4108/icst.mobicase.2014.257787
Mendeley helps you to discover research relevant for your work.