Abstract
Human activity recognition (HAR) has been an important research field for more than a decade due to its versatile applications in different area. It has gained significant attention in the health care domain. Although it has similarity with other form of activity recognition, it offers a unique set of challenges. Body movements in a food preparation environment are considerably less than many other activities of interest in real world. In this paper, a comprehensive solution has been demonstrated for the Bento Box Packaging Challenge activity recognition. In this case, we present a well-planned approach to recognize activities during packaging tasks from motion capture data. We use dataset obtained from motion capture system where subjects have 13 markers on their upper-body area and by special use of cameras and body suit. We obtain around 50,000 sample for each of the activities. We reduce the data dimensionality and make the data suitable for the classification purpose by extracting reliable and efficient features. After feature extraction process, three different classifiers, e.g., random forest classifier, extra trees classifier, and gradient boosting classifier are compared to check the result. We conclude that this challenging dataset has been observed to work most efficiently for random forest classifier using hyperparameter tuning.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Inoue, S., Lago, P., Hossain, T., Mairittha, T., Mairittha, N.: Integrating activity recognition and nursing care records: the system, deployment, and a verification study. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3(3) (2019)
Saha, S.S., Rahman, S., Haque, Z.R.R., Hossain, T., Inoue, S., Ahad, M.A.R.; Position independent activity recognition using shallow neural architecture and empirical modeling. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, UbiComp/ISWC ’19 Adjunct, pp. 808–813, New York, NY, USA. Association for Computing Machinery (2019)
Alia, S.S., Lago, P., Adachi, K., Hossain, T., Goto, H., Okita, T., Inoue., S.: Summary of the 2nd nurse care activity recognition challenge using lab and field data. In: Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, UbiComp-ISWC ’20, pp. 378–383, New York, NY, USA. Association for Computing Machinery (2020)
Hossain, T., Ahad, M.A.R., Tazin, T., Inoue, S.: Activity recognition by using lorawan sensor. In: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, UbiComp ’18, pp. 58–61, New York, NY, USA. Association for Computing Machinery (2018)
Cheema, M.S., Eweiwi, A., Bauckhage, C.:. Human activity recognition by separating style and content. Pattern Recogn. Lett. 50, 130–138 (2014)
Aggarwal, J.K., Xia, L.: Human activity recognition from 3d data: a review. Pattern Recogn. Lett. 48, 70–80 (2014)
Atallah, L., Yang, G.-Z.: Review: The use of pervasive sensing for behaviour profiling—a survey. Pervasive Mob. Comput. 5(5), 447–464 (2009)
Guan, Y., Ploetz, T.: Ensembles of deep lstm learners for activity recognition using wearables. Proc. ACM Interactive Mobile Wearable Ubiquitous Technol. 1, 03 (2017)
Ahmed, N., Rafiq, Islam.: Enhanced human activity recognition based on smartphone sensor data using hybrid feature selection model. Sensors 20, 317 (2020)
Chelli, A., Pätzold, M.: A machine learning approach for fall detection and daily living activity recognition. IEEE Access 7, 38670–38687 (2019)
Chelli, A., Pätzold, M.: A machine learning approach for fall detection based on the instantaneous doppler frequency. IEEE Access 7, 166173–166189 (2019)
Alia, S., Lago, P., Takeda, S., Adachi, K., Benaissa, B., Ahad, M.A.R., Inoue, S.: Summary of the Cooking Activity Recognition Challenge, pp. 1–13 (2021)
Bonanni, L., Lee, C.-H., Selker, T.: Counterintelligence: Augmented Reality Kitchen (2005)
Hossain, T., Islam, M., Ahad, M.A.R., Inoue, S.: Human Activity Recognition Using Earable Device, pp. 81–84 (2019)
Barnachon, M., Bouakaz, S., Boufama, B., Guillou, E.: Ongoing human action recognition with motion capture. Pattern Recogn. 47, 238–247 (2014)
Lin, y., le kernec, J.: Performance Analysis of Classification Algorithms for Activity Recognition Using Micro-doppler Feature, pp. 480–483 (2017)
Alia, S.S., Adachi, K., Nahid, N., Kaneko, H., Lago, P., Inoue, S.: Bento Packaging Activity Recognition Challenge (2021)
Bulling, A., Blanke, U., Schiele, B.: A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. 46, 01 (2013)
Hossain, T., Ahad, M.A.R., Inoue, s.: A method for sensor-based activity recognition in missing data scenario. Sensors 20, 3811 (2020)
Nahid, N., Kaneko, H., Lago, P., Adachi, K., Alia, S.S., Inoue, S.: Summary of the Bento Packaging Activity Recognition Challenge (2021)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
6 Appendix
6 Appendix
Used sensor modalities
Motion capture (MoCap)
Features Used
As described in Sect. 3.3 and summarized in Table 1
Programming Language and Libraries
Programming language: Python
Libraries: NumPy, Pandas, Matplotlib, Scikit-learn, Sci-Py
Machine Specification
-
RAM: 8 GB
-
Processor: 2.2 GHz Dual-core Intel Core i7
-
GPU: N/A
Training and testing time
Training: 10.8 min
Testing: 3 min
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Rafiq, J.I., Nabi, S., Amin, A., Hossain, S. (2022). Bento Packaging Activity Recognition from Motion Capture Data. In: Ahad, M.A.R., Inoue, S., Roggen, D., Fujinami, K. (eds) Sensor- and Video-Based Activity and Behavior Computing. Smart Innovation, Systems and Technologies, vol 291. Springer, Singapore. https://doi.org/10.1007/978-981-19-0361-8_15
Download citation
DOI: https://doi.org/10.1007/978-981-19-0361-8_15
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-0360-1
Online ISBN: 978-981-19-0361-8
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)