Abstract
Activity recognition is one of the most researched topics in the field of machine learning-based recognition. There are many challenges associated with Human Activity Recognition. One of the most important challenges to overcome is the simultaneous recognition of complex activities as well as smaller activities that are part of such complex activities. The dataset that has been used and the work that has been done in this paper is part of the Cooking Activity Recognition Challenge. The dataset that has been provided in this challenge contains three classes of complex or macro activities and ten classes of smaller or micro activities. The macro activities are mutually exclusive whereas multiple micro activities can occur in a sequence as parts of a particular macro activity. The dataset is very challenging because of the recorded segments having varying sample rates among them for which we have preprocessed the data. In addition to that, the dataset contains several recorded segments with missing data. The task of this challenge has been to classify macro- and micro activities separately from this dataset. We have introduced a deep learning framework combining Convolutional Neural Network (CNN) and Gated Recurrent Unit (GRU) to extract spatial and temporal features for recognition of macro- and micro activities. The model that we have proposed for this dataset outperforms other conventional and existing deep learning models with classification accuracies of 83.76 and 59.39% for macro- and micro-activity classifications, respectively.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Hayashi, T., Nishida, M., Kitaoka, N., Takeda, K., Daily activity recognition based on dnn using environmental sound and acceleration signals. In: 23rd European Signal Processing Conference (EUSIPCO). IEEE, pp. 2306–2310 (2015)
Yao, S., Hu, S., Zhao, Y., Zhang, A., Abdelzaher, T.: Deepsense: A unified deep learning framework for time-series mobile sensing data processing. In: Proceedings of the 26th International Conference on World Wide Web, pp. 351–360 (2017)
Park, J., Jang, K., Yang, S.-B.: Deep neural networks for activity recognition with multi-sensor data in a smart home. In: IEEE 4th World Forum on Internet of Things (WF-IoT), pp. 155–160. IEEE, Singapore (2018)
Himawan, I., Towsey, M., Law, B., Roe, P.: Deep learning techniques for koala activity detection. Interspeech 2018, 2107–2111 (2018)
Haque, M.N., Tonmoy, M.T.H., Mahmud, S., Ali, A.A., Khan, M.A.H., Shoyaib, M.: Gru-based attention mechanism for human activity recognition. In: 2019 1st International Conference on Advances in Science, Engineering and Robotics Technology (ICASERT). IEEE, pp. 1–6 (2019)
Lago, P., Takeda, S., Adachi, K., Alia, S.S., Bennai, B., Inoue, S., Charpillet, F.: Cooking activity dataset with macro and micro activities. IEEE DataPort (2020). https://doi.org/10.21227/hyzg-9m49
Lago, P., Takeda, S., Alia, S.S., Adachi, Bennai, B., Charpillet, F., Inoue, S.: A dataset for complex activity recognition with micro and macro activities in a cooking scenario (2020). arXiv:2006.10681
Cooking activity recognition challenge. https://abc-research.github.io/cook2020/data_description/
Ahmed, M., Antar, A.D., Ahad, M.A.R.: An approach to classify human activities in real-time from smartphone sensor data. In: 2019 Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 2019 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR). IEEE, pp. 140–145 (2019)
He, Z., Jin, L.: Activity recognition from acceleration data based on discrete consine transform and svm. In: 2009 IEEE International Conference on Systems, Man and Cybernetics. IEEE, pp. 5041–5044 (2009)
Chen, L., Hoey, J., Nugent, C.D., Cook, D.J., Yu, Z.: Sensor-based activity recognition. IEEE Trans. Syst. Man Cybernet. Part C (Appl. Rev.) 42(6), 790–808 (2012)
Bao, L., Intille, S.S.: Activity recognition from user-annotated acceleration data. In: International Conference on Pervasive Computing, pp. 1–17. Springer, Berlin (2004)
Mantyjarvi, J., Himberg, J., Seppanen, T.: Recognizing human motion with multiple acceleration sensors. In 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat. No. 01CH37236), vol. 2, , pp. 747–752. IEEE (2001)
Antar, A.D., Ahmed, M., Ishrak, M.S., Ahad, M.A.R.: A comparative approach to classification of locomotion and transportation modes using smartphone sensor data. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, pp. 1497–1502 (2018)
Cheema, M.S., Eweiwi, A., Bauckhage, C.: Human activity recognition by separating style and content. Patt. Recogn. Lett. 50, 130–138 (2014)
Aggarwal, J.K., Xia, L.: Human activity recognition from 3d data: a review. Patt. Recogn. Lett. 48, 70–80 (2014)
Yang, Y., Hou, C., Lang, Y., Guan, D., Huang, D., Xu, J.: Open-set human activity recognition based on micro-doppler signatures. Patt. Recogn. 85, 60–69 (2019)
Barnachon, M., Bouakaz, S., Boufama, B., Guillou, E.: Ongoing human action recognition with motion capture. Patt. Recogn. 47(1), 238–247 (2014)
Lin, Y., Le Kernec, J.: Performance analysis of classification algorithms for activity recognition using micro-doppler feature. In: 2017 13th International Conference on Computational Intelligence and Security (CIS). IEEE, pp. 480–483 (2017)
Pawlyta, M., Skurowski, P.: A survey of selected machine learning methods for the segmentation of raw motion capture data into functional body mesh. In: Conference of Information Technologies in Biomedicine, pp. 321–336. Springer, Berlin (2016)
Wang, J., Chen, Y., Hao, S., Peng, X., Hu, L.: Deep learning for sensor-based activity recognition: a survey. Patt. Recogn. Lett. 119, 3–11 (2019)
Hammerla, N.Y., Halloran, S., Plötz, T.: Deep, convolutional, and recurrent models for human activity recognition using wearables (2016). arXiv:1604.08880
Münzner, S., Schmidt, P., Reiss, A., Hanselmann, M., Stiefelhagen, R., Dürichen, R.: Cnn-based sensor fusion techniques for multimodal human activity recognition. In: Proceedings of the 2017 ACM International Symposium on Wearable Computers, pp. 158–165 (2017)
Suvarnam, B., Ch, V.S.: Combination of cnn-gru model to recognize characters of a license plate number without segmentation. In: 2019 5th International Conference on Advanced Computing & Communication Systems (ICACCS). IEEE, pp. 317–322 (2019)
Lyu, Y., Huang, X.: Road segmentation using cnn with gru (2018). arXiv:1804.05164
Kim, P.-S., Lee, D.-G., Lee, S.-W.: Discriminative context learning with gated recurrent unit for group activity recognition. Patt. Recogn. 76, 149–161 (2018)
Haque, M.N., Mahbub, M., Tarek, M.H., Lota, L.N., Ali, A.A.: Nurse care activity recognition: a gru-based approach with attention mechanism. Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 719–723 (2019)
Rohrbach, M., Amin, S., Andriluka, M., Schiele, B.: A database for fine grained activity detection of cooking activities. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, IEEE, pp. 1194–1201 (2012)
Luowei Zhou, J.C., Xu, C.: Youcook2 dataset. http://youcook2.eecs.umich.edu
Alia, S.S., Lago, P., Takeda, S., Adachi, K., Benaissa, B., Ahad, M.A.R., Inoue, S.: Summary of the cooking activity recognition challenge. Human Activity Recognition Challenge, Smart Innovation, Systems and Technologies (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Siraj, M.S., Shahid, O., Ahad, M.A.R. (2021). Cooking Activity Recognition with Varying Sampling Rates Using Deep Convolutional GRU Framework. In: Ahad, M.A.R., Lago, P., Inoue, S. (eds) Human Activity Recognition Challenge. Smart Innovation, Systems and Technologies, vol 199. Springer, Singapore. https://doi.org/10.1007/978-981-15-8269-1_10
Download citation
DOI: https://doi.org/10.1007/978-981-15-8269-1_10
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-8268-4
Online ISBN: 978-981-15-8269-1
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)