Abstract
We present a framework that generates beat synchronous dance animation based on the analysis of both visual and audio data. First, the articulated motion of a dancer is captured based on markerless visual observations obtained by a multicamera system. We propose and employ a new method for the temporal segmentation of such motion data into the periods of dance. Next, we use a beat tracking algorithm to estimate the pulse related to the tempo of a piece of music. Given an input music that is of the same genre as the one corresponding to the visually observed dance, we automatically produce a beat synchronous dance animation of a virtual character. The proposed approach has been validated with extensive experiments performed on a data set containing a variety on traditional Greek/Cretan dances and the corresponding music.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Alankus, G., Bayazit, A.A., Bayazit, O.B.: Automated motion synthesis for dancing characters. Computer Animation and Virtual Worlds 16, 259–271 (2005)
Panagiotakis, C., Tziritas, G.: Snake terrestrial locomotion synthesis in 3d virtual environments. The Visual Computer 22, 562–576 (2006)
Kim, J.W., Fouad, H., Sibert, J.L., Hahn, J.K.: Perceptually motivated automatic dance motion generation for music. Computer Animation and Virtual Worlds 20, 375–384 (2009)
Kim, T., Park, S., Shin, S.: Rhythmic-motion synthesis based on motion-beat analysis. ACM Transactions on Graphics (TOG) 22, 392–401 (2003)
Essid, S., Richard, G.: Fusion of multimodal information in music content analysis. Multimodal Music Processing (2012)
Chu, W., Tsai, S.: Rhythm of motion extraction and rhythm-based cross-media alignment for dance videos. IEEE Transactions on Multimedia, 1 (2012)
Shi, J., Tomasi, C.: Good features to track. In: CVPR, pp. 593–600 (1994)
Valtazanos, A., Arvind, D., Ramamoorthy, S.: Comparative study of segmentation of periodic motion data for mobile gait analysis. In: Wireless Health 2010, pp. 145–154. ACM (2010)
Klapuri, A.P., Eronen, A.J., Astola, J.T.: Analysis of the meter of acoustic musical signals. IEEE Trans. on Audio, Speech, and Language Processing 14, 342–355 (2006)
Böck, S., Schedl, M.: Enhanced Beat Tracking with Context-Aware Neural Networks. In: Proceedings of the 14th International Conference on Digital Audio Effects (DAFx 2011), Paris, France (2011)
Holzapfel, A., Stylianou, Y.: Beat tracking using group delay based onset detection. In: Proc. of ISMIR - International Conference on Music Information Retrieval, pp. 653–658 (2008)
Michel, D., Panagiotakis, C., Argyros, A.A.: Tracking human body articulations with multiple rgbd sensors. Technical report, FORTH-ICS (2013)
Fulekar, M.: Bioinformatics: applications in life and environmental sciences. Springer (2009)
Dijkstra, E.: A note on two problems in connexion with graphs. Numerische Mathematik 1, 269–271 (1959)
Holzapfel, A.: Similarity methods for computational ethnomusicology. PhD thesis, University of Crete, Greece (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Panagiotakis, C., Holzapfel, A., Michel, D., Argyros, A.A. (2013). Beat Synchronous Dance Animation Based on Visual Analysis of Human Motion and Audio Analysis of Music Tempo. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2013. Lecture Notes in Computer Science, vol 8034. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41939-3_12
Download citation
DOI: https://doi.org/10.1007/978-3-642-41939-3_12
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-41938-6
Online ISBN: 978-3-642-41939-3
eBook Packages: Computer ScienceComputer Science (R0)