Non-choreographed Robot Dance


  • Alexandru Surpatean



This research aims at investigating the difficulties of enabling the humanoid robot Nao to dance on music. The focus is on creating a dance that is not predefined by the researcher, but which emerges from the music played to the robot. Such an undertaking can not be fully tackled in a small-scale project. Nevertheless, rather than focusing on a subtask of the topic, this research tries to maintain a holistic view on the subject, and tries to provide a framework based on which work in this area can be continued in the future. The need for this research comes from the fact that current approaches to robot dance in general, and Nao dance in particular, focus on predefined dances built by the researcher. The main goal of this project is to move away from the current choreographed approaches to Nao dance, and investigate how to make the robot dance in a non-predefined fashion. Moreover, given the fact that previous research has focused mainly on the analysis of musical beat, a secondary goal of this project is to focus not only on the beat, but other elements of music as well, in order to create the dance.


Apostolos, M., Littman, M., Lane, S., Handelman, D., & Gelfand, J. (1996). Robot choreography: An artistic-scientic connection. Computers & Mathematics with Applications, 32(1), 1 – 4.

Arentz, W. (2001). Beat extraction from digital music.

Berenzweig, A., & Ellis, D.(2001). Locating singing voice segments within music signals. IEEE workshop on Applications on Signal Processing to Audio and Acoustics.

Buell, K. (2001). International style standard [modern] ballroom dancing. Ballroom Dancing for Beginners.

Camurri, A., Mazzarino, B., & Volpe, G. (2004). Analysis of expressive gesture: The eyes web expressive gesture processing library. Gesture-Based Communication in Human-Computer Interaction, Vol. 2915/2004 of Lecture Notes in Computer Science.

Cone, E. (1971). Conversations with roger sessions. Perspectives on American Composers. New York, Norton.

Dannenberg, R. (2002). Listening to`naima’: An automated structural analysis of music from recorded audio.

Foote, J., & Uchihashi S. (2001). The beat spectrum: A new approach to rhythm analysis.

Foote, J. (1999). Visualizing music and audio using self-similarity. MULTIMEDIA ‘99: Proceedings of the seventh ACM international conference on Multimedia (Part 1), pp.77–80,ACM, New York, NY, USA.

Goodridge, J. (1999). Description and classification of time elements in performance events: A synthesis of approaches. Rhythm and timing of movement in performance: drama, dance and ceremony.

Goto, M., & Muraoka, Y. (1999). Real-time beat tracking for drumless audio signals: Chord change detection for musical decisions. Speech Communication, 27( 3-4), 311–335.

Goto, M. (2001). An audio-based real-time beat tracking system for music with or without drum-sounds. Journal of New Music Research 30(2), 159 – 171.

(2007). Keepon keeps on shaking his robotic yellow booty.... Computer Weekly, pp. 48– 48.

Nakahara, N., Miyazaki, K., Sakamoto, H., Fujisawa, T., Nagata, N., & Nakatsu, R. (2009). Dance motion control of a humanoid robot based on realtime tempotracking from musical audio signals. Entertainment Computing ICEC 2009, pp. 36–47.

Nakaoka, S., Nakazawa, A., Kanehiro, F.,Kaneko, K., Morisawa, M., Hirukawa, H., & Ikeuchi, K. (2007). Learning from observation paradigm: Leg task models for enabling a biped humanoid robot to imitate human dances. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 26(8), 829–844.

OKeee, K. (2003). Dancing monkeys.

Or, J. (2009). Towards the development of emotional dancing humanoid robots. International Journal of Social Robotics, 1(4), 367–382.

Orife, I. (2001). Riddim: ARhythm Analysis and Decomposition Tool Based on Independent Subspace Analysis. Ph.D. thesis, Dartmouth College.

Scheirer, E. (1998). Tempo and beat analysis of acoustic musical signals. The Journal of the Acoustical Society of America, 103(1), 588–601.

Scott, C. (1989). How children grow: Musically. Music Educators Journal, 76(2), 28–31.

Technische Universitaet Graz(2009). Robocup 2009.

Wright, S. (2003). Musical intelligence. The Arts, Young Children, and Learning, p. 85.

Yoshii, K., Nakadai, K., Torii, T., Hasegawa, Y., Tsujino, H., Komatani, K., Ogata, T., & Okuno, H. (2007). A biped robot that keeps steps in time with musical beats while listening to music with its own ears. Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1743 – 1750, San Diego, CA, USA.

YouTube / horryville (2009). Nao @ robocup2009 graz dances michael jackson `billie jean’.

YouTube / mundolibreyloco (2009). Robot nao. v=Kn8gr6gJCCk.

YouTube / TeamKouretes (2009). Nao dancing innity 2008. watch?v=SjzSdxPt3as./220-0-general.