Improvisation
Performance, 2022
This is an improvisational collaboration between humans and a robotic musician. It aims to explore the co-creativity between robots and humans and the role of a robot musician as an active listener. We bring together acoustic instruments and electronic sounds. Shimon, the robot musician, isn't just operated by humans - but also an active agent that can understand and create music. Shimon composes based on a variable-order Markov chain, which automatically adjusts the order based on the length of the input sequence, and sings and dances via a rule-based interactive system.
In collaboration with Mir Jeffres
“Music is an agreeable harmony for the honor of God and the permissible delights of the soul.”
―Johann Sebastian Bach
Counterpoint is a key compositional technique involving the interweaving of multiple independent melodic lines that composers such as J.S. Bach. In the Baroque period, improvisation was important in musical performance and education. This project aims to improvise a counterpoint with a robot musician, Shimon. In this project, Shimon will receive a four-bar melody provided by a human performer as input. Based on this input, Shimon will use a genetic algorithm to further develop this melody, and use a convolutional neural network-based approach for harmonization, which has been trained on the Bach Chorale dataset.
Music, body, and machine: gesture-based synchronization in human-robot musical interaction
Musical performance relies on nonverbal cues for conveying information among musicians. Human musicians use bodily gestures to communicate their interpretation and intentions to their collaborators, from mood and expression to anticipatory cues regarding structure and tempo. Robotic Musicians can use their physical bodies in a similar way when interacting with fellow musicians. The paper presents a new theoretical framework to classify musical gestures and a study evaluating the effect of robotic gestures on synchronization between human musicians and Shimon - a robotic marimba player developed at Georgia Tech. Shimon utilizes head and arm movements to signify musical information such as expected notes, tempo, and beat. The study, in which piano players were asked to play along with Shimon, assessed the effectiveness of these gestures on human-robot synchronization. Subjects were evaluated for their ability to synchronize with unknown tempo changes as communicated by Shimon’s ancillary and social gestures. The results demonstrate the significant contribution of non-instrumental gestures to human-robot synchronization, highlighting the importance of non-music-making gestures for anticipation and coordination in human-robot musical collaboration. Subjects also indicated more positive feelings when interacting with the robot’s ancillary and social gestures, indicating the role of these gestures in supporting engaging and enjoyable musical experiences.
[Publication] Gao, X., Rogel, A., Sankaranarayanan, R., Dowling, B. and Weinberg, G., 2024. Music, body, and machine: gesture-based synchronization in human-robot musical interaction. Frontiers in Robotics and AI, 11, p.1461615.
[Article Link] https://www.researchgate.net/publication/387264961_Music_body_and_machine_gesture-based_synchronization_in_human-robot_musical_interaction
[Publication] Gao, X., Rogel, A., Sankaranarayanan, R., Dowling, B. and Weinberg, G., 2024. Music, body, and machine: gesture-based synchronization in human-robot musical interaction. Frontiers in Robotics and AI, 11, p.1461615.
[Article Link] https://www.researchgate.net/publication/387264961_Music_body_and_machine_gesture-based_synchronization_in_human-robot_musical_interaction