The emerging field of "Robotic Musicianship"The development of machine intelligence to capture musical perception, composition, and performance capabilities in robots focuses on developing machine intelligence, in terms of algorithmsA set of rules or instructions given to a computer to help it solve problems or complete tasks and cognitive models, to capture musical perception, composition, and performance, and to transplant these skills into a robot that can then reproduce them in any context.
In such multi-ensemble technological settings, it must be assumed that humans will not play rigidly; rather they will move and express with the 'feel' of the music, wherein the roles of 'leader' and 'follower' within the troupe could often change in a fluid manner. Hence, for machines to participate and create cooperative musical performances, where synchronizationThe coordination of simultaneous processes or events to operate in unison and adaptation plays a vital role, they need to operate at a higher cognitive level.
We develop an approach based on the joint strategy of:
We consider each musician (irrespective of human or machine) as a separate oscillatorA system that produces regular, repetitive variations, used in synchronization models to represent rhythmic elements, wherein mathematical models for oscillator coupling, for example the well-known KuramotoThe Kuramoto model describes synchronization in systems of coupled oscillators, widely used in modeling biological and musical synchronization algorithm, can be used for establishing and maintaining synchronization.
Historical context of automated musical systems and the central challenge of human-robot musical interaction.
Comprehensive review of existing work on human-robot musical synchronization and identification of research gaps.
Development of a comprehensive multimodal synchronization framework integrating audio, visual, and gestural cues.
Audio-based ensemble leadership tracking using advanced machine learning techniques.
Exploration of visual signals and gestural information in musical synchronization.
Integration of multiple modalities for enhanced human-robot musical collaboration.
Real-world implementation for human-robot musical ensemble with experimental validation.
Summary of contributions, limitations, and future research directions.
Achieving effective synchronization in human-robot musical ensembles presents a significant challenge. Human musicians naturally adapt to each other using subtle cues and variations in rhythm, tempo, and dynamicsβelements that are not easily replicated by machines.
Ensures control and sensing of musical instrument components along with parameters in sound synthesis. This includes:
Focuses on capturing the overall representation of the musical process through:
Novel multimodal synchronization system integrating audio, visual, and gestural cues for human-robot musical collaboration.
Advanced machine learning approach for dynamic leader identification in musical ensembles using LSTM networks.
Integration of computer vision techniques for gesture-based musical synchronization and conductor following.
Continuous learning mechanisms that adapt to individual performance styles and musical preferences.
This research has resulted in several peer-reviewed publications and has been presented at international conferences in the fields of music technology, robotics, and human-computer interaction.
S Chakraborty, S Dutta & J Timoney
Humanities and Social Sciences Communications 8 (1), 1β9
A Yaseen, S Chakraborty & J Timoney
International Conference on Human-Computer Interaction, pp. 85β92
S Chakraborty, A Yaseen, J Timoney, V Lazzarini & D Keller
International Computer Music Conference 2022, pp. 132β138
S Chakraborty & J Timoney
Companion Publication of the 25th International Conference on Multimodal Interaction
S Chakraborty, S Singh & S Thokchom
2018 Eleventh International Conference on Contemporary Computing (IC3), pp. 1β6
D Keller, A Yaseen, J Timoney, S Chakraborty & V Lazzarini
Future Internet 15 (4):125
B Faghih, S Chakraborty, A Yaseen & J Timoney
Applied Sciences 12 (15):7391
S Chakraborty, S Kishor, S Patil & J Timoney
Joint Conference on AI Music Creativity (AIMC 2019), Stockholm, Sweden
S Chakraborty & J Timoney
2020 5th International Conference on Robotics and Automation Engineering
S Chakraborty, S AktaΕ, W Clifford & J Timoney
18th Sound and Music Computing Conference (SMC 2021), pp. 46β52
A Yaseen, S Chakraborty & J Timoney
International Conference on Human-Computer Interaction, pp. 335β347
S Chakraborty, D Keller, A Yaseen & J Timoney
Symposium 2024 (UbiMus 2024), p. 121