Artificial Creative Intelligence - Rethinking musical creativity at the era of deep machine learning

Abstract:

<aside> 💡 The research project led by the ACIDS team at IRCAM aims to model musical creativity by extending probabilistic learning approaches to the use of multivariate and multimodal time series. Our main object of study lies in the properties and perception of musical synthesis and artificial creativity. In this context, we experiment with deep AI models applied to creative materials, aiming to develop artificial creative intelligence. Our work aims to decipher both complex temporal relationships, and also analyze musical information located at the exact intersection between symbolic (musical writing) and signal (audio recording) representations. Our team has produced many prototypes of innovative instruments and musical pieces in collaborations with renowned composers. Notably, we aim to demonstrate through both scientific presentation of the tools and a duet live techno performance two of our foremost groundbreaking research prototypes, namely 1/ Neurorack // the first deep AI-based eurorack synthesizer 2/ FlowSynth // a learning-based device that allows to travel auditory spaces of synthesizers, simply by moving your hand.

</aside>

Resources**:**