【コラム】 Live CodingとAIミュージックの親和性と未来

Column en Ai Experimental Live Coding
【コラム】 Live CodingとAIミュージックの親和性と未来

Prologue: The day Chord took the stage

Text: mmr Theme: The culture of playing chords and music generated by AI.Explore the creative changes occurring at that intersection.

In the darkness of the club, what appears on the screen are not notes but chords. d1 $ sound "bd sn [hh*2]"──It was not a musical score, but an improvised algorithm.

This culture called “Live Coding” was born in Sheffield, England in the early 2000s. Artists write programs in real time on stage and immediately output them as sound. A fusion of music and coding, club culture and algorithms. This new expression would later resonate deeply with AI music.


Chapter 1: Birth of a culture of “playing” chords

The origins of Live Coding lie in Algorithmic Composition. The earliest examples include automatic composition experiments by Lejaren Hiller and Iannis Xenakis in the 1950s. Live Coding has carried this into the 21st century and brought back physicality and real-time functionality.

In 2004, the community “TOPLAP” was proposed by Alex McLean and Nick Collins. The slogan was ``Show us your screens!’’ By sharing the process (code) of producing sound with the audience, The idea was to turn the production process itself into a performance.

Environments such as TidalCycles, SuperCollider, and Sonic Pi are Enables the act of “writing sounds by hand” improvisationally, It brought a new live nature to electronic music.


Chapter 2: Transformation of generation brought about by AI

In the context of AI music, music generation using deep learning made significant progress in the late 2010s. Representative examples include OpenAI’s “Jukebox,” Google’s “Magenta,” and “Riffusion.”

AI doesn’t write code. Instead, it learns patterns from large amounts of data and “internalizes” production rules. In other words, AI is algorithmic intelligence that is “outside” of Live Coding. However, in recent years, the boundaries have rapidly become blurred.

For example, TidalCycles users use GPT to suggest codes in real time, Cases are beginning to appear where AI analyzes live performances and predicts the next rhythm. This fusion points to a future where AI becomes a co-star in Live Coding.


Chapter 3: Differences between human improvisation and machine “improvisation”

Human Live Coders enjoy errors and chance. Unexpected sounds and misunderstandings drive the music. On the other hand, AI improvisation is “reconstruction” based on past data, Essentially it remains within the bounds of probability.

However, this difference is also the source of creativity. AI provides infinite combinations, and humans find meaning in them. The relationship between the two is not one of “dominance and subordination,” but rather a mutually complementary creative relationship.


Chapter 4: Evolution and comparison of major tools

Tool name Developer/organization Features Possibility of AI collaboration
TidalCycles Alex McLean Haskell-based Live Coding environment specialized for pattern description Real-time code generation possible with ChatGPT integration
SuperCollider James McCartney A long-established environment for sound synthesis and algorithmic composition Sound parameter control using AI models is in progress
Sonic Pi Sam Aaron Ruby-based with both education and performance in mind AI-assisted code examples used in educational settings
Riffusion Seth Forsgren et al. Diffusion model that generates spectrograms AI itself directly generates sound
Ocelot / Hydra Live Coding environment that integrates generated video + sound AI synchronization of vision and sound is possible  

Chapter 5: Example of collaboration between AI and Live Coding

  • AI-DJ Experiment (2023, Berlin CTM Festival) A human Live Coder plays on TidalCycles, and AI analyzes the BPM, harmony, and spatial arrangement. Generate responsive mixes in real time. As a result, we were able to co-star in a form where the AI ​​follows the “human rhythm”.

  • Algorave × GPT Jam (2024, Tokyo) Multiple Live Coders receive GPT-based code suggestions on stage, Perform while making corrections on the spot.Chats from the audience are used as input data. An attempt was made to have AI read the “atmosphere of the place.”

  • Riffusion+Tidal Loop TidalCycles randomly rearranges the fragmented sounds generated by AI, A new production format in which AI is in charge of the “materials” and humans are in charge of the “structure”.


Chapter 6: Ethics and Creativity - What is a “performer” in the age of automation?

Who owns the copyright when AI writes code? Does the concept of “original” hold true in improvisational production?

These questions are closely related to the philosophy of Live Coding. TOPLAP’s philosophy of “opening up the process” is Transparency = democratization of creativity. When AI joins this culture, We need to resist the “black box”.

If a generative AI proposes a code, its learning process and decision criteria should also be made public. This is the key to unlocking the future of algorithmic music.


Chapter 7: Future prospects - Towards “algorithm-driven collaboration”

In the 2030s, “AI sessions” will become commonplace in music production. AI is not just a tool, but is positioned as a co-performer. Humans direct the direction of concepts and emotions, AI improvises hundreds of sound patterns. The act of selecting and editing from there is itself a “performance.”

Furthermore, by integrating the Live Coding environment with AI, There is also a possibility that it will become a “prompt = performance interface.” No longer need a mouse or MIDI. The era when language and thoughts themselves become sounds is coming.


Illustration: Live Coding × AI evolution timeline

timeline title Evolution of Live Coding and AI Music (2000–2025) 2000: TOPLAP is formed and Live Coding culture is born 2004 : Initial version of TidalCycles released 2016 : Deep Learning music generation model (Magenta, OpenAI MuseNet) 2020 : Riffusion starts AI spectrogram generation 2023: AI x Live Coding co-star events expand in Europe 2025: GPT-based real-time Live Coding environment introduced

Correlation diagram: Collaborative structure of Live Coder and AI

flowchart TD A["Human (Live Coder)"] -->|Code input/improvisation| B["Live Coding environment (Tidal, SuperCollider)"] B -->|Output of generated sound| C["AI analysis module (tempo/structure analysis)"] C -->|Prediction/Proposal| D["AI generator (Riffusion, GPT system)"] D -->|Material generation| B B -->|Sound output| E["Audience (reaction data)"] E -->|Emotion analysis| C

Conclusion: A new democratization of creativity

Live Coding is through “improvisational expression through chords”. He opened up music to an act that anyone could create. AI will further promote democratization, We are trying to create a culture of sharing the “intelligence of performing”.

Algorithms and humans, machines and emotions. Where the boundaries melt, A new musical horizon rises.

Chords transcend sheet music, and AI learns to improvise. Music is no longer a “human exclusive patent”; Co-Creative Intelligence.


Monumental Movement Records

Monumental Movement Records

中古レコード・CD・カセットテープ・書籍などを取り扱っています。