Live performance at ACMC 2017 with a bespoke interactive music system. Generation II is an interactive electronic music performance featuring a live improviser and the Evol algorithmic music system, designed by the researcher.
History
ERA Category
Performance of Creative Work - Music
Funding type
Other
Eligible major research output?
Yes
Research Statement
Research Background
Interactive music systems enable collaborative performances between human musicians and computational systems. This field, with roots in the early days of computing, has seen significant advancements, with the researcher contributing to it for over 20 years. Generation II is an interactive electronic music performance featuring a live improviser and the Evol algorithmic music system, designed by the researcher. The system is grounded in theories of music composition and psychology, and explores how musical structure can emerge through the variation and elaboration of motifs. The performance exemplifies the interactive application of these processes in a live, improvisational context.
Research Contribution
Generation II utilises multiple software agents within the Evol system to create layered musical textures that evolve subtly over time, synchronised to a shared temporal framework. The aesthetic effect is reminiscent of 20th-century minimalist compositions. The Evol system iteratively elaborates on captured fragments, using them as motifs. The work extends the researcher’s investigation into reflexive music software, where live performance gestures are captured and transformed into generative musical accompaniments. This work demonstrates how improvisation with reflexive software can produce aesthetically coherent music, emphasising the creative potential of human-machine collaboration in real-time performance.
Research Significance
Generation II was selected through peer review for the Australasian Computer Music Conference in Adelaide in 2017. The performance illustrates the ensemble effect of human-machine collaboration, achieving coherence within mimetic compositional processes and alignment in musical articulation and structure. These outcomes highlight features that are rarely achieved with such clarity in interactive music systems.