CREATING A RESPONSIVE VISUALIZATION THAT REACTS WITH MUSIC IN REAL TIME: INTEGRATING ABLETON LIVE 9 AND CYCLING ’74 MAX FOR LIVE INTO A MUSICAL PERFORMANCE
MetadataShow full item record
PublisherThe University of Arizona.
RightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.
AbstractOver the past three semesters, I have been working on refining my Honors Thesis to meet exactly what I wanted to study. This is why when looking at what I wanted to focus on it, it was obvious that I wanted to integrate musical performance with a visual performance—similar to what a Video Disc Jockey (VDJ) might incorporate into a live set. Currently, there are a quite a few percussion pieces that use Max patches that process the sounds, or use pre-made visuals that play along with the music. My goal with this Thesis was to find a way to use a patch with the music that reacts to the visuals in real time. What makes this different from other performances is that each performance will be slightly different than the one before it or after it. To me, this is something that I find very exciting about music and technology. Throughout this essay, I will explain how I approach composing a piece specifically for this as well as my learning process to integrate a Max patch with the Music.
Degree ProgramHonors College