CREATING A RESPONSIVE VISUALIZATION THAT REACTS WITH MUSIC IN REAL TIME: INTEGRATING ABLETON LIVE 9 AND CYCLING ’74 MAX FOR LIVE INTO A MUSICAL PERFORMANCE
dc.contributor.advisor | Weinberg, Norman | en |
dc.contributor.author | BARSETTI-NERLAND, DANIEL EDWARD | |
dc.creator | BARSETTI-NERLAND, DANIEL EDWARD | en |
dc.date.accessioned | 2016-06-10T18:58:04Z | |
dc.date.available | 2016-06-10T18:58:04Z | |
dc.date.issued | 2016 | |
dc.identifier.citation | BARSETTI-NERLAND, DANIEL EDWARD. (2016). CREATING A RESPONSIVE VISUALIZATION THAT REACTS WITH MUSIC IN REAL TIME: INTEGRATING ABLETON LIVE 9 AND CYCLING ’74 MAX FOR LIVE INTO A MUSICAL PERFORMANCE (Bachelor's thesis, University of Arizona, Tucson, USA). | |
dc.identifier.uri | http://hdl.handle.net/10150/612546 | |
dc.description.abstract | Over the past three semesters, I have been working on refining my Honors Thesis to meet exactly what I wanted to study. This is why when looking at what I wanted to focus on it, it was obvious that I wanted to integrate musical performance with a visual performance—similar to what a Video Disc Jockey (VDJ) might incorporate into a live set. Currently, there are a quite a few percussion pieces that use Max patches that process the sounds, or use pre-made visuals that play along with the music. My goal with this Thesis was to find a way to use a patch with the music that reacts to the visuals in real time. What makes this different from other performances is that each performance will be slightly different than the one before it or after it. To me, this is something that I find very exciting about music and technology. Throughout this essay, I will explain how I approach composing a piece specifically for this as well as my learning process to integrate a Max patch with the Music. | |
dc.language.iso | en_US | en |
dc.publisher | The University of Arizona. | en |
dc.rights | Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author. | en |
dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | |
dc.title | CREATING A RESPONSIVE VISUALIZATION THAT REACTS WITH MUSIC IN REAL TIME: INTEGRATING ABLETON LIVE 9 AND CYCLING ’74 MAX FOR LIVE INTO A MUSICAL PERFORMANCE | en_US |
dc.type | text | en |
dc.type | Electronic Thesis | en |
thesis.degree.grantor | University of Arizona | en |
thesis.degree.level | Bachelors | en |
thesis.degree.discipline | Honors College | en |
thesis.degree.discipline | Music Education | en |
thesis.degree.name | B.M. | en |
refterms.dateFOA | 2018-09-11T12:26:31Z | |
html.description.abstract | Over the past three semesters, I have been working on refining my Honors Thesis to meet exactly what I wanted to study. This is why when looking at what I wanted to focus on it, it was obvious that I wanted to integrate musical performance with a visual performance—similar to what a Video Disc Jockey (VDJ) might incorporate into a live set. Currently, there are a quite a few percussion pieces that use Max patches that process the sounds, or use pre-made visuals that play along with the music. My goal with this Thesis was to find a way to use a patch with the music that reacts to the visuals in real time. What makes this different from other performances is that each performance will be slightly different than the one before it or after it. To me, this is something that I find very exciting about music and technology. Throughout this essay, I will explain how I approach composing a piece specifically for this as well as my learning process to integrate a Max patch with the Music. |