Program - Interactive Music

Page 1

Creative Music Technology

INTERACTIVE MUSIC Friday 27 May – Basil Jones Orchestral Hall “Leviathan” by Zi Hui Chen & Rhiannon Hurn Leviathan is heavily based on an album by Everything Everything called Raw Data Feel. It features glitchy textures with a disco feel. We were influenced by the algorithmic nature of the percussion textures in the album and wanted to recreate it in our piece. We have created a story on which to evolve our piece and each section should be distinctly different as the story progresses. Zi Hui has drawn on her previous experience with Ableton’s Session View and organised the structure of the piece, making it easier to understand and perform. She has also produced most of the sounds by taking inspiration from ‘Raw Data Feel’, such as selecting sounds to insert in the drum rack for percussive textures and making decisions on what and when to automate parameters. Zi Hui will be performing some MIDI keyboard, such as the chords at the end of the piece while MIDI automation is being performed. Rhiannon begins with speaking through a mic, the vocals have been distorted with an effects rack to simulate a computer announcing the download status of the AI. Once the “download” is complete she uses the push pad to build on percussion to simulate the AI growing in intelligence. Throughout the performance she uses faders and knobs to control levels and utilities live MIDI automation. At the end of the performance, she uses the Wii remote to control redux and vinyl static. She has contributed to making musical decisions, such as deciding on which textures are appropriate. The primary stimulus for our piece was presented by Rhiannon, which helped drive the creative process of this piece. Our piece follows the life of an AI. It begins at its ‘birth’ and it then forms its own understanding of musical structures before it breaks out of its algorithm and finally destroying from information overload. This can be heard from the gradual build-up of instrumental layers and effects. We have also made use of a microphone in our piece to perform spoken word, for example, and then adding audio effects to it.


“Dark Punk” by Ethan Waller, Matt Rabbas and Harry Morris. Our piece takes inspiration from the nature of cinematic music and electronica, as well as grunge rock. The progress of the three contrasting sections is to be like a changing story with a changing soundscape, which could also be compared to stages of the night: dusk, night, dawn or the rapid changes of mood in films (despite the fact that the music itself isn’t all that similar to film music). Our aim was to display the agency that all the various control surfaces and instruments gave us with their inherent nature, being interactive instruments. Ethan (person 1): Ethan will manage the Ableton Push during his section and arm the instruments for Matt to play, as well as cue the final chords from his section to flow into Matt’s section. Ethan then moves to the keyboard plays chords for Matt’s section before moving to the nano kit during Harry’s section. Matt (person 2): Matt starts by playing chords during Ethan’s section, and then takes control of the Ableton Push to fade out the final chords from Ethan’s section into his own section. Matt triggers the base line using the Push kit while Ethan plays and fades in the final part of his section, which flows into Harry’s section. Matt plays pads for Harry’s section as well as the bass line during the climax of the entire piece (finale of Harry’s section). Harry (person3): Harry begins by playing a guitar solo/improv during Ethan’s section and then moves to a Wii mote to play drums. Harry then moves to the Push kit to trigger his section and arms the pads for Matt to play. During his section, Harry sings and loops ambient vocals and plays distorted guitar power chords. Finishes it off with a guitar riff. To further represent the idea of contrasting sections, in our performance we all rotate sections so that each individual can add their own artistic views while keeping the piece as a unified work. We make use of the Wii mote and live looping for theatrical effects as well as a combination of acoustic sounds and electronic/experimental sounds to bring about a sense of unexpectedness within the piece itself. We wanted to make the performance as lively as possible, creating literal movement as we switch from one instrument/tool to another. “Look up” by Morgan Chippendale, Matthew Johns, Jeremy Emot. Our performance is experimental electronica with a strong hip-hop influence titled “look up”. The main influence for this piece was flume’s 2016 album, Skin, where he experimented with a complex combination of synth timbres and heavy sidechaining to


create intense pulsating effects. We have integrated the use of synth instruments, vocals and guitar throughout the performance using midi controllers to affect the instruments being played live. These instruments all have interesting timbres which fit together in a satisfying and unique way. The piece focuses on this use of timbre and texture to help the audience transcend the limits of reality and enter a new musical world. Morgan (person 1) - Morgan will be controlling the Ableton push to trigger loops and move through scenes. Morgan will also manipulating textural sounds and effects using nobs. Later in the performance, Morgan will be rapping about the complex state of the world, utilising many vocal techniques to fit ‘within the mix’. Matthew (person 2) - Matthew will be controlling the musical instrument digital interface (MIDI) Novation Launchkey Mk43, playing different synth parts/drum grooves. He will be using the drum pads to control the effects on Morgan’s vocals and Jeremy’s guitar. On Morgan’s vocals, Beat Repeat and Pitch Shifter will be used in specific phrases to add emphasis and create a satisfying sound. Finally, Matthew will use one of the drum pads to control the volume of essentially all parts simultaneously, silencing the project for a beat before bringing it back for the chorus. Jeremy (person 3) - Jeremy will be playing melodic ideas on the guitar, utilising both lead and ambient styles. This versatility in his playing allows the same instrument to have 2 different roles within the context of the piece. Jeremy will add subtle textures to fill space in the piece and help immerse the audience. Jeremy also designed the main effects on the guitar tracks to fit cohesively into the song. We hope the live manipulation of live audio is noted, especially on the vocals. Not only are complex ‘interactive’ techniques used, but precise timing is utilised. We also hope that the combination of contrasting synthesisers playing in unison is recognised, as it creates a thick texture and adds energy to the piece. “NOW 1” by Olivia Gadenne, Alexis Luxford and Sarah Matheson. For this performance, we decided to sample the acoustic guitar intro from the song “Fade to Black” by Metallica and build alternative melodies and harmonies on top. We were also inspired by the LoFi genre and aimed to make our performance have a chilled out and ambient vibe. There was also a lot of inspiration drawn from electronic music, which we tried to convey with the effects used. Liv (person 1) will be using the Abelton Push to trigger sounds, arm tracks and manage session mode during the performance. Alexis (person 2) will use the nanokontrol to adjust faders and panning, as well as Wii remotes to handle other miscellaneous


effects, such as EQ, reverb and fuzz. Sarah (person 3) will be using the novation launch key, to play and record different MIDI instruments. The parts she will be playing include our melody and bass lines. We hope our use of effects is noted, as well as our looping and beat repeat techniques. We would also like the assessors to look out for how we use effects such as fuzz, delay, reverb and EQ envelopes to achieve an ambient, LoFi feel in our music.

“Dreamscape” by Ellee Chapman, James Madden and Caitlin Mills In our ensemble performance, our trio will be showcasing the collaborative design and execution of an original interactive music work. The initial inspiration of our piece was from traditional Japanese anime show openings, however we decided to add an unexpected section where our song transitions into a more bittersweet sounding melody. This structure assists us with presenting our main idea for the piece, which was to portray the process of someone entering a beautiful setting in their dream whilst they sleep and enjoying themselves, until their dream begins to end, and they awaken to a much duller reality. The dream that occurs whilst they sleep acts as an outlet, or escape from their reality (hence the name – we thought it was a cool way to combine the words Dream Escape). Ellee (person 1) will be using her voice throughout the performance as her primary instrument. She will be singing with a microphone throughout most of the piece, manipulating her voice by using a OSCulator connected Wii-mote which will control dry/wet audio effects that are mapped to Ableton. Ellee will also be using the nano controller to control mapped effects for Caitlin’s guitar that will affect the sound the audience hears as she plays chords from her guitar. James (person 2) will be providing the percussive elements in our piece. Using the Ableton Push, he will play a bit-crushed hip hop drum groove and experimental bass arpeggio synth to the first section of our piece and play a more bittersweet melody from a MIDI synth pad and a different sounding drumkit during the second section. James will also control the timing and volume of each of his loops by touching certain knobs and buttons on the Push. Caitlin (person 3) will start out by playing a simple ‘anime-esque’ piano melody on the Novation Launchkey throughout section 1. During this time, pre-set audio effects from the nano controller will be heard by the audience as she plays. This will provide the basis of which we structured our piece around. During section 2, Caitlin will switch from the Launchkey to her electric guitar that will be connected to the Focusrite. She will


play various chords throughout the remainder of the performance, whilst Ellee controls her sound using mapped effects. Throughout our performance, we hope that our use of various mapped effects and looping methods are noted, as well as our incorporation of various technologies and instruments and how they are fused together to create an interactive work that combines the elements of production and composition.

“Earthwax” by Jesse Beaumont, Eden Shepherd, Josh Crawford. In the Interactive Music concert, we will be performing our original composition “Earthwax”. This piece draws inspiration and takes stylistic cues from chill, “downtempo” electronic music, and especially the artists Jamie XX and Tourist. Earthwax attempts to envelop and immerse the listener in its atmospheric, hypnotic soundscapes with its slow tempo, discerning use of repetition, and extensive implementation of effects, particularly reverb. But despite its initially laidback vibe, it gradually increases in intensity and complexity as it goes on, both in a melodic and rhythmic sense. It will be performed live on Friday using Ableton Live, which is being controlled by a combination of MIDI devices and live instrumentation. Jesse (position 1): Jesse will be using an Ableton Push to essentially “run” the performance, by utilising its 64 drum pads to trigger loops and transition to different parts of the project. He will also be midi mapping the Push’s knobs so he can modulate different effects and parameters in real time. Eden (position 2): Eden will primarily be performing live bass into the Focusrite, which is being processed heavily to create new timbres and differentiate it from a clean bass sound. However, there will be parts of the performance in which synthesised bass is used instead, during which Eden will use a Nanokontrol to modulate and filter certain parts of the composition. Josh (position 3): Josh will be using a Wii Remote and Nunchuk with OSCulator, which he has mapped so that he can swing them downwards to trigger drum presets, mimicking the action of hitting a drum. He will also be playing MIDI in using a Launchkey in other parts of the performance. Assessors should pay close attention to our novel use of Wii Remotes with OSCulator, the gradual building and layering of musical elements throughout the piece’s duration, the creation of unique digital “instruments” using Wavetable, and our use of Ableton audio effects to create a more atmospheric, dreamlike soundscape.


“NOW 2” by Luka Ison, Oliver King, Jeremy Rich. For this performance piece, we were inspired by musical juxtaposition. That being the concept of two heavily contrasting sections that transition between one-another abruptly yet fluently. Free improvised chaotic noise, created via the electronic manipulation of acoustic instruments, is at the forefront of the first section, inspired by musical acts such as AMM, Otomo Yoshihide, Masonna, and Supersilent. As this section builds up to it's intense and noisy climax through a crescendo, a drone ambient section featuring sparse jazzy improvised chords and compressed lo-fi drums suddenly becomes exposed, being the second section. The section draws heavily from minimal acts Boards of Canada, Boris, and Terry Riley. Oliver King will perform improvisation on electric guitar, using extended techniques and acoustic-based manipulation such as the use of guitar slide, along with live electronic manipulation via a Nintendo Wii remote strapped to the guitar controlling Ableton manipulative parameters. Movement and motion of the Wii remote will control parameters such as echo and delay amount, ring modulation rate and amount, pre-programmed 707 drum machine hits as well as other details. Certain buttons on the remote will control parameters such as distortion and delay. Jeremy Rich will be improvising, playing a flugel horn through a Shure PGA98H which will run through his pedalboard of effects. He will be mostly utilising his Boss MD-200 which is a modulation multi effects, he will be utilising the harmoniser and overtone function. Combining these effects with his swollen pickle fuzz finally running through hard compression will result in a transformed sound that will barely reminisce a typical flugel tone. Throughout lighter stripped areas in the song he will disable his effects and go for a reverberated clean tone with modulating delay in the ends of small, improvised phrases. Luka Ison will be performing on the Ableton Push and the Novation Launchkey to control various different parameters, particularly in the first section. FM synthesis is utilised in the first section using Ableton’s Operator, with different parameters such as course and fine pitch of the oscillators as well as using effects like phaser, reverb and distortion to colour the sound. The keyboard and Push are utilised more in the second section to create and trigger loops of drums and synths, as well as using more controls to manipulate the timbre of these loops.

“The Monks of Cha Cha” by Joseph Cross, Charlotte Lennon and Connor Townson.


“The Monks of Cha Cha” is a performance piece that draws its sound world from the dulcet tones of choral music and the relaxed atmosphere of micro house music. The piece begins with a ‘soundscape’ with samples from a choir used to display the musical idea of the chant which creates suspense leading up to the second section of the piece. This new sonic landscape draws from the work of artists such as Daft Punk and Joris Voorn, through the implementation of a four on the floor bass line (at 130bpm) and sampled sounds to create space. Joe will be using the app TouchOSC on his iPhone and connect it through TouchOSC Bridge, then directly into Ableton on the MacBook to manipulate panning and faders throughout the performance. The program will be utilised to trigger the samples of the choir to add some context before letting the entire sample play out, saying ‘we are the monks of Cha Cha, sacred keepers of the rhythm’. This will help to provide a clear message behind our performance to the audience but allow us to implement the idea of utilising wireless MIDI control to enhance the overall performance. Charlotte is making use of the Novation LaunchKey MIDI keyboard to add more melodic elements to the piece. Leading up to the performance, instruments were chosen that would suit and work well with the other loops that were created. Using either the MIDI keyboard of the Ableton Push, the instrument track can be selected and used to record or just perform the melodic ideas. This will allow for the ideas throughout our piece to be expanded, by providing melodic embellishment without our piece. Throughout this performance, Connor will utilise the functions of Ableton Push to trigger loops and instruments to develop the piece. The Ableton Push will provide a strong basis for structuring the piece but will also allow enough sonic space for timbral manipulation. Additionally, by altering the certain frequencies on different instruments, this will further our piece by expanding on timbral manipulation. Of particular interest for this performance is the group interaction and utilisation of different controllers. The main mode of development is timbre, which is reflected in the use of TouchOSC and the manipulation of different effects. To structure the performance, most of the direction is orientated around the Push, the use of which encourages interactivity. Lastly, more “live'' melodic elements that accentuate the performance are completed using the Novation Launchkey.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.