Music Technology Live Capstone Mutual abductions Spencer Wallace and Arthur Zhiliang Tan Mutual abductions: An interactive improvisatory environment for two performers on vastly contrasting instruments, featuring a simple trill as primary musical material. Spencer and Arthur attempt an imitative call-and-response via a bespoke Max MSP patch (a conceptual "tunnel" akin to that found in the children's game "Chinese Whispers"). This patch incorporates a granular synthesiser designed to garble the incoming audio signal to various extents depending on how "complicated" the raw material is: simple/discrete sonic events pass through relatively unaffected, but longer messages are increasingly re-pitched, increasingly polyphonic, more finely granularised, and spat out ever faster. Metering of various input parameters (pitch, duration, loudness etc.) is used to control the granular synthesiser, resulting in a fully autonomous patch that interacts with the performers as much as the performers interact with it – thus Mutual abductions is presented as a trio, not a duo, with an invisible third performer harbouring a dangerous capacity to transform audio into something completely unrelated, despite it relying entirely on live-sampled input. Each human performer attempts to imitate only the more complicated material from the Max patch, and not their counterpart's live input, with the eventual result being ever-increasingly mutilated messages leaving the patch. Inevitably, the human performers will fail to realise this complexity on their individual instruments due to limitations in technique, technology and ability, culminating in a sonic outburst of frustration that only worsens the garbling of the messages... this frustration ultimately descends into a somber acceptance of failure as the piece concludes. Spencer's instrument is a multi-timbral synthesiser designed in and realised in Ableton Live. He selectively transitions between these voices using a combination of playing live and looping to create texture and energy throughout the different sections. Arthur's instrument is a French horn which he uses conventionally as the piece opens, but increasingly resorts to extended techniques in his futile attempts at imitating the Max patch's output: a re-appropriation of the horn as a piece of technology. As the piece progresses however it should become apparent that for both performers, at least in the context of Mutual abductions, "technology" and "instrument" are synonymous terms: they merely describe the tools with which we realise a sonic concept. On the surface, Mutual abductions is an exploration of granular synthesis as a compositional technique, demonstrating the capacity of a granular synthesiser to create sonically complex material out of extremely simplistic input material (sustained trills, discrete percussive transients etc.). It is realised of course that composition is better defined as intelligently designed sonic interactions, rather than just "a whole lot of complex sounds"; thus as a piece of live music, a reliance on performer input to intelligently interact between sonic events is used to present a unified gestalt: each performer interacts with their own technology, with the Max patch, and thus indirectly with each other, to arrive at a sonic whole that is greater than the sum of its parts. Ultimately then, Mutual abductions rejects a concrete compositional objective in favour of a focus on the advanced and fluent use of technology in whatever form.
‘Boss Fight’ Liam Hartley, Jared Hundermark, Samuel Luck and Fin Wegener ‘Boss Fight’ is an audiovisual performance that features live electronic music, synthesised visuals, live lighting control and motion tracking technology produced by Liam Hartley, Jared Hundermark, Samuel Luck and Fin Wegener. One performer uses motion tracking technology to control an avatar in a retro-styled video game; dodging projectiles from three floating antagonists in a neon space themed environment. Live electronic music is performed as the score, while a lighting system is controlled to emphasise the mood and events of the game. The avatar proceeds through three levels of increasingly difficult gameplay to finally face the final ‘boss’ and determine the outcome of the adventure. The accompanying soundtrack to the piece features chip-tune inspired synthesiser sounds among modern rhythmic and atmospheric devices in a contemporary take on scores of early video games from studios such as Nintendo and Gamecube. The music explores both retrospective and futuristic approaches to aesthetic while highlighting the increasing levels of intensity displayed in the video game through the use of ambience, motifs, repetition, timbral variation and the gradual introduction of rhythmic elements. As the guiding conceptual realisation of the piece, much of the construction of the work was informed by the video game system. Built in a visual synthesis software, Touchdesigner, the game features an abstract humanoid avatar and three separate enemies of increasing complexity. The design of this game system involved character and scene design, advanced coding to replicate physical projectiles and a MIDI control system that allows the game to be controlled in order to propel the performance. To achieve the integration of real time motion tracking, the player of the video game is tracked by an Xbox Kinect’s infrared sensor to produce data that is interpreted within Touchdesinger. The data is processed, scaled and exported to a second instance of Touchdesigner on a separate computer that hosts the video game visualisation. Through this technique, the player is able to control the avatar in the game in real time, adding immersive control and engagement to the performance. Additionally, the body tracking data set is converted to midi data and sent to a virtual modular synthesiser in VCV Rack. This midi data modulates multiple parameters of the synthesiser, such as pitch, filter, resonance and effect levels, allowing the player to determine the sound of the instrument through their body. Inspired by the work of new-media artists Chris Vik and Bileam Tchepe, this additional integration of immersive control bridges the gap between musical and dance performance to realise an improvisatory approach to choreomusical composition. Accompanying the music and visuals is a synchronised lighting display sequenced in Ableton Live and programmed in PureData. Featuring colours consistent with the aesthetic of the game and levels of intensity complementary to the structure of the work, the lighting display emphasises key events and moods throughout the piece. ‘Boss Battle’ is an ambitious experimental approach to live performance that balances technological integration between systems with human operation, as performers both interact with data sets from other systems and react to one another’s performance. The predetermined narrative of the game allows the producers to improvise specific elements while allocating others to continue without control, opening possibilities for serendipity while preventing collapse. The piece represents an extensive learning experience for all involved and indicated numerous areas of immersive tech-art for future exploration.
Digital Orthodox Annika Pine, Sasha Muratidis, Kyla Ingham, Tae Young With creative production and art existing as a reflection of the current times, our group wanted to delve into the idea of how different art forms and technology inform, reveal and endanger one another. Each individual member of our group aimed to take a technological area of interest and see how they could grow their skill set through our performance. We have focussed on many art forms to make this a completely immersive experience, hopefully letting the audience explore what traditionality truly means in a digital world. Acting as the ‘traditional’ art aspect of our performance is a live drawing performed by Annika Pine. Operating as a creative vessel, it aims to explore how advanced technology and orthodox art forms can communicate and inform each other. As it is performed under a projection of Kyla’s TouchDesigner piece; colour, gesture and linework are influenced in both directions. The drawing also heavily influences the sound by using Piezo (contact) microphones; microphones that pick up sound through vibrations of solid objects rather than air. Working alongside the contact microphones, Sasha’s work in Ableton Live is part of what really makes this performance an immersive use of music technology. To enhance the perception of liveness further, Sasha will inject a multitude of audio effects and live processing within Ableton Live to dynamically alter the original drawing sound using the novation launch key. She has used multiple layers of pre-composed instrumentation that she will manipulate and construct live, responding to every other member and their creative contribution. The visuals created within Touch Designer focus on the use of Noise TOPS which are manipulated into different instances of generative art. Different parameters within these Noise generators along with Levels and colour palettes are controlled via a midi keyboard.. Finally, a Wii remote which controls the Feedback is used in the finale of the performance to reset the to a blank slate in order to bring focus back to Annika’s artwork. Though the visuals are quite simple, they act as the buffer zone between the traditional art of drawing and advanced uses of music technology. While none of the lighting techniques utilised are new to the industry on their own in a technical sense, particularly regarding DMX lighting constraints, the purposeful performability of the lighting allows full creative and emotive control over the performance, within the provided parameters, unlike a pre-programmed theatre show or completely audio reactive performance. This will encourage a fluid & cohesive lighting atmosphere, and makes responding to unpredictable on stage events and generative expression more effective. This is achieved through a heavily modified Pure Data patch communicating with DMXIS (a software to control lighting via Digital Multiplex) and midi inputs. Several very deliberate controls have been programmed to create variation between the lighting, including separate colour faders, independent colour panners as well as individual midi keys and strobing lighting effects. Additionally, some 5 volt lighting strips have been programmed using an Arduino Uno microcontroller that also have performable parameters that will be reserved for the unspecific climax of the performance . By using ensemble-like communication, the manipulated live sounds, controlled lighting, musicianship and both traditional and digital artwork generate a connection between audio and visual that showcases our group’s advanced use of music technology.
REZNOR Rhys Evans, Nick Broome, Hugh Sinclair and Sid Moktan ‘REZNOR’ is an audio-visual piece composed by Rhys Evans, Nick Broome, Hugh Sinclair and Sid Moktan. The piece is named after Nine Inch Nails member Trent Reznor, whose musical style was a major inspiration for this project. The industrial and noise soundscape is broken into a multi-session performative environment with textural and sonic movements being live automated through the use of multiple MIDI controllers. The slow-moving synth pad is played with an emphasis on bringing a harmonic structure to the piece. It has multiple parameters mapped to it which allow, in real-time, the synth to evolve, move and have its fundamental structure modulated. All the while, a droning, industrial and grungy soundscape is providing a sense of drive and foreboding angst. The mangled metallic textures are modulated and played live to provide a motif throughout. The synth pad slowly evolves and moves into a grungy, distorted and noisy rhythmic bass which folds into itself with a parameter that allows the bass which is being designed live to play sustain or a chugging, fast movement which creates a change in movement and energy. A drum section with complex repeating and modulation is also played in this section to provide a core rhythm. One of the performers of this live Industrial piece will be controlling the majority of the soundscape sounds and textures throughout the 8 minutes. The role will be undertaken using multiple knobs, keys, and faders on a midi keyboard controller, which are triggered to play and modulate the core detailing and subtle textures of the soundscape and ambient sounds. The soundscape uses layered and sustained electronic pads, as well as complex rhythmic sounds such as drones and real foley samples of keys rattling and sonic spaces to capture the listeners interest. This piece will always be performed slightly different each time live, as the artists are constantly using their musical instincts and visual indicators to attempt to achieve the most live and authentic performance of this gritty, dark song. Lighting is an important part of a performance that helps convey the emotion, rhythm and texture of a piece. The group uses lighting in this performance to aid and support the overall musical energy of the piece. Tonight, audience members will see the lighting flash in time with the music, or in response to musical cues from the other group members in order to convey the musical energy of the performance. Hugh, who will be operating the lights tonight, has created several different “scenes” in Ableton Live that each carry a different energy. Each scene has different colours, timing and patterns. Different scenes will be chosen to match the energy of the music being played. While the scenes have been created beforehand, Hugh’s performance has not and will be improvised live with the other members. The scenes are triggered via Ableton Push, a controller with a different pad for each scene. Hugh can also speed up and slow down the scenes on the fly, matching the tempo of his group members as they improvise. The industrial and noise soundscape has a multitude of layers ranging from samples of both industrial and musical sounds. The Role of the Guitarist is to perform utilising both acoustic and electronic technology hence it is why is a real live pedalboard to help perform the sounds but also a virtual guitar rig in ableton to help assist in music technology. As the soundscape progresses the sounds evolve from clean and wet samples to distorted and rhythmic noise. This transition helps to provide both contrast and a change of mood and energy. Ranging from rhythmic bass whilst being sustained to guitar licks to chugs, all of these movements will help to realise the sonic and musical capability of our soundscape with the additional lighting to set the mood and atmosphere.
Colour Profiles Theo Bourgoin and John D. Morris Colour Profiles is an audiovisual work produced and developed by Theo Bourgoin and John D. Morris, utilising a combination of software (Touchdesigner, PureData and Ableton Live,) as well as analogue equipment including synthesisers, FX chains, cassettes and a 4-track tape recorder. The objective of this performance is to explore methods of sonic composition and ways of blending analogue and digital technologies with a multimedia visual performance by making use of pre-planned processes and liveness, presenting narratives through non-linear musical sound sources. Colour Profiles is a work based on static analogue synthesiser tones and swirling soundscapes captured and manipulated on cassette tape. Real-time visuals generated in Touchdesigner morph and augment the unfolding audio composition, performed using tape manipulation and midi controllers. Similarly, both sampling and resynthesis of these sounds are used as compositional devices within Ableton Live. Spectral analysis of auditory information, both musical and non-musical, is paired with basic colour information, as well as visual textures and noise, examined in diegetic contexts. Visually, Colour Profiles makes use of solid RGB tones as a basis for dense, static audio drones. Pixels of red, green and blue make up each and every image viewed on-screen. The germ of this idea presents itself as such: tones of the displayed colour, as well as the intensity of visual noise surrounding it, are warped and sculpted based on the kind of audio information being received. Touchdesigner is utilised as both a real time image generator, and as an interface for LED lights using the DMX protocol. Image analysis allows for the LED lights to extend the visual scope of the performance beyond the usual projection; enhancing visual ambience and reinforcing motifs behind displayed colours. Sections are, at first glance, separated only by the concept of Red / Blue / Green, but indulge in abstracts. Representations of non-existent contexts, relating moods and social behaviours, periods of history etc. make themselves known in their own way to individual listeners through diegesis, allowing less rigid association between sound and video. These humanistic elements run concurrent to the scientific lens this work was originally developed through. More specific personal examples were certainly utilised in the development of each section, though it is best to allow the audience to develop their own relationship to displayed colours and sounds. Red tends to represent feelings of warmth, density, and energy. Interpretations of these concepts can run for a mile: is power explored from a social perspective, is it in reference to strength, or is it energy - literal power? Similarly, Blue takes the listener to a sparser place, referencing mid-section spectral content, ending its journey in the upper frequency ranges as Blue gives way to Green. Colour Profiles explores colours not only as data that can be transcribed, but emotional spaces that can be worked within and around. In demonstrating the sonic relationship between displayed colours and sounds in tandem, transitionary periods between these colours are also explored thoroughly - making the relationship between Red and Blue, or Blue and Green, and even the sounds of these visual intersections, known plainly to the listener.