Creative Music Technology: Capstone Program

Page 1

CREATIVE MUSIC TECHNOLOGY: CAPSTONE Friday 3 November: Basil Jones Orchestral Hall Simulate

Oscar Tooms

Taking form as an audio-visual composition, 'Simulate' delves into the intricate relationship between humanity and technology. Presented in the format of a narrative-driven composition, the story embarks on a compelling journey, guiding the viewer through the lens of an unidentified protagonist as they navigate the simulated realm of consciousness brought upon by their immersion in the digital world. Upon freeing them self from this simulated world, they begin to experience the real world from a new-found perspective, becoming aware of all its beauty and elegance they had previously overlooked. The message the composition intends to deliver to the audience, offers a unique interpretation of the concept of catharsis, impelling the audience to liberate themselves from technology and embrace a path of detachment. As a capstone project, this composition embodies the culmination of my skill development across various artistic disciplines, pushing the boundaries of what is possible through innovative use of technology in art. ‘Simulate' aspires to immerse its audience in the contrasting worlds depicted in this composition, representing their disparities through tasteful execution of avant-garde composition, daring sound design, and meticulously crafted visual art. While music serves as the focal point of the audio component, it contains scenes supported exclusively by diagetic sounds, such as foley, sound effects, and field recordings. These elements intend to construct a rich sonic environment that not only accompanies the musical sections, but also drives the narrative of the composition. The decision to steer away from a live performance was intentional, driven by the desire to enhance the sense of immersion for the viewers and encourage them to focus their attention to the visual component. Each individually rendered scene of the visual component was meticulously crafted in Touch Designer, with a strong attention to detail given to each onscreen element. With no on-stage presence to serve as a distraction, this approach allows for a deeper exploration of the intricate relationships between sound and visuals, resulting in a more profound and seamless narrative. The driving creative force at the heart of this composition is innovation. By exploring the power of creative software such as Ableton, Touch Designer, and DaVinci Resolve, the creation process saw me exploring the capabilities of these programs to new depths, while simultaneously searching for ways to interconnect these programs to create a sense of synergy between the audio and visual components. The resulting work strives to challenge preconceived notions of what technology can achieve in the realm of audio-visual storytelling. ‘Simulate’ is not merely an audio-visual composition but an exploration of the human experience in a digital age. It seeks to challenge and provoke thought, all while delivering a mesmerizing sensory experience that


invites the audience to become an active participant in a world where reality and simulation intertwine.

Toi Bastar

Jack Meimaris, Jasper Hodgson

In this performance, Jack Meimaris and Jasper Hodgson will guide you on a sonic tour across the Eurasian continent. The concert serves as an ode to the players’ instruments, with jack showcasing his skill in Irish accordion playing, and Jasper bringing to the table the unique sound of the Kazakh dombra. Jack has been involved with Irish music from a young age, competing in Irish dancing competitions and learning a range of instruments, including the accordion, banjo and bodhran. Jasper has been dedicated to the world of Kazakh music for many years, studying the traditional music of the dombra, its history, and the art of dombra building. The two musicians have worked hard to effectively integrate these instruments into a performance which also showcases their proficiency and deep understanding of electronic music production and performance. Jack will be using a SoftStep Midi foot controller to trigger effects and scenes in Ableton. Additionally, he will be controlling electronic instruments using an Ableton Push. Jasper will be controlling a range of spatial, modulation and other experimental FX parameters using an Akai MidiMix. In the production stage of this set, these two have gone in depth with their use of creative sound manipulation, creating drum grooves out of environmental samples they recorded, and experimenting with sound morphing and glitch techniques on their digital sounds. At the beginning of the performance, Jack and Jasper explore some of the more nonconventional sounds their instruments are capable of producing. For the rest of the performance, they fuse together elements of not only Irish and Kazakh music, but a range of musical influences which have impacted their musical journeys over the course of their lives. This includes Greek folk music from Jack’s Childhood, and other ethnic music performances Jasper witnessed as a teenager. The performance is comprised of four main sections. There are three main tunes. The first, is written by Jasper, and is a traditional style Kazakh composition, combining the influences of some of the most famous Dombra composers. Following is a jig written by Jack, which diverges from traditional Irish music, having a distinct almost Mediterranean sound. The third song sees the two instruments come together, in a finale which combines the players’ respective styles and influences. For Jack and Jasper, playing their instruments is a cathartic experience; the accordion and dombra, respectively, serve a means to process and express emotion. This concert aims to celebrate how they value these instruments and exhibit the knowledge the have gained throughout their Batchelor degrees. This is why they have created a performance that combines traditional music with upbeat experimental electronic music. The audience is encouraged to immerse themselves in this journey from the rolling hills of Ireland, through the Mediterranean


city streets, to the endless Kazakh steppe. The use of live Instrument performance, electronic fusion, and interesting visuals should create a fun and captivating show.

Organic Resonance

James Engwirda, Brendan Cardiff, Remi Raymond

Organic Resonance is an audio-visual composition creatively exploring the juxtaposition between two contemporary concepts: Nature and Artificial Intelligence. Within this, the composition aims to explore sub-themes embedded within the work, which are used as a mechanism of storytelling throughout the composition. Regarding the technical aspects of the work, as a trio we focused heavily on conveying innovative, unique methods in relation to creating the music, whilst underpinning the two juxtaposing themes. Sonically, the composition blends a combination of found sounds, creatively sampled in nature by Brendan, with technology enhanced digital synthesis processes by Remi. Cohesively gluing the work together are the gripping visuals, originally created and devised by James which further provide a link throughout the entirety of the work, as the juxtaposition between the two themes unfold. Aesthetically, we decided as a trio it would be in our best interest to present the work in video format, as we could truly focus on aspects such as timbral creation, compositional synthesis and visual integration which further reinforced the contrast of the two themes. Blending organic, nature focused samples with modern, artificial intelligence driven sound processes was initially a simple process, but quickly transformed into an intricate, yet informative engagement, especially when the element of visuals was introduced into the composition. Regarding software implementation, three key programs were used throughout the process of creating the work. Brendan primarily used Ableton Live to collate, organise and create the intriguing opening theme within the composition, with his found sounds sampled from Brisbane’s Mt Coot-Tha. Remi used Logic Pro, and specifically Alchemy – a powerful sample manipulation synthesiser, used as a basis for the technologically enhanced middle section of the work. Once each of these sections were finalised, Brendan combined his nature inspired music with Remi’s technologically infused section in Ableton Live, to sonically finalise the composition. Lastly, to create the aesthetically riveting visuals, James used Touch-Designer to create a visual network incorporating particle elements, and an innovative use of colour manipulation. Further, James used Final Cut Pro to add finishing touches, implementing subtle details such as the rainwater effect to cohesively morph the composition into one, wholescale work. Using a variety of software not only extended our creative possibilities as a group, but further allowed each of us to individually craft our specific parts to the highest possible standard. Looking back, as a trio we are pleased with our work, as we strictly followed our initial weekly plan to best achieve our goals. This at times was difficult, but as a group we knew it was important to positively encourage each other to cohesively engage with the work consistently


across the trimester. From the first meeting where we devised the initial concept of the themes to the final export of the work, it has been an engaging, informative and fun process to collaborate on a project with our peers. Organic Resonance is a culmination of three years’ work between Brendan, James and Remi, and we hope you enjoy our audio-visual composition exploiting the contemporary juxtaposition between Nature and Artificial Intelligence.

Clowns

Liam Brown, Cameron Bryer

Clowns is a rule-breaking live performance that aims to take the audience on a “bender” (informally: a wild drinking spree). Pre-programmed beats, real-time MIDI instruments, and audio-reactive visuals will be used to convey intense, euphoric, and comedic emotions. The piece links back to tonight’s theme of catharsis by providing the audience a release through comedic relief. In addition, Artificial Intelligence (AI) is used creatively and carefully throughout the performance in both the audio and the visuals. Going into production for this performance, Cameron Bryer and Liam Brown were inspired by the band Clown Core. Structurally, this made the approach look less like one cohesive piece but rather a series of short moments that flow (or sometimes don’t flow) into each other. For the musical elements, the pair was heavily influenced by Drum and Bass artists such as Skrillex and Crankdat. These Drum and Bass sections feature predominantly throughout the piece as the main climatic element. More euphoric parts of the piece were also heavily inspired by movies such as Interstellar and Planet Earth II (both scored by Hans Zimmer). These sections feature creative uses of sampling, granular synthesis, and acid rhythm generation. Throughout the performance, there are also subtle nods to artists such as TheFatRat and 100 gecs. From a technical standpoint, most of the sounds used in the performance were created by the pair using synthesis, sampling, and audio manipulation. Throughout the performance, Liam will be using an Ableton Push to play samples, synths, and perform granular synthesis. Cameron will be using a MIDI keyboard to add some improvised elements over parts of the piece. As well as this, he will also be using a Novation Launchpad with pre-programmed lights for some parts of the performance. For the visuals, techniques such as AI image and video generation, 3D animation and audio reactivity were used. AI was used with caution throughout the piece as a stylistic choice rather than a “cheat” to make the process easier. RunwayML was used to generate AI images, but it also had the ability to animate images. Animated clips are used in some sections of the piece to add movement or take the viewer on a journey. AI was also used with Touch Designer to create an evolving landscape of art during some of the euphoric sections of the piece. Touch designer was also used to create scenes for some of the builds and drops. These visuals relied heavily on audio reactivity and element animation. Furthermore, there are instances of 3D modelling and


animation that are used as a comedic element throughout the performance. Most of these elements were stitched together in Final Cut Pro to create a video that the pair will perform to. Cameron and Liam would like to take the opportunity to thank you for coming to tonight's concert. Any donations are welcome and would be greatly appreciated.

Bloodline

Ainsley Shiri

Bloodline is an audio-visual experience created by Ainsley Takondwa Shiri or simply T.A.S. This artwork takes you through the journey of his psyche and delves deep into his emotions, and feelings on earlier parts in his life and how they’ve shaped him to become the man he is today. Highlighting a couple different mediums being Music, vocals, and vision. Bloodline will creatively, technically, and aesthetically show you just how T.A.S diverse both as an artist and a music technologist. Aesthetically this piece aims to create an engaging story from the start to finish and provide some twists and turns that will have you hooked from the moment is starts until the end. It will do this by using a mix of creative visuals that give each section a very distinct look and feel and an engaging sonic story that works to blend a bunch of sounds and rhythms from different cultures to create a unified sound that is true to the artist. Technically I aimed to push myself as far as I could. Creating the visuals was one of the most time-consuming parts as it was the first time I was using the 3D Programs however I dedicated my time to researching tirelessly and letting my creativity run free. Diving deeper into the visuals T.A.S has utilized several programs to create the visuals. From the original photographs turned into video using Runway.ml in the beginning to the sequence created in Unreal engine with an animated depiction of the performer that was created using face gen to turn my real-life photo into a detailed digital model. blender to create a skull 3d version, the metahuman plugin to combine the previous two programs and create a head-to-toe model of T.A.S. then finally the live link app by Unreal engine to animate my face and so I could capture the facial movement. On top of this for the visuals you will also see a unique blend of visuals created in premiere pro and touch designer that help to visually enhance the storytelling present in the main performative element which is a rap that is riddled with intricate rhythms that will put you in a trance and hook you into the story. Sonically you’ll find a little bit of everything. Sticking to the overall genre of hip hop there are elements of different cultures sandwiched within some Asian instruments and some Indian drum grooves chopped up to fit the rhythmic landscape.


This whole experience will be run through Ableton live and I will also be using an Ableton push and an instrument created as a part of a second year course named after its creator T.A.S


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.