3191037l Changing language - How visualisation and sonification can help

Page 1

CYMATIC

HOW VISUALISATION AND SONIFICATION CAN HELP

CHANGING LANGUAGE

VISUAL ART EXTENDED REALITY SYNESTHESIA PHOTOGRAPHY VIDEO INTERFACE INTERACTION DATA VISUALISATION MUSIC DATA SONIFICATION SCIENCE


CHANGING LANGUAGE TO SEE THE MUSIC MUSIC ANUMATION MACHINE

2

https://prodigiesmusic.com/music-animation-machine/

Bach, Fugue in C m


minor, BWV 546

viewing graphical music scores Stephen Malinowski, an incredibly innovative musician who has developed a unique platform for viewing graphical music scores, and turning those scores into visually engaging Youtube videos. Malinowski’s invention is called the Music Animation Machine. What is it? From his website: The Music Animation Machine display is a score without any measures or clefs, in which information about the music’s structure is conveyed with bars of color representing the notes. These bars scroll across the screen as the music plays. Their position on the screen tells you their pitch and their timing in relation to each other. Different colors denote different instruments or voices, thematic material, or tonality. And each note lights up at the exact moment it sounds, so you can’t lose your place. MAM notation is extremely simple:

Bach, Prelude in C major, WTC I, BWV 846

Each note is represented by a colored bar. The bars scroll across the screen from right to left as the piece plays, and each bar lights up as its note sounds (so you can’t lose your place). The length of each bar corresponds exactly to the duration of its note as performed (not in any way “quantized”). The vertical position of the bar corresponds to the pitch — higher notes are higher on the screen, lower notes are lower. The horizontal position indicates the note’s timing in relation to the other notes. The music animation machine is an engaging way to immerse yourself in a musical score. Young children will especially be enamored with the way the scores move and the notes “come alive.” Similar to Preschool Prodigies’ scrolling sheet music, the music moves across the screen, reinforcing the concepts of view music geo-spatially, with time being horizontal and pitch being vertical. While these animations are not necessarily on a staff, it reinforces seeing music on a staff.

Bach, Fugue in B-flat minor, WTC I, BWV 867

3


CHANGING LANGUAGE TO SEE THE MUSIC RAINER WEHIGNER’S VISUAL LISTENING SCORE TO ACCOMPANY GYORGI LIGETI’S ARTIKULATION

4

http://www.openculture.com/2018/01/watch-gyorgy-ligetis-electronic-masterpiece-artikulation-get-brought-to-life-by-rainer-wehingers-brilliant-visual-score.html https://www.youtube.com/watch?v=71hNl_skTZQ&feature=emb_title


Even if you don’t know the name György Ligeti, you probably already associate his music with a set of mesmerizing visions. The work of that Hungarian composer of 20th-century classical music appealed mightily to Stanley Kubrick, so much so that he used four of Ligeti’s pieces to score 2001: A Space Odyssey. One of them, 1962’s Aventures, plays over the final scenes in an electronically altered form, which drew a lawsuit from the composer who’d been unaware of the modification. But he didn’t do it out of purism: though he wrote, over his long career, almost entirely for traditional instruments, he’d made a couple forays into electronic music himself a decade earlier. Ligeti fled Hungary for Vienna in 1956, soon afterward making his way to Cologne, where he met the electronically innovative likes of Karlheinz Stockhausen and Gottfried Michael Koenig and worked in West German Radio’s Studio for Electronic Music. There he produced 1957’s Glissandi and 1958’s Artikulation, the latter of which lasts just under four minutes, but, in the words of The Guardian’s Tom Service, „packs a lot of drama in its diminutive electronic frame.” Ligeti himself „imagined the sounds of Artikulation conjuring up images and ideas of labyrinths, texts, dialogues, insects, catastrophes, transformations, disappearances,” which you can see visualized in shape and color in the „listening score” in the video above. Created in 1970 by graphic designer Rainer Wehinger of the State University of Music and Perfor-

ming Arts Stuttgart, and approved by Ligeti himself, the score’s „visuals are beautiful to watch in tandem with Ligeti’s music; there’s an especially arresting sonic and visual pile-up, about 3 mins 15 secs into the piece. This isn’t electronic music as postwar utopia, a la Stockhausen, it’s electronics as human, humorous drama,” writes Service. Have a watch and a listen, or a couple of them, and you’ll get a feel for how Wehinger’s visual choices reflect the nature of Ligeti’s sounds. Just as 2001 still launches sci-fi buffs into an experience like nothing else in the genre, those sounds will still strike a fair few self-described electronic music fans of the 21st century as strange and new — especially when they can see them at the same time.

5


CHANGING LANGUAGE TO SEE THE MUSIC EXPERIMENTAL MUSIC NOTATIONS AND UNUSUAL SYSTEMS

6

BOGUSŁAW SHAEFFER

https://llllllll.co/t/experimental-music-notation-resources/149


LEON SCHIDLOVSKY

GIOVANNI LONGO

7


CHANGING LANGUAGE TO SEE THE MUSIC GEORGE CRUMBS SCORE NOTES AND POSTER

8

http://andrewhearst.com/blog/2006/02/the_amazing_music_scores_of_the_avant_garde_composer_george_crumb


9


CHANGING LANGUAGE TO SEE THE MUSIC MUSIC FOR DEAF - CYMASPACE

At CymaSpace, we are working to make performing arts more accessible and inclusive to the Deaf & Hard-of-Hearing. Deaf users give valuable feedback on our technology and the deaf esperience of seeing sound

10

interactive music furniture

https://www.cymaspace.org/audiolux/


AUDIOLUX is an open source digital lighting system that allows the Deaf & Hard-of-Hearing to see music & alerts using the Arduino hardware/software platform & modern digital individually addressable LEDs. Source code is provided under open-source licensing for anyone interested to develop their own system. CymaSpace also offers pre-built Audiolux lighting systems for rental for events both large and small.

Deaf Want Music Too

1 in 8 Americans & 360 million people globally live in partial or complete silence due to hearing loss. Cultural events (performing arts & social gatherings) are a powerful way to bring communities together; however the Deaf & Hard-of-Hearing (DHH) are often excluded their benefits. When such events rely on sound to deliver their experience or message, organizations & artists often overlook equal

To address this need, CymaSpace serves the community providing advocacy for DHH patrons as well as access to ASL interpreters, accessible technologies and Deaf awareness training to art organizations, venues, festivals and individual artists.

access as they lack resources or awareness to ensure their event activities are accessible to those who cannot hear.

Through the CymaSpace Technology Program, we are developing hardware & software solutions to intelligently translate audio into light & vibration, sound can be seen & felt not just heard. When applied to performing arts; DHH & hearing audiences respond positively, reporting new appreciation for art mediums previously thought inaccessible. Many state the developments encourage them to attend events & socialize more. DHH artists could follow rhythms & perform alongside hearing peers. Others express a desire to use the technology at home for their entertainment system or even alert/safety systems. The AUDIOLUX Lighting System represents a significant first step in providing equal access to sound & music during performing arts & social events. Our affordable to purchase, rent or easy-to -build, open-source sound-reactive LED lighting system can be used in a variety of applications such as creating a visually compelling lighting addition to performance stages or integrated into architecture and even furniture. Incorporating AUDIOLUX

11


CHANGING LANGUAGE TO HELP UNDERSTAND MUSIC FROM SARS-COV-19 Lighting into your events has been shown to increase attendance by the Deaf & Hard-of-Hearing community and is a great way to improve equal access. One of the distinguishing features of SARS-CoV-2 is the crown, or corona, of spikes on its surface. Zoom in to those infinitesimal spikes further and they’re made up of chains of proteins, looping and folding over one another. In an attempt to understand this new pathogen better, musician and engineer Markus Buehler and his colleagues at the Massachusetts Institute of Technology have assigned each protein and structural form a musical equivalent. The result, generated by artificial intelligence, is a surprisingly soothing musical score that Professor Buehler said revealed detail that microscopes couldn’t. „Our brains are great at processing sound. In one sweep, our ears pick up all of its hierarchical features: pitch, timbre, volume, melody, rhythm, and chords,” he said.

12

„We would need a high-powered microscope to see the equivalent detail in an image, and we could never see it all at once. „Sound is such an elegant way to access the information stored in a protein.” The SARS-CoV-2 virus spike is a particularly complex assembly — it involves three protein chains folded together in an intricate pattern. The volume, duration and rhythm of notes in the score reflect how the amino acids that make up the proteins are arranged, and the entangled chains are rendered as intersecting melodies.

„These structures are too small for the eye to see, but they can be heard,” Professor Buehler said. Coronavirus questions answered An illustration of a cell on an orange background with the word ‚coronacast’ overlayed. Breaking down the latest news and research to understand how the world is living through an epidemic, this is the ABC’s Coronacast podcast. It’s pretty, but what’s the point? There’s more to this piece of music than a pretty melody with a nerdy origin. Translating proteins into sound gives scientists another tool to understand and manipulate them, Professor Buehler said. „Even a small mutation can limit or enhance the pathogenic power of SARS-CoV-2.”


It’s the spike protein on the outside of the virus that fits to the receptors inside our bodies and allows the virus to infect us. In creating the music, Professor Buehler and his team analysed the „vibrational structure” of this protein.] „Understanding these vibrational patterns is critical for drug design and much more,” he said. „Vibrations may change as temperatures warm, for example, and they may also tell us why the SARSCoV-2 spike gravitates toward human cells more than other viruses.” Taking a musical approach could also be used to design drugs to attack the virus, he said. For example, scientists could search for a new protein that matches the melody and rhythm of antibodies that could interfere with the virus’s ability to infect.

A MUSICAL ANALOGY So yes, this music has function. But it’s also a product of creativity and artistic expression. And, just like any art form, it highlights aspects of the world around us that we might otherwise miss. „As you listen, you may be surprised by the pleasant, even relaxing, tone of the music,” Professor Buehler said. „But it tricks our ear in the same way the virus tricks our cells. It’s an invader disguised as a friendly visitor. „Through music, we can see the SARS-CoV-2 spike from a new angle, and appreciate the urgent need to learn the language of proteins.”

13


COMPLEX DATA EASIER TO UNDERSTAND MT. ETNA SONIFICATION Data sonification experiments have been carried on by the Musica Inaudita sound laboratory of the University of Salerno (Italy), in collaboration with the Catania INFN Section and the TRAC (Technologies and Research for Contemporary Arts) project. Sonified data (provided by the INGV) were geophysical data collected by a digital seismograph placed on the Etna volcano in Catania (Italy). The sonification package has been written in Java and run on INFN GRID. Sonified data were geophysical signals (Mt. Etna volcano activity) collected by a digital seismograph. Data sonification is becoming one of the most versatile and precious diagnosis tool, in several fields: data analysis, support to visual data inspection, education. The sonification architecture deleloped is open and flexible. Its design includes a powerful and customizable audio synthesis engine, and a Digital Fourier Transform (DFT) algorithm which is able to write the results as ASCII file to disk, ready to be plotted, processed with an automatic parser or simply collected into a database.

14

Data sonification and the sciences Data sonification is the representation of data by means of sound signals, so it is the analog of scientific visualization, where we deal with auditory instead of visual images. Generally speaking any sonification procedure is a mathematical mapping from a certain data set (numbers, strings, images, ...) to a sound string. Data sonification is currently used in several fields, for different purposes: science and engineering, education http://grid.ct.infn.it/etnasound/page4/page6/page6.html

and training, since it provides a quick and effective data analysis and interpretation tool. Although most data analysis techniques are exclusively visual in nature (i.e. are based on the possibility of looking at graphical representations), data presentation and exploration systems could benefit greatly from the addition of sonification capacities. Because sounds can convey significant amounts of information, sonification has the potential to increase the bandwidth of the human/computer interface. Nevertheless, its use in scientific computing has received limited attention up to the last years because of the intensive computations usually required to produce sound. Digital audio usually deals with very high sampling rate, the standard value for CD quality audio signals is 44100 Hz, so to produce one second of audio data it is required to compute 44100 values. One minute will take 60 x 44100= 2646000 calculated samples, just to have a snapshot of the sonification procedure from the point of view of computing. On the other hand, sonification could be a very precious aid, since ear has a very high power of discrimination. As a consequence, one can use very small frequency steps (even smaler than a quarter tone) to take into account any tiny variation of the data. Equally discriminating power is available for which concern the timbre. Finally, in the scientific domain, sound is an extremely interesting tool (and one of the most easily suitable) to identify regularities in the time domain, both at the level of microstructures and on large


scales. In addition to that, sound can immediately make clear and recognizable transitions between random states and periodic phenomena. Most auditory processes are indeed based on the detection of regular patterns: periodic repetitions which turn into understandable qualities like pitch and timbres. Moreover, sonic representations are particularly useful when dealing with complex, high-dimensional data, or in data monitoring tasks where it is practically impossible to use the visual inspection. More interesting and intriguing aspects of data sonification concern the possibility of describing patterns or trends, through sound, which were hardly perceivable otherwise.

15


COMPLEX DATA EASIER TO UNDERSTAND HOTEL KILAUEA VOLCANO SONIFICATION AND VISUALISATION Hotel Kilauea tells the story of volcanic unrest over 10 years, compressed into three minutes - a synthesis of the natural rhythms and melodies at an active volcano. KILAUEA VOLCANO AND RAW DATA

16

Kīlauea is a restless volcano on the big Island of Hawai’i. The data associated with this recording and animation tell a story of volcanic re-awakening over 10 years. After decades of reduced activity, between 2003 and 2007 the supply of magma rising from Earth’s mantle within the Hawai’ian hotspot dramatically increased1. This deep influx of magma to Kīlauea volcano culminated first in a shallow intrusive episode where magma came close but did not reach the surface on June 16-17 2007 (the “Father’s Day” intrusion), and then caused the opening of the Halema’uma’u crater on March 19, 2008. This volcanic vent was open (and hosted a lava lake!) until the eruption of May, 20182 when the lava lake drained and the crater collapsed catastrophically. The Hawaiian Volcano Observatory on Kīlauea, established in 1912, recorded a variety of signals that documented this dramatic series of events. We merged three data streams taken in the time interval between 2000-2010 for this audio-visual depiction of the action. Global Positioning System (GPS) measurements1 taken across the summit of the volcano record a long baseline swelling of the ground (Figure 1A). Tiltmeter records of days-long deformation epihttps://www.volcanolisteningproject.org/page/page-4/

sodes (called “Deflation-Inflation” events3, Figure 1B) show more localized, transient underground magma motions in the shallow subsurface. Measurement of sulfur dioxide (SO2) gas flux at the surface1 (Figure 1C) tracks the eruption rate of magma at the surface. Stacks Image 23 Figure 1. Data from the Hawaiian Volcano Observatory, taken at Kīlauea volcano between 2000-2010 during a period of increased magma flux, which culminated in the opening of Halema’uma’u crater in 2008. The top axis shows the score time for the multimedia piece “Hotel Kilauea”. The vertical blue dashed line is the Father’s Day intrusion, while the red dashed line in the opening of Halema’uma’u crater. (A) GPS distance between two stations on opposite sides of the summit crater, giving a “line length” that varies as the volcano swells up. (B) Occurrence rate of Deflation-Inflation ground deformation events near the summit of the volcano. (C) Measured SO2 gas emissions near the summit, which is a proxy for the volume of magma erupted.


Figure 1. Data from the Hawaiian Volcano Observatory, taken at Kīlauea volcano between 20002010 during a period of increased magma flux, which culminated in the opening of Halema’uma’u crater in 2008. The top axis shows the score time for the multimedia piece “Hotel Kilauea”. The vertical blue dashed line is the Father’s Day intrusion, while the red dashed line in the opening of Halema’uma’u crater. (A) GPS distance between two stations on opposite sides of the summit crater, giving a “line length” that varies as the volcano swells up. (B) Occurrence rate of Deflation-Inflation ground deformation events near the summit of the volcano. (C) Measured SO2 gas emissions near the summit, which is a proxy for the volume of magma erupted.

17


VISUALISATION A final element of “Hotel Kilauea” brings the sonified – and then musically abstracted – volcanic data one step farther, by animating both the datasets and the musical interpretation. Digital artist Zack Marlow -McCarthy used a 3D digital elevation model of the Kīlauea volcano as a basis for pointillistic expression of the data, rendering different visual elements for each of the three data streams. Rather than map the data directly onto an image, Zack took some artistic license that retains key elements of the data. The GPS data enters through long-wavelength destabilizations of the volcano, while the Deflation-Inflation events enter as discrete bursts of color, and SO2 gas emissions rise out of the Halema’uma’u crater as beams of red light. To close the loop between representation of the data and artistic interpretation, Zack also included a visual element associated with the music. Subtle deformation of the volcano occurs in response to the improvised music: a blurring of the difference between reality and interpretation, or a nod to the imprecise and ambiguous reality of scientific story telling. Sonification The representation in Figure 1 is a “traditional” way to present data: numerical values associated with signals recorded at the volcano are plotted on a graph, and we can observe relationships between the signals. For example, notice that GPS data record a deflation coincident with the Father’s Day magma intrusion, and SO2 emissions spike up around the opening of Halema’uma’u when magma reached the surface as a lava lake.

18

Sonification of the Kīlauea data provides a different way to understand these patterns, by representing the volcanic data with sounds that faithfully represent the quantitative information recorded by scientific instruments. Ben Holtzman at Columbia University and I had to come up with some new techniques to do this (credit for these techniques goes almost entirely to Ben, I merely provided support and tested things out). One might say that the music was composed by the volcano, but arranged by us! We used a different approach to sonification for each data stream. For the GPS data, we developed a “weighted chord” sonification that uses an octahttps://www.volcanolisteningproject.org/page/page-4/

tonic or diminished scale (a very symmetric and jazzy-sounding scale) to map GPS line length onto pitch clusters in a low frequency range. This gives a sonically broad, fuzzy picture of the rising and falling of the volcano surface, representing slow deformation that occurs over a long time relatively deep beneath the surface. For the Deflation-Inflation deformation episodes, we used the onset of events and their magnitude to create sound “bursts” with loudness and pitch associated with the data values. For the SO2 flux, we mapped the timeseries (in tonnes per day) of gas coming out of the ground to frequency directly, using a high register to emulate the more rapid movement of magma flowing at or near the surface. To make things interesting, we sped up all of the data by a factor of more than a million – ten years in three minutes! DIRECTED MUSIC IMPROVISATION The sonified data alone is an interesting scientific product. Just like a graph, we can listen for patterns and relationships between the datasets. We can develop a story to explain the patterns and hopefully understand the natural world a bit better. However, as artists as well as scientists, we also recognize and value the aesthetics of volcanism. We created a musical interpretation of the data, Adam Roszkiewicz playing a ‘prepared’ guitar with erasers and paper clips attached to the strings, John Mailander and Leif Karlstrom playing violin, and Ethan Jodziewicz playing a ‘prepared’ bass with paper stuffed in between the strings). We worked with recording engineer Daniel Rice at his studio in Nashville Tennessee, the four performers crowded in a room around a stereo pair of microphones. The sonified data was piped into headphones and we played spontaneously in response to the sounds as well as each other – no rehearsal, no key or meter, just free improvisation of the kind made popular by jazz artists like Ornette Coleman or John Zorn. You can hear that the playing takes on a life of its own: we react at different rates and different ways, and don’t stop playing quite in time with the data. But, we engage with the patterns of sound created by the volcanic data and this generates musical moments.


19


COMPLEX DATA EASIER TO UNDERSTAND THE COLOUR OF DYLAN

20

https://www.behance.net/gallery/64507955/The-Colors-of-Bob-Dylan


Each piece of music has a key: the note on which the whole harmonic composition is built. How can we use this information to visually represent the complexity of a musician’s production over time? This data visualization project transforms each note into a color to tell the unique story of Bob Dylan’s work through the 442 songs published in his studio record releases. The visualization is enhanced with other information telling us more about each song: its duration, its type (originals versus covers, blues, minor-key songs) as well as by the chance to rearrange them in different orders (chronological, record or key-based). The project was developed as a digital interactive application and as a series of printed posters.

21


COMPLEX DATA EASIER TO UNDERSTAND TEMPERATURE ANOMALIES VISUALISATION Perhaps, more in the realm of climate change maps, the well-crafted map displays temperature anomalies for specific date ranges. When you select a time range, the Temperature Anomalies map is certainly a stunning way to see how our climate has been impacted over time.

22

https://svs.gsfc.nasa.gov//cgi-bin/details.cgi?aid=4030 https://gisgeography.com/global-weather-maps/

Using NASA’s five-year global temperature anomalies data from 1880 to 2012, you can see the dire trend in Earth’s climate.


23


INTER FAC E

DATA V I S UALI S AT I O N

INTE R AC TIO N

VJING

IMAGINATION/UNREAL

LIGETI’S ARTIKULATION VISUAL LISTENING ANIMATION

4-5

MARY HALLOCK GREENWA GENERATIVE ART

PITCH

EXPERIMENTAL MUSIC NOTATIONS AND UNUSUAL SYSTEMS

6-7

TEMPO

MUSIC ANIMATION MACHINE

2-3

CHANGING LANGUAGE C UNDERSTAND THE COLORS OF BOB DYLAN

20-21

COMPLEX DATA

22-23 NASA TEMPERATURE ANOMALIES

MUSIC VISUALISATION, SCENES AND F COV19 MUSIC FROM STRUCTURE

12-13

24

10-11

MT ETNA VOLCANO SONIFICATION

14-15

HOTEL KHALILEA VOLCANO SONIFICATION AND VISUALISATION

16-19 VID EO

P HOTO G R A P HY

EX T EN DED R EALI T Y


MUSIC

DATA SON I FI C AT I O N

SC IE N C E

REAL WORLD SURROUNDING

4’33’’

ALD KANDINSKY PAINTINGS

ALEXANDER LAUTERWASSE

COMUNICATION MUSICAN

AN HELP AUDIENCE VISUAL ARTIST

CYMATIC SCIENCE EXPERIMENTS MUSIC VIDEO

FILLING THE GAP IN COMUNICATION ANALEMA GROUP

FURNITURE FOR DEAF

VISUALISATIONS APP

?

TELEPRESENCE INTERFACE

S YN E S T H E S IA

VISUA L A RT

C YMAT I C

25


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.