2 minute read
RHAPSODY IN BLUE, GREEN, YELLOW …
PROJECT METAMORPHOSIS TAPS MACHINE LEARNING TO TRANSFORM CLASSICAL MUSIC INTO VISUAL ART.
Arpeggios spiral. Brushstrokes in blue, green, yellow and red bend, sway and swoop. High notes glimmer and deep bass tones bloom. Without a sound, the music of Mozart, Bach and Debussy unfolds as images on a screen.
Advertisement
It’s all part of an innovative collaboration between Chapman University prize-winning concert pianist Grace Fong, Associate Professor of Computer Science Erik Linstead and graduate student Rao Ali ’18 (Ph.D. ’22).
The project translates iconic piano compositions into swirling, colorful and dynamic “moving paintings.”
The cross-disciplinary Project Metamorphosis takes a deep dive into machine learning, early 20th-century music theory and an algorithm called Perlin Noise that’s used in commercial films and video games for natural-looking effects.
The result: Video versions of familiar classical works. Without a sound, the music unfolds as images on a screen.
“In the future, this system could be used to interpret music into an aesthetic experience for people with hearing impairments and hearing loss,” says Ali, a Chapman Ph.D. candidate in computational and data sciences. The translation could be via a computer app that translates recordings for small screens or for display on large video screens at live concerts, he says.
“When collaboration works like this, it’s quite magical,” says Fong, a professor and director of Piano Studies in Chapman’s College of Performing Arts, Hall-Musco Conservatory of Music. Her previous collaborations have involved dancers, painters, film directors, even Michelin-star chefs, but never computer scientists.
“This has shifted my view to understand the harmony where machines are extensions of humans through reciprocal communication,” she says. “Chapman is the type of place that provides the support and resources to make it possible for these types of projects to materialize.”
Fong, Linstead and Ali published a paper in August 2021 in the leading arts and technology journal Leonardo outlining their shared journey. “I have no background in music, though I do enjoy listening to classical pieces,” says Ali. “Working with Dr. Fong helped me get closer and closer to an aesthetic interpretation of music as visual art.”
In addition to drawing on advanced computational techniques, Ali reached back to a circa-1919 theory called Marcotones that assigns colors to musical notes, such as red for the note C and green for F sharp.
It was just the starting point. Ali worked closely with Fong and Linstead to add nuance, such as making colors darker for passages in minor keys and lighter for major keys. When Fong said she imagines arpeggios as spirals, Ali wrote code that interprets these rippling, broken chords as swirls of color. He used the Perlin Noise algorithm to produce curving, naturalistic brushstrokes, too.
“I didn’t want jagged, straight lines,” he says.
Ali is now working on a new project that uses similar techniques to translate paintings into music.
“It’s one thing to take a piano composition and just produce colors on a screen, another to make it systematic and meaningful as Rao has done,” says Linstead, associate dean at Fowler School of Engineering and principal investigator of Chapman’s Machine Learning and Affiliated Technologies (MLAT) Lab. “We wanted a mathematical, computational and musical foundation for the project so that it could be used to produce visuals of any number of musical compositions.
Every step of the way, when the machine learning team explored something new through algorithms, they would send the work to Fong for her artistic input and advice.
“The best machine learning is done when you bring together people from different disciplines like this,” Linstead said.
The insights of concert pianist Grace Fong added nuance to the project. When Fong said she imagines arpeggios as spirals, research partner Rao Ali wrote code that interprets them as swirls of color.