AV Waves

Page 1

Islam Shabana

AV Waves The EEG Audio/ Visual Platform

Generated Parametric Image/ Own Graphic

KEYWORDS:

ABSTRACT:

Generative Art

AV Waves is a project aiming to develop a platform capable of rendering audio/visual experiences through the use of brainwaves. AV is an abbreviation for Audio/ Visual, while ‘Waves’ refers to brainwaves. In this project, we will explore how rendering this experience can be developed by engaging and analyzing the effect of sound on the human brains. This effect will be measured using data collected by the NueroSky EEG (electroencephalograph). This information is used to create a visual system which can react in real time according to the changes in EEG readings. Thereby producing a visual representation of the sound in real-time. AV Waves aims to become a leading platform capable of fully exploring the potentialities of A/V experiences and hopes to usher in further development in regards to live performance technologies. This combination of sound and visual imagery can yield profound implications for VJ’s and those working with A/V performance fields. The AV Waves project itself is divided into two parts. The first phase entails is experimenting to determine how dramatically sound can affect the human brain. The second involves to building a platforms that can visually generate the effects the sound has on the mind. Thus, the user will be able to guide the generated visuals whilst being exposed to sound/music thereby allowing the human brain to function as a physical medium capable of driving the A/V experience.

Audio/Visual, Sound, Visuals, EEG, Processing, Digital Platform


Islam Shabana

AV Waves

OBJECTIVE:

STATE OF THE ART:

“AV Waves” is a technical project with a very experimental background. One that aims to create a digital platform that utilizes EEG (electroencephalograph) to render an Audio/ Visual experience. The experiment goal was to visually show the relation between sound and brain waves. By utilizing generative art algorithms, the ”AV Waves” platform will be able to visually generate what the user is listening to using her unique EEG readings. This open source platform is easy to use and can be developed by artists and designers who wish to produce very a distinct, generative, visual results effect using EEG readings to lead an Audio/ Visual performance. The experiment will use the designed platform to show the change in the brain’s activity while it is being exposed to sound. This will consider many questions regarding the nature of sound and its effect on our brains. Queries such as, does sound affect the majority of people the same way or is individualistic experience? As well as, is it possible to generate images using brainwaves that can be read digitally? If so, to what extent will these visuals react when the sound change?

The inspiration behind “AV Waves” originally came from the field of “Cymatics Studies”, which is the visual representation of sound vibrations through physical materials. As well as from what is called the “Lissajous figures”, an experiment that was conducted by Jules Antoine Lissajous (1822-1880) that showed very similar visual representations of sound vibration but through light. It is known that different sound frequencies can displace particles, liquids, or light rays in an arranged patterns. We can easily assemble mathematical relationships between audio frequency and the visual geometrical displacement of the vibrations that happens to the material or light rays exposed to the sound. These mathematical relations will be our main reference in designing the algorithm for the visual representation of our experiment. The mathematics behind these experiments found an interesting intersecting point between sound, vibrations and physical reality. Through the experiment, we aspire to engage the human brain with sound and explore the ability to translate brainwaves into visual imagery. Whilst researching projects that use brain waves as a milestone for visual production; we found few conceptual projects that focus on visual generation or manipulation. The projects reviewed, and found to be a good base of study, were “Mandala Alhambra” by Andreas Brog, and “VALENCE”; a collaboration between EMRG, IMEC and Holst Center. However, there were projects where the control, or

Lissajous Figuers/ Creative Applications (Internet)

VALENCE: affective visualisation using EEG/ Photography © Ludivine Lechat


Islam Shabana

generation of sound/music, was the focus instead. This lead to our motivation to develop “AV Waves” as a leading platform combining sound and visualization. “AV Waves” can be considered a development point for Audio/Visual experimentations, and contributes especially to the fields of digital performance and VJing. Other projects that render visual representations of sound are interesting to us, such as Reify which translates sound visually and through 3D printing. Users of Reify translate whatever sounds/music they like into very personal 3D printed objects. “Mental Fabrication” is another project which produces 3D print outs of geometric fabrics landscapes using the EEG readings of what the artists calls the user’s “mental or emotional landscape”.

AV Waves

METHOD: “AV Waves” is a combination of different digital technological tools utilized to create a digital platform. First, we will be using a commercially affordable EEG scanner (NeuroSky mindset) which can be easily purchased online. Secondly, a program will be written to generate visual imagery initiated by a set of parameters collected from the audio input. Information that is then manipulated and controlled by other sets of parameters collected by the EEG readings. The user will be exposed to sound/music which will initially trigger a visual base, such as a particle system. Using the data collected from the EEG brainwave headset, which represents the user’s brain activity in real time. We will then manipulate the visuals using a different sets of parameters. The data collected from the audio input and the EEG values will be then be representing the mathematical expressions proven in the Cymatics Studies and/or the Lissajous Figures experiments.

TECHNOLOGY: Foremost, the algorithm used will be based on changing parametric values in the semi-random generated visuals. The visuals will first be triggered by the audio input. The ensuing first set of parameters will be collected through an audio Line-In, and it will be categorized by the range of sound frequencies being played in the piece. This set will be termed “Sound Parameters”; and will generate a simple visual representation, such as a particle system. It will then follow a mathematical expression according to the range of the audio frequencies being played. EEG frequency bands and related brain states/ Brain Wave Signal (EEG) of

The EEG brainwave reader ,at this time, will collect the values of the brain signals affected by the sound the user is hearing. The EEG data collected is then categorized under the brain’s natural seven, and main, types of brain wave signals: Delta, Theta, Alpha,

Rough visual plan/ Own Graphic


Islam Shabana

Low Beta, Midrange Beta, High Beta, and Gamma. Each signal is associated with a mental state the brain experiences. For example, Theta might represent an intuitive, creative, nostalgic, phantasmic, imaginary or dream state. While High Beta could represent alertness or agitation. The second set of parameters will be categorized based on the ranges of EEG data collected, and we can designate them the “Brain Parameters”. This set of parameters will be responsible for changing the mathematical variables in the expressions generated by the “Sound Parameters”. After trying the NeuroSky mindset, initial manual recordings showed that the EEG values changed whilst being exposed to different sound/music. For example, the alertness values (High Beta) were found in the presence of fast   and repetitive sounds like in beats or kicks while values usually attributed to meditation or relaxation (between Delta and Alpha ranges) were associated with more slow ambient sounds and synths. The sample we tried consisted of only a few people, and we will carry on recording more data digitally and sort them in a database and visual graphs for studying. Secondly, Processing will be our developing tool, due to the excellent quality and the availability of libraries that work with audio inputs and the EEG headsets. One of the problems we encountered during researching the compatibility of processing libraries for EEG readings was data instability. The EEG headset was apt to collect rapid and flickering changes in data readings which could lead to potentially unclear value read-outs. After some technical research, we found “Data Filtering” to be a common solution to the problem. We have come to realize it is preferable to use the average values of each brainwave to filter out which wave is representing the user’s state at that moment. This facilitates the desired parametric changes. The data filtering will ensure that the right range of values will guide the “Brain Parameters” to manipulate the final visual results.

AV Waves

TIMELINE: Oktober 2015

– Data recording – Algorithm Design

December 2015

– First program Prototypr – Test and develope second prototybe

March 2016

– Final protoype – Design a performance – Documentation

April 2016

– Final presentation

Resources: MacTutor History of Mathematics archive, Lissajous Curves, School of Mathematics and Statistics, University of St Andrews, Scotland, JOC/EFR/BS January 1997, Web. Tom De Smedt, Lieven Menschaert, VALENCE: affective visualisation using EEG, Digital Creativity, Volume 23, Issue 3-4, 2012, Web. Sami Emory, Artists Turn Songs into 3D-Printed Sculptures, The Creators Project 2015, web. Dan Cooper, This Machine Turns Your Mental Map Into An Architectural Structure, The Creators Project 2014, Web. Andreas Borg, Drawing with brainwaves (Granada/ Dubai), Alhambra Mandala 2012, Crea.tion.to, web. Filip Visnjic, The Harmonic Series, Device that explores musical and visual harmony, Creative Applications 2013, Web. NeuroSky, inc., Brain Wave Signal (EEG) of NeuroSky, Inc., December 15, 2009, Web.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.