Case study paper

Page 1

TAP BOX 2.0


CARLA MOLINS PITARCH


This case study shapes a personal exploration about the connection between Computational Generated Images (P5.js) and Haptic Interfaces (Arduino + capacitive sensing) with one core shared element, sound. Therefore, the goal is to create a physical musical interface that creates both sounds and computer-generated visuals.


Tap box 2.0 is a haptic box system that translates a tactile musical interaction into a visual pattern resultant of the sequence generated. This project takes places within a broader scope exploring physicality and materiality of data. In this case, the data provides from the different sounds triggered by capacitive sensors. The data represented corresponds to the number of times each note has been played.



While pursuing this exploration, I asked myself several questions:

How does the materiality and interface of an instrument affect its affordances and user experiences? How does the visualization of the data change the musical experience? My answer was to create a programmable musical instrument that lends its users to create sound systems with simplistic input and simple real-time generated visuals.


“Simplicity is about subtracting the obvious, and adding the meaningful.” John Maeda.


Following this principle from John Maeda, I want to keep simplicity as another core element of this piece. Both the interaction, the image generated include simple elements based on a base geometry, a circle.


Several key components conform this project:

Arduino + capacitive sensors


Each tack works as a capacitive sensor providing the opportunity to play different notes. Sounds can be positioned differently just unplugging the wires and plugging them again using a totally different setting. I prepared a set of 8 metallic tags and used them as capacitive sensors that trigger a different sound each. I’m using Adafruit’s library for the CAP1188 to be able to have eight different wires. Besides, I have pitches.h that include the different frequencies of sound that I’m playing. Here, there is a sample of how each capacitor triggers the sound just by calling the proper element in the array. The system allows touching multiple elements at the same time playing the corresponding tone on a sequence while visualizing in realtime the different sounds generated.


Materiality Simple and cheap materials compound the physical system to experiment with what we have at hand, instead of expensive and inaccessible components.

Tacks + Bubble Wrap + Cardboard + Acrylic + Rubber band


Serialcontrol.js. The conjunction of P5.js and SerialControl.js is the keystone to the project. Serialcontrol.js is a GUI application for running and monitoring of the p5.serialserver which enables connectivity between a local serial device and a web application via the p5.serialport p5.js library. Thus, the connection makes possible to access the data coming from Arduino serial to read the values sent and read them in P5.js to lastly transform the data into visuals.


P5.js P5.js receives through serial data the number of the sensor touched by the user. Each sensor sends a value from 1 to 8 depending on the frequency of the sound from low to high. This same range of values is translated into the radius of the circles drawn after each tap.


Display The final set of this project includes a mobile device to display the visuals. The device uses the IP address to connect remotely to the real-time visuals generated on P5.js.


Taking all into consideration this project could take different directions exploring even further the properties of the materials, more complex sounds systems and more details streams of data to represent.



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.