PROCESSING EXPERIMENTS Emmanuel BARINIA DES09124 – Prototyping Interactive Experiences
For this module, we were introduced and had to learn 2 new software. One of them is called Processing. Processing is a code-based software that is used for a lot of variety purposes. The one we were particularly looking for is the art aspect of Processing. Depending on the code, Processing can be used to create animation that can be either generative self controlled, or interactive with the user using mouse, keyboard and many more.
Our first big task using processing was to draw 3d cube. We draw 2 squares, 1 is static and the other one’s size depends on the location of the mouse (here with the command mouseX for the vertical and mouseY for the horizontal position). The lines of the cube are attached to each of the 2 squares, with their position also changing with mouseX and mouseY to match the 2nd square’s position.
This task was to introduce us with 2 new variables which are if/else and the mousePressed variable. The if/else variable are there to check if certain conditions are met. If they are, colour the boxes in green. If they are not (else), then colour them in white. The second is a condition. Here it is a mousePressed condition, so that wherever the mouse button is pressed (mousePressed == True), colour the box in green.
After doing a video of 50 examples of interactions we came across, we had to represent 5 examples of these interaction using Processing. My first attempt was to represent a flash of a camera. Using if mousePressed condition and a delay, so that when the mouse is Pressed, an ellipse is display 2sec after representing the triggered flash.
The second example is trying to reproduce the effect of touchscreen technology of devices such as smartphone or tablet. Using the mousePressed function that represent your hand in a real life situation, wherever you click on the square a smaller is showing telling you where you pressed.
The third example is me trying to represent a vehicule moving through the city with it’s headlight on. The idea is to leave a light trail behind it that shows where it has been. The processing code about drawing an ellipse on the mouseX and mouseY position, and the circle slowly disappear leaving a small trail.
The forth example is a representation of the light of the city going on and off during the night. It is also my first time experimenting with the random command that allow a lot of room for more experimentation. The Processing representation is drawing ellipses at random position.
The fifth and last example of interaction is a traffic light representation in Processing. Using the if mousePressed condition and else, the code allow the red circle to be displayed, and when the mouse is pressed, it disappear and the green circle appear instead.
This task was about adding pictures in Processing and playing with them. To add pictures in Processing we would need to add a few lines (PImage a and a=loadImage(imagename). To make Processing recognise the pictures we need to create a new data folder inside of the project folder where we would put the pictures, then make sure to know the format of the image file (whether it is jpg or png‌).
For this task we were splits into groups of 3 people and asked to conduct some research to try and identify every kind of technology using interaction with users within the university campus.
The next part of this task was to find and identify one kind of interaction within the ones we find during our research and try to rethink the interaction over by changing the way the object is used. I decided to go with one of the elevator of the campus. Elevator are usually used by pressing a button. When people want to use the elevator, they press the button, wait for the elevator to arrive, get in, click on another button depending on which floor they want to go, and wait again. But looking at people who are actually using the elevator, we can see strange humans behaviour whereas where people are in a rush, they press the button many times, maybe expecting the elevator to arrive quicker (the more you press it, the quicker it is).
My idea was to change the interaction between the users and the elevator. Instead of pushing a button, people would need to wave at a sensor camera to call the elevator. Now the interesting part is that, waving more faster or longer would not make the elevator go quicker, but it would make the door opening faster. For the Processing representation of my idea, I used pictures taken of an elevator at different stage of the door opening process and just added a mousePressed variable that would change the image from closed door to opened door. I also added a picture of a hand that is attached to the X and Y position of the mouse to fake the hand waving to the elevator.
This task was about introducing us to the integers. Integers are variable that we can use for many purposes (switch, setting up normalizations‌). Then after we were asked to make a ball bounce from left to right. Using the if condition we could set up the code so that whenever the ellipse reach a certain position, then go the other way for certain speed, and vice versa
One of our last task was to add sound into processing. For that we had to add the Minim library into the code to let processing play the audio file. Just like when using images in processing, the sound files also need to be in a data folder. On the right side is our first experiment with sound. We had to make 4 equal squares in a box, and put 4 players available and ready to play sound file whenever ask to. Using if mousePressed condition, so that each square start playing a different song and pause the one that was playing. It is a processing jukebox !
This is me experimenting with both sound and image into Processing. It is a simple code displaying an image of a cat, with a if mousePressed condition that plays a cat sound whenever the mouse is Pressed
PROJECT 1 GENERATIVE ART Emmanuel BARINIA DES09124 – Prototyping Interactive Experiences
BRIEF & SITE ANALYSIS The brief was to design a projected generative art installation through processing in a given location within the campus. The given site location of the project is situated in a corridor of Napier Merchiston campus (here represented in the plan on the right) often referred to as “the glass box�. The corridor is especially busy during rush hours because of the students (mostly creative and design students)going in and out of their classes. And of course because it is busy it means it is noisy as well.
BRIEF & SITE ANALYSIS As previously stated, the chosen space is a corridor in a University, a space where students generally use for moving from a room to another, or going to other parts of the campus. It is meant to be temporary, a transitional space, hence the reason it is generally poorly designed and lit. Because it give access to different rooms and spaces within the campus, it is used by significant amount of people throughout the day (especially in the morning).
Because of that, I used to think that corridors are relatively boring, too crowded and noisy. But what if we used those negative terms and turn them into positive ones? So that the crowd could interact with the space, through the noise people make? Now that would be interesting‌
INITIAL INVESTIGATION Before getting into the development of ideas, I did a quick investigation around the campus trying to find what type of interaction was already installed. Most of them involved touching (buttons)
While others use sensors (proximity, cards, energy, barcodes) to work.
So what kind of spatial interaction are we missing in the campus?
SITE INVESTIGATION
These rough sketch above represent the traffic flow in the design corridor during the morning rush hour while student try to get to their classes. The first sketch represent traffic at 9.45, the second at 10. The last sketch represent traffic direction during morning rush hour. My chosen targeted projection area was the white wall in front of the glass box. The downside is that it isn’t fully covered in white and there is a name tag on the end of the wall. As for the projection it is either from the top or facing upward with a mirror.
CONCEPT DEVELOPMENT So my idea was to make use of the noise generated by student while passing through the corridor in order to enhance the experience while adding a little touch of interactivity. Of course this kind of interactivity between sound and art already exist, such as oscilloscopes, spectrum and other form of equalizers. After all, this can be considered as generative art as well.
CONCEPT DEVELOPMENT The first concept was a code that uses the microphone to move a line depending of the intensity of the sound perceived, just like an oscilloscope for a music player. I started experimenting on the code, producing multiple lines and effect but then I figured I couldn’t do much as it was too simple.
The second Idea was another code that was used to measure the frequencies and the intensity of sound produce through the microphone. I found this one more suited for the needs of the project so I started experimenting with it.
CONCEPT DEVELOPMENT After some tweaking on the code I removed the scales on the sides, changed the colours of the lines and background to fit the University’s colour and added the Napier logo.
FINAL RESULT
ON-SITE PROJECTION
ON-SITE PROJECTION
ARDUINO EXPERIMENTS Emmanuel BARINIA DES09124 – Prototyping Interactive Experiences
For the second half of the semester, we were introduced with a new software/hardware called Arduino. Arduino is an open-source code based software that send information to an hardware board (very much like a computer), which will control other hardware such as motors, LEDs and such.
Our first task with the Arduino was to set up a blinking LED using the Arduino UNO board and the breadboard. Connected to the breadboard was wires powered from the board, 1 LED and a resistor. Another wire was connected to the pin 11 of the UNO board and using the example blink from the Arduino database, we could get the LED blink and change the blinking speed of it.
Our second task using arduino was to write our name in Morse code using the blink LED we learn before and changing the pace of the blink so that it can be read in Morse code. https://www.youtube.com/watch?v=z-I0ipo9PIk
int led = 13;
void setup() { pinMode(led, OUTPUT); } void loop() { digitalWrite(led, HIGH); delay(1000); digitalWrite(led, LOW); delay(3000); digitalWrite(led, HIGH); delay(3000); digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(3000); digitalWrite(led, LOW); delay(3000); digitalWrite(led, HIGH); delay(3000); digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(3000); digitalWrite(led, LOW); delay(3000); digitalWrite(led, HIGH); delay(1000);
// E
// M
// M
// A
digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(3000); digitalWrite(led, LOW); delay(3000); digitalWrite(led, HIGH); // N delay(3000); digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(1000); digitalWrite(led, LOW); delay(3000); digitalWrite(led, HIGH); // U delay(1000); digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(1000); digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(3000); digitalWrite(led, LOW); delay(3000); digitalWrite(led, HIGH); // E delay(1000); digitalWrite(led, LOW);
delay(3000); digitalWrite(led, HIGH); // L delay(1000); digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(3000); digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(1000); digitalWrite(led, LOW); delay(1000); digitalWrite(led, HIGH); delay(1000); digitalWrite(led, LOW); delay(3000); }
The above picture show 2 potentiometer. Potentiometer are generally used to set give value (usually from 0 to 1023) to the arduino board depending on the rotation angle. Potentiometer are really accurate and easy to work with. We can tell arduino to do differents things each time the value go beyond a certain point, which leave lots of room for different settings and customization.
The above picture is an ultrasonic sensor. Ultrasonic sensor send sonic sound through a certain distance or until the sound encounter a surface, and then can sense the sound back to the device. Depending on the time the feedback sound takes, it can measure the distance between the device and the closes surface. This measure comes up as a data to arduino, which can then be translated into other scale of measurements (cm, mm‌) Ultra sonic experimentation https://www.youtube.com/watch?v=TW9bso KpqfM
The last sensor we get to try during class is a light sensor. It roughly uses the same technology as the solar panel system, but instead of stocking the lights and transforming it into energy, the sensor can measure the amount of light there is in a close area around it. The value of light received can go from 0 to more than 1000 depending on the light sources and it’s environment.
PROJECT 2 TANGIBLE TIME Emmanuel BARINIA, Salim AHMED, Caitlin Bain DES09124 – Prototyping Interactive Experiences
Aims • A clear and basic understanding of the fundamentals of prototyping. • Create thought provoking experiences whilst also being interactive for the user. • To create a space that is both uncomfortable and challenges the human condition.
Branding • • • •
Uncomfortable factor Abstract Quite challenging to decipher/ illusive Thought provoking
Logo Our initial idea started with the Oreo biscuit design. It’s quite a intricate and abstract design which we influenced us. We manipulated the concept to convey our idea.
Influences
Our very first group was to create an object that uses both the servo motor and arduino
SPACE The project presentation and exhibition would take place in the Reception room of Merchiston Campus. The room is very bright with lots of natural light. It is a quite room giving views on the campus and the main road outside.
CONCEPT The brief of the project was to develop an interactive object or design that would interact with the users that are using it. The use of both Processing and Arduino are necessary, and the project needs to be related with time
Time is an interesting and relative subject. Time is perceived differently depending on peoples, on their location, ages, ethnicity‌ so many factors that it makes time a universal terms, but unique at the same time. Our idea was to try and explore this idea that time conception is unique, and see if people are aware of how temporary life is. Time is the “scaleâ€? we humans use to measure duration. We all are slaves of time. Times tells us when to go to uni, when to eat, when to go to sleep. It regulate our life. But would people behave differently if they knew how many time they have left to live? Our idea was to create something that could tell people how many years they have left, and then compare it to animals so that they know what the said animals would have done with the given time they have left.
We all agreed to make our design in a form of a booth. The idea was to create a booth where people could go in, scan their fingerprint on a device and then get a card telling them how many years they had left. Now the idea of people getting a card at the end was a crucial part of the design because they would help them remember the experience. If we had only display the remaining years on the screen it would be a temporary experience that would last for mere hours, but with a physical token we hoped that the experience would last longer. We made a first draft of the booth out of cardboard and we decided to use an Ipad with a hand scanner app for the fingerprint recognition. The Ipad would still on a platform with a mouse inside of it, and using springs anytime a hand would rest against the Ipad the hidden mouse would be pressed, giving then information to the laptop that the animation should begin.
Above is a sketch of our initial card machine.
The making of the cards in the studio. We did make 50 different cards for a total of more than 200 cards ready for the exhibition.
For our processing animation, we wanted to make it looks like an algorithm calculation while still being in our reach and without the code being to complicated. We decided to use pictures of numbers falling down until all the green numbers are on the same level. To do that we use the float command that allow pictures to be able to move. On the left are the float posX command which determined the position of each images across the X position
The second pictures shows the use of the float posY command that will determine how far up each pictures starts at the top right corner. The float gravity will determine how fast the picture will fall (the value we chose here was 0.00005 Float floorNum is the command that will tell each pictures to stop once the chosen value is reach, so that they all stop on the same level
We also added two integers in the command that will be used as a switch. Int run is the switched that will start the rolling number animation once the mouse is released (using the mouseReleased function). int servo is the switch that will give the information to arduino once and then be on standby.
mouseReleased will also reset the animation if released again by adding the posY lines, gravity and floorNum again at the end in the mouseReleased function
When we start the presentation the logo of our group appears
Once the mouse is pressed/hand is scanning, another picture appear
After the scanning on the Ipad is finished, people remove their hands from the Ipad which release the click and trigger the last part of the animation which is the floating numbers
The next step was to set up arduino to use the servo motor. We initially experimented the servo to be controlled with buttons (2 buttons, with the first one to rotate the servo to the left and the second one to rotate the servo to the right). Unfortunately we ran into an issue where the servo provided with the Arduino only rotate for 180°, whereas we needed at least a servo of 360° for the cylinder card machine to work properly. The first picture on the right is the orignal 180° servo while the one underneath is a continuous rotating servo. Continuous servo are used differently because they are totally different from a “normal” rotating servo. The 180° rotating servo has a limiter which is also used to track the position (or angle) of the motor. Which means that we are able to tell the servo through Arduino to rotate to a certain angle (let’s say 40° for instance). So we have control of the angle of rotation and the speed of the rotation itself. A continuous servo is different, those motors are used for continuous rotating purposes such as aircrafts and boats, because it is not limited to a certain angle, there is no way of of telling what is the current angle or to rotate it to a desired angle. The only thing we have control with is the way it is rotating (clockwise/counter-clockwise), and the speed of the rotation itself. To set up servo at a mid point (usually when using it for the first time), I had to turn a little screw under the servo itself until it stops spinning around to define it’s centre.
Video links of servo experimentation: https://www.youtube.com/watch?v=dOhL4H-Iozw http://www.youtube.com/watch?v=cAtSV855kU&feature=youtu.be
Servo code explanation The Serial.begin(9600); line is there to start communicate with Processing. It is like choosing the same frequency on walkie talkie to get them working together. Myservo.writeMicroseconds(1500); line was to set the servo to the middle without it twitching. The if(val == ‘H’) is the condition that trigger the servo. H is the signal sent from Processing to Arduino. Once the signal is sent, Arduino will check if it is available. If it is then all servo rotation will begin. Delay(8000); means that the servo will wait 8sec after H is received before doing anything Myservo.attach(9); is to send data over to servo through the wire on the pin 9 Myservo.Write(100); will make the servo rotate anticlockwise (90 being the servo at midpoint, under 90 is clockwise rotation and above 90 is anti-clockwise rotation. Delay(4000); means the rotation is goint to last for 4sec. Val = 0; tells the servo that after the command is done, reset the value of H to 0 so that it doesn’t loop and to prevent the servo from rotating indefinitely after the sequences is done. Videos of me experimenting the simplewrite command of Processing to Arduino https://www.youtube.com/watch?v=EfVQTuti3ks
Not long before the Project presentation, we got our hands on a new device to replace our former card machine. The ticket machine was an already built dispenser that needed a motor to get the ticket out. It was the most stressful part of this project because we had to change a lot of our design such as the cards that needed to be cut smaller because the device can’t contain traditional business card size and also the way the new device would fit into our final project. To get the device working we had to buy a gear that would fit into the servo, and then fix the servo the ticket machine just next to its other part of the gear. Each rotation of the servo would be followed by the gear mechanism getting in motion and a card would slowly get out, ready to be taken by the user. To make the cards go out we had to setup Arduino to rotate for a certain speed for around 4 second (a lot of experiments have been made to perfect this). Unfortunately another issue we ran into was that with the nature of our cards being to thin, sometimes the card machine will pop two cards at the same time, so we had to tighten the rotary mechanism that help slip the cards out to reduce the probability of multiple cards (although it still happened once in a while but not often). This card machine was really challenging but I think it was a good experience, because we learned to work around problems and deal with stressful last minute issues.