Digital Ecologies: Archive at Tantallon Castle: Interactive Installation
When thinking about, hypothetically or otherwise,卢 subjecting a site to an intervention we must first consider its past. Any site will have many layers of history; one or many structures may have come and gone and as a result the location始s context is in a perpetual state of flux. It is possible to think of these variations of the site as disruptions. A once uninterrupted piece of ground is subjected to the vision of a designer, reshaping the earth, uprooting trees and altering the skyline. Time remains constant amid these iterations but the history is updated and a mark has been made on its ever-expanding timeline. I am interested in cataloguing disruptions, how human intervention marks. These marks can last centuries or even millennia, be ephemeral or even momentary. Due to the timescale of this semester I focused on the latter fleeting moments.
The brief of my Architectural Design unit, 驶In Proximity始 called for an intervention of bold proportions at a site of historic significance, a National Archive with at least 3600m2 of storage floor space situated at Tantallon Castle. I began to think about how the public might respond to this vast scale and weather one might feel intimidated by the quantity of exhibits. I wrote a brief to design a device that would enhance a visitor始s excursion by guiding them towards exhibits that may be of more interest to them. I devised a mechanism that would monitor a visitor始s behavior amongst displays and use this information to subtly guide them in the direction of related exhibits. These subtle gestures would be achieved by way of in a matrix of LEDs installed along the walls and in the floors.
However, my concept for the Archive at Tantallon became less focused around public exhibits of archive contents and as a result left the brief less appropriate. I had already been working with interactive components in my Architectural Design course interpreting movement and sound to create drawings and so decided my project would lean towards an interactive installation that would interpret live human data, either taking the form of a temporary installation or built in to the fabric of the Archive. ʻReflexʼ, ʻStudy of You and ʻMirrorsʼ all by design practice rAndom International are just a few of many examples of walls/panels that interact with human behaviors. They both represent abstractly disruptions caused by human presence in a provoking way. I was inspired to create an interactive wall that would obscure the view out to bass rock by translating movement and sensing proximity.
I used the grasshopper plug-in gHowl to recieve the data from my smartphone, the data is then split up and selected using a list item component. Item 8 and 9 were the touch sensor x and y co-ordinates which I used to replace the multi directional slider
For the purposes of this project I began by simulating movement with a multidirectional slider in grasshopper however the ambition of the project was to engage with real life data so I quickly moved to interfacing my model with the 驶Rhino Grasshopper App始. This ingenious application allows a live stream of data provided by your smartphone sensors to grasshopper, across a wifi network. I decided to use the touch screen sensor to provide me with x and y co ordinates thus performing the same function as the M.D slider with a more human quality. Unfortunately the speed at which the data was sent and processed caused a lot of lag and crashed rhino with much more than 10 blocks.
Firefly is a plugin for Grasshopper that strives to bridge the gap between computational design in Grasshopper and physical hardware such as the Arduino microcontroller board. I decided to make a circuit using two potentiometers and an Arduino board to create a 驶sensor始 that could be easily used. Because this simple device relayed information via usb the whole operation was sped up.
The installation took the form of a broad wall made up of a grid of rotating louvers that when targeted would begin to spin and stop spinning if no longer targeted. The haptic device I built is used tosimulate the positioning of the viewer and so as they walk past they would disrupt the wall.
I the previous variation was too static and perhaps boring, therefore I decided to go back to Grasshopper and rethink the way in which the input data could be used. I decided that the louver would spin according to their distance from the point (viewer). The distance would not simply stop the louvers outside of a determined range from spinning but instead reduce the speed at which turn. This is made more interesting when you view the piece and cannot see the moving point; the wall reveals the whereabouts of the point. A final point of development was to intensify this notion of revealing. I copied the wall, thus adding depth to the machine and a more human quality as well. The Arduino device simulates an ambling viewer, and their location is constantly being assessed, the louvers closest to the person spin the fastest, thus revealing where they are.
http://www.youtube.com/watch?v=ZpOBYBKQG3Q&feature=youtu.be