W.A.V.E
THE UNDERGROUND AADRL V.12
Contents 1.
Design as Research
2.
Research Lab
01.1 01.2 01.3 01.4 01.5 01.6 01.7 01.8
02.0 02.1.1 02.1.1 02.1.2 02.1.3 20.1.4 02.1.5 02.1.6 02.1.7 02.1.8 02.2.1 02.2.2 02.2.3 02.2.4 02.2.5 02.3.1 02.3.2 02.3.3 02.3.4 02.3.5 02.3.6 02.4.1 02.4.2 02.4.3 02.5.1 02.5.2 02.5.3
Introduction to Fractals Algorithmic Architecture Matering Material Information is ‘Alive’ Singularity Responsive Environments Principals Team as Design Stratergy
Studio Brief L-System Symmetrical Behaviours L-System Assymetrical Behaviours L-System Behaviours in Space L-System Circular Diagrams Multiple L-Systems L-System + Lofting Aggregation using L-System Spatial Behaviour of L-System based cubes Translation of L-System cubes in to components Quadtree Algorithm Quadtree Algorithm -Modification Quadtree Experimentation Quadtree Algorithm - Architectural Speculation Quadtree Data Manipulation Octree Algorithm Octree Experimentation Breaking Symmetry with Data Manipulation Octree algorithm - Analysing form and space Breaking Symmetry of Koch Geometry Natural Selection Koch Curve 2D Koch & 3D Koch Lofted Koch Curve Aperiodic Tiling - Penrose Penrose Deployment Penrose + Koch
5
02.6.1 02.6.2 02.6.3 02.6.4 02.6.5 02.6.6 02.6.7 02.6.8 02.7.1 02.7.2 02.7.3 02.7.4
6 18 24 30 36 42 48 54
61
63 64 66 72 76 78 79 80 84 85 86 87 88 92 96 98 100 101 102 103 104 108 110 114 116 117 120
3.
Materialization Water dynamics on folded geometry Folding behaviours on populated patterns Components from Penrose Geometry Aggregation of components on Koch Curve Cutting Logic Overlapping Pinching L-System based cubes under dynamic forces Penrose tiling pattern embodied in Kinetic System Pentose Kinetic structure bounded by Koch Curve Surface Apertures
Design Intervention
03.1 03.2 03.3 03.4 03.4 03.5 03.6 03.7 03.8 03.9 03.10 03.11 03.12 03.13 03.14 03.15 03.16 03.17 03.18 03.19 03.20 03.20 03.21
Context Site Deployment Heirachichal Model Pattern Deformation Topography Water Dynamics Sedimentation Rates around the world Sedimentation on Mudflat Zone Pattern Ellimination Topography generation Touchdown points - extrusion Enclosure deformation Structural System Apertures Water Breaker Section - Mudflat Zone Perspectives Ecology Water Channels Sedimentation on River Link Section : River Link Landscape Profile
122 123 124 127 128 129 132 133 135 136 137 139
141
143 144 146 147 149 151 153 154 161 168 172 178 182 186 196 200 210 212 218 222 232 236 238
03.22 03.23 03.24 03.25 03.26 03.27
Fabrication System Construction System Perspectives Touchdown and Anchoring Points Orders of Triangle Ecosystem
240 242 244 250 253 254
4.
Physical Models
257
5.
Acknowledgement
273
01 Essays
Design as Research
5
Introduction to Fractals
01.1. Introduction to fractals
By: Pebyloka Pratama
6
Background of Fractals
comprehensive deductive and logical system (http://en.wikipedia.org/wiki/Eu People always try to define clidean_geometry). space. Define forms and the spaces This effort of discovering geomethey produce. Spaces as an intangible try as one way to define space and form variable provide a great sense of free- is also triggered by the kind of technoldom and potential. In order to discover ogy that human posses at the time. With those potentials, people are often tried mathematics as the ground knowledge to make the intangible variable into a and basic tools such as scale and comtangible one. And therefore space be- pass, man begins to try to understand comes place a tamer creature that man space by measuring it, and forming it into can more easily interact with, despite simple shapes. eliminating some degree of the sense of freedom that it had. The method of defining spaces Euclidean and Non-Euclidean Geometry has been evolving with the same rhythm of the evolution of technology. TechnolThe Euclidian Geometry, a term to ogy as a means to help man to under- address the basic forms that Euclid of Alstand and simulate a system opens up exandria discovers, is very powerful that new possibilities on how man can observe it stands untouched for a very long time. space. From the most basic and primitive The basic geometry that the Euclid’s Eltechnology until a complex computa- ement uncovers are being used everytion system, the evolution of technology where for centuries as a main principle of greatly influence on how people think defining spaces. It is still being used even about forms and the spaces that they in modern day architecture. produce. Evolution of technology helps For over two thousand years, man to understand and tame the intan- the adjective “Euclidian” was unnecesgible “space” in order to explore the po- sary because no other sort of geometry tentials of space. had been conceived. Euclid’s axioms The first effort in defining space seemed so intuitively obvious that any that has been recorded dates back to theorem proved from them was deemed the Greek era. The earliest discussion true in an absolute sense. about geometry was sparked by a Greek Today, however, many other selfmathematician Euclid of Alexandria with consistent non-Euclidean geometry are his book Euclid’s Element. It has been known, the first ones having been discovone of the most influential books in histo- ered in the early 19th century. It also is no ry, as much for its method as for its math- longer taken for granted that Euclidean ematical content. The method consists of geometry describes physical space. An assuming a small set of intuitively appeal- implication of Einstein’s theory of general ing axioms, and then proving many other relativity is that Euclidean geometry is a propositions (theorems) for those axioms. good approximation to the properties of Although many of Euclid’s results had physical space only if the gravitational been stated by earlier mathematicians, field is not too strong (http://en.wikipedia. Euclid was the first to show how these org/wiki/Euclidean_geometry). propositions could be fit together into a
The way human understands about space and form is constantly changing because of the evolution of technology and civilization. New technology forces people to redefine and re-criticize the way they initially defines space and form. Although these complex forms are already coexisting with daily human lives throughout the history, whether it’s an artificial pattern like the traditional Indian knitting pattern or natural composition like geographic formation that is caused by the abrasion of sea waves, but the understanding of these complex form is constantly changing and evolving. Euclidean geometry that emerges
with the help of basic analogue tools and technology like scale and compass is evolving with the discovery of relativity theory and gravity. The evolution from analogue machines to digital machines will also influence how we see geometry. With the invention of digital computation machines, the way man understands form, spaces, and pattern will also evolve. Computers that enable to solve highly complex computation problems inspired a lot of innovations; one of them is the discovery of fractal geometry. Julia Set The root of fractal geometry itself is actually already being studied decades before any complex computation machine discovered. In Paris, in 1917, a mathematician named Gaston Julia published papers that studied about the so-called complex numbers (Arthur Clarke, Fractals : The Colours of Infinity). These mathematics studies have come to known as the Julia Set.
Greek mathematician performing a geometric construction http://en.wikipedia.org/wiki/File:Sanzio_01_ Euclid.jpg
01.1. Introduction to fractals
The Euclidean Geometry began to evolve when Einstein’s theory of general relativity and Newton’s theory of gravity being discovered. Geometries that are being influenced by these theories begin to emerge. Non-Euclidean geometry that is contrasted with Euclidean geometry is being discovered. The essential difference between Euclidean and non-Euclidean geometry is the nature of parallel lines. Euclid’s fifth postulate, the parallel postulate, which states that within a two dimensional plane, for any given line l and a point A which is not on l, there is exactly one line through A that does not intersect l (http://en.wikipedia.org/ wiki/Non-Euclidean_geometry). By contrast, non-Euclidean geometry states that there are possibilities that there are intersections between those lines. This condition that defies the parallel postulate of Euclid is called the Elliptical behaviour, and the opposite of this behaviour is the Hyperbolic behaviour. Non-Euclidean geometry at this time are considered to be a transformation of a Euclidean geometry that is affected by the laws of gravity, and Einstein’s relativity theory.
…In complex dynamics, the Julia set, J(f), of holomorphic functions (holomorphic functions are the central object of study of complex analysis; they are functions defined on an open subset of the complex number plane C with values in C that are complex-differentiable at every point) f informally consists of those points whose long-time behaviour under repeated iteration of f can change drastically under arbitrarily small perturbations (bifurcation locus)… http://en.wikipedia.org/wiki/Julia_set
7
01.1. Introduction to fractals
Julia Set http://www.fordham.edu/lewis/fferm/julia.png
Brownian motion
8
Gaston Julia and Pierre Fatou, French mathematicians, are the ones who initiated the theory of complex dynamics in the early 20th century. These mathematicians can already predict the meaning of iterations in a mathematical algorithm and what they can do in the realms of geometry, although they never seen what the Julia Set that they discovered looks like. It’s only at the advent of modern computation (around 1970) the Julia Set can be realized as a graphic image. Meanwhile, a series of geometric studies trying to pry away from the notion of traditional Euclidean geometry happens elsewhere. People are starting to look at the irregular pattern of nature and trying to understand it.
tic motion. Physical Brownian motion is described in Perrin 1909 as follows : “ In a fluid mass in equilibrium, such as water in a glass, all the parts appear completely motionless. If we put into it an object of greater density, it falls. The fall, it is true, is the slower the smaller the object; but a visible object always ends at the bottom of the vessel and does not tend again to rise. However it would be difficult to examine for long a preparation of very fine particles in a liquid without observing a perfectly irregular motion. They go, stop, start again, mount, descend, mount again, without in the least tending toward immobility” (Benoit Mandelbrot, ‘Theme’ and ‘The Irregular and Fragmented in Nature’, The Fractal Geometry of Nature , New York, 1977).
Brownian Motion
Koch, Peano, and Sierpinski
Robert Brown, a Scottish scientist who is acknowledged as the leading botanist in an expedition to Australia during the first half of the 19th century (http:// en.wikipedia.org/wiki/Robert_Brown_(botanist)), studied a pattern movement that is going to be known as the Brownian motion. The Brownian motion is the seemingly random movement of particles suspended in a fluid or the mathematical model used to describe such random movements, often called a particle theory (http:// en.wikipedia.org/wiki/Brownian_motion). Brownian motion is among the simplest of the continuous-time stochastic processes, and it is a limit of both simpler and more complicated stochastic processes. Later on, Brownian motion is discovered to have a several real-world applications. An often quoted example is stock market fluctuations. The Brownian motion is later studied by Jean Perrin, a French physicist, and to be one of the first and simple stochas-
There are a lot of other examples of scientific studies that trying to explore or even innovate a new possibility of geometrical family apart from the classic Euclidean geometry. Some of them are what we know as Sierpinski Triangle, Koch Snowflake (or also known as Koch Star), and Peano’s Space-Filling curve. All of these studies are already embody the principles that we now know as fractal geometry, although the term fractal geometry itself has not been discovered by the time these studies took place. The Koch snowflake is a mathematical curve and one of the earliest fractal curves to have been described (although by the time the Koch snowflake is discovered, the term ‘fractal’ has not even been discovered yet). The Koch snowflake appeared on the first time on 1904 in a paper written by a Swedish mathematician Helge von Koch titled “On a continous curve without tangents, constructible from elementary geometry”
L System as a Representation Tool These systems later on are studied and represented, geometrically and also algorithmically by the Lyndenmayer system. Lindenmayer system (or L-system for short) were conceived as a mathematical theory of plant development that is studied by Aristid Lindenmayer, a
biologist, in 1986. The central concept of L-system is that of rewriting. In general rewriting is a technique for defining complex objects by successively replacing parts of a simple initial objects using a set of rewriting rules or productions (Przemyslaw Prusinkiewicz & Aristid Lindenmayer, The Algorithmic Beauty of Plants, New York, 1990). The Koch Curve, Spiersplinski triangle, and Peano’s Space filling curve can be simulated using the L system. If we abstract the Lyndenmayer system into it’s most basic definition, the L system is a system that rewrites codes using a set of rules. The initial codes that starts an algorithm, better known as the initiator, are being rewritten by a set of rules, also known as the generator, each iteration and the result get feed into the original algorithm. For example the Koch Snowflake can be drawn using the L system using a fairly simple procedures. Mandelbrot restates this construction as follows : “One begins with two shapes, an initiator, and a generator. The latter is an oriented broken line made up of N equal sides of length r. Thus each stage of construction begins with a broken line and consists in replacing each straight interval with a copy of the generator, reduced and displaced so as to have the same end points as those of the interval being replaced.” (Przemyslaw Prusinkiewicz & Aristid Lindenmayer, The Algorithmic Beauty of Plants, New York, 1990).
Koch snowflake fractals http://www.emeraldinsight.com/ fig/0670340109062.png
01.1. Introduction to fractals
(http://en.wikipedia.org/wiki/Koch_ curve). Peano Space filling curve is a fractal system discovered by Giuseppe Peano, an Italian mathematician, on early 20th century. In mathematical analysis, a space-filling curve is a curve whose range contains the entire 2-dimensional unit square (or the 3-dimensional unit cube) (http://en.wikipedia.org/wiki/ Space-filling_curve.) The Sierpinski triangle, also known as the Sierpinski gasket, is a set of fractal triangles discovered by a Polish mathematician Waclaw Sierpinski in 1915. Originally constructed as a curve, this is one of the basic examples of self-similar sets (later we know that self-similarity is one of the basic characteristic of fractal geometry). The Sierpinski triangle is a mathematically generated pattern that can be reproducible at any magnification or reduction (http://en.wikipedia.org/wiki/ Sierpinski_triangle). The Seirpinski triangle has a quite simple algorithm. Basically it is a triangle that is being divided recursively into 4 triangles. This subdivision can happen infinitely, creating a range of smaller triangle sets. Because of the subdividing characteristic of the Siersplinski triangle, the triangle itself generally only grow inside, not outside of the main triangle boundary.
Peano’s space filling curve http://www.apprendre-en-ligne.net/blog/images/ Peano_curve.png
Sierpinski triangle http://ecademy.agnesscott.edu/~lriddle/ifskit/gallery/trianglefractals/SierpinskiTriangle.gif
9
01.1. Introduction to fractals
Another example of Koch Curve that is being simulated using the L-system is the Quadratic Koch island. It is being drawn with the rules as follows : F = Move forward a step of length d. + = Turn left by angle alpha - = Turn right by angle alpha If n (generation) = 2, alpha = 90 Initiator = F-F-F-F Generator = F turns into F+FF-FF-FF+F+FF-F-F+F+FF+FF-F Then the result is : (check images shown on the left)
Koch island (Przemyslaw Prusinkiewicz & Aristid Lindenmayer, The Algorithmic Beauty of Plants, New York, 1990). Option Explicit ‘Script written by peby ‘Script copyrighted by the underground ‘Script version 17 February 2009 17:58:10 Call Main() Sub Main()
10
‘-----defining the initiator, generator strings Dim initiator : initiator = “F-F-F-F” Dim generator : generator =”F+FF-FF-FDim iterations : iterations = 2 ‘-----defining the variables----Dim strObj, arrObj, arrObjTemp Dim dblAng01, dblAng02, dblDiv Dim i,k Dim arrPtTemp, arrVecTemp, strObjTemp ‘i = loop number every iterations ‘k = loop number every tokens Dim m,n ‘m = number of loop every first object (arrObj) ’n = number that increment every each object loop
and number of iterations----F+F+FF-F-F+F+FF+FF-F”
Below are also attached the Visual Basic script for Rhinoceros to simulate Koch curve using the L system logic so exploration of these algorithms can be done more easily…
‘-----defining the first and second angle and also the subdivision----dblAng01 = 225 dblAng02 = 225 dblDiv = 1/6
‘-----creating the first object (line)----strObjTemp = Rhino.AddLine (Array(0,0,0),Array(0,120,0)) ‘-----redim-ing arrObj as an array of objects based on the time the object was created----ReDim arrObj(0) arrObj(0) = Rhino.FirstObject arrPtTemp = Rhino.PolylineVertices(strObjTemp) arrVecTemp = Rhino.VectorCreate(arrPtTemp(0), arrPtTemp(1)) Rhino.EnableRedraw False
‘-----defining the tokens of initiator and generator strings every iteration as individual loop----- For i = 0 To iterations If i = 0 Then arrTokens = Rhino.Strtok(initiator,”,”) Else arrTokens = Rhino.Strtok(generator,”,”) End If
01.1. Introduction to fractals
Dim arrTokens
For m = 0 To Ubound(arrObj) n=0 strObj = arrObj(m) For k = 0 To Ubound(arrTokens) ‘-----select case for each token ‘-----each token will be identified as an individual function Select Case arrTokens(k) Case “F” Call StraightLine(strObj, dblDiv, n) strObj = Rhino.FirstObject Case “+F” Call DiagonalRight(strObj, dblDiv, dblAng01, dblAng02, n, arrVecTemp) strObj = Rhino.FirstObject
Case “-F”
Call DiagonalLeft(strObj, dblDiv, dblAng01, dblAng02, n, arrVecTemp) strObj = Rhino.FirstObject
End Select Next Next ‘-----re dim-ing arrObj as an array of objects as the result of the loop above????----arrObjTemp = Rhino.AllObjects ReDim arrObj(Ubound(arrObjTemp)) arrObj = arrObjTemp Next Rhino.EnableRedraw True End Sub
11
‘writing the codes for the functions ‘function StraightLine corresponds to token “F” which is scaling the input object with dblDiv ‘while maintaining a straight direction
01.1. Introduction to fractals
Function StraightLine(strObj, dblDiv, n) Dim arrPts01A, arrPts01B, arrPts02A Dim strNewObj Dim arrVec, arrVec01 ‘create points from strObj for scale starting point and creating a vector arrPts01A = Rhino.CurveStartPoint(strObj) arrPts01B = Rhino.CurveEndPoint(strObj) ‘arrPts01A = Rhino.PolylineVertices(strObj) ‘create vector from strObj to scale without x,y,z array scaling arrVec = Rhino.VectorCreate(arrPts01B, arrPts01A) arrVec01 = Rhino.VectorScale(arrVec, dblDiv) ‘action for 1st object in the token = If n = 0 Then n=n+1 ‘1. delete the strObj that will be replaced by a new line Rhino.DeleteObject strObj ‘2. locate the end point of the new line ‘ add the value of x,y,z from arrVec01 to the x,y,z value of strObj starting point arrPts02A = Array(arrPts01A(0)+arrVec01(0), arrPts01A(1)+arrVec01(1), arrPts01A(2)+arrVec01(2)) ‘3. create line from starting point strObj to arrPts02A Rhino.AddLine arrPts01A, arrPts02A strNewObj = Rhino.FirstObject Else ‘action for the tokens starting from the 2nd n=n+1 ‘1. locate the end point of the new line ‘ add the value of x,y,z from arrVec to the x,y,z value of strObj end point arrPts02A = Array(arrPts01B(0)+arrVec(0), arrPts01B(1)+arrVec(1), arrPts01B(2)+arrVec(2)) ‘2. create a line from the end point of strObj to the arrPts02A Rhino.AddLine arrPts01B, arrPts02A strNewObj = Rhino.FirstObject End If End Function Function DiagonalLeft(strObj, dblDiv, dblAng01, dblAng02, n, arrVecTemp) Dim arrPts01A, arrPts01B, arrPts02A Dim strNewObj Dim arrPtsTemp01 Dim arrVec, arrVec01
12
‘action for 1st object in the token = If n = 0 Then n=n+1 ‘1. delete the strObj that will be replaced by a new line Rhino.DeleteObject strObj ‘2. locate the end point of the new line ‘ add the value of x,y,z from arrVec01 to the x,y,z value of strObj starting point arrPts02A = Array(arrPts01A(0)+arrVec01(0), arrPts01A(1)+arrVec01(1), arrPts01A(2)+arrVec01(2)) ‘3. create line from starting point strObj to arrPts02A Rhino.AddLine arrPts01A, arrPts02A strNewObj = Rhino.FirstObject arrPtsTemp01 = Rhino.CurveStartPoint(strnewObj) ‘4. Rotate strNewObj in CPlane with -dblAng01 value Rhino.RotateObject strNewObj, arrPtsTemp01, -dblAng01, , False ‘5. Rotate strNewObj using vector with -dblAng02 value Rhino.RotateObject strNewObj, arrPtsTemp01,-dblAng02, arrVecTemp, False Else ‘action for the tokens starting from the 2nd n=n+1 ‘1. locate the end point of the new line ‘ add the value of x,y,z from arrVec to the x,y,z value of strObj end point arrPts02A = Array(arrPts01B(0)+arrVec(0), arrPts01B(1)+arrVec(1), arrPts01B(2)+arrVec(2)) ‘2. create a line from the end point of strObj to the arrPts02A Rhino.AddLine arrPts01B, arrPts02A strNewObj = Rhino.FirstObject arrPtsTemp01 = Rhino.CurveStartPoint(strnewObj) ‘4. Rotate strNewObj in CPlane with -dblAng01 value Rhino.RotateObject strNewObj, arrPtsTemp01, -dblAng01, , False ‘5. Rotate strNewObj using vector with -dblAng02 value Rhino.RotateObject strNewObj, arrPtsTemp01,-dblAng02, arrVecTemp, False
01.1. Introduction to fractals
‘create points from strObj for scale starting point and creating a vector arrPts01A = Rhino.CurveStartPoint(strObj) arrPts01B = Rhino.CurveEndPoint(strObj) ‘arrPts01A = Rhino.PolylineVertices(strObj) ‘create vector from strObj to scale without x,y,z array scaling arrVec = Rhino.VectorCreate(arrPts01B, arrPts01A) arrVec01 = Rhino.VectorScale(arrVec, dblDiv)
End If End Function Function DiagonalRight(strObj, dblDiv, dblAng01, dblAng02, n, arrVecTemp)
Dim arrPts01A, arrPts01B, arrPts02A Dim strNewObj Dim arrPtsTemp01 Dim arrVec, arrVec01
13
01.1. Introduction to fractals
‘create points from strObj for scale starting point and creating a vector arrPts01A = Rhino.CurveStartPoint(strObj) arrPts01B = Rhino.CurveEndPoint(strObj) ‘arrPts01A = Rhino.PolylineVertices(strObj) ‘create vector from strObj to scale without x,y,z array scaling arrVec = Rhino.VectorCreate(arrPts01B, arrPts01A) arrVec01 = Rhino.VectorScale(arrVec, dblDiv) ‘action for 1st object in the token = If n = 0 Then n=n+1 ‘1. delete the strObj that will be replaced by a new line Rhino.DeleteObject strObj ‘2. locate the end point of the new line ‘ add the value of x,y,z from arrVec01 to the x,y,z value of strObj starting point arrPts02A = Array(arrPts01A(0)+arrVec01(0), arrPts01A(1)+arrVec01(1), arrPts01A(2)+arrVec01(2)) ‘3. create line from starting point strObj to arrPts02A Rhino.AddLine arrPts01A, arrPts02A strNewObj = Rhino.FirstObject arrPtsTemp01 = Rhino.CurveStartPoint(strnewObj) ‘4. Rotate strNewObj in CPlane with -dblAng01 value Rhino.RotateObject strNewObj, arrPtsTemp01, dblAng01, , False ‘5. Rotate strNewObj using vector with -dblAng02 value Rhino.RotateObject strNewObj, arrPtsTemp01, dblAng02, arrVecTemp, False Else ‘action for the tokens starting from the 2nd n=n+1 ‘1. locate the end point of the new line ‘ add the value of x,y,z from arrVec to the x,y,z value of strObj end point arrPts02A = Array(arrPts01B(0)+arrVec(0), arrPts01B(1)+arrVec(1), arrPts01B(2)+arrVec(2)) ‘2. create a line from the end point of strObj to the arrPts02A Rhino.AddLine arrPts01B, arrPts02A strNewObj = Rhino.FirstObject arrPtsTemp01 = Rhino.CurveStartPoint(strnewObj) ‘4. Rotate strNewObj in CPlane with -dblAng01 value Rhino.RotateObject strNewObj, arrPtsTemp01, dblAng01, , False ‘5. Rotate strNewObj using vector with -dblAng02 value Rhino.RotateObject strNewObj, arrPtsTemp01, dblAng02, arrVecTemp, False End If
End Function
14
see and know all of our lives. It is the expansion of the classical Euclidean The studies about fractal as Geometry that has been discovered a geometry, not just mathematical during the Greek era. Even Manequation, begin in the early 70s when dlebrot himself said, in a documenthe discovery of modern computa- tary created by Arthur Clarke, that: tional machines is starting to influ- “Fractals are shapes that we are the ence the scientific research heavily. shapes that we are extraordinarily Benoit Mandelbrot, a French mathe- used to in our subconscious-unorganmatician, was the one that starts the ized lives…” (Arthur Clarke, Fractals research about fractal as a new fam- : The Colours of Infinity). The simplest ily of geometry besides the classical example is if we look at the coastline Euclidean geom etry and the Non- of Britain, Mandelbrot initially inspired Euclidean geometry, that is actually by the shape of Britain coastline. If we an extension of Euclidean geometry look at a big scaled map, the coastinfluenced by relativity theory and line of Britain only looks like a simple gravity. geometry, lines or rectangles, without The studies about fractal any detail at all. But if we zoom in, started with a question why is geom- more details will come out. Self-similar etry of then described as “cold” and shapes will emerge to complement “dry”? (Benoit Mandelbrot, ‘Theme’ the detail of the coastline. If we zoom and ‘The Irregular and Fragmented in more, other self-similar shapes will in Nature’, The Fractal Geometry of emerge as the detail of the detail Nature , New York, 1977). The inability from the last scale until the boundary of classic Euclidean Geometry in ex- of water and land becomes blurred plaining the forms of nature sparks this when it comes to the actual size. question. Mountains are not triangles. From there Mandelbrot was Clouds are not circles. So Mandelbrot trying to put together a geometry speculated that there must be some- based on many known striated facts way to explain the formation of forms in mathematics, many striated facts in in nature better than using classical our daily experience, and many facts Euclidean geometry. that scientist had done in their past Mandelbrot claims that many research. Using those known facts as pattern of Nature are so irregular and a brief, he then tried to put up a new fragmented compared to Euclidean creature, a new geometry family that geometry. Nature exhibits not simply are known as fractals (Arthur Clarke, a higher degree but an altogether Fractals : The Colours of Infinity). different level of complexity (Benoit The word “Fractal” itself is acMandelbrot, ‘Theme’ and ‘The Irregu- tually coined by Mandelbrot in 1975 lar and Fragmented in Nature’, The when he is studying the Julia set with Fractal Geometry of Nature , New modern computation as a means. York, 1977). The word fractal was derived from Fractal Geometry is actually the Latin fractus meaning “broken” a geometrical family that we always or “fractured” (http://en.wikipedia.
org/wiki/Fractal_geometry). Freeman John Dyson, an American theoretical physicist and mathematician whose famous work is in quantum field theory, solid-states physics, and nuclear engineering (http://en.wikipedia. org/wiki/Freeman_Dyson), gave an eloquent summary of Mandelbrot’s theme of fractals: “Fractals is a word invented by Mandelbrot to bring together under one heading a large class of objects that have (played)… an historical role…in the great development in pure mathematics. A great revolution of ideas separates the classical mathematics of the 19th century from the modern mathematics of the 20th. Classical mathematics had its roots in the regular geometric structures of Euclid and the continuously evolving dynamics of Newton. Modern mathematics began with Cantor’s set theory and Peano’s space filling curve. Historically, the revolution was forced by the discovery of mathematical structure that did not fit the patterns of Euclid and Newton…” (Benoit Mandelbrot, ‘Theme’ and ‘The Irregular and Fragmented in Nature’, The Fractal Geometry of Nature , New York, 1977). On 1st of March 1980 Mandelbrot discovered the Mandelbrot set, the most famous set of fractal geometry. The special thing about the Mandelbrot set is that trough a set of very simple algorithm, can emerge a range of complex shapes that continue to infinity. . The Mandelbrot set is so simple that it only need the add function and multiplication to complete the set. But it algorithm need to be looped million even billion times to create the complete set,
01.1. Introduction to fractals
Mandelbrot Set
15
01.1. Introduction to fractals
The initial formula from Mandelbrot set
The initial shape from Mandelbrot set
The details from Mandelbrot set that can continue until infinity
16
that’s why it is not discovered until the era of modern computation. The initial algorithm of Mandelbrot set is shown on the image on the left. The symbol that looks like a two way arrow is actually the symbol of iteration. It means that the Z that becomes the output from the first iteration becomes the input for the next iteration. It will go around like a dog biting its own tail, or like the Ouroboros, a Greek ancient symbol depicting a serpent swallowing it’s own tail and forming a circle. From this simple algorithm, emerges a pattern so complex that the detail of the form goes on until infinity. If we keep magnifying or zooming in to the arm of the ‘bug’ (that’s what Mandelbrot called the creature), the details will keep on going until infinity. Even when the initial set is as big as the whole universe, there still will be similar shapes that emerges from the set. From these studies, fractal geometry emerges. The main characteristic of a fractal geometry are : 1. It has a fine structure at arbitrarily small scales 2. It is too irregular to be easily described in traditional Euclidean geometric language. 3. It is self-similar (at least approximately or stochastically). 4. It has a Hausdorff dimension which is greater than its topological dimension (although this requirement is not met by space filling curves) 5. It has a simple and recursive definition (http://en.wikipedia.org/wiki/Fractal_geometry).
Reference Arthur Clarke (Producer&Director), Fractals : The Colours of Infinity [Motion Picture]
Przemyslaw Prusinkiewicz & Aristid Lindenmayer, The Algorithmic Beauty of Plants, New York, 1990 Euclidean Geometry (n.d), retrieved April 24, 2009, from http://en.wikipedia.org/wiki/Euclidean_geometry Fractal Geometry (n.d), retrieved April 24, 2009, from http://en.wikipedia.org/wiki/Fractal_geometry Freeman Dyson (n.d), retrieved April 23, 2009, from http://en.wikipedia.org/wiki/Freeman_ Dyson
01.1. Introduction to fractals
Benoit Mandelbrot, ‘Theme’ and ‘The Irregular and Fragmented in Nature’, The Fractal Geometry of Nature , New York, 1977
Sierpensky Triangle (n.d), retrieved April 22, 2009, from http://en.wikipedia.org/wiki/Sierpinski_triangle Space Filling Curve (n.d), retrieved April 21, 2009, from http://en.wikipedia.org/wiki/Spacefilling_curve Koch Curve (n.d), retrieved April 21, 2009, from http://en.wikipedia.org/wiki/Koch_curve Brownian Motion (n.d), retrieved April 2, 2009, from http://en.wikipedia.org/wiki/Brownian_ motion
17
Mrs Kingsley Hughes remarks the beginning of this technology way back to 1820s. We make a tool and the tool makes us.” Charles Babbage, an important multidisciplinary (philosopher, mathematician, logician, inventor and engineer) scienHistory witnesses the transforma- tist, improved the concept of the “differtions and translational effects of techno- ence engine” working with steam power logical advances over the wide variety of to avoid the miscalculation of the devices the practices of the humanity. Since this of his time such as astronomical, tidal and digital-mechanical marvellous invention, navigational charts reasoned by mainly computer entered into our daily lives be- human error which affected many ships ginning from 1940s, it has had a huge radi- to end up losing their ways in the ocean. cal impact without doubt on the socio- Then Babbage noticed he could design cultural life of human beings and on the the “Analytical Engine”, a machine capaperception towards the world and more ble of solving any mathematical problem, importantly, to relate this short article’s instead of his first single-purpose difference concern, in the fields of technical appli- machine after completing one working cations and research-based matters. It is small part of this engine. This new design of a fundamental issue how this transforma- him consisted of many components analtive machine which is “bending matter to ogous to those in the modern computer of its ends” (1) has had a role in the archi- our time and was even programmable by tectural design realm apart from other simple punch cards. In this way, Babbage application fields. A very immediate ques- happened to be the first person known in tion could be directed here: has the trans- the history visioning the computer by beformative use of the computer been suc- ing significantly very ahead of his time and cessful enough up to this point? To begin this led him being called “the father of with, for instance, according to Michael the computer” (3). He devoted himself to Meredith while the other design fields like build this machine for the rest of his life, but automotive or aeronautic related ones unfortunately could not achieve to build it improved radically by transforming their mainly because of the financial shortcomgenerative bases and capacities from ings and technological insufficiency. He 1980s when computers step in, the archi- died in 1871 by leaving the drawings of his tectural field unfortunately was still debat- designs allowing them to be realized in the ing tautological ideological issues far from last century which were working exactly production (2). An alternative approach successful. In addition to this first effort on differing from the traditional utilization of building a computer mechanism, the seccomputer which can change the archiond one came from America by the need tectural design method will be displayed of computing systems for the 1890 census. in the following text. There was a need to count and store the Before jumping in architecture diinformation of the tremendously increasrectly it would be reasonable to have a ing population of America by immigration look at the brief history of computer for unfrom Europe (4). The old fashion system, derstanding the issue from its basics. The human marking papers, was not possible writers of the programmer’s book Mr and to keep up with this wide change and-
01.2. Algorithmic Architecture
“Homo faber, homo fabricatus.
18
ALGORITHMIC ARCHITECTURE (Computerization vs. Computation and an Algorithmic Approach for Design)
by: Mehmet Akif Cinar
George Stibitz used the same exact operational approach naming it as “binary system” which is the fundamental principle of the functioning of the digital computer. After this breakpoint things continued at breakneck speed by the greater sophistication of programmability. New programming languages emerged making the computability utilization beyond the imaginations. Long story short, by the 1980s computers were sufficiently small, highly reliable and much faster. Nowadays personal computers are quite common by the effect of incredible internet technology and small programmable computers are everywhere humankind can reach. As stated above, the very first attempts of creating this incredibly handy tool, to put it simple, were proceeded by the thought of eliminating the human error in the first place and secondly by the need of a system which surpasses the human ability in a certain field. On the other hand, the translational process of the logical system made the realization of the computing machine possible. After this background information, the claim of Dr. Meredith, the struggle of architecture in idle repetitive debates instead of transforming itself into a different fundamental conception as mentioned in the intro of this text can now be proven here. Although the use of the computer in several professional fields has been very parallel to these mainly initiative ideas or needs, in the architectural realm this parallelism is quite questionable (2). The question searching the “scope of the involvement of the computer” in the architectural design introduced
in 1950s could be the guiding inquiry for us to discover this issue (8). However, in the last decade this question was simply forgotten, as Prof. Picon informs us. The progress on the visualization power of computers giving birth to creation of strange forms or blob architecture fascinated the architects and blackened their visions on this complicated issue which definitely contains both technical and philosophical aspects. According to Prof. Kostas Terzidis, the dominant use of the computers in today’s architecture is limited and fallacious, mostly based on simple mouse manipulations of 3D computer models. He introduces these two illuminating concepts, “computerization” and “computation”, and compares them to enlighten this tricky subject. While the idea of computerization contains the essential activities of inserting, operating or storing of the information in a computer system and includes mainly the digitization of the preconceived entities or ideas in the human mind, computation involves the process of calculation that defines the entities by logical or mathematical systems; therefore it explores the elusive and unclear nature of things. By this description, it is obvious that the dominating use of the computer is in the first category. Most architects and designers conceive computers as advanced tools in which they enter and manipulate or organize their pre-conceived and conceptualized ideas on a system and by this way they simply lose the benefits of the power of computation which the first inventors were in search of.
01.2. Algorithmic Architecture
a competition was staged by the census office to find a new system to compute and deliver data processing equipment. A machine using punch cards developed by Herman Hollerith made it possible to speed up the process ten times faster then other simpler hand based mechanisms and won the contest. Within only 6 weeks over 60 million people were counted and tabulated by a simple electrical system. So the speed of the processing information increased significantly and this was a huge step for the former development of the computers (4). During the late 19th century and the first half of the 20th century, there had been several other sufficient attempts to propose new mechanical or electrical analogue computer models (5). Finally an American researcher, George Robert Stibitz, developed the system based on binary “Boolean Logic” and completed the electromechanical relay based computer and is now named as the father of the modern digital computer (6). Boolean logic was a highly considerable opportunity developed by Charles Boole in the 19th century as the other first successful steps took place towards the modern computer. It had a translational power of the logical operations in linguistics. This system offered a representational approach to logic by bringing the symbolic framework to represent the language terms such as “equal to”, “greater than”, “less than”, and “not equal to” which converted the complicated statements to the readable simple instructions (6-1). Thus, one century after the discovery of the Boolean Logic
19
01.2. Algorithmic Architecture 20
The research and creation of the design softwares evidently include the computational methods and even fundamental mathematical and philosophical solutions. However when the software serves to the designers’ purpose, the simple 3D transformation of e.g. a NURBS surface which contains computational aspects in its programmed nature, obviously does not mean the designers utilizing the advantage of computing. An algorithmic approach which directly proposes one of the very fundamental components of the computer which Babbage came up with hundred years ago: the “programming” facility could be the answer what we are looking for to overcome this complicated issue. Programming meaning “the ability to talk to the machine in a language it can understand and using grammar and syntax that it can follow to get it to perform useful tasks”(9) is the alternative strategy contrasts the mainstream use of the computers in the design world. Besides the 3D software packages such as rhino, maya, 3D studio max contain the scripting languages Mel, Vrb, MaxScript in their structure which actually use the essential base programming languages such as visual basic and Java and this way can save the designers from primary hardcore programming burdens. The factory-set limits of the 3D software and the imprisonment of the mouse based operations would be broken by using these scripting languages. Thus the computation power of the computer will be incorporated and this will allow to designers and architects to expand their human intellect. This computation can offer a systematic approach which contains a generative capacity for space and form in terms of architectural design. An internal rule based logical method departing from human language itself will be at-
tached to design through the programming approach. This design method can be named as “algorithmic architecture” and the term “algotecture” is coined by Kostas Terzidis to signify the use of the algorithms in architecture. “In mathematics, computing, linguistics, and related subjects, an algorithm is a finite sequence of instructions, an explicit, step-by-step procedure for solving a problem, often used for calculation and data processing”(10). It is a method to formally articulate a well-defined strategic solution plan to a known problem or to complete a certain task and further it can serve to search probable solutions of a partially known problem by containing randomness factor in it (10). The logical linguistic quality of the algorithm makes the translation or reflection of the human thoughts possible. The articulation of the components of a problem or the description of the steps of the solution is the way to communicate or interact with the computer. Contrary to utilizing the computer as a tool, i.e. computerization, the tendency of mainstream designers, a synergic relationship between the nature of the machine and the human agent would rise through the algorithmic strategy. Computation introduces a huge quantity of calculations, complicated analyses, recursive processes, random applications which a human mind would not have gone through before. While this power of the computer, carrying both alien (otherness) and also counterpart aspects to a human being, extends the human intellect limitations as well as it shows potential methodologies human mind have not conceived before because of its own limitations. The machine once human invented affects and changes its own creator. In the field of design, algorithms
serve by systematizing and also rationalizing the thought of the designer. The unclear problem would transform into a traceable pattern. A recipe-like solution method would occur by composing a logical system coming out of grammatical or syntactical rules. Especially in architecture, defining the specific problem related to main topics such as aesthetic and planning considerations or formal issues or construction problems is hardly possible for the human designer. However the logical quantitative method of the computation which differs the traditional human thinking, in this aspect, can make the problem addressable and by addressing it implies different solutions or alternative ways of seeing the problem, or at the very least reliable hints. So the algorithm is not only used to arrange the steps or solve the problem, it also serves as a guide to explore what the problem involves. The following practice-based example would present a truly use of the power of the algorithmic approach. Advanced geometry unit at Arup (AGU) founded by Cecil Balmond includes a multidisciplinary group of people such as architects, engineers and scientists that came together with an interest of creating a new practice. The unit intends to find a new approach that differs from the traditional method of architecture using the drawings of plans and sections and patching them together to make the building. Their way is to search the internal dynamics of the forms and understand the own logic of the systems in place of mimicking the software-based models and extruded drawings. AGU realized the daring constructions of the Serpentine Pavilions in London. The research approach of the group for these projects was to offer arbitrary beginning points to find a rule-based system.
a.1- The conceptual project Toyo Ito designed: a box defined by complex disordered horizontal lines and straight vertical lines representing the columns and beams.
b.1- The algorithmic approach AGU comes with: a square geometry representing borders of the proposal and the connection of the two adjacent sides from the half of the dimension of these sides.
01.2. Algorithmic Architecture
The conceptual Serpentine Gallery project of Toyo Ito, a box defined by complex disordered horizontal lines and straight vertical lines representing the columns and beams (a.1), is set to a definable order through a simple geometric algorithm. The researcher group took a square geometry and connected the two adjacent sides from the half of the dimension of these sides to put away the traditional parallel intersecting Cartesian grid (b.1). However, repetition of this connection created an obvious reference to the first initiative square coming out the discovered move. Then instead of connecting the each side always from one middle point to another, the connection was made from one third of the dimension of the one side to half of the next side and a trimmed square was the outcome of this connection. This 1/2 to 1/3 subdivision rule was repeated seven times recursively and a spiral of different scaled trimmed squares which give partially reference to first square came out of this process (b.2). This simple repetitive subdivision algorithm formed an intricate rich pattern in the end (b.3). After this point, all the end points of the each trimmed squares were extended till the end of the edges of the first big square and afterwards the extensions folded perpendicular to the four sides to create the first proposed box scheme of Toyo Ito. The structural solution was conceived by only a perpendicular extrusion of 55 cm of these geometrical lines.
b.2- Secondly The connection was made from one third of the dimension of the one side to half of the next side. This1/2 to 1/3 subdivision rule was repeated seven times recursively and a spiral of different scaled trimmed squares came out.
b-3- This simple repetitive subdivision algorithm formed an intricate rich pattern as an outcome.
21
01.2. Algorithmic Architecture
c.1-The interweaving squares created by the process determine geometrical and structural hierarchies.
d.1- The first attempt of the rationalization of the project was extremely successful and on the other hand it also turned into a discovery of a new unexpected aesthetics.
22
As this practical application points out the generative algorithm invented by AGU articulated the arbitrary chaotic nature of the first design proposal and this way the aesthetic aspect of the geometry was achieved and more importantly both structure and hence afterwards construction was rationalized. The interweaving squares created by the process determine geometrical and structural hierarchies (c.1). The first square generated consists of the primary structural elements and this square supports the inner spiral of squares which are secondary elements. Likewise a thickness hierarchy appears. The beam series thicken from the central square to the perimeter of the pavilion on where the first square exists. The opportunity of arranging a subcomponent system which made both fabrication, transportation and erection issues easier and faster also emerges through the algorithmic approach. In the end, the cladding of the pavilion was generated by a basic binary representation of the first geometric rule and the distribution of the solid and transparent elements were ready. An intricate beautiful geometrical pattern is achieved by a rule-based mathematical method and the complexity is resulted through the computation power by solving out the problems of the first random chaotic approach by a simple strategic touch. A new holistic architectural product is the outcome of the process in which the structure, the tectonic nature and ornamental quality is interdependent and indistinctive. Eventually, the first simple attempt of the rationalization of the project was extremely successful and, on the other hand, it also turned into a discovery of a new unexpected aesthetics (d-1).
References: Berlinski, D. (2000). The advent of the algorithm. San Diego Sakamoto, T., Ferre, A., & Kubo, M. (Eds.).(2008). From control to design: parametric/algorithmic architecture. Barcelona: Actar Hughes, A., & Hughes, K.K. (2005). Beginning Programming. Indianapolis: Wiley Publishing Sellers, D. (Producer&Director). (2005). Modern marvels, Thinking machines: the creation of the computer [Motion picture]. New York: Hearst Entertainment History of computer hardware. (n.d.) Retrieved April 28, 2009, from http://en.wikipedia.org/ wiki/History_of_computer_hardware George Stibitz. (n.d.) Retrieved April 28, 2009, from http://en.wikipedia.org/wiki/George_Stibitz
01.2. Algorithmic Architecture
The above mentioned example, only one of the several genuine experiences, would sufficiently show the magical offer of the computational perspective which most importantly has a strong potential of the discovery of a new way to conceive the whole architectural issue to architectural world. The use of the computer goes beyond computerizing; i.e. imitating, storing or organizing what is understood already in the designer’s mind, it becomes a medium to explore what is not understood. The transfer of the human thinking processes to a machine is carried out through the programming approach. Hereby the computer turns into a reflector of the human mind and mirrors the mental processes which shows to a human designer the way to map out the own way of thinking. The computer makes it possible to see the knowledge we can not see through our habitual thinking and helps designers and architects to take the things in their own hands. By the help of computation and algorithms designers now are able to extend their thoughts into an unknown and once unthinkable complex world. The algorithm serves to define new problems, help grasping the problem, addresses the possible solutions and gives birth to a hybrid human-computer designer. Consequently, at its simplest, the emergence of a new architecture comes out of this hybrid genesis transforming the whole related fields. Once again the tool human created in the first hand helps him to surpass his own existence.
George Boole. (n.d.) Retrieved April 28, 2009, from http://en.wikipedia.org/wiki/George_ Boole Terzidis, K. (2006). Algorithmic architecture. Oxford: Architectural Press Algorithm. (n.d.) Retrieved April 28, 2009, from http://en.wikipedia.org/wiki/Algorithm
23
Matering material
In search of material based aesthetic in digital architecture
01.3. Matering Material
by: Ardes Perdhana
Abstract I was triggered by the two opposing texts from two different medias. They are talking about architect’s role based on their directions of practice. This writing tries to explain why is it unproductive: commenting contemporary architecture without relating it to its architectural thinking/ theory. As a result, this writing tries to convince that material based digital innovations are the true advantages of our time of life that we should use as the starting point. By critiquing some related projects, which explains more about “what is not”, I try to define an abstract area that can still be explored. Promoting the material itself, in which can be injected of layers of values in order to be able to adapt and interact in context; it will leaves a new periphery in the readers mind worth to investigate.
The debate
24
An interesting text was published by The New York Times in its architecture weekly column, which was titled ‘Architecture; until the money ran out’ by Ouroussoff (2008). He was questioning the role of architects in solving our real problems. It fascinates me because it could be considered as a strong evidence of people opinion. If such mainstream media like The New York Times were interested to bring up this kind of topic as its article, it would have cleared that architect is being doubt as a profession. It probably has been charged as guilty for causing all the problems that are being faced by most of the society in the world.
Ouroussoff stated that architects, who used to be promising with their visionary design formulas/ statements and celebrated as heroes in the cultural field, have tendencies to be segmented in serving people with their projects. He related these facts with the economic crisis the world is facing right now. Contemporary architecture is ignoring social agendas that were being advocated by architects in the modernist era. He argues: ”…But somewhere along the way that fantasy took a wrong turn. As commissions multiplied for luxury residential highrises, high-end boutiques and corporate offices in cities like London, Tokyo and Dubai, more socially conscious projects rarely materialized. Public housing, a staple of 20th-century Modernism, was nowhere on the agenda..” Ouroussoff (2008). Opposed to Ouroussof’s text, there was another text from the Architecture of Humanity website. It was written by Sinclair and Stohr (2008). They tried to convince that not all architects are practicing the same type that they described working on some attention grabbing projects. They introduced architects like the MMA Architect, Samuel Mockbee, Hassan Fathy, and Buckminster Fuller as the new architecture revolution. These architects are working on their social service type of practice, trying to solve problems in areas damaged by the natural disaster or poverty. They encouraged Ouroussoff to trust to these kind of architects to lead the new architecture revolution rather than the big named architects like Rem Koolhaas, Zaha Hadid, Herzog and de Meuron, Frank Gehry, etc. Sinclair and Stohr hardly argue:
Scientific theory, in its truly purpose, gives us the logical direction for not repeating something that is already done, so that we can still focus in doing the innovations in the most effective way. That’s the aim of the system.
Scientific based critique method
Re-interpretation of Vitruvian classic definition of architecture
At this stage, I start to think about how we as an architect should deal with it. Those facts I mentioned above leads us to the question that is my true point in this writing: How can we fairly judge and evaluate the contemporary debate and problems in our latest architectural scene? Can we solve those problems with the latest potentials in our architectural findings and still using logical system in science? I agree that some contemporary architecture projects, which are highlighted by the media, are not solving the problems in our real world. Cities that are dominated by the skyscrapers are proved economically and ecologically inefficient. It is sadly true! But I don’t agree that we should focus on the choice of (directions in practice) as the solution of this particular condition. I do agree that architects mentioned by Sinclair and Stohr, working on the social serving type of practice, are noble. And I admire their sincere. But it misses the whole point because the reaction was not started from the realm of the theory as our logic base/ system in critiquing, not by pragmatic reasons like changing our type of practice.
There are lots of thing that can still be explored in this manner. The fact that innovations in digital technology are not something new and that people from all over the world and from any kinds of background (culturally, economically, etc) are already familiar and even affected by it (internet, digital camera, cell phone, game, etc), is a true given potentials that we should push forward in our explorations. It’s so powerful because finally, for example, we can simulate thousands of phenomena within architecture using computer in a shorter time than ever before, resulting more progressive research findings to be implicated. So let me introduce some projects, from inside or outside digital architecture, which relevant to discuss. There will be explanation about the objectives of each project, and critiques given upon it, based on the 3 key aspects that are important in digital architecture: context, interactivity/ adaptive, and matter/ material. Why are those 3 criteria considered as the key important aspects in contemporary architecture? I agree that there is no such
thing as new in architecture. It’s only chains of refinements based upon previous recorded comprehensive design thinking. And I agree to what Stephen Gage confirmed about these 3 aspects as the relevant interpretations of Vitruvius’s classic definition of architecture. As Stephen Gage explained, those 3 are the updated interpretation of English poet Sir Henry Wooton on commenting Vitruvius’s Ten Books of Architecture about 3 most basic aspects that should appear in a project to be considered as architecture: firmness (context), commodity (matter), and aesthetic/ delight (interactivity/ adaptive). He defines further in describing the relation in those 3 key aspects in the context of an environment:
01.3. Matering Material
“... but why call on designers who spent the better part of their careers building ever-competing, energyconsuming, sky-piercing structures, when you could hire any of a myriad of qualified (if less well-known) firms already experienced and engaged in rethinking the built environment?...” Sinclair and Stohr (2008)
“…It is worth examining this attributes more closely. When we ascribe the quality of firmness to the same object, we do this in terms of our understanding of the environment in which it sits. When we ascribe the quality of commodity to the same object, we extend the description of the environment to include our understanding of the behaviour of people. When we go further and ascribe the quality of delight to an object, we can only do this in terms of our own understanding of the understanding of others.” (Gage, 2008, quoted in Sheil, 2008) Regarding to this new interpretation, let me start examining the MMA Architect’s project, which I think is repeating something that is already done in the modernist era of architecture. 25
01.3. Matering Material 26
“Low-cost single-family dwelling, Indaba, South Mpahlwa, dw L, Morojele, “LowAfrica” -cost (Luyanda sin gle-family elling , M.Indaba, 2008. Curry Prize Website SoStone uth Design Afr ica” (Luyand[ona line]. [Acessed 16th January 2009]. Available Mpa hlwa, L, Moroj ele, M. 2008. Cur ry from World Wide Web: <http://currystonedeStone D esign Pr ize Website [onli ne]. signprize.com/?page> th
[Aces sed 16 January 2009 ]. Availab le from World W ide We b: <htt p://currystonede signpri ze.com /?pa ge _id=208>)
“Simple Nested Hierarchy and Complex Hierarchy” (Reiser, Umemoto. 2006. Atlas of Novel Tectonics. New York: Princeton Architectural Press)
They designed low-income housing for a Cape Town Shantytown, which is the winner of Inaugural International Design Award for Humanitarian Innovation which was given by The Curry Stone Design Prize. MMA Architect offered indigenous mud and wattle building techniques. The building is using brick and mortar foundation to support two story frame of timber and sandbag infill construction. They argue it will reduce the consumption of energy because it’s not using any electricity and it’s efficient because it’s not involving any skilled labor to construct. This project is using a method, which Reiser and Umemoto introduced as collaguing techniques, that is strictly embedding one material with one values/ function only in a building. In terms of this kind of attitude towards material and how they relate it with the structural aspect from the building, it is considered as an excluding way of ignoring lots of other values. Each material embeds only one function, by reducing much other potential in it for many problems that can be solved by it. I would like to bring up Sir E.H. Gombrich’s quotation in his book titled “Norm and Form”, that can be found in Reiser and Umemoto’s Atlas of Novel Tectonics. He explained: “It will be remembered that the Principe of exclusion is a very simple, not to say primitive, principle that denies the values it opposes. The principle of sacrifice admits and indeed implies of a multiplicity values. What is sacrificed is acknowledged to be a value even though it has to yield to another value which commands priority.” (Gombrich, 1971, quoted in Reiser,Umemoto, 2006)
I think we should start to focus in finding possibilities of facilitating the plurality of functions within the material. One way to achieve that goal is by using a method that Stephen Gaged introduced as a self control system. It is a system that allows an object to make continuous evaluation about the conditions of a place where it sits and make certain internal actions upon it in order to adapt and to be survived. Stephen Gage stated that the idea of this system ambition of having layers of aspects/ functions within material is started when time based was considered in architecture. Afterwards, people have been interesting to explore cybernetics that offer an autonomous control system and tries to apply it in architecture. One of the reasons why is it started to be explored was that is a kind of shifting paradigm from “men as the center of the world” into an equal position with another creature in the nature, which was explained by Usman Haque in commenting Gordon Pask Conversation Theory. He argues: “…Now at the beginning of the 21st century, Pask’s Conversation Theory seems particularly important because it suggests how, in the growing field of ubiquitous computing, humans, devices and their shared environments might coexist in mutually constructive relationship...” (Haque ,2008, quoted in Bullvant) At this stage I want to introduce some projects that was made with this kind of attitude of research. One of them is a research that is held by Neil McLoughlin Architects.
and kinetic potentials. It is a creature made by Theo Jansen, which can be moved by wind and sea water as the stimulator of the internal mechanic operations in order to be survived in the beach. I advocate these projects as evidence that demonstrate of a system embedding several layers of values. This kind of system allows a communication between the material and its given context in order to be able to adapt. It has big opportunities for further development since they still can not be defined as a complete architecture project.
“Bloom, RIBA, London” (McLaughlin, N. 2000. Screens. Protoarchitecture: Analogue and Digital Hybrids. Vol 78 (No 4). pp. 72)
01.3. Matering Material
They concentrated on the reactions between everyday found object and nature itself in order to give a sense of aesthetic. Their projects demonstrate evidences of material based research, which is trying to create responses from nature’s phenomena (physics, chemical, kinetic) from the material natural potentials. The first evidence is a result of collaborative works with an artist, Martin Richman, creating a self illuminating and scented field. By using some gardening equipments made from plastics scattered upon detergent covered area in the RIBA building’s floor, it is absorbing the UV lights from the sun in day time. And when the night comes, it will start spreading the light that is lit by its physics interactions between UV light-plastic-detergent and the strong scent for the whole floor in the building. This finding is worth to be explored in terms of its ability to provide light without any electricity. It’s an evidence demonstrate the interaction between material and nature by its physical reactions. The second one would be a canopy made from computeretched copper sheet. The two layers of the copper was prepatinated with different chemical processes, so that through time it will has different colors from it designed geometric pattern. It gives us an evidence of opportunity of a self-continuous chemical reaction between material and nature. And another project that can be used as evidence is the one that demonstrates the relation between material and nature that can produce movement by its chemical
In search of material based aesthetic in digital architecture Critical point that I can still give for those projects would be its unspecific kind of aesthetic. Somehow I feel that it is not something that I can consider as an architectural aesthetic. I’m still trying to find a distinct formula that can define a very different aesthetic from the other field of art and previous architectural thinking (i.e. post modern era). I agree the importance of user of an architectural object that is explained by Edward Winters, in his book titled “Aesthetic and Architecture”. An architectural object always has to have relation to the public. The user plays important role in defining an aesthetic of architecture. He tries to categorized architecture as public art, and stresses that architect as its creator/ artist has to relate their works with the public where it sits. He argues:
“Delay, Soane Street Canopy, London” (McLaughlin, N. 2007. Screens. Protoarchitecture: Analogue and Digital Hybrids. Vol 78 (No 4). pp. 78)
“Animaris Percipiere (Sea Foam)” (Jansen, T. 2005. Strandbeests. Protoarchitecture: Analogue and Digital Hybrids. Vol 78 (No 4). pp. 72)
27
01.3. Matering Material 28
“Anamorphic Tectonics, Rome” (Shafiei, S. 2007. Convoluted Flesh. Protoarchitecture: Analogue and Digital Hybrids. Vol 78 (No 4). pp. 43)
“Convoluted Tectonics, Rome” (Williams, J. 2007. Convoluted Flesh. Protoarchitecture: Analogue and Digital Hybrids. Vol 78 (No 4). pp. 42)
“…These artists could not have produced the work they have without concerning themselves with the particular publics for whom the works made. Architecture just is a public art. It works are placed in public spaces and together they form our towns and our cities. Art, when it went public, had to address the issue arising from its newly found publicity…” Winters (2007)
– a continuous epiphany…” (Gages, 2008, quoted in Sheill)
At this stage it is quite clear that variety holds important role in order to give the continuous delight that is needed to make a distinguished aesthetic. The more the merrier! And there has been no better time to achieve more varieties than right now, since the digital innovations allow us to simulate millions array of With such consideration, I’m start variation probabilities. questioning about perception that peoBut I’m still questioning about the ple can achieve from an architectural common values that the public share object. What kind of values within archi- together as their communal perception tecture that can be understood by the upon an architectural object. Is there public? How can we achieve that in the any such thing? project? Let me introduce about what Having understanding about how Peter Eisenman differentiated between a brain works in order to make a percep- conceptual aspect and perceptual astion is important at this stage. Stephen pect related to aesthetic in his “Notes Gage argues that an architectural ob- on Conceptual Architecture; Towards a ject should be learnt first by its spectators. Definition”. He argues that conceptual It’s the natural way. By the time they un- aspect of architecture is something that derstand about it, it gives the sensation is already processed as layers of meanof delight. ings inside the brain of every observer, Gage continues to bring us into while perceptual aspects of architecture explanation that is I consider as an op- is formed by the physical appearance of portunity of an exploration in the digital the object itself. realm. He was questioning about what He’s taking a word symbol as an happen next when it’s already under- example on how observer reactions of a stood by the spectators? Is it going to word. If we were looking at letter “X”, an lose its sensation again? How can we observer would have said that it is a letkeep those sensations disregarding how ter “X” that they were using conceptual many times the spectators came to the aspects. It is a common agreed upon same object? meaning. But if they were saying that Gage introduced something that the letter “X” is “something that is cenhe called continuous epiphany. He ex- tralized”, they would have reacted using plains about the variety that should be the perceptual aspects. The objects apcontained in architecture in order to pear with their own natural and physical achieve particular sensation of aesthet- meaning. ic. He argues: With this definition, I promote the perceptual aspect that can define “…Perhaps if the variety is great enough about which perception that should be the observer will always learn new things advocated.
Conclusion Material has to be placed at the top priority of further investigations. Evidences and arguments built above, based on the real problems and theoretical problems, direct us to explore the connection between material,
context, and interactivity/ adaptability. It has the potentials for innovations regarding to theoretical logic. And since it all started from the real problems, hopefully at the end it’s going to give a problem solving kind of architecture.
01.3. Matering Material
It is coherent with the set of arguments which I already explained before about the physical potentials from a material that should be consists of layers of values. It would have repeated something that is already done in the post modern era, If we were implementing the conceptual aspects. Because meaning which is a seed form of a culture, can be formed by those physical potentials within the material in architecture. In this sense, it is worth taking a glimpse to the project that Marcos Cruz and Marjan Colleti are doing in their design investigation in the Bartlett School of Architecture. They tried to explore the possibilities of implementing typological approach in the topological realm. I consider this as repeating something that is already done in the post-modern era of architecture. Typological approach tries to implement certain shapes or symbols that are already exist in a society and it is representing certain meanings to their history as a community. I do object to it with such consideration that culture and history can be formed by architecture, not the other way around. Reiser and Umemoto clearly stated this particular argument in their Atlas of Novel Tectonic.
Reference List Curry, C. 2008. Curry Stone Design Prize Website [online]. [Accessed 16th January 2009]. Available from World Wide Web: <http://currystonedesignprize.com/?page_id=208> Eisenman, P, D. 1971. Notes Towards a Conceptual Arhictecture: Towards a Defiition. Casabella. n 359. p. 49-57. Sheil, B. 2008. The Wonder of Trivial Machines. Protoarchitecture: Analogue and Digital Hybrids. Vol 78 (No 4), pp. 17, 19. Haque, U. 2007. Bullvant, L, (ed). The Architectural Relevance of Gordon Pask. 4D Social: Interactive Design Environments. Vol 77 (No 4), pp. 55.
“Convoluted Tectonics, Rome” (Williams, J. 2007. Convoluted Flesh. Protoarchitecture: Analogue and Digital Hybrids. Vol 78 (No 4). pp. 42)
Ourousoff, N. 2008. ARCHITECTURE: It Was Fun Till the Money Ran Out. New York Times Website. Accesed 14th January 2009. Available from World Wide Web: <http://query.nytimes. com/gst/fullpage.html?res=9504E4DD153BF93 2A15751C1A96E9C8B63&sec=&spon=&pagew anted=1> Reiser, Umemoto. 2006. Atlas of Novel Tectonics. New York: Princeton Architectural Press Sinclair, C., Stohr, K., 2008. Architecture for Humanity Website [online]. [Accessed 16th January 2009]. Available from World Wide Web: < http://www.architectureforhumanity.org/ updates/2008-12-21-a-letter-to-the-new-yorktimes> Winters, E. 2007. Aesthetics and Architecture. London: Continuum International Publishing Group
29
Information is ‘Alive’
Influence of Data on Built Space
01.4. Information is ‘Alive’
by: Muhammed Shameel
30
Recording information is not new. It can be dated back to the cave man where he used the cave walls to record information. Since then this has evolved in to different forms, such as writing on leaves, clay tablets, books, floppy disk, hard disk etc. Data can be seen in two ways: 1st the material information that consist of definite recorded information such as history and scientific documents. 2nd being the immaterial information such as personal memories, songs, dances, rituals, etc. This information has a tendency of being modified when passed on from one person to the other. Earlier way of recording information followed a strict linear system. Data was arranged on the basis of hierarchy or grid. It also had to store information of the persons and the process that it represents. This facilitated easy access of information as well as better control. The introduction of the binary number system by Gottfried Leibniz in the 17th century gave a breakthrough for data storage and computing. Leibniz, also known as the first computer scientist, began to develop machines that could execute arithmetical operations using binary numbering system. Information was decoded in to 1’s and 0’s and stored on magnetic tapes. The modern digital database no longer functioned in a linear fashion. Data is accessible through complex networks of linking technologies. Facilitating a total random and non linear system, resembling the fuzzy logic that is seen in the cases of immaterial information. Accessing information in such a system can be compared to picking up a needle from a haystack. Complex intelligent search engines are designed to link all the related data even when the relation is is not defined by the user. These systems are so
powerful that they can map out which all products or books a given individual would prefer, to even suggest family members and friends that the individual is related to. Some of the social networks like Facebook, Twitter, Orkut and search engines like Google, Amazon etc. are using these systems to make it more interactive. The database is capable of restructuring itself as and when new information enters the system. Thereby making the primitive linear data system more intelligent, adaptable, complex and dynamic. In addition to this the internet plays a significant role in increasing the domain of the database, exponentially. The interactivity of the database also depends on the type of the data. Data such as historical accounts or ancient sacred texts has a different implication to those related to the individual. The later can lead to the behavioral aspect of the individual. This information can predict the future behavior of the individual and be can be great input in various field. For example various consumer companies use this information to decide what sort of product should be released in to the market depending on the trend. As far as the database of an individual is concerned, it can consists of Social security numbers, Credit cards and Bank Accounts, Contact numbers, Driving License, Membership details, etc the list goes on. The fact is that these data makes up one’s identity. Having this set of data places the individual in the virtual network. This keeps the individual on a constant scanner leading to information of his where abouts and what he is doing. This explains why the information related to the individual is of more importance than the historical accounts.
would be linked and monitored or examine. Manuel Delanda expands on Faucault’s argument that ‘It was the variety of new ways of examining the individuals that governed the information accumulation process.(Brouwer, Joke, (2003) Information is Alive,The Archive Before And After Foucault pg 8 -13, V2_/NAi Publishers, Netherlands) He takes some basic every day activity such as the examination of a patient, the tests administered to students to measure the degree of their learning, to the questionnaires given to soldiers to be recruited or workers to be hired, that changed in its code of conduct in the 17th and 18th century. Previously a physician’s visual inspection was quick and irregular, now it has become an extended process of inspection where by timely records are taken in addition to more sophisticated medical tests. Examinations at schools have become instruments in determining, assessing and comparing individual aptitudes itself. At the moment what we have is these giant networking organizations and search engines that has information related to the individual in their database. The way they have established themselves is by developing a node that can connect to different entities and individuals. These interface are also designed in a fashion that encourages the participation of the individual by placing his/her data on the web. So at the start what we have is an independent node. This node grows by increase in the database through the participation of the individual. The next level of this virtual network is interconnection of these different nodes comprising of inde-
pendent organizations. Tim Berners Lee calls this ‘Linked Data’(2009, ‘The next Web of open, linked data’, Ted 2009 Conference, California,U.S.A). He extends the idea of data to relationships. Whereby a particular data can lead to other data. In order to achieve this he proposes the linkages of these huge databanks made accessible to every individual in the world. Linking of a set of data base cab result in totally different information almost similar to chemical reactions where new chemicals result out of reaction between two or more chemicals. When this extended to the entire world it offers a big domain of permutation combination that can give a wide spectrum of possibilities. The more connected it is the more powerful it becomes.
01.4. Information is ‘Alive’
Micheal Foucault writes : ‘ For a long time ordinary individuality - the everyday individuality of everybody- remained below the threshold of description. To be looked at, observed, described in detail, followed from day to day by an uninterrupted writing was a privilege... The disciplinary methods reversed this relation, lowered the threshold of describable individuality and made of this description a means of control and a method of domination. [What is archived] is no longer a monument for future memory, but a document for possible use. And this new describability is all the more marked in that the disciplinary framework is a strict one: the child, the patient, the madman, the prisoner, were to become... the object of individual descriptions and biographical accounts. This turning of real lives into writing is no longer a procedure of heroization; it functions as a procedure of objectification and subjection’ (The Birth of Prison, 1979, Vintage Books ) Earlier ways of recording information was based on events that has taken place. Only information related to important events or people holding high or prestigious posts were recorded. This did not include the the common man. Perhaps the closest they could get at that point of time is by tagging a group of individuals under an organization. The worlds population as of today is estimated to be 6.77 billion. With information technology advancing at a rocket pace, bringing in more and more rural towns, villages and individuals in to the virtual network, it wont be long when each human on earth
Two set of data linked to get a new result or purpose
A Network of Linked Data interacting with each other to produce a spectrum of outputs.
31
01.4. Information is ‘Alive’ 32
Various social networks and organization should share their data.
Growth of a Network
According to George Dyson, for an entity to be considered alive, it has to perform two actions: 1st develop a system that supports growth, 2nd The system should be able to reproduce its self (Brouwer, Joke, (2003) Information is Alive,On the Loose, Interview with George Dyson pg 70 -80, V2_/NAi Publishers, Netherlands). At the beginnings of life however, you do not need reproduction but a system that can replicate itself. The difference is replication produces an exact copy and reproduction produces similar ones. According to biological context, genes replicate themselves and the organism reproduces.At the moment what we have is a giant network that has the capacity to grow continuously. This becomes the metabolic system for sustenance of life. The network of data however had not started to reproduce itself but one can say that the stage is set and that primitive operations of replicating codes and bits can be seen. It is hard to see data as ‘Alive’ as one would agree to the growing virtual network driven by data and information as a metabolic system but cannot consider such a virtual entity to reproduce. Reproduction in biological organisms is achieved through cell division and sex. This can be seen in the virtual network as well. While ‘Sending’ or ‘Transmitting’ an information in a network we replicate a copy of the code a remote relocation, leaving the parent code intact at the host location. The present global network is sustained through ‘organs’ such as computers, routers, servers etc and the web of connections between several nodes. Codes that are the building blocks of this network is rapidly becoming multi cellular by running on many processors at the same time. The network has now being extended in to an primitive ecosystem
where we as human has become a part of it. Either one can say that these data driven virtual network has become part of us or we together has become part of life. Data is ‘Alive’. What has been established at the moment is that in every aspect of our life data has become a part of it or is the result of it. In such an context its quite significant that our built structures should be designed in a fashion that it addresses the existence of such a system as well as accommodates it. At the moment we have statistical data in the forefront that has influenced the built structures and upcoming design prototypes. The D-Tower, by NOX at Doetichem, Netherlands is a good example of statically driven design. It is an hybrid of digital data and material construct that is made possible through a biomorphic tower built structure, a website and a questionnaire. It transforms these into an interactive system of relationships in which the intensive (feeling, qualities) and the extensive (space, quantities) start exchanging roles, when human action, colour, money, value, feeling become network entities. Through the online questionnaire about daily emotions provided to the participating members of Doetichem, the system evaluates the prevailing emotional state of the city’s residents. The tower changes its colour according to the emotional state. Hate, Love, Happiness and Fear - these are represented by four colours- green, red, blue and yellow. Another example for data driven design is a recent development called ‘Sixth Sense’ developed by Pattie Maes’ lab at MIT, spearheaded by Pranav Mistry. This design is associated with the individual on an one to one basis and addresses the examination of the individual as discussed earlier.
Apart from feeding in information, archiving it and using it for future purposes, data can also be used for reinterpreting the information in different ways.Today we are also looking at multi-dimensional data that consists of tons of information within it. Visualization, hearing and exploration of this multi dimensional data offers breakthroughs in areas of science and engineering. Such data is so complex that specially designed computers are required to handle it. The decoding of this information and the interaction of specific decoded data can give us interesting and useful results. The Allosphere, developed by researchers at the Center for Research in Electronic Art Technology (CREATE) at UC Santa Barbara, provides a platform where a group of scientist can be completely immersed in data. It is a three storey metal sphere and echo free chamber. It resembles to a large dynamically varying digital microscope attached to a supercomputer. The team consists of visual artists, scientists and computer engineers. The visual artists has completely remapped complex mathematical algorithms that unfold time and space, both visually and sonically. The scientist looks in to the new patterns the the interaction of the multi- dimensional data produce. Twelve special super computers are designed by the computer engineers in order to handle data exploration of this scale. The Alloshpere provides a big opportunity for personals in various fields to study in depth analysis of their data. For example a surgeon can fly in to the brain, have a look at the tissue landscapes generated visually and the blood density levels as
01.4. Information is â&#x20AC;&#x2DC;Aliveâ&#x20AC;&#x2122;
It consist of a wearable device with a projector that can retrieve and project the meta information on literally any surface. The main driving force of the design is to access Meta Information that would allow the Individual to make the right decisions of whatever he/she comes across. The prototype comes in handy in various instances. To name a few : At a super market, you walk up to an particular section and look at the wide range of brands that are available in one product category. The device would identify the product and brand based on image recognition or marker technology and access information available online on the product. Depending on the personal preferences, relevant information of that particular product is projected suggesting whether that particular brand is suited or not. When we meet a person, all the information or data related to the individual is retrieved from blogs and personal webpages and is represented in the form of a web cloud. What should be noted that these devices not only projects information but are capable of inputing information in to the database cloud. As said earlier the cloud restructures itself every time an information enters the system. Some other examples where statistical data plays a significant role and that influences the built space are as follows: Stock Markets, Political analysis, Urban scenarios consisting of traffic and circulation, urban planning and design : Space syntax by Bill Hillier and Julienne Hanson, Global Information Systems, Surveillance and supervision etc. the list goes on.
The D-Tower, by NOX at Doetichem, Netherlands
The sixth sense gadget
33
01.4. Information is ‘Alive’
The Allosphere at UC Santa Barbara : Allo Brain at Allosphere
34
Stacking of Portable Hard Disk to serve as future Brick Bats
music at the same time. Such re-interpretation of data has led to interesting discoveries and again further data. Almost every user linked to the virtual network today operates through a computer, personal digital assistant or a mobile. He or she will have a certain amount of digital space in the virtual network and is represented as Mega, Giga and Tera bytes. These however takes up physical space in our built structures in the form of giant server rooms representing the nodes in the network. With data driving most of our lives and exponential growth of the metabolic network system, we should think of spaces that would accommodate the system beyond server rooms. Some of which has been discussed earlier. The extension of the data from an organization level to the individual would blow up the server space exponentially. Space today is limited and is designed to employ multiple functions at the same time. In such a scenario how can we extend the server room to multi purpose program ? This in the near future will have to be answered Portable storage devices such as external hard disk and memory sticks that are used to store persobal data has been reduced to convenient dimension today. If we analyze, it is quite evident that the dimension of these devices has been relatively stayed constant for last decade. What has changed is that storage capacity of these devices has increased. One possible way of attaching a multi purpose utility to these devices is to use these as brick bats. It probably does not perform as much as a brick bat but would transform or accommodate the server room in to screen walls. The alteration of the material casing of the hard disk in to more stronger and performative component that can be compared to a
conventional wall would make it thermally, visually and acoustically sound. The conventional wall has material physics and is in constant conversation with the environment, incorporating the digital domain would make it ‘Super Intelligent’. Another possible scenario that we should consider is the recycling of data. A huge amount of physical energy is consumed in the creation and maintenance of the virtual data. Means has to be explored where this virtual data can be recycled in to physical energy. This might look very ambitious but would become a significant requirement in the near future. Reference: Brouwer, Joke, (2003) Information is Alive, V2_/NAi Publishers, Netherlands Mistty, Pranav (n.d), Sixth Sense - Intergating Information with the real world, Retrieved April 18 2009 from http://www.pranavmistry.com/projects/sixthsense/ Spuybroek, Lars, 2004, NOX: Machining Architecture, Thames & Hudson The Allosphere Research Facility (n.d), Retrieved April 17, 2009, from http://www.allosphere.ucsb. edu/ Tim Berners - Lee (2009), ‘The next Web of open, linked data’, Ted 2009 Conference, California,U.S.A Bilbiography www.wikipedia.org/wiki/Gottfried_Leibniz http://www.ted.com/index.php/talks/pattie_ maes_demos_the_sixth_sense.html http://www.ted.com/index.php/talks/tim_berners_ lee_on_the_next_web.html http://www.ted.com/index.php/talks/joann_ kuchera_morin_tours_the_allosphere.html http://en.wikipedia.org/wiki/Space_syntax Fernandez Per, Aurora. (2007) D book : density, data, diagrams, dwellings, a+t ediciones
35
01.4. Information is ‘Alive’
Singularity in Technology, and Concept Realization in Comic Books
01.5. Singularity
Case Study: Ghost in the Shell and AstroBoy
36
by: Pebyloka Pratama
Singularity The word “singularity” seems a bit foreign to our ears; it sounds a lot like science fiction and far away from our everyday live. Little of us realize that singularity have been the inspiration of many research and work of art. From non-fiction essays, fiction novels, movies, television series until popular art and modern entertainment system like computer games has been trying to understand what the meaning of singularity. Singularity has been the theme of many multi-media works within our daily lives. Some of them are trying to explore singularity in a very rigorous manner, and other just uses singularity for entertainment purposes. From those that only explores singularity as a mere backdrop, trying to expose the condition of singularity within everyday live, a deep research about singularity based on facts that happens today, until trying to analyze the impact of singularity to our psychological state. This essay will try to explain the definition of singularity, and to explore singularity in a specific form of popular art, which are comic books and mangas. Some of these comics even dated way back before people try to study singularity in a scientific manner. These ideas of singularity in comic books are like a prediction or assumption on how the future will be like with all of the changes technology brings to the norms and values of our daily lives. Based on etymology, the word singularity means : 1. something that is singular: as a: a separate unit, b: unusual or distinctive manner or behaviour: peculiarity 2. the quality or state of being singular 3. a point at which the derivative of
a given function of a complex variable does not exist but every neighbourhood of which contains points for which the derivative does exist 4. a point or region of infinite mass density at which space and time are infinitely distorted by gravitational forces and which is held to be the final state of matter falling into a black whole (http:// www.merriam-webster.com/dictionary/ singularity). In science, singularity can be referred to a variety of concepts. There’s singularity in mathematics, technological singularity, gravitational singularity, mechanical singularity, etc. There are also individual theories that try to analyze and explain what singularity really means in a very specific environment or field of science. In mathematics, singularity in general means a point at which a given mathematical object is not defined, or a point of an exceptional set where it fails to be well-behaved in some particular way, such as differentiability (http:// en.wikipedia.org/wiki/Mathematical_singularity). In physics singularity can mean the threshold when a substance transforms into another substance that have a different property than the previous one. For example is when water turn to ice, there is a point when the substance itself is not water but also not ice, so it is indefinable since it doesn’t have any behaviour similar to water nor ice. People tend to relate singularity to a point of evolution; that is a specific moment when a creature morph into another creature. It is a point where logic and assumption does not work as they used to be anymore.
and based on the popular non-fiction book by Ray Kurzweil Singularity is For the intention of narrowing Near titled Surviving the Singularity. He down the topic of the essay, this es- said that “…the futurist and artificial say will only dwell on the theory of sin- intelligence researcher Ray Kurzweil, gularity in technological term. Artifi- in the Singularity is Near, argues that cial intelligence will also be discussed we are approaching a change so a little bit since the strong connection dramatic that it qualifies as a singuthat it has with technological singu- larity. That is a point which all the rules larity itself. Aside from the exposition and assumptions that worked before of technological singularity, this essay failed. This will occur when we are will also talk about the role and influ- able to project human intelligent into ence of comic books in this modern computer.” (http://mtnmath.com/ society. movies/index.html) As we all know, the improveThe technological singularity ment and rapid development in is a theoretical future point that takes technology, whether consciously or place during a period of acceleratsubconsciously, are affecting our ing change sometime after the creaeveryday lives and norms. People are tion of a superintelligence (http:// re-defining their world. New family of en.wikipedia.org/wiki/Technological_ geometry like the fractal geometry singularity). Superintelligence itself is are born through the modern com- defined as an artificially enhanced putation machines. Source of knowl- human brain, a computer program edge and information shifted from or device that is much smarter, more books in the library to our own living creative and wiser than any current room through the usage of the inter- or past existing human brain. The net. Social activities are now being uniqueness of superintelligence inconducted by people from different volves a scenario where the superpart of the world through the internet. intelligence continues to enhance Easy accessibility that the information its own capability and intelligence technologies provide us makes our (http://en.wikipedia.org/wiki/Suworld expand and in the same time perintelligence). Theoretically, the shrink. point of singularity in human live will Science and technology emerge when man is able to implant are changing the world in an ever- an artificial intelligence (the intellichanging rate. This paradigm shift gence of machines and the branch that is being caused by the develop- of computer science which aims to ment of technology is predicted to create it, this term was coined by come to some sort of breaking point John McCarthy, an American comwhere all assumptions and rules that puter scientist, in 1956. He defines worked before failed. Paul Budnik, an artificial intelligence as “the science American scientist that is known by his and engineering of making intelligent alias Mountain Math Software, post machines”) that is smarter than hua video that talks about this subject man brain itself. Where the artificial
intelligence can evolve to enhance it’s intelligence to make decisions without any interference from its human maker. At that point, any norms or values and logic that have been the basic for us to make assumptions and speculation will fail. Irving John Good, a British statistical researcher who worked as a cryptographer at Bletchley Park, wrote about an “intelligence explosion”, suggesting that if machines could even slightly surpass human intellect, they would improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences (http://en.wikipedia.org/ wiki/Technological_singularity). Raymond Kurzweil, an inventor and futurist who has been a pioneer in the fields of optical character recognition, text-to-speech synthesis, speech recognition technology and electronic keyboard instruments (http://en.wikipedia.org/wiki/Raymond_Kurzweil), in his book The Singularity is Near argues that the singularity point in human civilization is not as far away as it seems. In his lecture in The Singularity Summit at Stanford he emphasize the exponential growth of informational technology and artificial intelligence will be the one that facilitates the emergence of singularity in human civilization. He argues that by 2030, there will be a super computer sophisticated and intelligence enough to mimic the operating process of a human brain.
01.5. Singularity
Singularity in Technology
37
01.5. Singularity
Ray Kurzweil’s graphic showing the exponential growth of information technology (http://www. hodgeslab.org/2006/05/a_singular_experience_1. html).
In Ray Kurzweil’s last two books, The Age of Intelligent Machines (1987) and The Age of Spiritual Machines (1999), he tried to predict what will happen in the informational technologies in the near future based on the combination of four postulates: 1. That a technological evolutionary point known as “the singularity” exists as an achievable goal for humanity. 2. That through a law of accelerating returns, technology is progressing toward the singularity at an exponential rate 3. That the functionality of the human brain is quantifiable in terms of technology that we can build in the near future 4. That medical advancements could keep a significant number of his generation alive long enough for the exponential growth of technology to intersect and surpass the processing of human brain (http://en.wikipedia.org/wiki/ The_Singularity_is_Near). Comic Books and Mangas as a Media of Popular Culture This singularity theory has been the inspiration of many multi-media cultural work of arts. The first and most famous entertainment media that uses singularity as the main inspiration is the cinema. Many movies that try to exploits this concepts emerges like Terminator or Blade Runner. But even before the mainstream culture like movies and cinema try to explore the singularity concepts, a lot of sub-culture popular arts has already explore the singularity concept through many view point. One of those pop art is comic books, or manga (Japanese comic books).
38
Comics (derived from a Greek word, komikos, pertaining to comedy), is a graphic medium in which images are utilized in order to convey a sequential narrative; the term, derived from massive early use to convey comic themes, came to be applied to all uses of this medium including those which are far from comic. It is the sequential nature of the pictures, and the predominance of pictures over words, that distinguish comics from picture books, though there is some overlap between the two media (http:// en.wikipedia.org/wiki/Comic). Manga are comics and print cartoons in the Japanese language and conforming to the style developed in Japan in the late 20th century. In their modern form, manga date from shortly after World War II, but they have along, complex pre-history in earlier Japanese art (http://en.wikipedia. org/wiki/Manga). Comics and mangas have been an influential part of popular art in the modern life. Two-way feedback has been happening between comic books and lifestyle in general. Comic books and mangas not only portray and parody human life, but also influence trends and general interests in many aspects of human life. Ideas and issues that comic books brought up have been a contemplation and self introspection material of the norms and values that humans hold in their everyday lives. These materials have also influence and the inspired many event and innovations human ever come up with. Although most of the people think comic books are only a visual-oriented entertainment media, but actually comic books try to stimulate not only our visual sense, but every other senses as well. Scott McLoud, an American comic artist and theorist on comics as a distinct literary and artistic medium,
There are a lot of comic books that embodies the theme of technological singularity, and try to explore the impact of singularity within our lives, whether it’s physical or psychological effects. Some of these comics and
mangas are so revolutionary and influential that they deserved to be called classics. Because of the limitation of the essay, this essay will only analyze 2 manga titles that are classified as classics, which are Astro Boy and Ghost in the Shell. Astro Boy (Tetsuwan Atom) Astro Boy, also known as Testsuwan Atom, (first published on 1952) by Ozamu Tezuka, a man who is often credited as the “Father of Anime”, and is considered the Japanese equivalent to Walt Disney (http:// en.wikipedia.org/wiki/Osamu_Tezuka), is a fiction of the adventures of a robot named Astro Boy. Astro Boy is one of the first famous android (synthetic organism that is designed to look and act like a real human being), and received a cult iconic status in Asia. Astro Boy manga was insplired by Metropolis, a 1927 silent science fiction film written by Fritz Lang, an Austrian-German-American filmmaker, screenwriter, and film producer, and his wife Thea Gabriele von Harbou (http://en.wikipedia.org/wiki/ Metropolis_(film)). Metropolis, which is the world’s most expensive silent film at the time of its release, is a story that set in a futuristic urban dystopia and examines the social crisis between workers and owners in capitalism. In 1965, Osamu Tezuka got an invitation from Stanley Kubrick, an influential American filmmaker, screenwriter, producer, and photographer, to be the art director of his newest film 2001: A Space Oddysey. Stanley Kubrick became interested in Tezuka’s art style after he watched Astro Boy.
Unfortunately Tezuka have to decline the offer because of his own studio’s schedule (http://en.wikipedia.org/ wiki/Osamu_Tezuka). Astro Boy tells a story set in a fictional futuristic city about an android named Astro. Astro was built by a scientist named Dr. Tenma to replace his son Tobio who passed away because of a car accident. Dr Tenma built Astro and takes care of him lovingly as if he is his real dead son, but soon Dr. Tenma realized that Astro couldn’t replaced his dead son because as an android, Astro cannot grow old and express human aesthetics. Dr. Tenma then rejected Astro and sold him to a circus owner, Hamegg. Astro then founded by Professor Ochanomizu, the head of the Ministry of Science in the fictional world, and then took Astro as his adopted son. Prof. Ochonamizu treated Astro warmly, and soon discovers that Astro was gifted with superior powers and skills, as well as the ability to experience human emotions. Astro Boy manga then remade by Naoki Urasawa, a 2008 winner of the Pulitzer Prize for his fiction manga series Monster, by the title of Pluto. Pluto tells the same story of Tezuka’s original Astro Boy with a different perspective and a whole darker and more realistic tone. Pluto also manage to dig deeply of the story and exploits the psychological side of the story in a more mature way. Instead of telling the story in Astro’s point of view, Pluto tells the story from a detective named Gesicht that is one of the most powerful android in the world (Astro, or Atom, is also one of them).
01.5. Singularity
, wrote the book Understanding Comics on 1993, a wide-ranging exploration of the definition, history, vocabulary, and methods of the medium of comics, itself in comics form (http:// en.wikipedia.org/wiki/Scott_McCloud). On his presentation on TED (short for Technology, Entertainment, Design an annual conference that cover a broad set of topics including science, arts and design, politics, education, culture, business, global issues, technology and development, and entertainment) on 2005, he emphasize that comic books is not created only for visual stimulation. He said, “… comic books are visual medium that try to embrace all the senses within it…”. He states that every different elements like pictures, words, symbols, and everything in between are all funnelled through a single conduit which is the vision. This input to our vision is a facility to represent sound and to understand the common property and common heritage of this aspect, like the texture and characteristic of sound. He also mentions the balance of the visible and the invisible in comics, where the comic artist gives the reader something to see inside the comic panels, and something to imagine in-between the panels. There’s also another important sense that the comics vision represent which is time, sequences in comic panels that indicate the flow of time.
39
01.5. Singularity
Movie poster from the 1927 silent science fiction Metropolis. http://en.wikipedia.org/wiki/ File:Metropolisposter.jpg
The original AstroBoy from Osamu Tezuka
40
The story feels a little weird when the reader cannot use the normal values that is used to analyze the story, but gradually the story presents its own set of values that is in someway correlated to the normal human values. This story also challenged difficult moral questions that later on inspired many science fiction block busters like Steven Spielberg’s AI or Minority Report. For example initially we can’t understand why a humanoid robot can cry seeing a pile of machine junk, but when we grasped the notion that the writer offers which is robot, although it is an artificial being, is also some kind of life forms that can evolve on it’s own and experience emotion, we can understand the situation by correlating the pile of machine junk with a pile of human body from a genocide in Africa. The setting in Astro Boy and Pluto clearly signifies the point of technological singularity where machine has been programmed so sophisticatedly that it can evaluate and evolve itself. Emotion that has been one of the few differences between human and machines is becoming blurred when machines started to feels emotion through experience. Reading this manga, the reader clearly understood that the shift of norms and values that singularity caused within the civilization.
has a sequel, Ghost in the Shell 2 : Innocence, which is produced in 2004. A third film, Ghost in the Shell : S.A.C. Solid State Society, was also created. Although the third film is a sequel to the anime series storyline (Ghost in the Shell : Stand Alone Complex), and didn’t have a continuity story-wise with the first 2 films. Ghost in the Shell influenced many people from different fields of profession. The Wachowski brothers, the one who created and directed the Matrix trilogy, mentioned Ghost in the Shell as their inspiration (http://www.warnervideo.com/ matrixevents/wachowski.html). Producer Joel Silver also stated in an interview on the Animatrix , a collection of 9 animated short films set in the fictional universe of The Matrix series, that he was shoen the Ghost in the shell movie during a pitch from the Wachowski brothers to indicate the style and look of the film they wanted for The Matrix (http://en.wikipedia.org/ wiki/Ghost_in_the_shell#Films). Ghost in the Shell is a story set in the future where android and bio-machines technology have an important role in human civilization. Humans can implement their consciousness in machines and android, or upgrade their brain so their consciousness is connected to the Net (world wide web). With the increasing popularity of free-access social networking website like facebook.com, Ghost in the Shell we can already predict that the world of Ghost in the Shell is not that far from realGhost in the Shell is a Japanese ity. post-cyberpunk manga created by MaGhost in the Shell tells a story about samune Shirow, and first published in 1989 a fictional intelligence department un(http://en.wikipedia.org/wiki/Ghost_in_ der the Japanese Ministry of Home Affairs the_shell#cite_note-3). called Public Security Section 9. Section Ghost in the Shell gained it’s pop- 9 is lead by an android with a consciousularity in the international world when it ness Major Motoko Kusanagi, who are was released as a theatrical anime film the main character of Ghost in the Shell. in 1995 directed by Mamoru Oshii. It also The storyline and logic in Ghost in the
Reference Mathematical Singularity (n.d), retrieved 26 April 2008, from (http://en.wikipedia.org/wiki/ Mathematical_singularity Paul Budnik (Producer&Director), Surviving Singularity [Motion Picture], retrieved 25 April 2008, from http://mtnmath.com/movies/index.html
01.5. Singularity
the Shell. The storyline and logic in Ghost in the Shell universe is very convoluted when the author questioned about the scientific meaning of consciousness, and soul in a human. The reality is split into two, the reality of the world that human are living in, and the reality of the Net where souls, or Ghosts, can be implemented. These two realities are influencing each other. Issues that combining two very opposite aspect of live which are science, where what we know and what we can ascertain are a basis of what we believe, and meta-physical factors are being brought up by the author. For example is when the main character, who is an android, questions herself about her existence and her soul. Another example is when an incredibly powerful and evolved artificial intelligence can simulate and create a soul and consciousness for itself, and declare itself as an independent life form. This comic book questions many fundamental questioned like what is life, what is emotion, what is consciousness, in the time when singularity occurs. When artificial intelligence are so developed so it can declare it self as a new life form, and when soul and consciousness can live on through the Net even when the body dies.
Technological Singularity (n.d), retrieved 25 April 2008, from http://en.wikipedia.org/wiki/ Technological_singularity Superintelligence (n.d), retrieved 25 April 2008, form http://en.wikipedia.org/wiki/Superintelligence Raymond Kurzweil (n.d), retrieved 25 April 2008, from http://en.wikipedia.org/wiki/Raymond_ Kurzweil Ray Kurzweil, The Singularity is Near, Viking Adult, New York, 2005
Naoki Urasawaâ&#x20AC;&#x2122;s Pluto, a remake of the classic AstroBoy
Comic (n.d), retrieved 25 April 2008, from http:// en.wikipedia.org/wiki/Comic Manga(n.d), retrieved 25 April 2008, from http://en.wikipedia.org/wiki/Manga Osamu Tezuka, AstroBoy, Japan, 1952 Urasawa Naoki, Pluto, Japan, 2002 Masamune Shirow, Ghost in the Shell, Japan, 1989 Mamoru Oshii, Masamune Shirow (Producer&Director), Ghost in the Shell [Motion Picture], Japan, 1995 Mamoru Oshii, Masamune Shirow (Producer&Director), Ghost in the Shell 2 : Innocencel [Motion Picture], Japan, 2004 Kenji Kamiyama, Production I.G, Masamune Shirow (Producer&Director), Ghost in the Shell : Stand Alone Complex, Production I.G, Japan, 2002 Ghost in the Shell movie poster
41
An Attempt to Seek for “Responsive Environments”
01.6. Responsive Environments
by: Mehmet Akif Cinar
42
“With this short paper the author tries to relate the issues to himself firstly.”
anymore. You can return playing with the forms loosely and be the strongest critic ever on the “avant-garde” famous arPlease excuse me for my humble chitects’ designed ones. Only one thing complaint dear “regular” architect. With- could be told to you before your leave, out touching on any philosophical prob- the thing you are playing does not only lematic of the issue, I simply want to tell: have appearance and material aspect “I am in a room now; me – my body and but can have some behaviour to humans my belongings, the furniture, walls and who you are designing for. windows I am facing, the ground I am This paper will try to retrace the on; the beloved architectural elements- above mentioned issue over the readings architectonics as your colleagues like to of “performative” title, one of the design name them look like physically existing in research studio seminars introduced. this space; then could you please, as a The occurrence of the computers guide, enlighten me why the “me” part is directly related to above-mentioned is feeling so alienated and it looks to me argument. In the article “digital bureauthat the relationships between these dif- crat”, David Berlinski indicates that the ferent classes – the thing I call “me”, those humans’ bending the material to its related paraphernalia and the artefacts limits by inventing computers radically us, humans, are building- are extremely changed the flux of himself as a life form weak? It is quite sure that there is a fun- (2000). It became more then a tool helpdamental philosophical aspect I can ing humans on surpassing its own limits. sense, though questioning this part of the The interactive capacity of the compuproblem in detail and a repetitive talking ter is the most basic fact and would be as one old generation used to like some- the most important criteria in terms of the how would not reveal the responsibility interrelation of a nonliving thing to living and make me satisfied, dear architect. In entities (Haque, 2007). Before coming to the mean time this profession at this cer- present-day expressions, Nicholas Negrotain time, looks to me too loose and im- ponte finds computer as a powerful popotent, continuation with the traditional tential mediator in terms of its providing a design methods will not deliver enough. responsiveness and individuality already The poetical and mystical explanations back in the early 1970s. He points out an of the arbitrarily designed entities are not important point for this article clearly by filling. writing: Dear architect, unfortunately because also I am an architect (being “I am interested in the rather singular goal named as a semi-architect would be pre- of making the built environment responferred in this case) and I am studying in a sive to you, individually, a right I consider “design research studio” to escape from as important as the right to good educathe design approach I learned or had an tion.” idea on at a traditional school and traditional late-modern architectural offices, From this audacious detection, the dishaving some honest answers to the ques- tinction between the notion of effective tion is more important for my own con- and affective spatial dispositions can be cern then questioning you as an architect a guide to us towards a “performative”
technology and space affecting each other simultaneously forms the performative architecture. In this way an indeterminate space condition replaces the former predetermined and pre-programmed approach. After the above expressions Mr Kolarevic gives the D-Tower project by Lars Spuybroek (NOX) as a literal example to performative architecture. The D-tower located in Doetinchem, Netherlands carrying both digital and material qualities, works with so-called biomorphic structure in the material world and in the digital side, a website and a questionnaire trying to capture how the inhabitants’ of Doetinchem collective feelings are. It is claimed that these three parts are related to each other interactively. Spuybroek defines the process by mentioning “the intensive (feelings, qualities) and extensive (space, quantities) start exchanging roles, where human action, colour, money, value, feelings all become networked entities” (n.d.). The continuous biomorphic epoxy surface of 12 m high tower becoming both the structure and the skin changes its illuminated colour depending on the “state of mind” of the residents living in the city. The dominant emotion of the day is calculated by an algorithm through the website by the help of the online questionnaire and is mapped out on the “skin” as 4 different symbolic colours –green, red, blue, yellow- representing the daily emotions in order of –hate, love, happiness and fear-. On this project apart from its being “performative” and a poetic monumental entity by its direct rela-
tion to the human feelings, the interactivity exhibited is not strong enough to argue. In the case of this project there is a one sided reaction according to the input which is the collection of the emotional states of the residents, however the mutual reaction is missing to call that there is an interaction between these agents. A preprogrammed mechanic behaviour is presented as a result which in the end does not give an indeterminate spatial output as Kolarevic seeks through his article. It simply paraphrased as: there are four different predetermined colour states changing on an amorphous “multi-per formative” structure activated by one uncertain “collective” input. Moreover in 1970s as Negroponte criticizes the responsive architecture speculations by stating “when we look at responses that have been suggested (in the literature) for architectural behaviour, we find the most banal illustrations, reminiscent of second-rate light show” (Negroponte, 1972) using the change of colour as only one architectural output can end up misjudgements easily. Beside the above project, there are some other young architects dealing with creating interactive spaces instead of having a traditional approach of designing and constructing, they choose to merge artistic and architectural skills. Having a look into these other architects’ work, in this case a design of an installation, may give us a clearer picture of possibility of creating a responsive system. The projects these architects design in common have an experimental approach to humans’ bodily
01.6. Responsive Environments
as well as responsive approach in architecture as Dr Kolarevic uses this classification quoting from John Rajchman when he begins his article (2005). Rajchman explains by comparing these two characteristics of space: while the effective space predetermines and captures the movements or activities in a setup, the second one tries to unloose the movements or bodies from a predetermined setup, the undetermined relations get more important and this results in letting body to go on unpredictable paths. Dr Kolarevic traces back the emergence of “performance” in the humanistic and other research fields to 1950s. The shift from a fixed inactive collection of artefacts for the definition of culture to a “web of interactions” and an active network of intertwined processes changes the perception of fixed structure and meaning of the culture. Hence the performative approach emerges in contemporary culture by the concept of a continuous cultural and social transformation. Likewise in the architectural realm a continuous selfreformatting system by becoming “responsive” to the transforming social, cultural and technological contexts and besides becoming a medium to affect this cultural pattern is the way how performative architecture can be described. Instead of a fixed programmatic spatial condition a multiple and ambiguous space is created by the effects of dynamics of sociocultural and technological changes. As in the case of performative theory in cultural aspect, there occurs a complex dynamic network of culture,
43
01.6. Responsive Environments
D-Tower: The dominant emotion mapped out on the “skin” as 4 different symbolic colours –green, red, blue, yellow- representing the daily emotions in order of –hate, love, happiness and fear-. (http://www.nox-art-architecture.com/)
DUNE 4.0: the landscape he created becomes “an extension of the humans’ activities” producing these different “moods”. (http://www.studioroosegaarde.net/downloads/ Dune_%20Daan%20Roosegaarde.pdf)
44
sensations and desires on a very basic existential level. Such an experience presents a phenomenological aspect through these projects by suggesting a very direct personal perception to the body. The method is mostly trial and error by using the relatively new technologies and carries the offer of these technologies having a stronger and closer connection to our bodies and senses (Bullivant, 2006). In 4DSocial one of the issues of the AD magazine, Lucy Bullivant points out the resemblance of the design of one Dutch architect to the narratives of Alice in Wonderland to argue how that particular design takes user into a totally different world by allowing him or her to participate in (2007). The architect Daan Roosegaarde presents an audacious initial idea of his interactive landscape design Dune 4.0 by claiming “We want it to learn how to behave and to become more sensitive towards the visitor”. Two long lines of swaying reed-like fibre “bushes” containing microphones and presence sensors in them take place in the both sides of one corridor of the Netherlands Media Art Institute for the installation. The landscape turns off its self when there is no visitor, being called the sleeping mode and when the visitor enters in the scene light begins to appear on where he or she walks at. The elements begin to have lightning crashes in case of dense noise existence. Roosegaarde asserts that the landscape he created becomes “an extension of the humans’ activities” producing these different “moods”. “Responsive environments” is explained by Bullivant as “-by definition spaces that interact with the people who use them, pass through them or by themhave in a very short space of time become ubiquitous” (Bullivant, 2006). Beside
this explanation, the work presented and the similar interactive projects primarily focus on the direct physical perception of the visitors-users instead of semiotics of forms as the mainstream architectural world does. The occurrence of the emotion based relationship of individuals towards artefacts take place in the example. In Dune 4.0 project both sides affect each other simultaneously, the sound and motion inputs coming from the human visitors translated into the change of the density of the light of the “landscape” and movement of the fibre bushes, in turn becomes input for the visitors to change their behaviour to have another affect on the nature-like object. Thus these two entities continually transform each other in an interactive way. Otherwise than analyzing the contemporary architectural examples looking back to some historical models also can be helpful. The case of Cedric Price’s “Generator” project acknowledged as the first “intelligent building” by the architectural media on its time at 1979. The project initiated on 1976 for the client American paper company, Gilman Paper Corporation. Price’s design philosophy was in favour of an impermanent architecture minding users’ participation. The proposal consisted of a grid of foundation bases and linear track, on which a mobile crane could move and place a kit of parts comprised of cubical module enclosures and infill components (i.e. timber frames to be filled with modular components raging from movable cladding wall panels to furniture, services and fittings), screens, decks and circulation in multiple arrangements. The design focus was on the details and interrelationships of all the above components, and its being a menu for facilitating change was significant. This way the users would
puts become outputs and they affect each other simultaneously. In Addition, different levels of interactions and relationships emerge as Gonçalo M Furtado points out in her research by examining the notes and letters of Frazer and Price (2007). Firstly, Frazer mentions that the process includes an ‘interactive’ relationship between ‘architect/machine’ assisting the drawings and additional information for the parallel developments. Secondly an ‘interactive/ semi-automatic’ relationship of ‘user/ machine’ carrying out instructions to provide operative drawings to the operators and then a scheduling and inventory package for the Factor, to make it able to act as a functional critic occur. Next level of relationship comes from the previous one, making the activities of the users possible in this system and other step is taking those activities and configuring the components in a set of rules to carry the changing requirements. Lastly, an overall behaviour rising from all those relationships emerges as Frazer states “to generate unsolicited plans, improvements and modifications in response to users’ comments, records of activities, or even by building in a boredom concept so that the site starts to make proposals about rearrangements of itself if no changes are made. The program could be heuristic and improve its own strategies for site organization on the basis of experience and feedback of user response” (2007). The whole process becomes challenging against to the history and the future of architecture, a building which has its own life and intelligence making conscious
Generator Model (1978-80): Plastic, metal, plastic coated wires, and self-adhesive paper dots (10.8 x 78.7 x 52.1 cm) (http://www.moma.org/collection/browse_results.php?crit eria=O%3AAD%3AE%3A7986&page_number=4&template_ id=1&sort_order=1)
01.6. Responsive Environments
have been a part of the organization of the building and thus the building would be able to satisfy the shifting needs of the client by having this rearrangement capability. In the later stage of the project John and Julia Frazer were invited as computer system consultants and Frazers’ researches enhanced the proposal to a more intelligent direction. The focus of Frazer on “usermachine” relationships in their researches introduced the idea of “building that could learn from the interaction with the user”. A computer program was written by them to organize the distribution of the elements in response to changing requirements of the users and also every component carrying microprocessors allowed the system to control its organization and then learn from the different configurations (Frazer, 1992). In addition to this user-building interaction, Frazer borrows the “boredom” of the machine concept from Gordon Pasks’ MusiColour which listens to the musical performer and gives coloured light outputs according to the rhythms and frequencies, yet if the inputs are too static and consistent machine gets bored and listens other frequencies then the performers’ (Haque, 2007). Likewise if the configurations of the components are too stable, the Generator becomes “bored” and suggests alternative arrangements based on its own experience. This approach includes a learning capability which brings the potential of finding the most performing organization for its user. There happens to be a mutual relationship between different agents (i.e. site, building and the user), the in-
45
01.6. Responsive Environments 46
Initial Generator Model (1978-80) (http://www.activesocialplastic.com/2007/08/cedric_prices_generator.html)
making conscious decisions by being both interactive and responsive to its users and the site. Hence a conscious life-like entity is created and the use of the creative title of “An Evolutionary Architecture” by Frazer sounds convincing. Several different relations and interactions overlays and makes the complex architectural proposal in this example. The above projects imply that the responsive environment issue takes us to another level: “intelligent spaces”. When Negroponte writes about intelligent environments, he gives the metaphor of the being a new member of the house in the family. A contextless regulatory control system receiving some signals with its sensors and its giving some responses is not enough to generate a responsive system as so-called interactive/responsive projects offer nowadays. The physical environment also needs its own intentions and purposes like one of the family members. The space can be an intelligent system which can also learn how to behave from the interactions of its users as Generator Project offers. Lucy Bullivant advices us by saying “if intelligent spaces were truly intelligent, we might not like them, because we want them to be intelligent but acquiescent” and with this reason she prefers to present the projects tak’ng place between not “purely reactive” or not “entirely predetermined” (2006). On the contrary, as an individual, also as an architect, I am looking forward to see the limitless, indeterminate even wild responsive environment to live in it. A built creature-like entity having its own mind and character, responding not only to human, but to anything it can sense is much more promising than an algorithm driven fake reactive so-called responsive system.
References: Berlinski, D. (2000). The advent of the algorithm. San Diego Negroponte, N. (1975). Soft architecture machines. MIT Press Frazer, J. (1995). An evolutionary architecture. London: AA Press Kolarevic, B. (2005). Per formative architecture beyond instrumentality. Spon Press Bullivant, L. (2006). Responsive environments. London: V&A Publications Haque, O. (2007) The architectural relevance of Gordon Pask. AD: Architectural Design. London: Wiley Academy Bullivant, l. (2007). Alice in technoland. AD: Architectural Design. London: Wiley Academy Furtado, G. (2008). Cedric Price’s generator and the Frazers’ systems research. Technoetic Arts: A Journal of Speculative Research. Intellect Ltd D-tower. (n.d.) Retrieved May 28, 2009, from http:// www.arcspace.com/architects/nox/d_tower/ Roosegaarde, D. (2006) Dune 4.0. Retrieved May 28, 2009, from http://www.studioroosegaarde.nl/ index.php?project_name=Dune
47
01.6. Responsive Environments
Principals
Translating statistical data into architecture with algorithm
01.7. Principals
by: Ardes Perdhana
48
Abstract This writing is driven by the needs of having criteria in terms of implementing certain algorithm into architecture and urban design. What are the principals that can be used in order to use certain algorithm from statistical data for certain aspects in architecture? Introduction There is a stage where we have already understood about the concept of algorithm and its theory in which explains its relation to several aspects in nature and our life, but still havenâ&#x20AC;&#x2122;t found a way to implement these ideas into the design process of architecture. The question is actually simple: How can we put and arrange these algorithms in the most suitable way into architecture according to the basic principals of those two different but connected realms? What are the principals in connecting those two? This writing is an observation, which is trying to find these principals from two main source books: Critical Mass, How One Thing Leads To Another, written by Phillip Ball and Philosophy of Science, Contemporary Readings, edited by Yuri Balashov and Alex Rosenberg. There are also some additional source books that are being used to support some arguments taken from the first two main source books. The first reading is explaining about the history of Physics of Society. Along the Phillip Ballâ&#x20AC;&#x2122;s explanation about the history of this particular discipline, there are some implicitly stated principals, usually being used by the scientists who are categorized into Physics of Society. The observation is actually focus
on how these scientists connect one fact into another, in this case it is from statistical data to graphic of scientific model (algorithm), in their research. At the end of this writing, it suggests to use the found principals not to connect between statistical data and certain algorithm model again, like those scientists in answering their scientific enquiries, but to use it as the basis in connecting certain algorithm model into aspects in architecture. This is the mission to continue the quest.
(above) area of discussion; (below) suggestion
At this stage, we probably ask these next questions: What is the difference between science and philosophy in terms of their area of discussion? What are the questions that cannot be answered by science, but can be satisfyingly answered by philosophy? Yuri Balashov and Alex Rosenberg explain about 2 types of questions that are in the domain of philosophy, in order to have the clear separate definition with science, in the introduction of Philosophy of Science, Contemporary Readings, “…First, the questions that the sciences-physical, biological, social, behavioral- cannot answer now and perhaps may never be able to answer. Second, the questions about why the sciences cannot answer the first lot of questions.”
This definition is actually being brought into their book when they are explaining about a philosophy movement that commonly known as Logical Positivism in the history of Philosophy of Science. This movement was a response to a degradation of Retrospective observation on the phi- philosophy contribution at that time losophy of science in comparison with another knowledge, especially science. A reflection on the history And one of the reasons of this of philosophy of science becomes degradation is that because there important at this stage. Because it was no awareness in understanding records similar conditions described, the nature of science and philosothat is the main core of this writing: phy. Philosopher tended to be arrofinding the arguments in connecting gantly refused to accept that they two or more scientific facts. It will give are being negative by neglecting us the reason that we cannot have their competitor in finding the truth, from the science realm only. science. It made the mapping of the relation between science and phi-
losophy was not so firm. Logical Positivism finally created some basic understanding that was defining a new relation between philosophy and science. One important text, that explains about this movement, can be found in the same book that was edited by Balashov and Rosenberg. It’s Moritz Schlick’s The Future of Philosophy. He was trying to define the difference between science and philosophy, as a part of the suggestion about what philosophy should do in the future in order to survive. He explains,
01.7. Principals
Along the observation of these principals from the first main source book, there are some doubts and questions addressed to these principals regarding to their lack of persistence and preciseness as scientific principals. They are sensed to be too “relax” and too “pragmatic” as scientific base. This is because it’s already out of the domain of the science itself. Philosophy of Science, which is the second main source book, explains this condition of needing reasons to answer some questions in a research, but it cannot be answered by some facts from the science field only. It needs more arguments from the discipline of philosophy as the basis in order to evaluate these “relax” and “pragmatic” principals. In respect to this condition, it is better for us to discuss some important statements from the history of Philosophy of Science before the Physics of Society. This sequence will ease us in understanding the topic better. Hopefully we will have the certainty in implementing algorithm into architecture.
“…In fact, before I go any farther, let me state shortly and clearly that I believe Science should be defined as the pursuit of truth and Philosophy as the pursuit of meaning.” More on the separation in definition between philosophy and science, Schlick was explaining that scientific questions usually could be answered by prepositions. When we want to know about a meaning of a word, we usually try to find it in a dictionary. And if there is still another word from the previous definition that we still don’t understand from the explanation, we just keep looking in the dictionary. We keep looking into another pages, and so on. Pages in this case are actually an analogy of preposition. Philosophical questions, even though it is correct according to English grammatical rules, cannot be considered as a question. How can we provide a preposition in order to answer such question like “Is blue 49
01.7. Principals 50
more identical than music?” We cannot. This is because philosophy is actually a mental activity. It means we need some time to finally experience (being shown) it by our self. In the end, he was finally suggesting that philosophy should be kept in the specific task, which is in a science research. Philosopher in the future has to help a science research whenever they have the difficulties in connecting their fact by giving some logical and conceptual analysis. In this spirit, he named his philosophy movement Logical Positivism. Another text that is interesting from the Philosophy of Science is The Pragmatics of Explanation, written by Bas van Fraassen. It is important because it brings the arguments of using hypothesis in a research from the realm of philosophy. Even though we are now consider that philosophy can be connected in a science research, in order to have a specific task and not flying around and having an endless debate on what philosophy is, Bas van Fraassen argues that it’s impossible to have a complete explanation of a science research itself. There must be some part of the scenario of a science research that cannot be explained with a theory that is already existed. Particularly on this matter, he argues,
research by using the so-called false theory or hypothesis. He also quoted a statement from Darwin about this matter,
started it. His aim was actually to have a political theory using the same mechanic worldview as the one that Newton used in his theory of gravity. “…It can hardly be supposed that a false He wrote his book, Leviathan, in theory would explain, in so satisfactory a response to the condition of social govmanner as does the theory of natural se- ernment that was neglecting the nature lection, the several large classes off facts of human. The quest was actually to deabove specified.” fine simple laws of human behavior in respect to the simplicity of Pythagoras law With this understanding, we re- in geometry. It has been continued by alize that it is impossible to keep philo- several generations of scientist, up until sophical questions outside from scientific now when we finally consider such disciresearch. Because it’s truly exist. That’s pline like quantum physics. why we should be able to recognize the Along the Phillip Ball’s explanaphilosophical questions. Without hav- tion about the journey of this particular ing this ability, we will have the difficul- discipline all these time, there are some ties in doing the research by setting the implicit principals that are being used by scenario of research all over again from the scientists in connecting one fact into the beginning, over and over again, trig- another in order to complete the quest gered by the needs of answering such explained above. This writing is trying to questions. bring some of the selected principals And the next thing that we should found from his book, worth to investigate do in respect to these philosophical and finally, that can be used in bringing questions can either be postpone them, algorithm in an architectural research. because probably we won’t have the Description, not prescription answer until we finally have the experiEach algorithm has its own charence, or verify them using some logic acter. So do architecture, it has its own and concept analysis, which is the area rule. The importance of understanding of the philosopher of science mentioned these two systems first, before finally conabove, or simply use assumptions (hy- nect them in a design process, is making pothesis). The third attitude is actually the the next stage of the process easier. This one that we will find quite often during is because we do the exploration with the observation of the history of Physics such basic understanding that will allow “…When is something explained? As a foil of Society in the next chapters. us to control the system much better, still to the above three ideas, let me propose in a coherent way according to their nathe simple answer: when we have a thetures. ory which explains. Note first that have is Retrospective observation on the history Ball, in explaining about the histonot have on the books; I cannot claim to of physics of society ry of physics of society mentioned about have such theory without implying that this: this theory is acceptable all told.” Phillip Ball, in his book titled Critical Mass, How One Thing Leads To Anoth- “…Political theorists tend to concern He calls this type of explanation as er, explains about the history of Physics of themselves with what they think ought a false theory. It’s quite common among Society. Thomas Hobes, who wanted to to be; scientists concentrate on the way the big names of scientists to explain his have a law that governs social behavior, things are. The same is true of the new
“…Science provides prescriptions but descriptions.” From the quotation above, Ball is trying to tell the difference between political theorist and scientist when Hobes started it in the 17th century by bringing atomic theory from the Greek philosophy and theory of gravity by Newton into social utopia. The political theorist was strongly forcing his ideal condition for a society at that time to follow, disregarding the nature of the society itself. On the other hand, scientist was always trying to understand and to make the description about the nature of things, but not prescription. This fact encourages the importance of having a system that is not so rigid and still gives way for people’s assumption in terms of bringing the facts or data into their own field of work. There is a condition when we are dealing with technical problems in the script field because there is lots of aspects need to negotiate. But in respect to this principle, we actually should force the exploration by placing the simplest definition of the algorithm model that we use in the first priority. One of the examples of this negotiation condition can be the following. If we wanted to use the L-system, we should have pay attention more on its passing values aspect, because it is the nature and the simplest definition of the system. When we
have a series of assumptions needed to negotiate, this aspect should be in the first priority. This principal can be the suggestion of the proper attitude in regards to the algorithm model that we’re going to use. Ignore the detail Through the history of Physics of Society, we can find the importance of statistical data. It is the parameter in deciding which aspects that have the similarity before we finally connect them. In terms of using this statistic data into many new applied fields, there is a common false to be very much focus in the details, which is the numbers shown in it. The focus is supposed to be put into the overall trend of the behavior. Stephen Wolfram, in his book titled A New Kind Of Science, mentioned about this when he’s explaining the tendencies of using the simplest system/ model that is taken from featured effects relevant to the process, “ …And in my experience by far the best first steps in assessing a model is not to look at numbers or other details, but rather just to use one’s eyes, and to compare overall pictures of a system with pictures from the model.” Imagine we have two systems we want to connect. Things that we should do in respect to their statistic data, is actually categorize or highlight certain data representing
Two different kinds of phase transition. In a critical (secondorder) transition, falls gradually to zero as a ‘control parameter’ such as temperature) is changed (a). In a first-order transition (such as freezing or boiling), such a property (here the density of fluid) changes abruptly at the transition point (b).
01.7. Principals
physics of society: it seeks to find descriptions of observed social phenomena and to understand how they might arise from simple assumptions.”
a.
above critical temperature
below critical temperature
b.
above critical temperature
below critical temperature
The icing model of magnetism (a) assumes that the ‘magnetic needles’ can point in only one of two opposed directions. (b) There are two types of grid site in the model: occupied by a particle (liquid) and unoccupied (gas).
51
01.7. Principals
Picture (a) “Vortex motion in cells of the slime mould Dictyostelium discoideum”. (b) “Some fish swim in the same kind of vortex pattern”. (Ball, P., 2004, Critical Mass, p. 150
A fatal growth model developed by Michael Batty and Paul Longley can give a reasonable imitation of the shape of the Welsh city Cardiff, confined by rivers and sea. (a) shows the real city, (b) is the computer simulation ‘grown’ from the model. (Ball, P., 2004, Critical Mass, page 187)
“Bifurcation Diagram” (Ball, P., 2004, Critical Mass, p. 132)
52
important data in graphic and see if there any similarity according to their shape in the sketch. More argument that can also be brought to be the basis of this principal is Ball’s explanation about the Concept of Universality in science. It is the underlying argument in terms of claiming that there is a similarity, out of a comparison study of a behavior, between two different properties, when they are having the so-called phase-transition between one stage into another. He argues,
with a population of cells in a slime mold for example. These two living organisms are in a different scale, but their behavior patterns are the same. Physicists, in their difference with scientists from another discipline that too much focus in their field only, have been trying to seek the interactions between things. In respect to the previous described fact above about the similarity in patterns, physicists have been questioning about how does it connect and relate each other. Later on we consider a statement that the interactions hap“…Physicists call this universality - a term pening in the local environment of a livwhich aims to convey that are some ing thing is actually the one that gives the processes in the world for which the de- impact to its global scale behavior. tails simply do not matter.” This fact gives such understanding that if we want to control such big “…What this suggest is that phase transi- scheme problem in our everyday life, we tion is a generic phenomena: they hap- have to control it from the local interacpen in the same way for a wide range of tion related to that particular big scheme apparently different systems.” problem. And there is an interesting fact from the pattern investigation of cellular In short, we have to put our at- organism, that all living things are actutention to the overall behavior out of ally have an asymmetry pattern when statistic data. Take some important data they are stable. that describe the simplest definition of Ever since, physicists are trying to the algorithm model and then sketch it inves tigate this symmetry breaking apin a graphic. Compare it with another proach in their research of interactions important data with the same treatment. between things through their graphic If they look similar, then it’s similar. Don’t model. They wanted to know what are mind about another detail like their differ- the parameters that finally matter in a ence in scale. The overall similarity is what particular problem that they should conwe want from their graphic comparison trol in the local interactions. study. From the previous chapter, we consider a way to relate one aspect to Asymmetry, the stable condition of a another aspect through their similarity in growth living thing their graphic of highlighted statistic data. At this stage, with the principal described There has been a consideration in this chapter, we should find the paramabout the similarity in pattern between eters that matter in breaking the symmetwo living organisms in different scale. We try of a model interaction between two can take the similarity of the pattern of local aspects in the particular problem of a population of fish swimming in the sea our research agenda.
One of the difficulties in a research is to recognize, which question worth to investigate and which one is not. And it is critical because it will decide the effectiveness of the research itself. In our case, there have been lots of questions about ways to connect certain algorithm model into aspects in architecture that make us set everything all over again from the beginning.
01.7. Principals
Conclusion
The retrospective of history of Philosophy of Science gives us the method to classify the questions in a science research to avoid the problem mentioned above. And the retrospective of history of Physics of Society gives us some examples of hypothesis, which we can use in connecting algorithm to architecture, which leads into the well developed of this particular discipline nowadays. In this manner, hopefully the actual work in a research studio of architecture will be more productive and have the clear destination.
Reference List Balashov, Y., Rosenberg, A. 2007. Philosophy of Scince: Contemporary Readings. New York: Routledge Ball, P. 2005. Critical Mass: how one thing leads to another. London: Arrow Books Wolfram, S. 202. A new Kind of Science. Canada: Wolfram Media Incorporated
53
‘Team’ as a design strategy
01.8. ‘Team’ as a design stratergy
by: Muhammed Shameel
54
The designer has been looking at various subjects as his source of inspiration for his ideas, theories and innovations. Most of these subjects however, have been associated with mankind all this while. Nevertheless, these subjects have not yet and will not be saturated in educating us. The growth of education and knowledge as well as constant research and development has helped understand these subjects better. As a result of which, design has evolved…producing various prototypes and innovations. However apart from the constant update of information, the various perceptions of the same subject with the same data and tools that are available have also helped designers come up with innovative designs. One of the basic design tools taught at undergraduate programs of architecture is how to draw perspective drawings. The linear technique is based on factors like eye-level, vanishing points, picture plane and foreshortening. It imitates the movement of light between the viewer’s eye and the picture plane based on the rules of optics. Based on the placement of these factors on the drawing board, one can arrive at a perspective drawing of the object placed in the scene. The two-dimensional image arrived at, only shows details of the object that are seen based on the position of the viewer’s eye. When the position changes the detail too changes. In other words the information conveyed is limited when compared to the three dimensional object. This is valid in the physical environment as well. One could get easily deceived if the complete threedimensional information is not conveyed. This can be tackled easily by either walking around the object, rotating the object or even by placing mirrors with respect to
the viewer’s position. The first one would consume time whereas the later setup would deliver the required information at a given instant of time. Today, advanced visualizing software’s can be used to obtain this information. Specific software’s can also retrieve three-dimensional information of the object at a given time when the object is subjected to transformations over a period of time. Depicting the subject from multiple viewpoints was the main characteristic of the 19th century avant-garde movement ‘Cubism’, where the subjects are broken up, analyzed and then reassembled in an abstract form. The result was the summation of distinct instances on the canvas. Pablo Picasso, who pioneered this movement, called this the “Sum of Destructions”. (Alexander Boguslawski. 1998-2005. Cubism. Retrieved December 25th 1008, from Rollins College. Website: http://www.rollins.edu/ Foreign_Lang/Russian/cubism.html) The cubist work therefore has a greater context. The surfaces depicting the object intersect at random angles and merges in to the background, reducing the sense of depth and creating an ambiguous space. The ‘Cubism’ movement was perhaps a reaction, set against the formlessness of ‘Impressionism’, which was mainly characterized by depicting realistic scenes of modern life. (Impressionism, Cubism. (n.d). Retrieved Decemeber 25th 2008, from http://en.wikipedia.org/wiki/ Impressionism and http://en.wikipedia. org/wiki/Cubism) The then artist believed that their role was to give a symbolic representation for the idea and not to depict the short lived appearance of things. As a result of which, the subjects were broken down and represented by basic shapes and elements.
of various sections of a scene is varied to control the speed of the action. The combined result is a unique experience where the natural rules of gravity and nature are defied by setting the camera path to move in a real time setting, while the speed of the action is varied. These reference points can also be termed as ‘Vantage Points’. A general definition of this would be ‘A place from which something can be viewed’.(Definition of Vantage Point. (n.d). Retrieved December 27, 2008, from www.wordnet.princeton. edu) With reference to animations and movies such as ‘Vantage Point’ (2008) directed by Pete Travis, it can also be defined as, ‘The frame of reference of the narrator in a story’. (Definition of Vantage Point. (n.d). Retrieved December 27, 2008, from www.pearsoned.ca/text/flachmann4/gloss_iframe.html) The movie revolves around an assassination plot of the President of United States. The actual time length of the story is 23 minutes but it is retold six times, each time from the viewpoint of different characters. The viewer’s idea about the plot keeps changing as and when the characters viewpoint is shown. Each of these stories integrates towards the end, ultimately revealing the actual convict and what actually occurred. The context and the information thus conveyed are very rich and complete. The films poster gives us a graphical idea about the movie where each of the vantage points contributes to the unknown convict. It is not necessary that one should adopt multiple viewpoints to retrieve more information.
01.8. ‘Team’ as a design stratergy
Though the cubist works are an abstract form, perspective guidelines are followed in most of the paintings. When a perspective drawing is looked at, the viewer becomes a part of the visual field that extends out of the drawing. Multiple viewpoints would therefore lead to multiple imaginary fields creating a partial three-dimension space. This makes it dynamic and interactive. The emphasis here is not on the method but on the context and the information that is been provided at an instance. A visual effect, often used in movies, referred to as ‘Bullet Time’ demonstrates this idea in a more detailed manner. The visual effect is achieved by extreme permutation of time and space on different scales, simultaneously. Though the concept was first observed in the movie, ‘Blade’ (1998) directed by Stephen Norrington, the true Bullet Time effect was introduced through the Wachowski borthers’ movie, ‘The Matrix’. One of the exceptional scenes in the movie is when the lead character Neo, dodges bullets fired at him by an agent. The scene setup consists of a line up of more than 400 computer controlled multiple SLR cameras along a predetermined path. The cameras are set to shoot at an instance or programmed to click at regular close intervals. When the scene is in action, each of these cameras generates images that are required for animation with a higher number of frames per second. Special morphing software’s are then used to morph the images from different cameras to get one animation sequence. In addition to this, the Frames Per Second aspect
55
01.8. ‘Team’ as a design stratergy 56
subjects. A common example of this methodology is the assumptions that are adopted while proving a theorem or a formula. Without these assumptions, the results can be different or might not be arrived at. The following image is the logo for an Indian television channel named Kairali. There is more than one way of interpreting this image, each of which emerges, as and when you read the basic construction lines. The sequence, in which the lines are read, plays a role in the various interpretations acquired out of this image. A similar approach of multiple interpretations can be seen even in terms like ‘architect’, ‘design’, etc. Webster comprehensive dictionary defines ‘Architect’ as ‘One whose profession is to design and draw up the plans for buildings, etc., and supervise their construction’. However, ‘Architect’ as a verb is used in many ways. A software engineer can also be termed as an architect of the software that he designs. Similarly a Military Personnel who decides on the deployment of the military troops or comes up with a strategy of military operation is often called an ‘Architect’. Another example of multiple interpretation as illustrated by Deluze in his text of ‘Lines of Work – On Diagrams and Drawings’ states that the link between lines and diagrams should be understood as what it does not represent.(Gilles Deleuze & Felix Guattari.(1987). A Thousand Plateaus: Capitalism & Schizophrenia. Transl. & forward by Brian Massumi) This definition actually rules out all other possible interpretations of what the diagram represents and is perhaps a better definition to the conventional straight forward definition it can take. These definitions cannot be disagreed upon, but one
tends to choose the closest definition or the most appropriate to the background one adopts. The background is built on the information and data available at that time. With the advances in information and knowledge, these opinions often take a different approach, thus influencing the end result. The result may vary to an extent that it can be totally contradictory to what was believed to be. For example: Recent studies have showed that moderate consumers of alcohol have less chance of heart attacks and better probability of longevity than those who are abstainers or heavy drinkers while it was believed earlier that it was harmful. (Alcohol and your health. (n.d.). Retrieved December 26, 2008, from http:// www.healthchecksystems.com/alcohol. htm) With the constant research and development in various fields, the understanding and therefore the interpretation of these subjects are bound to be different in the future. The factor that we are perhaps in control of is the time that we take to change our opinion. These interpretations however are not arrived at one particular time. It happens over a period of time. Most of the subjects that we study today are the ones that were born along with mankind. Subjects like ants, bees, trees, nature, universe etc. still offers a lot to learn from, as it is evident from the constant discoveries that are made today. The latest prototype from BMW called the ‘GINA’ brings us to an important discussion. The GINA is an acronym of ‘Geometry and Functions in ‘n’ Adaptations.’ (GINA Light Visionary Model. (n.d.). Retrieved January 2nd, 2009, from http://www.bmwusa.com/Standard/ Content/AllBMWs/ConceptVehicles/ GINA/Default.aspx)
like Processing and Rhinoceros has been on the forefront amongst the visualizing software’s. In the Research and Development area, innovative methodologies have contributed to achieving results on a quicker pace, keeping multiple interpretations in mind. Amongst these methodologies, one significant strategy and probably the most successful one is to work on ‘Team based’ profile. Bringing in professionals and expertise from different fields in to one design problem has facilitated multiple opinions and has led to better production and innovativeness. Each of the members constituting the team would have multiple opinions themselves. These opinions or perspectives, acting as ‘Vantage Points’, would multiply with increase in the team members. This probably can lead to chaos. However, as Bruno Latour’s suggests in his book, ‘Laboratory Life: The Construction of Scientific Facts’, one would agree that design results out of chaotic and disorganized ways. (Brett Steele. (2003). Disappearance and Distribution: The architect as machinic interface, Hunch: The Berlage Institute report No. 6/7. (pp.422-436). Rotterdam.) This, however, does not suggest that more members would result in a productive design. It is vital that the team members are comfortable with each other and that a common platform or in other words an interface is setup to take advantage of a team based setup. For technical and communication requirements, Computers, Software’s and Information Technology contribute to meet the basic requirements of the interface.
01.8. ‘Team’ as a design stratergy
During its development, designers questioned the predefined rules of conventional model of a car from as many perspectives as possible. The prototype’s highlight is that it has a basic chaises or minimum metal framework around which a fabric is wrapped that becomes the skin of the car. The GINA has a total of seven pieces of fabric that are manually stitched to form the skin of the car. It takes about two hours to put it on to the frame of the car. The conventional skin of an automobile consists of static steel metal sheets and plastic. The skin or the interface of the GINA model along with the hydraulic enabled metal frames makes the prototype dynamic, adapting to various user requirements, reflecting human emotions. However a similar idea could have been inspired from a car that is often covered with a plastic based fabric to protect it from dust when not in use. Here the question is not on the concept but why has the BMW taken all this time towards the development of the GINA, when we have been looking at a car covered with fabric probably since its invention. One would agree again that there were technical limitations for the production of this concept. However, the idea as it is has been developed lately. The design tools of today consisting of advanced visualizing software’s and innovative research techniques along with information technology have helped us reduce the time taken to come up with innovative concepts. To name a few; Autodesk’s Maya, 3D Studio Max, scripting and modeling software’s
57
01.8. ‘Team’ as a design stratergy 58
Each of the Vantage points would be based on various backgrounds, which can be because of professionals from various fields, working together. In some cases lifestyle and culture also plays a significant role in contributing to opinions. Vantage points, together as a whole, contribute to the identity of the architectural studio. The design outcome through such a team based setup would lead to emergent behavior which otherwise have a greater probability of a linear outcome. The distribution of work with regards to the necessary skills and knowledge needed to operate various design tools as well better management, are other advantages that the team based setup offers. A Team based setup might not be a new strategy but the objectives of such a setup are what are innovative. Today along with globalization, teams span over countries and no more limited to local geographical boundaries. The Teams take a decentralized network profile working simultaneously without placing themselves on a hierarchical platform. This was one of the initial objectives of the Team EXTASIA who came together for the competition project, BLUR pavilion of EXPO’01, Switzerland. (Elizabeth, Diller. (2002) Blur : The making of nothing. New York. Harry N. Abrams.) Many organizations have already adopted the network model of distributed office, beginning from the 90’s. Each node of the network can be assigned to take a particular role. At the same time it is not necessary that every node of the network that is established has to participate in a design problem. The nodes can be selected based on the project requirement. The distributed offices are also setup on the basis of project sites, clients and commissioning bodies that now spread around the globe.
With design problems and requirements becoming more complex by the fusion of different functions along with the increase in the scale of the design, it has become difficult for an individual to address it all by himself. Teams have therefore become more of a requirement. The role of the ‘Designer’ as well as the definition should perhaps be redefined to refer to the team rather than the individual.
Bibliography http://cghs.dadeschools.net/african-american/ twentieth_century/cubism.htm http://www.rollins.edu/Foreign_Lang/Russian/cubism.html The Matrix ©1999 Village Roadshow Films (B.V.) Ltd / Warner Brothers. Clips from “What is Bullet Time” documentary from The Matrix DVD & Video. Astor, Robert. “Cubism” STRS Renaissance Fall 1996, SIRS, Inc. Colliers Encyclopedia. New York: Colliers Incorporated, 1891. Vol. C. 546-7. http://www.healthchecksystems.com/alcohol.htm Blur book www.autoblog.com www.bmwusa.com http://www.designboom.com/weblog/cat/16/ view/3084/bmw-gina-light-visionary-model-concept-car.html Cubism http://www.rollins.edu/Foreign_Lang/Russian/cubism.html http://en.wikipedia.org/wiki/Cubism http://en.wikipedia.org/wiki/Impressionism http://en.wikipedia.org/wiki/Post-Impressionism
59
01.8. ‘Team’ as a design stratergy
02 Phase 1
Research Lab
61
62
02.0 Studio Brief Wetware is interested in the methodologies that can narrow down the gap between matter and information. Such a designed systems can learn and adapt to complex host conditions. Thereby making it efficient and self sustaining. Large populations and multipliticities are the key to this. Agents with simple
isntructions are intelinked with microtransactions in a vast field. This intelligence has the ability to rewrite codes within the system leading to sustainable design solutions. The gene bank are deployed inoder to be tested as poly scalar coastal infrastructures within high pressure flooding zones of London Estuary. The
WETWARE performs as highly articulated coastal infrastructure with the task of proliferating modulated networked formations of constructed or modified land in flooding zones. In its micro-scale, it performs as a convergence of novel material, structural, organisational and aesthetic behaviours.
63
v L-System Symmetrical Behaviours
01.1.02.1.1 chapter L-System name
Series of fractal catalogs are constructed from line segments using rules specified in drawing commands. Starting with an initial string, the axiom, transformation rules are applied a specified number of times, i.e. iterations, to produce the final command string which is used to draw the image. Six algorithmic growth phases are shown in these catalogs, including the 1st, 2nd, 3rd, 4th, 6th and 9th generations to witness how complex forms emerge with sets of simple rules and also to understand the nature of the L-Sytem Growth logics. An apparent symmetrical behaviour emerges by the rule used in these very first studies making the branches grow using the same degrees in 2 opposite directions in 2D level.
64
65
01.1.02.1.1 chapter L-System name
L-System Asymmetrical Behaviours
02.1.1 L-System
A rich variety of asymmetrical behaviour is indicated in the following catalogs through a simple tweak on the rule sets. Instead of showing the growth steps as the previous studies, 5th generations and 7th generations are compiled here to witness the incredible ability of the evolution in patterning. The L-System growth rule shaped by one of the two branches remaining passive using a fixed angle while the second one changing the angle gradually in 2D level.
66
67
02.1.1 L-System
68
02.1.1 L-System
69
02.1.1 L-System
70
02.1.1 L-System
71
02.1.1 L-System
L-System Behaviours in Space 3-Dimensional L-System Branching Geometry as Subject
02.1.2 L-System
The 3rd dimension is introduced in the studies by allowing the growth to occur in Z axis added to former X and Y axis. A variety of 3D growth behaviours and a set of intricate geometrical outcomes are indicated in the following catalogs. Five algorithmic growth phases are shown, including the 1st, 2nd, 3rd, 4th, 6th generations and additionally the details through the fabric of the geometrical fields. Several L-System growth rules and various angles are experimented and noted on the left sides of the compiled samples.
72
73
02.1.2 L-System
74
02.1.2 L-System
75
02.1.2 L-System
02.1.3 L-System
L-System Circular Diagrams The outcome patterns of the set L-System are also diagrammed by using translucent bubbles in both 2D and 3D platforms which made it possible to read the different aspects of these patterns and to see the potentials of the use of these fractal based products. The hierarchical characteristic and highly connected networks they produce are the most specific nature of these rule based systems. Following collection of this diagrammatic expression clearly displays the intricate nature of L-Systems.
76
L-System Line Based String Information
L-System Volumetric Geometry
77
02.1.3 L-System
02.1.4 L-System
Multiple L-Systems Initial String : L-L
Initial String : L-R
Initial String : R-R
Initial String : R-L
Initial String : R-R
Initial String : L-R
Initial String : S-S
Initial String : R-L
Iteration :1
Iteration :2
Iteration :3
Iteration :1
Iteration :2
Iteration :3
Introduction of more than one L-System brought in relations such as adaptability, survival etc in to the field. In this setup a minimum distance is maintained between two L- Systems. The initial position and direction along with string information controls the way the L-System grows within the system.
78
02.1.5 L-System
L - System + Lofting Simple lofting exercise is done to help understand the array of lines that result out of the L-System. In this case the L-System studied is a symmetrical one and therefore the lofting is done between symmetrical members. Various geometrical patterns are arrived at from the same basic L-System. The string information that generates the L-system is retained in the final output. The lofting logic is then between the geometries with the same string information. The lofted geometry colour corresponds to its respective string. The result is a rich fabric like pattern consisting of intertwining frills.
79
02.1.6 L-System
Initiator String : L
Initiator String : R
Initiator String : S
Overlay of all generated geometry
Side Elevation
Perspective
Aggregation using L - System
Cube
The attempt here is to understand aggregation though L-Systems. The basic L-system is reinterpreted by replacing two dimensional geometry with L-shaped volume. The geometry copies it self to specific position based on the strings operation of the L-System. In this setup the iteration and the L-system rules are kept constant, varying the initial string operation. Though the results are found to have similar directional growth, the pattern and the length of resultant varies. No performance criteria is looked up here. The geometry generated through respective initial string operation can be associated with specific function with its intersection when overlaid can be seen as another function.
80
String operation : L
String operation : R
02.1.6 L-System
Regular Tetrahedran
String operation : B
String operation : S
Regular Tetrahedran Iteration : 20
Rule : L --> R R --> S S --> B B --> L
String Initiator : L
String Initiator : R
String Initiator : B
String Initiator : S
Tetrahedron is employed in to the L-system with simple operations such as copying alongside its faces and edges. It is found that the generated geometry takes form of a cylindrical drum with varying tilt depending on the initial string operation. Here the iteration and the L-system rule is kept constant.
Iteration : 50
String operation : L
String operation : R
String operation : B Copy to edge
String operation : S Copy to edge
String operation : L
String operation : R
String operation : B Copy to edge and scale by .8
String operation : S Copy to edge and scale by .8
Scaling and copy to edges are introduced in the string operations. The resultant geometry varies from regular to irregular geometry.
81
Iteration 10
Perspective
02.1.6 L-System
Top View
Truncated Tetrahedran
Perspective Overlay of generated geometries with varying string initiator.
The truncated tetrahedron provides better spacial qualities compared to the regular tetrahedron with the same L-system rules. On introduction of scaling operations the generated geometry reflects qualities of fractal.
String operation : L
String operation : R
String operation : B
String operation : S
Iteration 10
Perspective
Truncated Tetrahedran
String operation : L
Perspective Overlay of generated geometries with varying string initiator.
82
String operation : R
String operation : B Copy to edge and scale by .8
String operation : S Copy to edge and scale by .8
02.1.6 L-System
L - System Aggregation : Truncated Tetrahedran
83
02.1.7 L-System
Spacial Behaviours of L-System Based Cubes The mentioned method of application of L-System rules into the geometrical elements are materialized by using primitive wooden cubes and box tapes in a simple set. The analyze of the spacial possibilities and geometrical behaviors allows us to categorize these behaviours of various configurations into 4 basic titles: longest span, tallest height, largest cavity and most compact organization. Three different generations are used to observe the organizations created through various configurations. The possibilities of the organizations emerged increased exponentially from 6 to 720 in the first iteration and 720 to 87billion in the second iteration.
84
02.1.8 L-System
Translation of L-System Cubes into Components The experiments done with the hinged cube technique allow us to go forward and arrange the set of those different spacial and geometrical behaviours demonstrated on the above catalogs based on water dynamics and proposed program needs. The arrangement of multiple configurations is achieved through a simplified genetic algorithm. Following this process, 3d components are developed showing 24 various spacial geometries when combined together in different positions. Then these components take the place of the beginning primitive cubular elements. This way strong spacial qualities and a complex growth system are emerged.
85
01.1. 02.2.1 chapter Quadtree name
A similar partitioning is also known as Q-Tree. All forms of QuadTrees share some common features : >> They decompose space into adaptable cells. >> Each cell (or bucket) has a maximum capacity. When maximum capacity is reached the bucket splits >> The tree directory follows the spatial decomposition of the QuadTree (http://en.wikipedia.org/wiki/Quadtree, extracted on : 14 Sept 2009)
QuadTree Algorithm for Tiling Strategy and Data Analisys A quadtree is a tree data structure in which each internal node has up to four children. Quadtrees are most often used to partition a two dimensional space by recursively subdividing it into four quadrants or regions. The regions may be square or rectangular, or may have arbitrary shapes. This data structure was named a quadtree by Raphael Finkel and J. L. Bentley in 1974 (http:// en.wikipedia.org/wiki/Quadtree, extracted at : 14 Sept 2009).
86
Data tree structure of QuadTree algorithm, where one node splits into four if it detects some category applied by the user
with Fractal Subdivision Theory
QuadTree use a recursive subdivision logic to split itself and create a similar form into itself. With this logic, QuadTree can be modified to incorporate different kinds of geometry that have a similar recursive subdivision logic. This shape shifting and form finding experiment helps to understand more about the potential use of QuadTree algorithm in architecture design, and architectural problem solving.
The experiment was to try to use different family of forms that can be recursively subdivided to facilitate the QuadTree algorithm. The simplest euclidean form is a triangle. With the Sierpinski triangle algorithm applied to the QuadTree script, a new form and its modifications is born with using the same logic.
01.1. 02.2.2 chapter Quadtree name
QuadTree Algorithm Modification
Sierpinski triangle algorithm diagram (http://www.kokkugia.com/wiki/index. php5?title=RhinoScript_fractals_and_recursive_subdivision, extracted on 14 Sept 2009)
Rectangular recursive subdivision that is normally used for QuadTree algorithm.
87
02.2.3 Quadtree
L-system Iteration : 1
L-system Iteration : 2
L-system Iteration : 3
L-system Iteration : 4
QuadTree Experimentation on L-System Branching Geometry
QuadTree usually used for Spatial indexing and form recognition. These Catalogues shows how QuadTree recognize the geometry of a simple branching system, and evolve based on the l system generation as well as improve it's degree of detail based on the quadtree iterations.
88
Quadtree Iteration : 5
Quadtree Iteration :4
Quadtree Iteration :3
Quadtree Iteration :2
Quadtree Iteration :1
L-system Iteration : 5
02.2.3 Quadtree
L-system Iteration : 1
L-system Iteration : 2
L-system Iteration : 3
L-system Iteration : 4
Quadtree Iteration : 5
Quadtree Iteration :4
Quadtree Iteration :3
Quadtree Iteration :2
Quadtree Iteration :1
L-system Iteration : 5
QuadTree Experimentation [direct geometry modification] on L-System Branching Geometry Creating a different family of forms that use similar logic as the QuadTree experimentation result. Altering the script changes the QuadTree geometry from boxes to curves. This Experiment shows that the QuadTree logic can be applied not only to a certain kinds of geometry, but to other kinds of geometry as well.
89
QuadTree Algorithm Modification Catalogue
02.2.3 Quadtree
on L-System Branching Geometry
90
Using Sierpinski triangle as a modification to QuadTree algorithm to make a new logic of subdivision within the algorithm. This step reduces the data tree of QuadTree from 4 division each nodes into 3 division each nodes.
QuadTree Algorithm Modification Catalogue [02] with Variations to Splines
02.2.3 Quadtree
Modifying the triangle QuadTree to create curves and other kinds of self-similar forms.
91
02.2.4 Quadtree
Area Tagged Based Transformation Parametric Tilling Strategy The most simple way of creating a parametric system from the 2 algorithm (L-System, and QuadTree) working together is to tag the resulting geometry from this algorithms. From here, with a simple script an aperiodic 3d tiling system is produced. This parametric tiling system is based on the basic parameter of the 2 algorithms, which are the l-system string, l-system generation, and quadtree iterations.
QuadTree Algorithm Architectural Speculation Parametric Tilling Strategy Early speculation of what the QuadTree algorithm can be used in architectural design. With tagging the amount of QuadTree iteration and the area of geometry the algorithm produced, a parametric system of tiling and geometrymaking is produced.
92
QuadTree Parametric Tiling Catalogue The first experiment of creating a quadtree parametric tiling system. In this catalogue, the size of each block created by the quadtree is tagged. Extruded within the same height throughout the multifarious sizes of blocks creates different sloping angle each block size. The smaller the blocks, the bigger the steeper the angle is, and vice versa.
02.2.4 Quadtree
Simple L-System Branching Geometry as Subject
93
QuadTree Parametric Tiling Catalogue [modified]
02.2.4 Quadtree
Simple L-System Branching Geometry as Subject
94
The similar logic is being applied with the modified quadtree algorithm to create different kind of expression within the pattern
QuadTree Parametric Surface Catalogue [modified] An extrusion logic based on the area of each quadtree component. With the curved geometry moving vertically controlled by the size of each component, and being lofted, created different 'towers' with various height and sizes. The smaller the triangle, the higher the curved component move. Proportion between the height and the triangle size is quite important in this experiment.
02.2.4 Quadtree
Simple L-System Branching Geometry as Subject
95
QuadTree Data Manipulation
02.2.5 Quadtree
Breaking the Symmetry Using L-System String as Data Manipulator Breaking the symmetry of a QuadTree pattern by not only identifying the geometry of the L-System, but also recognizing the codes of strings behind the geometry. This experiment is meant to find out ways to increase the level of interaction between the two algorithm. In this experiment, a simple and symmetric L-System was used with only 2 strings with similar shapes.
Diagram of a simple L-System data string born form the rules below : initiator >> b generator >> b turned to a a turned to a b
<< octree a >> 5 generations b >> +2 generations
<< circle packing a >> 5 generations b >> +2 generations
The result of the experiment was the creation of a new pattern with different degree of intricacy on different part of the pattern. The symmetry of L-System geometry is broken through the identification of codes behind it.
96
02.2.5 Quadtree Angle : 0
Angle : 60
Angle : 15
Angle : 30
Angle : 75
Angle : 45
Angle : 90
To take a step further, the data manipulator experiment is done on the same L - System with different angles. What is seen is that the circular packing transforms and subdivides in to further smaller cells. The cells also demonstrate contraction and expansion. The information that results out this setup is a total reinterpretation of the parent L - System. These set of patterns has characters of filtering and spatial qualities which can be extended on to the architectural domain.
97
02.3.1 Octree
Octree Algorithm for Space Forming and Geometry Analisys An octree is a tree data structure in which each internal node has up to eight children. Octreeâ&#x20AC;&#x2122;s are most often used to partition a three dimensional space by recursively subdividing it into eight octants. Octreeâ&#x20AC;&#x2122;s are the three dimensional analog of quadtrees (http://en.wikipedia.org/wiki/ Octree, taken on 14 Sept 2009)
98
Each node in an octree subdivides the space it represents into eight octants. In a point region (PR) octree, the node stores an explicit 3 dimensional point, which is the "center" of the subdivision for that node; the point defines one corners for each eight children (http://en.wikipedia. org/wiki/Octree, taken on 14 Sept 2009)
02.3.1 Octree
Nodes Attractor as Interference Proximity based Scale Operation One of the modification is by node attractor. The attractors are being deployed each generation of the L-System. These attractors will play an important part to distort the basic octree algorithm. Operations as scaling, whether it's 2 dimensional or 3 dimensional scaling, or shifting process can be generated. Proximity between the attractors and each individual component from the octree will be used as a proportion regulator of each transformation
L-System Coding as Interference Mapping the Coding in L-System Geometry Another kinds of variation on the algorithm is geometry distortion by the l system string. Similar with the quadtree modification, the octree instead of only reading the actual geometry, the algorithm also reads the code of the l system. By reading the codes, a lot of things can be applied. Iteration differentiation and geometry distortion will be applied parametrically to the octree, carving spaces and breaking the symmetry in the process
99
OctTree Experimentation
02.3.2 Octree
Symmetrical Branching L-System as a Subject
100
The first experimentation using regular octree on a simple l system. This study is done to identify the spaces the l system naturally carved. These spaces are divided by the octree iterations, with the last iteration is actually wrapping around the l system geometry.
Breaking the Symmetry with Data Manipulation L-System Coding as Distortion Means
02.3.3 Octree
In attempt to break the symmetry and carve a more differentiated space characteristic, the octree is reading the string codes of l system. The modification resulted in a more dynamic spaces flowing in between the l system.
101
OctTree Algorithm as a Means of Analyzing Form and Space 3 Dimensional Koch Curve as a Study Case
02.3.4 Octree
A catalogue of an octree acting on 3 dimensional koch curve. The octree is identifying the spaces and solid geometry of the seemingly-abstract modified 3d koch curve.
102
Breaking the Symmetry of Koch Geometry and Space Proximity from the Nodes of Koch as a New Variable
02.3.5 Octree
This catalogue pictures the modification of octree using attractor nodes that are spread every generation of the l system. Proportional scaling was used as a modifier to the geometry. This carves spaces in a very dynamic manner with differentiated characteristic and dimensions.
103
Natural Selection
02.3.6 Octree
Fitness Category as an Evaluation Criteria Inspired by the evolution theory about natural selection by fitness criteria and how an entity can mutate into another kind of entity bacause of stimulations from the environment, another modification to the octree algorithm was created. The l system as a starting point for the octree algorithm to run is also functioning as a basis for the fitness evaluation process. The goal is to come up with the most effective configuration of octree in terms of proximity and size against the l system that acts as an environment and stimulator. This algorithm experiments can later be applied to a real architectural condition to solve problems and finding the logic of an effective form. This experiment is trying to find the most suitable configuration of octree based on proximity. After mapping 3 dimensional koch curve by octree, and applying the scaling modification by attractor, the algorithm tries to find which individual component is the most important for the l system based on proximity. The l system, aside for spreading attractor nodes each generation, also deploys proximity point as a basis for calculation. The algorithm then calculate individual distances of the component to the proximity points, drawing lines to indicate the shortest distance of point to the component. Deletion process based on proximity then began. The component that are considered not important, or too far from the l system got deleted. This process happens individually every octree iterations. The algorithm also counts the volume of each individual octree selecting the smallest every each iterations.
104
Natural Selection Process Catalogue Fitness Criteria as an Evaluation Category
02.3.6 Octree
The natural selection process catalogue showing the process step by step on a 3 dimensional modified koch curve
105
02.3.6 Octree Interior spaces that are carved by the octree with lines representing the smallest distance between octree components and the proximity points that are being deployed by the l system
106
02.3.6 Octree The result of the experiment is a selected configuration of octree that is most suitable for the koch system. Octree component that is too far or considered too big got deleted. This same logic can also be applied to select different architectural agents to find the most suitable or the most efficient composition in a specific environment.
107
Koch Curve
Beside are the examples of geometries from the Koch Curve algorithm. The image on the left shows a quadratic Koch island with parameters : Angle : 90 degrees Generation : 2 String : Initiator >> F-F-F-F Generator >> F+FF-FF-F-F+F+F F-F-F+F+FF+FF-F The image on the right shows a quadratic modification of the snowflace curve with parameters: Angle : 90 degrees Generation : 4 String : Initiator >> -F Generator >> F+F-F-F+F
02.4.1 Koch Curve
String Coded Fractal Composition The Koch Curve is a mathematical curve and one of the earliest fractal curves to have been described. It was first appeared in a 1904 paper titled "On a continuous curve without tangents, constructible form elementary geometry" by the Swedish mathematician Helge von Koc (http:// en.wikipedia.org/wiki/Koch_snowflake, taken on 20 Sept 2009) In order to generate Koch Curve on a computation platform, the logic of turtle graphics is used within the script. Turtle graphics is a term in computer graphics for a method of programming vector graphics using a relative cursor upon a Cartesian Plane. The image above shows how turtle interpretaion of symbols F, +, and -
There are two kinds of strings in the generation of Koch Curve, they are called initiator and generator. The latter is an oriented broken line made up of N equal sides of length r. Thus each stage of construction begins with a broken line and consists in replacing each straight interval with a copy of the generator, reduced and displaced so as to have the same end points as those of the interval being replaced (Przemyslaw Prusinkiewicz & Aristid Lindenmayer, The Algorithmic Beauty of Plants, New York, 1990). These are some examples of Koch Curve with similar initiator, but different generators Example of interpretation of a string. The angle increment is equal to 90 degrees. Initially the turtle faces up
Angle : 90 degrees Generation : 4 Strings : Initiator >> F-F-F-F Generator >> FF-F-F-F-F-F+F
Angle : 90 degrees Generation : 4 Strings : Initiator >> F-F-F-F Generator >> FF-F-F-F-FF
String : FFF-FF-F-F+FF-F-FFF
108
Angle : 90 degrees Generation : 4 Strings : Initiator >> F-F-F-F Generator >>FF-F-F-F
Angle : 90 degrees Generation : 4 Strings : Initiator >> F-F-F-F Generator >> F-FF-F-F
String Modification and Angle Variation
In order to design using Koch Curve logic, several experimentation using string modification and angle variation are done. With this method, various kinds of fields with different kinds of characteristics are formed. Implementing these fields into the realms of architectural design is the goal of these studies.
Script modification based on strings (initiator and generator) and generation amount
02.4.1 Koch Curve
On Basic Koch Curve
109
2d [modified] Koch Curve Catalogue With Angle Differentiation
02.4.2 Koch Curve
Catalogue of different kinds of Koch Curve with similar strings with different angles. Different growth pattern of fractals are created by these studies.
110
02.4.2 Koch Curve
3d Koch Curve Modification Space-Forming in 3 Dimensional Platform In order to create 3 dimensional structure and defining 3d spaces, the string of the octree have to be modified adding one more angle to the rotation each component of the string. To create a continuous geometry, the amount of component each string also have to be modified. Each string have to have another 4 component more than the regular 2 dimensional string.
111
3d [modified] Koch Curve Catalogue [01] With Angle Differentiation
02.4.2 Koch Curve
Catalogue of different modified 3 dimensional Koch Curve using similar strings but different angles. The first and second angle rotation of each component of the string intentionally made similar to keep the amount of variables low and constant high.
112
3d [modified] Koch Curve Catalogue [02]
02.4.2 Koch Curve
With Angle Differentiation
113
02.4.3 Koch Curve
Materializing The Geometry of Koch Curve 3d Modified Koch Curve as Object Study
In order to find out the character of surfaces or 3 dimensional solids that is formed by these Koch Curve, another computational command is added to the script. Simple lofting with string recognition creates an undulating surface with wrinkly quality on the koch. These surfaces accurately depicting the movement and growth direction of the Koch Curve.
114
Lofted Koch Curve Catalogue 3d Modified Koch Curve as Object Study
02.4.3 Koch Curve
Catalogue of lofted Koch Curve with the same string, various angles, and different generations. These forms are showing different level of details and wrinkles regulated by generation, and different movement with the angles as its primary parameters.
115
02.5.1 Penrose
Aperiodic Tiling Penrose A given set of tiles, in the Euclidean plane or some other geometric setting, admits a tiling if non-overlapping copies of the tiles in the set can be fitted together to cover the entire space. A given set of tiles might admit periodic tilings, tilings that remain invariant after being shifted by a translation. An aperiodic set of tiles however, admits only non-periodic tilings, an altogether more subtle phenomenon. Penrose set is an example of aperiodic tiling. The Kite and Dart geometry belongs to the penrose family. 72 A
108
72
36
Half Kite [A]
36
Mirror Operation A --> A+ Aâ&#x20AC;&#x2122; B --> B+Bâ&#x20AC;&#x2122;
36
Half Dart [B]
B 2 Kite + 1 Dart A --> AAB
1 Kite + 1 Dart B --> AB
Penrose - Aggregation + Sub Division Keeping the subdivision logic in mind, The penrose geometry is made to aggregate. The aggregation is controlled in a way that it will have half kite and half arrow alternatively. One of the uniqueness that this geometry gives is the direction in which it can grow. Unlike the equilateral triangle the half kite and half arrow, together will result in controlled directional growth. Along with aggregation of the geomtry, Subdivsion rules are also applied. A Half Kite can result into a Kite and half arrow, while a half arrow subdivides in to one half kite and one half arrow.
116
Addition Operation A --> A + B B --> B + A
Attractors
Sub-Division Activators Sub-Division Activators
02.5.2 Penrose
Penrose - Deployment [2D] To facilitate the growth of the penrose pattern, a field of attractors and activators are deployed. The position of which is based on the design intervention or requirements. The attractors suggest the direction of growth which has its own character due to the geometry that is employed. The side to which the growth has to occur is done on the basis shortest distance from each side to the attractor. Activators are linked to Sub-division process. On proximity the base geometry sub divides. The interation is varies with proximity as this is influenced by the growth of the base geometry. Sub division can trigger or be seen as a means to achieve a different spatial setup in comparison to the non subdivided base geometry. The field here is two dimensional.
Penrose - Deployment [3D] To move from a two dimensional growth logic to a three dimensional one, additional operation are introduced. The activators also take the role of reppellants to maintain a distance threshold. Though the attractors and activators are placed in two dimensional field, the resultant geomtry is three dimensional. The fashion at which the script operates is that it dentifies each attractor and then grows towards to them individually based on the iteration. This leads to finger like formations in the resultant geometry.
117
02.5.2 Penrose Setup 2: The growth is directed towards the first attractor for the set iteration and then it grows towards the next attractor. The activators trigger subdivision as well as maintains a distance threshold. The resultant geometry does not have finger like projection as in setup 1.
Setup 3:
118
Seeds are itroduced along with attractors and activators. The seeds act as position of origin of the growth. The points are placed three dimensionally. The growth operation here is by addressing every seed individually and thus does not have any relation to each other.
02.5.2 Penrose Setup 4: The seeds are addressed all at once and so grows simultaneously. The attractors too are addressed at once and thus each seed has a certain dialogue with each other. The resultant geometry can be seen as a network of interlinking connections and can set basic structure on which the overall form can be related to.
Modified Koch Curve
Setup 5:
Repellants and Activators from Koch Curve
The Koch curve is introduced in to the setup. Each junction of the koch curve act as a repellant. In addition to which midpoints of desired length segments act as activators.
119
Top View
Koch + Penrose To take the Koch curve further the geometry is replaced by the penrose triangles. In this case a half kite. This is achieved by controlling the angles. Trials are run both in two and three dimension. In the third setup half arrow is also introduced. The iteration is limited according to smallest dimension favourable. The shape of the penrose geometry adopted in this system can help in easily fusing a component that has penrose geometry as its parent root.
In this operation both half kite and half arrow are introduced. Though it does not differ much from the previous operation, it is more intricate.
120
Perspective
02.5.3 Penrose
Iteration : 1
Iteration : 2
Iteration : 5
121
02.5.3 Penrose
Materialization
02.6.1 Materialization
The attempt to materialize penrose tiling from a 2 dimensional pattern into a 3 dimensional shape is by projecting it into a flat foldable material. Later on, some lines within this pattern are modified into guidelines for 3 operations, which are folding lines (folding in and folding out), cutting lines, and connection lines (points and edge connection). This system of intervention creates a geometry system that can be evaluated according to architectural criteria; such as spatial quality, structural performance, aesthetic values, etc; at the next stages. All of these operations within this system happens everytime the recursion (self-division) and the population (growth) occur based on the penrose tiling pattern.
Folding Based on Angle Fixation MethA simple angle fixation method is introduced after observing the initial materialized folding studies which were using spontaneous angles, thus uncontrolled behaviours occurred. Wooden components(black elements on the photos) carrying specific degrees of angles(15-30degrees) fix the degrees between the folded adjacent triangles. In this way more clearified general topologies emerge. The diagrams on the right show the attempt to analyze the behaviour of folded surfaces on the projected planes(plan view). The areas they cover has some specific relationships according to the angles and generations such as: -The increase on folding angle results with decrease on the area the layers cover. -Smaller folding angles(<15)results in expansion of the covered
122
Water dynamics on the folded geometry The folded geometry arrived at is places in to fluid engines to understand its behaviour with agencies like water that can act in the form of wave or water currents. The look out is for any patterns in terms of water choreography or to understand its properties for a possible sea barrier. In this test scenario the iteration value for sub division is 1 and the angle of folding is 15 degress
02.6.2 Materialization
Top View
Perspective
Top View
Setup 2:
Perspective
The sundivision iteration value is 2 and the folding angle is maintained at 15 degrees. The vectors do not show any considerable outcomes. The shadow region has diminished due to smaller area coverage of the geometry. This is due to increase in the fold.
123
02.6.3 Materialization
Folding Behaviours on Populated Patterns A specific global topological behaviour is observed after several populations and iterations experimented in the various generated patterns through the developped scripts and using these wooden components carrying sets of degrees of angles(15-30-45 degrees) made the movement of surfaces more readable. Basically, a spiralling global topology demonstrated which fastened when the iterations of the penrose triangles increased.
Point Connection
1st iteration
The first system of materialization is using points connection. These connections (purple circle lines) are placed at the top and bottom right corner of each triangular cell. The lines within these triangles are modified into folding lines. The green lines are the lines that should be folded inwards, while the yellow lines should be folded outwards. Images on the rightz shows how it is affecting the overall shape of a material, from a simple flat shape into 3 dimensional shape with architectural aspects; such as space underneath it, openings from cutting lines, textures on the surface of the material.
2nd iteration
The thing that can be evaluated as a negative aspect from this experiment is that this point connection system creates some ar-
3rd iteration
124
Edge Connection
Overlay
From the images and diagrams shown on the left, it can be seen that this type of connection are refining the bending moment occurs at the previous points connection. While it still creates the same quality of architectural aspects mentioned before.
1st iteration
The next step is to find a population system using the module that is being set up with 3 parallel triangles, side by side whereas their top corner is touching at the same point. And the technique that is quite effective to populate these modules is by overlaying. By introducing new cutting lines (slits) in each module, it finds a way to grow horizontally and vertically by interlocking each module at those slits. At this stage, the opportunity to find a connection to the site starts to appear. By overlaying 3 different modules, we can start to see it's attempt to be adaptive to three different layers of conditions of the water (vertically) and it's growth from dry land to wet land (horizontally). The critical evaluation from the result occur at this stage is that it's overall shape is still too wild (too spiky) and difficult to be accepted as an habitable space. And the way they grow with its circular movement are too wild to be controled.
1st iteration
2nd iteration
02.6. 3 Materialization
The second system of materialization is using edge connection. These connections (purple straight lines) are placed at the top and bottom corner that connect the triangular cell with its neighbour. The lines within these triangles are also modified into folding lines. Still using the same rules of colour with the previous points connection
3rd iteration
2nd iteration
3rd iteration
population: overlay technique with interlock (slits) connection between the modules
125
02.6.3 Materialization
Pattern Logic
Based on the evaluation from the previous stages, it seems the materialization system needs to be reconfigured in order to refine the connection between one aspect of implementation with another. The first aspects that is reconfigured is the module itself. By composing 4 triangles in a zig zag position as the smallest modules, it has more flexible population system, which creates a continuous surface out of the same penrose tiling. This module rule can work for any of the basic pattern from different iterations. Diagrams above shows this ability in combining few different iteration
126
1st iteration
2nd iteration
Components from penrose pattern Pattern arrived at in the 1st iteration is converted in to a component by folding. Two sets of components results. These geometries has relationship of interlocking with itself as well as with each other. The combined geometry provides spatial qualities.
02.6.4 Materialization
Male Component - A
Female Component - B
Component A + B
Component A + A
Components interlocked with each other
Perspective Front View
In this case the resulting geometry achieves perfect interlocking as how a Kite and arrow would interlock.
Component B + B
Perspective
Front View
127
02.6.5 Materialization
Aggregation of Components on Koch Curve The components are aggregated on the modified Koch Curve. The length of the segments defines the scale of the component. The direction of the line segement with respect to the origin decides the orientation of the component. The first example uses combination of component A and B. The second example is fusion between component B itself.
Modified Koch Curve 1
Modified Koch Curve 2
Perspective
128
Koch curve 2 is mirrored on X-Z plane as well as on the Y-Z plane. On to which the component BB is aggregated
Component AB aggregated on the Koch Curve
Component BB aggregated on the Koch Curve
Top View
Top View
Front View
Front View
Cutting logic
These cutting lines together with folding rules are creating the global shape of the previous flat material. A change at these rules can effect the overall curvature of the material's shape. By populating the module in X and Y axis, we can get a grid control system for the material shape. After having few experiments of folding rules in the modules, it seems the lines that lies between one module and another in the Y-axis (vertical wise) are the ones that are quite effective in creating the overall curvature. While the other lines within the module are the ones that we can use to maximize the open-
02.6.6 Materialization
The importance of introducing cutting lines within this new module system is to mantain the continuous surface without any bending moment because of the fold.
Folding logic
129
130
02.6.6 Materialization
02.6.6 Materialization different cutting rules creates different angle of the curvature from the material.
different folding rules at the lines in between 2 different modules (vertical wise) creates the direction of the curvature (mountain & valley).
131
02.6.7 Materialization
Overlapping The intention in overlaping 2 layers with different rules of cutting and folding is to create space. In the images above, It is shown that the folding rules constitutes the direction of the curvature (upwards or downwards) and the cutting rules define the angle of the curvature. The results are the first layer, which is the relatively flatter, can be the base and the other one, which is relatively more curved, can be the roof. While the overlapping area itself is placed at the first and the end stripes of the module grid. It's enhancing the material structural performance by creating more rigid surface.
Overlap Model
132
02.6.8 Materialization
overlapping overlapping
Pinching At this stage, pinching technique is introduced as an attempt to break the symmetry occurs at the previous experiment. By distributing several points area of pinching at the surface, it creates a new control tool to make the curvature angulated not only 2 dimensionally, but also 3 dimensionally. It richens the spatial and structural performance in the system.
Deformed Having the materialization rules set up, we also test if it's still working when the patterns are deformed from the digital side. It seems that the more the recursion happens in the patterns, the lesser it angulates the surface. While the more it is stretched in (shrinking), the stronger structural performance is created within the pattern.
133
02.6.8 Materialization
Final model
Conclusion Materialization experiments that are done brings up few results can be used as input for the digital exploration of the project. The rules that is being set up (module, growth system, cutting logic, folding logic, overlap logic), which uses minimum surface principals, gives a system to extract the generic conditions will be simulated in the digital part of the exploration. These generic conditions (stretched areas that's stronger than the other part, the relation between overlaping areas and folding rules that defines spatial division between layers, etc) can be embeded as design constrains, together with the site conditions, in the digital prototype system.
134
L-System Based Cubes Under Dynamic Forces
02.7.1 Kinematics
The initial cubular geometric translation of the L-system mentioned above is also exposed in dynamic forces of the water scape and a self-organization method is achieved using these natural forces the test site offers. First and second generations of the L-System based elements are hinged each other in the maya simulation environment using the same rule system of material based studies we presented above. The dynamics of the water on the surface is generated in the simulation representing the vector fields waves would produce. The connected cubes are exposed to the vector fields and a dialog between the geometries and vectors achieved. Time based equilibriums of configurations are demonstrated under the effects of various sets of vector fields and the directionality of the vectors generated.
135
02.7.2 Kinematics
Penrose Tiling Patterns Embodied in Kinetic System A different materializing method is introduced through the translation of the generic Penrose tiling pattern into a kinetic frame structure using wood sticks and rubber tubes as components. The idea of this transformable space system is created to response the dynamic forces nature produces such as water currents, wave impacts and wind blows. As a test platform the generic pattern is populated in the horizontal direction with 3 factor and vertical direction with 2 factor. Then a space closure is generated from the movement of the intersection points of the penrose triangles to each other. The kinetic structure reads the natural forces and responses by the travel of the intersection pivots to another pivot. This way several complex geometries and various space qualities are achieved.
136
02.7.3 Kinematics
Penrose Kinetic Structure Encircled by Koch Curves The Penrose stick frame experiment manifested a higly uncontrolled behaviour. Therefore the koch curves, another algorithmic fractal system is introduced to keep it under control. The frames were encircled by the Koch curves reading the geometries penrose frames generates and the intersection pivots were attached to the outer frame of the koch fractals. Thus more definable spacial transformations occured through the travel of the pivot points.
137
02.7. 3 Kinematics
Spacial Behaviours Penrose Kinetic Structure Generates A responsive spacial and structural system are displayed through these free Penrose stick frame experiments and the Koch Curved frame structure studies. The vector fields generated by the surrounding ecologies are read digitally and translated into mechanical kinetic movements of the structural framework, thus construction planes so-called floors, ceilings and separation walls. Set of spacial phases occurred which allowed the users to experience the dialogue of the architectural tectonics and natural systems. A higly complex geometrical behaviour would emerge through the population of these connected structural network of penrose frames. Even so, the experiments redirected towards the folding techniques of continous surfaces.
138
02.7.4 Kinematics
Surface Apertures
Effects: apperture
01
02
03
04
139
03 Phase 2
Design Intervention
141
142
03.1. Context The coast line is a dynamic phenomenon that changes its attributes because of its constant influence of many external agencies. What is critical here is the landmass, which in some cases are disappearing due to rising sea levels or in other cases being extended due to formation of sand bars and mud flats. The topography of the landmass is a result of
various agencies that act at various scale and quantities. The scale range from macro to micro scale. At macro scale agencies such as earth techtonics, temperature, surface winds etc is more active with single celled organisms such as bacteria, fungi acting at the micro level. As a result of which land topography has qualities of poly scalar terrain. What we are inter-
ested is in recreating this dramatic topography as an artificial landscape structure deployed in a dynamic environment such as the coastline. The design intervention is not only limited to the structure itself but how it will interact with these external agencies and attain qualities of a living structure.
143
03.2. Site
Rushley Island
The Thames Estuary is where the River Thames meets the waters of the North Sea. The tidal influence extends upto 89 kilometres upstream of the Thames Estuary. The estuary extends from the head of Sea Ranch to Canvey Island on the Essex shore as its western boundary. The eastern boundary is a line drawn from North Foreland in Kent via the Kentish Knock lighthouse to Harwich in Essex.
Havengore Island
Site
dfl
Mu
Thames Estuary
at
Samuels Corner
t
dfla
Mu
The estuary has the worlds second largest tidal movement. On an average it can rise upto 4 metre with moving speed of 8 miles per hour. TThames estuary is a major shipping route, as a result of which constant dredging happens in and around the area. The appellation Greate Thames Estuary applies to the coast and the low-lying lands bordering the estuary itself. The zone is characterized by salt marshes, mudflats and open beaches. Rising sea levels in places floods certain areas, taking pressure off the defences. Man made embankments are backed by reclaimed wetland. The thames Estuary is part of Thames Gateway, designated as on of the principle development area in Southern England. This area has several proposed sites for building a new airport, additional flood barrier to assist the existing Thames barrier, a new National Park is also thought of. The site, we are interested is along the thames estuary where another small river meets the North Sea. The mudflat zone called as Maplin Sands falls under Designated Special Area of Conservation (SAC). The zone is very critical in the ecosystem that it inhabits. Migratory birds often visits Maplin sands, which brings in ornithologist to this area. Salt marshes play an important role in maintaining the coast line, which otherwise would have depleted due to constand wave and tidal action. The mudflats are also habitiats to species like snails, crabs and amphiphods. These are the main food source for the migratory birds. A considerable amount of silt is deposited on the river mouth. The silt is brougth down by the moving water from on its course and is deposited. This leads to the formation of Mudflats. The process is very vital in maintaining the coastline. The sedimentation rate along the Thames estuary is around 2-5 centimeter per year.
144
Thames Estuary
Migratory Birds
Maplin Sands
Salt Marsh Vegetation
Havengore Island
Snails / Crabs / Amphiphod
03.2. Site
Fjord type
Section through a river mouth Salt wedge type
Image courtesy : http://www.whoi.edu/oceanus/viewImage.do?id=5537&aid=2486 Pressure
Membrane
Salt Water Slightly Stratified type
Fresh Water
Energy genearation through Osmotic Pressure Global Potential 1600 - 1700 TWH
Vertically Mixed type
Different types of Estuaries Image courtesy: http://oceanservice.noaa.gov/education/kits/estuaries/estuaries05_circulation.html
The river mouth is an area where fresh water meets sea water. The fashion in which it meets or mixes varies from Fjord type, Salt wedge type, Slightly stratified and vertically mixed. Mostly the fresh water moves into the sea along the surface and the salt water moves in closer to the river or sea bed. The is an oppurtunity to generate electricity through pressure driven osmotic power plants. This method of generating electricity is prototypical by itself and is completely renewable. The infrastuctural requirement are long lines of pipe lines and high performance membranes.
145
03.3. Deployment The site has three distinct zones. The Sea / River, Mudflat and the land. This can be reinterpreted as three different desities of matter. The site also provides possible access points. The access points are deployed based on contour information and keeping flooding zones in mind. The site also suggest possible ways that the L-system can grow. Seeds are deployed based on this information. Initially four seed points are deployed. One from the river side, second from the sea side and two others
from either sides of the river placed on land. The Lsystems grow simultaneously bridging these seed points, alongside access points. The density information plays a vital role. The density values is directly related to the distance from seed points. These values corresponds to the minimum area that a branch at a point need to branch into two or more other branches. The string behaviour is also re-written based on the density information. Thereby creating micro transcations and contributing to the macro organization.
Sparial requirements are fed into the later models. Isles are open to sky zones where in birds would have uninterrupted access to mudflats. Water pools acts are reservoirs for fresh and salt water. In these pools different ecosystems can be showcased.
Sedimentation cluster Isles Isles
Isles
Isles
Sedimentation cluster V.1.1
V.1.2
V.2.1
V.2.2 Pool Micro Ecosystems
Isles
Sedimentation cluster
Pool Micro Ecosystems
Isles
Isles
V.1.3
146
V.1.4
Sedimentation cluster
V.2.3
Sedimentation cluster
V.3.1
Sedimentation
Fresh Water Pools Salt and Fresh water Ecosystem Eco park Public Space
Brackish Water Outlets
03.4. Heirachical Model
Inlets
No access - Shorter branch width
Access
Pedestrian Access
Isles Migratory Birds Open to sky
Sedimentation No access - Shorter branch width
Salt Water channels Inlets
Heirachical Model The heirachial model has close relation to the water a tree grows. At first it has a direction of growth which on the natural model is vertically up from the ground. This in the Heirachial model represented here is called as the Axis Branch. The Axis branch
splits in three branch, two of which becomes the main branch and the third one becomes the axis branch itself. The determination of which one would become the axis branch is determined by the one closest to the access point or attractor point. This in
the natural model is the source of lights. The main branch splits in to one main branch and two other sub- branches whenver the density requirement is ment.
147
03.4. Heirachical Model
Access points
Isles Isles
Access points
The Heirachial model provides layers of information. Information that would decide the role the branch would play. This is transcoded into specific values that would influence the geometry in the next set of design moves. The information retrieved decides spaces such as water pools, access pathways, water channels, structural anchoring points, sedimentation branches, etc.
Isles
Sedimentation Branch
Access
Fresh water Pools
Salt water Pools
Anchoring points
148
Water Pools
Brackish Water outlets
Fresh water Inlets
Salt Water inlets
Water Channels
Generic penrose pattern ovelayed on L-system
Generic penrose pattenr deformed
03.4. Pattern Deformation
Generic penrose pattern overlaid on L-system. The pattern is deformed using the vector information from the L-system. This leads to thickening of material along L-system. The thickening corresponds to structural requirements and can be considered as structural beams
The generic pattern adapts to changing profiles of L-System. The spatial behaviour varies based on the profile of the L-system
01
02
03
04
05
06
07
08
09
10
149
03.4. Pattern Deformation
01
02
03
04
05
06
The L-system also has three dimensional information embedded within it. The generic penrose pattern is deformed on its z value along with x and y co-ordinates. This again defines spaces based on the role the L-system branch plays derived from the heirachial model.
150
The continous L-System branch-based deformations serve as paths having circulation potentials both for the humans and waterscape. Basically the topology of the artificial landscape sample occurs as a negotiation between the mountain and valley tectonics. The way pattern created offers a series of primary arteries and in between terrains. Primary arteries happen to be flatten with maximum %8 slope.
03.5. Topography
Topoghraphical Behaviour Through Folding
151
03.5. Topography
Topographical Layering Strategy
152
The sections taken from topograhy generated through folding is abstracted to simple sine cosine curves to find a strategy for creating a dialogue between layers; organize each of them and overlap them generating the closed spaces in between. The movement of the sectional topoghraphies vary from flat continous lines to different degrees and wideness of curved lines. These behaviours are combined in different strategies to create several scales of inhabitable pockets which follows the needs of the tested program. The diagrams on the right demonsrate some of the results of this method.
A set of sample patches are chosen to test the relationship between the water dynamics and the generated fabic. The following vector studies are generated through maya simulations. The time-based shots of vector fields represent the water flows on the surface(outer skin) and the currents indoor spaces according to the different moments of the rises of the tidals. On the edge conditions the folded tectonics intentionally have spiky rock-like formations and also different scales of opennings in order to break and also channel the water in and out of the fabric. The apertures on the surface also serve as smaller doorways for water to fill in. Water goes through the fabric when it rises up by the effect of tidal move and a natural sculptural phenomena takes place while it is also offering a transformation of the availability of the spaces tested program WATER LEVEL
12m
TIME
09:00
WATER LEVEL
13m
TIME
WATER LEVEL TIME
WATER LEVEL
10:00
13.5m 11:00
14m
TIME
12:00
WATER LEVEL
15m
TIME
03.6. Water Dynamics
Water Dynamics Through Generated
12:30
153
Sedimentation Rates Around the Globe
03.7. Sedimentation
The investigation of sedimentation rates on different marshlands is prosecuted to understand this natural phenomena properly and also contacting by sedimentation researchers brought useful inputs into the project. The natural sedimentation process takes place on the marsh zones around the world with the degrees of 2 cm to 6 cm increase annually. This amount is 1cm to 4.5cm on the site the project is set .We are projecting a rise of 10 cm to 15 cm by boosting up the process through the systematic approach of the project.
154
Physical Sedimentation Tests on the Coast
Several physical sand sedimentation experiments are done on the the site by the team. What is interesting for us in this set up is the actual sand deposition around the placed obstacles with the increase of the surface area as well as given rich patterns through this deposition. We developped a Color-based behavioural simulation of the sand sedimentation through maya fluid engine. The outputs of the engine are used to generate surface transformations.
03.7. Sedimentation
Physical Experiments & Simulation of Sand Sedimenta-
Simulation Results of Sedimentation as Color-
155
Topography Generation through Time
03.7. Sedimentation
Outputs of the maya simulation, the colormaps become inputs to transform a planar surface into a 3 dimensional surface. The color dark blue generates the lowest pits on the surface and the yellow scale transforms the highest pits. The color chart on the right shows the transformation based on the different color scales.
156
Through time the surface areas of the land increases by the created ondulations for more species to settle in. The colormaps and surface transformation method projects an increase of 12 percent in ten years. On the other hand this method results by the emergence of different qualities of shaded domains for the inhabitants and the designed latticed structure as an additional habitat for species supports the shading phenomenon. The topography is also careographed and endangered marshland is Simulation Result of Sedimen-
Transformation
based
on
The investigation of sedimentation rates on different marshlands is prosecuted to understand this natural phenomena properly and also contacting by sedimentation researchers brought useful inputs into the project. The natural sedimentation process takes place on the marsh zones around the world with the degrees of 2 cm to 6 cm increase annually. On the site the project is interested this amount is 1cm to 4.5cm. We are projecting a rise of 10 cm to 15 cm by boosting up the process through the systematic approach of the project. Simulated Vector Maps
03.7. Sedimentation
Generation of TouchDown Profile
157
Generation of TouchDown Point Densities
03.7. Sedimentation
Sedimentation process has a direct relationship inherited from the l-system. The density value we are interested in is such that it is minimal intervention to the land and maximum performance in terms of sand deposition. The generation of the touchdown points(TD) on the lsystem strings is achieved through distributing them by certain distances. Here 4 different levels of densities generated following this logic.
158
Topography Generation on Different Density Values
03.7. Sedimentation
The different levels of densities run through the fluid engine. A certain optimum range was found to careograph the sand topography and this also results with larger surface areas. The best performing density level is resulted by generating the distances of the touchdownpoints by 3meters to 10 meters.
159
160
03.7. Sedimentation
03.8. Sedimentation on Mudflat zone
Simulation Results on the MudFlat Zone
161
162
03.8. Sedimentation on Mudflat zone
To investigate our designed densities(3m-5m TD distance) we tested the density values fed into the hyrercical model through the simulation. According to the colormap results, it is found to be performing to the levels of expectations in terms of surface increment and emerging patterns.
03.8. Sedimentation on Mudflat zone
Simulation Results on the MudFlat Zone
Test Patch on Sedimentation Zone
Topography
Heights
163
03.8. Sedimentation on Mudflat zone
Topography Generation based on Simulation Results on the MudFlat Zone
164
165
03.8. Sedimentation on Mudflat zone
03.8. Sedimentation on Mudflat zone 166
Emergence of a New Pattern on the Thames Estuary The sand deposition initiated at the local level but it starts to interfere each other and eventually it grows into macro level carving a new topography pattern on the estuary coastline. Thus the symbiotic relationship of the artifact and the nature is permanently reflected through the time.
167
03.8. Sedimentation on Mudflat zone
03.9. Pattern Ellimination
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 100
Frame 110
Frame 120
Pattern elimination penrose pattern being eliminated using l system information and its hierarchical system to create different profiles, intensity, and thickening throughout the development to cater to programmatic issues
168
Frame 40
Frame 50
Eliminated penrose pattern A sample of eliminated pattern based on hierarchical information of the l system
03.9. Pattern Ellimination
Frame 30
Frame 90
Frame 130
169
170
Deformed Pattern
03.9. Pattern Ellimination
Water Pools
03.9. Pattern Ellimination
Surface
Isles Open to sky
Open Pools Cut outs : open to sea
Water Pools Surface
Isles Open to sky
Elliminated Pattern
171
03.10. Topography 172
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 100
Frame 110
Frame 120
Frame 40
Frame 50
03.10. Topography
Frame 30
Topography creation Information inherited from the l system hierarchical information feeds into 3d deformation algorithm to create differentiated 3d movement in order to fit programmatic fitness criteria creating the first layer of the topography
Frame 90
3d pattern deformation The first and main topography profile that is feed by inherited information of the l system
Frame 130
173
03.10. Topography
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 90
174
Frame 100
Frame 110
Frame 40
Frame 50
1st layer topography
03.10. Topography
Frame 30
Secondary topography movement By using the l system information in local scale, another layer of topography with differentiated undulation are emerged, creating various local architectural qualities for programmatic purposes
Additional layer topography
175
03.10. Topography
Plan
Isometric
Vertical pattern deformation being applied to a regular rectangular grid
Plan
Vertical pattern deformation being applied to a distorted penrose pattern
176
Isometric
03.10. Topography Section
Because of the nature of the distorted penrose pattern, the geometry will have a particular characteristic opposed if the same algorithm applied to a regular rectangular grid pattern. This geometry will have a derivational that is determined by the distortion of penrose pattern that is actually regulated by hierarchical information of the l system.
Section
This simple algorithm becomes adaptive when the user input all the information from previous steps to it.
177
03.11. Touchdown Points 178
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 100
Frame 110
Frame 120
Frame 40
03.11. Touchdown Points
Frame 30
Frame 50
Touchdown points extrusion Using the density information from l system, series of structural touchdown and anchoring points are formed. This elements functions as the main structure of the landscape and also intentionally designed to create sand deposition on the site to enhance the quality of natural landscape
Structural touchdown points
Frame 90
Frame 130
179
Touchdown points
03.11. Touchdown Points
Sedimentation touchdown points that being responsible in creating sand sedimentation.
Natural sand sedimentation Sand sedimentation level that will be the habitat for marine life
180
03.11. Touchdown Points Touchdown points are divided into two, structural anchoring points and sedimentation touchdown points. Structural anchoring points happen with 50 meter interval, with main function as an anchor that supports the infrastructure. While sedimentation touchdown points are moments where extrusion happens on the structure to create sand sedimentation. Sedimentation touchdown points doesnâ&#x20AC;&#x2122;t puncture the ground deeply, as a matter of fact, it is just barely touching the surface of the sand to create maximum effect of sedimentation
181
03.12. Enclosure 182
Frame 00
Frame 20
Frame 40
Frame 120
Frame 140
Frame 160
Frame 200
Frame 220
Frame 240
Frame 80
Enclosure deformation
03.12. Enclosure
Frame 60
Frame 100
Hub enclosure
In a particular area, enclosures are created by differentiating the 3d distortion of the main pattern to function observation hubs and resting places
frame180
Frame 260
183
184
03.12. Enclosure
03.12. Enclosure
Obscurity between structure, human habitation, enclosure, marine life habitat and the natural site itself is intended. With this expression, the built environment seamlessly merged with the greyness of the natural mudflat, creating a distinguished atmosphere.
Human eye sketch
Birdâ&#x20AC;&#x2122;s eye sketch
185
03.13. Structural System
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 90
186
Frame 100
Frame 110
Hub solid structure Frame 30
Frame 40
Selection by area determines which part of the closure becomes solid surface and will function as the main roof structure of the observation area
03.13. Structural System
Structure selection
187
03.13. Structural System
structural system
188
the structural investigation begin with such assumption that the accumulation of the small triangles deformed by the L-system can create stronger structural performance, can be used as the anchoring network for the entire surface.
deformed areas as the structural spine >>
01
02 load distribution isometric analysis
03.13. Structural System
load distribution section analysis
03 structural spine + anchoring points 189
03.13. Structural System
Zoomed in indicator
Subdivided opening Opening appertures area Continuous structural ribs l system hierarchical information embed various kinds of information into the distorted pattern. Differentiation can be tagged by picking inherited information, which enables user to withdraw information without having to trace back the parent model.
190
In this case, area information which value have been distorted by l system hierarchical model, becomes a set of input for the next step of algorithm.
By tagging area information, user can determine which area that are suitable to become structural ribs and surfaces and which one that are potential to be openings. Diagram above shows the two different areas that will have two different performance criteria, selected by area values.
03.13. Structural System
Zoomed in view of the mesh subdivision that will be the datum for opening apertures.
Interior view sketch
191
03.13. Structural System
Frame 00
Frame 10
Frame 60
Frame 70
Frame 80
192
Frame 90
Frame 20
Frame 40
Frame 50
03.13. Structural System
Frame 30
Closure main structure 1st subdivision structure Pattern that lay on the voids of the closure becomes struts for the first subdivision structure. These struts will become beams to support the mesh openings
193
03.13. Structural System
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 90
194
Frame 100
Frame 110
Frame 40
Frame 50
03.13. Structural System
Frame 30
Substructure 2nd subdivision structure
To rationalize the dimensions of openings, a second subdivision structure is created this structure will be the one that holds the opening panels
195
03.14. Apertures
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 90
196
Frame 100
Frame 110
Frame 40
03.14. Apertures
Frame 30
Frame 50
Appertures Mesh opening Another step of subdivision is formed to bridge the dimension into human scale the appertures will change according to the sun direction to catch an ideal internal illumination
197
03.14. Apertures
Geometry modification By tweaking simple characteristic of the pattern geometry, another kinds of geometry is formed. This geometry modification can only be applied to the last subdivision, since altering the geometry means losing the identity of penrose triangle, and such subdivision cannot be applied again Those modified geometry then analysed based on the degree of opacity to determine which is the most optimum to fit a certain programmatic function
01. Rounded triangle offset inside original triangle. Void x solid proportion = 50 X 50 Being used for main structural geometry since it have a neutral characteristic
02. Spline that is formed by the edges of triangle. Void x solid proportion = 25 X 75 Being used for osmotic area since it tends to have a high degree of closure
03. Connecting midpoints of each side of triangle to the centre point. Void x solid proportion = 75 X 20 Being used for observation spaces since it have a high degree of opacity
198
Sample area
03.14. Apertures Sun orientation becomes another variable when creating the geometry of the appertures. Normal vectors of each segment and the direction of sun rays determine how big the appertures are opening. This way, the building can get an optimum interior natural illumination based on where it is orientated against the sun
199
03.15. Water Breaker
Frame 00
Frame 10
Frame 60
Frame 70
Frame 80
200
Frame 90
Frame 20
Frame 40
Frame 50
03.15. Water Breaker
Frame 30
Outline detection Surface edge lines Detecting the surface edge lines that directly touch water to create a water breaker geometry that absorbs the water force
201
03.15. Water Breaker
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 90
202
Frame 100
Frame 110
Frame 40
03.15. Water Breaker
Frame 30
Frame 50
Adaptive subdivision Adaptive subdivision that is regulated by proximity is going to be the base geometry of water breaker
Water breaker subdivision
203
Water breaker line Enclosure outline
03.15. Water Breaker
Platform outline
204
Recognizing water breaker location By analysing the outlines of roof enclosure and human access platform outlines and analysing them with touchdown structure outline, a set of lines that defining where the solids are directly touching the water emerges. With this line, an area that are meant to break the water can be defined
Water breaker line
03.15. Water Breaker
Subdivision area
Enclosure/platform area
Zoomed in adaptive subdivision plan
Water breaker perspective sketch
205
03.15. Water Breaker
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 90
206
Frame 100
Frame 110
Frame 40
Frame 50
Water breaker geometry
03.15. Water Breaker
Frame 30
Dynamic pattern extrusion based on area and proximity to the edge forms a water breaker geometry that can absorbs water collision force through its geometry intricacy
Dynamic extrusion
207
03.15. Water Breaker
diagramatic section
Zoomed in plan of the adaptive subdivision
Isometric screenshot of the water breaker geometry, the intricacy of the geometry will disperse water force to protect human inhabitation area, while at the same time providing an opportunity to create new kinds of marine species habitat
208
209
03.15. Water Breaker
210
03.16. Section - Mudflat Zone
apperture mesh Structural struts Solid enclosure surface Beach condition Interconnection void Water breaker geometry
High tide level
03.16. Section - Mudflat Zone
Enclosure structural ribs
Mudflat level
apperture mesh Structural struts Interconnection void Outdoor beach condition Water breaker geometry Habitable struts High tide level
Mudflat level
211
212
03.17. Views
213
03.17. Views
214
03.17. Views
215
03.17. Views
216
03.17. Views
217
03.17. Views
03.18. Ecology
Sedimentation touchdown points are designed specifically to alter the natural mudflat topography. Besides increasing the surface area of natural topography that also increase habitat for marine species, this phenomenon also emerging a new pattern onto the mudflat, creating variations of shading conditions that will enhance the quality of habitat.
218
This artificial intervention will add the varieties of species that inhabit thames mudflat which lacks solid natural aspects.
03.18. Ecology
Convoluted structure that have variable exposure to water overtime, will become a new habitable environment for new kinds of marine species.
219
03.18. Ecology
Before Sand sedimentation that accumulated, will bury the structure overtime. Parts of the structure will dissolve into the natural landscape eventually, and will become a relic that enrich the variety of habitable environment for marine species.
After
220
03.18. Ecology Sand sedimentation
Intertidal zonation
The structure various exposure to water overtime caused by tidal movement of sea water, will cause an phenomena called intertidal zonation to happen. Intertidal zonation is a phenomena where different kinds of species inhabits different areas of a structure, caused by differentiation exposure degree of the structure to water.
221
03.19. Water Channels 222
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Frame 100
Frame 110
Frame 120
Frame 40
Channel surface
Frame 50
Salt water channels
03.19. Water Channels
Frame 30
First layer of the development created by information from the l system and differentiated by hierarchical information
Frame 90
Frame 130
223
03.19. Water Channels
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Water flow The topography is designed to channel salt water to the clear water area in an exclusive manner Preventing the two states of water to mix
224
Frame 40
Frame 50
03.19. Water Channels
Frame 30
Salt water flow
Frame 90
225
03.19. Water Channels
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Irrigation system Clear water is channelled using an open irrigation system that moves between pool to pools this allows the clear water to be exposed to the sun to maintain a low salinity level
226
Frame 40
03.19. Water Channels
Frame 30
Frame 50
Clear water flow
Frame 90
227
03.19. Water Channels
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Accessible platform Upper surface of the topography for human accessibility purpose and also act as a cover on the salt water channels, preventing salt water to be exposed directly with sunlight in order to maintain a high salinity level
228
Frame 40
Frame 50
03.19. Water Channels
Frame 30
Platform surface
Frame 90
229
03.19. Water Channels
Frame 00
Frame 10
Frame 20
Frame 60
Frame 70
Frame 80
Closures Thickening and layering of pattern happens where the two different states of water collide to create closures that functions as observation areas and also house the osmotic energy plant which make use of the pressure that happens when clear and sea water merge to create electric energy.
230
Frame 40
03.19. Water Channels
Frame 30
Frame 50
Hub closures
Frame 90
231
03.20. Sedimentation on River Link
Topography Generation based on Simulation Results on the RiverLink Zone
232
233
03.20. Sedimentation on River Link
03.20. Sedimentation on River Link 234
Low Interference at the RiverLink Zone The strategy of the generation of the touchdown points at the river link is keeping the amount at the possible lowest levels. The information is taken from the structural input and fed into the TD generation script. The TDs are distributed by the distance of 50 meters to be able to support the built fabric above. This way the interference is very limited and it is projected that the flow of the water would not be
235
03.20. Sedimentation on River Link
236
03.20. Section - River Link
Clear water pool
Osmotic plant area Structural anchoring points High tide level Low tide level
03.20. Section - River Link
Observation deck
Seabed level
Observation deck
Osmotic plant area Clear water reservoir Clear water pool Salt water channel Structural anchoring points High tide level Low tide level
Seabed level
237
Observation Deck
Accessible Platform
03.21. Landscape profiles
Water Pool
Low Tide
Salt Water Channels Observation Deck
Non Accessible During High Tide
Water Pool
High Tide
Salt Water Channels
Section A
Observation Hub Accessible Platform
Low Tide
Osmotic Plant
Salt Water Channels
Observation Hub Non Accessible Platform during High Tide
High Tide
Section B 238
Osmotic Plant
Salt Water Channels
Accessible Platform
Low Tide
Salt Water Channels
Accessible Platform
NonAccessible during High Tide
Salt Water Channels
03.21. Landscape profiles
Accessible Platform
High Tide
Section C
Low Tide Non Accessible During High Tide
Accessible Platform
Water Pools
Osmotic Plant
High Tide
Section D 239
03.22. FAbrication System
Fabrication system
240
a prototype that demonstrates a flexible casting frame can be adjusted by a control panel on itâ&#x20AC;&#x2122;s 2 sides, creating an overall shape of a penrose surface by tension mechanism.
[01] configuration 01 [02] configuration 02 [03] control panel [04] tension mechanism
01
02
03
04
03.22. Fabrication Ssytem
Flexible casting frame
241
03.23. Construction System
02
isometric 01
Construction system This investigation tries to find a system to build the whole surface according to the structuralproperties and the panelized casting system from the previous stage. The deformation area, which is the main structural network ( 01 ) will be cast on site, while the rest of the surface that sits between thes structural network ( 02 ) are more flexible in terms of casting location.
01
242
[01] main structural network[the spine] [02] panelized surface
01
isometric 02
01
03.23. Construction System
01
plan
section 01 01
02
01
01 main structural network ( the spine ) 02 panelized surface
section 02
243
244
03.24. Views
245
03.24. Views
246
03.24. Views
247
03.24. Views
03.24. Views Interconnected interior blurred the differentiation between level and spaces inside the structure
248
03.24. Views Interior view showing the void that connects level spatially
249
03.25. Touch down and Anchoring Points
TouchDown Point Distribution on the Site Based on Different Densities
250
03.25. Touch down and Anchoring Points TOUCH DOWN HIGH DENSITY POINTS PERIMETER CHECK (6m -> 3m-3m) TOUCH DOWN LOW DENSITY POINTS PERIMETER CHECK (25m -> 12.5m-12.5m) ANCHORING POINT PERIMETER CHECK (50m -> 25m-25m)
251
252
03.26. Orders of Triangle
The prototypical system has different orders of penrose triangles, each of which has spesific roles to play. Images shown below are the examples of the implementation in architecture out of the different scales of penrose triangles by the result of the iterations and deformations the methodology offers. There are 2 subdivision iterations and 4 levels of deformations in the tirangular prototype. First level of triangles having no subdivision with level 0 and level 1 of deformation take their place in the performative system as circulation platforms, channelling structures and species habitation by generating high amounts of surfaces. This triangle set having a level 2 to level 4 of deformation starts to behave as the structural spines of the designed fabric. The first subdivision of the penrose triangle produces the sub-structural system and when the fabric is interacting with water, the extrusions of this level of triangles perform as wave breakers. The generation of aperture mesh according to the sun direction to translate the system to the human scale is achieved by the last subdivision.
03.26. Orders of Triangle
Different Scales of Penrose Triangle towards Performance
253
03.27. Ecosytem
Projecting the Zonation of the Emerged Habitation
254
03.27. Ecosytem
Artificial New Habitat through Fabric
255
04 Models
Physical Models
257
258
04.1. Model
259
04.1. Model
260
04.1. Model
261
04.1. Model
262
04.2. Model
263
04.2. Model
264
04.3. Model
265
04.3. Model
266
04.3. Model
267
04.3. Model
268
04.3. Model
269
04.3. Model
270
04.4. Model
271
04.4. Model
Ardes Perdhana Indonesia
272
Mehmet Akif Cinar Turkey
Pebyloka Pratama Indonesia
Muhammed Shameel India
Acknowledgement We would like to thank our Parents and Family for their support and love. Alisa Andrasek, for guiding and giving us an oppurtunity to work with her studio. DRL Directors and Tutors, Theodore Spyropoulos, Yusuke Obuchi, Marta Male-Alemany, Rob Stuart Smith, Jeroen van Ameijde, and Shajay Bhooshan, for their constant inputs and critiques given along the duration of the course. Thanks to all our colleagues at AADRL and friends with their timely help. We also take this oppurtunity to thank all AA Support staff.
273
274
275