coding parameters SPRING 2010

Page 1

orp

Annie Boccella Tânia Branquinho Manuel DeLanda Erik Ghenoiu Alpna Gupta Mitchell Joachim LÊopold Lambert Sarah Le Clerc Marinelle Luna Peter Macapia Hannibal Newsom Sarah Ruel-Bergeron David Ruy Scott Savage suckerPUNCH Erik Thorson Kazys Varnelis James Williams Eleftheria Xanthouli

coding parameters

orp

coding parameters SPRING 2010

Architecture Manual

School of Architecture tarp@pratt.edu

SPRING 2010

Pratt Institute 200 Willoughby Avenue Brooklyn, NY 11205



orp

coding parameters SPRING 2010

Architecture Manual



Sarah Le Clerc Editorial

3 Erik Ghenoiu The View from the Cloisters 4 Ruy The Conceptual and Technical Problems of 6 Indeterminacy 10 suckerPUNCH In the Ring

20 James Williams Urban Dynamics and Urban Form 24 Mitchell Joachim Rapid Re(f)use Waste to Resource City 2120 28 Annie Boccella Spatial Textures 32 Erik Thorson Physical Intuition 36 l

Manuel DeLanda Deleuze and the Use of the Genetic

Algorithm in Architecture 41 Leopold Lambert Computational Labyrinth Towards a Borgesian Architecture 47 Hannibal Newsom The Digital Distraction 50 Kazys Varnelis [undetermined] 52 Tania Branquinho & Eleftheria Xanthouli Apomechanes 56 Peter Macapia State of the Art For and Against Rationalism 58 Scott Savage Analog Form-Finding 63

Sarah Ruel-Bergeron Please GAUD, put me on the

express train to Rightenousnesscontributors 66

Sarah Le Clerc Editorial Erik Ghenoui [undetermined] David Ruy Randomness and Irreducible Complexity suckerPUNCH In the Ring conversations with Roland Snooks [kokkugia], Alex Pincus and Peter Zuspan [Bureau V], Mike Szivos and Jose Gonzales [SOFTlab], lBrennan Buck [Freeland Buck], and Maxi Spina [msa]

James Williams Urban Dynamics and Urban Form Mitchell Joachim Rapid Re(f)use Waste to Resource City 2120 Annie Boccella Spatial Textures Erik Thorson Physical Intuition Manuel DeLanda Deleuze and the Use of the Genetic

Algorithm in Architecture LĂŠopold Lambert Computational Labyrinth Towards a Borgesian Architecture

Hannibal Newsom The Digital Distraction Kazys Varnelis On Games, Garages, and Complexity Tânia Branquinho & Eleftheria Xanthouli Apo Mechanes Peter Macapia State of the Art For and Against Rationalism Scott Savage Analog Form-Finding Sarah Ruel-Bergeron Please GAUD, Put Me on the

Express Train to Righteousness Contributors



Editorial

T

he engagement of computational and digital practices in architecture is something that seems to amaze and confound even the strongest of its proponents. We are obsessively immersed in everything computational. This could be due to the continuous emergence of innovative tools or maybe the latent desire for a utopian/dystopian influence on our present. It is precisely for these reasons that we have decided to focus this issue of tarp on computational methodologies and pose the question, “Are these practices giving us the architecture that we are envisioning?” I believe our persistence in experimentation and critique can best be described by what Kevin Kelly, co-founder of Wired magazine, outlines in his article entitled “Extropy.” He speculates that the actual power of computation is that: 1. Computation can describe all things. 2. All things can compute. 3. All Computation is one. Of course, this is a broad conjecture, but one that allows for seemingly problematic constraints to be temporarily resolved. If “computation can describe all things” then it is not just the solution we are seeking through computation, but also the problem and the process. It is with this unifying language that we will ultimately find the simple complexity at large. Yes, these

hypotheses are infinite, but nonetheless invigorating to dwell upon. This utilization of computation as ‘source’ opens us up to a tremendous range of possibilities. It is often claimed in passing that computation in design provides us with forms devoid of the designer’s hand. Yet, the scripting generation believes that these methods provide opportunities for the implementation of an evolutionary process capable of strategically adapting to environmental and/or structural changes. We only have to read science writer and past editor of Nature, Philip Ball’s “Branches” to see the possibilities of complex order and organic patterning. It is here that we delve into the exploration and critique of what is truly viable from these algorithmic strategies. Ultimately I hope that through these various computational models, such as parametric systems, generative processes, and scripting we are able to gain more insight on the complex systems that surround us. And, although we are nowhere close to inhabiting a selfadapting amorphous structure, it is exciting to think that someday a generation possibly will. 

-Sarah Le Clerc

3


The View from the Cloisters Erik Ghenoiu

I

t’s very easy to spend a decade as a scholar of architecture and never really come to terms with the advent of digital architecture and the computational generation of forms. I did it. The topic hardly ever even came up, to be honest. Only a few of the senior scholars in the field seem to have much interest in it, and the established writer-architects—most of whom were in the practice before pac-man was released—often enough have disparaging remarks ready about how computergenerated architecture is no substitute for drawing and thinking. Last week, for instance, I heard Daniel Libeskind say that he’s never been interested in the stuff, and, given the choice, he’d rather look at vernacular architecture in the midwest.1 This isn’t to say that it’s not a topic a history/theory person might take up: it’s just not quite as respectable for us as, say, Mies’ early drawings, or as stylish as studying the 1950s. Up to around 2000, we historians could get away with a few solid quips for the occasional roundtable with Greg Lynn and otherwise leave well enough alone. There wasn’t anything to say about digital architecture, because it wasn’t yet a canon we could take apart. In other words, it wasn’t yet modern architecture. I like to think of history and theory’s relationship to architectural practice as analogous to the relationship between church and state: the tonsured few look after the spirit of the thing and slowly decide what really mat-

4

tered ex post facto, but whatever clout we have derives from the fact that we relatively rarely get to meddle in policy. And so it has been with computational architecture, mostly left to be interpreted on the fly by our studio colleagues and not in the slide shows (the students have already seen it) or close textual readings (no good texts to assign) of history and theory courses. As a result, for this question at least we’re not yet entirely relevant, though being ourselves the professional critics, we tend to be immune to criticism for it (insulting the history and theory establishment in architecture is like breaking a mirror: seven years of bad press). It’s not as if we don’t ever think of technique—we have a pretty good handle on drawing and perspective by now, some pretty clever things to say about Christopher Wren’s use of model making, and we could even tell you a thing or two about what in contemporary studio practice derives from Walter Gropius and what from the École des beauxarts—but we haven’t really digested the digital, for the obvious reason that the dish hasn’t yet made it out of the kitchen. And so I’ve been glad to work with the editors of tarp in preparing this issue that aims to take stock of computational and digital architecture. Looking in from the outside, some basic principles emerge that will be more fully explored in the essays in this issue. First, from a theoretical viewpoint, almost nothing that has


been done in digital architecture requires a computer per se: the technology has simply enabled a series of paradigm changes each of which can potentially be executed longhand, and in rudimentary form all of them were in evidence before CAD was invented. Of course, technological development has guided the direction of digital design by making certain kinds of processes very easy and thus encouraging the exploration of those instead of countless other unfollowed paths, but the revolutionary change has been one of means and not principle. Second, I think it is useful to outline a few basic categories of work that have more or less succeeded each other as trends, but have all been at play from the beginning: CAD as drawing Fundamentally, CAD begins as just another kind of drawing, more easily manipulable than the older kinds and lending itself to the easy representation of three dimensions. Just as the T-square made it easy to make straight lines, so freeform surface modeling made it easy to make them curvy. CAD also made it easier than ever to design abstract surfaces without immediate concern for traditional materials and construction, which was further enabled in practice by increasingly-sophisticated engineering and new materials. This was also dangerous, in that it made it easy to design impressive sculpture without thinking too deeply about program and use, but fundamentally, this was just the intensification of a process of which drawing was already capable, replacing complex curves and blobs for straight lines and the imitation of previous drawings. Generation of Forms Algorithm and scripting enacted a switch from representing forms directly produced by the designer (within the typical forms made easy by the software) to making the act of design the framing of a process that would itself generate the forms with more or less oversight and intervention by the architect. The fact that this almost feels like natural evolution has contributed to a renewed interest in biomorphism in design, but like the “organic” whip-lines of the art nouveau, this can actually be opposite to the evolutionary process when it imitates forms directly rather than adapt to external needs. At the same time, this kind of computational design has served as a way to institutionalize the kind of iterative architecture of a rigorously developed formal system that characterized the early career of Peter Eisenman and the New York Five. At worst, however, it has been a way to crib Eisenman’s kind of process without taking with it the theoretical framework that the process was originally meant to serve.

Evolutionary Algorithm A third branch of computational design seems not to be concerned with the generation of forms at all but rather the use of carefully-crafted controls and often massive amounts of field data precisely to adapt design to program, or sometimes spaces to use. This attitude was at first more typical of urban design and planning than of architecture proper but is increasingly to be met with everywhere. An early example of this is Bill Hillier’s “social syntax” project in London, the success of which has been qualified by all the problems of any other intrusion of statistical social science into the arts: a deep faith in the truth of data realizes only the kinds of virtues that data can measure, and can never look hard enough at its own qualitative assumptions. The data-poor and statistically naïve approach of the artist and, by extension, the designer inevitably intervenes and must intervene on many levels that are not simultaneously susceptible to conscious analysis. Whether the art is in the framing of a process or the creation of a form, it probably remains necessary regardless of the sophistication of the means it employs. Seen from the cloisters, design has always derived part of its importance in that it is our chance to play God, to take our own stab at the act of creation, and for architecture in particular, the creation of a world. Perhaps in this light the idea of an evolutionary algorithm that eliminates the need for the designer is a close parallel to Darwin’s evolution, which also did away with the need for a designer, or at least for the need for one anymore. But even if we succeed in the laboratory to develop a tool in which the designers have finally invented themselves out of the system, it still wouldn’t alleviate the need for someone to navigate the larger apparatus of the design fields, which after all have a job to do. A perfectly-adapted algorithmic architecture will still be faced with a global slum problem, existing neighborhood development practices, perpetually renegotiated land use policy, changing financial systems around mortgage and debt equity, unruly clients, crooked contractors, et cetera ad infinitum. We know that our new tools can help us generate and test fantastic forms on the screen or in an extremely controlled monumental project. Whether we can put them to work to help us negotiate the quotidian muck of the real remains to be seen, or finally done.  NOTES (1) Daniel Libeskind in conversation with Catherine Ingraham, Pratt Architecture Lecture Series, 5 April 2010. 5


Randomness and Irreducible Complexity David Ruy

I

n a recent book by an ex-CIA operative,1 a story is told where an important al-Qaeda leader is located and subdued in Pakistan. Confirmation of the target’s identity is a problem in the chaotic aftermath of the operation. No identifying documents are in the al-Qaeda leader’s possession, and his face does not resemble photographs on file. The CIA operative is instructed by the field office to photograph the target’s ear. Sending the digital image via cell phone, the target is positively identified almost immediately. It turns out that no two ears are alike — examining the form of the ear is a nearly foolproof way of verifying identity. Three conditions are essential to any biometric technique. First, the feature must be unique to an individual. Second, the feature needs to be universal (everybody has a fingerprint). Finally, the feature must be persistent (your fingerprint can’t change over time). You may have also heard of using retinal patterns, dental structures, fingerprints, voice, DNA, and of course, the face itself. It is remarkable that variation in the natural world is so pervasive and specific. Nobody is exactly alike. It is quite strange if you think about it. I recount this story to raise a question about this phenomenon that is so familiar and fundamental that we hardly ever think about it. The phenomenon is the seemingly absolute condition of individuation and variation in the world around us. The physical universe

6

is teeming with an outrageous degree of differentiation. It’s evidence is everywhere and every time. The question in relationship to this phenomenon is simply, “Why?” As architects, we intervene in the world influenced by underlying assumptions about the constitution of matter and its higher order constructions. Usually, we operate through inherited principles that we’re not even conscious of (some of which are very old). Occasionally, we project architectural possibilities confronting our principles. We do so either as celebrations of architecture’s inherent rationality and stability within an inhospitable world of change, or as quixotic attempts to mirror nature itself. But in the long view, architects manipulate matter. And that is profound. What principles are we assuming in this activity? And are those principles different from the principles operating in nature? Are there even principles operating in nature, or are we projecting progressively more refined rational models? Questions regarding the variability of the physical universe have always been foundational to our creativity and our ability to bring forth new things, but we have no satisfying answers, only uncomfortable hypotheses. Perhaps one day, maybe soon, we’ll have some better answers, but for now, our ways of thinking about construction and part-to-whole constitution are exceptionally rational and remain in conflict with contradictory evidence everywhere. Rational models


explaining the physical universe continue to be based on principles of sameness though the world around us is always radiating the artifacts of a feral difference engine. The distance between the way we hypothesize construction and the actual constructions all around us is so great that for three millennia we have been tempted to conclude that the physical universe is simply filled with errors or random occurrences that are nothing but misleading, imperfect shadows of ideal forms. As Einstein proclaimed, God does not play dice.2 Of course this is true if you assume that we just don’t see the hidden patterns when we see randomness. But why assume there is a pattern to be discovered? Clearly, there is an interest here with regard to what we perceive as random variation in the physical universe. If you assume that you can ultimately have a theory about everything, you do not believe there is such a thing as a random occurrence — what is perceived to be random just doesn’t have a theory discovered yet. An unexpected insight into this discourse comes from algorithmic information theory. The word “random” is itself a distraction because we associate the word with time and the unpredictability of the future. Flip the coin, and we don’t know if it’s going to be heads or tails. We have probabilistic models now that tell us it’s a 50/50 chance, but ultimately, we don’t know for sure what’s going to come up next. We call this randomness. However, another way to think about it is in terms of the sequence of heads and tails that is formed if you flip the coin a thousand times. If you look at the sequence and discover an intrinsic pattern, you should be able to predict what comes next. The defining characteristic of randomness in this view is not the indeterminacy of the future but in the informational content of the variation itself. A number of things are significant here in this novel way of thinking about the coin flip:

1. Randomness is defined by the sequence and its lack of pattern. 2. The discovered pattern (if there is one) is a theory about how the sequence is generated. 3. The pattern that is detected can be simple or complex. To elaborate further point by point, first we can ask about the informational content of the variation. The sequence of heads and tails generated by the coin flip can be encoded as a binary sequence of 0’s and 1’s — 0 for heads, 1 for tails. This sequence has a length and constitutes an informational content of a specific size. Let’s say it’s 1000 bits. Is it possible to compress this informational content to a smaller size? You would need an algorithm that looks for patterns so that the sequence can be stored in a smaller size. You know about zipping a file I assume. What I’m describing here is not that different. Let’s say you did this thousand flip sequence three separate times, resulting in three separate sequences. You’ve stored each of the three as three separate binary files. When you zip compress each, one shrunk to 200 bits, one shrunk to 400 bits, and one only shrunk a little to 900 bits. In the definition of randomness in algorithmic information theory, we would say the first is the least random and the last most random. The algorithm that determines the consistencies in the sequences and looks for a predictable pattern constitutes a theory about those sequences. Second, the most astonishing thing about this scenario is that it is largely accepted now that the vast majority of possible mathematical objects (a sequence of numbers, for example) have no theory.3The vast majority is random. For example, something as basic as the prime number sequence is still without a theory. At various corners of the world, supercomputers are hard at work factoring enormously large numbers to determine the 7


next highest known prime. Every so often the discovery of the next prime number is a cause for celebration in the mathematical community (currently, 243,112,609-1, discovered August 23, 2008). If we discovered a theory of primes, it would be a simple thing to generate as many prime numbers as you wish. If this sounds too geeky to be important, consider the fact that encryption schemes are based on these theory-less sequences, including the prime number sequence. Think about all the applications of encryption. However, beyond applications of this notion of randomness, it is truly astonishing that the simplest possible statements of particular sequences are the sequences themselves. Some binary sequences are incompressible. Your 1000 bit sequence of heads and tails, when zipped, didn’t get a single bit smaller. And it’s not a problem with the algorithm. In some cases, it is just not possible to compress it. So finally, Gregory Chaitin is an important figure in this discussion. Chaitin is a mathematician at the IBM Watson Research Center and has made some remarkable contributions to algorithmic theory. Chaitin looks to an overlooked area of Leibniz’s work considering the problem of stating a theory.4The better theory is always the shorter one. Further, if the informational content of the theory is larger than the informational content of the phenomenon itself, it’s a pretty useless theory. Better would be the phenomenon itself as its own theory. That is, the simplest restatement of the random sequence is the sequence itself. Chaitin observes: My approach is completely different. It’s based on measuring information and on showing that some mathematical facts have no redundancy and cannot be compressed into any mathematical theory because these facts are too complicated, in fact, infinitely complex. This new approach suggests that what Gödel originally discovered was just the tip of the iceberg and that the problem is much bigger than most people think.5 It is the idea of computation and this new understanding of randomness that is elevating these questions about the variability of the physical universe into a new territory of investigation, where the most enthusiastic proponents assert that computation is a paradigm that finally gives us access to the difference engine of physical existence.6 But it is important to point out that we’re in the infancy of computational thought. A great deal of the current research remains radical and controversial. To bring it back to the implications of all of this 8

for architecture, we can look at the recent interest in computation and generative techniques in a different way. Considering the insights of algorithmic information theory, we can observe that the search for new computational methodologies is the search for a theory of variation. We are looking for the difference engine that will produce an architecture with all the richness and complexity of the physical universe. There are rarefied discussions about the technicalities of possible difference engines (scripting, simulated dynamics, associative models, etc.) and also some important discussions about why we want an architecture like nature in the first place (what’s so bad, actually, about standing apart from nature heroically?). Over the last ten years, I’ve been fascinated by how designers sneak in a randomization function7 into scripts, or deliberately compromise a simulation to get a particular pattern. It tells me that there is an aesthetic predisposition to an irreducible expression with all the complexity, intricacy, and incompressibility of the natural world. I don’t think it’s an accident that all of this is taking place in a time where the question of architecture’s relationship to nature is in crisis. In the end, I don’t think these experiments are so esoteric, but filled with charged consequences for our society. The importance of reporting some of the finer points of how randomness is theorized in algorithmic information theory is that we can argue that all of what is currently projected as generative techniques are essentially hypothetical theories about how to compress the random sequence. How do you produce the incompressible expression? These generative techniques are ultimately incredibly sophisticated means for projecting new idealities — idealities that are the most sophisticated and refined reflections we’ve ever had of the mystery at the heart of physical existence. The best designers in this domain ultimately rely on an age-old sensibility that has intuitive relationships to the world that changes. It is particularly human, all too human. I also participate in numerous conversations these days about digital techniques that I can characterize through a metaphor of driving a car. Do you want to be a mechanic or the driver? Scripting is akin to tinkering with the computational engine itself and artistic management of computational tools is akin to driving the Formula One racecar. Ultimately, I am fascinated more by a third possibility where there is a conjuncture that is developed between the subjectivity of the designer and the computational tools where it is hard to say where


one ends and the other begins. For example, I continue to be fascinated by those painters that seemed to achieve a magical conjuncture with the randomness of liquid paint. When Jackson Pollock proclaims, “I am nature,” I wonder if there may be possibilities for working in such impersonal ways in architecture. Architecture has often been the speculative device for investigating ideal forms and has understandably been in opposition to the physical universe that is dripping with inconsistencies. It is one of the remarkable subtexts of the computational age where we see now a slight easing of this opposition between architecture and nature. The computer that seemed so antithetical to nature, that seemed to be the pinnacle of artifice, is unexpectedly revealing fresh insights into the workings of the physical universe. As always with deeper understandings of nature, the temptation is always to determine the course of nature and our relationship to it. Already we see a split in the discourse concerning computer-based practices with one side desiring an intimacy with nature through generative techniques of design, and the other side celebrating the computer as the best tool ever for optimizing our relationship to material constraints. As Karl Chu has pointed out, computation may be the Promethean fire of our time.8 What is still unclear are the implications for how we work with this fire. It is perhaps the responsibility of academies to stay ahead of the curve, if for no other reason than to mitigate the possible dangers just around the corner. 

NOTES (1) See John Kiriakou with Michael Ruby, The Reluctant Spy. (Bantam Books, 2009), xix. (2) This is paraphrased from Einstein’s letter to Max Born, “I, at any rate, am convinced that He does not throw dice.” This comment is in reference to the validity of quantum mechanics — a theory that indicates at the smallest scales of matter is a condition of indeterminacy. See Albert Einstein, trans Irene Born, The Born-Einstein Letters (Walker and Company, 1971), letter from December 4, 1926. (3) If you would like to investigate this history, consider the genealogy of Hilbert’s statement of the Entscheidungsproblem (decision problem) and it’s subsequent redress by Gödel’s Incompleteness Theorem, Turings paper, “On computable numbers, with an application to the Entscheidungsproblem,” and the recent work of Gregory Chaitin. (4) See Gottfried Wilhelm Leibniz, Discourse on Metaphysics and Other Essays (Hackett Publishing Company, 1991). (5) See Gregory Chaitin, “Irreducible Complexity in Pure Mathematics,” http://arxiv.org/math/0411091 (2004). (6) There are many references that can be cited here, but two stand out. The first is the work of Edward Fredkin and his theories under the rubric of “digital philosophy.” The second is the work of the ecologist Daniel Botkin. See Daniel Botkin, Discordant Harmonies:A New Ecology for the Twenty-First Century, (Oxford University Press, 1990), 12-13. (7) It should be noted that there still is no conclusive randomization function that is purely algorithmically based. Stephen Wolfram is notable for making a claim recently for having discovered a way. This remains disputed. (8) See Karl Chu. “The Metaphysics of Genetic Architecture and Computation.” Perspecta 35, Building Codes (The MIT Press, 2004) 76. 9


In The Ring

Abigail Coover and Nate Hume editors of suckerPUNCH

T

he suckerpunch catches one off guard, but once it connects its impact is inescapable. There is no room for subtlety. No matter how sly and sophisticated the approach, it is hidden and essentially unimportant. All that counts is the blow. It can be a low blow, down and dirty, or elegantly swift but most of all it needs to sting. Likewise digital design has gone from an interest in the swing, through process, tools, and protocols, to an embrace of impact through the manifestation of complex forms, sensations, atmospheres and effects. We recently sat down for lunch with five young architects and practices who know how to pack a punch. Maxi Spina, Brennan Buck, Roland Snooks, SOFTlab (Michael Szivos and Jose Gonzalez), and Bureau V (Alex Pincus and Peter Zuspan) are all exploring diverse facets of contemporary architecture by investigating and negotiating the resonance between the worlds of academia, representation, and construction. Their work differs wildly, but they all share common concerns and ambitions in terms of intent, materiality, architectural ambitions, precedent, and fabrication. This group of architects has absorbed the technique-driven designs of their 1990’s predecessors and evolved them into robust manifestations including technicolor blooms, fibrous skins, and synthetic tectonics. Tool-dominated techniques are no longer the buzz and today’s concerns have moved past the fleeting trends of 10

Everything Ornament Bureau V


11


instantaneous snapshot animations and the smell of laser cut vivisection models. Maxi Spina: It is the opposite of the 1990s when new techniques were brought to our discipline. It was a lot more wild – just run this animation until it produces something digitally appealing. Fetishizing a technique, whether it is digital or not, will create an expression of the technique itself. The change that comes with this generation is a concern with how to territorialize those tools within the discipline in a constructive manner. It is important to challenge those techniques within architecture’s constraints. Brennan Buck: I have a skepticism towards an obsessive interest in technique. I think the origins of 90s technique fetishism in architecture was a way to get away from conventions of intuition and composition and the idea was that algorithmic technique could in some ways force or prompt new solutions. So I still believe in the idea but I think in the end it is about the end rather than the means. The end results matter once again. This shift is in bed with a desire for intent and accountability to be explicit in the output rather than a result driven purely by recursive algorithms or processes. Idea and ambition are the new curators of the development and deployment of scripts and modeling techniques. Intent is back in the fight and ready to confront the ripe architectural territory between geometry and calculated effects. Alex Pincus: We are thinking about atmospheres and

BALCONY SECTIONS

Jujuy Redux Apartment Building Maxi Spina (msa), Marcelo Spina and Georgina Huljich (P-A-T-T-E-R-N-S)

12


effects in a different way then they are conventionally spoken about in terms of them being just a by-product of digital formalism. Digital manipulations are meant to support a larger idea of what the atmosphere of the space could be, as opposed to the other way around. If you reverse engineer atmosphere from a concept down to performance versus performance up to concept then you are going to have a greater possibility of getting interesting architecture, whether it is successful or fails it will be an architecture that is more tied to intent. Roland Snooks: The algorithm should emerge from an architectural problem rather than architecture emerging from algorithm. That really is an issue of intent. Algorithms often work in a deterministic way where there is a linear relationship between the input and output and often that does not have the sophistication to enable you to embed any architectural concern in it. My interest in this work is how you take your intention as a designer and how you are able to embed that in the procedure

process or algorithmic process. For us the ability to embed intent is critical for the design process. We have always been interested in formal characteristics which are generated by algorithmic design. More crucial to us than an interest in sensation and effects is an interest in designing tectonic systems. Architecture has always been highly tectonically articulated. Everything has been seen as a discrete system and this has been particularly emphasized within classical modernism. We are beginning to become interested in how algorithmic work actually breaks down some of these hierarchies and effectively becomes nonlinear design. Instead of seeing either form following function or function following form simply as a duality or looking at the way structure follows surface or ornament follows structure, we are looking at how these things can begin to interact in a non-hierarchical way. How does structure negotiate its relationship with ornament, for example. In doing this we are interested in the way normative hierarchies begin to dissolve, and as soon as they begin to dissolve, it makes for an entirely different reading of form and articulation, and that has

Stilettopolis SOFTlab 13


Taipei Performing Arts Center kokkugia

certain resonance in terms of the effects it generates. So in a way it is an interest in effects but it is actually not just looking at surface effects but the whole way tectonics begin to negotiate.

1999 vision of the future where everything is glossy and plastic. Clearly none of us are wearing glossy, plastic anything. There are alternate material selections that are informing the way we deal with pattern and texture.

The rigorous pursuit of effects and new sensations has driven contemporary work from its vacuous black background proliferated with shiny surfaces to an engagement with materiality. Designs are no longer developed devoid of material and scale, but developed through a delicate dance of material realities and complex geometries.

An oscillation between the digital and its physical ramifications forces young practitioners to work at diverse scales from the web to drawing to installation to building. These scales operate in discrete and integrated manifestations blurring the boundaries between their influences. This scalar dialog has replaced the “script or die� attitude, as Michael Szivos would say, with one of interaction and generative diversity.

Alex Pincus: There is a pretty strong interest in making real things and testing the ideas that we had in school in the real world. Everyone is more intuitively connected to the tools of production from the modeling software all the way through to making things as opposed to theoretically attached to those models... There is a difference between cutting a pattern in marble versus cutting a pattern in MDF. Most of the projects to date that have dealt with perforation or panelization have been built out of relatively cheap, straightforward materials. It is not that we are trying to cut them out of the most expensive materials, but we are trying to investigate this in terms of atmosphere and effects. What does it mean to use everything from conventional to futuristic materials in relationship to contemporary design techniques? A reclaimed timber faceted wall is a lot different then an auto-body-painted faceted wall. We are trying to divorce some of the techniques from the materials they have been typically associated with. There is a cheesy 14

Michael Szivos: If you are working in flash developing a square that can become an art gallery when you click on it and then you move back to designing a table which is also a square, you are forced to question if it can do more. The goal is for the movement between the web and built work to be very horizontal and blurry. The web work we do is extremely influenced by architecture. We think about a website in an extremely modular way. Working in another environment, you learn things, you prove that ideas you had in one medium work in another environment. What is great about the web is that you can experiment without materials. It is the equivalent of doing a rendering or a drawing, but it is performative. At the same time moving from the web to fabrication there is a chance to test it at full scale and advance the design. The choice in materials is a way to embrace the fact that it is a different medium. Since we do work digitally it is important when we get to build something.


Taipei Performing Arts Center kokkugia

Fibrous Tower 2 kokkugia

Swarm Matter kokkugia 15


Jose Gonzalez: There is also the immediacy of the feedback with the scale of web and installation work. You can do it quickly. You can test the performance of it right away. We are more interested in shorter more immediate time frames. Architectural time is glacial time. A project takes months and years where these projects are expedited quickly. That is related to the economics as well. You see results quicker. Alongside fabrication and atmospheric concerns, interests in precedent and fundamental architectural problems such as enclosure, aperture, and spatial organization are emerging in current digital and algorithmic design conversations. As a result of a rejection of novelty, contemporary designers are developing innovative projects to engage with ongoing architectural concerns. Curtain walls and railings are no longer annoying distractions, but opportunities for innovation. Roland Snooks: Everything we do comes out of our understanding of architecture. Our work is about architec-

Trillion Dollar Mile Bureau V 16

ture design first and contemporary techniques second. Increasingly what we are interested in doing is looking at architectural problems. Part of this is our interest in tectonics and the relationship between structure and ornament. These are fundamental questions architects have always dealt with. I think when people are dealing with new techniques there is this desire to reject history and claim that everything is somehow new and that is certainly not a position we want to put ourselves in. Maxi Spina: I think precedent is important - to be able to delaminate the work of people who have traveled the ground before you and tackled similar issues. Not to fetishize their work or demystify who your idol is, but to reflect and extract the mechanisms and procedures that recur in their work. Once extracted, in their proper field of communication, the work becomes much more generic. The problem with precedents has to deal with two things – it has to be seen as a hinge, there is a continuity in that precedent that you want to carry on, but also rupture. That is the generation of any architectural work,


or any creation. There is an acknowledgment of the past and there is a break or rupture of the past. Alex Pincus: I think for the medium it’s really important to get some fundamentals involved in the more sophisticated approaches to the kind of things people are trying to do. The balls to the wall, everything is futuristic, everything is innovative approach is dangerous to a certain degree. People should have some understanding about what is innovative about what they are doing and whether or not it is innovative. Peter Zuspan: People are obsessed with making something new for better or for worse. I understand the process driven ideas specifically in terms of jumping on the bandwagon of “let’s devise this crazy futuristic machine that will make us think of something new”. There is a lot of value to that in terms of being able to produce something you have not thought of but at the same time there is a reason why poetry has rhyme. There is still a lot of interesting work that can come out of it. While precedents are bringing productive fundamental concerns and concepts back into the fold, new questions of meaning are also resurfacing. Issues of meaning that go beyond materiality and atmosphere are back on the table. The superficial playground of complex surfaces and their effects now has the potential to be performative, not just through technology, but also in relationship to its meaning and cultural resonance. Brennan Buck: I am interested in the border, or the overlap, between atmosphere and meaning. Robert Levit makes a convincing case in an essay called “Contemporary Ornament” that some form of meaning is inescapable for architecture. A contemporary architecture heavy on pattern and articulation for him inherently implies things about nature and the individuality valued by our society. If you accept that some generalized set of associations will inherently be evoked, how do you think about that, not in a reductive postmodern way, but in a more open, multiplicitous way. There is a good case to be made that most of our media and a lot of our urban spaces already work this way. We’re asked to judge politicians on whether their facial expressions seemed upset and angry or friendly and optimistic, and we’ve been identifying with products based on how they make us feel for decades. Through the effects of form, surface, and lighting, architecture can be immersive and power-

Shizuku SOFTlab

ful in a similar way. Dave Hickey’s essays about the political power of beauty and Nigel Thrift’s writing on what he calls the ‘spatiality of feeling’ suggest that aesthetic decisions have social and political implications – can be performative. The struggle is not between shape and form, right angle and informe, or generative and algorithmic, but between knockout and pulled punch. Through a love of high and low culture this work can absorb contemporary sensations from the shape shifters of True Blood, to the genre switching of Nico Muhly, to the dissonant noise of black metal. These influences complemented with the usual highbrow suspects such as Antonio Negri, Gilles Deleuze, and Alberti lead to the development of hybridized aesthetics and methodologies which can reengage and reenergize fundamental architectural problems. A new round of trendy tools will be on the scene tomorrow but the issues of enclosure, aperture, and spatial organization will forever be in the architectural mix. Tools and techniques exist to lubricate new atmospheres and effects rather than to dictate the direction of the work. A renewed interest in the end results of a project place value on what it is, rather than what made it. It is time to save the seduction for the punch and to let the swing fade into the ubiquitous black background. 

17


These studies of surface-based poche investigate the subtly evocative potential of figural structure. Three patterns of structural members were designed to evoke structural, natural and monumental configurations (the latter inspired by Louis Sullivan’s Midwest natural and monumental configurations (the latter inspired by Louis Sullivan’s Midwest interior void. The resulting semi-solid mass infuses the enclosed space with both character and dynamic variation.

Surface Poche Studies Freeland Buck 18



Urban Dynamics and Urban Form James Williams

A

rchitecture is at a critical moment right now in terms of the global, unprecedented rate of urbanization and the powerful digital processes with which architects are experimenting. We now have potential tools to better understand and integrate rapidly-proliferating urban systems. The use of computation allows for the integration of larger amounts of information and a faster way to think through multiple design solutions. That is, we have the potential to think more systematically about the context in which we are building, in the hope of producing more sophisticated and adaptable solutions to the way in which people inhabit cities. Nevertheless, we should be more questioning of the inputs we are working with, their ultimate relevance to the experience of the user, and how they are translated into a formal strategy. One computational thesis, put forward by Patrik Schumacher, partner at Zaha Hadid Architects and founding director at the AA Design Research Lab, is to scale-up parametric methodologies, which include animation, simulation and form-finding tools, and parametric modeling and scripting tools such as GenerativeComponents (GC), Dynamic Programming (DP), Mel-script, and Rhino-script. In general, Schumacher argues for ‘parametricism’, which he defines as a set of “shared design ambitions/ problems” that utilize parametric design tools and scripts

20

to develop “intricate correlations between elements and subsystems” and aim for “maximum emphasis on conspicuous differentiation and the visual amplification of differentiating logics.”1 He further argues that parametricism is a new style that “succeeds Modernism as the next long wave of systematic innovation” and that only parametricism has the ability to truly “organize and articulate the increased complexity of our post-Fordist society.”2 On a heuristic level he argues that architecture should avoid rigid geometric primitives, simple repetition, and juxtaposition of unrelated elements or systems in favor of more parametrically malleable forms, gradual differentiation, and systematic inflection and correlation. In addition, he outlines five agendas for this project, which include the interarticulation of subsystems, accentuated differentiation, figuration, responsive agency at different timescales, and the total integration of the built environment. 3 As Schumacher himself claims, this is an operational definition, which he believes helps the designer create high performance spaces that better articulate the intensity of relations between systems and subsystems and achieve a higher level of internal integration and external adaptation.4 While Schumacher is certainly capturing some of the guiding principles of current work in architecture schools and a handful of firms, his argument seems somewhat forced and also


unquestioning of its philosophical underpinnings, which are not overtly discussed in his manifesto. The ultimate goal of Schumacher’s project is to work with these methodologies on an urban scale, or what he terms ‘parametric urbanism.’ Such an approach uses swarm logic5 as a basis for urban design, incorporating mass, spacing, and directionality as variables.6 More specifically, Schumacher wants to explore the dynamic relationships between “fabric modulation, street systems, (and) a system of open spaces.”7 As a starting point, he looks to Zaha Hadid’s master plan for KartalPendik in Istanbul. The goal of this project is to redevelop an abandoned industrial site as a new city center. The design started with the laying out of a soft-grid or a net, generated from multiple variables. Lateral lines that connect existing roads, a singular longitudinal axis, and topographical considerations form the basis of the grid. An operation of bundling using Maya’s hair dynamic tool was applied, to create areas of dense program cut through by streets, vertical build-up of towers when the net is pulled up, and open spaces.8 This results in a hybrid grid that exhibits properties of Frei Otto’s minimizing detour network and a deformed grid.9 Two building typologies, cross towers and perimeter blocks, were also incorporated into the system. The cross towers were placed on crossing points within the grid and the perimeter blocks operate on an inverse relationship between height and parcel area. Schumacher argues that this creates an ‘ordered complexity’, in this particular case a “rhythm of urban peaks to index the rhythm of the widening and narrowing of the urban field”, that is different from traditional master plans or the ‘visual chaos’ that results from unregulated growth.10 Two questions immediately arise from Schumacher’s argument. First, does Hadid’s project tie back to the idea of swarm intelligence and what does such a process mean for architecture, urban design, and urban planning? Secondly, what is the general nature of form-making in relation to urban patterns? Neil Leach, Professor of Architectural Theory at the University of Brighton and a visiting professor at the University of Southern California, argues that the city itself, like other types of living networks composed of populations with smaller discrete elements, evidences swarm intelligence, or a “bottom-up collective intelligence that is more sophisticated than the behavior of its parts.”11 However, as he points out, while fixed grids, either topological or geometric cannot make “qualitative shifts in form and space” outside of their own design,

a “genuinely bottom-up emergent system of swarm intelligence where individual agents with embedded intelligence respond to one another…offers behavioural translations of topology and geometry that can have radically varied outputs.”12 While such a simulation is yet to be worked out, it points to the complexity of the matter and serves as a critique of the use of the concept of swarm intelligence in design. While perhaps it could be argued, as Leach points out, that Frei Otto’s grid techniques represent a bottom-up form of material computation13, it seems erroneous here to apply the concept of swarm intelligence to Hadid’s project. Although the circulation paths were connected to the existing urban fabric, the system was essentially overlaid onto a parcel of land, making it more of a top-down process, much like any other master plan. Although the paths are embedded, to a certain extent, with their own logic of minimizing detours, would it not have been more useful “However the insisto take local inputs tence on a visual order such as infrastructure, within the differentiaenvironmental impacts, and local user patterns tion of the system gives into consideration? too much immediate Closer to Schumacher’s relevance to a desired manifesto, should not urban aesthetic.” the circulatory patterns be developed more in real-time as they are being built, in order to be more adaptable to multiple changing conditions? Schumacher has argued that his project is more about urban design and articulation of space as it is perceived by the user, as opposed to urban planning, politics, and the organization of space by engineers.14 However, the process of generating a system of circulatory paths and open spaces through computational techniques seems to collapse this distinction. I would argue that the parametricist project does have a place in this type of endeavor, where an artful use of variables in the design process can balance the necessary functional considerations with how users navigate and experience the street and open spaces. In terms of designing a collection of buildings on a large urban scale, however, the parametricist project becomes quite problematic. The bundling of paths in Hadid’s soft-grid produces a variety of densities and variability in building height, with the intention of producing an ordered complexity. However, the insistence on a visual order within the differentiation of the system gives too much immediate relevance to 21


a desired urban aesthetic. Schumacher admits that this part of the project can only be realized through strict planning guidelines and political and private buy-ins, and argues that “all constituencies need to be convinced that the individual restrictions placed upon all sites really deliver collective value: the unique character and coherent order of the urban field from which all players benefit….”15 While the differentiation of systems and thus ordered complexity within a building, or small set of buildings, seems plausible, scaling-up parametric design to an entire urban area, as it is described here, seems unfeasible and theoretically invalid. Similar to the overlaying of the grid, the imposition of a general form, in terms of building typologies and heights goes against the idea of a bottom-up process. The parametricist dogma encourages the integration and adaptability of systems, while Hadid’s plan imposes an overall formal logic from the outset. Although this formalism is adaptable to a certain degree, it is very rigid if one takes into consideration the vast number of variables that could have been used as a basis for design. We should also ask how such an ordered complexity would be experienced by the user, or if it could really be perceived at all outside of a birds-eye view. That is, this type of formal move is very far removed from the scale of the user. Given the rigidity and scalar issues of this plan, I would argue that Schumacher’s agenda does not yet have a tenable formal strategy for dealing with the urban scale. Similar to Leach, Manuel DeLanda, a philosopher, media artist, programmer and software designer based in New York, has argued that the decision-making processes that generate the urban context need to be modeled before we start to form buildings. To achieve this computationally he proposes the modeling of “intelligent decision-making agents that can influence others and reflect upon their own decisions.”16 Again, although such a modeling program has yet to be developed, this raises questions about what degree of complexity should an architect take into account, or, what types of variables should be incorporated into a computational design. It seems that the parametricist agenda is less prepared to answer this question theoretically, in terms of the coordination of building typologies across an urban landscape in response to the user. While the use of computation in urban planning and design is largely new field that is yet to be worked out in theory or in practice, it can still be argued that we need to think about how to incorporate a wider range of variables in such projects, to more deeply 22

reflect the rich dynamics inherent in the urban context. To better understand these relationships, and how we arrive at formal strategies, the connections, on different timescales, between materiality and social processes needs to be better mapped. As architects come to terms with computation and its relevance for urban design and planning, we are entering a realm where formal considerations should come towards the end of the process, and where ideas about the individual user, circulation, and land use should be incorporated from the beginning. 


NOTES (1) Schumacher, Patrik. “Parametricism: A New Global Style for Architecture and Urban Design.” Architectural Design 79: 4 (2009), 15-16. (2) Schumacher, 15. (3) Schumacher, 16-17. (4) University of Southern California (2009). Intensive Fields – New Parametric Techniques in Urbanism, Session 3, Parametric Urbanism [Video] Retrieved May 31, 2010, from http://arch-pubs.usc.edu/parasite/intensivefields/video-archive/. (5) “This often refers to a form of ‘swarm effect’, where a grid is morphed parametrically using either digital tools or Frei Otto’s ‘wet grid’ analogue technique.” Leach, Neil. “Swarm Urbanism.” Architectural Design 79: 4 (2009), 61. (6) Schumacher, 17. (7) Schumacher, 19. (8) arcspace.com. “Zaha Hadid Architects Kartal – Pendik Masterplan.” http://www.arcspace.com/architects/ hadid/kartal_pendik/kp.html (Accessed March 2, 2010). (9) Schumacher, 20. (10) Schumacher, 20-21. (11) Leach, 58. (12) Leach, 61. (13) Leach, 63. (14) Intensive Fields – New Parametric Techniques in Urbanism, Session 3, Parametric Urbanism [Video]. (15) Schumacher, 21. (16) Leach, Neil. “The Limits of Urban Simulation: An Interview with Manuel DeLanda.” Architectural Design 79: 4 (2009), 55. 23


Rapid Re(f)use Waste to Resource City 2120 One Day Hour Tower A single 54 story skyscraper made from 24 hours of compacted waste produced in the city of New York.

Mitchell Joachim

N

ew York City is disposing of 38,000 tons of waste per day. Most of this discarded material ended up in Fresh Kills landfill before it closed. The Rapid Re(f)use project supposes an extended New York reconstituted from its own landfill material. Our concept remakes the city by utilizing the trash at Fresh Kills. With our method, we can remake seven entirely new Manhattan islands at full scale. Automated robot 3d printers are modified to process trash and complete this task within decades. These robots are based on existing techniques commonly found in industrial waste compaction devices. Instead of machines that crush objects into cubes, these devices have jaws that make simple shape grammars for assembly. Different materials serve specified purposes; plastic for fenestration, organic compounds for temporary scaffolds, metals for primary structures, and etc. Eventually, the future city makes no distinction between waste and supply. ď Ž

Credits: Mitchell Joachim, PhD; Terreform ONE + Terrefuge, Emily Johnson, Maria Aiolova, Melanie Fessel, Zachary Aders, Webb Allen, Niloufar Karimzadegan, Lauren Sarafan, Philip Weller 24


25


26


27


Spatial Texture Annie Boccella

A

material study with mylar explores how twodimensional linework can evolve into a surface yielding spatiality and growth. An original pattern is articulated through precise, repetitive cuts. A simple, yet variable connection piece allows for multiple conditions of light transmission and surface articulation. As the system aggregates, complexity develops through gradual mutations such as rotation and scale. Qualities of finesse emerge as the system morphs delicately, exposing a system that is not stagnant, but constantly responding to component behavior. ď Ž

28


29


30


Grosses Schauspielhaus Hans Poelzig

Berlin-Mitte 1919

31


Physical Intuition Erik Thorson

O

ver the last 20 years the toolset of the architect has radically changed yet our authority and responsibility has remained the same. Our interaction with the physical aspects of a project has become limited since the reliance of engineers in our profession. Because of this, much of our responsibility and authority has been subdivided and distributed across multiple disciplines. If architects could develop informed conceptual designs for all aspects of a building we would be able

Minimal Detours Network 32

to offer a much more succinct design. This allows the designer to incorporate all mechanical, structural, environmental and programmatic elements at the conceptual stage of a project. The projects that I present here offer insight into this possibility by incorporating FEM (Finite Element Method) analysis software into different stages of the design. By indicating the distribution of stresses and displacements this software allows for a detailed visualization of where structures bend or twist. In each


project it provides a slightly different service. The base process for each of these projects is similar. A model is created in a 3d modeling software, restraints and forces are then applied to the model to create a “problem set-up� which is exported to a FEM solver, which is a computational tool which uses the FEM method to solve for the distribution of stresses along a material. The results are exported from the solver and read back into the modeler. The model is then modified to accommodate the solution. By automating this process, it is unnecessary to utilize additional modeling platforms, therefore the data is much more intuitive and accessible. By implementing an algorithm on behalf of the designer, solver results can automatically influence a design with various levels of agency. I will briefly describe each of the projects presented here and their different uses of FEM integration. Minimal Deflection Optimization. This project utilizes FEM analysis in a very subtle way, but influences structural stability. The process begins with a simple tessellated roof structure. The roof structure is then subjected to restraints and gravity. The problem is then solved in the FEM solver and the results are imported back to the modeler as nodal displacement data. The nodes of the structure are then moved in the z-direction to counter the displacement. This is a simple process with very little modification but resulting in a more structurally stable design. There were over three hundred iterations in this evolution resulting in a minimally deflecting design. Three of these results can be seen in the image to the right. Minimal Detours Network. This project is one in which the initial design is a direct result of principal stress vectors. The process begins with a double curved surface which is restrained and subjected to the force of gravity. This process results in a vector field which can be used to direct and trace points. The first two images in the sequence express the directionality of the principal stress lines. In order to gain agency as a designer I applied noise to this scenario resulting in a series of crossing, overlapped polylines. The geometry is then condensed over a series of iterations using a detour algorithm. Bridge Project. This project is unique in that it is iterative as well as interactive. The process is similar to the Minimal Deflection Optimization and Minimal Detours Network projects, but at the moment of adaptation the designer is allowed to make subtle changes to the way in which the form responds to the resulting stresses.

Minimal Deflection Optimization 33


Minimal Deflection Optimization

The algorithm is very simple, yet profoundly effective. At the points of high stress the structure gets thicker, and low stress the structure gets thinner. The amount of material which is added and subtracted can be modified at each step to create intentional spaces and forms that would otherwise not emerge. By changing the stress values it alters the entire track of evolution, resulting in an emergent process. The image to the right displays some of the steps taken to evolve the bridge’s final form. Each of these three projects use FEM software in very different ways. The first utilizes it as an initial guide in determining structural organization, allowing the designer maximum agency. In the Minimal Deflection Optimization project the initial design is simply handed over to the iterative process, upon completion it returns the result. The tool starts to show its potential most in the Bridge Project. In this project the optimization algorithm is interactively modified, allowing the designer complete agency. It is this interaction of user and algorithm, in real time, that is a relatively unexplored space particularly in the field of architecture. Optimizing and informing designs with the engineer’s advice has always been an integral part of the design process but never has it been so accessible. Through this use of FEM software, physical simulations can inform the concept and help to answer questions from the minutia to the broader gestures of the project. In a time where sustainability is of paramount importance to architects, it is essential that we address what each physical effect has on a building, from the solar gain and radiant heat caused by the sun, to relieving the lateral 34

loads on fenestration. Integrating analysis software allows the designer to manipulate these systems, catering to performance criteria while maintaining control and coordination of the aesthetic desires and programmatic requirements. Integrating FEM software into our workflow can offer many possibilities to our profession both for simple adjustments, which can have profound effects, and for creative application providing insight into physical phenomenon. The use of this type of tool which has not previously been accessible can provide the insight for the designer to answer many of the questions that have been restricted to the engineering specialists. ď Ž


Bridge Project 35


Deleuze and the Use of the Genetic Algorithm in Architecture1 Manuel DeLanda

T

he computer simulation of evolutionary processes is already a well established technique for the study of biological dynamics. One can unleash within a digital environment a population of virtual plants or animals and keep track of the way in which these creatures change as they mate and pass their virtual genetic materials to their offspring. The hard work goes into defining the relation between the virtual genes and the virtual bodily traits that they generate, everything else –keeping track of who mated with whom, assigning fitness values to each new form, determining how a gene spreads through a population over many generations– is a task performed automatically by certain computer programs collectively known as “genetic algorithms.” The study of the formal and functional properties of this type of software has now become a field in itself, quite separate from the applications in biological research which these simulations may have. In this essay I will deal neither with the computer science aspects of genetic algorithms (as a special case of “search algorithms”) nor with their use in biology, but focus instead on the applications which these techniques may have as aids in artistic design. In a sense evolutionary simulations replace design, since artists can use this software to breed new forms rather than specifically design them. This is basically correct but, as I argue below, there is a part of the process in which deliberate design is still a crucial component. Although the software itself is relatively 36

well known and easily available, so that users may get the impression that breeding new forms has become a matter of routine, the space of possible designs that the algorithm searches needs to be sufficiently rich for the evolutionary results to be truly surprising. As an aid in design these techniques would be quite useless if the designer could easily foresee what forms will be bred. Only if virtual evolution can be used to explore a space rich enough so that all the possibilities cannot be considered in advance by the designer, only if what results shocks or at least surprises, can genetic algorithms be considered useful visualization tools. And in the task of designing rich search spaces certain philosophical ideas, which may be traced to the work of Gilles Deleuze, play a very important role. I will argue that the productive use of genetic algorithms implies the deployment of three forms of philosophical thinking (populational, intensive, and topological thinking) which were not invented by Deleuze but which he has brought together for the first time and made the basis for a brand new conception of the genesis of form. To be able to apply the genetic algorithm at all, a particular field of art needs to first solve the problem of how to represent the final product (a painting, a song, a building) in terms of the process that generated it, and then, how to represent this process itself as a well-defined sequence of operations. It is this sequence, or rather, the computer code that specifies it, that becomes the “genetic material” of the painting, song, or building in


question. In the case of architects using computer-aided design (CAD) this problem becomes greatly simplified given that a CAD model of an architectural structure is already given by a series of operations. A round column, for example, is produced by a series such as this: 1) draw a line defining the profile of the column; 2) rotate this line to yield a surface of revolution; 3) perform a few “Boolean subtractions” to carve out some detail in the body of the column. Some software packages store this sequence and may even make available the actual computer code corresponding to it, so that this code now becomes the “virtual DNA” of the column. (A similar procedure is followed to create each of the other structural and ornamental elements of a building.) At this point we need to bring one of the philosophical resources I mentioned earlier to understand what happens next: population thinking. This style of reasoning was created in the 1930’s by the biologists who brought together Darwin’s and Mendel’s theories and synthesized the modern version of evolutionary theory. In a nut shell what characterizes this style may be phrased as “never think in terms of Adam and Eve but always in terms of larger reproductive communities.” More technically, the idea is that despite the “Unless one brings fact that at any one into a CAD model the time an evolved form intensive elements of is realized in individual organisms, the populastructural engineering, tion not the individual basically, distributions is the matrix for the of stress, a virtual build- production of form. A ing will not evolve as a given animal or plant building.” architecture evolves slowly as genes propagate in a population, at different rates and at different times, so that the new form is slowly synthesized within the larger reproductive community.2 The lesson for computer design is simply that once the relationship between the virtual genes and the virtual bodily traits of a CAD building has been worked out, as I just described, an entire population of such buildings needs to be unleashed within the computer, not just a couple of them. The architect must add to the CAD sequence of operations points at which spontaneous mutations may occur (in the column example: the relative proportions of the initial line; the center of rotation; the shape with which the Boolean subtraction is performed) and then let these mutant instructions

propagate and interact in a collectivity over many generations. To population thinking Deleuze adds another cognitive style which in its present form is derived from thermodynamics, but which as he realizes has roots as far back as late medieval philosophy: intensive thinking. The modern definition of an intensive quantity is given by contrast with its opposite, an extensive quantity. The latter refers to the magnitudes with which architects are most familiar with, lengths, areas, volumes. These are defined as magnitudes which can be spatially subdivided: if one takes a volume of water, for example, and divides it in two halves, one ends up with two half volumes. The term “intensive” on the other hand, refers to quantities like temperature, pressure or speed, which cannot be so subdivided: if one divides in two halves a volume of water at ninety degrees of temperature one does not end up with two half volumes at forty five degrees of temperature, but with two halves at the original ninety degrees. Although for Deleuze this lack of divisibility is important, he also stresses another feature of intensive quantities: a difference of intensity spontaneously tends to cancel itself out and in the process, it drives fluxes of matter and energy. In other words, differences of intensity are productive differences since they drive processes in which the diversity of actual forms is produced.3 For example, the process of embryogenesis, which produces a human body out of a fertilized egg, is a process driven by differences of intensity (differences of chemical concentration, of density, of surface tension). What does this mean for the architect? That unless one brings into a CAD model the intensive elements of structural engineering, basically, distributions of stress, a virtual building will not evolve as a building. In other words, if the column I described above is not linked to the rest of the building as a load-bearing element, by the third or fourth generation this column may be placed in such a way that it cannot perform its function of carrying loads in compression anymore. The only way of making sure that structural elements do not lose their function, and hence that the overall building does not lose viability as a stable structure, is to somehow represent the distribution of stresses, as well as what type of concentrations of stress endanger a structure’s integrity, as part of the process which translates virtual genes into bodies. In the case of real organisms, if a developing embryo becomes structurally unviable it won’t even get to reproductive age to be sorted out by natural selection. It gets selected out prior to that. A similar process would 37


have to be simulated in the computer to make sure that seems to run out of possibilities. New forms do continue the products of virtual evolution are viable in terms of to emerge but they seem too close to the original ones, structural engineering prior to being selected by the deas if the space of possible designs which the process signer in terms of their “aesthetic fitness.” explores had been exhausted.4 This is in sharp contrast with the incredible combinatorial productivity of natural Now, let’s assume that these requirements have forms, like the thousands of original architectural “deindeed been met, perhaps by an architect-hacker who signs” exhibited by vertebrate or insect bodies. Although takes existing software (a CAD package and a strucbiologists do not have a full explanation tural engineering package) and writes of this fact, one possible way of apsome code to bring the two together. “The term “phylum” carries proaching the question is through the If he or she now sets out to use virtual the idea of a shared bodynotion of a “body plan.” evolution as a design tool the fact that the only role left for a human is to be plan, a kind of “abstract verAs vertebrates, the architecture the judge of aesthetic fitness in every tebrate” which, if folded and of our bodies (which combines bones generation (that is, to let die buildings curled in particular sequences bearing loads in compression and musthat do not look esthetically promis- during embryogenesis, yields cles bearing then in tension) makes us ing and let mate those that do) may part of the phylum “chordata.” The term an elephant, twisted and be disappointing. The role of design “phylum” refers to a branch in the evohas now been transformed into (some stretched in another selutionary tree (the first bifurcation after would say degraded down to) the quence yields a giraffe, and in animal and plant “kingdoms”) but it also equivalent of a prize-dog or a race- yet other sequences of intencarries the idea of a shared body-plan, horse breeder. There clearly is an sive operations yields snakes, a kind of “abstract vertebrate” which, if aesthetic component in the latter two folded and curled in particular sequenceagles, sharks, and humans.” activities, one is in a way, “sculpting” es during embryogenesis, yields an eledogs or horses, but hardly the kind of phant, twisted and stretched in another creativity that one identifies with the development of a sequence yields a giraffe, and in yet other sequences of personal artistic style. Although today slogans about intensive operations yields snakes, eagles, sharks, and the “death of the author” and attitudes against the “rohumans. To put this differently, there are “abstract vertemantic view of the genius” are in vogue, I expect this to brate” design elements, such as the tetrapod limb, which be fad and questions of personal style to return to the may be realized in structures as different as the single digit limb of a horse, the wing of a bird, or the hand with spotlight. Will these future authors be satisfied with the opposing thumb of a human. Given that the proportions role of breeders of virtual forms? Not that the process of each of these limbs, as well as the number and shape so far is routine in any sense. After all, the original CAD of digits, is variable, their common body plan cannot model must be endowed with mutation points at just include any of these details. In other words, while the the right places (and this involves design decisions) and form of the final product (an actual horse, bird, or humuch creativity will need to be exercised to link ornaman) does have specific lengths, areas, and volumes, the mental and structural elements in just the right way. But body-plan cannot possibly be defined in these terms but still this seems a far cry from a design process where one must be abstract enough to be compatible with a myriad can develop a unique style. combination of these extensive quantities. Deleuze uses There is, however, another part of the process the term “abstract diagram” (or “virtual multiplicity”) to where stylistic questions are still crucial, although in a refer to entities like the vertebrate body plan, but his different sense than in ordinary design. Explaining this concept also includes the “body plans” of non-organic involves bringing in the third element in Deleuze’s phientities like clouds or mountains. 5 losophy of the genesis of form: topological thinking. One way to introduce this other style of thinking is by What kind of theoretical resources do we need to contrasting the results which artists have so far obtained think about these abstract diagrams? In mathematics with the genetic algorithm and those achieved by biothe kind of spaces in which terms like “length” or “area” logical evolution. When one looks at current artistic reare fundamental notions are called “metric spaces,” the sults the most striking fact is that, once a few interestfamiliar Euclidean geometry being one example of this ing forms have been generated, the evolutionary process class. (Non-Euclidean geometries, using curved instead 38


of flat spaces, are also metric). On the other hand, there are geometries where these notions are not basic, since these geometries possess operations which do not preserve lengths or areas unchanged. Architects are familiar with at least one of these geometries, projective geometry (as in perspective projections). In this case the operation “to project” may lengthen or shrink lengths and areas so these cannot be basic notions. In turn, those properties which do remain fixed under projections may not be preserved under yet other forms of geometry, such as differential geometry or topology. The operations allowed in the latter, such as stretching without tearing, and folding without gluing, preserve only a set of very abstract properties invariant. These topological invariants (such as the dimensionality of a space, or its connectivity) are precisely the elements we need to think about body plans (or more generally, abstract diagrams). It is clear that the kind of spatial structure defining a body plan cannot be metric since embryological operations can produce a large variety of finished bodies, each with a different metric structure. Therefore body plans must be topological. To return to the genetic algorithm, if evolved architectural structures are to enjoy the same degree of combinatorial productivity as biological ones they must also begin with an adequate diagram, an “abstract building” corresponding to the “abstract vertebrate.” And it is at this point that design goes beyond mere breeding, with different artists designing different topological diagrams bearing their signature. The design process, however, will be quite different from the traditional one which operates within metric spaces. It is indeed too early to say just what kind of design methodologies will be necessary when one cannot use fixed lengths or even fixed proportions as aesthetic elements and must instead rely on pure connectivities (and other topological invariants). But what it is clear is that without this the space of possibilities which virtual evolution blindly searches will be too impoverished to be of any use. Thus, architects wishing to use this new tool must not only become hackers (so that they can create the code needed to bring extensive and intensive aspects together) but also be able “to hack” biology, thermodynamics, mathematics, and other areas of science to tap into the necessary resources. As fascinating as the idea of breeding buildings inside a computer may be, it is clear that mere digital technology without populational, intensive, and topological thinking will never be enough. 

NOTES (1) Appeared in 2001 and has been published in various forms elsewhere. (2) “First....the forms do not preexist the population, they are more like statistical results. The more a population assumes divergent forms, the more its multiplicity divides into multiplicities of a different nature.... the more efficiently it distributes itself in the milieu, or divides up the milieu....Second, simultaneously and under the same conditions....degrees are no longer measured in terms of increasing perfection....but in terms of differential relations and coefficients such as selection pressure, catalytic action, speed of propagation, rate of growth, evolution, mutation....Darwinism’s two fundamental contributions move in the direction of a science of multiplicities: the substitution of populations for types, and the substitution of rates or differential relations for degrees.” Gilles Deleuze and Felix Guattari. A Thousand Plateaus (University of Minnesota Press, 1987), 48. (3) “Difference is not diversity. Diversity is given, but difference is that by which the given is given...Difference is not phenomenon but the nuomenon closest to the phenomenon...Every phenomenon refers to an inequality by which it is conditioned...Everything which happens and everything which appears is correlated with orders of differences: differences of level, temperature, pressure, tension, potential, difference of intensity”. Gilles Deleuze. Difference and Repetition (Columbia University Press, 1994), 222. (4) See for example: Stephen Todd and William Latham. Evolutionary Art and Computers (Academic Press, 1992). (5) “An abstract machine in itself is not physical or corporeal, any more than it is semiotic; it is diagrammatic (it knows nothing of the distinctions between the artificial and the natural either). It operates by matter, not by substance; by function, not by form...T he abstract machine is pure Matter-Function -a diagram independent of the forms and substances, expressions and contents it will distribute.” Gilles Deleuze. Difference and Repetition (Columbia University Press, 1994), 141. 39



Computational Labyrinth Towards a Borgesian Architecture Léopold Lambert

Through the years, a man peoples a space with images of provinces, kingdoms, mountains, bays, ships, islands, fishes, rooms, tools, stars, horses, and people. Shortly before its death, he discovers that the patient labyrinth of lines traces the image of his own face. [Jorge Luis Borges]

I

t has been several years now since computation has grown within a group of international architecture schools in the Western world. However, something that I regret too often, computational architecture stands as a self-contained discipline. Increasing the limits of the field of possibilities is definitely a laudable idea; however this achievement seems relatively meaningless if it is not achieved with serious consideration for the human dimension in architecture. Based on this statement, I will elaborate with a short study of how computation allows one to design what I would call a ‘Borgesian’ architecture. Jorge Luis Borges’ work indeed involves very evocative spatial dimensions and I will try to focus here on what may be his two most famous short stories: The Lottery in Babylon and The Library of Babel. The Lottery in Babylon dramatizes a city whose integral human behaviors and functions are systematically subordinate to chance. It is very important to understand

that the notion of lottery in this short story is not characterized by an arbitrary distribution of more or less valuable prizes, but rather by a random determination of every citizens’ acts and fates whether those are desirable or dreadful. The whole frenzy – not to say idolatry – of this lottery actually comes from this existence of danger and loss of control. The notion of loss of control is primordial because it is that which brings us to the creation and origins of architecture and the ability we now have to design with computational methods. In the same way that the Borgesian Babylon ceases to depend on the causal judgment of a transcendental morality, architecture can now tend towards an emancipation from the omnipotence of the architect by partially delegating a power of decision to chance. Actually, the Babylonian’s lives and computational architecture still depend on a transcendence; however, the latter no longer arises from a direct subjectivity but rather from an illegible disorder triggered by this subjectivity. In this regard I would suggest that randomness is able to bring an important dose of irrationality and illegibility which I am personally interested in studying. If the hyper-rationalization of an architecture tends to make it more controllable by an institutional power, breaking with this process could thus be considered as a form of resistance towards such a power. As an homage to Borges, I would propose calling 41



labyrinth any ‘out of control’ architecture inserting in its core a decent amount of resistance to rationality. The other short story that seems appropriate to evoke in this short study is The Library of Babel. This story is a conscientious description of the Library as “a sphere whose exact center is any one of its hexagons and whose circumference is inaccessible,” that hosts the totality of books composed with all letter combinations possible. The Library is thus questioning the notion of the infinite and its paradoxical spatial application. I intentionally write ‘paradoxical’ because the infinite seems to me as illustrating a conflict between mathematics and physics. The latter can only suggest the infinite without actually describing it whereas, mathematics is a language based on the idea of the infinite. Returning to our field of study, architecture originally belongs to the universe of physics; computation tends to insert mathematics into it and therefore the notion of the infinite. The only limit to an architecture generated by mathematics is the finite characteristics of its generator: the computer. However, simply the idea of relating architecture to one or several equations is to allow itself to acquire an infinite dimension. Such an idea obviously tackles the issue of its physicality and therefore allows architecture to exist through other means than within the finite amount of the physical world’s particles. In the same way Borges succeeded in creating an infinite world thanks to words and to the reader’s imagination, computation allows the creation of an infinite architecture through its relation to mathematics. In 1949, Jorge Luis Borges published Ficcionnes, a collection of labyrinthine short stories including the two studied here, and thus proved once again that some of the richest architectures were not necessarily designed by traditional means. Sixty years later, computation, another untraditional means, allows such scenarii to be visualized. It seems appropriate here to evoke very briefly the creation of the hyperlink, which elaborates protocols for the infinite narrative arborescence of another short story from Ficcionnes, The Garden of Forking Paths. Computation now allows architecture to reach a new dimension be it poetic, political, mathematical or even metaphysical, and thus seems to justify the use of these new tools. The architect now needs to adopt a perfect balance between, on one hand, the amount of control he gives up in order to improve his design, and on the other hand, the amount of control he actually needs to tame the tool so as to not fall into idolatry.  43



NOTES

(1) However unlikely it might seem, no one had tried out before then a general theory of chance. Babylonians are not very speculative. They revere the judgments of fate, they deliver to them their lives, their hopes, their panic, but it does not occur to them to investigate fate labyrinthine laws nor the gyratory spheres which reveal it. Nevertheless, the unofficial declaration that I have mentioned inspired many discussions of judicialmathematical character. From some one of them the following conjecture was born: If the lottery is an intensification of chance, a periodical infusion of chaos in the cosmos, would it not be right for chance to intervene in all stages of the drawing and not in one alone? Jorge Luis Borges. “The Lottery in Babylon.” In Ficcionnes (1949; Rayo, 2008). (2) The universe (which other calls the Library) is composed of an indefinite and perhaps infinite of hexagonal galleries, with vast air shafts between, surrounded by very low railings. From any of the hexagons one can see, interminably, the upper and lower floors. The distribution of the galleries is invariable. Twenty shelves, five long shelves per side, cover all the sides except two; their height, which is the distance from floor to ceiling, scarcely exceeds that of a normal book case. One of the free sides leads to a narrow hallway which opens onto another gallery, identical to the first and to all the rest. To the left and right of the hallway there are two very small closets. In the first, one may sleep standing up; in the other, satisfy one’s fecal necessities. Also through here passes a spiral stairway, which sinks abysmally and soars upwards to remote distances. Ibid. “The Library of Babel.” In Ficcionnes (1949; Rayo, 2008). Credits: etchings by Erik Desmazières for The Libary of Babel by Jorge Luis Borges. (David R. Godine, 2000). 45



The Digital Distraction Hannibal Newsom

C

haos theory. One dimensional cellular automata. Systematic wholes greater than the sum of their constituent parts. Generative systems. These are just some of the ideas embodied in computational architecture which make up the foundation of the curricula of some contemporary architecture schools. Every year these concepts, to which many individuals have dedicated entire careers, are simultaneously unloaded onto a host of unsuspecting architecture students. Not only are these ideas complex, but the tools used to understand and explore them are extremely complicated. It is no wonder then, at institutions with such strong pedagogical focus on computational architecture, the excitement generated by the exposure to these powerful and complicated ideas is often overshadowed by the inherent difficulty in understanding and manipulating all the accompanying complex software. These media have the capability to impress, to astonish, and to bewilder, regardless of the specific methodology behind their use. As a result much of our current implementation of the digital, as students, places emphasis on the tools themselves over the art. In essence, we become distracted by the digital. There are several causes behind the mystique surrounding digital media. When these avenues were first being explored access to computers was primarily restricted to universities or large research institutions

like IBM. Limited access and limited resources constrained the field of study to only a few individuals. This restriction, combined with the complexity of mathematical formulae being studied and the growing awe of computers in general generated a great interest and fascination with the work bei ng done. It was all very mysterious and exciting. However, the proliferation of the personal computer in recent years, to the extent that the majority of schools now have mandatory laptop requirements, has effectively opened the door to anyone even remotely interested in digital architecture. This increased access helped us to better understand the computer as a tangible entity, but did little to further our understanding of its capabilities. Access was no longer restricted, but manipulation was no less complicated. Part of the difficulty in comprehending the intricacies of computational media is that much of the work takes place inside a computer or similar digital device. We may interact with the various interfaces at our disposal, or even program the interface ourselves. Still, there is a degree of disconnection in the design process between our keystrokes and the final output that is not present in the movement of the hand and the appearance of a line when drawing with a pencil. Further, the ability to successfully interact with digital media and computers is embodied knowledge. It cannot be easily taught in lectures or through print, but requires hands 47


on experience. No matter how eloquently (or not) I may If we look to ‘A Panel on Architecture,’ hosted by Charexplain the intricacies of digital surface modeling, the lie Rose with a number of leading architects at Peter’s act of manipulating a primitive polygon, let’s say a cube Eisenman’s Aronoff Center for Design and Art in Cincin– subdividing, pushing, and pulling on various vertices, nati in 1996, we will see that the question is not a new edges, and faces – until it resembles the human body one: or an automobile will not truly make sense until it is attempted firsthand. Even then, these programs have beSanford Kwinter: I think it’s worth pointing out the come so advanced that the amount of time it would take difference between Greg [Lynn]’s point and Peter to master any one of them is practically unrealistic. Even [Eisenman]’s. Earlier in the show it was asked what made if mastery were a realistic goal, the speed at which these this building experimental and I think primarily what programs are constantly updated and overhauled makes makes it experimental is the degree to which so much it nearly impossible to remain current with all of the new of the design agency was surrendered to non-human and versions. It is far easier to specialize, and we see this as non-personal forces. One of the dangers that that brings individual students often begin to identify with specific up... is the possibility that offices are being run by softmodeling software and processes. ware today. And I think that it really depends on the kind of way you grasp What further differentiates the “The fact that computers this new reality which is more imporcomputer from the traditional tools of have the ability to reliably tant than the media reality. design is its quality as an active meand sensationally perform as dium. It participates in the creative Eisenman: I don’t believe that there’s process, whether running scripts, per- design agent, combined with any office, even Greg Lynn’s office, forming the simultaneous calculations their ubiquitous presence, led of an algorithm or other complex code, to the reification of the digital that’s being run by software. I think when we get to that point we’re in and often produces unexpected and design process. ” big trouble. Because if we start to get astonishing results. Of course, before things that look like Alias [modeling it can do anything it first must be given software] buildings as opposed to FormZ [modeling explicit rules in a language it can comprehend. Still, this software] buildings then they are being run by software. participation has had an inflammatory effect on the arI don’t think that’s the case. chitectural community. After the introduction of ‘paperless studios’ in a number of architecture schools in the early nineties, there was a wide range of reactions to the use of computers in design. Experimentation with various rudimentary digital design programs originally intended for the design of automobiles opened the door to an array of new possibilities, possibilities beyond just production. But the name ‘paperless studio’ for many implied that computers had somehow replaced the traditional tools of the architect, to the consternation and outrage of many in the ‘old guard.’ The fact that they have the ability to reliably and sensationally perform as design agent, combined with their ubiquitous presence, had led to the reification of the digital design process, resulting in the false impression that the computer was somehow replacing the architect. Of course computers replaced nothing; they were simply an addition to the architect’s toolbox. The computer may offer a release from tedium; however the elimination of hard work and creativity has never been a property of any digital medium. 48

Lynn: I think you just have to come to terms with the role that the technology has in the design and figure out how to direct it. But I think you can’t avoid it. 1 This notion of direction still holds today. With the digital-computational science of architecture the primary focus of a number of architectural institutions, as architects we cannot afford to be distracted by the awesome power of digital technology. Maxi Spina, in his interview with SuckerPUNCH (page 12), makes the salient point that the fetishization of a technique ultimately leads to the expression of that technique, and that the new paradigm in this generation is the constructive territorialization of these techniques within the larger discipline. Perhaps this is the case, but I would go a step further. I see this phase of the fetishization of these digital techniques not as a universal phase in architecture, but as a personal one for architects; a phase that, by definition, most students are still experiencing. We are too recently exposed to this technology not to get carried away every


time we discover a new capability or fancy trick. Excitement is a natural response to such incredible and visually stimulating technology, and to some degree obsessive manipulation is necessary to come to a full understanding of it. But if in the attempt, we focus on moving beyond the simple expression of the tools, we will be better prepared to further investigate the complex ideologies behind them. 

NOTES (1) Charlie Rose, “A Panel on Architecture,” November 8, 1996. http://www.charlierose.com/view/interview/5865 (2) A Michael Knoll. “The Digital Computer as a Creative Medium.” IEEE SPECTRUM 4:10 (1967), 89-95. 49


On Games, Garages, and Complexity Kazys Varnelis

W

ith the dual markers of the end of the housing bubble and the end of the first decade of the century it’s become apparent that network culture—the form of life dominating the last decade and a half—is mutating into some aberrant, unforeseen form. As at the end of modernism, when the rigid structures of modern architecture had become so associated with the rigid organizational regimes of Fordist capital that both architectural and bureaucratic complexes fell at once, yet again architecture and capital have become over-identified. After all, it was the virtualization of architecture, its ready conversion into complex and abstract mathematical systems, its departure from its familiar slow-moving nature that was to blame for the collapse. If the architectural press has put the blame on archetypical middle Americans living in everexpanding McMansions littering exurbia, architects are hardly innocent either. Swift to take advantage of the financialization of the building industry, they undertook vast construction projects in exotic locations like Dubai, Ordos, and Manhattan. The most advanced design of the day served as an indexical pointer for global investors, indicating not only that a city was delighted to accommodate the creative class, but also the degree of creativity with which bankers and government officials could fabricate financial fictions. 50

So now that parametric design is not only not new but rather is fatally associated with a holistic approach to building involving extreme mathematical feats of financial speculation, what are we to do with these tools, how do we continue in a sociocultural landscape so vastly changed? Like the followers of Buckminster Fuller who went off into the Colorado Desert to build Drop City, like the best trained-products of European design schools who formed radical architecture groups like Archizoom, Superstudio, Coop Himmelblau, Utopie Group, and Haus-Rucker-Co, this generation of highly trained designers will not leave technology behind for a mythic world of authenticity. For a generation born digital, that will not stand. On the contrary, they will bring the new technologies with them, and will take them over and make them their own. At Drop City Steve Baer built deformed geodesic domes that he called “zomes” to be experienced while under the influence of psychedelic drugs. So, too, today’s graduates of architecture schools are likely to build partly for this world, partly for others, constructing experiences as much physical as virtual. It’s hard to imagine that the worlds of experience design, museum design, and online game design won’t be reshaped by a generation of architects for whom designing while maintaining five simultaneous instant-messaging chats is an everyday


experience. Imagine massive games ranging across cities produced by individuals for whom gaming and the production of space are merely aspects of a design continuum. Or take another scenario. Imagine, if you will, a migration of young architects to the suburbs. It’s the least probable of all possibilities. But architects are geniuses at finding places nobody wants to live; think of George Maciunas and “This generation of highly Gordon MattaClark in Soho in trained designers will not the 1960s and leave technology behind for early 1970s. If it a mythic world of authenticseems obvious ity...on the contrary, they will now, it hardly bring the new technologies was back then. with them, and will take them Renting recently fore clos ed over and make them their houses they will own. “ build communal fa bricat io n shops in garages, adapting the skills they have learned in schools to what Chris Anderson calls the next industrial revolution, custom-fabricated DIY product design and manufacture. Still others will apply their architectural skills to a new understanding of the city. Yoking parametric modeling skills together with GIS, architects will be among the best equipped not only to design in but to understand the highly complex conditions of the contemporary city. If sociologists, engineers, and planners are increasingly understanding processes within cities as emergent systems, architects can make new sense of such readings, lending them visual form. But let’s shift gears completely, doing the intellectual equivalent of going from 5th into reverse on the highway. In the Collapse of Complex Societies, archeologist Joseph Tainter uncovers complexity as the downfall of societies. As societies mature, he observes, they become more complex, highly differentiated and highly linked. To manage my affairs, I have to wrangle myriad bureaucratic agents such as bank managers, insurance auditors, credit card representatives, accountants, real estate agents, airline agents, delivery services, script-reading hardware support personnel, and lawyers in combination with non-human actors like my phone, my computer operating system, my car, the train, and so on. This is typical of the bureaucratized nature of complex societies.

If, in a charitable reading, we produce such bureaucratic entities in hopes of making the world a better place, keeping each other honest and making things work smoothly, in reality, they rub up against each other, exhibiting cascading failure effects. In Tainter’s reading, such complexity requires greater and greater amounts of energy until, at a certain point, the advantages of the structures they create are outweighed by diminishing marginal returns. The result is collapse, which Tainter defines as a greatly diminished level of complexity. Just as rigidity was the failure point for Fordism, complexity is the failure point for post-Fordist society. If Tainter concludes that the only hope to forestall the collapse of a complex society is technological advance, this is where my optimism rubs up against my nagging feeling that what we are dreaming up is too little, too late. Technology itself is all but unmanageable in everyday life and adding greater layers of complexity is hardly a solution. We should have taken our lumps when the dot. com boom collapsed and retrenched for five or six years. Instead we added that much more complexity—take the debt and what is required to maintain it or the impossible war on terror or the climate— and now our options are greatly limited. But as Tainter points out, most of the people who experience collapse don’t mind it too much. Many of them seem happy enough to just walk away from the failing world around them, much like owners of foreclosed homes do today. Eventually a new civilization springs up and with it, new kinds of designers. Is it too much to believe that out of all this, the future of design lies somewhere, between these scenarios? Is it too much to believe that the keenest of intelligences in design will uncover these and run with them? 

51


Apo Mechanes

Athens, Greece - Bios Gallery Tânia Branquinho & Eleftheria Xanthouli

A

po Mechanes was an intensive three week computational design studio and fabrication workshop led by Ezio Blasetti, Dave Pigram, Roland Snooks and Loulietta Zindrou in Athens, Greece. Using Rhino scripting the studio focused on the potential of computation and algorithms to generate spatial qualities and self-organization techniques in the design process. The component and its connections were carefully designed to produce a wide range of aggregation configurations and organizational responses based on each rule set variation. The final composition selected was the result of eight recursive generations that considered, along with its scale, the engagement of the viewer in the ambiguity between architectural and objectile. Hera: Plywood - 320cm x 190cm 116 connections pieces 59 components 75cm x 60cm (118 pieces) 1392 zip ties 3 wire cables The script included external attractors which could override the original rule set and instigate the propagation of the component as a morphological reaction on a local and global scale based on proximity. The installation features three form deviations of the component, which respond to the distribution of weight loads, considering the structural integrity of the aggregation and its feasibility as a hanging installation. For fabrication, the component was efficiently nested, resulting in minimal material waste. Prior to individual assembly at the Bios Gallery the component and its connections were easily compiled and stacked for shipping in three 50cm x 100cm boxes. ď Ž

52


53


54


+++ generations

A

B

D

+

C

component

orientation

2

3

1

4

6

5

9

7

2

8

3

4

1

6

5

9

7

2

8

3

reference points (1,2,3)

1 4

5

6

7

9

8

1

2

3

7

8 4

9 5

6

target points (4,5,6,7,8,9)

55


State of the Art For and Against Rationalism Peter Macapia

State of the art usually implies a rationalization. It implies a fixed ratio between means and ends that can’t get any more optimized. It applies to technology, techniques, and fields. And it definitely applies to our array of software. With the various forms of computation at our disposal, analytical and generative, we continuously live in a state of the art. There’s no doubt that it will change, transform again, or become something else. But what is the art in state of the art? Or what makes it art before its instrumentality becomes part of the state apparatus in state of the art? For some reason this brings to mind Descartes and the rationalist method of the Meditations. But if you’ve read the Meditations closely, and if in particular you’ve read the second Meditation, you might get the sense that there is something a bit odd. A renowned thinker of tremendous mathematical and scientific insight decides to contemplate a humble piece of wax. And this encounter with an amorphous and somewhat indecisive matter becomes the pivot for the entire cogito. I want to look at this Second Meditation a bit more closely to understand something about the nature of the “thought experiment.” “Take, for example, this piece of wax: it is quite fresh, having been but recently taken from the beehive; it has not yet lost the sweetness of the honey it contained; it still retains somewhat of the odor of the flowers from which it was gathered; its color, figure, size, are apparent 56

(to the sight); it is hard, cold, easily handled; and sounds when struck upon with the finger. In fine, all that contributes to make a body as distinctly known as possible, is found in the one before us. But, while I am speaking, let it be placed near the fire--what remained of the taste exhales, the smell evaporates, the color changes, its figure is destroyed, its size increases, it becomes liquid, it grows hot, it can hardly be handled, and, although struck upon, it emits no sound. Does the same wax still remain after this change? It must be admitted that it does remain; no one doubts it, or judges otherwise. What, then, was it I knew with so much distinctness in the piece of wax? Assuredly, it could be nothing of all that I observed by means of the senses, since all the things that fell under taste, smell, sight, touch, and hearing are changed, and yet the same wax remains.” Everything changes. And since everything changes, it is contingent. Certainly there is the wax, but its identity is anything but stable. And so what remains the same? The fact that Descartes, regardless of all forms of doubt, regardless of all forms of experience of change, is himself Descartes actually thinking. That he thinks, he cannot doubt, though everything else about the wax, and by extension, anything material or perceptual. Here the mind grasps itself in the movements of thought as if to fix with certainty its existence, that, insofar as I think, I exist. The moment of Reason realizing itself is astonish-


ing. And yet, how do we get there? Let’s say we take Descartes’ method literally. Is there a coherent step by step process? No. The process could be arranged in any order. Is there a logical exhaustion of the results? Not really. There are an infinity of possible outcomes with the wax, it’s a continuous phenomena. Does it matter that it is wax, as opposed to, say, dirt? Probably not. Any matter seems capable of transformation. Does the wax have to exist as such? No, we can simply imagine it, so whether or not it is real isn’t an issue. So what then is the wax? Let’s call it a search space, a space, a medium in which the goal as it were is to test out various possibilities. With either a goal or a stopping point. Because, and here’s the point, there isn’t anything literally rational, or even reasonable, about Descartes’ process, except at that moment Reason becomes instrumental and makes of itself an object. In other words, the way in which Descartes plays with the wax is neither scientific nor methodical per se. It is exploratory. (historical note: parallel to Descartes writing, his colleague and friend Christian Huygens was working on the curvature of glass lenses and the perfection of the telescope). And yet, it is a meditation after all, the medium of thought being wax or the “thinking about wax” I’ve often thought about this problem of method and result in design. We have methods (sometimes we call this process) and generally the idea is that experimental methods and processes will yield novel results. Right now, the tradition, and what could be called the soul of architecture, that is, geometry, is butting up against an entirely different set of techniques which we could call algorithmic. These techniques do not start with geometry as a foundation. The goal is to produce geometry, but from a place that I would say is actually irrational since, as in the labyrinth, the movements are in fact blind. It is what we could called a search space. Anything that is a computational search space is by definition blind. In our case, it is used to produce a geometry that we cannot see and yet to make what we eventually do see introduce new situations that we can still call architectural. There really isn’t any Reason in these operations, and it would be difficult to say that there is any precise method other than the search for novelty. We’re just getting some sort of feedback from this system with which we are playing. And yet amazingly, in this process, there is a point which something rational or logical begins to emerge. And yet how do we know this? Reason, logic, coherence, and decidability: are these processes or results? Reason guarantees nothing in the

success of design, and yet everything in its final outcome once the process like scaffolding has been kicked away. Every process, therefore, can be called a process of rationalization. But I would argue that rationalization in and of itself yields nothing interesting about design. Process should remain unhitched to Reason. If one thinks about it, Descartes used wax as a kind of search space, a medium in which he could ask “There isn’t anything literally questions and get answers. It rational, or even reasonable, is actually noth- about Descartes’ process, exing less than a cept at that moment Reason kind of compu- becomes instrumental and tation, which, makes of itself an object.” it turns out is not at all rational, nor in fact scientific. It is exploratory, it is a search space. It is in fact blind. And if we push the analysis further, it is labyrinthine -- it is irrational. I like this about our processes in algorithm and design. And I think that in many ways, it’s not different than Turing’s answer to the Entscheidungsproblem. An algorithm doesn’t have to justify the answer, or provide a proof. It is interesting to note, then, that the invention of the modern computer, based on the one’s and zero’s (the binary system already anticipated by Leibniz, a colleague of Huygen’s as well) and to whom history owes credit to Turing (who wasn’t trying to make a computer, but solve a mathematical question posed by Hilbert on decideability) came up with a thought experiment in a way not much different than Descartes, except that it doesn’t require a transcendental Reason: “We may compare a man in the process of computing a real number to a machinewhich is only capable of a finite number of conditions . . . “ 

57


Analog Form-Finding Scott Savage

I

n the design of Analog Form-Finding the challenge was to find a formula or rule set that would generate seemingly random pairings while providing an underlying structure for fabrication. Thinking forward to the resulting component-based system it was crucial that the identity of each individual be surpassed in the whole. It was also requisite that each cube be unique from its peers. From early studies, a rule base for making unique geometries that could exist together and therefore be larger than the sum of the individual parts, evolved from a simple analog computational device of permutation and combination. In short I am exploring how many unique combinations are possible from x number of possibilities. Mathematically the process is fairly simple and can be described as unique combinations of four cuts pulled from eleven possibilities. The requirement for unique individuals puts an interesting pressure on fabrication. This conflict between fabrication processes and concept became the main driver for design decisions. The decision to use analog methods allowed for direct translation into the final model and considerations for craft. This also led to unexpected results in the field condition. By front-loading the design process into the physical model other subsequent actions were streamlined as the project was carried out. As opposed to facing the difficulty of translating a digital model, which exists independent of materiality and fabrication, I allowed the materiality of the analog process to produce the form by carrying it through to completion. ď Ž

58

Orientation A Sides with similar cuts are placed in the same orientation relative to the field


eld

Orientation B Cubes rotated through four possible orientations

59


60


61



Dear GAUD, Please Put Me on the Express Train to Righteousness1 Sarah Ruel-Bergeron

R

umor has it that digital fabrication and computer programs have been created to maximize productivity and help designers attack present-day design challenges by allowing us to design in a way not previously possible. This sounds ideal. But I’m not sure what is productive about an entire weekend spent moving individual vertices (20 glorious hours) to achieve my masterpiece: a perfectly flat plane. Or about my sleepless nights taken over by attempting to resolve issues of normals2. I’ve become so obsessed with the dark spot on the modeling file that signifies immediate gameover when rushing to turn in my 3-d print file that I’m starting to question my own level of normalcy. At this point the baseline is a frighteningly distant memory. But I’m getting away from the point I’m trying to make, which is that the digitally naïve, like myself, are being eaten alive by the very digital media that are supposed to usher us along the path to starchitecture. Yes, there are still people (myself included) for whom finding the “power” button on the computer is not an entirely obvious matter. How those of us found our way into a school that considers itself at the forefront of digital design is a question that puzzles me on those sleepless (1) Righteousness is an attribute that implies that a person’s actions are justified, and can have the connotation that the person has been “judged” as leading a life that is pleasing to GAUD. (paraphrased from wikipedia’s definition of Righteousness) (2) thank you 3D Max

nights. Didn’t you review my portfolio? I ask, as I turn my desk drawer inside out hunting for a couple of toothpicks to hold my eyelids peeled apart. There were NO computer-generated images, remember? Only two basic diagrams in a 25-page portfolio3. How can you expect me to model, render and fabricate when I’ve just spent an hour troubleshooting on Google and finally phoning the help desk in tears to figure out how to get my toolbox back on the screen after accidentally deleting it? It’s usually around this time that I become convinced that someone is having a nice laugh at my expense. But then a third year student kindly reminds me that I’m in school, that this is all about learning and is most of all supposed to be a raging fun ride. Has fun been redefined to mean being so concentrated on getting the damn model to sit at just the right angle that one fails to notice the pool of drool that’s gathered on their keyboard, right around where the built-in mouse pad sits?4 Now, my understanding is that students are accepted into this M. Arch. program from any variety of undergraduate backgrounds -- from gymnastics to Ukrainian literature. Which is why I am troubled by my professor’s expectation that we have mastered the (apparently) basic architectural terminology5. Most notably, “emergence” (3) which would have looked nicer if I had cut felt out with a pair of dull scissors and written on it with puffy paint (4) Does anyone else find it highly suspicious how perfectly located the pad is for drool-pooling? 63


seems to be the newest buzzword. It is also that which implications on my game 6. causes me the most distress. After repeated inquiries, I Come to think of it, our attraction to the computer is have concluded that no one can explain this term simply not unlike a new crush (who conveniently doesn’t mind enough for me to grasp. Thus, the rational part of my the digital rants). It is a strange yet UN-deniable attracbrain (obviously taking up too much space) has sorted it tion. When you’re not with Dell you think of him. Is he out and come up with the following conok where I last left him? Has anyone clusion: Pratt expects that the computer else been moving in on him in my aband I will merge into one creative being, a “I didn’t used to have a sence? Does he miss me and want to reinterpretation of myself that oozes script twitch in my left eye, or a see me?…Cause I want to see YOU, from my nose, has the printscreen func- drooling problem, but even Dell. And learn more about you, and tion incorporated directly into my right more distressing is that I’ve share moments with you and tell you eyeball and can be conveniently plugged developed chronic architec- about the past experiences I’ve had into any wall socket to regain energy at with others like you (but don’t worry tural verbal diarrhea which any time. I like you SO much better). We play I think I’m mostly just frustrated with is having some serious imgames with each other where Dell crashes programs when I haven’t saved the lack of tangibility when modeling on plications on my game.” for at least eight hours and hides the the computer, which we’re told is where recovery files. In return I make him jealous by replacing its “advantage” lies. “No limits!” they tell us. Well, I find parts of our relationship with my walkman, Iphone or that re-learning how to pan every time I open a program is limiting. Especially when getting a decent image of even lab computers, making him fully aware of his deficiencies. But when it’s going well, Dell protects me by my genius creation requires me to create lines in Autobeing of such a massive size that I can use the ball and CAD, bring them into Max to loft the whole thing along chain I devised from his 15 pound charger as a weapon a spline, go into Paracloud to populate its surface, bring against any aggressor on my daily late night commutes it back into Max for MentalRay rendering and switch over to Rhino to extract linework… Only to bring the to and from Bed-Stuy. But when we finally get home rendered image back into Photoshop to delete its backand in bed he’s just the right size, fitting perfectly next ground, then take it into Illustrator to superimpose both to me in the place of any possible suitor, his comfortimages, tweak its opacity and line weights until I’ve got ing “sleep” light snoring away next to me. I must admit it looking just the way I want so that I can finally take we do occasionally fight. He’s just so stubborn and reit into Indesign for the board layout. To someone who is fuses to do what I’m telepathically screaming at him to at a beginner level in each one of these seven programs do (the heavy clicking seems to have no effect on him this task can only be completed with a solid 24 hours of either). He always wins. But it’s his way or I’m back to drooling. Only to hear, the day before the review, that hand drawings and paper models7. I’m striving towards a the angle one degree away from this particular vantage power equality of 50-50, because right now we’re looking point would actually be more revealing. My left eye (the at 90-10 in his favor. In retrospect, perhaps this balance non re-programmed one) begins to twitch as I respond is what digital modeling is all about. “show me,” while opening the Max file. The camera an…Shortly after “completing” my last deadline of my gle hasn’t changed so my professor points and says “see, first semester at Pratt I dragged myself all the way back right over here would be better.” I move most of my upper to my folks’ home in DC and slept, ate and tried to rebody towards her pointing finger, forgetting, shamefully, gain my former abilities of socializing “normally” with the two dimensional properties of the computer screen. non-architectural primates. After a month of “heaven” I scramble to try and cover up my blushing by hitting disguised as complete boredom I find that my brain has ctrl + left click and my entire model flies out of sight… glossed over the traumatic last four months and left me THIS is the level of productivity that I want to thank (to my complete horror) wanting more. I’ve spent some digital modeling programs for. Because I didn’t used to time thinking (insert drooling) about the last semester have a twitch in my left eye, or a drooling problem, but and find that it is interspersed with strange flashbacks even more distressing is that I’ve developed chronic ar(6) you know, in the unfathomable situation where I try to branch out of studio life chitectural verbal diarrhea which is having some serious (5) did I miss an email attachment or something? 64

(7) not an option if I plan on revolutionizing the built world as we know it


of a time when I thought of architecture as inseparable from concept, programmatic depth and clear reasoning. Instead of revisiting projects in my mind for progression, I’m having a reoccurring nightmare that all of the programs I’ve just learned have updated to completely unrecognizable interfaces. I probably should have guessed that when I wasn’t skipping sleep to hang out with Dell he’d come after the one thing I’ve cherished the most over break- the face-plant into my pillow for more than 5 hours EVERY night. What I’m concluding is that more than anything to survive our forward-thinking digitallyminded world I’ll need to develop the memory of an elephant to remember all of the hotkeys. The only thing saving me is this false sense of hope that I have a better understanding of the challenge ahead. So I happily stock up on disposable toothbrushes, start quoting prices for inflatable mattresses for in-studio naptime, and fill my freezer with enough food to eat sporadically for the next four months. All the while I continue to flirt with my Dell who’s been giving me an attitude because I’ve been prioritizing printed books. But he shouldn’t worry, not only is my childhood dream of playing with cooler Legos itching stronger than ever but I’ve promised him a romantic getaway to Rome all summer. As I grudgingly go off to my last decent shower before spring break the impending doom takes over me with the skipping beat of Dell powering on. 

65


Contributors

Annie Boccella will graduate from Pratt Institute with a Master of Architecture in May 2010. Previously, she studied Economic Sociology and Latin American Studies at Cornell University and worked in finance in San Francisco before mustering the courage to quit her job and enter the world of architecture. In the future, Annie looks forward to exploring innovative and productive ways of transforming the urban fabric, both abroad and in her own neighborhood. Tânia Branquinho is completing her last semester in the Masters program of Architecture at Pratt Institute. She attended Parsons School of Design for Fine Arts and has a Bachelor of Fine Arts from the New York School of Interior Design where she has been an adjunct professor for the last five years. She was recently one of the recipients of the Eleanor Allwork Scholarship awarded by the American Institute of Architecture.

Manuel DeLanda is the author of five philosophy books, “War in the Age of Intelligent Machines” (1991), “A Thousand Years of Nonlinear History” (1997), “Intensive Science and Virtual Philosophy” (2002), “A New Philosophy of Society” (2006) , and “The Emergence of Synthetic Reason” (Forthcoming). He teaches two seminars at University of Pennsylvania, Department of Architecture: “Philosophy of History: Theories of Self-Organization and Urban Dynamics”, and “Philosophy of Science: Thinking about Structures and Materials”. He also teaches at Pratt Institute in Brooklyn, and holds the Gilles Deleuze chair at the European Graduate School in Switzerland. Erik Ghenoiu teaches history and theory in Graduate Architecture and Urban Design at Pratt. He has also taught at Parsons and the University of Wisconsin-Madison and is affiliated with several research institutes in Berlin. He has a Ph.D from Harvard and is at work on a book about “tradition” as a kind of modernism in German architecture and urbanism around 1910.

Alpna Gupta is currently a second year graduate student at Pratt Institute. She received her B.F.A. in visual studies from the Columbus College of Art and Design. Her installations and mixed-media sculpture have been exhibited in various galleries and juried exhibitions throughout Ohio. She hopes to integrate her artistic practice with the technical and functional parameters of architecture.

Mitchell Joachim is a leader in ecological design and urbanism. He is a Co-Founder at Terreform ONE+ Terrefuge. He earned; Ph.D. MIT, MAUD Harvard, M.Arch. Columbia. He is faculty at Columbia and Parsons. Formerly an architect at Gehry Partners, and Pei Cobb Freed. He has been awarded multiple fellowships; TED2010, Moshe Safdie, and MIT Martin Society. He won the History Channel Award and Time Magazine Best Invention of the Year. His project, Fab Tree Hab, has been exhibited at MoMA and widely published. He was chosen by Wired magazine for “The 2008 Smart List: 15 People the Next President Should Listen To”. Rolling Stone honored Mitchell in “The 100 People Who Are Changing America”. He was selected as the Frank Gehry International Chair at the University of Toronto.

Léopold Lambert is a young French architect who graduated from the Ecole Spéciale d’Architecture (Paris) and is currently pursuing his Post-professional masters degree at Pratt Institute. He has worked for R&Sie(n) in Paris and for Serie in Mumbai. He is also the editor of an internet daily journal entitled boiteaoutils that attempts to approach architecture through the filters of politics and narration. He is currently writing his Master thesis that investigates the systematic political “weaponization” of architecture. Sarah Le Clerc, editor of tarp, is in her final semester as a graduate student at Pratt Institute. She received her B.S. in Architecture at The Ohio State University. Originally from Ireland, she has lived throughout Europe and the U.S. before arriving in New York. She has worked for numerous architecture firms and on her own projects in Milwaukee, Cleveland and New York. In 2008 she attended a two month workshop at Technische Fachhochschule Berlin studying sustainability as part of an effort to incorporate German sustainability methodologies into her own practice. 66


Marinelle Luna is a graduate student at Pratt Institute and holds a B.Arch from the University of Texas where she was awarded the AIA Texas chapter scholarship and was part of the Tau Sigma Delta Honor Society. She is also currently a freelance designer. Peter Macapia is the Principal and Director of labDORA and teaches at Pratt Institute and the Southern Califonia Institute of Architecture (SCI-ARC). He is currently working on algorithmic and topological design research and an essay titled Cartesian Wax, which is the basis of the essay here.

Hannibal Newsom is a graduate student in the M. Arch program of the GAUD School at the Pratt Institute of Design. He received a B.S. in Architectural Studies from the University of Illinois in 2005, where he spent a year at the Ecole d’Architecture de Versailles, and attended the Ecole d’Architecture de Paris-Belleville in the fall of 2005. Some of his designs have been executed in public buildings in Chicago. He now lives in Brooklyn, where he can be found in studio.

Sarah Ruel-Bergeron is a graduate student at Pratt Institute’s School of Architecture. She is particularly interested in low income housing in developing countries and has spent time working in Haiti, Mexico, and Guatemala since 2005. After completing the master’s program she plans to continue exploring these issues in depth through doctoral research in Architecture.

David Ruy is an architect, theorist, and co-director of Ruy Klein, an experimental design office in New York City. The work of Ruy Klein has been widely published and exhibited, and is recognized as one of the most respected speculative practices in architecture today. Focused on the emergence of a synthetic sublime in recent design culture, David’s work and research examines contemporary design problems at the intersection of architecture, nature, and technology. David is currently an Associate Professor at Pratt GAUD, where he is also the director of The Network for Emerging Architectural Research (NEAR). Scott Savage will graduate from the Master of Architecture program at Pratt Institute in May of 2010. He also received a BFA from Utah State University with a dual emphasis in Graphic Design and Printmaking. Before going back to school he lived in Brazil, traveled Europe, worked in Construction and Graphic Design, and pursued his interest in Rock Climbing. He is interested in good food and how to grow it on his roof. suckerPUNCH Abigail Coover and Nathan Hume are co-creators and editors of the website suckerPUNCH. They are graduates of the Yale School of Architecture and have an architectural design firm, Hume Coover Studio, located in Brooklyn NY. Erik Thorson has graduated from the University of Colorado at Boulder and has received a Masters of Architecture with an emphasis in digital tectonics from IaaC. Erik is a member of the group Workshops Factory and currently lives and works in New York City while completing his M. Arch at Pratt. His emphasis is on the integration of computer science and architectural design. Kazys Varnelis is the Director of the Network Architecture Lab at the Columbia University Graduate School of Architecture, Planning, and Preservation. He edited the Infrastructural City: Networked Ecologies in Los Angeles, Networked Publics, cowrote Blue Monday: Stories of Absurd Realities and Natural Philosophies, and is currently working on a book on Network Culture. James Williams is a second year M.Arch student at Pratt. He received a B.A. from Vassar College with a focus in religion, sociology, and set design. He hopes to integrate his interest in urbanism and sociology with current architectural discourse and practice 67


orp

Annie Boccella Tânia Branquinho Manuel DeLanda Erik Ghenoiu Alpna Gupta Mitchell Joachim LÊopold Lambert Sarah Le Clerc Marinelle Luna Peter Macapia Hannibal Newsom Sarah Ruel-Bergeron David Ruy Scott Savage suckerPUNCH Erik Thorson Kazys Varnelis James Williams Eleftheria Xanthouli

coding parameters

orp

coding parameters SPRING 2010

Architecture Manual

School of Architecture tarp@pratt.edu

SPRING 2010

Pratt Institute 200 Willoughby Avenue Brooklyn, NY 11205


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.