PLAT
1.5
Fall 2011
PLAT is a student-directed journal published out of the Rice School of Architecture. Questions, comments and donations can be directed to: Rice School of Architecture
PLATjournal.com
PLAT Journal
editor@PLATjournal.com
ISSN 2162-4305
MS-50 Houston, Texas 77004 Acknowledgments The production and publication of PLAT would not have been Editors-in-chief
possible without the talents and generosity of:
Joseph Scherer, Eileen Witte Sarah Whiting, Dean, Rice School of Architecture Managing Editor
Lars Lerup, Dean Emeritus, Rice School of Architecture
Erin Baer
Farès el Dahdah, Associate Professor, Director of Graduate Studies, Rice School of Architecture
Design Editor
Scott Colman, Senior Lecturer, Rice School of Architecture
Melissa McDonnell
Neeraj Bhatia, Wortham Fellow, Rice School of Architecture Nana Last, Associate Professor, University of Virgina School
Art Editor
of Architecture
Renee Reder
Rice School of Architecture, Faculty and Staff The Architecture Society @ Rice
web designer
Rice University Graduate Students Association
Ian Searcy
Rice University Lynn Stekas and John Daley
Publishing Director
James and Molly Crownover
Sean Billy Kizy
Nonya Grenader JDMiner Systems LLC
Grant Director
Raymond Brochstein
Kelly Barlow
Joujou Zebdaoui The Henry Luce Foundation, Inc.
Distribution Director
Architecture Center of Houston Foundation
Amanda Crawley
The Rice Design Alliance
Copy Editors
Special Thanks to
Seanna Walsh, Tracy Bremer, Jessica Tankard, Jessica
Marti Gottsch, Patricia Bacalao, Ethan Feuer,
Cronstein, Tsvetelina Zdraveva, Sue Biolsi, Justin Brammer,
David Dewane
Chimaobi Izeogu, Mary Casper, Sara Hieb, Jia Tolentino,
India Mittag, Director of Development, Rice School of
Matthew Faega, Lauren Ajamie
Architecture Linda L. Sylvan, Executive Director, Rice Design Alliance
Staff
Raj Mankad, Editor of Cite, Rice Design Alliance
Matthew Austin, Rebecca Sibley, Brianna Rogers, Louie Weiss, Jenny Zhan, Alex Gregor, Timmie Chan, Nicholas
Printer
Weiss, Andrew Daley
The Prolific Group | Printed in Canada
Fall/Winter 2011
IN CONVERSATION WITH JOHN MAY Joseph Scherer: Do you think there’s a relationship between the development of technologies that permit new ways of seeing nature and the consistent emergence of ways of claiming attachment to those forms?
in statistical machines like the digital computer, we began to see the emergence of this new conception of the environment. And that’s what we’re working with today: a condition in which our perception of the world has be thoroughly reoriented around a particular set of technical arrangements and their associated concepts. When we are presented with scientific and bureaucratic depictions of the ‘natural environment’ today, we are confronted with a condition in which “the natural” is essentially defined as that which can be represented as information (or ‘data’) in the form of discrete electronic charges. Only by acknowledging that can we begin to work through what it might mean for architects to engage ‘more ecological’ practices.
John May: Absolutely. Although I would want to modify your question somewhat, first by divorcing it from the language of “development” and “permission.” Let’s say instead that the relationship is between technical methods of representation and the particular conceptions of nature associated with those methods. For example, we can point to a specific shift that took place, just prior to and during World War II, in which the concept of environment was restructured, in substantial and very particular ways, through the tethering of that concept to the technical parameters of electronic visualization. If you look closely at those technical parameters – at their history and design – it becomes clear that this restructuring had the much more subtle effect of reorienting our understanding of the environment around a kind of statistical reasoning implicit in the design of electronic instrumentation.
Scherer: It sounds like that’s kind of a feedback loop then: there is a conceptual framework for understanding nature, but then the tools are invented in support of that (or they produce that concept). I’m trying to untangle the relationships there – not to say that there needs to be a completely hierarchical understanding of cause and effect here – but I wonder if you’ve seen consistent relationships between these phenomena in your research of this history.
In other words, through the details of that instrumentation, the conceptual and discursive foundations of “the environment” – its very essence as a scientific object – was reformulated around the limitations and assumptions of probabilistic reasoning. As that view became increasingly concretized
May: Whether or not the instruments we’re using are documenting “the world as it exists” – whatever that would mean – or whether they are realizing (let’s say “making-ontological”) a particular interpretation of the world – this is a question that runs through the
58
PLAT
1.5
heart of modernity. In my view, it becomes especially important after 1945, a year which we can use as a kind of rough pivot point from progressive to reflexive modernity. From that point onward – to use Beck’s logic – modernization began to concern itself far more with the management of internally-produced risks: radioactivity, pollution, toxicity, etc.
GIS technology document existing conditions or does it generate real conditions? It does both. That is, it postulates an abstract model – a statistical model – and then it poses modes of intervention in order to realize that model. So it belongs to what Sloterdijk has called the “ongoing explication of space” that defines modern managerial techniques. At this point we are so adrift among these instruments that, to my mind, the question is neither one of causality, nor one of technological determinism: it’s a matter of struggling to understand the resonances
The relationship between representation and intervention is one that has to be interrogated with a certain intensity, because although it involves a seemingly timeless recursivity, the specific details – historical, technical, political, etcetera – of that relationship are under constant revision. We are constantly finding new ways of opening up the world, of making it available for adjustment and restructuring. Once upon a time it was possible to frame those adjustments entirely within the language of progress, but it would require tremendous naiveté to do so today. There’s no doubt that the statistical conception of the environment is an effective way of seeing the world – it contains tremendous managerial capacities; exponential gains in control and efficiency have been realized through the signalization of environment. But it may have much more to do with revealing potentials than discovering essences, and in any case we’re clearly paying a very steep price for our newly found prowess. Some of the easiest examples are found in the use of geographic information systems (GIS), which is a tremendously powerful instrument for the spatial management of data. Does
59
Fall/Winter 2011
IN CONVERSATION WITH JOHN MAY Joseph Scherer: In relation to this idea that architecture or architectural theory is failing in some way, I’m thinking about projects that were radical in the 60s and 70s… Archigram and stuff like that. There was a desire to actually have those things become real projects. There’s a time when Banham actually expects that he’s going to open the door and there’ll be a circus out there, and everyone will be free. That faith seems like a very unique moment when theory and design were bridged, yet it was still unsuccessful. So: if architecture is doing the best it can, and if theory is failing, and together they didn’t really get anywhere, doesn’t that suggest a complete inability to change the situation?
that shape daily practice might count. We should also remember that theory constitutes a form of representation, and in this historical moment it might be more in need of attention than other forms, which I realize is not a very satisfying answer. You can avoid becoming despondent by keeping a longer timescale in mind. It’s not your job to solve the numerous crises of habitation we face collectively. It’s simply your job to begin to sort through the reality of our technological lives, with the hope that a patient description might teach us how to not repeat our mistakes, or at least help teach us how to live – not merely survive – amidst the decaying fabric of modernity. In that sense, the notion that there is a ‘solution’ to our current predicament reveals an already instrumentalized conception of thought, which imagines life first and foremost as a set of problems in need of planning and management: a truly negative conception of life. Instead, what has to take place is a much longer historical-philosophical project that dives beneath that psychology. The analogy that I usually draw is with feudalism. Feudalism was not ‘solved.’ It was slowly dissolved, over several hundred years, and ultimately replaced with an entirely different mode of existence. That process required the invention of countless concepts like ‘rights’ and ‘democracy,’ which previously had not existed, and the patient discrediting of other ideas, ‘divine right’ and so forth. Part of the problem right now is that we aren’t doing any of that. Instead, what we’re doing is passively receiving representa-
Eileen Witte: Another question comes to mind when we compare the current state of architectural theory to that of the 60’s and 70’s. Archigram and others of that time were inadvertently critiquing the notion that architectural representation – in a particular, the plan – could order a society through planning and the creation of fixed spaces. Their eccentric use of representation suggested that the plan, as an architectural tool, was too deterministic. And so I’m interested to know if you think that today we need (or have already developed) reactionary forms of representation that critique the statistical project. John May: I think any project that aims to discover or reveal something buried within the rote technical processes
90
PLAT
1.5
images by: Brantley Highfill
tional regimes from scientific and technological discourses, assuming that those will somehow quickly get us out of our predicament while still preserving our precious lifestyles. I don’t see that happening.
to an extent that mechanization never allowed. Take scripting, for example, which is tremendously powerful. It’s radically different than older forms of mechanical representation that generated something like the Nolli plan.
Scherer: Okay, take the Nolli Plan of Rome – that’s a fairly subjective data set, but it allows for the visualization of data in a way that might not otherwise be immediately comprehensible to the public. Then today, we’ve got Venturi Scott Brown, and let’s say that they’re designing in a way that’s related to that visual framework and that dataset. But we can imagine that, with other datasets and other tools, we might have different projects emerging from it.
As architects, we have to understand that our disciplinary subjectivity has radically changed over the last three decades – especially over the past decade. Nearly all the decisions that we used to make mechanically are now electronically automated. That’s not a pejorative statement, and certainly mechanization contains its own forms of automation. But electronic automation is predicated on entirely different control loops, and arguably has more drastic implications as the proposed scope of intervention expands. As that happens, the automation of decisions around datasets becomes much more pertinent. As soon as one begins to speak in the language of landscapes and territories and populations, one can no longer remain naïve to the basic fact that we are presently abrogating our agency to automated processes whose full reality we haven’t bothered to understand. We are very facile with those processes, and we possess tremendous technical acumen, but we don’t understand them. I’ve not yet seen an adequate history of scripting in design, much less one that manages to reconstitute some plausible theory of ‘automatic agency.’ That phrase is, in some sense, historically contradictory. If today it is simply an empirical fact – which I believe it is – we must find ways to reinscribe the concept of the subject within this radically altered technical field.
May: I think in part what you’re suggesting is that all datasets are subjective, which is why it becomes so important to understand the features of statistical reasoning at work on a particular set of information. We haven’t discussed it yet, but a significant factor – alongside the wartime desire for telemetry – behind the becoming-electronic of environment has been a drive towards automation. So in a way you can say, “yes, life became electronic.” But that wasn’t some sort of natural or geological process. It was motivated by an intense, one might even say oneiric obsession with automation. Automation is a dream buried so deeply in the modern psyche that we can no longer see its horizon. In any case, certain kinds of statistical-electrical processes can be automated
91