.Pneumà

Page 1



.pneumà_ non-standard & interactive architecture /computational design bio-strategies

Harold Woóds Master of Architecture (Biodigital Architecture)



pneuma1 /ˈnjuːmə/ Noun, PHILOSOPHY noun: pneuma; plural noun: pneumas 1. (in Stoic thought) the vital spirit, soul, or creative force of a person.

1

Pneuma in the context of this thesis is to be understood and interpreted similarly as described above however as the vital spirit, soul, psyche and creative force of nature rather than of a person.


This document is submitted in partial fulfillment of the requirements for the degree Master of Architecture (Biodigital Architecture) at ESARQ-UIC Universitat Internacional de Catalunya at Barcelona, Catalunya – Spain 30th September, 2014

Š Harold Woods


DECLARATION.

I, Harold Woods 081684 am a student registered for the course, Master of Architecture [Biodigital Architecture] in the year 2013-14. I hereby declare the following: I am aware that plagiarism [the use of someone else’s work without permission and/or without acknowledging the original sources] is wrong. I confirm that the work submitted for assessment for the above course is my own unaided work except where I have stated explicitly otherwise. I have followed the required conventions in referencing thoughts, ideas, and visual materials of others. For this purpose, I have referred to the ESARQ-UIC School of Architecture style guide. I understand that the Universitat Internacional de Catalunya may take disciplinary action against me if there is a belief that this is not my unaided work or that I have failed to acknowledge the source of the ideas or words in my in my own work.

_________________________ Harold Woods 15th August 2014


DEDICATION. I dedicate this thesis to my parents, Samartha and Norris Ndagi. ; to my family, thank you for the constant, tireless, unwavering support and standing by me throughout this thesis and the whole 9-yards. Without you all would not be possible. There aren’t enough words in this world to describe my gratitude and how I feel so blessed to have you. Thank you for always believing and for the endless support, words cannot express. I love you all, endlessly and am eternally grateful. ; and to Maria Carmen Ruiz Marlin


ABSTRACT. The nature of interactivity is not a product, but a process. Nature is full of elegance. Sometimes that elegance, to human eyes, also appears as great beauty. But being beautiful isn’t (except maybe when attracting mates) the purpose of that elegance. Nature is elegant in its designs because that elegance has a function. Observations of phase transitions in materials and the phenomena of irregularity that is characteristic of a phase transitions in nature suggest another interesting mathematical approach and natural processes that might be successful in modelling the relation of order and randomness in the processes of natural systems, imitated in architecture and design would be useful. Life forms evolve constantly, as so should our approaches to different aspects of the design process or else, eventually all current solutions and approaches would be fallacious. Maybe this leads us to reflect on the issues to be tended to by contemporary architecture. Should a building be a static being or rather is an enunciated system, capable of constantly collaborating with its surroundings and by one means or another retrieving data which it then reprocesses to react to this impulse in a procedure of adjustment toward oneself? Or should a building rather be an open, dynamic, "living" structure or apparatus?


ACKNOWLEDGEMENT

I realize and am constantly reminded that like a good design, an architect/designer is not created by one person alone. It takes a team of dedicated, hardworking, thoughtful and creative people. People who understand that the end result is more important than the difficulties experienced in the process, and more importantly that this end result is worthwhile. I am one, and whole, but incomplete.

I would also like to thank the following people for their guidance and thought-provoking critique throughout the duration of my studies and thesis:

Dr. Prof. Alberto Estévez (Universitat Internacional de Catalunya, ESARQ-UIC) Prof. Dennis Dollens (Universitat Internacional de Catalunya, ESARQ-UIC) Ezio Blasetti (Columbia University, GSAPP; Maeta Design LLC) Daniel Wunsch (Universitat Internacional de Catalunya, ESARQ-UIC School of Architecture) Pablo Baquero (Universitat Internacional de Catalunya, ESARQ-UIC School of Architecture) Prof. Aranzazu Balfagón (Universitat Internacional de Catalunya) Prof. Agustí Fontarnau (Universitat Internacional de Catalunya, Medicine & Health Sciences)

Anastasia Fedonkina thank you for the daily smiles and laughters. I always drew back on these whenever exhausted and stressed. Our daily chats and Skype calls kept me in a good mood to stay focused. Carmen Ruiz Marin, thank you for everything and anything; the patience, endless yet undeserving support for as long as I can recall. Words cannot express my gratitude. Sarah Winkler, thank you for always being honest with me, for giving me a kick when I needed it, and for being as equally passionate and bossy about the projects as I was. But most importantly, the long memorable daily walks to the metro.

And to all the other gems of the M.Arch. (Biodigital Arch.) Class of 2013-14, you rock! You have been my other family for nearly a year and my gratitude for all your help is immeasurable. And I wish you all the best. I saved the best for last. Each and every person mentioned here made a valuable contribution. Thank you for your input and ever-open-minded mentorship and friendship; to friends both inside and outside the architectural world.



INTRODUCTION. Since the early days of modernism, the biological theme has enriched architectural discourse with a set of insightful references concerning the design premises and the means these can be achieved. Such a broad reservoir of influences is connected with processes directed to actions carrying out specific tasks, further suggesting functional and utilitarian connotations (Hensel, 2009). An updating of the analogies between biology and architecture goes along with the introduction of digitally-based techniques. Computation has aided in the development of new methodologies concerning data registration, analysis and manipulation towards the production of variations about form and also its material manifestation in more dynamic manners. One of the main applications of computation in the design process is to define and alike to evolve systems capable of collaborating with different factors at optimal levels. Such a task may be described as a general understanding of performance. Performance may aid in the development and the assessment of complex systems whose behavior is comparable to ones of organic nature. A system's desired properties such as agility, compliance and overall pertinence in delivering compound operations may be examined with advanced digital tools simulating its behavior. There is no doubt that nature offers an exceptionally rich asset of knowledge to explore; but, after years of study even of fascination with biological references and after the related research has often been limited to skin-deep understandings of organic themes, we have reached a point where it is important to respond to key questions about biology's appointment to architecture, such as: - What are the properties being inherent to the biological model that evoke themes of main architectural interest and how may such a transference occur in more meaningful ways? - How has the biological model helped to readdress critical assumptions about architecture, especially with the introduction of digital tools to design? In respect, this thesis assesses biology's main references along with an updated account of performance and then it summarizes its influences in architecture, as these have largely been supported by the use of digital means. Recent architecture has found itself having to cope with new social, cultural and natural complexities that request arranged frameworks that are time-relative, reconfigurable and evolutionary, and a relating model of building design and outline characterized as adaptive systems. Since the phase out of the Modernist Movement, design in architecture has demonstrated a solid enthusiasm toward positivist venture methodologies. From one perspective, the investigation of intricate and dynamic frameworks has restored enthusiasm toward the study of systems, "bottom-up" methodology and approach, adaptive systems, generative systems, genetics and the automatic creation of form as the pillars of a new generation of design. The persistent emergence of digital innovations in the world of design has made it possible to adopt these new patterns of thought and thinking.


This is the premise of my exploration through venture in project-based research focusing on biodigital methodologies. Integral to this research is the thought of architecture that looks towards planning systems that tries to accomplish higher requested objectives rising through an intimate correlation of material and computational interaction as it constructs a generative view of space, structure and the exploration of behavior-based models of living through patterns found in nature’s biological systems. This biodigital Architecture thesis investigates, through hypothesis and practice, how biological algorithms can shape design and architecture, originating from a somewhat non-conventional point of view and targets to define new platforms for design innovation fusing the digital and physical research within the rapidly evolving fields of material fabrication, computation, advanced building systems, design and immersive media relative to natural biological systems. The focus is explicitly on the fusion of biological systems and technology as it applies to the present and the fate of design, computation, media, digital fabrication, robotic manufacturing, sustainable engineering, material ecologies and novel tectonics, strategies established in the methodologies of natural systems, simulated through the principle determined and rationale based stages of algorithmic and computational design approaches. The strategies proposed draw upon aggregate discernment or intelligence and social programming to cause a design association between natural systems, human designer and computer process. Interest is focused on perceptions of the way in which organic living beings get emanate structures from simple components to complex intricate unpredictable systems. The structures and forms generated by natural systems can be analyzed and understood as hierarchical organizations of very simple components (“bottom-up” system), in which the properties emerging eminently are more than the aggregate of its parts. The undeniably fast breakdown in natural systems that humanity is as of now witness to has brought home the dire requirement for another methodology towards society. Modern technologies born of the Industrial and Information Ages have brought prosperity undreamt of by previous civilisations. This coupled with lessons learned from nature, creates endless possibilities. Experimentation has observed the evolution of genetic architecture; and natural systems and it’s translation into digital organisms; hereditary multifaceted nature of the of the organisms and demonstrate that an advancement or evolution of simple systems to more intricate complex genetic architecture; design that proceeds from innovative technologies and wish to develop skills and pursue knowledge in design research located in new production paradigms focused on the concepts and convergent interdisciplinary effects of emergence on design and production technologies, and on developing these as creative inputs to new architectural design processes. The instruments of analysis and design in this regard are both biological and computational processes. Life forms evolve constantly, as so should our approaches to different aspects of the design process or else, eventually all current solutions and approaches would be fallacious. Maybe this leads us to reflect on the issues to be tended to by contemporary architecture. Should a building be a static being or rather is an enunciated system, capable of constantly collaborating with its surroundings and by one means or another retrieving data which it then reprocesses to react to this impulse in a procedure of


adjustment toward oneself? Or should a building rather be an open, dynamic, "living" structure or apparatus? Diversity of forms, applied patterns and articulation in geometry is the signifier of the signified aspects that gained its meaning as space from multiple layering and inscription of information. Through the ordering and categorizing of these fragmented semiotic sub-systems the complete semiotic system emerges, proposing an information rich and articulated social campus design The nature of interactivity is not a product, but a process. Nature is full of elegance Nature is full of elegance. Sometimes that elegance, to human eyes, also appears as great beauty. But being beautiful isn’t (except maybe when attracting mates) the purpose of that elegance. Nature is elegant in its designs because that elegance has a function. Throughout the evolution of architecture in history, the act or process of addressing the increasing complexity that coalesced from the fields of technology, material systems manipulation, spatial sensibility, social structure and culture involved information processing intended as its organization and transmission throughout the growing subjects and levels involved in the design and construction process. In the field of architecture, the relations among building practice and design/planning have grown increasingly articulated and intricate, with mutual interrelations and rich information lows among design, construction and communication, where the knowhow of each process stage was embedded into the subjects involved, thus forming a seamless nexus that converged in the emergent figure of the architect-master builder. The recording and transmission of information from one step to the other followed the same necessity of organization beyond the brain’s own capacity. Such complexity and abstraction started to be addressed through representation2: building up from the projective technology that we, as humans, developed as symbiotic with the projection mechanism of our camera-eye vision since the Nazca pictorials, the aim of representation is to both unfold and compress a conceptual topological n-dimensional manifold in a sub-space of lesser dimension (which is the medium – typically in architecture the projection goes from 4D to 3D to 2D) through a likeness or an image. How then is it possible to bridge the actual divide between design and making? As simulation allowed a more dynamic and open-ended exploration, it is crucial to understand its paradigm shift and implications to its very core: as Manuel De Landa said, this entails topological, intensive and population thinking. Surely the definitions of set of relations become of primary importance, but beyond that it is even more important to define the objects of such relations and how both are modeled. Modeling itself is an operation that has to do with the specialization of information. If we come to the understanding that matter itself is information based and ultimately its constructive aspects derive by its own relations between behavior and form then it comes also clear to our minds why programming is the most important paradigm of our present days. Deriving from object-oriented programming, object-oriented philosophy9 considers the possibility of a reality of objects in constant interaction in which relations, gradients and intensities are objects themselves and therefore can be programmed in their constitutive characters and behaviors.


The most radical consequence of this paradigm shift then is a change from the definition of a global, superimposed and non-changeable, non-negotiable form and shape to the programming of the constitutive characters and behavior of interacting objects. Material and geometric properties, environmental fields can then be programmed as initial rules of a system that then can be simulated and its results evaluated and interrogated. Another consequence of the increasing resolution that describes the fabric of reality in our models and theories, as well as the non-linear feedback typical of complex systems is to have recently propelled into several branches of computational design research an increasing leading role to performance as evaluation criteria (especially biomimetically inspired and pioneered in sectors such as aerospace engineering and automotive). Such variable is not new in the design field, yet its role in creative disciplines should be very different from the one it occupies in science and engineering. The latter fields are goal-oriented: performance optimization is the minimization of a function within precise conditions where the space of possible configurations admits one universal optimum: it is the case of finding the shortest path between two points on a given medium. In other disciplines it could be the task to shape a fork in order to produce a 440 Hz sound wave, or the work made to tune all the gear components of a speed skier, the goal is to go as fast as possible along a straight slope. Taking this same paradigms in creative disciplines, performance can be aimed towards facilitating the search for multiple optima in the space of possibilities: a guitarist has its own preferences in terms of instrument (sometimes dictating specifics on how his instrument should be made), amplifier and effects; a snowboarder one tunes his own gear in order to ease the creation of new tricks. In both cases, the idea is subjectdependent, proactive (anticipates events rather than being their reflection), with an initial push and an open-ended goal.



BIODIGITAL ARCHITECTURE Alberto Estévez Studio_

CHAPTER 1 | ARCHITECTURE & GENETICS 1.1.

Introduction to Genetics

Natural structures, or rather biological forms are self-composed progressively, as frameworks inside frameworks. They collect themselves under the heap of gravity and accumulate material and vitality from nature. A vital properties for self-sorted out towards oneself structures in characteristic frameworks, both living and no- living, is that little basic segments and components gather together to structure bigger structures that have emergent conduct and properties because of the reciprocal action or communications between the parts making up the entirety. A notable emergent property of human tissues, for instance, is their mechanical conduct. This component is known as linear stiffening, and it is evident even in single particles like molecules and in DNA.1 Charles Darwin needed to manage the ramifications of merging in his hypothesis of evolution. He was compelled to perceive blending as not vital or at any rate not the real principle, and recommended that science of the mid-1800s had not yet got the right response to its questions. That answer originated from a contemporary, Gregor Mendel, in spite of the fact that Darwin clearly was unaware of Mendel's work.

1.1.1. The Gene Historically, a gene is defined as a heritable unit of phenotypic variation. From a molecular standpoint, a gene is the linear DNA sequence required to produce a functional RNA molecule, or a single transcriptional unit.2 Genes can be assigned to one of two broad functional categories: structural genes and. It is the function of the end product of a gene that. Genes could be allotted to one of two expansive functional classifications: structural genes and regulatory genes. It is the capacity of the final result of a gene distinguishes structural and regulatory genes. »Structural genes code for polypeptides or RNAs needed for the ordinary metabolic activities of the cell, e.g. enzymes, structural proteins, transporters, and receptors, among others »Regulatory genes code for proteins whose capacity is to control the expression of structural genes, however, with respect to sub-atomic organization both classes of qualities are comparative.

1

Overview from Weinstock M., On self –Organization: Stress driven form- finding in architecture, Genetic Architectures II: digital tools & organic forms, 2005, ESARQ/SITES books, Barcelona 2 Pearson, H. 2006 Genetics: What is a gene? Nature 441: 398-401.

1


A gene typically possesses a characterized area on a chromosome, of which there are 46 in every human cell and which contain the whole human genome. The accurate chromosomal gene area is characterized by specific successions for the beginning and end of its transcription. Every gene has a particular impact and capacity within the organic entity's morphology or physiology, could be transformed or mutated and can recombine with different genes. It is a store of data (as nucleotide base sequence); subsequently it doesn't initiate activity, however is followed up on, e.g. amid the procedure of gene articulation.

Fig.1: Gene Expression, A general structural arrangement of the different components 3 making up a eukaryotic gene

The complete set of genes of an organism, its hereditary constitution, is known as the genotype. The human genome, for instance, contains an estimated number of 25 000 protein‑coding genes. The physical sign, or articulation, of the genotype is the phenotype; the organic entity's morphology and physiology, If a particular characteristic, such as brown eye colour, is part of an organism’s phenotype, one can conclude that the individual carries the gene(s) for that characteristic. If, however, a particular characteristic is not expressed, one cannot implicitly conclude that the particular gene is absent because expression of that gene might be repressed. Different varieties of the same gene, resulting in different phenotypic characteristics, are called alleles.4

3

Oliver Brandenberg, Zephaniah Dhlamini, Alessandra Sensi, Kakoli Ghosh, Andrea Sonnino, Introduction to Molecular Biology and Genetic Engineering, 2011 4 Griffiths, A.J.F., Wessler, S.R., Lewontin, R.C., & Carroll, S.B. 2007. Introduction togenetic analyses. 9th edition. Palgrave Macmillan.

2


1.1.2. Genes and Heredity The study of genes and heredity is called genetics. Heredity phenomena have been of enthusiasm to people much longer than the underlying principles were logically researched, understood and caught on. Antiquated people groups were enhancing plant yields and domesticating creatures by selecting attractive people for reproduction. Genetics as a set of scientific principles and systematic techniques developed in the 1860s when the Augustinian friar Gregor Mendel performed a set of trials that uncovered the existence and presence of biological “components� " in charge of transmitting attributes or traits from era or generation to generation. These components were later called genes, after the disclosure of chromosomes and genetic linkage in the early twentieth century. Up to this point genetics took a gander at genes as abstract conceptual that by one means or another control hereditary trait. Through genetic analyses the inheritance of different genes was studied, however the physical and biochemical nature of the gene stayed obscure. Further work uncovered that chromosomes comprise of DNA and protein, and subsequent studies permitted the conclusion that DNA is, truth be told; the genetic material.5

DNA was thought to be a simple molecule, along these lines numerous researchers did not accept that it without a doubt conveyed and put away the data for a whole life form. In what manner can such gigantic measures of data be contained and passed on starting with one generation, then onto the next? Unmistakably, the genetic material must have both the capability to encode particular information and the ability to duplicate that same data definitely amid each cell division. What sort of sub-atomic structure could permit such perplexing capacities?

1.1.3. The structure of DNA In spite of the fact that the exact DNA structure was unknown until 1953, its essential building pieces were long known over time. It had been demonstrated that DNA is composed out of four fundamental molecules called nucleotides, which are indistinguishable with the exception of that each one contains an alternate nitrogen-containing base. Every nucleotide is made up of a phosphate aggregate, a sugar (of the deoxyribose sort), and one of the four bases.6 In 1953, Francis Crick and James Watson were the first to succeed in assembling the developing blocks and an accompanying sensible DNA structure. They utilized DNA x-ray beam diffraction patterns created by Rosalind Franklin and Maurice Wilkins and information from Erwin Chargaff. The x-ray beam information demonstrated the DNA molecules to be long, thin and took upon a helical (spiral winding-like) shape.

5

Morange. M. & Cobb, M. 2000. A history of molecular biology. 1st edition. Cambridge (MA), Harvard University Press. 6 Miller, F.P., Vandome, A.F., McBrewster, J, Central dogma of molecular biology: History of molecular biology, Primary structure, DNA replication, Transcription (genetics). Alphascript Publishing, 2009.

3


Fig.2: The four bases, are adenine (A), guanine (G) (the purines) and cytosine (C) and thymine (T) (the pyrimidines7 In part (A), the four bases, the blending of the bases and the association of the bases through the sugar phosphate-spine are delineated. Note that the two DNA strands are held together through base pairing and are running in inverse course, marked 3' and 5' end respectively (read: three prime and five prime). In part (B), a schematic drawing of the real DNA double helix structure is portrayed, containing the same components in simplified form and labelling as in (A).

1.1.4. The flow of genetic information: the central dogma In the early 1950s, Francis Crick recommended that there is a unidirectional stream of of genetic data from DNA through ribonucleic corrosive (RNA) to protein, that is, "DNA makes RNA makes protein". This is known as the central dogma of molecular biology, since it was proposed without much tangible proof for the individual steps. Presently, these steps are known in subtle element: DNA is deciphered to a RNA molecule (courier RNA [mRNA]), that contains the same succession data as the format DNA, and consequently this RNA message is interpreted into a protein sequence as per the genetic code.8

7

Oliver Brandenberg, Zephaniah Dhlamini, Alessandra Sensi, Kakoli Ghosh, Andrea Sonnino, Introduction to Molecular Biology and Genetic Engineering, 2011 8 Miller, F.P., Vandome, A.F., McBrewster, J. 2009. Central dogma of molecular biology: History of molecular biology, Primary structure, DNA replication, Transcription (genetics). Alphascript Publishing.

4


1.1.5. The Genetic Code The fundamental building blockss of DNA are the four nucleotides; the essential building blocks of proteins are the amino acids, of which there are 22 that characteristically happen in proteins naturally; the alleged proteinogenic amino acids. The genetic code is the correspondence amidist the sequence of the four bases in nucleic acids and the grouping of the 22 amino acids in proteins. It has been demonstrated that the code is a triplet code, where three nucleotides (one codon) encode one amino acid.

Fig.3: The Gene, Transcription and Translation

9

In the nucleus, DNA is transcribed to a pre-mRNA molecule by RNA polymerase. The pre‑mRNA is processed, e.g. by intron excision, to the mature mRNA. The mRNA is exported to the cytoplasm and translated into protein, which is accomplished by ribosomes and tRNA that together decode the genetic code into amino acid sequence. Following translation, the synthesized protein adopts its correct 3-dimensional shape and is ready to perform its cellular function. The long story short, the genetic code is for all intents and virtuality universal; it is the same in all organisms inhabiting on this planet. Genes taken from plants might be decoded by animal cells, 9

Oliver Brandenberg, Zephaniah Dhlamini, Alessandra Sensi, Kakoli Ghosh, Andrea Sonnino, Introduction to Molecular Biology and Genetic Engineering, 2011

5


while prokaryotes genes could be decoded by eukaryotic frameworks, and undoubtedly the other way around. Without such a universal nature of the genetic code, genetic control, engineering and advancement would be considerably more troublesome.10

1.2.

CELLS

1.2.1. Enacted Progression In a biological framework, each one cell has its own vitality. Essentialness is a consequence of the collaboration and reciprocal action between the individual cell and the outside synthetic and physical conditions. In an element framework that’s dynamic, reciprocal action happens among all the particles and all through the system. At the point when an extrinsic impetus is applied to a given molecule, different particles simultaneously react, framing a consistently dynamic shape, reshape or distortion. Particles gain momentum and move in distinctive directions because of simultaneous outside forces, prompting a one of a kind shape that can't be framed by a solitary purposeful energy. As Michael Hensel notes in Emergence, the procedure of activity is generally essential. 1.2.2.

Introduction to PhytologicalTopologies

Architecture’s noteworthy qualities lie within our ability to hypothesize upon organic mimesis as another pattern for both material and automatic conduct. As it were, world history has encapsulated itself into a radical stage where the very fate of life as we know it can now be adjusted by reconfiguring the reckoning rationales of ‘computation logics’ of characteristic determination of natural selection.11 A new design language emerges out of the contemporary exploratory methodologies of biological systems. The examination in the field of biological morphologies, as purpose of inception and a better understanding for architectural concerns informed by nature focused around nature’s biological qualities, is similar to a glimpse in a brim of endless, yet unexplored solutions. Understanding nature is an elucidation; explanation that makes something clear; clarification. The design language is trying to discover a specialised dialect, with particular wording and punctuation, in light of this limb of biological science. The dissection of development, multiplication, digestion system, advancement, infections, biology and evolution can rouse remedies and resolutions for spatial issues and concerns and design from the inception of design, through development to greater scale errands and for new breeds of material frameworks. The procedure and approach for theory to spatial conditions ought to have an exploratory methodology like the one utilized within biology today. Machine programming and algorithms are of the new fundamental apparatuses that are utilised to invigorate, mimic nature and imperceptible biological courses of action for the making of architectonical substances.

10 11

Voet, D. & Voet, J.G. 2004. Biochemistry. 3rd edition. Wiley & Sons. Genetic Architectures II: digital tools & organic forms, 2005, ESARQ/SITES books, Barcelona, page 127.

6


L-systems, for instance, organic development and blooming mechanisms might be fortified and portrayed from a modelling point of view. Additionally a connection to physiological procedures, seen in nature could be uncovered, and includes different strategies for general arithmetic.12

13

Fig.4: L-System/Lindenmayer system, several examples for generation of fractals with dimension between 1 and 2.

Within this domain the studio is intrigued to arrive past any biological organism structural impersonations. The undertaking is to extend onto the surfaces with an alternate methodology from the smooth, consistent and flawless structures normally made by the machine designers, and to uncover a surface geometry with an external and internal rationale and with a perspective to the relationship between the entire to part and inverse The exploration of the endless conceivable, top to bottom articulations of the surface, could offer novel bits of knowledge into potential outcomes of spatial separation, form and structure, among others. Architectural prototype models ought to incorporate spatial conditions approaching in biological forms. For this task, a mixed ‘bag’ of parametrical and topological programming was the outline instruments of design. By creating a material for the assessing and comprehension the criticalness of this aspect, in relation to a conceivable assembling methodology we could reexamine the starting geometry.

12 13

On the other hand Parallel Rewriting Systems, after the scientist Aristid Lindernmayer who designed it. Retrieved from http://mathworld.wolfram.com/LindenmayerSystem.html, 16/9/14.

7


1.2.3.

Taxonomies in Phytology

Strains and classification A life form, organisms in particular is for the most part classified on the premise of phylogeny evolutionary relationship to other species. In this regard, specie is a taxonomic class beneath the rank of genus. Species in the same genus impart a later normal progenitor or ancestor than do species in other taxa. Taxa [Greek. Tasso, orchestrate, classify] are gatherings of life forms or organisms that fill a specific category of classification.14 Linnaeus, in his book "Systema Naturae", distributed in 1735, basically utilized floral part contrasts to allot plants to these classes: species, genus, order, and class. Presently, we utilise a rundown of seven compulsory categories, each one subdivided into three extra sub- categories, which makes more than thirty classes of order, in a chain of ascended importance. Taxonomy and order of things are a piece of a more extensive field of systematics15, whose objective is to regulate phylogeny16, or evolutionary history of a group of organisms.

Fig.5: Realistic delineation of Linnaeus' alleged "sexual system", illlustrated by Georg Dionysius Ehret in1736. The title says "Linnaeus' sexual method for plants, described in the book Systema 17 naturae."

Natural History Museum (London).18

14

structural differences like shape, size, colour, etc Greek systema, an orderly arrangement 16 Greek phyle, tribe, Latin genitus, producing 17 Retrieved from http://www.imagejuicy.com/images/plants/c/convolvulus/7/, 13/08/14 18 Retrieved from, http://commons.wikimedia.org/wiki/File:Syst-sex.jpg, 18/08/14 15

8


A phylogenetic tree or evolutionary tree is a tree demonstrating the evolutionary connections among different biological species or other entities known to have a typical progenitor. In a phylogenetic tree, every hub (called node) with descendants manifests the neoteric common precursor or ancestor of the descendants, and the edge lengths in a few trees compare to time gauges. Every hub is known as a taxonomic unit. Inside hubs are by and large called hypothetical taxonomic units (HTUs), as they can't be specifically watched, rather speculative.19

Fig.6: A hypothetically established tree for RNA genes20 Phylogenies symbolize the evolution of species: mutation, crossover and natural selection govern this evolution. These concepts give a manual for the advancement of synthetic and advanced devices capable of mimicking natural mechanisms, and bringing into existence new properties and qualities. Evolutionary design has its grounding in computer science, design and evolutionary biology. It is an extension of evolutionary computation, which broadens and joins computer aided design and with analytical programming. It doesn't dither to lend itself concepts from natural evolution. 19 20

wiki/phylogenetic tree, 20/08/14. http://galleryhip.com/simple-phylogenetic-tree-of-life.html, 5/8/14

9


1.3.

Interaction Facets:

CAD systems have undertaken a part past the simple era of creation drawings; they now in a broad sense influence the structures, form and habitats that an architect designs. The capability to depict topological correlations in advanced digital environments possesses an immediate impact on the courses in which architects consider design: “Topological structure corresponds to quantitative difference and, hence, to service; topological form corresponds to qualitative difference, and hence, to surface.�21 Among many illustrations of application of surface modelling methods to the creation of new building forms, is the Kunsthaus Graz by Peter Cook and Colin Fournier. However the venture not just investigates the dynamic responses to the surrounding environment, additionally it interacts with it. A low-determination machine controlled exterior showcases ventures occurrences inside the building to the outside. Novak's paracube, in which parametric correlations between surfaces in 4-D space were utilised to characterize a 3-D space casing structure, likewise depicts how ideas, for example, multidimensional or trans-euclidean space might be best understood and explored with advanced digital innovation.22

Fig.7: Marcos Novak, Paracube, 1997-98

1.4.

Seed & Sequel: Cause & Effect

Anyhow, before going further all the more in our present biology and the design strategies of synthetically built systems, it is important to demonstrate certain mathematical facets of the morphology and the dynamical fundamental principles of organic and inorganic matter that depict the concordance and balance of the world, since everything is bound apparently equivalent by physical and mathematical laws, communicated by number and defined by natural order. We ought to begin by enquiring: What has emerged, what does it emerge out of, and how is emergence delivered?23

21

Szalapaj P., 2005, Contemporary architecture and the Digital design Process, Elsevier, Oxford, page 108 Szalapaj P., 2005, Contemporary architecture and the Digital design Process, Elsevier, Oxford, page106 23 Hensel M., Menges A., Weinstock M., Emergence: Morphogenetic Design Strategies, AD, Volume 74, No3, May/June 2004, Wiley-Academy, London, page 11 22

10


1.5. EVOLUTION 1.5.1. Evolution – from Zero to Hero Phylogenies symbolize the evolution of species: mutation, crossover and natural selection govern this evolution. These concepts give a manual for the advancement of synthetic and advanced devices capable of mimicking natural mechanisms, and bringing into existence new properties and qualities. Evolutionary design has its grounding in computer science, design and evolutionary biology. It is an extension of evolutionary computation, which broadens and joins computer aided design and with analytical programming. It doesn't dither to lend itself concepts from natural evolution. A more critical gander at morphogenetic systems connected to compositional architectural design makes us perceive the notion that evolutionary systems could help to guide architects on an imaginative course. We were not searching for a natural, organic or genetic structure, however for computerised tools roused by genetic systems equipped for helping and supporting design ventures. It is conceivable that such devices could be propitious for design innovation, among other aspects considering that one is centred, specifically, on the inception of the design process. In his hypothesis to emphasize the relationship between growth and form, D’Arcy Thomson comments: “Organic form itself is found, mathematically speaking, to be a function of time. [...] We might call the form of an organism an event in space-time, and not merely a configuration in space.”24 This is an idea Hall΄e, Oldeman and Tomlinson resounded to accentuate the relationship: “The idea of the form implicitly contains also the history of such a form.”25 Perceptions of stage transitions in materials and the phenomena of discontinuity that is normal for a stage transition recommend an alternate fascinating numerical approach that may be effective in simulating the connection of order and arbitrariness in the procedures of nature systems, and may be beneficial to fuse in stain-driven advanced structure or form finding approaches.26

1.5.2. Biology of an Evolutionary nature Evolutionary science; evolutionary biology in particular explores the origin, the change and the multiplication of species over time. The tremendous set of conceivable outcomes of prospective genetic sequences, and the desired "results" are the consequences of profoundly pertinent organisms that are capable of survival and replication within their domain or habitat. This is a certainty that lends evolutionary biology the capacity of being an engaging wellspring of ingenuity for tending complex computational concerns that oblige seeking through an immense number of conceivable possible outcomes. Also, evolution might be translated as a hugely parallel pursuit strategy; as opposed to a ‘chip away’ at one species at once over a given time of time. Evolution tests and changes entire populations of species all at once, as a singular. 24

Thomson W. D’ Arcy, 1917, On Growth and Form, 79 University of Chicago Press, edition:2- 1948, page 79

25

Prusinkiewicz P, Lindenmayer A, 1996 The Algorithmic Beauty of Plants, Springer-Verlag, New York, preface

26

Genetic Architectures II: digital tools & organic forms, 2005, ESARQ/SITES books, Barcelona,, page 105

11


The natural methodology of life evolution and the methods that are utilized within evolutionary biology have impacted numerous other disciplines that use evolutionary algorithms to solve complicated issues, including architecture.27 A specific class of these computational algorithms is Genetic Algorithms. In his book; A thousand years of non-linear history, De Landa hints on evolution. He elaborates about abstract machine, with evolution being the third abstract machine. This is more promptly worthy as being a dynamic machine, in view of the well-known non-Darwinian instantiations, for instance, evolutionary algorithms. The stream of genes through replication is undoubtedly just a piece of what life really is in its entirety. The other part is constituted by the stream of biomass.”28

Thus evolution isn't the substance of life, but rather stage, a temporary stage, that is neither permanent nor bound by a time lap in light of the fact that artificial things advance in life too – which could be referred to as evolution as well. One could even speculate and argue that artificial life is in fact not artificial at all, as once contended by Dennis Dollens. More reasonably, everything artificial, to the core of the term itself as we understand it, is indeed a product of an evolutionary process from a biological organism that has undergone evolution, lived or died, but either way eventually ended up as a material or raw material in a production system. This goes to show that artificial (non-living) things are actually living things.29 This ‘artificial life’ is an extended form of natural life. So, it is all life, thus alive, and contending principles like evolution still apply to both, except in different ways we probably haven’t started to comprehend. 1.5.3. Natural Selection; 1.5.3.1.

The Nature of Natural Selection - definitions of natural selection

It is critical to perceive that “natural selection" is not synonymous with “evolution." Evolution can happen by courses of action other than natural selection, particularly genetic drift. What's more natural selection can happen without any evolutionary change, as when natural selection maintains the existing state of affairs as usual by excluding degenerates from the ideal phenotype.

27

Though this is recent in architecture as opposed to other fields of industry, it is effective and progressive. DE LANDA, Manuel. A thousand years of non-linear history, The MIT Press, 2000, p138-9 29 Dennis Dollens’ overview on the matter of artificial life 28

12


Fig.8:: Pseudocopulatory pollination. Image: (Left) Ophrys apifera, one of the “bee orchids,” uses pheromones to attract male bees and is shaped such that, in attempting to copulate with the flower, pollen adheres to the insect’s body.30 Image: (right) A long-horned bee (Eucera longicornis) attempts to mate with an Ophrys scolopax flower. A yellow pollen mass adheres to the bee’s head.31 The competence, frequently called the reproductive success of a biological element is its normal per capita rate of numbers increment. When accessing natural selection among genotypes or life forms, the parts of fitness for the most part comprise of;- The average number of offspring (e.g., eggs, seeds) produced via female function -The probability of survival to the various reproductive ages - The average number of offspring produced via male function. “Reproductive success” has the same components, since survival is a prerequisite for reproduction. In nature, there is a battle for presence. Natural selection clarifies both evolution and adaptation and this could be directional, stabilizing, or disruptive.

30 31

© E. A. Janes/Photolibrary.Com, retrieved 18/6/14 © Perennou Nuridsany/Photo Researchers, Inc., retrieved 18/6/14 13


IFig.9: Adapting an adaptation, Ralp A.Clevenger/Photolibrary.com

Nudibranchs such as Flabellin iodinea are marine gastropod molluscs that lack shells. Many nudibranchs are unpalatable or dangerous because of stinging nematocysts they acquire by feeding on coral tissue and storing the noxious structures in their own bodies as a defense against predators. Bright “warning coloration” like this individual’s is adaptive in toxic animal species, a signal to would-be predators that consuming this particular prey is not a good idea. 1.5.3.2.

Natural Selection & adaptation

An adaptation is a trademark that improves the survival or regeneration of the bearing organisms, with respect to optional character states; particularly the inherited condition in the populace in which the adaptation advanced. Natural selection is the only known mechanism to cause the evolution of adaptations; as such, large number of scientists would essentially characterize an adaptation as a characteristic that has evolved by natural selection. The word “adaptation” likewise alludes to the process whereby the parts of a populace get to be more qualified to some attribute of their surroundings through change in mannerism that influences their survival or regeneration. These definitions, in any case, don't completely fuse the complex issue of exactly how adaptation (or the process of adaptation) ought to be classified or measured.32 Image : (below) Conflict between group and individual selection. Genotypes that have lower reproductive rates. If so, then the species as a whole might evolve altruism through the greater survival of groups of altruistic individuals, even though individual selection within each group would act in the opposite direction.

32

Daniel. C. Denett, Darwin's Dangerous Idea: Evolution and the Meanings of Life, Simon & Schuster Press,1995.

14


1.5.3.3.

Design and mechanism

The multifaceted nature and obvious capacity of organisms’ adaptations can't possibly emerge from the arbitrary activity of physical powers. For several years, it appeared that versatile or adaptive design could only be clarified by a shrewd designer; yet actually, this contention from configuration was viewed as one of the strongest evidences of the God’s existence. Case in point, Reverend William Paley composed in Natural Theology (1802) that; generally as the multifaceted intricate design of a watch suggests a sagacious, deliberate watchmaker, so every detail of nature, for example, the human eye, shows " every indication of contrivance, every manifestation of design, which exists in the watch", and must, moreover, have had a designer.

IFig.10: Weaver ants (Oecophylla) constructing a nest.Chains of workers, each seizing another’s waist with hermandibles, pull leaves together. (Photo from Hölldobler and Wilson 1983, courtesy of Bert Hölldobler)

15


1.6. GENETIC ALGORITHMS 1.6.1. Prologue to Genetic Algorithms Genetic Algorithms were initially imagined by John Holland in the 1960s and from that point forward they have been utilised as stochastic routines for dealing with optimization and search issues, working on a populace of conceivable or likely outcomes. As per Charles Darwin's Theory of Evolution, the tedious application of the aforementioned approaches modifies an initial species into different other species; notwithstanding, where just the stronger species predominate. Genetic Algorithms perform the same operations on the populace in conceivable focuses with just those that fit the result better surviving. Despite the fact that there is no formal definition of genetic algorithms, every one of them comprises of four elements. The main is the population of chromosomes which depicts the conceivable results of the issue. Selection is second and it alludes to the part that will evolve to the next generation. Selection is performed based on a fitness function that determines how suitable a solution is.

Fig.11: Genetic Operators, Crossover33 The selection process is applied to every single generation created. Crossover alludes to the fusion or exchange of attributes between two parts of the members of the elite group defined by selection, by which offspring is created. There are different sorts of crossover yet the most habitually utilised are; the one-point crossover, in which the parents are reduced at a particular point and the head of the first is attached to the tail of the second or the other way around; and the two-point crossover, in which a part from one of the parents is acquired and traded with the part that lies in the same locality of the other parent. The Pseudo code - “First a population P(m) is created. Then the loop beings with the first generation, G. The Population is evaluated by E(P(m)) to form the evaluated population E(m). This evaluated population is then refined by another evaluation function to form the fit 33

Retrieved from https://wiki.ece.cmu.edu/ddl/index.php/Genetic_algorithms, 17/9/14.

16


population FP(m), which consists of the individuals fit enough to reporduce. A new population is formed by using a number of genetic operators, GO(FP(m)). This population is then put back throught the loop. The termination criteria in this pseudocode is number of generations. However, it can also be a different function, such as reaching a threshold fitness value.�34 P = P(m). forG = 1:100. EP(m) = E(P(m)). FP(m) = E(EP(m)) P'(m) = GO(FP(m)) P(m) = P'(m) End. The application of this is well evident in architecture; coding and scripting35

1.6.2. The Allure of Evolution: Let’s pose a question. Why use evolution as inspiration to resolve computational issues? To analysts of evolutionary and computation, the mechanisms of evolution appear to be appropriate for the absolute most pressing computational issues and concerns in numerous fields including architecture. Numerous computational issues oblige seeking through countless possibilities for solutions. An alternate case is seeking a set of tenets or mathematical laws that will forecast the good and bad times of a financial sector, related business sector, for example, the fall and rise of shares, currents and investments. Such pursuit issues can regularly profit from a viable utilisation of parallelism, in which numerous distinctive potential possibilities are investigated and explored simultaneously effectively. What is required is both computational parallelism; numerous processors assessing patterns simultaneously and an intelligent methodology for selecting a successive sequence for assessment and analysis. Numerous computational issues oblige a sequence of rules (as later seen in scripting and coding languages) to be adaptive and commence performing admirably in a constantly changing unpredictable environment. Lastly, numerous computational issues oblige complex remedies that are hard to program by hand. An irrefutable illustration would be the issue of creating artificial intelligence (AI). Initially, artificial intelligence experts presumed that it would be elementary and effortless to encode the sequence of rules that would confer intelligence on a program; expert systems were one aftereffect of this early idealism. However, presently numerous artificial Intelligence scientists deduce that the underlying rules of intelligence are excessively intricate to encode by hand in a "top−down" approach.

34 35

Retrieved from https://wiki.ece.cmu.edu/ddl/index.php/Genetic_algorithms, 16/9/14. Refer to Ezio Blasetti Studio Project in the subsequent Chapter

17


1.6.3. Genetic Algorithms and Architecture Whist different disciplines conformed to computational devices focused around the principles of evolutionary biology; evolutionary processes have not been comprehensively adopted in architectural design and leave a considerable measure to be desired. Only in the last decade has there been an observable drift in the way architects design by exploration and experimentation with such procedures to address complex issues. Genetic Algorithms offer a compelling resolution for this issue by tackling optimization and search issues, performing on a populace of conceivable results. In architecture genetic algorithms work in strikingly two routes: as enhancement instruments and as structure era apparatuses. In the optimization line, they address overall characterized building issues, for example, structural, mechanical, and thermal comfort implementation while in the last they are utilised under the extent of the idea of Emergence.

1.6.4. Author and Design process The switch in the design process and the role of design are a standout amongst the most essential ramifications of the usage of genetic algorithms to-date. Where algorithms are utilized as an optimization system, those progressions are not that paramount. Regardless of the possibility that right away it may give the idea that those strategies lead to the substitution of the architect or creator from the design loop, this is not so much the case. In reality, the designer is still the author; the person who decides what will be optimised, setting the variables and the capacities of the issue whist the optimisation systems are only one more instrument to the architect’s disposal sorting the design process durations and performing the required calculations. Then again, when genetic algorithms are utilised as a form generator device these suggestions are more paramount, since they are utilised amid the theoretical design stage. Evolutionary simulations supplant the customary design techniques and the creator or designer one might say is counterbalanced or nullified and insignificant. This is on the grounds that a large portion of the architects use genetic algorithms to breed new forms instead of merely designing them. As entrancing as the thought may appear, breeding buildings inside a computer may for some designers, unmistakably that only utilising advanced digital innovation without beneficial, structural, functional and topological intuition will never be sufficient for genuine architecture. Consequently, the primary role of the designer or architect or author is to be the judge of aesthetic vigour. As Steadman noted; “Just as Darwin inverted the argument from design, and ‘stole away’ God as designer, to replace Him with natural selection, so the Darwinian analogy in technical evolution removes the human designer and replaces him with the ‘selective forces’ in the ‘functional environment’ of the designed object.”36

36

Philip Steadman, The Evolution of Designs: Biological Analogy in Architecture and the Applied Arts. (New York, Cambridge University Press: 1979), 189.

18


1.6.5. Selection versus Output: favour and totality The selection of the final output is yet an alternate issue that is determined by the utilization of genetic algorithms. On account of need, the target is decently defined and thus the result selection is centred on an ideal or superlative result. Why ought to designers take after such a strict, consistent, and subjective methodology, in the event that they in the long run build their decision on speculative and impartial ornamental criteria? Why do they not design something similar to the final already preconceived and desired result at the absolute starting point? Is it true that it is on account of some architects just utilise genetic algorithms as a device to surpass their constraints of imagination and deduce that the usage of genetic algorithms catalogues their design process, furnishing it with a hypothetical and conceptual substance? 1.6.6. Genetic Algorithms in Architecture: Exigency or Drift? Genetic Algorithms which could be characterized as a computational strategy based around the principles of evolution have been as of late adopted in architecture to rectify and resolve complex issues in the form and the function of design undertakings. Whist there has been growing enthusiasm toward the utilisation of genetic algorithms in architecture, there has not yet been methodical research on the performance of these algorithms and their praxis in architecture yet whether they might be utilised to gratify particular architectural needs and of the design industry or just to merely shoulder imaginative and complex formations; and hence whether they serve reality or utopia. It is a question of materiality versus bliss and Arcady. To answer these inquiries, an exploration of operations of genetic algorithms in different disciplines beside architecture, their application; and in architecture and the ramifications of its applications in architecture are ventured into. Undoubtedly there are numerous challenges in the application of genetic algorithms in architecture, at least presently. That however, doesn't dismiss the fact that genetic algorithms can possibly assume a more viable part later on for architecture. On one side it is the architectural problem that has a “rationale” and a fundamental “facet.” The ‘rationale’37 is the escalating amount of information and the surging level of complexity contained within architecture today. The ‘facet’38 is the different building performances including spatial, practical, structural, functional, feedback, vitality, and lighting and on the other side there is the capability of genetic algorithms. Genetic Algorithms are incredible techniques for tending to issues of perpetuating complexity, discovering ideal design remedies from uncertain search spaces compelled by different data facets. In the event that design authors try to utilize computational design to confront engineering and architectural issues, Generative Algorithms could be one conceivable device to this end; they can fathom the ‘rationale’, whist considering the essential ‘facets’. Nonetheless, for genetic algorithms to be infused within architecture, a few things must be carried out. In the event that an author or designer uses genetic algorithms so as the form of a building to be emerge, down along the design process the designer will find the necessity to alter form partially or utterly to suit function. Equally so, to uncover the ideal optimal form reducing the 37 38

To be interpreted in its definition as reason with a logical basis To be understood in the context of aspect, feature and standpoint like in philosophy

19


expenses in one regard and augmenting sunlight in another regard, in the following stage of the process an alternate performance must be figured and new functionalities will be presented e.g. thermal performance, building comfort among others. A conceivable remedy for this is the coordination of generative devices with optimization tools; the combo of double usage of genetic algorithms: as a present-day pattern and equally a need. The form will arise by the concurrent computations of a large portion of the execution driven functions. This procedure manifests itself in potential exertion to accomplish the design objectives discovering the adequate approach and outcome between form and milieu, characterizing the milieu as “anything in the world that makes demands of the form.”39 - Including meanings, aesthetics, environment, and function - as Christopher Alexander noted. Indeed, for Alexander, “the form is the solution to the problem; the context defines the problem.”40

1.6.7. Conclusion One of the issues that contemporary architecture has so far neglected to manage and need to manage is the amount of data and the emerging multifaceted nature of a large portion of the its projects. Just as of late have architects taken an initiative to use Genetic Algorithms to address such issues. As discussed above, it exhibits the double operation of genetic algorithms in architecture and design: as optimisation devices, and as form-generation apparatuses, tending to these as need and as pattern separately. Likewise showed, are the ramifications of genetic algorithms applications ranging from the substitution of the conventional design process by the evolutionary simulations, the balance and rescission of architects, the unique criteria of the last opted choice, and the local usage of genetic algorithms just in a portion of the design stages. Be that as it may, genetic algorithms can possibly assume a more effective agenda subsequent architecture and design whist answering the architectural concerns with the methodologies of the design bore in mind, if legitimately applied. A conceivable answer for this issue is the coordination of generative tools along optimisation devices to accomplish a concurrent count of the performance and a worldwide evaluation of the design which will be emerged by the performance driven capacities.41

39 40

Christopher Alexander, Notes on the Synthesis of Form, Cambridge, Harvard University Press: 1964, 15. Ibid

41

Philip Steadman, The Evolution of Designs: Biological Analogy in Architecture and the Applied Arts. (New York, Cambridge University Press: 1979), 189.

20


1.7. THE DESIGN APPROACH: 1.7.1. A Computational Modus Operandi Modus operandi (MO)42 In complex frameworks, we are accustomed to imagining specific complex systems, for instance, systems within systems. Obviously a chain of command is additionally a sort of system: hierarchy. The sorts of all the more on a level horizontal plane organized systems we consider as complex frameworks. De Landa calls mesh works, and they constitute his second abstract machine. “...there are three elements in this diagram. First, a set of heterogeneous elements is brought together via … an interconnection of diverse but overlapping elements. … Second, a special class of operators, or intercalary elements, is needed to effect these interconnections.”43 Contemporary understandings in nonlinear urban dynamics show us that, by and large, friction; postponements, bottlenecks, clash, and uneven conveyance of assets, assumes a critical part in creating self-organisation toward oneself. Consequently, eradicating this from our models, for instance, by means of proposing an advancing rationale, naturally obliterates the likelihood of seizing any factual dynamical impact.”44 Systems aren't basic instantiations of one or the other of these reflections, either hierarchy; rankings of importance or meshworks however are really intricate: “Self-organized meshworks of diverse elements, versus hierarchies of uniform elements…..meshworks and hierarchies not only coexist and intermingle, they constantly give rise to one another…hybrid form[s]: a hierarchy of meshworks - a meshwork of hierarchies."45

1.8.

PARADIGM

1.8.1. The syntactic Role Design to discoloration or imperfection is a distinct strategy of biological frameworks that depends on a high number of feeble components composed in repetitive patterns that permit adaptive reaction to a scope of evolving conditions, unsurprising non-fulfilling modes, shock assimilation at the stake of a more costly and complex design phase and strenuous in their computation. They utilise states of dynamic parity, and the perplexing patterns and structures that emerge out of the material level up in the progressive scales are an evolutionary reaction to advancement within continually acting energy fields. Considering the fact that material in nature is exorbitant while form is shoddy; delivering new material intimates an expense as far as matter and vitality, while sorting material as it is framed in somehow has almost no extra cost, coherence in self-organization of material patterns is a key criteria for selection, despite the fact that there are different alternative criteria propelling this 42

a particular way or method of doing something, usually distinct. DE LANDA, Manuel. A thousand years of non-linear history, The MIT Press, 2000, p41. 44 DE LANDA, Manuel. A thousand years of non-linear history, The MIT Press, 2000, p41. 45 Ibid. 43

21


selection and not just a solitary criteria. An alternate key component in organic frameworks however, is that they work through combination and multi-execution of parts and subsystems, as opposed to a congregation of mono-optimized components, targeting a universal dynamic coherence instead of universal optimization.

An elaborate illustration could be given by the scrutiny of two structural morphologies: the Fuller dome and the honey bees’ honeycomb. Whist the dome is an unit-based subdivision design steered by structural optimisation criteria of the same triangular component, dealing with a regular surface with cognate curvature state and managed in two perceptible stages; design and construction, the honeycomb is a condition-based developing structure where the quintessential hexagonal cell form emerges as a consequence of closest-pressing material reaction for methodical and systematic load dispensation throughout the system, while presenting the adequate thermal state for larval growth.

Whist the first is attached to broad properties like the unit geometry and size, and the subsequent one works effectually in the complex interrelation in the amidst meticulous and immense properties by and large, therefore being capable of condition-based advancement on schedule and to express an amplified diversity and intricacy of cases well afar traditional Euclidean geometry. Since material and self-organization are still at work inside the discerning constraints, multi-performance coherence is an indicator that logic lies behind even the more complex shape.

22


Images: (above) Subdivision pattern of mono-optimized Fuller geodesic dome and multi-performative 46 natural honeycomb structure.

Fig.12: Comparison to the Fullerene molecules which are becoming the basic tools of nano47 technology.

“The Fullerenes are a recently-discovered family of carbon allotropes. They are molecules composed entirely of carbon, in the form of a hollow sphere, ellipsoid, or tube. Spherical fullerenes are sometimes called buckyballs, and cylindrical fullerenes are called buckytubes. Buckminsterfullerene (C60) was named after Richard Buckminster Fuller, a noted architect who popularized the geodesic dome.�48

46

Retrieved from http://b2dymaxionhouse.blogspot.com.es/p/about.html, 14/8/14. Retrieved from http://www.miqel.com/random_images/diversity_of_images_3.html, 16/8/14. 48 Retrieved from http://en.wikipedia.org/wiki/Fullerene, 12/7/14. 47

23


In what possibility does a multi-performance system deal with these repugnant strains? At the level of condensed matter, curve has a critical part: shape decides capability, and the energetics of capacity prescribes the ideal structure required.49 In organic systems then, disparate requests of space, function and structure are transacted through a regularity principle, not one absolution: one interest does not prohibit or eliminate the others, but rather non-common requests’ existence doesn’t come to a halt yet undergo self-moderation and mitigation to serve the predominating ones; as such, their realm of expression is compelled by the trails followed in the domain of the predominating interest. For instance, form finding strategies; which are determined by the inclination of vitality to uniformly distribute all through a system minimizing the exertion needed, are continually at work, serving other predominating requests, for example, system shape and its capacities or functions; they don't direct the general morphogenetic methodology nor their objective; minimization of a functional objective, for example, concession and pliability in which the aggregate resilience ir elasticity of vitality of a system in a given realm lends itself to the universal system.

The capability of a system to withstand its internal conditions in a stable state regulating its intramural environment is called homeostasis.50 Multi-performance system states then are in dynamic equilibrium; display classic self-regulatory dynamics for morphogenetic processes that flourish with matter's self-organization capabilities through which it displays hierarchy and coordination; patterns are an expression of non-linear yet rational system differentiation and articulation.

1.8.2. Method: from project to process The advancements in computational force and the accelerated ability to mimic complex system dynamics have all through late years availed the devices to pursue an utmost complexity a level where ‘design to defect’ is presently an available design approach to trail after. The capacities to merge exogenous conditions, execute and control the system articulation as an approach of variations in an adaptive response, thus control information from inception, to simulation to optimisation to and materialisation are presently executable. Information stream and control is achievable through a framework that merged a few softwares that could be perceived as a strategic tool, one that makes an interpretation of thought to execution. While programming is not interminable, and continually experiencing an evolutionary process, and as each out-dated one prompts an alternate, new progressive one, reasoning and gaining from follower perspectives is of most extreme significance.

49 50

Hyde, 1996 Cannon, 1929

24


1.8.3.

A Biomimetic MO - Multi-performance systems

In its most elementary form, biomimetic is a prologue to the ways in which organisms have evolved through form, materials and structures in light of fluctuated functions and situations or environment trailed by a an account of engineering design principles that have been conceptualised from nature in industry and material science. A comprehension of natural systems; general form, anatomy, energy flows and behaviour etc, along with an exploration of interrelations and an abstraction of engineering principles are explored. Biomimetics51 alludes to the investigation of life processes in biological frameworks and their interpretation in numerous fields and industry. The most thriving and beneficial of them consisting of engineering to begin with, and subsequently architecture & design. Morphogenetic methodologies; circumscribing both morphology and material systems studies, and their association with performance expected an essential imperativeness since it got to be clear that numerous adaptive strategies of such frameworks functioned best through morphological feedback to endogenous or exogenous conditions. The contrasts between the design methodologies introduced in industrial based techniques and biological systems, one might be portrayed as “design to perfection”, while the other “design to imperfection”.52 Design to flawlessness, or rather “Design to perfection” involves optimization of every component to the single prime found within the prevailing conditions through the minimisation of an objective capacity. Until late, attestation of a complexity paradigm, the inclination for a standard remedy was adjudged diversity or complex outcomes as unrefined simply on the grounds that they were queer and unorthodox; and confounded with intricacy, if not a consequence of procedural lapses. Minimization of the objective function implies the quest for the best performance within the conditions given, which prompts a solitary performance optimization. The industrial model infers its objectives and processes of action from a linear narrative enrapt around the probability of disintegration of complex phenomena in rudimentary parts and the elementary articulation that all material enrapt processes are optimization processes.53

1.9.

PARAMETRICS

1.9.1. Retrospect of Parametricism ‘Parametric design depends greatly on a manipulation of form that might appear visually interesting, but is often superficial.’54 In retrospect of this view, it implies that parametricism depends enormously on a control of form that may show up outwardly intrigue, yet is regularly peripheral and insignificant. However, with time always comes change because change is inescapable. This is apparent in different fields of industry around us, and architecture or design is no exception. Like Woodbury once noted, “Design is change, parametric modelling represents change.”55

51

This will be discussed in further detail in subsequent chapters of this thesis. Vincent, 2005 53 De Landa, 2007 54 Leach, Neil 2009 Parametrics 55 Woodbury, R., 2013 52

25


Whist some firmly embracing it as the new and better approach to design, others some unequivocally refute it, perceiving it as just a sketch and rendering. Designers utilising parametric ought to be mindful of its merits and downsides to corroborate that they are designing with utmost efficiency. Parametricism is an approbation of exploration that dispenses a new medium for exploring, experimenting, representing and communicating forms. New processes of designing lead to new processes of thinking, which could be innovatively discharging for the architect or author of a design. Experimentation is free of preconceived thought and ideas springs into outcomes and design objectives being explored in a manner that is devoid of conventional design approach. Computation furnishes a more robust systematic tool of analysis than an individual.56 Explorations could be effectively and distinctly visualised and catalogued at any stage of the design process thus making analysis dissimilar and diverse. Analytical capabilities of the system could be utilised to make the collation and production of design options more efficient Designs are updated and redesigned by parameters.57 New models don't need to be reconstructed or depicted all over again each time with each change.58 Parameters might be set to corroborate just significant design solutions are delivered; thus lessening time squandered exploring unviable alternatives. Innovation, like all other design approaches, exploration and imagination can occur before a brief is availed. The design is the control in which the designer has over making changes to the design once new/extra restrictions are placed on the project from the brief. Authorship is upheld.

Fig.13: parametric workshop project, Tsinghua University, China, 2010

56

Peters, B., & de Kestelier, Computation works: the building of algorithmic thought, 2013. Woodbury, R., Elements of Parametric Design, 2010. 58 Ibid. 57

26


Parametricism is pushing innovation in design and engineering technologies and inspiring collaborations within multidisciplinary industries. With pristine forms being created by architects and designers, a much closer collaboration between engineer, manufacturer and designer is obliged to emerge and make these pristine forms a reality. However, to be optimistic, this could likewise of course be seen as a fault in one perspective. The traits of fields and complex systems coincide in a cogent relationship. This profound relationship permits eloquent distinction that might be compositionally deciphered and executed in both tactile and elusive incorporeal ways, case in point, a semiological parametric framework been created from the universal scale (expert arrangement) to the nearby scale (intramural space) utilising parametric instruments, like, path optimization and vector fields.59 Diversity of forms, applied patterns and articulation in shell geometry is the parametric signifier of the implied that attained its relevance as space from different layering and engraving of data. As Werner notes, “Through the requesting and classifying of these divided semiotic sub-frameworks the complete semiotic framework rises, proposing a data rich and enunciated social outline.”60 "Abstraction is a procedure by which higher ideas are procured from the utilisation and taxonomy of literal ideas; “factual veridical ideas”, first principles, or by alternative techniques. [… ] In software engineering, abstraction is a procedure by which information and programs are characterised with language and logic. Abstraction tries to diminish and alienate subtle elements so that the developer can concentrate on a couple of ideas at once.”61 "Representation is the utilisation of signs that remained in for and replaced something else. It is through representation that individuals assort the universe as we know it and verity through the demonstration of ‘christening’ its elements. Signs are masterminded to depict semantic developments and convey correlations.”62 A representation is a kind of transcription in which the tangible data around a physical entity is depicted in a medium. The magnitude to which a masterful representation looks like the entity it speaks to is a capacity of verdict and doesn’t convey the meaning of the term.”63

59

Werner, 2013 Ibid. 61 http://en.wikipedia.org/wiki/Abstraction. 62 Mitchell, W. 1995, “Representation”, in F Lentricchia & T McLaughlin (eds), Critical Terms for Literary Study, 2nd edition, University of Chicago Press, Chicago. 63 http://en.wikipedia.org/wiki/Representation_(arts). 60

27


1.10. DESIGN TOOLS Tools64 1.10.1. NURBS: in Rhino Rhino is a NURBS (Non Uniform Rational Bezier Splines) modelling package produced by McNeel.65 Modelling using NURBS is well suited to narrating complex geometries for fabrication, as curved forms are not approximated to faces or segments as they are when modelling using meshes/polygons (as seen in 3ds Max, Maya) or splines (as portrayed in AutoCAD and similar programs). Instead curves and curved surfaces are described by interpolating between ‘control points’. This characteristic of NURBS geometry is most evident by the fact that no matter how far you zoom in on a nurbs curve or surface, it will never appear exactly flat or faceted. Consequently, the designer has a lot of control over manipulation of the parameters of objects in Rhino, and use to initiate and connect to the geometries of objects afar. Case in point, curves and surfaces can be divided into exactly equal segments, simplified accurately, used to precisely measure angles and curvature etc. 1.10.2. Grasshopper Grasshopper is a plugin for Rhino that makes the history of all of your modelling operations and geometric properties explicit to the designer. It encompasses a multitude of other plugin that permits an extensive degree of manipulation and explorations; from Kangaroo, math surfaces, etc. This stream of geometry and information is represented as a graph of connected nodes, and is commonly cited as a Visual Programming Interface. Modification of a node in the graph causes this change to flow through to all connected nodes downstream, updating the output of each of these nodes and consequently permitting designers to retrospectively modify parameters and operations anytime and promptly66 discern their impact on the final design or output, without the necessity to redraw the geometry. Grasshopper; and other scripted design processes a permit illustration by data utilisation.

1.11. EXPRESSION OF INTEREST 1.11.1. Design Geometry Diversity of forms, applied patterns and articulation in geometry is the signifier of the signified aspects that gained its meaning as space from multiple layering and inscription of information through the ordering and categorizing of these fragmented semiotic sub-systems the complete semiotic system emerges, proposing an information rich and articulated design. Architecture such as LAVA’s Green Void, 2008 or Smart Geometry’s Gridshell installation and IBA’s Canton Tower provide established precedents supporting the parametric success of the application of minimal surfaces, geodesics and in this way, all three examples have utilised parametric modelling to explore and realise the aesthetic and structural opportunities of their projects, resulting and minimise waste in all aspects of the design while producing a strong 64

a piece of software that carries out a particular function, typically creating or modification. Robert McNeel & Associates, 2013 66 Not always, thus there is no “one size fits all”. Grasshopper creates a plethora of traps for beginners to fall into, especially when generating poly surfaces or calculating intersections. 65

28


aesthetic presence. The outcome is three very different projects however they all adopt similar strategies. The Green Void and Canton Tower projects, while completely different in scale and function, both look to nature structural principles in their architectural practice. “IBA strives towards coherence, the kind normally only found in nature.”67

68

Fig.15 (left): The Green Void, LAVA, 2008 69 Fig.15 (right): Gridshell installation, Smart Geometry,2012

“LAVA explores frontiers that merge future technologies with the patterns of organisation found in biological organisms and believes this will result in a smarter, friendlier, more socially and environmentally responsible future.”70 Geometry such as ruled surfaces, paraboloids, minimal surfaces, geodesics, booleans provide a very compelling and valid origin point for parametric exploration. Exploration of geometry demonstrated with techniques such as geodesics and minimal surfaces provides an opportunity to optimise and realize is an extremely relevant and important element of current architectural discourse as social values favour ‘environmentally friendly’ or ‘green’ architecture. Similarly Smart Geometry uses the bending parameters of the timber they the form and structure of their design. These examples provide evidence that producing architecture using geometries formed from parametric modelling is a very viable and interesting approach to designing in such a way that meets contemporary social and environmental. Parametric modelling of geometry has led to the comprehension of some of architecture’s most recent developments in design, performance, materiality and construction adaption or optimisation of lessons learnt from (biological systems) nature. Natural forms have been selfdefining through evolution. 67

Retrieved from http://www.iba-bv.com/, 14/07/2014. Retrieved from http://designalmic.com/green-void-installation-lava/green-void-installation-by-lava-section02/, 8/7/14. 69 Retrieved from http://matsysdesign.com/2012/04/13/sg2012-gridshell/, 8/7/14. 70 Retried from http://www.l-a-v-a.net/about-lava/, 14/07/2014. 68

29


Explorations, experimentation and innovation; of digital and analogue forms of computation in the pursuit of systemic design applications that are scenario- and time-based whist considering controls systems as open acts of design experimentation, our studios scrutinised production processes as active agents in the development of Proto-Design bio-systems.

1.11.2. Proto-types; Behavioural Analysis and Beyond As a step further, off the screen, prototypes are perceived as an apparatus, which are used to generate architectural solutions, enriched with data deriving from site analysis and the various building programs. The use and interpretation of the prototype is not always consistent however and no “one-size fits allâ€?. Prototypes in the projects conducted in our studios relied on the specific conditions of each project, and were certainly particular, as well as the personal experience and interest of each designer (in the case of individual projects), translating it into structural systems, typologies, form, organization of programs and detailed optimal solutions. This approach advances exploration of prototypical design systems, which investigates architecture as an instrument engaging both material and social forms of interaction based on information gathered. Social scenarios are coupled with material life-cycles as a way of hypothesising on how we live, adapt, interact and the role architecture plays or can play. Behavioural, parametric and generative methodologies of computational design are merged with physical computing and analogue experiments to create dynamic and reflexive feedback processes. New and possible forms of spatial organisation are explored that are neither type- nor site-dependent, but rather evolve as ecologies and environments seeking adaptive and hyperspecific features or properties. The projects shown here don´t claim in any way to be perfectly solved architectural proposals or prototypes. But rather, they emulate the possibilities occurring by the use of emerging design and manufacturing techniques as a means for architectural design. They raise questions, inviting the recipient into a critical dialogue about the existence (or not) of architectural qualities in an attempt of advancement from the elation and anticipation of these new tools to discoveries and explorations of what they really are capable of achieving. This performance-driven approach seeks to develop novel design proposals concerned with the habitual; out routines. The iterative methodologies focus on investigations of spatial, structural and material organisation, engaging in contemporary discourses of architecture and design.

1.11.3. Emergence Emergence has been an important concept in biology, mathematics, artificial intelligence, information theory and computer science, newer domains of climatic modelling and other complex systems analysis and simulations; of the mathematics of evolution and embryological development, the data structures and processes of the genome, population dynamics and pressures etc. Its applications to architecture design are explored in the generative design experiments within the various studios conducted, which concluded with the detailed modelling and analysis of the set of evolved forms, surfaces and structures.

30


Emergence is the ability of low-level components to self-organize into a higher-level system of sophistication and intelligence, as Johnson notes. This self-redesigning stems from the bottom up as opposed to controlled by an outer control element. Johnson gives examples of feedback, selfassociation and versatile learning. In spite of the fact that this book concentrates on the Web, placing it into its chronicled and biotic setting, Emergence can likewise be connected to architecture, building design, outline and arranging restrains, making it a remarkably important and enthralling read. The creator's hypothesis of emergence states that inside an arrangement of what gives off an impression of being disorder, there are underlying principles that administer the a pattern of order and conduct and carry request out of confusion and chaos. It presents us in the stunning universe of colonies; ground dwelling insect states, mould, neighbourhoods, neurons and programming elements. From mind science to programming to urban areas or cities to ant colonies; ground dwelling insect settlements, Johnson shows that "emergence" is surrounding us. To comprehend emergence is to comprehend something essential about how the planet overall functions.71 To insight mindfulness of occasions in our connection and context, Johnson proposes that you have to take a gander at a larger perspective than the singular life form or individual organism. Ants can't see the social order of the societies that they are parts of. In the same way that we people might have a comprehension of the nearby neighbourhood we are in and of ourselves, we have to venture outside (or above) the city to follow its capacities and how it functions. A city, such as a colony; ground dwelling insect province does not have controls from the top all things considered, however decides that every colony tenant or occupant ant complies, and it is these principles that offer request to the disarray and make the resultant neighbourhood act as an organic entity in general. The principle of emergence does not precisely topple our model of the universe—however it does infer that we have to reconsider our comprehension of how the universe is arranged. As per this standard, the intricate generalisations we know as the laws of motion in large systems, develop, or "emerge" from the straightforward, fundamental standards of singular particles.

1.11.4. Adaptation and simulation With digital simulation, the advent of simulation on a computer initially had the impact of atomizing scientific models, and disintegrating them. At that point, all the more drastically, it permitted a liberation from the universe and precedent mathematical modelling: seeing it leave off from a more unobtrusive, atomized scale, it got became discernable that simulation could indeed out-right leave off from individualised location confined rules; non-resumable, much all the more nearly, by a universe, mathematical gesture contrived previously and consequently superimposed. These principles, in their differing qualities, today contribute to a declassifying and localisation of mathematics in the sciences of conception, for they realise a preparatory de-mathematisation of the formalisms of modelling; through them, mathesis no more essentially tries to be universal, 71

Johnson, Steven. Emergence: The Connected Lives of Ants, Brains, Cities, and Software, Penguin Books Ltd, 2001, p66.

31


uniform and all inclusive,72 because these rules possess parameters which might be reregulated in function to environment and milieu. We then talk about the generative method10 and machine reproduction and no more of advanced recreation. These strategies are called generative on the grounds that they utilise the computers as a driving force for induction and not as a deductive force regulating formal calculations. The computer turns into a biological system of abstract formalisms.73 Customarily, simulation substitutes an abstraction with an observation and a measure.

Fig.16: voronoi polygons: Norfolk County Massachusetts, USA - GIS Mapping Strategies,2013

74

Additionally, with a more instinctive programming, accessible to inexperienced users of mathematical modelling, the machine turns into a location of interaction between heterogeneous ideograms; between non-modulated ideograms in an exceptional stereotypical dialect or selfevident in an uniform way. As a contingent outcome these symbols; roughly symbolic or typical; can have extremely contrasting semantic statuses: they can originate from several disciplines, several discourses and several different and irreconcilable dialects. Case in point, a poly- or pluriformalization75 made possible by the computational turnabout: the computer intertwines these heterogeneous symbols step by step. These formalisms, as sub-models with confined impact, allude once more to modes of meaning and reference; either to the true, fictive or regulating; all altogether distinctive: each is no more bound to conform to the conduct of a variable depicting to a universal physical, mechanical or thermal character. Their symbols can denote either local physical properties, but this with

72

F. Varenne, 2009a, p. 150. F. Varenne, 2003, p. 310 74 Retrieved from https://smathermather.wordpress.com/tag/voronoi/, 11/8/14. 75 Ibid. 73

32


heightened subtlety and resolution76 or entities, values or rules of decision symbolizing and rendering operational, all variables outside physical and mechanical components: aesthetic rules, organizational rules relating to flow, human or other, values of adaptation of a particular context, sustainable environmental constraints, optimized little by little, can hence be finely woven together with more traditional physical constraints. Despite their heterogeneity, it is the computational ecosystem which allows them to compete equally with more classic symbols and sub-models operationalizing for their part on physical constraints. In inference to all this, the existent trend of information and knowledge exchange inside and crosswise over diverse disciplines escalates the complexity of exploration, extending the interest for the creation of systems that fulfil functions of architectural design. It aspires to architectural settings as communicators that define, initiate and pursue on-going interactions inside a given setting. The qualities of fields and complex systems survive in cogent correlations. This profound relationship mitigates articulated differentiation that could be translated and enacted in both tactile and impalpable architectural ways.

76

A. Andrasek, 2009, p.54

33


PROJECT PheroβONE Theory meets experimentation, exploration, application and practice through applied design. That being said, during the Alberto Estevez studio; Genetics & Architecture emerged such an opportunity of research and experimentation for precepts learned from nature in a real-time design project competition for the design of a College of Architecture, Applied Arts and Design, at Qatar University, Doha – Qatar. 1.12.1. The Brief: The nature of the project brief did in fact in itself provide for explorations and experimentations to commence as it sought an unprecedented design and strictly a nature inspired approach. All of which gave direction to nature as a source of inspiration for all successive works and developments of the project. The brief precisely mentioned some of the required spaces; the basics and gave lead to the designer to incorporate whatever else the designer deemed appropriate. The basic spatial preconditions accommodated (but as already mentioned, were not limited to);

UNIVERSITY OF QATAR COLLEGE OF ARCHITECTURE, APPLIED ARTS AND DESIGN

Faculties

Architecture

Fine Arts and Media Art

Departments

Architectural Design 1 (Zaha Hadid) Architectural Design 2 (Greg Lynn) Architectural Design 3 (Hani Rashid) Building Design Structural Design Energy Design History and Theory of Architecture Digital Fabrication Art & Science Stage and Film Design Digital Art Photography Graphics and Printmaking Landscape Art Painting Painting and Animated Film TransArts - Transdisciplinary Art Transmedia Art Media Theory Industrial Design Graphic Design Graphics and Advertising Fashion 34


Design

Theory and History of Design Computer Studio Video Studio

Art and Technology

Art Sciences and Art Education

Life Drawing Book Art Geometry Ceramics Studio Wood Technology Metal Technology Technical Chemistry Art & Science Visualization Textile Technology Design, Architecture and Environment for Art Education Art and Communication Practices Textiles - Free, Applied and Experimental Artistic Design Specialist Didactics Cultural Studies Art History Philosophy Creative Writing

PheroβONE Tower Doha, QATAR Client:

Qatar University College of Architecture, Applied Arts & Design Doha, QATAR

Architect:

Harold Wood c/o. Genetic Architectures Office, Barcelona March - May, 2014

Pheroβone as a project is a design system based on the theory of emergent systems, seeking to generate an interaction between form, structure and materials, where the sum of individual components and their logical proliferation generate more efficient and better structural and spatial solutions adapted to their environment. It is an urban desert proposal based on morphological examination from self-generative algorithms. The new college is an engaging and inviting centre outside of downtown Doha, Qatar. Celebrating its prominent location on the Qatar University campus, one of the city’s research and education core, the building’s architectural design formally and philosophically extends out into the community, welcoming visitors of all ages and backgrounds to experience first-hand alike. Crafted of painted zinc, high performance glazing, and stainless steel, the building has a timeless appearance and extraordinary durability in the Doha climate. Transparent tinted glazing planes and reflective metal surfaces animate the building, exposing the activities within and engaging 35


people and art at multiple levels on both the interior and exterior. Selected to reflect Edmonton’s dramatic weather patterns and the extreme contrast of the long hot days of summer and the extremely cold nights, these materials create a dynamic quality that allow the building to transform along with its natural surroundings with enhanced building performance. Not only does the building change throughout the day, it changes from season to season. More static building materials would not allow for this type of ephemeral connection between the building and the site. The design reinvents the public spaces through a continuous stainless steel surface that moves lithely through the s interior and exterior spaces. Wall and ceiling become one fluid surface which captures the spatial volume while guiding the occupants through entry points, wrapping event and gathering spaces, and leading on to the galleries. The two languages of mass and curvilinear form define an inviting rhythm of destination and path in a unique way-finding experience for occupants, students and visitors. The green design nature of this vertical campus adapts it to current demands; harvest energy on its photovoltaic tinted glazing; be an intelligent building that gives feedback of its own performance; vertical campus and fresh breathable air at higher altitudes; automated sun shading7 cooling systems; heat sensors; intelligent materials; water harvesting system for selfirrigation, and amplifies wind speeds with its twisted shape to facilitate wind harvesting – energy efficient. The building is an icon of its domain and a depiction of precepts learned from nature.

Fig.17 – Streetscape: Harold Woods, photo-realistic renderings, 2014.

36


1.12.2. PRECEPTS FROM NATURE

1.12.3. The Spinal Cord The human spinal cord (central nervous system); is of interest due to its ability to withstand stress and weight, structure, flexibility in its adherence bending (in all directions), torsion/twisting, and ability to accommodate both tension and compression while transferring feedback information up and down the body. The Spinal Cord is connected to the brain and is about the diameter of a human finger. From the brain the spinal cord descends down the middle of the back and is surrounded and protected by the bony vertebral column and surrounded by a clear fluid called Cerebral Spinal Fluid (CSF), that acts as a cushion to protect the delicate nerve tissues against damage from banging against the inside of the vertebrae.77 The anatomy of the spinal cord itself, consists of millions of nerve fibres which transmit electrical information to and from the limbs, trunk and organs of the body, back to and from the brain; this inspires feedback systems within the building design for example to monitor performance of the building during its lifetime. The nerves which exit the spinal cord in the upper section, the neck, control breathing and the arms. The nerves which exit the spinal cord in the mid and lower section of the back, control the trunk and legs, as well as bladder, bowel and sexual function - informs placement of corresponding building components it areas where they are most efficient. The Motor Neurones nerves carry information from the brain to muscles whereas Sensory Neurones (nerves) carry information from the body back to the brain.78 Sensory Neurones carry information to the brain about skin temperature, touch, pain and joint position - feedback/loop information for building functionality, damage, performance and under poor performance.79 The brain and spinal cord are referred to as the Central Nervous System. The nerves within the spinal cord are grouped together in different bundles; Ascending - carry sensory information from the body, upwards to the brain, such as touch, skin temperature, pain and joint position, tracts carry information from the brain downwards to initiate movement and control body functions.

77

Bono C,M Lin Vernon, W, VW. Lin, Spinal Cord Medicine: Principles and Practice, Demos Medical Publishing,LLC., 2010 78 Ibid. 79 Thomas N. Byrne, Edward C. Benzel, Stephen G. Waxman, Diseases of the Spine and Spinal Cord, Oxford University Press, 2000

37


Image.1: Anatomy of the spinal cord vertebrae80 Image.2: four main regions of the spine: cervical, thoracic, lumbar and sacral81 Image: Schematic dorsal and lateral view of the spinal cord and four cross sections from cervical, thoracic, lumbar and sacral levels, respectively. 82

Image.4: Infographic of spinal cord injury awareness and 83 charities in UK

80

Retrieved from http://www.iscoliosis.com/anatomy.html, 20/4/14.

81

Retrieved from http://www.spinesurgeon.com.au/Neurological_Conditions/Spinal-Surgery.htm, 21/4/14.

82

Retrieved from http://neuroscience.uth.tmc.edu/s2/chapter03.html, 20/4/14. Retrieved from http://www.apparelyzed.com/spinal-injury-awareness.html, 21/4/14.

83

38


Image.5: 3D Model of a female skeletal system (vertebrae)84 Image.6: The spinal cord arterial circulation85 Image.7: The relationship Between Spinal Nerve Roots and Vertebrae 86

1.12.4. Ants/Termites This reference study focuses on the movement and feeding patterns of ants/termites. The hollow and yet elongated patterns and openings made by ants/termites for example in woods are mainly for ventilation and easier circulation as is the case with tunnelling in mounts and heaps of earth fill and/or anthills. Ants use pheromones to mark their trail and guide following ants. They mark the path as they go along ant leave tiny little messages. If the trail is successful and more and more ants follow up the guidance becomes more intense and denser, whereas other trails fade out.87 In nature, ants scavenge for food as a swarm. They choose their paths from the nest based on pheromone density, favouring paths with higher concentrations of pheromone. Individuals lay pheromone trails upon finding food, allowing other members of their colony to passively follow them to the food source. Because paths leading to food have higher pheromone density, increasing numbers of ants choose these successful paths. As a result, ant trails transition over time from random paths to streamlined routes. This organization and appearance of patterns over time is an example of emergence, the process by which complex systems and patterns arise from a multiplicity of simple systems.88

84

Retrieved from http://www.3danatomy.jp/3d-anatomy-products/female-skeleton-system.php, 12/5/14.

85

Retrieved from http://neuroscience.uth.tmc.edu/s2/chapter03.html, 21/4/14. Retrieved from http://www.apparelyzed.com/spinalcord.html, 18/5/14. 87 Maurice Maeterlinc, The Life of the Ant, University Press of the Pacific, 2001 88 Michael V. Brian, Production Ecology of Ants and Termites, Cambridge University Press, 1978 86

39


Ant feeding patterns lend themselves to simulation with agent-based models, which simulate the actions and interactions of autonomous agents to assess their effects on the system as a whole. This informs the building study from a simple to complex system by simulation, which can be applied a numerous building systems from design, construction, and to utility, for example circulation and ventilation. The patterns left behind can be understood as a form of strip morphologies. The strips contoured around the generated form react to solar position as well as the gradient map developed in Ecotect. The definition is to be used as an example for ways in which elements might be tied between Grasshopper and Ecotect (or other energy modelling software).

Image.8: smooth passages, or "galleries", created by Carpenter ants create in wood89 Image.9: “foot paths� left behind by ants; mazing ant nest structures90 Image.10: ragged passages, or "galleries", created by termites in wood 91 Image.11: signs of ant and termites damage patterns92 Image.12: signs of ant and termites damage patterns93 Image.13: signs of ant and termites damage patterns94

89

Retrieved from http://www.islandbasementsystems.com/foundation-repair/repair-wood-damage/termitedamage.html, 8/5/14. 90 Retrieved from http://biomimicrykth.blogspot.com.es/2012/05/amazing-ant-nest-structures-andtalking.html, 8/5/14. 91 Retrieved from https://plus.google.com/114051213125497692985/about, 8/5/14. 92 Retrieved from http://www.amcoranger.com/amco_termite_control/inspecting-termites/, 8/5/14. 93 Retrieved from http://www.buellinspections.com/moisture-ants-the-home-inspectors-little-helper/, 9/5/14. 94 Retrieved from http://365pest.com/bugs/carpenter-ants.html, 12/5/14.

40


1.12.5. Bones The study of the bone, informs the exo-skeleton structure of the building designed. Bone, the material that makes vertebrates distinct from other animals, has evolved over several hundred million years to become a remarkable tissue. Bone is a material that has the same strength as cast iron, but achieves this while remaining as light as wood. The front leg of a horse can withstand the loads generated while this 1500-pound animal travels at 30 miles per hour. The upper arm is able to keep birds aloft through entire migrations, sometimes over 10,000 miles without landing. The antlers of deer, used as weapons in territorial clashes with other deer, undergo tremendous impacts without fracturing, ready to fight another day. At some point, unfortunately, forces of impact exceed even bone's ability to hold up. Falling on the ice, suffering a collision in a car or a tumble on the ski slopes can cause the bone to fail. While fractures are disastrous, bone - because it is a live tissue - almost instantly begins a healing process. Without question, bone is the ultimate biomaterial. It is light, strong, can adapt to its functional demands, and repair itself. Bone architecture - there are two major kinds of bone, trabecular (spongy) and cortical. Trabecular bone gives supporting strength to the ends of the weight-bearing bone. The cortical (solid) bone on the outside forms the shaft of the long bone.95 This x-ray of a femur shows the thick cortical bone, and the trabecular bone which is arranged to withstand the stresses from usual standing and walking. Compressive stresses are those of the body weight pushing the bone down, and tensile stresses are from the muscles, pulling the bone apart. Mineral reservoir - in addition to its mechanical functions, the bone is a reservoir for minerals (a "metabolic" function). The bone stores 99% of the body's calcium and 85% of the phosphorus. It is very important to keep the blood level of calcium within a narrow range. If blood calcium gets too high or too low, the muscles and nerves will not function. Support - the skeleton is the framework of the body, it supports the softer tissues and provides points of attachment for most skeletal muscles.96

Protection - The skeleton provides mechanical protection for many of the body's internal organs, reducing risk of injury to them. For example, cranial bones protect the brain, vertebrae protect the spinal cord, and the ribcage protects the heart and lungs. Movement - skeletal muscles are attached to bones, therefore when the associated muscles contract they cause bones to move.

Storage of Minerals - bone tissues store several minerals, including calcium (Ca) and phosphorus (P). When required, bone releases minerals into the blood - facilitating the balance of minerals in the body. Production of Blood Cells - the red bone marrow inside some larger bones (including,

95

John P. Bilezikian, Lawrence G. Raisz, T. John Martin, Principles of Bone Biology; Ed. 2, Elsevier Academic Press, 2008. 96 Tim D. White, Pieter A., Folken, The Human Bone Manual, Elsevier Academic Press, 2005.

41


for example, the blood cells are produced; red Blood Cells, White Blood Cells and Platelets are described on the page: Structure & Functions of Blood. Storage of Chemical Energy - with increasing age some bone marrow changes from 'red bone marrow' to 'yellow bone marrow'. Yellow bone marrow consists mainly of adipose cells, and a few blood cells. It is an important chemical energy reserve.

Image.14: Synovial joints showing a space or "synovial cavity" in the joint.97 Image.15: Comparison of a normal versus an osteoporosis infected bone structure98 Image.16: Destructive lesion in vertebral body of the 7th thoracic vertebrae99 Image.17: Osteoporosis in a broken male hip bone100 Image.18: Illustration of how to improve functionality and joint force in bones101 Image.19: Osteoporosis and Metabolic bone disease102 Image.20: Pathological changes in the right femoral head103

97

Retrieved from https://www.boundless.com/biology/the-musculoskeletal-system/joints-and-skeletalmovement/classification-of-joints-on-the-basis-of-structure-and-function/, 15/5/14. 98 Retrieved from http://www.osteoporosis.ca/osteoporosis-and-you/what-is-osteoporosis/,18/5/14. 99 Retrieved from http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0090924, 18/5/14. 100 Retrieved from http://www.parathyroid.com/osteoporosis.htm, 15/5/14. 101 Retrieved from http://www.apiterapia.co/2014/02/como-mejorar-funcionalidad-y-fuerza.html, 18/5/14. 102 Retrieved from http://endocrinology.medicine.ubc.ca/research-page/osteoporosis-and-metabolic-bonedisease/, 103 Retrieved from http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0090924, 15/5/14.

42


1.13. PRECEDENT STUDY 1.13.1. LOCATION_ - Site: An unoccupied portion of the existing university -Site Context: Urban/Sub-urban and coastal; coast of the Persian Gulf, Doha – Qatar. -Site Greater Context: urban, densely populated

Fig.18-19: Site: Harold Woods, project size with context, 2014. LOCATION STUDIES_ -Terrain: mostly flat and barren desert covered with loose sand and gravel -Solar: Desert thus, hot and humid – high solar gain -Qatar Temperature; average high temperatures surpass 40-45 °C (106 - 114 °F) -Doha Temperature; average high temperatures surpass 38 °C (100 °F) -Landscape; relatively flat, pure urban -Precipitation; 0-20mm all year, except September 70-80mm

43


Fig.20: Precedent: Harold Woods, Site Environment Studies, 2014.104

1.13.2. DESIGN EVOLUTION:

FORM STUDIES_ Concept Models_ Form finding exploration from a singular unit to a massing form, and further to strip morphologies possibility explorations using match sticks.

Fig.21: Matchstick model: Harold Woods, Form Simulation Studies, 2014. Computational form Models_ Computational form finding exploration from a singular unit to a massing form and further to strip morphologies possibility explorations. Form exploration was done to find a suitable or optimal way to adapt the buildings to the already existing natural conditions of its would be 104

Image composition by author, but original individual images retried from http://solargis.info/, 1/6/14.

44


environments; attempting to be as sustainable as possible in all aspects of al building components from singulars like components to masses like automated systems.

Fig.22-25: Inception: Harold Woods, Computational form Simulation stages 1-4, 2014.105

105

Stages and figures are from top (Fig.24) to bottom (Fig27).

45


Fig.26-28: Inception: Harold Woods, Computational form Simulation stages 5-6, 2014.106

106

Stages and figures are from top (Fig.28) to bottom (Fig.30).

46


1.13.3. SKETCHES_ Exploration approach of design and form through sketches and models/skin/façade from the inception of the project

Fig.29: Concept Sketch: Harold Woods, Analogue Conceptual development, 2014.107 FORM, SKIN, FAÇADE: The skin is a biomimicry of ants/termite trails/patterns left on surfaces like wood for circulation and ventilation, along with pheromone for other ants to follow. This has been interpreted to be a feedback system within the building skin, as strip morphologies that twist upwards and downwards, depending on the environmental conditions for enhanced solar harvesting on the photovoltaic glazing system behind them, transfer of wind from one panel/blade to another until it reaches the collection point or the sky park/courtyards; and serves as a secondary structural component - exoskeleton.

Fig.30: Tripod: Harold Woods, Particle System, 2014.108 Particles are simulated to react to attractors, velocity, cohesion, and gravity within a field aggregating with minimum and maximum ‘strut’ lengths for an exo-skeletal structural and façade system. 107 108

Site imposition, rotation effect, and sketch pencil impression of ant patterns in wood. Exploration of particle system – skin and façade elaborations

47


STRUCTURE_ The building is enveloped in a exo-skeleton, that dominates its facade, inspired by the nature and functionality of the bones’ structure within the body, and with a twist to enhance wind speeds for collection whiles reducing the building sway effect, water harvesting, cooling systems, automated systems like solar tracking etc.

Fig: 31; Images 1-4: Tripod: Harold Woods, Particle System, 2014. The building’s evolutionary process during design; in and ascending order from 1 to 4

Fig.32; Image 5: one skin/facade panel (straightened) showing strip morphologies/ “louvers” and the “vortices”/openings for courtyards/sky parks. 48


Panel illustration below is horizontally laid; however, factual installation on the building is vertical, not horizontal. The building’s structure and orientation lends its ability from understanding of the Spinal Cord (and the nervous system) as a whole’s flexibility, agility, structure and ability to withstand different types of stress/load. The twist in the building can be seen as form of pre-stressing thus reduces the buildings movement in moments of strong winds (as experienced in dessert areas like Qatar) that would otherwise sway the building causing discomfort for occupants.

The twist further amplifies the speed of wind that hits the building and works with it to the building’s advantage by directing it for wind harvesting instead of resisting against wind load, which constitutes the building’s performance. This is illustrated below;-

Fig.33: Tripod Layout: Harold Woods, Figure-Ground, 2014.

PLANS; Typical Adaptive Floor Plan: The floor plans utilise an adaptive reuse approach which adapts a particular space to the particular needs of the inhabitants at that particular time of the building’s lifetime. The colour code is a representation of different programs within a space that can either co-exist and/or shuffle or alternate; library, studios, cafeteria, auditoriums, lecture halls, storage space, offices, research facilities, galleries, theatres, faculty offices, recreational areas among others, etc.

49


Sky parks/Courtyards:

The three “vortices” at the three different levels serve as sky parks/open courtyards at a higher altitude for recreation at intermediate levels through the building to provide for all occupants, yet with minimal distance travelled – convenience.

Fig.34: Plans: Harold Woods, Adaptive Layout, 2014.

50


1.14. RENDERINGS & VISUALISATION:

Fig.35: Photo-realistic renderings: Harold Woods, Streetscape SW째, 2014.

51


Fig.36: Photo-realistic renderings: Harold Woods, Streetscape Aerial, 2014. (Top left) Fig.37: Photo-realistic renderings: Harold Woods, Streetscape NW째, 2014. (Top right) Fig.38: Photo-realistic renderings: Harold Woods, Streetscape W째, 2014. (Bottom right)

52


1.15. Conclusion This digital architecture faces heavy criticism. It is often blamed for undertaking, at high costs, formal experiments, disregarding human needs such as efficient use of floor area and easy-to understand spaces and structures. Not all architects and designers therefore engage in such formal endeavours and prefer to follow the steps of modernism, with its rational language and the desire to transform the world into a better place. They try to overcome the criticism of modernism – its sometimes totalitarian ambition (e.g. Le Corbusier’s plan vision), its elitist language and sometimes inhuman metaphors (building as a “living machine”) – without changing its basic formal vocabulary. This is why much of today’s architectural debate about the digital revolution centres itself on the question of “blob or box”. We all have a picture of a future in mind. We can imagine cities with multi-layered mobility highways, clusters of buildings integrating a world of usage probably with landscapes across different floors, walls that talk to us, projections of our wishes in the space. Everything is well designed; we can smell the new way of living by gazing upon the imaginable. However, truth be told, even though nature is imperfect, we have got a lot to learn from it from a molecular scale to a macro scale. As Werner notes in her observations titled; The Myth about Biodigital Architecture, “Investigating in bio-digital architecture first and foremost allows us to look into basic strategies of how biology computes in order to understand the main principles of action, reaction, interaction, input, mutation, growth, relation and conversation”,109 and this furnishes and understanding that the qualities of the biological models that are of more interest to architecture are those related to live organisms viewed as systems because an organism's (say architecture, if you may) survival depends on generative processes of adaptation and evolution being part of a larger ecosystem as evolution is apparent and ceaseless.

109

WERNER, L. C. 2013. Lecture: The Architecture of Architecture. Lecture ed. Pittsburgh, unpublished: Carnegie Mellon University.

53


54


ENCODED MATTER Ezio Blasetti Studio_

CHAPTER 2 | ENCODED MATTER This chapter is a pure reference for lessons learned from the norms of genetic algorithms and their application in architecture and design. Thus is embarks purely upon Generative methodologies in digital design and algorithmic approaches in architectural design. 2.1.

Objectives

The studio aimed at offering a unique, hands-on experience to research and experiment in the field of Generative tools and Computational Design applied to architecture; and explored concepts by setting up adaptable design. Innovation, digital design and real fabrication will be the topics that the studio concerned itself with, with the aim to advance theoretical research as well as potential practical applications of algorithmic design in architecture and design. 2.2.

Collective Intelligence

The inquiry of how organisms yield compelling data from this persistent stream of information has been of enthusiasm to numerous researchers, especially in the field of neurology. An adaptive model, the human brain is subject to evolution and development. “Synaptic connections are point to point, […] [and] the points of connection are continually changing”110 As is explained by Tierney; “the constant data flow exchanged between organism and environment is continually being mapped and remapped in the brain.” Neuronal structuring is based on self-organisation.”111 Generative Systems design is a process in which the material and medium is algorithmic in nature. Algorithms manifest dynamic and emergent behaviour. As Castillo denotes, “Generative Systems propose to shift the focus from static models towards a computational logic, in what Bruce Sterling calls “processuality”.112 Processuality is a postmodern change in design perception, inspired by the craft of software. The world is rich with processuality- the growth of plants, boiling liquid, chemical patterns- processuality is all around us. These Generative Systems allow us to establish relations between patterns, structures, processes, forms and to model evolutionary behaviours.” The construct of the neurological system may be termed as a ‘relational architecture’113 such ‘relational architecture’ has been of interest to the Physical Language Workshop (PLW) at MIT Media Lab, directed by John Maeda. Their work is concerned with ‘collective cognition’ and ‘social software’. Amid his exploration Maeda created Openstudio an information imparting system in which “user-generated content can be leveraged as creative capital”.114 In addition the work at PLW tries to investigate the linkages between programming, innovation and learning. These strategies empower to simulate true phenomena. The models that we propose are asserted and communicated around diverse portrayals of the system idea, the first, Static Models work under the precision of a closed framework: “An isolated system having no interaction with an environment […] a System whose Behaviour is entirely explainable from within, a system without Input.”115 The second model we propose, Algorithms as a Model is 110

Tierney 2006, p. 36 Tierney 2006, p. 39 112 Marius Watz, 2005 113 Tierney 2006, p. 39 114 Tierney 2006, p. 42 115 Marius Watz, 2005 111

55


explained around the depiction of Open System: “A group of interacting, interrelated, or interdependent elements forming a complex whole.”116 Both models are portrayed utilising diverse modelling techniques: Static Models (Top-Down modelling) versus Algorithm as a Model (Bottom-Up Approach). Static Models are shaped through Top-Down Modelling techniques. In this paradigm, we presume that we comprehend how a system functions or carries on. This model shows a few attributes: the system portrayal is static and we have a decent understanding of the framework is in its totality and aggregate rather than partially, and we also understand exactly how the system components interact with each other. Marvin Minsky famously wrote: “You have to distinguish between writing a program that helps you to test your theory or analyse your result, and writing a program that is your theory”117 This distinction is equally relevant to architecture, as you begin “writing a program that is your design.”118 Interestingly on the other hand, as Nicolik notes, “the algorithm as a Modelʼ is shaped through Generative Science Paradigm. This paradigm builds understanding from the bottom-up. Phenomena can be described in terms of interconnected networks of simple Agents; an Agent is a persistent thing which has some state we find worth representing, and which interacts with other agents and serve to mutually modify each other´s states.”119 (Nicolik, I)

2.3.

THE RATIONALE OF SCRIPTING

“Architectural concepts are expressed as generative rules so that their evolution may be accelerated and tested.” -John Frazer, 1966 2.3.1. Systems; denotation The ‘systems view of the world’ was first coined by Ludwig von Bertalanffy. System theory relies on holism and networks120 as such a theory “seeks to identify the common patterns of organisation across disparate systems”121 One view of this in Architecture, is concerned with the material and the organisational, as explained by Coates; “the ‘systems view’ of space and form. […] derived from experimentation in mathematics and computer science [...]. These new ways of seeing form and spatial organisation all show the phenomenon of ‘self-organisation morphologies’, often referred to by the general rubric of ‘emergence’.”122 Both Biology and Architecture are focused around morphology; hence science and systems hypothesis has started to build them inside Architecture. “Recent bio-theories on complex adaptive systems and especially the phenomena of emergence have begun to open up territory that architecture can no longer ignore if it is to have any relevance, and indeed resilience, in the future.”123 Concurrently, paradigmatic movements within the sciences, mathematics and society have defined the development of human comprehension, especially in the time of the data age. The flow of information and the ability to ‘encode, recode and decode’124 has been made conceivable by the founding operations of the Turing Machine. Biological systems and embryonic development are

116

Ibid. Minsky, cited in Coates 2010, p. 26 118 Coates 2010, p. 26 119 Marius Watz, 2005 120 Thacker 2004, p. 148 121 Thacker 2004, p. 144 122 Coates 2010, p. 1 123 Wiscombe 2005, p. 1 124 Thacker 2004, p. 16 117

56


shaping the language and procedures inside a design pursuit for a more logically augmented design.

2.3.2. Scripting, the Interpretation The expression "scripting" in the context of the computer originates in programming dialect. Scripting languages being a style of programming which is subject to a library of prior components composed in different languages. At its equation based, containing a series of commands, it connects together segments and habitually it is implanted in the application it controls.125 Classified as a subset of machine programming, the syndicate properties of this language permit it to function as the immediate interface between the user operation, and the machine application. In a design context, and particularly with the end goal of this thesis proposition, this command of language would identify with the ‘codification of design intention’126 amid the conception of design. Such ‘design codes’127 take into account the simulation of procedures and creates the methods by which to depict evolutionary quests. This strategy has expanded in notoriety as designers consistently asking how we can incorporate intelligence with the computational geometric setups. Moreover, ‘scripting’ requires designers to develop a literacy of programming and adds another level of design thinking into the design process. In addition, "scripting" obliges creators and authors of the design to ensue proficiency of programming and includes an alternate level of design intuition into the design process.

2.4.

Algorithmic Logic

The shift in perspective on human thought, the cognitive science approach.128 Electronics and data systems, especially data hypothesis which takes into consideration the quantification of the flow of information, have given the benchmarks of human cognitive performance. Newell, Simon and Show [1958] dealt with the General Problem Solver (GPS) which worked on the that rudimentary transformations of information could account for the generation of good solutions to complex issues. Through utilization of systems which exhibited behaviours likened to human characteristics of ‘purpose’ and ‘insight’, this work has gone some way to de-mystify the cognitive thought processes.129 GPS has demonstrated that behaviours are crucial properties of an algorithmic methodology to mimic human patterns of intuition, cognition and invention. Such approaches give a model by which to make an interpretation of design intentionality into the computer system, in an effort to generate a design. As Terzidis; “algorithmic logic is about the articulation of thoughts and a vague struggle to explore possibilities of existential emergence”.130 Such algorithms are indigent upon the style of programming used inside the computer.

125

Ousterhout 1998, p. 24 Terzidiz 2006, p. xii 127 Pisca 2008, p. 1 128 Lawson,2006 129 Lawson 2006, p. 135 130 Terzidis 2006, p. 40 126

57


2.5.

Generative Paradigm of Evolution

A generally acknowledged model, which exhibits non-linearity, has been created by John Frazer [1995]. His model offers ascent to the utilization of Genetic Algorithms inside the design process. Tending to the generative system, Frazer insists that; “It is further recommended that the concept is process-driven; that is, by form-generating rules which consist not of components, but of processes.”131 As Frazer affirms, in order to generate new designs, the combinatoric or procedural models need sufficient comprehensive quality to be fruitful. Yet rather rules are a necessity which stipulates the methodology and process of biological development and evolution.132 Frazer’s work is widely accepted to form the basis of a non-linear process upon architectural design.

2.5.1. The Generative; within Architecture and Art It has been proposed that generative design could be all the more extensively defined as utilising a generative system.133 William Mitchell, in his book Computer-Aided Architectural Design (1977)134 discusses the generative in terms of a system. The term, generative systems, empowers Mitchell to adjust his definition to properties innate in systems hypothesis, which is identified with giving the systems a depiction. The product of a generative system according to Mitchell is a “representation, model or design for an object”,135 which then must be translated into reality. Mitchell traces back the origins of such generative systems to Aristotle’s Politics, section 1290. He clarifies how Aristotle utilises a biological analogy for the constitution of the city. Like the varieties in organs composing a creature, the conceivable consolidations of such organs will according to Aristotle, bring about a variety of differing animals. Here, Aristotle makes the assumption of a finite number of components, with a finite number of combinations. His analogy of nature is that of a ‘kit of parts’.136 Mitchell looks to establish the ‘role of the computer’ in the design process. He describes designing as an activity of problem-solving, in terms of a process, he defines this as; “a process of searching through alternative states of representation in order to discover a state that meets specified criteria”.137 A critical thinking methodology is hence dependant on the representation of the system; a representation is needed as we by and large can't bargain straightforwardly with the physical site. To Mitchell, an issue explanation may be defined regarding a portrayal, additionally such a depiction is included in the description of tools, operations and objectives. Mitchell establishes that issues start with objectives, accordingly the achievement of such objectives, might be said to be objective-oriented behaviours, the action of which is the problemsolving process. From this we may formulate that a generative systems, in Mitchell’s view, are architectural elements which belong to a certain vocabulary, arranged in different combinations to generate form, pattern or organisation. Such an approach is dependent upon the designers literacy of 131

Frazer 1995, p. 65 Frazer, 2010 133 Arida 2004, p. 9 134 Mitchel, 1977 135 Mitchell 1977, p. 38 136 Frazer 1995, p. 14 137 Mitchell 1977, p. 27 132

58


variables; “The concepts of a design variable, and of the association of the design variables into data structures, are central to an understanding of the process of design by use of symbolic generative system�138.

Fig.39: Digital Grotesque, Michael Hansmeyer and Benjamin Dillenburger, 2013

139

Similarly, in Art numerous ideas and meanings of the generative exist. A particularly well established concept is generative art, in which the act of workmanship is identified by the employment of a system-based method. Generative art is a subset of digital art, or computer- art, or graphics, the yield of which might be seen as visual structure, patterns or shapes which exhibit a high degree of complexity.140 Generative artists use procedural models which abide to the principles of computational logics, consequently their craft can develop past the restrictions of human instinct and limit. Such art is imbued with chaos and a search for novel constructs, often based on aesthetic appeal.141

Fig.40: Emergence; Veronika Schmidt, illustration of concept of emergence and 142 the power of generative art

138

Mitchell 1977, p. 40]: Retrieved from http://www.michaelhansmeyer.com/projects/digital_grotesque_info.html?screenSize=1&color=1#undefined, 3/9/14 140 Sims 1991, p. 319 141 Bentley n.d., p. 1 142 Retrieved from http://nodebox.net/code/index.php/Introduction, 18/8/14. 139

59


Fig.41: Verlan Dress, Francis Bitonti, 3D printed Dress debuted at New York Fashion 143 Week,2013

2.6.

Design; Process-driven

The foundation of Fraser's work emanates from the conviction that the customary human-centric routines of intuition fail to offer the capacity to arrange the expanding multifaceted intricate nature of design issues. He expresses that essential to an evolutionary model is a form of generative techniques. Contrary to Mitchell's generative systems, Frazer argues that “Aristotle’s description of nature in terms of a kit of parts�144 is inaccurate, rather nature is not compelled by such combinatoric innovations. Frazer clarifies that such objective-oriented methodologies oblige particular criteria which are difficult to depict compositionally. Besides, such an exhaustive search will result in unmanageable quantity of permutations.

143

Retrieved from http://www.core77.com/blog/digital_fabrication/new_skins_computational_design_for_fashion_workshop__the_premise_and_process_behind_the_verlan_3d-printed_dress_25482.asp, 15/8/14 144

Frazer 1995, p. 14

60


Michael Weinstock, in The Architecture of Emergence [2010], concerning the connections between complexity, evolution and genetically coded information, wrote; “All the forms of life on the surface of the earth, including humans, have also emerged from the process of complex systems that are coupled to the transmission of biologically encoded information over time. […] All living forms are composed of cells, and each cell carries within it the information for the development of the whole form. The genome encodes the programme for the self-assembly of descendants identical to itself, and so it is passed down through time from one generation to the next. […] Every living form emerges from two strongly coupled processes, […] the differential development of cells in the growth of an embryo to an adult form, and the evolution diversification of forms over time. Information is the critical vector between them. Complexity builds over time by a sequence of modifications to existing forms, and from small and simple forms to ever larger and more complex forms”145 and am inclined to agree.

Fig.42: Swarm Intelligence: Architecture of Multi-agent Systems, Neil Leach & Roland Snooks, 2010

145

146

Weinstock 2010, p. 247 Retrieved from http://chemoton.files.wordpress.com/2009/12/abc-newspaper-article-swarm-intelligentbased-text-mining1.jpg, 5/6/14 146

61


2.7.

The Fabric of the Model

As indicated by Frazer to secure the evolutionary model it is important to define; “a genetic codescript, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection.”147 Frazer, in The Genetic Language of Design [1994], he parallels his hypothesis to the medium of weave in textile design. He depicts how conventional impulse of the forms and structures of nature is constantly supplanted by a stress on the inner logic and data coding of DNA and the advancement from a cell to a life form. Drawing on specific parallels in the art outlines of Persian carpets, their evolution, he contends is undifferentiated from with the natural evolution (in this case mutation). Frazer further expounds on ‘the blind evolutionary tactics’ of nature and their relationship to time. Strategies are capable of delivering extraordinary aesthetics and complexity by a “process of profligate prototyping and ruthless selection.”148 Frazer citing Jones; “Natural selection has superb tactics, but no strategy – but tactics if pursued without thought for the cost and for long enough, can get to places which no strategist would dream of”.149 To Frazer, the tactics are the rule-sets employed in the process-driven description of his systems, and emulating those found in natural systems. Significant here, is his reference to an important constituent; time. Weinstock also draws upon the importance of time, stating “time is a significant dimension for all systems”.150 This establishes Frazer’s view of the advantages of the computer; he sees it as an “evolutionary accelerator”151 where time and space are compact.

2.8.

The Design Genesis; Cellular Automata

The premise for Frazer's beginning point or "genesis" is the model of 'cell automata', concocted by John von Neumann in the 1950s. Von Neumann was enlivened by the application of abstract maths by Stanislaw Ulam's work on investigations of crystal growth. “Von Neumann, [...] set out explicitly to create a theory which would encompass both natural and artificial biologies, starting from the premise that the basis of life was information”152 The Cellular Automata (CA), works as an exhibit of cells, each with it expression (1 or 0), this state is dependent upon the current state of itself, the state of its neighbours and on certain transition or movement ruleset which change the state of a cell. This system of simple rules can produce surprisingly complex results. Such complexity is however dependant on the global ‘observer’.153 “Von Neumann recognized that life depends upon reaching this critical level of complexity”154 This provided justification for Frazer, who wrote [1995, p. 20]: “Life indeed exists on the edges of chaos, and this is the point of departure for our new model of architecture”.

The contention that “the complexity of the universe could be clearly understood in terms of simple programs”155 is introduced by Stephen Wolfram. His exploration of Cellular Automata began in 147

Fraser 1995, p. 65 Fraser 1994, p. 77] 149 Ibid. 150 Frazer 2010, p. 37 151 Frazer 1995, p. 10 152 John Frazer 1995, p. 13 153 Paul Coates, 2009, p. 43 154 Frazer 1995, p. 20 155 Wolfram 2006, p. 34 148

62


the early 1980s. CAs are particularly useful as their process can be observed globally. Wolfram demonstrated that programs with simple rules can produce complex results. “[...] [T]he behaviour that the system shows can nevertheless be highly complex. I argue that it is this basic phenomenon that is ultimately responsible for most of the complexity we see in nature” 156 To Coates; although CA have the same rules, each automata is subject to a different environment (of neighbours), thus the process exhibits complex behaviour. Moreover, “if the output, is much larger than the rule, then one is justified in claiming that the total system is complex rather than just complicated.”157

Fig.43: Cellular Automaton, Wolfram 158 World, rule 30

Our studio project drew back entirely on Python script (in Rhino 5.0), however, at a later stage Processing was also comprehended. To further elaborate, Cellular automata was our chosen basis for the duration of this studio for further experimentation and understanding of the grounding principles. My team and I worked in particular on spatial frames by creating defined environments within which to operate the Cellular Automata procedures, and simulating their behaviour in Rhino with a mesh. In a multi objective aim that concerns data, geometry and statics we developed algorithms to drive lengths between “environment” to a given database of measures, and then to performance of the turmites (refer to CAs). The precise description of the details is as illustrated and explained in our subsequent group project;

156

Wolfram 2006, p. 35 Coates 2010, p. 43 158 Retrieved from http://mathworld.wolfram.com/CellularAutomaton.html, 17/5/14 157

63


2.9.

Programming Languages

As Kalay clarifies; “The objective of programming language is to support the (human) programmer in writing correct code for some task”.159 Artificial intelligence researchers adapted such languages and decisive programming languages emulated. Here they defined only the rules and functions, no instructions on the operations. Such languages operated on a form of pattern matching. The presentation of object-orientated programming (OOP) languages, provided objects whose consolidation was predefined. Such independent objects from libraries written by other programmers amended once again the programmer’s dialogue with the computer. The methodology of writing a programme was, once the user had some grasp of the language, an undertaking to build definitions and explanations. Scripting languages can be considered as a subset of object-orientated programming, of which some versions include; MEL Script, Max Script, Python. The languages and grammars provide sufficient deliberation and representation to take into account creators to define, or rather start depicting. Content-based languages furnish us with an open system, which empowers creators to formalise non-linear strategies in their systems. The limits of the Turing Machine and the algorithms to address the broadest scope of issues, leads into how we ought to begin to defining such algorithms for them to convenient for the designer’s application in the real-world. What is the role of these strategies (algorithms) inside the design process? And how do architects and designers captivate with such methodology eventually to produce a decent design resolution to an occurring complex problem? Notwithstanding, it is comprehensible that to captivate inside this dialog, designers ought to demonstrate a constrained level of competency in ‘the art of computer programming’. As Coates states; “while the use of the computer in architecture can begin and end with just using the medium, a much more powerful set of experiences can be encouraged if the designer can enter a dialogue with the machine at the level of designing the medium itself, by being able to recast the design intentions and goals in terms that extend the capabilities of the software in front of them.”160

2.9.1. Python (in Rhino 5) Python offers exciting new potentials for programming in Rhino with Object-Oriented functionality, simple syntax, access to the .NET framework and a vast number of user-built libraries to extend Rhino’s functionality. Programming also offers a new language to communicate with the world because almost every discipline, from the Sciences, Engineering to Art, utilize code as a progressive new medium - and this primer gave us a relatively easy introduction into this powerful language for communicating with the world. Similar to the previous or other primers, we have the advantage of using geometric and visual examples to help understand programming. In many traditional scenarios, programming is approached with non-visual examples and difficult to understand founding engineering, architectural and design problems. For this reason, as well as Python's easy-to-read syntax, we were able to bring everyone to understand and write simple programs to help automate and design within Rhino.

159 160

Kalay 2004, p. 50 Coates 2010, p. 51

64


2.9.2. Processing The open source platform Processing was developed by Ben Fry and Ces Reas whilst at PLW. Processing expands upon the principle foundations of OpenStudio and demonstrates an open system which further incorporates the pedagogy of programming as a tool for learning. The platform is a comprehensive synthesis of the fundamental values attributed to the PLW. “It integrates a programming language, development environment, and teaching methodology into a unified system.”161 As an artist in his own right, Reas expresses a system-based view to programming; “Writing software has been the foundation of my work as an artist for the last four years. During this time, I’ve come to think more generally about software, discussing it in terms of processes and systems rather than computers and programming languages”.162 Processing is based on a simple syntax. Furthermore, “the creative potential is greatly expanded with Processing as designers and artists learn to become proficient coders as well.”163 To achieve a cognitive transparency, Processing employs an open system with which code is both visible and adjustable. This unified platform functions in a meshwork fashion, “open systems enable sharing, cycling and continual innovation through collective actions.”164 The novel approach of Processing is the degree of universality this open system provides, both in terms of a cognitive accessibility and a collective distribution. The true benefit of open systems is summarised by Tierney: “These examples operate as dynamic social ecologies that are structured to both reflect and extend our cognitive and social engagement with the world.”165

Fig.44: Microbial Ecologies; Fran Castillo, Peter Malaga, Yogesh Karekar and Priyanka Narula, Research 2013 [Communication Protocol (Digital Simulation - Fabrication System- KUKA 6 Axis Robot)]

Fig.45: Microbial Ecologies; Fran Castillo, Peter Malaga, Yogesh Karekar and Priyanka Narula Research, 2013 [Model]

161

Reas and Fry 2007, p. 1 Reas, cited in Tierney 2006, p. 43 163 Tierney 2006, p. 45 164 Ibid. 165 Ibid. 162

65


PROJECT 2.10.1.

thi(n)GMO -

contact tropism

Design Team: Harold Woods, Sarah Winkler & Ernesto Arias

In a prologue to the project – thi(n)gmo, based on Cellular automata, a ruleset was created and enforced onto the script. This ruleset was run in over 8000 steps or runs to attain different results; the result is different in each run and along with it comes a different ruleset to describe the potrayed result. A single desired ruleset out of the more than 8000 possibilities was advanced to the next stage to explore the possibilities it had. Thingmo project is thus an eplorative journey of this one rule out of the 8,000 rulesets (1/8000). CATALOGUE 1

catalog title: sample references of cellular automata run rulesets and various possibilities index description: examples of results (rulesets and patterns of movement)

66


thin(g)mo. Barcelona, SPAIN Project: thi(n)gmo. Theme: Biodigital Architecture| Encoded Matter design team: Harold Woods , Sarah Winkler & Ernesto Arias Instructor: Ezio Blasetti May, 2014

Description: Tropism is the growth of a plant to a stimulus that acts with greater intensity from one direction than another…the key is directional stimulus. This trait makes it possible for plants to optimize their position in space for better usage of environmental resources. Thigmotropism is essentially contact tropism as demonstrated by rhizomes, plant tendrils, carnivorous plants, and certain cell behavior. Can thigmotropism be the basis for a biodigital exploration of the “inherent potential of computation to generate space and of algorithmic procedures to engage self-organisation in the design process”? In this project, we “engaged with encoded process in order to develop an aesthetic and intuition of complexity”. Using Python scripting and Rhinoceros 3D 5.0, we explored, organized and cataloged our results. Three Python scripts were exercised. In all three our “agent(s)” were embedded with a specific logic to achieve a multiplication of effect. The scripts are based on simple rules but these lead to complex and emergent behavior. The first script is an application of the 2-D Langton’s Ant with the added capability for the ant to have a color that can change in addition to the environment (Turmite = Turing Machine Termites). The second script was interwoven/ looped with the Turmite and applied to both a 2D mesh plane and a primitive closed 3d mesh. The agents (termites) were allowed to work through a certain number of steps and then a subdivision routing was activated. Tropism-directional growth is achieved through termite contact with mesh. The third script was an “agent on mesh” algorithm. User specified agents act on the mesh and leave curve trails in their wake as they grow towards user specified attractors.

67


catalog title: biological reference images for tropism index description: reference numbers (see below), all reference web sites accessed 22-05-2014

0.1 cortical microtubule cytoskeleton acquires organization and this organization in turn functions to guide 166 patterns of cell growth and division 167 0.2 drosera capensis (cape sundew) is carnivorous through thigmotropism 0.3 aspen trees, subgrade rhizomes have horizontal growth through negative thigmotropism, avoidance of 168 soil 169 0.4 Aldrovanda vesiculosa : the cartwheel 170 0.5 dionaea muscipula (venus flytrap) carnivorous through thigmotropism 171 0.6 phalaenopsis orchid. orchids are rhiizomes ... growth through thignotropism 172 0.7 thigmotropic tendril. 173 0.8 drosera regia carnivorous plant 174 0.9 panax ginsing is a thignotropic rhizome 175 0.10 xray showing umbo-sacral facet tropism 176 0.11 lilium tigrinum (tiger liillies) are thignotropic rhizomes 177 0.12 thigmotropic tree roots. boca raton, florida

166

Retrieved from Retrieved from sundew 168 Retrieved from 169 Retrieved from 170 Retrieved from 171 Retrieved from 172 Retrieved from is.html 173 Retrieved from 174 Retrieved from 175 Retrieved from 176 Retrieved from 177 Retrieved from McNeil 167

https://deepgreen.dpb.carnegiescience.edu/ http://brilliantbotany.com/post/76552842654/biocanvas-sticky-glands-from-a-capehttp://heritageandfamily.blogspot.com.es/2013/01/aspen-tree-genealogy.html http://www.plantworlds.com/carnivores3.html (accessed 22-05-2014) http://imgarcade.com/1/thigmotropism-venus-fly-trap/ http://www.orchidcarelady.com/page/4/ http://newagaintoday.blogspot.com.es/2012/08/a-happy-day-in-neighborhood-our-yardhttp://home.arcor.de/bart.w/Galerie/regia%20spirale.jpg http://necesitodetodos.org/2014/04/plantas-medicinales-2/7-ginseng/ http://www.mainechiro.com/courses/low_ans.htm http://1hdwallpapers.com/tiger_lilly-wallpaper.html http://www.panoramio.com/photo/34225823 . photo uploaded 10-04-2010 by Richard

68


CATALOGUE 2 Description: The group chose to explore the potential of computer generated growth. The first step was to evaluate the application of the 2-D Langton’s Ant with the added capability for the ant to have a color that can change in addition to the environment (Turmite = Turing Machine Termites). The code was manipulated to include a random rule set generator in order to be able to get a better idea of the cross section of results that are possible with a limited number of runs. A 2d 50 x 50 grid mesh was used for the environment and one point assigned to be the Turmite. It was estimated that only 10% of all runs resulted in a pattern generation. The other 90% of runs resulted in the Turmite immediately “dying� and no pattern produced.

catalogue title: random rule set generation & Turmite pattern growth on 2d mesh index description: a selection of forty eight 2d patterns resulting from 3,000 randomly generated rule sets

69


CATALOGUE 3 Description: This algorithm explores how turmites embedded with a specific logic navigate over an environment (mesh) based upon a selected rule set from the randomly generated rules sets of the previous exercise. In this case rule set 5 which had a linear pattern was used as a hard input to the algorithm. The number of steps was set of 3,000, the mesh grid increased to 500 x 500 and the number of turmites increased in order to evaluate how the turmites might interact with each other.

catalogue title: turmite agent pattern over 2d 500 x 500 grid mesh (rule set pattern 5 as input) index description: agent influenced growth pattern over time, catalog number = “rule set step number�

70


CATALOGUE 4 Description: The second script was interwoven/ looped with the Turmite script but with a manually selected and hard programmed rule set. The resulting algorithm was applied first to a 2d mesh plane. The agents (termites) were allowed to work through a certain number of steps and then a subdivision routing was activated. The code had a “spring” feature intended for the 3d exercising of the code. For the 2d runs, this feature was disabled.

catalogue title: turmites & 2dmesh subdivision (pattern rule set 4 as input) index description: agent influenced growth over time, catalog number = “rule set.step number”

71


CATALOGUE 5 Description: We observed four general pattern types from the successful rulesets: Spiral, Linear, Chaotic and Stamp. We chose an example from each of the pattern types we observed in our initial cataloging of the standalone turmite algorithm to use their rulesets as input for the combined turmite/subdivision algorithm. The code was modified, so that the trail left by the turmite would cause a subdivision...resulting in a growth of the mesh. After evaluating the results of running the modified algorithm with the four rulesets, we found that, when applied to a 3D mesh each of the agents had a similar influence on the growth of Thi(n)gmo.

catalogue title: comparative study of turmite & 3dmesh subdivision (pattern rulesets 1, 2, 3, & 4 as input) index description: agent influenced growth, catalog number is “ruleset.step number�

72


CATALOGUE 6 Description: Evaluation of the results of the 3d turmite/subdivision routine for the four rulesets resulted in a group conclusion that the subdivision algorithm was dominating the agent algorithm. The formations generated after a similar number of iterations were quite similar even though the generation started from ruleset inputs that had created significantly different 2d patterns. The group made an intuitive decision to move forward and evaluate the effects of an additional growth program using a fifth rule set that was selected because it was a very linear pattern in an effort to see if the growth due to pattern would be more distinguishable.

catalogue title: turmite & 3dmesh subdivision ( pattern ruleset 5 as input) index description: agent influenced growth over time , catalog number = “ruleset.step number�

73


CATALOGUE 7 Description: In this algorithm, parasitic agents have been embedded with a specific logic to navigate over the mesh towards attractors. The mesh object resulting from the turmite/subdivision script is used as a scaffold for navigation and habitation by agents. Agents leave a trail of curves behind. The catalog illustrates growth over time (3,000 steps of growth were recorded.). This added encoded behaviour starts a process that if continued and refined could produce architectural apertures, openings, circulation and space organization.

catalogue title: agent tropism on mesh (pattern ruleset 5 mesh as input) index description: agent influenced growth over time, catalog number = “ruleset.step number�

74


RENDER/VISUALISATION; ONE POSSIBILTY

75


2.11. CONCLUSION Evolution is an automatic search process. To De Landa, evolution searches “the space of possibilities”178 He proposes that genetic algorithms can be seen as a visualisation tool for the generation of new forms. Aligning his discussion with the materialist De Leuze, he describes how the ‘genes tease out something of the matter’s morphogenetic potential’. Interpreting De Leuzian ideas on Materialisation, De Landa calls for a partnership between designers and the materials. He argues: “It is not for designers to impose their will”179 Computation therefore is not merely another technological innovation but ultimately a consequence of the metaphysical desire to uncover the Code of Life, and along with it, the invention and construction of abstract machines that could engender possibilities. It is this logic encoded within an internal principle, which constitutes the autonomy of the generative that lies at the heart of computation. Artificial Intelligence enables the cultivation of non-linear processes and parallelism is the source of most generative processes. Moreover both designers, human and machine, in partnership have a cognitive reach that extends beyond the spectrum of either individual intellect. We have presented a novel methodology – the methodology of scripting – is shown here to be fundamental to creativity, as relevant as drawing and programming we argue maybe seen as a creative process.

178 179

De Landa, 2004 Ibid.

76


77


CHAPTER 3 | AUTOPOIETIC & GENERATIVE ARCHITECTURE Dennis Dollens Studio_ 3.1

Definition

A term coined by Humberto Maturana (1980) to explain the process of living or cognising. It comes from the Greek “auto” meaning “self” and “poiesis” meaning “creation or production”. The autopoietic (living) system is an autonomous, self-organizing system. It is operationally closed containing within it all the elements necessary for its own reproduction and the maintenance of its organisation, but it is open to the flow of matter and energy and hence coupled to its environment. Life is defined by Maturana and Varela as a type of self-organization thus implying that autopoiesis in the physical space. This resembles the concept of metabolism, which itself is typically included in definitions of life. Three senses of metabolism are distinguished. Take a theoretical approach to architecture with Autopoietic Architecture, which presents the topic as a discipline with its own unique logic. Architecture's conception of itself is addressed as well as its development within wider contemporary society. It is argued that the problems of emergence and the architecture of complexity can be solved by analysing the self-organizing evolution of complex systems. A generalised, distributed variation-selection model is proposed, in which internal and external aspects of selection and variation are contrasted. "Relational closure" is introduced as an internal selection criterion. A possible application of the theory in the form of a pattern directed computer system for supporting complex problem-solving is sketched. The theory of autopoiesis challenges concepts familiar in biology and cognitive science. While its use of informational language is too restrictive, its use of cognitive language is too liberal: life does not imply cognition.

3.2

The Concept of Autopoiesis

The entire theory of autopoiesis – etymologically; self-making, has been developed by the physiologists Maturana and (later) Varela over more than thirty years. Its most sustained expression is in their book Autopoiesis and Cognition: The Realization of the Living, first published in Spanish in 1972 and translated a few years later (1980). The idea of autopoiesis is highly reminiscent of the strongest sense of metabolism. More precisely, since autopoiesis is a purely abstract concept, which Maturana and Varela apply to many different examples of self-organization, metabolism is close to the particular type of autopoiesis they regard as characteristic of life. In their terminology, this is “autopoiesis in the physical space”, implying that autopoiesis effected in physical systems, by physical (metabolic) processes. Metabolism is not part of the definition of autopoiesis as such, since autopoiesis is a more general concept. But using their characteristically opaque vocabulary, it is constitutive of its material structure when autopoiesis is realised as a living thing.180 Maturana and Varela take this form of autopoiesis (metabolic self-organization) to be the real essence of life. They see it as logically and biologically prior to most of the other vital properties in 180

Maturana and Varela, 1980, p. 88

78


the “typical” list given in the Introduction, including reproduction, evolution, growth, responsiveness, and adaptation. Their approach has been adopted by some researchers in A-Life and cognitive science. Unlike many of their colleagues, these researchers; some of whom are involved in the computer modelling of biological processes, unequivocally deny the possibility of strong A-Life because since they see cognition as necessarily grounded in life, they also deny the possibility of strong artificial intelligence. Autopoietic systems in general are defined in terms of their organization, not of their components nor even the properties of their components. What is crucial is “the processes and relations between processes realised through components.”181 An autopoietic system, or “autopoietic machine,” is formally defined by Maturana and Varela as follows: “[...] a network of processes of production; transformation and destruction of components that produces the components which: (i)

through their interactions and transformations continuously regenerate the network of processes or relations that produced them; and

(ii)

constitute the machine as a concrete unity in the space in which the components exist by specifying the topological domain of its realization as such a network.”182

The “machine,” here, is abstractly conceived, as is a Turing machine; and the “concrete” unity is not necessarily physical. The concept of autopoiesis as just defined can be applied; for instance, to inorganic chemistry, to architecture, business organizations, or to whole societies. None of these is classed by Maturana and Varela as a living system, although the last two are higher-level systems whose lower-level components are living organisms. For life, as they define it, embodiment is required; “autopoiesis in the physical space is a necessary and sufficient condition for a system to be a living one”183 And embodiment, in turn, involves more than mere physical existence. It requires the self-creation of a unitary physical system by the spontaneous formation of a physical boundary. Precisely, they claim that living things are of necessity physical, with a bodily fabric and boundary produced and maintained by them. The outstanding feature of living organisms is a form of selforganization termed autopoiesis. Some autopoietic systems have a self-maintained identity that does not exist in the physical space. A society, for instance, social order so to speak, consists of organisms closely coupled not only by physical relations, but also by semantics; to say, linguistically grounded communications. The self-organization of a society is constituted by a selfcoherent and self-sustaining set of social practices, within which there may be sub-systems of intercommunication having their own autopoietic unity. Different legal systems, for example, shape and maintain themselves within the specific communities concerned, and help establish equilibrium between various social and economic institutions.

181

Maturana and Varela, 1980, p. 75 Ibid, p. 79 183 Ibid, p. 84 182

79


Only human organisms can form part of a society, so defined. But for all living creatures, the very boundaries of the living system as a physical unity, as well as its bodily components, are continuously produced by its own activities. A human body, or a tree, is an autopoietic unity in the physical space. But they are higher-level autopoietic systems, made up of many such systems at a lower level.184 The basic phenomenon here is not the formation of a body with arms and legs, or leaves and boughs, but the self-organization of a single cell. The generation of the cellmembrane both bounds and constitutes the cell as an autonomous vital entity, distinguishable from its environment. Explaining how this can happen is universally acknowledged to be one of the core problems of biology.185

3.3

Autopoiesis and A-Life

Neither Maturana and Varela, nor Zeleny himself, regard Zeleny's computer simulation of life as real life. It is indeed an autopoietic system, but the autopoiesis does not take place “in the physical space”. To be sure, the system is actually implemented in the electronic processes of the computer. But at that level, there is no autopoiesis. The self-organization results only within an abstract representation of chemical relationships like catalysis; that are modelled by, not realized in, the computer. The same applies to Varela's more recent simulation of autopoiesis by means of an artificial chemistry.186 In other words, the arguments given above to show that a simulation of metabolism is not really alive are paralleled by arguments showing that the type of autopoiesis characterizing some simulations is not the type of autopoiesis required for life. That's not to say that Maturana and Varela deny the possibility of any conceivable sort of artificial life. On the contrary, they explicitly agree that one could in principle “design” and “make“ a living system.187 They further remark that we may, unwittingly, already have done so. In saying this, however, they are not thinking of virtual creatures in computer memory, but of self-maintaining biochemical systems. Nor are they thinking of some marvel of nanotechnology, whereby individual bio-molecules are directly arranged manually. But rather, they have in mind the human scientist's “creating the conditions under which autopoietic biochemical systems form themselves“.188 If for instance, nanotechnology were necessary to produce a particular biochemical system, even one that thereafter was self-sustaining, that system would not count as autopoietic. But suppose that a nanotechnologist deliberately aided the construction of a self-maintaining system by making chemical changes that could, indeed would, eventually have happened naturally. In that case, I would want to say that the resulting system was an autopoietic one, even though its selfconstruction had been aided by human intervention. Maturana and Varela, however, might view this intervention as adulterating.

184

Ibid, p.107-9 Maynard-Smith & Szathmary, 1995, ch. 7 186 McMullin & Varela, 1997 187 Ibid, p. 114 188 Zeleny, 1977, p. 27 185

80


3.4

Autopoiesis, Cognition and Biological Systems

Autopoietic theory exemplifies biology as it could be. Indeed, since it already exists; the first publications appeared in the 1960s, one could assert that autopoiesis exemplifies biology as we know it. But this would be misleading. The vocabulary of autopoiesis is unfamiliar to most biologists and scientists, and a minority taste among those biologists who have encountered it. Moreover, there are a number of differences between the theory of autopoiesis and more orthodox biology, which prevent its wider acceptance. Some differences are arguably matters of emphasis, which need not drastically affect the choice of research questions. These include what the autopoietic approach says about the nature of death, and about the relations between life, reproduction, and evolution. Other differences are more likely to arouse genuine puzzlement, or even exasperated rejection. For Maturana and Varela have fundamental reservations about some concepts widely used in biology, psychology and cognitive science, too. On the one hand, they reject theoretical terminology that is common in both biology and psychology. On the other hand, they speak of knowledge and cognition in many contexts where psychologists and most cognitive scientists would not. These more unorthodox aspects of autopoietic theory will be addressed in the following two sub-sections; “No information, or representation” and “All life is cognition”. Here, let us consider three aspects of the autopoietic concept of life that distinguish it from most other definitions; namely, its implications regarding death, reproduction, and evolution. By contrast, no possibility of interrupted autopoiesis is granted by Maturana and Varela.189 This is not primarily because they believe as most proponents of metabolism perhaps do too that the physical processes concerned are, as a matter of fact, continuously dynamic. Rather, their conception of autopoiesis as the fundamental source of the unity of the living thing forbids any suggestion that it might be interrupted without thereby destroying the vital integrity of the system in question. As phrased it; “in a living system, loss of autopoiesis is disintegration as a unity and loss of identity, that is, death”190

3.5

Genetic Algorithmic and Autopoietic of Architecture

The new digital approach to architectural design is based on computational concepts such as topological space, isomorphic surfaces and parametric design. Architecture is recasting itself, thus becoming partly an experimental investigation of topological geometries. Digital media is employed not as a representational tool for visualization, but as a generative tool for the derivation of form and its transformation; the digital morphogenesis. It explores the possibilities of form finding. Topological space opens up a universe where essentially curvilinear forms are not stable but may undergo variations, giving rise to new possibilities; the emergent form. In the task of designing rich search spaces, certain philosophical ideas, which may be traced to the work of Gilles Deleuze, play a very important role. Though not invented by Deleuze, he was the one who brought them together for the first time, making the basis for a brand new conception of the genesis of form.

189 190

Ibid, p. 98 Ibid, p. 112

81


According to Manuel De Landa in his essay “Deleuze and the Use of the Genetic Algorithm in Architecture,” the productive use of Genetic Algorithmic implies the deployment of three forms of philosophical thinking; 1 - Population thinking: the sequence of operations points at spontaneous and multiple mutations 2 - Intensive thinking; without the structural engineering and distributions of stress, a virtual building will not evolve as a building. 3 - Topological thinking; the obtained with the Genetic Algorithm must demonstrate an incredible combinatorial productivity like in natural forms, with thousands possibilities The employment of genetic design strategies develops autonomous architectural concepts, which replace the traditional hierarchical processes of production known as “cause and effect”; new organizational patterns and weavings and performative morphologies that can modulate and differentiate the environment. This morphogenetic process includes pattern, repetition and permutations. The tendency towards architectural autonomy might be understood as a moment of overall societal process of differentiation. Traditionally, architecture and good design were inseparably connected with society and harmony. The new algorithmic evolutionary conditions give architecture an autopoiesis. The autopoietic system as a complex, historically evolving system, always uses time and involves a series of events in its responses, so that simple and predictable one-to-one correlations between environmental impacts and system responses are out of the question. Recent developments in digital technology expose a degree of autonomy that architectural discourse has established by differentiating itself from the immediacy of everyday talk about buildings, and thus the complexity of the discursive detour, which mediates a particular impact or response, should grow with the overall complexity of society.

3.6

Emergence and the architecture of complexity

Realistically complex systems like organisms, societies, and ecologies are characterized by a multi-level structure. A classic explanation for this hierarchical "architecture" of complex systems was given by Simon (1962). His argument is based on a variation and selection view of both natural and artificial evolution: elements are connected and combined by natural interactions thus creating a variety of assemblies. Of these assemblies only those will "survive" which are sufficiently stable; the other assemblies will fall apart before they can undergo any further evolution. The stable assemblies, forming "naturally selected wholes", can then again function as building blocks, to be combined into higher order assemblies, and so the process can repeat itself at ever higher levels, forming a set of hierarchically structured complexes. Simon then uses this model in order to show why multi-level systems are more probable to emerge than two-level systems of comparable complexity: in a two-level system all the components must "fall into place" at once, otherwise the assembly will be unstable and fall apart before the missing components are added by the natural variation mechanisms. In a multi-level system, on the other hand, it suffices that small subsets of components would "fall into place" forming stable subassemblies ("modules"), which can then again be recursively combined in 82


small sets forming higher level modules. Clearly, the smaller the set of elements which must fall into place, the higher the probability that this will happen by random combination. However, Simon acknowledges that there are exceptions to this rule that non-hierarchical complex systems are highly improbable: for example, most polymers are formed by a very simple linear, two-level assembly of a large number of molecules. One of the important contributions of present-day self-organization models is that they can explain the emergence of such nonmodular, two-level systems, which have nevertheless a very large number of elements. Such processes are usually characterized by non-linear, autocatalytic mechanisms, whereby the presence of a small stable assembly whose emergence is quite probable enhances the probability that other elements would join the assembly, thus making it grow and become even more stable. According to the formulation of Haken (1983); a stable mode "enslaves" the remaining unstable modes. No intermediate levels of modules are needed in such a process with positive feedback. The emergent stable configuration can be thought of as an "attractor" exerting a force on the configurations in its neighbourhood, so that the configurations which are close enough to the attractor will automatically move closer and closer towards this stable configuration. It is clear that both the hierarchical model of Simon and the "non-linear" models of selforganization only describe part of the features of emergence. A real complex system, for example the human body, has as well hierarchical, multi-level aspects such as the organelle being a subsystem of the cell, being a subsystem of the organ, being a subsystem of...) as non-linear, two-level aspects; e.g. the system of blood vessels as a coordinated closed circuit consisting of billions of blood cells). However, in general, there is not just one global hierarchy or non-linear organization, but a multitude of inextricably entwined sub-organizations and subsystems. If we wish to understand the architecture of such complexity, we will need a more general, integrating theory of emergence and self-organization. The present text will propose some basic principles on which such a theory could be founded.

3.7

Emergence and self-organization

Emergence is a classical concept in systems theory, where it denotes the principle that the global properties defining higher order systems or "wholes", for instance, boundaries, organization, control, etc. can in general not be reduced to the properties of the lower order subsystems or parts. Such irreducible properties are called emergent. Until now there is no satisfactory theory explaining what characterizes emergent properties or what the conditions for their existence are. During this thesis, and in previous and subsequent chapters, I have tackled the question not from the traditional static viewpoint but from a dynamic, evolutionary viewpoint, replacing the question "How can a property be emergent?" by "How can a property become emergent? - how can it emerge?" This should also lead us to answer the question "Where do 'wholes' or 'systems' come from?" A promising approach to the problem of dynamical emergence is provided by the recently developed models of self-organization. Self-organization may be defined as a spontaneous; non- steered or non-directed by an external system process of organization, that is; of the development of an organized structure. The spontaneous creation of an "organized whole" out of

83


a "disordered" collection of interacting parts, as witnessed in self-organizing systems in physics, chemistry, biology, sociology and the likes is a basic part of dynamical emergence. However, another essential characteristic of emergence as it is understood in systems theory is its hierarchical or multi-level nature: an emergent whole at one level is merely a component of an emergent system at the next higher level. Until now, the most popular paradigms used for explaining self-organization; for instance, attractors, synergetics, catastrophes etc. are characterized by a mere two-level structure: the "microscopic" level where a multitude of building blocks or elements like molecules, and individual organisms interact, and the "macroscopic" level where these interactions lead to certain global patterns of organization; for example, a dissipative structure or a crystalline symmetry. The resulting systems as studied through these paradigms like a crystal, a regular pattern of fluid rolls in the Bénard phenomenon, or a trail of ants carrying food back to the nest are usually so simple in structure that it is not necessary to use a specifically systemic approach for understanding them.

3.8

Autopoiesis and the System

Lars Spuybroek wrote in 2004 in NOX: Machining Architecture that he ‘dreamed of a systems theory in architecture.’191 In 2011 Patrik Schumacher attempted to deliver this dream in the form of his controversial book, The Autopoiesis of Architecture. Its first volume opens with this statement: “The phenomenon of Architecture can be most adequately grasped if it is analysed as an autonomous network (an autopoietic system) of communications.”192 Schumacher appropriates the term ‘autopoiesis’ from the biologist Humberto Maturana via the social scientist Niklas Luhmann. As Curtis pointed out in All Watched Over By Machines of Loving Grace, the work of biologists like Maturana who have imposed cybernetic systems on nature have proven highly contentious, often oversimplifying a complex natural system to fit a simplified cybernetic one. Thus Schumacher’s methodology is certainly questionable. Just as questionable is his claim that this application of Systems provides a new paradigm. Peter Buchanan challenges this particular idea in his critique of the book, likening it instead to Marshall McLuhan’s ‘sunset effect’: a ‘flare-up’, and ‘an exaggeration of the pathologies of modernity.’ I would agree; the attitude, particularly to technology seems similar to that which we attributed to Modernism, particularly as it invests the machine with creative agency. However the agency has moved from the machine itself, to the self-organising system, in which humans and machines are all equal nodes engendering the system as a whole with creative agency.

191 192

Spuybroek, NOX: Machining Architecture, p.5 Schumacher, p.1

84


3.8.1 What is the aim of “The Autopoiesis of Architecture”? In his book, The Autopoiesis of Architecture, Patrick Schumacher elaborates on autopoietic systems. The Autopoiesis of Architecture, which presents the topic as a discipline with its own unique logic. He explores how the various modes of communication comprising architecture depend upon each other, combine, and form a unique subsystem of society that co-evolves with other important autopoietic subsystems like art, science, politics and the economy. Generally, the book applies to architecture the concepts and methods of German sociologist Niklas Luhmann (1927-1998) in its attempt to present an all-embracing, unified theory of architecture. Luhmann’s many books analyse modern society as a set of autonomous functional systems, including law, economics and politics. This horizontal differentiation into functional systems distinguishes modern society from the previous era of vertical stratification into social classes, the vestiges of which persist. Each functional system constitutes a separate system of communications and is autopoietic in nature. Autopoiesis, still a somewhat controversial concept, is a term coined by Chilean evolutionary biologists Humberto Maturana and Francesco Varela in 1972, and means self-generating. Here it refers to the evolving dynamic of these functional systems, the autonomy of each of which, in line with the concept of autopoiesis, helps keep them evolving.

Fig.: Parametric urbanism, Patrick Schumacher, Illustration of autopoiesis

193

For the most part, book thus explicates Luhmann’s conceptual system while applying it to architecture. Although the book is clearly written and easily understood, these parts are echoed over and over again, emphasising the same points countless times. Yet in a few places elsewhere the opposite pertains: untenable assumptions appear in a single sentence, without the full explanation they demand. As Schumacher asserts; ‘The theory of architectural autopoiesis is trying to think through the implications that follow when all the above mentioned options are rejected in order to embark upon a consistently anti-humanist, systemic and radically

193

Retrieved from http://www.dezeen.com/2011/02/04/competition-five-copies-of-the-autopoiesis-ofarchitecture-by-patrik-schumacher-to-be-won/, 13/8/14

85


Constructivist re-description and forward projection of architecture,’. Following Luhmann’s systems approach, he defines architecture as a system of communications. As Schumacher, states, the introduction of the concept of autopoiesis reflects the premise that the discipline of architecture can be theorized as a distinct system of communications. Autopoiesis means self-production. The concept was first introduced within biology to describe the essential characteristic of life as a circular organization that reproduces all its most specific necessary components out of its own life-process. This idea of living systems as self-making autonomous unities was transposed into the theory of social systems understood as systems of communications that build up and reproduce all their necessary, specific communication structures within their own self-referentially closed process. It is this total network of architectural communications, a gigantic, self-referentially closed parallel process that is referred to in the title of the book: the autopoiesis of architecture is this overall, evolving system of communications.194 The aim is a comprehensive theoretical system that offers itself to architecture as its comprehensive self-description describing architecture from within architecture, in its internal constitution, and in its relationship to its societal environment. The premise here is that architecture has always already constituted itself self-referentially, via its own autonomous, disciplinary discourse. The theory proposed here, the theory of architectural autopoiesis, focuses on architectural communications and “observes” these communications to detect its typical patterns. The theory analyses how individual communications depend upon and reproduce communication structures like the key distinctions, concepts, values, styles, methods and media of the discipline.

3.9

Constructing Autopoiesis & Metaphors

We wanted it because it signaled the connection between what we do and work being done in the fields of self-organization, autopoiesis, artificial life and consciousness studies. When crossing disciplinary domains, one always crashes head-first into the problem of metaphor: how can you connect ideas from distinct intellectual fields by means of a device analogous to what mathematicians recognize as an equal sign or a congruency sign? The concepts and principles introduced above should not remain purely theoretical speculations. With the advent of the new information technology complex, qualitative mechanisms can now be implemented and tested on computer in a relatively simple way. A general programming paradigm, pattern directed systems, is emerging, which is directly applicable to the present type of approach. A pattern directed system consist of a collection of modules or rules, which respond to messages or conditions characterized by a specific pattern; a set of variables or input channels structured in a specific way by sending out new messages or actions, dependent on the information received. The system is intrinsically parallel since different modules can respond simultaneously to the same or even different message(s), but it is possible to simulate such mechanisms on sequential machines.

194

Schumacher, 2010

86


Examples of pattern directed systems are; production systems, classifier systems, object-oriented systems, and logical or relational programming. In our approach the modules can be likened to subsystems, the messages to their input and output. Two modules can be said to be (temporarily) coupled if the output message of the one is accepted as input by the other one. The general problem with pattern directed systems is to specify the control structure; the set of rules which determines which module can send or accept messages to or from which other module. The generalized variation-selection dynamics in combination with the closure concept may provide an answer to this problem. The dynamics controlling the flow of messages must depend on two selection criteria: the external problem, to be specified by the user, and the internal closure of collections of coupled rules, leading to the self-organization and emergence of complex subsystems within the pattern directed system. In order to be effective the system should also have a variation mechanism. In order to start the problem-solving; evolution process, there must be an original variety of modules. This can be provided by the user, who could try to express the initial knowledge he has about the problem domain in the form of "if ... then ..." modules. Of course, this initial variety can always be expanded by the user during the problem-solving process: there is a continuous interaction between the computer system and the user, who plays the role of the external environment. Another source of variety can be provided by the computer system itself, which generates variations of the existing modules by internal changes or by combinations with different, external modules. Until now, typical problem-solving programs working according to the generate-and-test mechanism only use internal variation; the state of the system is changed by replacing some of its intrinsic properties. However, we have comprehended that external variation is a more interesting process in the sense that it can give rise to the emergence of a higher-order of systems through closure. An example of an existing pattern directed system evolving through variation selection is formed by "classifier systems".195 Here the selection is basically external, but the variation is partially internal mutation of classifiers, partially of a mixed type recombination of classifiers, in which part of one module; classifier, is recombined with part of another module. There is no explicit closure mechanism. Moreover, the information contained in a module is fixed, so that there is no explicit mechanism for emergence, although complex "assemblies" of modules might implicitly develop. Let us conclude by sketching how a pattern directed implementation of the present theory of emergence and evolution might be applied to real world problems. The main idea would be to design a generative autopoietic system for solving complex problems. A problem, as said, can be defined as a situation of non-optimal or non-satisfactory adaptation. The problem does not need to be well-structured; have an explicit goal, initial state and domain, it suffices that the actor experiencing the problem be capable of distinguishing satisfactory solutions from non-satisfactory ones - that he be able to carry out a selection between possibilities offered to him by an autopoietic system. The task of the support-system would then be to provide the user with potential solutions, with a relatively high probability of success.

Therefore the system must possess some intelligence; use the available knowledge (even if incomplete) in an efficient way by integrating the pieces of knowledge in stable, adaptive systems 195

Wilson, 1987

87


or complexes, and adapt itself rapidly to new input from the user. Moreover, the proposed potential (or partial) solutions should be meaningful to the user, i.e. easily recognizable as satisfactory or not. Therefore, the organization of the proposed system should be transparent and motivated. This demands an advanced interface for representing complex information. Such an interface may be provided with the aid of so-called "hypermedia" such as; HyperCard on the Apple Macintosh, that is; the combination of multiple media (text, graphics, sound, programming, animation...) in a nonsequential, but easily accessible, network format. Furthermore the system should continuously offer advice and explanations regarding the possible evolutions of the problem solving process.

3.10 THE PROJECT_BIG IDEA PART 1: Connector The objective of the first part of the studio was to design a connector. The connector was to be process driven rather than product-driven because an area or point of application was never defined in the beginning. We bore in mind that it will be used in a system, probably a shelter or canopy system but without the specificity where it would be used inspired a universal connector for my project. Since the project connector had to borrow ideas learned from nature, and be practically useful and applicable in a real world scenario, I looked to nature for ideas. Dragon flies – proved to be a good field of study and experimentation. The natural vocabulary of the project echoed need for flexibility, and in nature I found flexibility. Dragon flies have a segmented body (that allows for flexible bending of the insect along its body length), an exceptional wings flight pattern unlike any other insect (that sways up, down, front and forward) in 2-axis, high level of sensory effect; each wing operated by a separate muscle; a high level of control and stamina that it can even attack its prey while inflight. And most importantly, a firm with its pair of legs under all circumstances; hunting, storm weather etc, qualities a good connector should possess. The lessons learned were applied to design a sensory connector that gives when the systm is under stress/overload, controls movement and sway during high wind speeds, locks automatically during high stress and warns uses by a colour coded system; stretches in and outwards to accommodate all sorts of movements, thus it has a normal length before use, and a different one during usage. The following are images of the study;-

Fig.47: Dragonfly wings, swinging effect and directions

196

196

Retrieved from http://en.wikipedia.org/wiki/Insect_wing, 12/2/14

88


Fig.48: (left) Dragonfly Grip, inflight stamina amidst strong winds and rain Fig.49: (right) Dragonfly, body structure and details

Fig.50: (top left) Connector, Harold Woods, interior spring structure and detail, 2014 Fig.51: (top right) Dragonfly, Harold Woodsinside the spring, 2014

Fig.52: (top left) Connector, Harold Woods, skin,2014 Fig.53: (top right) Connector sketches, Harold, Woods, connector hooking system based on dragonfly legs

89


Fig.54: (top) Connector, Harold Wooods,exterior/skin/insulation

PART 2 & 3: SURFACES & STRUCTURE After the connectors, the second part was surfaces. As a class, we had a general idea of the brief but experimented individually and separately first through the study of leaf surfaces and patterns, origami advanced paper folding techniques etc. The experimentation of leaf surfaces is what emanated in the surfaces of the canopy and results experimented on a class group developed structural system. Excerpts are catalogued below:Fig.55: Surface; catalogue of some leaf studies, Sarah Winkler, 2014

90


Fig.56: Surface; catalogue of some origami studies, Sarah Winkler, 2014

91


Fig.55: Structure; simulation, Dario Sanchez, 2014

Fig.56: Structure; simulation render, David Romero, 2014

92


93


94


95


3.11 Conclusion We have seen that the concept of autopoiesis explicitly identifies as the origin of life. One or other of these concepts, each of which focuses on the self-organization of the bodily fabric, should be recognized as the most fundamental feature of life. For without a selfmaintaining bodily organism or systems, however simple it may be, the other vital phenomena; growth, development, responsiveness, adaptation, reproduction, and evolution would cease to emerge. But we have seen also that the two concepts (autopoiesis and metabolism) have different implications for the definition of life. In particular, reproduction and evolution are conceptually secondary to autopoiesis, but are typically listed on the same level as metabolism in non-autopoietic definitions of life. Accordingly, autopoietic theory implies substantive biological hypotheses that are not taken seriously, if they are considered at all, by most biologists. Specifically, it allows the possibility that the very earliest living things were incapable of reproduction, and therefore incapable of evolution too. The autopoietic approach is unusual also in its choice of theoretical vocabulary for describing behaviour. Maturana and Varela use autopoietic arguments to criticize each of these concepts. Although admitting that they can be useful metaphors, they also see them as potentially misleading. They prefer to speak in literal terms of intimately coupled dynamical systems, connected in a continuous process of mutual perturbation. To take their philosophy of autopoiesis seriously, then, would be to undermine many concepts and theories familiar within cognitive science. In other words, the arguments given above to show that a simulation of metabolism is not really alive are paralleled by arguments showing that the type of autopoiesis characterizing some simulations is not the type of autopoiesis required for life. That's not to say that Maturana and Varela deny the possibility of any conceivable sort of artificial life. On the contrary, they explicitly agree that one could in principle “design” and “make“ a living system.197 They further remark that we may, unwittingly, already have done so. In saying this, however, they are not thinking of virtual creatures in computer memory, but of self-maintaining biochemical systems. Nor are they thinking of some marvel of nanotechnology, whereby individual bio-molecules are directly arranged manually. But rather, they have in mind the human scientist's “creating the conditions under which autopoietic biochemical systems form themselves“.198 If for instance, nanotechnology were necessary to produce a particular biochemical system, even one that thereafter was self-sustaining, that system would not count as autopoietic. But suppose that a nanotechnologist deliberately aided the construction of a self-maintaining system by making chemical changes that could, indeed would, eventually have happened naturally. In that case, I would want to say that the resulting system was an autopoietic one, even though its selfconstruction had been aided by human intervention. Maturana and Varela, however, might view this intervention as adulterating.

197 198

Ibid, p. 114 Zeleny, 1977, p. 27

96


97


CHAPTER 4| METAPHYSICS AND COMPUTATION Carl Chu Studio_

“All is algorithm!” Gregory Chaitin

1.1

Inception

Karl Chu describes the world in which we live as a series of transitions from one phase to another; where each fundamental shift is as natural brushing your teeth. Not only are these ‘phase changes’ important in our world, but rather to be expected. The Universal Turing Machine was only the beginning of a fundamental phase change into the information age; although this machine was years ahead of what we would consider a ‘computer.’ Yet throughout his essay, Chu seems to stress the idea that it is these forward thinkers that push us in new and even almost unpredictable directions. Today, forward-thinking scientists and researchers are pushing the envelope in their respective fields. We now have cloud-computing and teleconferencing and have mapped human genome, where only decades before these ideas were mere fantasy. In nearly all aspects of our lives, we are headed towards a world that can quickly adapt to a wide variety of inputs and our rapidly-changing selves; no longer must we be flexible with the world we live in. Chu, however, makes the noted exception in architecture. As architects, buildings are designed in a static fashion; built to a series of specifications and then left to sit. This has begun to change recently, with the production of ‘smart’ buildings that can adapt to various situations, such as climate conditions. However, Chu pushes the idea of a ‘Genetic Architecture,’ that behaves more like a biological system. This type of architecture would be able to analyze, act and reproduce without external inputs or programming. In such a world as this, one would start to wonder at the future role of architects; what does design mean for a building that [re]designs itself? What happens when architecture can act on its own? Intelligent architecture. Machnes. Does the architect of the future equate to a computer programmer of the present? Does he or she exist at all? While Chu’s hypothesis may seem inconceivable right now, it may be prudent to look the Universal Turing Machine or the visionaries behind the internet to begin to appreciate the validity of what he is saying. This chapter discusses the idea of genetic computation and, by extension, the possibility of genetic architecture. This can be seen as a form of self-generative computation that carries within itself at each successive step the information needed to regenerate and continue the process. As a basis for this discussion, the Universal Turing Machine is shown as the linkage between the physical world and computable mathematical operations. According to this view, all physical processes can essentially be broken down and described computationally. Recent technological developments in bioscience and digital computation have brought us to a point of 98


convergence that may fundamentally alter our lives, a “new kind of biomachinic mutation of organic and inorganic substances.” Two architectural responses are presented that attempt to address these issues: The morphodynamical and the morphogenetic. The morphodynamical approach utilizes higher order computation while staying within traditional architectural patterns, whereas the morphogenetic can be said to move beyond this and use digital computation to create an architecture that carries within it the framework of its own self-generation. Carl emphasises over and over again the use of the idea of Monadology, or an irreducible physical entity or concept, as a means of describing the fundamental building blocks of such a system. A morphogenetic architectural system, as I understand it, would consist of this monadal unit inscribed with a code that allows for its replication and that provides guidelines for its utilization. Overall, I find the idea of genetic architecture interesting, but I wonder what the effect of time would be on these systems. As is the case with all other forms of matter, computational tools degrade with age… How does this factor into a morphogenetic system? I also question the idea that all physical processes might be reduced into a computational form. I think much of what is good in our lives and environment is the result of whim, accident, and human nature and should be left as such. As Chu mentions the quest for the ‘Universal Language’ it feel like there are consequences and several things that are going to be lost and forgotten. As the genetics evolution concept of life and architecture progress, I feel that this is going to form an even larger social-economic gap between cultures spread throughout the world. As we are developing into architects and designers I think it is important to reflect and react to principal meaning of architect; and to fill the human needs and for spaces based on the interactions of people from one culture to the next. He relays the message that technology, networking, genetics have developed so much and so fast, but still that the ‘universe’ has not explored its limits. Genetic architecture is an exciting, promising, and highly conceptual field that suggests we can bridge the gap between biology, artificial intelligence, and architecture. Chu sees great potential for architecture to radically evolve along with its inhabitants and designers. He views genetic architecture as an extension of the human being as an extension of the human genes combined with the power of technology, which he names the post-human. Genetic architecture would consist of self-assessing, self-healing and self-generated systems that are formed into spaces, design and buildings. Perhaps genetic buildings could morph, react and adapt to its inhabitants by sensing the moods or health of its occupants and act accordingly. They are buildings that are more than off the grid and sustainable, they would reach a much higher standard, which for now is unknown. While all the development and progression with building technology is advancing, I can’t help but be a little cynical to this type of architecture revolution.

99


1.2

The Metaphysics of Architecture

The term metaphysics means beyond the physical nature. Many parts of the human existence can be considered to be metaphysical: thoughts, feelings, memories, dreams, ideas or any other thing that goes beyond the physical word we live in. Humans have dealt with these intangible elements of life since the beginnings of consciousness. Many philosophers, including Martin Heidegger, were concerned with the metaphysical because it is a fundamental part of human beings and their reality. And it is this reality, or physical world, that interests architects. After all, it is what they work with. The physical world, and architecture as a part of it, provokes metaphysical reactions in the individual - such as feelings, memories and thinking. Thus, building becomes a very important matter in our existence and for our experience of the world. Heidegger wrote that to be a human being means to be on the Earth as a mortal. It means to dwell. Then he adds to the explanation of the word bauen (building): But if we listen to what language says in the word bauen we hear three things: a) Building is really dwelling b) Dwelling is the manner in which mortals are on the Earth, c) Building as dwelling unfolds into the building that cultivates growing things and the building that erects buildings.199 Thus, building is critical for human existence. Since the destiny of humanity is to dwell on the Earth, and since the way we interact with the planet is by constructing, any structure we erect is an expression of dwelling. Related to building as an expression, is the understanding of the built object. Then, how we express the way we dwell through a constructed entity -or built environment in this particular case- becomes part of the job of the designer. What the architect is able to achieve in his projects is what others will perceive. The architecture of Peter Zumthor is the object of this study. His thoughts and ideas are expressed in his buildings. They produce a metaphysical experience. By designing spaces that enhance the natural and the real world, Zumthor makes the users participate in a relation with the environment. He produces an effect in the person; he makes them wonder about the most basic components of life. He refers to architecture: Architecture has its own realm. It has a special physical relationship with life. I do not think of it primarily as either a message or a symbol, but as an envelope and background for life, which goes on, in and around it, a sensitive container for the rhythm of footsteps on the floor, for the concentration of work, for the silence of sleep.200 There is poetry behind these words. They are not just words that describe features of a particular edifice, but that describe a sensitivity for what lies beneath the real world. When architecture becomes just an envelope and lets all the other components of human existence become more important, the metaphysical appears. Architecture is no longer a building; it is now the container

1. 200

199

Heidegger, Martin: Poetry, Language and Thought New York: Harper and Row, 1975.

Ibid.

100


of poetry, thoughts and dreams. Peter Zumthor achieves this in his designs, and what follows is how he does it. David describes his experience as he walked in one of the buildings designed by Zumthor: It was like entering a very different place. As soon as you step in, you can smell the wood, hear the music, see the different tonalities of light... it was amazing. It is through the interaction between the human body-mind and the physical elements of a building that metaphysical experience is produced. Thus, what the designer does is to manipulate any effect that an intended object has on humans. In the case of the built environment it has been the architect who has change that effect. The intention of the creator becomes very important since it is his idea that will guide the design of the future space to be experienced by others. In this sense, and in order to achieve reactions that go beyond the physical, Zumthor proposes: I thus appeal for a kind of architecture of common sense based on fundamentals that we still know, understand and feel. I carefully observe the concrete appearance of the world, and in my buildings I try to enhance what seems valuable, to correct what is disturbing, and to create anew what we feel is missing.201 It is reality then, what is important in Zumthor's architecture. He tries to give back the importance to the concrete appearance of the world. In other words to enhance the natural and the real. By doing this he intends to cause an effect on people. He explains it and compares this effect with what a work of art may also produce: If a work of architecture consists of forms and contents which combine to create a strong fundamental mood that is powerful enough to affect us, it may possess the qualities of a work of art. This art has, however nothing to do with interesting configurations or originality. It is concerned with insights and understandings, and above all with truth.202 It is the actual way in which he enhances the natural world that this essay explores; how he manipulates the components of his buildings in order to affect the human experience. In Zumthor's architecture there are six main elements that interact with each other to produce a complete metaphysical experience: the concept of archetype, nature, materials, light, the human body and the person's memory.

2. 202

201

Zumthor, Peter: Thinking Architecture, Boston: Basel, 1999

Ibid.

101


1.3

The metaphysics of space and time

In common-sense, as this manifests itself in common language, one finds the earliest quasitheory of space and time (here quasi-theory = a jungle of notions of varying trustworthiness). Looked into more closely, this quasi-theory turns out to be heavily metaphorical, and it is with this metaphorical theorization of space and time that I will find quarrel this whole chapter. But even before this it should be conceded that metaphorization, i.e., the (implicit) likening of a thing to something else, is a philosophically inferior kind of explanation, because it tells you what something is like, not what it is. Absent any other way of explaining something – and such an absence has to be argued for – metaphors should be shunned in philosophy, however expedient and tempting they might be in the case of invisible, immaterial, non-concrete space and time. Now I take these three qualifications – invisible, immaterial, non-concrete – to be self-evident truths about space and time, or at least the default position and free of the initial burden of proof (not that I know of any alternative position in this respect). If one doubts space and time are not invisible, one might ponder what colour or shape they are; or if one suspects they are material, one might ponder how much they weigh; or, finally, if one thinks they are concrete, like the ball or table I see, one might ponder how to locate them precisely in the world. One might then want to press me and say that, for instance, electromagnetic radiation ostensibly also qualifies on these three counts, so can I be more specific in defining space and time? But the differences between radiation and space/time should not be difficult to spot: radiation manifestly interacts with material objects; it has physical sources, and is at least in principle traceable to (fairly) concrete objects, photons (one can shoot them off lasers one at a time). Prima facie (and far beyond), space and time are not like this.

1.4

SEIFERT SURFACES / TYPOLOGY

1.4.1 Topological Architecture One interesting method non-linear form transformation is topological architecture and this studo was based on this for the development of proto-structures. Topology is defined as study of intrinsic, qualitative forms that are not normally affected by changes in size or shape, which remain invariant through continuous transformation of elastic deformation, such as stretching or twisting. Greg Lynn’s essay (1993) on .architectural curvilinearity is one of the first examples of the new topological approach to design that moves away from the dominant deconstructivist logic of conflict and contradiction. To develop a more fluid logic of connectivity and manifested by continuous highly curvilinear surfaces. The shape of a NURBS curve or surface is controlled by manipulating the location of control points, weights, and knots. NURBS make the heterogeneous, yet coherent forms of the topological space computationally possible. By changing the location of control points, weights, and knots, any number of different curves and surfaces could be produced.

102


Topological Architecture and its departure from the Euclidean geometry Greg Lynn’s essay (1993) on .architectural curvilinearity, is one of the first examples of the new topological approach to design that moves away from the then dominant deconstructivist logic of conflict and contradiction. to develop a .more fluid logic of connectivity, manifested by continuous, highly curvilinear surfaces

1.4.2 Homeomorphic figures (form transformation) The defining element of topological architecture is its departure from euclidean geometry of discrete volumes represented in Cartesian space, and the extensive use of topological, rubbersheet. geometry of continuous curves and surfaces, mathematically described as NURBS - NonUniform Rational B-Spline curves and surfaces. In the topological space, geometry is represented not by implicit equations, but by parametric functions, which describe a range of possibilities.203 Scrolls, or ruled surfaces are surfaces generated by straight lines or rulings govern by twisting and tapering. It stands out in the architecture of curvilinearity due to its striking shape and relative simplicity of construction.

1.4.3 Algorithmic Architecture (Scripting) "Script" is derived from written dialogue in the performing arts, where actors are given directions to perform or interpret.204 Scripting languages are typically not technical but mathematical solutions that are define by set of rules and based on parameters. It is a programming language that controls a software application and often treated as distinct from programs which execute independently from any other application. Scripting provides a number of advantages when applied as a tool for architectural design. It gives the power of recursion and allows performing repetitive tasks in a faster and efficient way. Recursion provides speed for calculating large number of functions or operations that are of the same kind. It allows automating iterations, to construct complex objects or methods using smaller and simpler functions, which are repeated several times. 1.4.4 A Brief History of Scripting in Architecture The use of scripting in architecture was taken from the .Theory of L-System205 in which the simulation of plant growth is further experimented and applied as a process to generate objects.206 The theory of L-systems has led to a well-established methodology for simulating the branching architecture of plants. Many current architectural models provide insights into the mechanisms of plant development by incorporating physiological processes and have been an eye opener to architects and designers to use in form generation of buildings.

203

Piegl and Tiller 1997 Schnabel, 2007 205 Lindemayer, 1968 206 Allen et al, 2004 204

103


Scripting in architecture denotes the use of algorithms to generate form, solve, organize, or explore problems computationally by using numeric data and variables to address problems. According to Terzidis (2006), the basic elements that are used in algorithms are constants, variables, procedures, classes, and libraries and the basic operations are arithmetical, logical, combinatorial relational and classificatory arranged under specific grammatical and syntactical rules. Theoretically, an algorithmic process serves as step by step or sequential pattern in which planned and mathematically calculated to perform an accomplishments for a desired task. Its form result could not be sometimes predicted but may end with better or worst than intended. In this process, the power of .accidental form generation. may lead to better and good solutions. However, the advantages of this process is algorithm can be intelligently calculated and can serve as a pattern to understand the problem, addresses its possible solutions and sometimes could be a vehicle for defining new problems.Basic Scripting Language MEL and MaxScript: Basic Form Transformation Maya Embedded Language (MEL) is a scripting language used to simplify tasks in Autodesk's 3D Graphics Software Maya. MEL offers a method of speeding up complicated or repetitive tasks, as well as allowing users to redistribute a specific set of commands with others that is useful. MaxScript is the scripting language of 3d Studio Max. Its language was developed to be used by artists as well as by technical directors and programmers. It provides relatively relaxed syntax rules and is more similar to Maya's MEL script. The internal structure of MaxScript has more similarities to LISP, and is considered as expression-based language. Both MEL and MaxScript have geometrical objects such as curves, surfaces or solid that is embedded. In scripting, attributes is used to compose sub-elements, i.e. a polygon that are constructed based on faces, edges, vertices, and NURBS-based cube surface that is also govern by isoparms and control points. Form transformations can be extracted by algorithmic transformations such as sequential transformations, multi-booleans, stochastic search, fractals, cellular automata and hybridization.207

207

Terzidis, 2006

104


1.5 PROJECT SEI. 1.5.1 TYPOLOGY: Knots & Rings Introduction Typological thought refers to the whole, to the manifold relationships among things, to the extreme and at the same time the harmonious. It is a way of thinking that does not refer to the age but to the place - place at which borders and opposites melt together into an intellectual universal. Typology is the comparative study of physical or other characteristics of the built environment into distinct types. In this paper, the historical transformation of type and typology concepts since the Enlightenment has been examined in three developing stages based on methodological and historical interpretation: the first conceptualization developed out of the rationalist philosophy of the Enlightenment, the second relates to the modernist ideology and the last to Neo-Rationalism after the 1960s. The study aims to highlight the significance of the concepts of type and typology that are so rich in tradition and so important for intellectual history, and that could aid in enhancing our understanding of architecture within its historical and socio-cultural contexts. A discussion of type and typology can promote a way of looking at the built environment, that can not only help us recognize and discover basic types but also enhance our ability to see the differences as well as similarities among architectural artefacts by recognizing the invisible connections between them. During the nineteenth century, a deliberate turn away from ideas of imitation and truth‐to‐nature towards concepts of abstraction or objectivity emerged and fundamentally altered the knowledge and practices of many disciplines. In architecture, this important shift resulted in theories of type and design methods based on typology, complementary concepts through which architecture as both a modern form of knowledge and knowledge of form was to be consolidated. In terms of architecture and its instrumentality, type and typology are unique as disciplinary frames through which broader socio‐political, cultural and formal problems can be posed. To explore the sustained, or perhaps renewed critical, interest in the potential of type and typology, a number of academics and practitioners will discuss their relevance to contemporary architectural practice and research and in relationship to the problem of the historicity of disciplinary knowledge. When one thinks of how we make sense of our daily life, one can easily recognize the significance of the notion of type in understanding and clarifying the commonalities and differences between various phenomena within the immense world of existence. As Franck and Schneekloth says “types and ways of typing are used to produce and reproduce the material world and to give meaning to our place in it”. The notion of type underlies all logical inferences that help one to classify the phenomena, to put them in groups based on their similarities, as well as to make distinctions between them. This act of classification enables multiplicity to turn into unity, which at the same time generates reasoning 105


and knowledge. The first period when the notion of type gains its significance was the eighteenth century also known as the Age of Enlightenment. During this period, the Enlightenment thinkers, inspired by Newton’s revolution in physics, argued that systematic thinking could be applied to all forms of human activity. It is in this period that the first encyclopaedias in various disciplines were written with the aim of classifying rational information. Some of the most important and influential writings of the Enlightenment were published during this time. These include the following three main texts: Encyclopaedia (1751) edited by Denis Diderot and Jean le Rond d’Alembert and compiled by the group called the Encyclopẻdistes; Baron de La Brede et de Montesquieu's Esprit des lois (The Outline of a Modern Political Science-1748); and Jean Jacques Rousseau's the Discours sur des sciences et les arts (Discourse on the Origin and Foundation of Inequality Among Mankind-1755). Within architectural discourse, the first typological approach developed out of the rationalist philosophy of the Enlightenment as can be found with the French archeologist and art writer Quatremẻre de Quincy in his work Encyclopẻdie (1789). The result of this corpus of work has since been influential and it has become the subject of debate in architectural discourse of the twentieth century. But within the modernist architectural discourse, the concept of type suffered a loss of significance. For example, in modernism the notion of type was reduced to the notion of stereotype. However, we see a re-emergence of the significance of type and typology during the 1950s as reflected in the writings of Aldo Rossi, mainly The Architecture of the City (1982).

1.5.2 PROTO STRUCTURES Architecture is “the construction of possible worlds:”1 We are entering a new era where the current paradigm of architecture will no longer be valid or considered responsible. The new era will take advantage of planetary computation, be populated by “post humans,” domestication will no longer be a legitimate goal of architecture., and what have to now been perceived as clear distinctions between chaos and order will dissolve . What kind of architecture would this be? We will need a revolution in our conceptual understanding of architecture. Humanity will no longer be at the centre of our understanding of architecture. The best clues can be extrapolated from biology, genetics, and mathematics. Studying topology in the architectural sense is a good start to understanding a possible new concept of architecture. Mathematical topology is the consideration of the nature of space, investigating both its new structure and its global structure...Wolfram Mathworld states topology is the mathematical study of properties preserved through deformations, twistings and stretchings of objects (no tearing). (http://topology.rz-a.com/). Knot theory is the study of mathematical knots and a good area of research when considering constructing a possible world.208

208

Karl Chu, June, 2014, lecture notes summary, Biodigital Master, ESARQ, UIC, Barcelona

106


107


Fig.: Inception, Harold Woods; Knots & Rings screenshots of initial process,2014

1.5.3 KNOT SEQUENCE In mathematical terms, knot theory is an embedment of a circle in a 3-dimensional euclidean space. Though developed in the 18th century, knot theory is now being used to understand knotting phenomena in DNA and the theory is crucial to the construction of quantum computers.209 Of most interest and potential to architecture are the spun 4d knots (in the third column of this page) which illustrate the potential for volumes and spaces can be formed through a mutation of form and structure into a complex dynamics...space that has the possibility to be “transformed into an interconnected, dense web of particularities and singularities better understood as substance or led space.” (http://topology.rz-a.com/ , quote referenced from “The Role of Mathematics in Virtual Architecture.” Mathland) The standard Mobius strip has the unknot for a boundary but is not considered to be a Seifert surface for the unknot because it is not orientable.210 The "checkerboard" coloring of the usual minimal crossing projection of the trefoil knot gives a Mobius strip with three half twists. As with the previous example, this is not a Seifert surface as it is not orientable. Applying Seifert's

209

Retrieved from http://en.wikipedia.org/wiki/Knot_theory, on 20/6/14 Retrieved from http://en.wikipedia.org/wiki/Seifert_surface, 20/6/14

210

108


algorithm to this diagram, as expected, does produce a Seifert surface; in this case, it is a punctured torus of genus g=1, and the Seifert matrix is

Image: Knot sequence, Harold Woods, selected stages of the knot and rings experiment, 2014

109


Adaptation is about “interiority”. In architecture, interiority would refer to a building/structure that exists for itself, by itself and in itself. An object that shines on its own without embellishment or to make a point is a worthy goal in architecture. An architecture that is “laconic” and in its simplicity and brevity it is elegant...which then allows itself to be anything. The will of the architecture is no longer the will of the architect but simply the will of the architecture.211 The proto-structure below was generated based on a set of manipulated parametric equations in Rhinoceros 3d, Math equations and grasshopper. Using the Rhino Math plug-in allowed for experimentation to determine the variation that would form the surface topology with the potential to enclose and open up to space within its innate loop. The mathematically generated curves have an embedded logic that can serve as primary structure, and utility -own conduit. The “thing’s” continuum of the knot allows the “thing” to relate to the ground below (foundation) and to the atmosphere. The embedded structural logic of the knot curve provides the potential for secondary structure, framed openings, spatial organization, and circulation.

Image: Mobius, Harold Woods; Grasshopper and Mobius strip typology experimentation, 2014

Historically architecture is understood to be within the “solid” regime. Biological systems geometrical organization are characterized in the “liquid” region. Architecture should strive to be in the liquid regime...this is where you and organization that borders on chaos.212 The knot diagrams were created in Rhinoceros 3d using Rhino Math plug-in and grasshopper. Original knot parametric equations retrieved from http://www.mi.sanu.ac.rs/vismath/taylorapril2011/Taylor.pdf.

211

Karl Chu, June, 2014, lecture notes, Biodigital Master, ESARQ, UIC, Barcelona

212

Karl Chu, June, 2014, lecture notes, Biodigital Master, ESARQ, UIC, Barcelona

110


The target to design a “thing” not an “object.” Because a “thing” has no preconceived relationships. A “thing” will be more topological, complex folding, unfolding, intertwining, knotting, challenging the limits and barriers. Understanding the difference between “object” and “thing” will help understand how to not project your own repressive nature on the design. Search for a “laconic” elegance. “Being” and capturing space. The diagrams below catalogue a brief study into the potential of knots to develop a proto-structure with an implicit and cohesive logic of the structure in the surface. Internal logic gives rise to a beauty and harmony. There are four Regimes of Matter, (1) solid, (2) liquid, (3) gaseous, and (4) plasma.

111


Images: (above) Proto-Structures, Harold Woods, a selection of some “possible worlds� 1-4 of thousands through mutation/evolution process, 2014

112


1.6

CONCLUSION

To a great extent, evolutionary design theories and the analogy of Deleuze have evoked a paradigmatic shift of design process. It moves away from the traditional pen and paper-based architecture. Though, curvilinearity is seen in Baroque architects, it is the impact of the application of digital technology that transcends today.s unique style of architecture. Today, as architecture enters into the realm of digital age, the use of digital technology is becoming mature. A number of digital avant-garde architects are emerging, architects that are non- Euclidean, digitally driven and topologically curvilinear-based thinkers. While the inquiry and exploration is still very dynamic, the use of these methodology and emerging computational design tools to generate different forms is very useful. Complex forms and difficult spatial programming that is often a hindrance to architects in the early stage of design will become embodied in a computer system and will be very essential to design process in the future. With the advancement of technology, architects can breed new forms, a form that is not just for representation but a form that integrates the underlying logic of architecture-aesthetics, function and strength. Classifications are human constructs necessary to understand and clarify the commonalities and differences between various phenomena. Although one cannot disagree about the extent they are helpful for us, sometimes proposed categories could become strict boundaries which could limit our understanding. Similarly, examining one of the main criticisms of the notion of type and typology is related to the danger of type turning into stereotype. According to Ungers, for example, form follows function slogan led, at the cost of architecture, to an all discriminating pragmatism as the oppressing phenomenon of empirical optimism.213 De Carlo also based his criticism against the notion of type on the description of stereotype as the rigid type that is repeated or reproduced without any variations and according to a preestablished conception, and bearing no distinctive signs or individual qualities214 Furthermore, De Carlo also suggested that types have stiffened to the point of giving the impression that the invention of alternatives is useless as types do not accept variations, additions, or alterations. Typology as such does not and cannot incorporate user participation, and therefore it is antithetic to participation. There are critics who recognize the contributions of typological approach, and believe in continuous debate so that the understanding of the notion of type can flourish. Gregotti and Reichlin, for example, accept and support the recent use of the notion of type as well as the focused attention on typology, but criticize their refusal of the significance of function all together. Gregotti defines the understanding of type as becoming “stone-hard value of laws independent from any heteronymous situation”215. He suggests that this separation from the particular and the individual reduced the architectural designs’ capacity to find in reality the necessary confrontation and ideas. Instead, Gregotti directs attention to the organic relation between the functions, the necessity of the project, the reason for an idea, and the construction process.216 Reichlin, furthermore, emphasizes the fact that architectural work is a structurally complex material and at the same time a tool which is subject to factual and cultural use and a plastic and spatial artifact that is the object of a symbolic and aesthetic fruition. He questions how many of 213

Ungers, O. M., “Ten Opinions on the Type” Casabella, 509-510: 93-95, 1985 De Carlo, G., “Notes on the Uncontrollable Ascent of Typology” Casabella, 509-510: 46-52, 1985. 215 Gregotti, V., “The Grounds of Typology” Casabella, 509-510: 4-8, 1985. 216 Ibid. 214

113


these dimensions have been considered in typological approaches and if they are considered as a system or not. Reichlin criticizes the application of typological approach in design schools having similar problems as the inductive method. He also cautions us against loosing structural and functional attention to architectural object, and architecture becoming repetition of models. Oriol Bohigas is another contemporary theoretician who, on the one hand, recognizes type as “one of enlivening elements” of recent architectural debate, and on the other hand, criticizes the instrumentalization of type, type as being conceived as a means of supplying certain final-model forms.217. According to Bohigas, the instrumentalization of type has caused a crisis in the historical process of modern architecture. The use of ‘type’ as a tool in the design process, similar to what Gregotti says, has led to ‘typification of the type’ that is the tendency to discourage the emergence of new formal structures because of the belief that historically formulated types could provide the answers to new functions and production systems. Moreover, according to Bohigas, this attitude created the appearance of a “formal frozen repertory” that is very easy to repeat exactly as it is without any new cultural value. Instead, he offers the idea of type as the first hypothesis in the design process. Then we need to recognize the real structure of the historical experience not just its stylistic appearance. This historical experience needs to be examined and this can only be valuable through typology. His approach is based on having a hypothesis and testing the fitness of this hypothesis to the concrete facts of the scheme and continuously re-proposing a new hypothesis until that propriety is obtained. The constructive aspects of type as well as the vitality of typological thinking for creative thought in general is well recognized within the architectural community. As Reichlin summarizes; “the idea of type promotes a census of knowledge, a re-ordering of experience around the discipline of architecture, and, consequently, a re-conquest of intelligibility”218. However, typological thinking should not be condemned only to be a practical tool used for the development of types, basic patterns or concepts. The typological thinking “defines a way of thinking in basic all-encompassing contingencies, of having a universal view of the world of ideas, as well as that of reality”219. In other words, typological thinking might facilitate a way of looking at life that promotes thinking in transformations, a way of thinking that combines the opposites in a morphological continuum. To conclude, typological thoughts and actions presuppose two things: first, to recognize and discover basic types; secondly the ability to see things in complementary relationships. As Unger suggests, “thinking of manifold possibilities corresponds to thinking in morphological transformations of things and states, be they the material of nature or culture”220. This way of looking might in fact be instrumental in the creation of more appreciative, grateful and sensible way of seeing differences by putting them in a continuum and recognizing the invisible connections between them, not only within the architectural discourse but also in all aspects of life in general.

217

Bohigas, O., “Ten Opinions on the Type” Casabella, 509-510: 93, 1985. Reichlin, B., “Type and Tradition of the Modern” Casabella, 509-510: 32-39, 1985. 219 Ungers, O. M., “Ten Opinions on the Type” Casabella, 509-510: 93-95, 1985 220 Ibid. 218

114



REFERENCES

Prusinkiewicz, P. and Hanan, J. Lindenmayer Systems, Fractal, and Plants. New York: Springer-Verlag, 1989. Stevens, R. T. Fractal Programming in C. New York: Holt, 1989. Wagon, S. "Recursion via String Rewriting." §6.2 in Mathematica in Action. New York: W. H. Freeman, pp. 190-196, 1991. Madan, M.L. 2005. Animal biotechnology: applications and economic implications in developing countries. Rev. Sci. Tech. 24(1): 127-39. Mattick, J.S. 1994. Introns: evolution and function. Current Opinion in Genetics and Development 4: 823-831. McEvoy, T.G., Alink, F.M., Moreira, V.C., Watt, R.G. & Powell, K.A. 2006. Embryo technologies and animal health - consequences for the animal following ovum pickup, in vitro embryo production and somatic cell nuclear transfer. Theriogenology. 65 (5): 92642. Christopher Alexander, Notes on the Synthesis of Form. (Cambridge, Harvard University Press: 1964), 15. William Lethaby, Architecture: An Introduction to the History of the Art of Building, (T. Butterworth Ltd;Rev: 1911), 25. Luhmann, N. (1982). The world society as a social system.Int. J. Gen. Syst. 8,131-138. Luhmann, N. (1986). The autopoiesis of social systems. In Geyer, F., and van der Zouwen, J. (eds.),Sociocybernetic Paradoxes, Sage, London. Maturana, H. M. (1970).Biology of Cognition, Biol. Comp. Lab. Res. Rep. 90, University of Illinois, Urbana, pp. 75–102. [Reprinted in Maturana, H. M., and Varela, F. G. (eds.) (1980).Autopoiesis and Cognition: The Realization of the Living, Reidel, Dordrecht. Part of this report is reproduced as Neurophysiology of cognition. In Garvin, P. (ed.) (1970).Cognition: A Multiple View, Spartan Books, Washington, D. C.] Maturana, H. M. (1974). Cognitive strategies. In Von Foester, H. (ed.),Cybernetics of Cybernetics, Biological Computer Laboratory, University of Illinois, Urbana. Maturana, H. M. (1975a). The organization of the living: A theory of the living organization.Int. J. Man Machine Stud. 7, 313–332. Maturana, H. M. (1975b). Communication and representation functions. In Piaget J. (ed.),Encyclopedie de la Pleiade, Gallimard, Paris.


Maturana, H. M. (1978). Biology of language: The epistemology of reality. In Millar, G., and Lenneberg, E. (eds.),Psychology and Biology of Language and Thought: Essays in Honour of Eric Lenneberg, Academic Press, New York. Maturana, H. M. (1980a). Autopoiesis: Reproduction, heredity and evolution. In Zeleny, M. (ed.),Autopoiesis, Dissipative Structures and Spontaneous Social Orders, AAAS Selected Symposium 55, Westview Press, Boulder, Colo. Maturana, H. M. (1980b). Man and society. In Benseler, F., Hejl, P., and Kock, W. (eds.),Autopoietic Systems in the Social Sciences, Campus Verlag, Frankfurt. Maturana, H. M. (1981). Autopoiesis. In Zeleny, M. (ed.),Autopoiesis: A Theory of Living Organization, Elsevier North-Holland, New York. Maturana, H. M., and Varela, F. G. (1975).Autopoietic Systems, Biol. Comp. Lab. Res. Rep. 9.4, University of Illinois, Urbana. [Reprinted in Maturana, H. M., and Varela, F. G. (eds.) (1980).Autopoiesis and Cognition: The Realization of the Living, Reidel, Dordrecht.] Maturana, H. M., and Varela, F. G. (1980).Autopoiesis and Cognition: The Realization of the Living, Reidel, Dordrecht. Maturana, H. M., and Varela, F. G. (1987).The Tree of Knowledge, Shambhala, Boston. Mead, C. (1989).Analogue VLSI and Neural Systems, Addison-Wesley, New York (in press). Millar, J. (1978).Living Systems, McGraw-Hill, New York. Selfridge, O. (1959). Pandemonium: A paradigm for learning. In D. V. Blake & A. M. Uttley (Eds.), Proceedings of the Symposium on Mechanization of Thought Processes (pp. 511-529). London: H. M. Stationery Office. Teubner, G. (forthcoming) The evolution of legal systems. (Paper given at the British Academy, April 1999.) To appear in M. Wheeler & J. Ziman (Eds.), The evolution ofcultural artefacts (provisional title). Publisher under negotiation. Thelen, E., & Smith, L. B. (1993). A dynamic systems approach to the development of cognition and action. Cambridge, Mass.: MIT Press. Uexkull, J. von (1957). A stroll through the worlds of animals and men. In C. H.Schiller (Ed.), Instinctive behavior: The development of a modern concept (pp. 5-82). New York: International Universities Press. Varela, F. G., Maturana, H. R., & Uribe, R. B. (1974). Autopoiesis: The organisation of living systems, its characterisation and a model. Biosystemss 5 (4), 187-96. Varela, F. G., Thompson, E., & Rosch, E. (1991). The embodied mind: Cognitive science and human experience. Cambridge, Mass.: MIT Press. Wheeler, M. (1996). From robots to Rothko. In M. A. Boden (Ed.), The philosophy of artificial life (pp. 209-236). Oxford: Oxford University Press.


Zeleny, M. (1977). Self-organisation of living systems: A formal model of autopoiesis. International Journal of General Systems, 4, 13-22. Chu, K. S., (2002) The Unconscious Destiny of Capital (Architecture in Vitro/Machinic in Vivo). In: Leach, N. et al. (eds.) Designing for a Digital World. London, Wiley & Sons. pp 127-133. Chu, K. S., (2005) Metaphysics of genetic architecture and computation. In: Estévez, A. T., and Truco, J., and Felipe, S., (eds.) Genetic Architectures II. Barcelona, Sites Books / ESARQ (UIC). pp 158-180. Tierney, T., (2006) Collective Cognition: Neural Fabrics and Social Software. In: Collective Intelligence in Design. Architectural Design Vol 76 No 5. West Sussex, England, John Wiley & Sons. pp 36-45. Weinstock, M., (2005) On self-organisation: Stress driven form-finding in architecture. In: Estévez, A. T., et al. (eds.) Genetic Architectures II. Barcelona, Sites Books / ESARQ (UIC). pp. 100-106. Weinstock, M., (2010) The Architecture of Emergence, The Evolution of Form in Nature and Civilisation. West Sussex, England, John Wiley & Sons. Whilliams, J.K.C. And Kontovourkis, O., (2008) Practical emergence. In: Littlefield, D.(ed.) Space Craft: Developments In Architectural Computing. London, RIBA Publishing.p 68-81. Wolfram, S., (2006) How Do Simple Programs Behave?. In: Programming Cultures. Architectural Design, Vol 76, No 4. West Sussex, England, John Wiley & Sons. pp 34-37. Zellner, P., (2008) De-Tooling. In: Architecture and Urbanism, No 455. Tokyo, A+U Publishing Co. Ltd. pp 130-135 Zellner, P., (2009) Pretensions of Form: A Conversation. In: Observations on architecture and the contemporary city. LOG, No 17. New York, Anyone Corporation. pp 63-76. Hartz, G., Cover, J., (1988), Space and Time in the Leibnizian Metaphysics, Nous, Vol. 22, No.4, pp. 493-519 Heidegger, M., (1927), Being and Time, tr. Macquarrie, J., Robinson, E., (2001), Oxford: Blackwell Publishers Ltd. Husserl, E., (1973), Ding und Raum: Vorlesungen (1907), The Hague: Martinus Nijhoff Mcguire, J.E., Slowik, E. (2012), Newton’s Ontology of Omnipresence and Infinite Space, in Oxford Studies in Early Modern Philosophy, Vol. 6, pp. 280-308 Findlay, J., (1975), “Husserl’s Analysis of the inner Time-Consciousness”, The Monist, Vol. 59, No. 1, pp. 3-20 Laudan, L., A Confutation of Convergent Realism, Philosophy of Science, Vol. 48, No. 1 (Mar., 1981), pp. 19-49 David is a current fifth year student at Kansas State University and visited the Swiss Pavilion in the year 2000. Heidegger, Martin: Poetry, Language and Thought New York: Harper and Row, 1975.


Heschong, Lisa: Thermal Delight in Architecture Massachusetts: MIT Press, 1979. Zumthor, Peter: Thinking Architecture, Boston: Basel, 1999. Rynasiewicz, R., "Newton's Views on Space, Time, and Motion", The Stanford Encyclopedia of Philosophy (Winter 2012 Edition), Edward N. Zalta (ed.), URL = http://plato.stanford.edu/archives/win2012/entries/newton-stm/



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.