BlueSci Issue 20 - Lent 2011

Page 1

The Cambridge University science magazine from

www.bluesci.co.uk

9 771748 692000

ISSN 1748-6920

20 >

Cambridge University science magazine

Lent 2011 Issue 20

FOCUS Life in the Universe

Test Tube Babies . Space Elevator . Music Therapy Einstein’s Life . Science of Significance . Triangulation of India


You have online access to Cambridge Journals, from your desk, your library or your mobile device. Thanks to an agreement with the Cambridge University Library, all staff and students of the University of Cambridge have online access to over 250 peer reviewed academic journals and over 180 journal archives published by Cambridge University Press.

To access Cambridge Journals please visit:

journals.cambridge.org

w


Lent 2011 Issue 20

Contents Features 6

Regulars On the Cover News Book Reviews

Cell Talk Rhea Chatterjea explores the medical frontiers of gap junction research

8

Behind the Science

The Blue Screen of Death Wing Yung Chan traces the imperfect path to the perfect program

10

Annabelle Painter shows how the amygdala may be key to culture, spirituality and identity 12

Eradicating Rinderpest Paul Simpson looks at the history of a quietly devastating disease

14

16

Mark Nicholson discusses the science behind the fiction of the space elevator

FOCUS Life Will Find a Way BlueSci explores the past, present and future of the search for life in the Universe

About Us...

24

Sara Lejon gives her perspective on Nobel prize winning in vitro fertilisation technology

Arts and Reviews

26

Lindsey Nield discovers the hidden power of music

Climbing Space

BlueSci was established in 2004 to provide a student forum for science communication. As the longest running science magazine in Cambridge, BlueSci publishes the best science writing from across the University each term. We combine high quality writing with stunning images to provide fascinating yet accessible science to everyone. But BlueSci does not stop there. At www.bluesci. co.uk, we have extra articles, regular news stories and science films to inform and entertain between print issues. Produced entirely by students of the University, the diversity of expertise and talent combine to produce a unique science experience.

22

Ian Fyfe uncovers the personal life of Albert Einstein

Perspective

The Science of Significance

3 4 5

History

28

Tim Middleton explores how India was mapped and the world’s tallest mountain was named

Technology

30

Tom Ash looks into computer systems that can receive commands directly from the brain

Away from the Bench

31

Rosie Robison recounts her experience at the Parliamentary Office of Science and Technology

Weird and Wonderful

32

Committee President: Tim Middleton .................................... president@bluesci.co.uk Managing Editor: Stephanie Glaser ........ managing-editor@bluesci.co.uk Secretary: Jessica Robinson .............................. enquiries@bluesci.co.uk Treasurer: Wendy Mak .................................. membership@bluesci.co.uk Film Manager: Sita Dinanauth .....................................film@bluesci.co.uk Webmaster: Joshua Keeler ............................. webmaster@bluesci.co.uk Advertising Manager: Richard Thomson .......... advertising@bluesci.co.uk Publicity Officer: Helen Gaffney .................... submissions@bluesci.co.uk News: ......................................................................news@bluesci.co.uk Submissions: ............................................... submissions@bluesci.co.uk

w

Contents

1


Issue 20: Lent 2011 Editor: Taylor Burns Managing Editor: Stephanie Glaser Business Manager: Michael Derringer Second Editors: Amy Beeken, Robert Jones, Jonathan Lam, Luke Maishman, Kirsten Purcell, Ilia Rushkin, Sandra Schneider, Nicola Stead, Raliza Stoyanova, Vivek Thacker News Editor: Imogen Ogilvie News Team: Robert Jones, Ayesha Sengupta, Katy Wei Book Reviews: Catherine Moir, Anders Aufderhorst-Roberts, Nicola Stead Focus Editors: Wing Ying Chow and Natalie Lawrence Focus Team: Yvonne Collins, Letizia Diamante, Amelia Penny Weird & Wonderful Editor: Nicola Stead Weird & Wonderful Team: Thomas Gizbert, Tim Middleton, Helen Parker, Pictures Editor: Wendy Mak Pictures Team: Wing Ying Chow, Yvonne Collins, Amelia Penny, Jessica Robinson, Paul Simpson Production Team: Wing Ying Chow, Alex Hyatt, Tim Middleton, Kirsten Purcell, Sandra Schneider Cartoonist: Alex Hahn Cover Image: Tim Middleton

ISSN 1748-6920

Varsity Publications Ltd Old Examination Hall Free School Lane Cambridge, CB2 3RF Tel: 01223 337575 www.varsity.co.uk business@varsity.co.uk BlueSci is published by Varsity Publications Ltd and printed by The Burlington Press. All copyright is the exclusive property of Varsity Publications Ltd. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, without the prior permission of the publisher.

2 Editorial

Celebrating Issue 20 success seems to be measured in tens.

20 issues, 800 years, 1000 signatures—all milestones, all arbitrary. So at moments like these, when commemoration is long due yet trivially assigned, how does one measure achievement? It’s here (in my North American-ness) that I take a cue from Frank Capra and say that success is defined by community. And community, if anything, is what BlueSci has built. Starting in 2004 as a small extra-curricular project amongst a handful of friends, the publication has blossomed into a Cambridge institution. In the two years I’ve spent with the magazine, every issue has surpassed previous records in contributions, volunteers and submissions. I’ve seen our traditional venues replaced as meeting numbers grew well beyond fire safety limits. From graphic design to Nature, from national newspapers to high profile scientific awareness campaigns, BlueSci alumni are building their presence in public science. And, in recent years, BlueSci has inspired sister publications at other universities in the UK. If community is our metric, then commemoration is certainly due. Such surging interest in science writing is a positive in a year marked by negatives in the UK. Fuelled by debt, politicians have questioned the ‘use’ of certain sciences, trying to place an immediate value on a pursuit that is defined longitudinally, often with initial underappreciation (e.g. Tim BernersLee’s “vague but exciting” proposal for the World Wide Web, which certainly would not have wooed many policy-makers, or Max Born winning the Nobel Prize in physics for a footnote). Assessing value in science, though necessary, is complex. Sometimes this doesn’t translate. It’s the role of this and other organisations to translate. It’s perhaps fitting, then, that we’ve considered many ‘big’ ideas in the 20th issue. What’s the possible future of space exploration? What do we actually know about life in the Universe? How is art and technology interacting with the foundations of our minds? These are some of the many questions we, in small part, address. Built on the backs of extraordinary volunteers, BlueSci is a sample of the best science writing from across Cambridge, one of the richest scientific environments in the world. It stands to inspire an understanding of science, create an awareness of its importance within culture, and, most of all, to enlighten. If we’ve captured a mere shimmer of the wonder, imagination and artistry of the scientific pursuit, then we’ve done our job.

Taylor Burns

Issue 20 Editor

Lent 2011

The first issue of BlueSci, Michaelmas 2004


Of Minerals and Meteorites Richard Thomson looks into the story behind this issue’s cover image The Cambridge University science magazine from

www.bluesci.co.uk

ISSN 1748-6920 ISSN 1748-6920 20 > 20 > 9 771748 692000 9 771748 692000

Cambridge University science magazine

Lent 2011 Issue 20

FOCUS Life in the Universe

Test Tube Babies . Space Elevator . Music Therapy Einstein’s Life . Science of Significance . Triangulation of India

Lent 2011

fascinated humans throughout the ages. They still have an odd ability to amaze us, but, more than that, they actually give us insight into the birth of our solar system. Many of the meteorites we see today were created at the same time as the Sun, Earth and other seven planets (eight if you can’t bear to let Pluto go) were formed from the accretion disc of gas and dust that condensed to form the early solar system. They keep a record of their rich history of geological processes and are therefore a portal into time as well as space. There are a number of classes of meteorite, but the vast majority that are recovered on Earth are known as chondrites, owing to the presence of spherical mineral inclusions known as chondrules. These inclusions are spherical because they are formed in zero gravity, melting and reforming during their elliptical orbit around part of the accretion disc of a newborn star. The chondrite class is a subclass of the stony meteorites formed from an early solar system. They show no evidence of modification due to melting or differentiation of the parent body from which they fragmented (which, in the case of chondrites, is commonly asteroids). These meteorites can be dated by geologists by analysing the radioactive isotopes of elements with known half lives compared to their respective daughter isotopes. This leads to the conclusion that our solar system is 4.6 billion years old (give or take a few million years, but who’s

METEORITES HAVE

counting?). The second subclass of stony meteorites is known as the achondrites and as the name suggests these are free from chondrules. They are derived from parent structures that have undergone serious geological processes since formation, such as planets or other stellar bodies such as the moon. Although these cannot give any information about the origin of our solar system, they are an interesting tool for research into extraterrestrial activity and the presence of chiral amino acids on these meteorites has strengthened the argument for life outside our planet. The cover image is of a thin section of the chondritic Bjurbole meteorite witnessed under an optical microscope through crossed polar light to give an array of birefringent colours from the various minerals present. The thin section shows a chondrule primarily composed of the silicate minerals olivine, clinopyroxene and orthopyroxene, which is very common. These minerals are held in a very fine-grained matrix which consists largely of the elements iron and nickel as well as finer grained dust. The meteorite belongs to Albert Galy and is used in the second year teaching labs of the Earth Sciences Department. The Bjurbole meteorite was broken into many pieces when it collided with sea ice in the Gulf of Finland on 12 March1899, and is considered unusual due to the large size of the chondrules present.

On the Cover 3


News Benefits to weaker immune system

OLIVER HEROLD

Check out www.bluesci. co.uk, or BlueSci on Twitter http://twitter. com/BlueSci for regular science news and updates

DANA BURNS

TIM VICKERS

animals continue to demonstrate a range of immunity responses to disease. This raises the question of why natural selection has not eliminated susceptibility to disease, which has been found to have a genetic basis. An 11-year study on wild Scottish sheep, published in Science, showed significantly different levels of antibodies in the blood of individual sheep. Antibodies are proteins that identify and generate immune responses against foreign objects. Scientists collected blood plasma samples and assayed them to measure antibody concentrations. Researchers hypothesised that, although stronger immune responses lead to better survival, they can also be a strain on the body for energy, even causing autoimmune disorders. After further study, they discovered that sheep with greater antibody levels lived longer, but there’s a catch. They found that males with higher antibody concentrations were less likely to have sired offspring during their previous rut, and females less likely to have produced a lamb. With respect to species evolution, both susceptible and resistant strategies are thus equally successful. While sheep with better immune responses have more time to produce offspring, others with a lower immune system are more fertile, producing about the same number of progeny in their lifetime. Immunity can thus have both a positive or negative effect on evolutionary success, depending on the circumstances—a highly significant finding. as

Cell membranes – liquid or solid?

it has long been known to scientists that the lipid sheets that form the structural basis of all cellular membranes exhibit viscous fluid behaviour. However, new research at the University of Oregon has revealed that, under certain conditions, this material can also behave like a solid. The fluid nature of lipid bilayers in cell membranes is known to be crucial in allowing cell surface molecules to move and interact, but it has now been shown that these lipid structures are in fact viscoelastic. They do not flow like a Newtonian fluid regardless of the stress encountered—above a critical frequency or speed of perturbation they react like solids. This kind of behaviour has been recognised before in biological materials such as mucus and tears, but it is the first time that it has been identified in lipid bilayers. This new discovery, published in the Proceedings of the National Academy of Sciences, could affect our understanding of a host of mechanical processes in cell membranes: from the cellular response to external forces to the movement and manipulation of proteins in the membrane itself. The currently mysterious failure of many of these mechanical membrane processes has been linked to various diseases, so this new research could play a key role in illuminating why cell membranes function, or fail to function, the way they do. kw

Curious excitation of ‘magic’ isotope

CARSTEN NEIHAUS

the isotopes of tin (Sn) provide a perfect laboratory for

4 News

studying a variety of nuclear properties at the limits of particle stability. The 100Sn isotope is particularly important as it has a so-called ‘doubly magic’ closed-shell nucleus, with 50 protons and 50 neutrons, which has helped physicists to develop the Nuclear Shell Model. Under this model it is expected that the ground-state spins of the semi-magic isotopes 101,103,105Sn will be identical, and dependent on which single-particle orbital has the lowest energy. Experimental data for known isotopes in this range have previously indicated no exceptions, but researchers at the Oak Ridge National Laboratory in the United States have found an unexpected result.

By measuring energy spectra in xenon-tellurium-tin alpha-decay chains, the team found that the spins of the ground state and the first excited state of 101Sn were reversed with respect to the heavier isotopes. The authors of the study, published in Physical Review Letters, explain that the inversion results from unusually strong pairing interactions between neutrons in the outer orbital and relatively small energy splitting between orbitals. This behaviour makes the proton-rich nuclei above 100Sn unique. Characterising their nature is essential for calibrating theoretical models and for predicting the properties of unmeasured nuclei. rj Lent 2011


Book Reviews Discoveries of the Census of Marine Life: Making Ocean Life Count

CUP, 2010, £27.99

OCEAN COVERS 71per cent of the surface of the Earth, yet very little is known about the species it contains. The Census of Marine Life, an international initiative, was established in 2000 to address the “diversity, distribution and abundance” of marine species and involves a variety of projects. Paul Snelgrove, who oversaw the final synthesis phase of this ten year endeavour, has written this book to bring its discoveries to a wide audience. In a highly readable manner, he outlines both the various inventive methods employed and the discoveries made, and the book is illustrated throughout with photographs of fascinating species. This work was, however, not carried out purely for the sake of curiosity, as a loss of ocean biodiversity could have serious consequences for the environment. Yet Snelgrove ends with a positive message about the future of our oceans. The more information we possess, the better we can act to sustain marine resources, and the Census is a vital starting point for the expansion of our knowledge. This book conveys the enthusiasm the author feels for his subject, and is well worth reading for anyone curious about the life that exists beneath the surface of the ocean. CM

Sudden Genius: The Gradual Path to Creative Breakthroughs “EUREKA!” EXCLAIMED ARCHIMEDES upon witnessing water displacement as he got into his

OUP, 2010, £18.99

bath—or so history would have us believe. But do geniuses really have these so-called ‘eureka’ moments of clear inspiration? Moreover, can you define a genius empirically? Andrew Robinson addresses these and other questions in his highly recommended new book, Sudden Genius. He opens with an exploration of the scientific study of such exceptional creativity, looking at the roles that intelligence, talent, unconscious thought and mental illnesses may play. Throughout, Robinson uses plenty of examples and anecdotes that prevent it from becoming too academic. The majority of this book, however, focuses on the lives of ten remarkable individuals from both artistic and scientific domains, prior to, and during, their major breakthroughs or discoveries. These include Einstein’s special theory of relativity, Mozart’s Marriage of Figaro and Champollion’s decipherment of Egyptian hieroglyphs. Robinson’s varied choice of geniuses illustrates the large diversity of characteristics, family backgrounds and educations found amongst exceptionally creative people. Although there doesn’t seem to be a set pattern amongst the ten geniuses, Robinson concludes with the interesting observation that almost all major breakthroughs required immersion in the chosen subject for about ten years prior to the discovery. NS

Pathfinders: The Golden Age of Arabic Science

Allen Lane, 2010, £25.00

Lent 2011

“THE INK OF A SCHOLAR IS MORE SACRED THAN THE BLOOD OF A MARTYR.” Jim Al-Khalili opens his account of the history of science in the Islamic world with this powerful quote. It’s an apt choice because most of this book challenges widely held misconceptions about the influence that Muslim nations have had on science. Not that this is in any way a book about Islam: AlKhalili is careful to point out that the phrase “Muslim Science” is a misnomer just as “Jewish Science” was during Nazi Germany. Most people view scientific achievement as ending with the Greeks and beginning again during the Renaissance. We like to fashion the in-between as the ‘Dark Ages’. But while this was true in Europe, it was contrasted by a golden age of discovery in the Middle East in areas such as chemistry, physics, medicine and astronomy. Al-Khalil shows that scientific discovery is a continuous process which has passed between different cultures across the centuries. He also discusses, ultimately, how the Arabic world lost its influence, to the benefit of the western world. For the most part he succeeds in this. The chapters are concise and paint a clean continuous narrative, although the pictures seem hastily assembled and out of line with the text. But this doesn’t detract too much from the overall message: that so-called modern science originated in the east during an incredible era. Al-Khalili makes it clear that this is an era which can and should return. AAR Book Reviews 5


IE

ENZ

IC

MIN

DO

McK

Cell Talk

Rhea Chatterjea explores the medical frontiers of gap junction research

6 Cell Talk

a certain threshold, this elicits the generation of an electrical signal known as an action potential. Action potentials travel within the cell cytoplasm as waves of positive ions. Since cardiac muscle cells are connected to each other via gap junctions, these waves of positive ions can cross cell membranes and are transmitted across all the cardiac muscle cells. This increased positive charge within the cardiac muscle cells causes them to contract. The result: a synchronised heartbeat. The conduction of these action potentials throughout the cardiac muscle tissue can be monitored in an electrocardiogram (ECG) which is a common test performed on patients complaining of chest pains, palpitations and respiratory distress. An ECG test can help cardiologists detect irregular heart rhythms caused by faulty conductance of electrical signals through the heart, and even identify patients at risk of a heart attack. As well as allowing synchronisation of cellular reactions, juxtacrine signalling also provides speed in signalling. Since gap junction-mediated signalling relies on electrical signal transmission via the flow of ions, the speed of signal transduction far exceeds other means of signalling which rely on bulk flow through the circulation or passive diffusion through intercellular space. In addition, electrical signal transmission via gap junctions rules out the possibility of chemical fatigue, a problem that is faced in nerve to nerve transmission via chemicals. Nerves rely on neurotransmitters which need to be manufactured from chemical subcomponents. As a result, repeated YORGOS NIKAS

MAURIZIO DE ANGELIS

Cells communicate via close contacts (left) using membrane-spanning channels (middle). Gap junctions are involved in embryonic development (right)

PAUL J. SMITH. RACHEL ERRINGTON

whether communicating sensations of pain or the need for insulin release from the pancreas, cell to cell communication is vital for our survival. Just as we communicate our desires, emotions and situations through a variety of modes such as speech, facial expression and gesticulations, cells have evolved a multitude of ways to communicate with each other. Of these methods, juxtacrine signalling—communication between adjacent cells — is the most effective means of cellular signalling in fundamental processes in the body such as the beating of our hearts. In addition, it has formed the basis of a novel means of cancer treatment and has been implicated in embryological development. Juxtacrine cell signalling, unlike other signalling methods, requires the communicating cells to be in physical contact with each other. These regions of physical contact between the cell membranes are known as gap junctions. They allow cells to communicate their intracellular condition to adjacent cells by the presence of membrane-spanning channels which connect the cells’ interiors. This allows the regulated exchange of intracellular peptides, ions and other chemical components that are present in the internal environment of the cell. In this way, an external signal affecting the internal environment of a single cell can be transmitted across tissue. One of the best examples of the body utilising this method of signalling is in cardiac tissue. Special ‘pacemaker’ cells in the heart contain channels allowing the constant leak of positively-charged sodium ions into the cell. This increases the charge inside the cell and, beyond

Lent 2011


Lent 2011

In a gap junction the cytoplasm of two cells is directly connected allowing molecules to diffuse freely between them MARIANA RUIZ

stimulation of nerves can cause the rate of release of neurotransmitters to exceed the rate of their production. The weakening of signals transmitted due to insufficient neurotransmitters is known as chemical fatigue. With gap junctions, ions are constantly transported across all cell membranes via ion channels, such as the sodium pump, and any ionic gradients naturally dissipate. This ensures that any localised depletion of ions is rapidly replenished and baseline conditions are quickly restored, avoiding any form of fatigue. The characteristics of speed and lack of fatigue could have given the transmembrane connexin proteins, which form the gap junction channels, an evolutionary selection advantage. As described by Dr Robert Malenka, in his book on intercellular communication, many species of fish, amphibians and crustaceans use gap junction—mediated cell signalling to initiate what is known as the escape response. In crustaceans, such as crayfish and lobsters, the escape response involves a ‘tail flip’ which requires the coordinated movement of the segments of the tail. Each segment receives signals from a single neuron and these neurons are coupled together via gap junctions. Thus, the neurons effectively function as a single unit that enables the quick transmission of electrical impulses through the tail. The speed of signal transmission provided by gap junctions thus enables a rapid ‘tail flip’ response that is crucial in escaping from predators and other dangers. While gap junctions can be said to be indispensable, the full extent of their roles in cell signalling is still being unravelled. Recent research has implicated gap junctions in embryogenesis and early cell differentiation. Findings suggest that gap junctions may provide cells within the embryo with a quick and effective means of communication before the circulatory system has been established. Crucial left—right patterning in embryos that directs the development of the heart in the left side of the chest may also be communicated to cells via juxtacrine signalling. Recently, scientists have started to investigate the use of gap junctions as part of a novel tumour treatment strategy. During typical melanoma progression, the production of the ubiquitous transmembrane connexin 43—a protein involved in the formation of gap junctions—has been observed to drop significantly. Infection of mouse melanoma cells with diarrhoeainducing Salmonella typhimurium caused production of the connexin 43 protein to be upregulated, allowing these melanoma cells to form gap junctions with dendritic cells of the body’s immune system. The research team found that these gap junctions allowed melanoma-specific peptides to enter the dendritic cells. The experiments showed that the dendritic cells exposed these peptides by transferring them to their cell surface in a process known as

cross-presentation. This allowed recognition of the melanoma peptides by helper T-cells of the adaptive immune system, which can activate pathways to destroy cells with those specific melanoma peptides. The team hopes to devise a vaccination scheme in the coming months. The vaccination scheme will involve withdrawing melanoma cells, dendritic cells and T-cells from individual patients. By first infecting the melanoma cells with Salmonella typhimurium to reinduce connexin 43 production, and then placing these cells together with the patient’s immune cells, the patient’s T-cells will be ‘trained’ to identify the peptides specific to the patient’s melanoma. These patient-melanomapeptide-specific T-cells will then be reinjected back into the individual patients to initiate an immune response against the melanoma cells. Like many other novel therapies, this type of cancer therapy focuses on individual patient characteristics. By using the patient’s own melanoma cells and immune cells to induce gap junction mediated recruitment of the immune system, the team hopes to have found a potent method of targeting and destroying cancer cells. “We are inducing an immune response to that ‘fingerprint’ which is specific for the tumour” said Dr Maria Rescigno, an immunologist at the European Institute of Oncology in Milan who was involved in the study. If approved, clinical vaccination trials will start by June 2011. The role of gap junctions in immune responses has only recently come to light, with research revealing that gap junctions are heavily involved in cross-presentation. This suggests the frequent involvement of juxtacrine signalling in the protection of the body against viruses, bacteria and even primordial tumours. As such, further research into gap junction—mediated recruitment of the immune system seems to promise new options for individualised treatment strategies as well as massspectrum therapies against viruses like influenza and the common cold. Rhea Chatterjea is a 2nd year undergraduate in the Department of Medicine. Cell Talk

7


DO

MIN

IC M

cKE

NZIE

The Blue Screen of Death

Wing Yung Chan traces the imperfect path to the perfect program 1947 and computer scientists studying the Harvard Mark II computer reach a dead end. The theory is correct, the logic is sound, but the machine still isn’t working. On 9 September, operators investigating the new high-speed electromagnetic relays discovered a moth stuck inside, and notoriously recorded it as a 'bug'. Since then, computer science has adopted the phrase into its vernacular. The blue screen of death, known for its garish blue colour and scary messages, is an error screen caused by a fatal system crash. Encountered by professional and amateur users alike, the screen has become synonymous with both failure and despair. This makes it a useful measure of progress, as it can be caused by hardware failure, faulty software, a virus or a mixture of all three. Exploring these three key avenues provides insight into just how complex the problem of ‘program perfection’ really is. Keeping hardware working consistently has required borrowing solutions from electrical engineers and physicists. Whilst computer scientists wait to receive hardware that never crashes in any situation, they have been implementing systems that are ready in the event of failure. Early systems had multiple processors doing exactly the same thing, so that if one crashed, there was another one ready to carry on.

Interactive syntax highlighting is used by programmers to help spot mistakes when writing code

8 The Blue Screen of Death

MICHAEL HIMBEAULT

it is

Engineers had been using the term ‘bug’ or glitch to refer specifically to electronic or mechanical problems and, in the early stages of computer science, it was assumed that ‘bugs’ would also be confined to hardware. Software was seen to be virtual or intangible and considered immune. Yet time and time again, when scientists hit unforeseen barriers, rather than being limited by them, they discover a new field of study. On 6 May 1949, the EDSAC (Electronic Delay Storage Automatic Calculator) computer began operation and correctly generated tables of squares. It was a triumph in computing as it was the first practical, working ‘stored-program’ computer—one which uses electronic memory to store program instructions. However, three days later, it hit a glitch. The program that had been written to enumerate prime numbers was not giving the right results. On further inspection, it was found that the code was incorrect. This mistake led to the exploration of what we now call software development, and led Sir Maurice Wilkes, founder of the Computer Laboratory in Cambridge and creator of EDSAC, to infamously realise that “a large part of my life from then on was going to be spent in finding mistakes in my own programs.” There have been many notable examples where mistakes in software have led to catastrophic results. Our inability to write down all the digits of an irrational number means that computers are also unable to store numbers with infinite precision and instead use ‘floatingpoint’ numbers (conceptually similar to standard integers, but typically represented in this form: significant digits × base exponent). Each of these numbers is allocated storage in memory, and when one of the numbers exceeds its storage space, this can lead to catastrophe. This type of problem caused the Ariane 5 Space Rocket to self-destruct just 37 seconds after launch, costing the European Space Agency close to 1billion USD. Lent 2011


Lent 2011

JOÃO TRINDADE

Bugs in software also lead to loopholes or vulnerabilities that allow hackers and viruses to infiltrate. In April 2010, the QAKBOT virus was discovered on over one thousand NHS computers. The virus had been secretly uploading sensitive information onto servers. Viruses exploit vulnerabilities in code at various levels, often at the interaction between programs or at weaknesses in protocols. These flaws cost businesses billions of pounds each year. Though it may be costly to make mistakes, it is also very difficult to avoid them. In the traditional software design paradigm, development follows the ‘V model’ —the project moves from general objectives to detailed functional requirements and then a functional design is produced. From design, the project can begin its coding or implementation phase, after which it is tested and then released. Most of the difficulties are experienced in translation, that is, moving between the levels. It is very challenging to make sure that a design completely fulfils the stated requirements, and even harder to implement the design in a congruent way. As computers increased in memory and speed, complexity grew exponentially and layer upon layer of abstraction was added, until it became incredibly difficult to be confident that a solution would be glitch-free. Even if assured of perfection at a given level of abstraction, it would be extremely difficult to confirm that functionality at lower levels was similarly effective. Businesses now depend on increasing levels of IT infrastructure, so whilst from a scientific perspective it seems rather unsatisfying to knowingly release a flawed product, it is currently the only commercially viable solution. If your favourite game console crashes, it might be a little inconvenient, but if an aircraft’s computer system fails 40,000 feet in the air, it becomes much more serious. The realisation that some errors are more acceptable than others is part of risk assessment and management. The current situation is to deal with bugs pragmatically, so instead of aiming for perfection, software houses catalogue bugs that cannot be fixed in time and instead release patches gradually to fix those deemed critical. In mathematics, the study of sets—collections of mathematical or physical objects—is known as set theory. Because computer programming relies on natural expression, such as “and…if…or…not,” it in a sense uses what mathematicians call ‘intuitive’ or ‘naïve’ set theory. This is a non—formalised theory that uses natural language (as opposed to precise mathematical language) to describe and study sets. But the language of naïve set theory often lacks rigorous definition. The programming input and grammar may then be relatively ambiguous, requiring more interpretation by the computer and making it easier to write incorrect code. Yet perhaps there is hope after all. There are other potential systems that examine objects, groupings and collections. One such system is type theory, which

could become crucial for computer scientists in proving the correctness of programs before coding even begins. This enables detection and elimination of potential bugs before any problems can occur. Web pages have data validations for email addresses that only accept input if it has an ‘@’ symbol. Doing this reduces the chance of sending an email to a nonexistent email address. Similarly, ‘type systems’ assign types to variables and functions with the aim of stopping impossible behaviour such as trying to numerically add the word ‘hello’ and the number 5. A more subtle result is that by having very strict rules on types, it is actually hard to write incorrect code that will run. Such is the promise of type theory that in 1989, the EU began heavily funding projects that looked at developing type systems and ways to prove a program’s correctness. One of the projects that emerged from this investment was Mobius, which sought to incorporate so-called ‘proof-carrying code’, allowing code to be certified as bug-free. Although poetry and humour benefit much from the ambiguity of languages, beauty in mathematics and science is most commonly seen as simplicity and clarity. The two most popular languages, JAVA and C (based on October 2010 Figures by TIOBE Ranking Index) have a lot of ambiguity which makes programming quicker but also leads to bugs. Both of these languages have what is known as an ambiguous grammar, which means that there are statements that are valid but could have more than one meaning. It is up to the compiler software to decide what it really means. Understandably, this lack of precision means that it is very hard to prove a code is correct. Languages do exist that are free of ambiguous grammar. One is called SPARK, which sets out to be free of ambiguity, meaning that any compiler of the language will give the same result each time. SPARK is grounded on mathematical logic and formalism, and while traditional coders may not welcome the intensity of mathematical rigour, it is precisely this painstaking rigour that may once and forever banish the blue screen of death into obscurity. Wing Yung Chan is a 1st year undergraduate in the Computer Laboratory

Mathematical developments may one day lead to computer software that is error-proof

The Blue Screen of Death 9


SALVATORE VUONO

The Science of Significance Annabelle Painter shows how the amygdala may be key to culture, spirituality and identity

ISHQUEE / FLICKR

Austistic individuals can find it difficult to judge emotions from looking at eyes, and one suggested cause is impairment of the amygdala

how do we know our mum is our mum? What does it mean to have ‘déjà vu’ and what makes us gaze in wonder at our surrounding universe? The answer may lie in a small part of the brain called the amygdala. An almond shaped region sitting close to the hippocampus, the amygdala has numerous vital neurological roles, which include interpreting social signals and controlling emotional responses. It is also thought to be involved in providing a sense of self. Such correlative examples merely skim the surface of the amygdala’s functions, many of which we still do not know. When primate amygdalas were experimentally severed it was found that they were no longer able to respond to social cues, became withdrawn and, critically, lost their previously held positions in the social group hierarchy. Just like monkeys, human society operates through social hierarchies. As group sizes grew in our ancestral apes, so did the evolutionary drive for social behaviours and thus the development of associated brain regions, including the amygdala. One of the vital traits that resulted from this development was the ability to empathise. The ability to imagine oneself in another’s place is generally thought to be unique to humans and requires complex neurological processes. This includes the ability to associate significance to the behaviour of others, which means, for example, being able to appreciate instinctively what makes a laugh nervous, a glance stern, a gaze loving or a smile fake and to understand the driving force behind why another displays such behaviour. It then requires the association of an adequate response to that behaviour, such as suspicion, fear, anger or laughter. The part of the brain correlated with these functions is once again the amygdala.

10 The Science of Significance

Research into human sociability has shown links between the amygdala and autism, one of our most well-known social syndromes. Experiments, such as that conducted by Cambridge professor Simon Baron-Cohen, demonstrate that autistic individuals find it difficult to judge emotions from looking at eyes, a task which neurotypicals find generally easy. These experiments suggest that an impairment of the amygdala could be a cause of the social difficulties characteristic of autism. However, what if someone wasn’t born with a defective amygdala but instead it was damaged later in life? This can happen as a result of a stroke, causing victims to develop some startling symptoms. For example, sufferers of Capgras syndrome often become convinced that their loved ones have been replaced by impostors, robots or abducted and replaced with aliens. The emotional ‘glow’ which normally flares in the brain at the sight of a loved one is suddenly absent. In order to resolve the cognitive dissonance the brain concludes, quite incredibly, that they are not truly their loved one but are instead an impostor. This has had tragic results, such as one man who murdered and cut open the skull of his own father to try and find the robotic computer ship he was certain would be found there. Cotard’s syndrome is a similar disorder where sufferers become convinced that they are dead and sometimes even claim to be able to smell their own rotting flesh. This may be because they no longer experience any emotional significance, due to a dysfunctional amygdala, and the phantom scent may be because the amygdala is intimately involved in smell. Without the ability to associate significance to anything, patients live in a dull emotionless world and thus become convinced they must be dead. While it may seem hard to imagine the experiences of Capgras and Cotard’s syndrome sufferers, the amygdala is also associated with déjà vu, a familiar feeling to most human beings. Déjà vu is thought to occur when the connection between adding and recalling significance from the past gets Lent 2011


ALEX NASAHAHN

LIFE SCIENCES DATABASES

mixed up with the present, giving us the feeling of re-experiencing a familiar situation. ‘Jamais vu’ is the opposite—although much rarer—experience in which a familiar situation suddenly seems alien as the normal ‘significance’ messages fail to flair. Such experiences give us a small insight into what it might feel like to experience amygdala malfunction. An individual who constantly observes significance in every aspect of life, who has heightened emotions, is deeply religious and possesses an increased sense of self importance may have a type of epilepsy that is once again connected to a dysfunctional amygdala. The most widely recognised form of epilepsy occurs when neurones in the brain start firing uncontrollably, causing a mass of excitatory behaviour through the brain, leading to whole bodily convulsions (a grand mal seizure). However, seizures can sometimes stay localised in a particular brain region and for these individuals the seizures affect the temporal lobe. This is referred to as temporal lobe epilepsy (TLE). Such seizures originate in the amygdala before spreading to other areas of the brain. When TLE sufferers have seizures, they often describe having highly spiritual experiences, saying such things as “the world makes sense now” and “I finally understand the true nature of the cosmos.” Each time they have a seizure that spreads to a certain area of the brain, the threshold of electrical activity needed to activate that area becomes a little lower. If it gets low enough, a person will find themselves consistently having the experiences associated with that brain region. So, if a person’s seizures include a heightened spiritual experience, they may then start to see spirituality all around them, all the time. Indeed, even healthy people whose temporal lobes are stimulated using a

The amygdala (pictured in red) lies deep within the temporal lobes, adjacent to the hippocampus and medial to the hypothalamus

transcranial magnetic stimulator often claim to have a spiritual experience. Consumer demand has blossomed as result, with individuals being sold gear such as the ‘eight wheel shakti’ and ‘god helmet’ which controversially claim to offer purchasers spiritual experiences. But what is it about the temporal lobe that causes such spiritual phenomena? Well, as the amygdala evolved to be responsible for attaching emotional significance to people, actions and events, it is easy to imagine that, as a preadaptative offshoot, it started projecting significance onto other things such as the weather, geographical landscapes, animals and trees. Indeed, humans have been fascinated by these aspects of our world for thousands of years, often anthropomorphising them. Temporal lobe epileptics simply experience such attribution of significance in excess and begin to see symbolism in everything. Perhaps what we know as spiritualism and what we see as our imperative to understand and explain the world may have been catalysed by our evolutionary need to understand each other. It is this development that may have been the driving force that allowed ancestral humans to dominate over the other contemporary bipedal hominids. Although Neanderthals may have had bigger brains than humans, they did not develop cultural practices and rituals of comparable complexity. It is speculated by some that they may not have had, for instance, the ability to associate complex meaning to art, music, birth, or death. It was perhaps this ability to attribute significance to our lives that helped humans become spiritual, insightful and innovative, and consequently survive, while Neanderthal populations became extinct. Sociability, spirituality, culture and the maintenance of a sense of self—correlated with such an array of functions, the importance of the amygdala and its role in the attribution of significance is evident. With so many of these factors associated with human intellect and cultural sophistication, it seems that studying the amygdala will be crucial in our search for what it means to be human. Annabelle Painter is a 2nd year undergraduate in the School of the Biological Sciences

Lent 2011

The Science of Significance 11


E ZI N cK E M C INI M DO

Eradicating Rinderpest Paul Simpson looks at the history of a quietly devastating disease perhaps you have never heard of rinderpest

African rinderpest pandemic of the 19th century

12 Eradicating Rinderpest

Texas A&M University College of Veterinary medicine

before. Unlike many of our pathogenic foes, such as AIDS, tuberculosis and cholera, rinderpest does not often hit the headlines, but this pathogen has had an astonishing influence on humankind, and a huge effort has been spent combating it. This year the eradication of rinderpest, or cattle plague as it is also known, will be officially declared making it only the second disease in history to be systematically eliminated by human intervention. Rinderpest has been a significant catalyst for the birth and development of modern veterinary science and its eradication is a monumental victory for the discipline. Rinderpest has haunted human civilisations for millennia, with the earliest historical reference dating back to around 3000 BC in Egypt. Domestic cattle, yaks and water buffalo are particularly susceptible to this virus, which severely damages its victim’s digestive system leaving the host weak and dehydrated. Due to the high mortality rate, famine often followed a rinderpest outbreak as people were dependent on their cattle for food, transport and skins. Rinderpest has been a scourge throughout Asia, Africa and Europe, and has often been spread by trade and war. Notable pandemics were triggered across Europe by the invasions of the Hun and then Mongol armies in the 4th and 13th centuries. However, the pandemic of Africa in the 19th century probably represents the disease at its most destructive. The epidemic started in 1887 in Northeastern Africa. Rinderpest was brought

to the continent by cattle imported from either Yemen or India to feed the Italian army based there. Over the subsequent ten years cattle plague tore through the continent killing 80-90 per cent of susceptible animals, both wild and domestic, a death toll that numbered millions. In 1893 the explorer Fredrick Lugard observed the scale of the rinderpest pandemic in Maasailand. He remarked: “Never before in the memory of man, or by the voice of tradition, have the cattle died in such numbers; never before has the wild game suffered.” A combination of smallpox and famine due to rinderpest brought about a huge reduction in the population of native Africans. This especially affected the Maasai people. The devastation in the Engaruka Basin in Tanzania led one Maasai man to exclaim that dead bodies were “so many and so close that the vultures had forgotten to fly.” Despite its devastating history, rinderpest also has a positive legacy. Efforts to counteract cattle plague have inspired many developments in modern veterinary science. In the 18th century Giovanni Maria Lancisi, the physician of Pope Clement XI, was instructed to deal with a rinderpest outbreak that had killed over 26,000 papal cattle. Lancisi recognised that the plague ‘‘was caused by exceedingly fine and pernicious particles that pass from one body to another.’’ His solution was to control its spread by slaughtering ill and suspect animals, burying carcases in lime, controlling the movement of cattle and inspecting meat. The success of Lancisi’s method of controlling cattle plague led to the creation of the first veterinary school in Lyon. The school was established in 1762 and specialists were trained to combat threats such as rinderpest. Lancisi’s methods are still used today, for example to control the UK foot-andmouth disease outbreak in 2001. Strict control of animal movement during outbreaks was important for reducing the threat of rinderpest across Europe, but it was the development of vaccines that set the wheels in motion for removing the virus globally. An early vaccine developed in the late 1890s by Robert Koch involved immunising cattle with injections Lent 2011


dineshtilva

of bile from an infected animal. At the same time two South African scientists, Arnold Theiler and Herbert Watkins-Pitchford, worked on a more effective vaccine where cattle were injected with a mixture of infected blood and immune serum from a recovered animal. The passive immunity granted by injecting uninfected cattle with immune serum protected the host animals. At the same time their own immune system mounted a response against the virus in the infected blood, providing life-long immunity. Serum-simultaneous vaccination was used throughout India and Africa. This method was also independently developed and used to eradicate rinderpest in Western Russia by 1928. Further improvements in vaccination were achieved using attenuated viruses. These were developed by serially growing the rinderpest virus in live animals such as goats, rabbits or pigs that could propagate the virus but did not develop the disease. Although early vaccines could be used to provide long-term immunity against rinderpest, there were drawbacks. One example is the inherent problem of transmitting other diseases in vaccines derived from live animals. In the late 1950s, Dr Walter Plowright developed a laboratory-grown attenuated virus that could be used as a vaccine. The Tissue Culture Rinderpest Vaccine (TCRV) was, for the first time, a safe method of providing life-long immunity after a single inoculation. However, one disadvantage was that the vaccine was inactivated at ambient temperatures, making it problematic to use in remote areas of developing countries. Improvements in virus growth conditions and freeze drying methods made it possible to generate a thermostable TCRV. With vaccines in hand, the stage was set for the eradication of rinderpest. Vast countries such as China were able to banish the virus from within their borders using mass vaccination programmes. In the second half of the 20th century several international campaigns were able to reduce the global threat of the disease. Unfortunately, they ultimately failed to Lent 2011

Domestic cattle, relied on for food and transport, were particularly susceptible to the rinderpest virus

eradicate it completely and new epidemics repeatedly appeared. However, the creation of the Global Rinderpest Eradication Programme (GREP) in 1994, which coordinated vaccination and surveillance efforts, provided the momentum to finally eradicate cattle plague. An important key to GREP’s success was identifying hidden reservoirs of the virus that were repeatedly seeding new epidemics. These reservoirs were in isolated herds in countries like South Sudan where armed conflict made vaccination programmes particularly difficult. Ultimately it was the efforts of trained community animal health workers using TCRV that enabled the eradication of rinderpest from its last strongholds. In October 2010, GREP announced that it had successfully completed its objectives and had stopped field operations, paving the way for a declaration of global eradication of rinderpest. The economic and humanitarian benefits of eradicating rinderpest have been astonishing. India alone is estimated to have benefited from $289 billion worth of additional agricultural production from 1965 to 1998 and it is believed Africa benefits from an extra $1 billion annually. The humanitarian benefit is difficult to quantify. However, considering that as recently as the 1980s nearly 100 million cattle were lost due to an African rinderpest pandemic, it is not an exaggeration to say that billions of domestic and wild animals have been saved. As a result of this the lives of millions of people must have been improved. Dr Plowright was awarded the World Food Prize in 1999 in recognition of his contribution to eradicating rinderpest and the last recorded case of rinderpest was in Kenya in 2001. Like smallpox before it, the eradication of rinderpest stands as a testament to what can be achieved by international co-operation and science. Paul Simpson is a postdoctoral researcher at the MRC Laboratory of Molecular Biology Eradicating Rinderpest 13


NASA

Climbing Space

Mark Nicholson discusses the science behind the fiction of the space elevator

14 Climbing Space

EQUINOX GRAPHICS

Carbon nanontubes are a possible material for the space elevator, given that their theoretical strength is hundreds of times greater than steel

between 1958 and 1969, one of the most peaceful and productive conflicts in history raged. The so-called ‘space race’ pitted two superpowers against each other in frantic attempts to put the first man on the Moon, giving rise to iconic images of rocket launches and space shuttles. But why should today’s space exploration be based on such out-of-date technology? Rockets are exceptionally inefficient, with fuel-to-payload ratios exceeding 10:1. Each kilogram propelled into space costs tens of thousands of pounds, so an average astronaut takes close to a million pounds just to reach orbit— without any equipment. So how can we get out of our technological rut? Science fiction has been contemplating a myriad of possibilities for decades, but one stands out, so much so that NASA has started studying its feasibility. First popularised by Sir Arthur C. Clarke in The Fountains of Paradise, the concept of a space elevator is delightfully simple: a satellite extends a rope, commonly called a tether, all the way down to the ground, where a machine (the ‘climber’) grabs onto it and starts the long haul all the way to the top. Although this sounds suspiciously easy, the challenges involved are staggering: the satellite would have to be in geostationary orbit, 36,000 kilometres above the surface of the Earth, which makes for a rather lengthy cable. Additionally, the tether cannot just be lowered to the ground. A counterweight thousands of miles beyond the geostationary orbit must be used to keep the construction in equilibrium and stop the satellite from

falling out of the sky. The tether must be able to support its own weight, which leads to the biggest hurdle. No practical material exists that is strong enough to bridge such a distance and support its own weight without snapping—at least, not yet. Scientists are investigating novel materials, called ‘carbon nanotubes’, which have a theoretical strength hundreds of times greater than steel with only a fraction of the density. If these could be manipulated, mass produced and woven together into a continuous fibre, the cable would no longer be a physical impossibility. However, it is not just the tether that imposes technical constraints. Due to the huge distances involved, the climbers would have to travel at high speeds to make the journey in a reasonable amount of time. For example, for one week’s journey, the pod must average 215 kilometres per hour, travelling vertically upwards, and maintain this speed against the effects of gravity. This requires large amounts of energy, but carrying a power source into orbit creates exactly the same problem as launching rockets: a huge fuel-topayload ratio. One intriguing possibility is the use of so-called ‘beamed power’. This uses a high-intensity Earth-based laser to illuminate a modified solar panel on the climber. Accurately illuminating a spot a few metres across from a distance of 36,000 kilometres may be tricky, especially when the spot is moving at hundreds of metres a second. Fortunately, NASA is on the case, offering a $2 million prize in a competition to do exactly that, albeit on a slightly smaller scale. In their contest, the cable is lowered from a helicopter hovering a kilometre above the ground, and the climbers must ascend to the helicopter in under three minutes. Smaller prizes were offered for breaking easier time barriers at four and five minutes. As of 2009, a time of 3 minutes and 49 seconds has been achieved, winning 900,000 USD, but the top prize remains up for grabs. The mini-elevator used in this competition was Lent 2011


elementary to build: hanging a steel wire from a helicopter is not exactly a miracle of engineering. By contrast, constructing a full-scale working space elevator would be quite an impressive feat. A cable of this length would be expected to weigh thousands of tonnes (even if it is made of materials such as nanotubes) and cost billions to lift into space with rockets. The cable would have to be fragmented—no single rocket could lift something that heavy. There is, however, a much more sensible alternative. Once the elevator is in place, it can be used to send material into orbit at very low cost—so why start big? A rocket could be used to send a 22—tonne ‘seed’ cable into space. This would be full length, stretching all the way back down to Earth, but so thin that it would barely be visible. An extremely light climber, whose only payload is another cable, could then climb this ‘seed’. When spliced to the first, the resulting, stronger cable could be climbed in turn by a larger climber, carrying a heavier cable to add to the first two. Hundreds or even thousands of iterations later, the elevator would be hefty enough to take a real payload—a satellite or even a pod carrying astronauts. Once we are capable of lifting pods with humans in, the possibilities of the space elevator become boundless. Quite apart from offering users an extraordinary cost advantage over those still stuck with classical rockets, an elevator can very simply be scaled up until it can take hundreds of tonnes at a time.

Lent 2011

Space tourism takes on an entirely new meaning when, rather than experiencing the cramped confines of an all-too-brief rocket trip, individuals are offered a luxury two—week tour in a large capsule with panoramic views. Satellites would also be designed with cheaper budgets, capable of being readily lofted into space for any scientific or financial end. The advent of rockets brought us satellite TV, GPS and the iPhone. Perhaps the space elevator will usher in a new wave of innovations, with as-yet unforeseen consequences. A few rather sizeable hurdles still stand in our way. Although the technical challenges are considerable, economic issues and public opinion pose just as great a threat. Any government who stepped into the fray would struggle to justify the costs to their voters, despite the potential benefits. The international ramifications of building such a structure, with its huge military advantages yet equally large military vulnerabilities, would lead to a political quagmire. An extremely brave, and possibly foolhardy, politician would be required to head the project. However, if all of this could be overcome, the space elevator could open the door to a space-faring future. When might this be? As the late Sir Arthur C. Clark used to say, the first space elevator “will be built 50 years after everyone has stopped laughing.” Mark Nicholson is a 3rd year undergraduate in the Department of Chemistry

Climbing Space 15


Life will find a way BlueSci explores the past, present and future of the search for life in the Universe

22 Katpitza

Lent 2010


FOCUS

Worlds on worlds are rolling ever From creation to decay, Like the bubbles on a river Sparkling, bursting, borne away. —P.B. Shelley, “Hellas”, 1821

Lent 2010

EQUINOX GRAPHICS

humans don’t want to be alone. Scientific and philosophical speculation about life in the Universe is one of our most ancient and frequently sensationalised pursuits. As early as the 5th century BC, Democritus hypothesised that there are ‘innumerable worlds’, some colliding, some declining, others flourishing, and many, he controversially claimed, harbouring life itself. Our most recent century has taken this curiousity to its respective extremes. On the one hand, mass cults, billion dollar investments in a so-far fruitless search for intelligence, and even the development of ‘new’ psychopathologies (such as the abduction phenomenon). On the other, religious denial of cosmic pluralism and the assertion of human uniqueness. All of these phenomena have, to a certain extent, used extraterrestrial life as their canvas. Indeed, 2010 could act as a case study in extraterrestrial hyperbole. The discovery of Gleise 581 (the so-called ‘Goldilocks planet’) and arseniceating bacteria in Mono Lake, California, both hit international headlines for weeks, with the latter subject to a highly cryptic, hype-building marketing campaign prior to the press conference. The explosive reaction to both announcements—and even the claims of the findings themselves—later came under heavy scrutiny from science writers and field specialists, yet it barely registered. The public demand for results reigned supreme.

Focus 17


EQUINOX GRAPHICS

The theory of panspermia suggests that ‘spores of life’ travelled to Earth, here depicted by an artist

18 Focus

Not many individuals have figured more prominently in public discussions of astronomy and the search for extraterrestrial life than Lord Martin Rees. As president of the Royal Society and master of Trinity College, he writes and gives passionate public lectures on issues of human and non-human life in the universe, with venues ranging from TED to Charlie Rose. At once a proponent of space exploration and a realist about the limits of current human understanding, Rees describes the search for life as “the grandest of our environmental sciences”. “It is one of the big questions which fascinates the public far beyond those who are interested in other kinds of science,” says Rees. “But there is another issue, which is how would we react, because it would make the galaxy much, much more interesting. On the other hand, if there isn’t any other life out there, then it has a compensation in that it allows us to feel more important, though we are very small in the cosmic context. But we could conceivably be the only place in the galaxy where there is interesting and advanced life.” “When we understand that, it will allow us to decide how likely it was. Was it just a very rare fluke, like shuffling a pack of cards and getting a perfect set? Or was it something we would expect to happen in any environment like that which prevailed on the young Earth?” Our ability to probe our solar system for possible life signs is accelerating rapidly with technological developments in robotics and remote control. With increasing observational and computational modelling power, we are even getting a better idea of what might lie in the wider Universe. Earth-based projects hope to detect signals from intelligent life or investigate the extremes in which life can exist. But will this satisfy public curiosity? How safely can we make assumptions about life in the universe with our limited data? Will ‘interesting’ life be found? “I don’t know if there is much chance of success,” says Rees, “but it will be so important if they succeed, that even though I’d assess the chance as much less than one per cent, it’s still good that people are doing it.”

the universe is certainly an inhospitable place, given the freezing temperatures, colossal impacts and highenergy radiation. Hopes of finding extraterrestrial life may rest partly on ‘goldilocks planets’ where conditions are ‘just right’ for life to thrive. Of course, the assumption that life on other planets will be similar to life on Earth may be flawed. Could we be missing out on a panoply of planets with their own specialised fauna, radically different to ours? Maybe. But we know of no alternative biochemistry as versatile as our own carbon-based one, and with no evidence to build on, our attempts to predict other life systems are purely speculative. With so many potential planets out there, it makes sense to prioritise those which look most hospitable to the kind of life we already know. So, what are these planets like, and how common are they? Conditions on Earth are finely balanced. It receives enough solar energy to melt ice, but not too much so that water is vapourised. It is massive enough to retain an atmosphere and maintain water in a liquid form, but not so dense that gravity becomes a crushing force. It has a magnetic field which deflects radiation that can damage genetic material. The ocean and atmosphere further provide a suitable ‘reaction flask’, where a host of chemical reactions that enable and sustain life can occur. Very small changes in conditions can cause environmental chaos. The Rare Earth hypothesis, popularised by Peter Ward and Donald Brownlee, argues that habitable planets are rare precisely because these conditions for habitability are so restrictive. Ward and Brownlee also argue that the positioning of habitable planets is also very limited. Like stars, galaxies have habitable zones around them. Too close to the galactic centre, intense radiation, supernovae and collisions would quickly irradiate and pulverise life. Too far from the centre, the availability of heavy elements required for planet formation falls. Stars move around their galaxies over hundreds of millions of years, with their planets in turn orbiting around them. If a star leaves the habitable zone at any point, life on its planets

Michaelmas 2010 Lent 2011


will be extinguished. Arguably, the emergence and survival of life depends on the precise path which its planet takes through the galaxy. Of course, Rare Earth has its critics. Though only one supposed ‘Goldilocks planet’ has been found, the tens of billions of exoplanets in our galaxy gives a high chance that other Earth-like planets do exist. Critics also maintain that there is no reason why conditions on Earth could not be replicated elsewhere. In the same line of argument, should the evolution of life not be just as commonplace? The Kepler mission, launched by NASA in 2009, investigates these questions by examining stars with potential planet systems outside our solar system. This uses enormously sensitive equipment to detect the shadows of planets moving in front of their star. Whether any of them support life will not be known for some time, but these surveys do at least give us some idea of whether Earth is especially unusual, or if we might be just one planet of many where life could exist. Even if there are other habitable planets, what are the chances that life will have evolved? If it has, will it be intelligent? We do not know if the event that produced the first organism was an isolated incident or an inevitability on any hospitable planet given sufficient time. One line of argument is that the evolution of intelligent life must be quite unexceptional. After all, it happened here, and intelligence has proved so useful on Earth that it must be an evolutionary inevitability—intelligence might evolve wherever life exists. Others consider the moments in Earth’s history where our entire ecosystem was threatened with extinction—by meteor impacts or sudden climate change, for example—and claim it’s a miracle that intelligent life survived at all. The Search for Extra Terrestrial Intelligence (SETI) project hopes to detect radio signals from any intelligent life. As radio signals from space can penetrate Earth’s atmosphere, these signals can be collected using radio telescopes on the ground using two strategies. The first is to carry out a sky survey, sweeping a large telescope over vast areas Lent 2011

NASA / JPL

NASA / JPL

NASA / JPL

FOCUS

looking for strong signals. The second strategy is to target the search, by pointing a telescope at a particular star for a long period of time, increasing sensitivity to weaker signals. The Allen Telescope Array (ATA), currently in development, will boost these efforts considerably. Its unique design allows continuous use, which will speed up the search and allow up to a million nearby stars to be studied. For the first time, SETI will be able to survey a significant proportion of the Universe. Should it so happen that any life is emitting radio frequency signals, the ATA will considerably increase the probability of detecting them. Despite these efforts, no signals have yet been detected, lending credence to one description of SETI as “searching for a gold nugget buried in a field”. “Now, if that is the case,” says Rees, “you might think that it would indicate life to be a sort of trivial feature in the Universe. But that would be wrong, because there’s another thing that we will learn from astronomy, which perhaps people outside of astronomy are not so aware of, and that is that the future is longer than the past. Life could become important even if we only have the initial spark of it.”

Mono Lake, California, was the site of one of the year’s most popularised science headlines (middle). An artist’s rendition of the surface of Titan (left). Recent astrobiological studies of Mars (right) have been providing the most compelling evidence for possible life

our search for life in the Universe naturally starts within our own solar system. The proximity is certainly an advantage: we can send probes and rovers relatively easily, which enables us to directly examine the physical characteristics of planets, moons and meteoroids. One possible place we might find signs of life is Mars. This planet is apparently inhospitable: dry, dusty, with a thin and slowly dwindling atmosphere. Its atmospheric pressure is so low that water cannot exist in a liquid state. However, Mars used to be more Earth-like, with a thicker atmosphere allowing liquid water to exist. Microbial life may therefore have existed—or may even still exist—on the planet. To explore this, the state of ancient Mars is being studied using meteorites found on its surface. In early 2010, findings from ‘Spirit’, one of NASA’s rovers,

Focus 19


EQUINOX GRAPHICS

An artist’s impression of Vulcanoid asteroids, hypothesised to orbit very close to the sun. If they exist, they may give us insights into the formation of planets and the origin of life on Earth

20 Focus

further hinted at a warm, wet climate on Mars some four billion years ago, and since then the discovery of certain carbonate rocks provides further evidence for the possibility of living organisms. Future missions to Mars are also planned to explore dry riverbeds, ice and rock types that only form when water is present, which should provide new insights. Europa, one of the moons of Jupiter, is seen as one of the most promising habitats for extraterrestrial life in our solar system, despite its dissimilarity to Earth. The Galileo space probe orbited Jupiter from 1995 to 2003 and showed that Europa’s surface is made entirely of ice, approximately 150 kilometres thick. Despite being too small to retain a dense atmosphere, traces of oxygen and small amounts of ozone have been detected, most likely held in the planteary ice, but could also potentially contribute to the formation of an extremely fragile atmosphere. Also, there is the enticing possibility of a vast underlying salty ocean where organisms might flourish. While future study of Europa is planned, comparison with organisms on Earth can help to explore possibilities for life in Europa’s potential oceans. In Antarctic sea ice, where temperatures can fall to -18 degrees centigrade, cold-tolerant microbial communities do exist. If similar organisms exist on Europa, they might lie dormant in subsurface ice for long periods, becoming active during brief episodes of local heating generated by tidal energy. This may also lead to the formation of hydrothermal vents similar to those found on Earth. They also might offer a feasible habitat for chemosynthetic organisms to survive, similar to those around deep sea hydrothermal vents on Earth. Investigations at Lake Vostok, lying beneath the ice of the Antarctic Plateau, may give an insight into what life could be present on Europa. Vostok is believed to contain microbial life which has been isolated for 18 million years, and survived just the kind of harsh conditions likely to be found on Jupiter’s moon. In contrast, Titan, a moon of Saturn, is thought to

be one of the most Earth-like worlds found to date because it possesses a hydrocarbon-based hydrological system and a thick, dense atmosphere. Saturn and its 30 satellites have been explored by the CassiniHyugens mission, a cooperative effort between the United States and Europe using robotic spacecraft. This mission revealed Titan’s rich atmospheric chemistry, which may be a lead into pre-biotic chemical pathways capable of kick-starting life. There may also be seasonal effects, suggested by variations in relative concentrations of gases. On the other hand, Titan was found to have extremely turbulent weather: high wind speeds, methane and ethane-based storm clouds, as well as temperatures plummeting to -180 degrees centigrade. for another independent origin of life highlights our ignorance about the origin of life on our own planet. Since we have yet to synthesise an artificial cell system starting from basic chemical components, there is not one definite explanation, but rather many conflicting theories. One theory suggests that life originated elsewhere in the Universe, travelled through space and reached the Earth. This theory is called panspermia, meaning ‘seeds everywhere’, and arose from the discovery of possible organic molecules inside meteorites. Life is thought to have originated 500 million years after the Earth formed, only a quarter of its current age. Supporters of the panspermia hypothesis speculate that life could not have originated at such a rapid pace in geological terms. However, panspermia does not solve the mystery of the origin of life on Earth, it only shifts it to another place in the Universe. It may be difficult to imagine living microorganisms travelling through space, withstanding radiation and landing safely on Earth. On the other hand, many micro-organisms on Earth are very good at surviving incredibly harsh environments. They have adapted to thrive at extreme temperatures, in corrosive environments, and possibly even under high pressure and high salt concentrations. Bacteria have been known to thrive in radioactive waste and water boiling

the search

Michaelmas Lent 2010 2011


out of deep-sea volcanoes. Some propose that life began in these places, where carbon oxide-rich water came into contact with hydrogen-rich fluids rising from below the sea floor. An American initiative, the Deep Carbon Observatory, was launched in August 2009 and attempts to find the deepest forms of life on Earth by sampling rocks and microbes from the crust to the core of Earth. They are also building devices to simulate interior conditions of Earth in the lab. The observatory aims to investigate if the biochemistry deep in the structure of Earth played a role in the origin of life. We are, as the saying goes, ‘all made of star dust’. But how did organic molecules arise from inanimate matter? In the 1950s, Stanley Miller and Harold Urey produced amino acids from a simple set of chemicals likely to be present in the early Earth’s atmosphere. It seems probable that basic building blocks can be formed thanks to the properties of carbon. However, for more complex biomolecules, the question is still open. There are two models that try to address this question. Some scientists hypothesise that the formation of basic molecules was catalysed on the surface of iron sulphide minerals, leading to a very primitive form of metabolism. Others suggest a very basic life form based on the self-replicating capability of RNA. This can both store genetic information like DNA, and catalyse chemical reactions like proteins. However, experiments are still under way to investigate the sequence of chemical events that led to the synthesis of the first molecule of RNA. “Physics and chemistry are much easier subjects to answer than biology,” says Rees. “We understand how from simple life, monocellular life developed, then multicellular life, etc., but no one understands how the very first reproducing organisms formed. And that’s obviously a question that’s key in biology, irrespective of any interest in what happens beyond the Earth. And I would say—I’m not an expert in that subject—but I think many biologists would hope to have clearer ideas in the coming decades about Lent 2011

NASA / JPL

NASA / JPL

NASA / JPL

FOCUS

how life got started on the Earth.” what drives the search for extraterrestrial life? Curiosity about our surroundings is an integral part of our psychology, and key to human progress. Nevertheless, we have invested more time, energy and speculation into investigating other planets than we have the inaccessible areas of our own planet. We discovered communities living on deep-sea hydrothermal vents only after we had already been to the moon. We can model the state of the Universe to within one second of its beginning, 18 billion years ago, yet have not fathomed how life began a mere six billion years ago. Though we know only a fraction of the species on Earth, we scour space for the tiniest signs of life. One could argue that, in the midst of a rapid biological catastrophe, we should focus on protecting what we have left before gallivanting around the solar system in search for extraterrestrial microbes. But there are ways in which this search could change our views of Earth and the life on it. We might find that life can only start on Edenic, Earth-like ‘Goldilocks planets,’ where everything is ‘just right’. Then, the imperative to preserve the Earth would be redoubled. If we find nothing else out there, the implications would be even greater. Along with feelings of isolation, this would leave us the recalcitrant custodians of something much more fragile and special than we have understood. The only home we’ve ever known.

An impact crater on Europa, showing rafts of ice (left). Callisto is the third largest of Jupiter’s 63 moons (middle). Aerogel is used in space missions to capture ‘stardust’ from comets (right)

Yvonne Collins is a PhD student in the Mitochondrial Biology Unit Wing Ying Chow is a PhD student in the Department of Chemistry Letizia Diamante is a PhD student in the Department of Biochemistry Natalie Lawrence is a MSc student in the Department of the History and Philosophy of Science Amelia Penny is a 2nd year undergraduate in the Department of Earth Sciences Focus 21


An Ordinary Genius

APIC/GETTY IMAGES

Ian Fyfe uncovers the personal life of Albert Einstein

The earliest photo of Albert Einstein

22 Behind the Science

albert einstein did not set out to change the world. He was just a man in awe of the Universe, who wanted to understand and explain it. Superficially, his success and legacy resulted from his Universe-shattering theories, but underneath lies an intricate interplay of his complex character with personal struggles and the turbulence of the early 20th century. Einstein’s fascination with the physical world was unrelenting from childhood. He was just five when the movement of a compass needle set his mind in motion: “Something deeply hidden had to be behind things.” The awe of young Einstein grew into a “rapturous amazement at the harmony of natural law,” equivalent in his mind to religiosity. To him, the idea of a personal, prayer-answering God was “naïve,” but as a Jew in the early 20th century, he was not without religious inclination. Together with his awe and reverence for the universe, this manifested itself in a secular, “cosmic religious sense” that motivated his science and carried him to success that his independent, free-thinking character may otherwise have precluded. Einstein’s character underpinned the success of relativity. Although he was unknown in the academic world, he dared to publish, along with three other ground-breaking papers in the same year, a theory that required a “modification of the theory of space and time” and dismissed Newtonian physics. There is no better demonstration of his fearless independence of thought and disregard for received wisdom that characterised so much of his life before and after his success. His natural individualism was augmented by early experiences. Already a lonesome child, Einstein’s isolation was reinforced by attending a Catholic school as a Jew. This nurtured his tendency to rebel and trust only himself, an attitude cemented in his mind by religious disillusionment. As a schoolboy, he observed Jewish strictures, but also read much science. When he realised, through his scientific studies, that much of the Bible could not be true, he vehemently rejected religion. “Youth is intentionally being deceived by the state,” he explained in later life, “Suspicion against every kind of authority grew out of this experience.” Einstein’s dissident approach caused him difficulty at every stage of his education. He thought the teaching at his school obstructed creative thinking, and that rote drills and questioning in class

accustomed the pupils to military-style “mechanical discipline.” After leaving school early, through a mixture of his own disquiet and his teachers’ irritation, he required extra studies to gain his place at the Zurich Polytechnic. Once there, he continued to please himself; he often missed physics practical classes, and even when there, he often carried out the experiment as he thought best. On graduating, Einstein struggled to get an academic job. No-one would provide him with a good reference, and his teaching style was considered unacceptable—he did not teach facts, but encouraged independent thinking, an approach that later made him a popular lecturer. Again, his character had created a stumbling block, but one that may ultimately have led to his success. When Einstein did find a job, it was not as an academic. He became a patent examiner, a job that suited his imagination and critical mind. Importantly, it allowed him to continue with his own thought experiments in his own time. This freedom was exactly what he craved. Without the restrictions of academia, his free thinking and rejection of received wisdom thrived, and he could comfortably ignore Newton and the physics establishment to follow his own train of thought and turn physics on its head. Einstein’s cosmic religious sense could also lift him above the “merely personal” and when he struggled with the emotional side of life, science was his haven. This became clear when he separated from his first wife and faced the prospect of little contact with his sons. “No wonder that the love of science thrives under these circumstances,” he said, “for it lifts me impersonally, and without railing and wailing, from the vale of tears into peaceful spheres.” During the throes of this intense personal struggle, he perfected and published the most important theory in physics: general relativity. Despite this apparent coldness and a necessity for solitude, Einstein’s relationships reveal an emotional and sometimes passionate man. While at Zurich, he fell in love with Mileva Mariç, a fellow student who later became his first wife. Their relationship was founded on a shared passion for physics; they read together and discussed science in their letters. Yet underneath, there was a deeper connection: “Without “My life is a simple thing that would interest no one. It is a known fact that I was born, and that is all that is necessary.” Lent 2011


hoppé

Lent 2011

US FEDERAL GOVERNMENT

thinking and radical theories used to exemplify the ‘menace’ of Jews. Although he had no interest in the religion itself, he embraced his Jewish heritage and spoke out in defence of the community. But after the assassination of Walther Rathenau, a Jewish friend in the German government, Einstein knew he was no longer safe in Germany. He fled, ultimately settling in America, having felt at home on previous visits. He appreciated the tolerance for creativity and free speech, a great contrast to Germany. He used this liberty and his public identity to advocate pacifism and call for an international authority to govern military power. He became a US citizen, his fourth change of citizenship, but his allegiance to his adopted state left an eternal blemish on the world. Physicist Leó Szilárd, a friend of Einstein, told him of the potential for uranium to produce powerful explosives. In 1939, just after the outbreak of war in Europe and as America began to rearm, Einstein wrote to President Roosevelt warning him of a possible nuclear threat from the Nazis. His letter triggered the Manhattan Project. Although Einstein’s action appeared to contradict his pacifism, he stuck by his principles. He no longer considered pacifism a viable stance in the face of the Nazi threat, and approved nuclear weapons as a justified defence. But once it was clear that Germany had no such weapons, he opposed continuation of the project and condemned the use of the bomb. Later on, he made this clear: “Had I known that the Germans would not succeed in producing an atomic bomb, I never would have lifted a finger.” After the war, Einstein used the nuclear threat to bolster his argument for an international authority. He became the chairman of the newly formed Emergency Committee of Atomic Scientists, dedicated to the control of nuclear weapons. Although Einstein was considered naïve in his ideals and politics, he was now motivated not only by pacifism, but also, according to his own admission, by guilt. He dedicated the last years of his life to this cause. Since his death, studies of Einstein’s brain have not fully revealed the secret of his genius. But looking behind the science, it is clear that he was simply a man, albeit one who was captivated by the ways of the world and fearless in voicing his own observations, be it the scientific, the political, the social or the philosophical. He was cremated on the afternoon of his death, with only 12 people at his funeral. His ashes were scattered before news of his death reached the world. These modest final wishes were a poignant conclusion to the life of the very humble and very human man who changed the world.

EDENPICTURES

you I have no self-confidence, no passion for work, no enjoyment of life—in short, my life is a void.” They had two sons, with whom Einstein’s relationship fluctuated. But their marriage broke down and when Mariç moved away with their children, he was reduced to tears that betrayed his true attachment to them. “The thought of leaving the children stabbed me like a dagger every morning when I awoke.” Einstein married again, to his cousin Elsa, a relationship that caused the split with Mariç. Although he stayed with Elsa until her death, his infidelity continued with a string of illicit relationships that reveal his often unmentioned magnetism and appeal to women. More subtle glimpses beneath the detached façade hint at Einstein’s depth. He was a talented violinist, often playing with others, and even celebrated his theory of general relativity by buying a new violin. As an old man, he endeared his neighbours with recitals and serenades. He helped children with their schoolwork and displayed an eccentric wit. Once, while in a convertible as it began to rain, Einstein removed his hat, put it under his coat and turned to his puzzled companion: “You see, my hair has withstood water many times before,” he said, “but I don’t know how many times my hat can.” After general relativity was published and proved by the observation of deflected light, Einstein’s life changed forever. He became a celebrity across the globe. As a largely solitary man, the constant demand for his attention annoyed him, yet he did not avoid it and, as he became used to it, even relished it. His eminence, however, made him a target for the growing anti-Semitism in Germany between the wars. His work was attacked and his unorthodox

The Levitt Building, 270 Broadway, New York, where the Manhattan Project was launched

Ian Fyfe is a PhD student in the Department of Pharmacology Behind the Science 23


Test Tube Babies

24 Perspective

RE

K

would “necessitate infanticide”. During the first half of the 1970s not a single woman became pregnant through IVF and public scepticism remained high. An ectopic pregnancy— in which the embryo implants outside of the uterus—was achieved in 1975, but it was not until 10 December 1977 (coincidentally, the day Nobel prizes are given out) that Steptoe and Edwards could confirm the very first in utero growth of an in vitro fertilised human egg. The breakthrough was an instant media hit. News reporters started camping out in the hospital grounds, telephoto lens cameras were constantly aimed at the staff and expectant mothers, recording devices were smuggled in, private information and progress reports were smuggled out, reporters used bribes and disguised themselves as cleaners to gain access to the ward. There was even a bomb threat to the maternity ward in early July 1978. Such frantic scenes are rarities nowadays, as IVF is an established therapy in most parts of the world. And given its success over the last 30 years, it is not difficult to characterise the development of IVF as one of the greatest scientific achievements in modern times. Its impact on the treatment of infertile couples since its initial implementation has been tremendous. In the UK alone, over 120,000 babies—roughly the population of Cambridge— have been born through the use of techniques now collectively termed ‘assisted reproduction’. In the decades after the birth of Louise Brown, IVF therapy has grown into a regulated global industry. In the UK, the

O

Until October 2010 more than four million babies have been born with the help of in vitro fertilization

    on 25 July 1978, a heavily pregnant woman is waiting in Oldham hospital outside Manchester for her baby to be delivered by a planned Caesarean section. She has been admitted under a false name, and only a handful of the staff know who she really is. The responsible doctor left the ward earlier that day, ploughing his way through the throng of reporters and news crews waiting outside. In the evening, he returns surreptitiously via a side entrance. The police are called to the hospital, and the journalists disperse. The birth is planned to take place at night to avoid the fierce mass media attention. At 11:47 pm, a healthy baby girl is brought into the world. As she takes her first breaths, a decade of embattled scientific research culminates in its final success: the first baby conceived by in vitro fertilisation (IVF) is born. Fast-forward 32 years, it’s October 2010 and Louise Joy Brown is now a grown woman with a child of her own. The IVF baby count has reached well over four million individuals world-wide and for his work leading to the development of IVF, retired Cambridge scientist Robert Edwards receives the Nobel Prize in Physiology or Medicine. After having started their collaboration in the late 1960s, Robert Edwards and his colleague Patrick Steptoe soon realised that their research into assisted human fertilisation would not to go unnoticed. Their early publications involved the surgical removal of human egg cells using laparoscopy (nowadays known as ‘key-hole surgery’) and development of protocols for growing embryos from human reproductive cells. This work quickly made its way into the media spotlight: “Test-tube time bomb ticking away” read the blazing headlines of the day, and a BBC-produced TV programme about cell fusion and in vitro fertilisation opened with images of the Hiroshima bomb. Religious and political leaders lined up to denounce or criticise the developing technology. Even scientific titans joined in the chorus, as James Watson (of DNA structure fame) stated during a Washington conference in 1971 that the technology

AZ

WELLCOME IMAGES, DAVID NELSON

Sara Lejon gives her perspective on Nobel prize winning in vitro fertilisation technology

Lent 2011


Lent 2011

Conceiving a child is a human right, but are we ready to handle this privilege?

SEAN McGRATH

NHS offers up to three cycles of IVF to eligible couples. Some who fail continue their efforts abroad, paying thousands of pounds to private clinics. The success rate for IVF decreases dramatically with the woman’s age, dropping from nearly 30 per cent for under-35s to less than 1 per cent for women over 44. From a financial perspective, IVF is still hopelessly and increasingly ineffective. And as the average age of women undergoing IVF in the NHS continues to creep up—from 33 years in 1992 to 36 years in 2007— IVF clinics will find themselves confronted with women seeking treatment in their 40s or even 50s, for whom the success rate will be vanishingly low. Many people may balk at the thought of women in their 50s conceiving children, but considering that life expectancy is on the rise in many parts of the world, it may well become an accepted reality within the next few generations. Although many of the doomsday voices raised in its early days have since been appeased, IVF is by no means an uncontroversial therapy. Today the techniques used continue to raise ethical concerns and stoke philosophical debate. In the same vein, as assisted reproduction techniques move further into unchartered territory, there is a continual requirement for re-evaluation of current moral standpoints. How do we judge the morality of new techniques when frontline science is moving rapidly? One may take the same path as the Catholic Church and assume the relatively uncluttered viewpoint that “… artificial insemination and fertilization…remain morally unacceptable”. For the more discerning individual, each technique will likely present its specific moral problem. For example, in the case of egg donation, should egg donors receive payment (as sperm donors do), and to what extent will such a practice ultimately lead us to put a price on fertility? Or even on a human life? And what are the moral concerns in the case of preimplantation genetic testing? As of yet, only a few hundred women undertake such testing each year in the UK, but it may well become more broadly available as techniques are made less costly and time-consuming. In the case of testing for monogenic diseases such as cystic fibrosis and sickle cell disease, the dilemma may not be a particularly difficult one. But what about testing for late onset diseases such as Huntington’s disease or cancer, or diseases for which the results may well come out in terms of a risk rather than as a qualitative yes/no answer? And even with a certain answer, what right does a couple have to knowingly bring a baby with a disability or a predisposition to a life-shortening disease into the world, only because they desire a child? Or even a perfectly normal baby, for that matter?

As awe-inspiring and boundary-challenging as IVF and its derivative techniques are, they will bring us back to the same point over and over again: a couple desperately trying to conceive. The right to have a child is given without question to every person on the planet, and any attempts to curtail this right will inevitably raise questions of immorality. But, no matter how a child is conceived—naturally or by assisted methods— having a child should be regarded as a privilege, not a right. The decision to bring a new person into this world should never be taken lightly. It may be a bit much to ask prospective parents to consider complex issues such as overpopulation and the creation of homogenous gene pools, but since external factors may well force us all to make such decisions in a not too distant future, we may have to get used to the idea. Robert Edwards received his Nobel Prize in December 2010 as the scientific father of a small country of children, one that started with the seminal birth of Louise Brown. His and Patrick Steptoe’s achievements are unquestionably extraordinary. The future is thus set for further scientific advances, and it is anyone’s guess what impact it will have on generations to come—or what moral choices it may eventually force them to make.

Sara Lejon is a postdoctoral researcher in the Department of Biochemistry Perspective 25


This is Your Brain on Mozart Lindsey Nield discovers the hidden power of music ‘brain injury’ is a simple term with complex implications. It describes a variety of causes that can lead, among other effects, to reduced physical function, problems with language comprehension and expression, and memory impairment, possibly putting survivors at greater risk of depression. Maximising a person’s abilities after brain injury is vital to improve their quality of life and many innovative therapies have been developed to help restore lost functions and aid in treating depression. One technique which is gathering support is music therapy. The ability of music to affect our mood, at turns both relaxing and uplifting, has long been recognised. It is clear that, to appreciate this art form, our brains conduct a detailed analysis. The introduction of imaging techniques such as functional magnetic resonance imaging (fMRI) has enabled researchers to map how music is processed in a living brain.They found that several disparate areas of the brain are stimulated when listening to music, and yet more are activated whilst playing it. This makes music an ideal medium with which to access the brain during rehabilitation. Processing sounds, such as music, starts with the auditory portion of the inner ear (cochlea), which sorts complex sounds into their constituent elementary frequencies. This information is then relayed along the auditory nerve, finally reaching the auditory cortex in the brain, which analyses the music in terms of volume, pitch, melody and rhythm. A vast network of connections then activates regions all over the brain that perform diverse operations. The cerebrum keeps music in working memory and remembers if a tune is familiar. It is also the location of the motor cortex, which helps to control body movements. The cerebellum, a vital

26 Arts and Reviews

NIDA

NIH

The dopamine pathway is triggered and gives rise to a sense of enjoyment when we listen to music

control centre for reflex actions, balance, rhythm and coordinating muscle movement, creates smooth and integrated motion when hearing or playing music. Broca’s area additionally supports the timing and sequencing essential to music, speech and movement, and the frontal lobe is involved in the planning and coordination needed to play an instrument. Our emotional response to music, such as the feelings of joy, sadness and excitement, comes from the limbic system, deep inside the brain. Music primarily stimulates a structure called the ventral tegmental which is linked to feelings of pleasure and is also activated by eating, sex and drugs. An area called the amygdala, which is linked to negative emotions such as fear, is normally inhibited when listening to music. Music therapy was initially introduced in order to inhibit such negative emotion, while promoting emotional expression and support. For patients with impaired cognitive function, frustration often comes from the inability to express feelings. Listening to songs which reflect these emotions can lessen the isolation and, by substituting their own words for those of popular songs, patients can begin to express themselves. As cognitive ability improves, it may become possible for entirely new lyrics to be written allowing better communication. Different types of music have varying uses during therapy. Music with a strong rhythm can stimulate brainwaves to resonate in sync with the beat, with faster tempos bringing sharper concentration and more alert thinking, and a slower one promoting a calm, meditative state. In this way, using an up-beat song can help brain injury survivors to focus on a task and improve their attention. These songs can also promote music making with the therapist or within groups to help build personal relationships and improve behaviour through feelings of inclusion. This technique has even shown some promise in treating autistic children who find social interaction difficult. Alterations in brainwaves can also change other bodily functions. Listening to music with a slower tempo can slow breathing and reduce heart rate, helping to prevent stress and promoting relaxation. There is some indication that music can also affect levels of various hormones and trigger the release of pain—relieving neurotransmitters. Amongst other benefits, this can lower blood pressure which can Lent 2011


Vienna orchestra

Playing music in groups improves personal relationships and enhances feelings of inclusion

Arturo de Albornoz

reduce the risk of stroke and other health problems such as cardiovascular disease. It may also explain why premature babies are less stressed when music plays in the neo—natal ward, aiding their development. Music therapy can also be beneficial to sufferers of neurodegenerative diseases such as Alzheimer’s. Apart from the calming effects and social interaction already discussed, music can trigger memory. When the brain recognises a familiar song it may stimulate a pathway to memories that are otherwise inaccessible, helping patients to reminisce and connect with friends and family. In rehabilitation from brain injury, music can be used as a mnemonic device to access, aid and sometimes rebuild memory function. In recovering functions such as memory it appears that the brain is able to create new connections to bypass damaged areas. When we learn, neurons are activated and networks are created between different brain regions. If these networks are disrupted following neurological trauma, the brain is capable of plasticity—utilising the remaining tissue and reorganising its function to develop new pathways. Exposure and experience reinforce these new connections so that learning and training can help rewire the injured brain and recover as much ability as possible. As we have seen, music connects many areas in the brain and

Lent 2011

can help drive reorganisation. This discovery has led to the use of music in rehabilitation of motor control. Due to the connections between the auditory and motor systems in the brain, musical rhythms can be used to help prime the motor system and drive timing during movement. Auditory cues act as an external timer to which patients try to synchronise their actions. By following such cues, recovering stroke patients are able to walk faster and with better control, reinforcing the improvements with long-term training. In a study that worked on arm movement in stroke patients, imaging showed that therapy had triggered brain plasticity, with additional regions being activated. Music and rhythm have also proved beneficial for Parkinson’s patients, quickening their movements and acting as a trigger and preventing the sudden halt of motion frequently seen in Parkinson’s sufferers. In addition to loss of motor control, damage to the left side of the brain can leave stroke patients with aphasia—the inability to speak. Since music utilises so many areas of the brain, patients are able to bypass injured speech centres by singing, which relies mainly on the right side of the brain, rather than saying what they want to convey. By gradually removing the melody some patients are able to retrain the brain and regain their speech. The true power and beauty of music is revealed in its ability to heal the brain. More than melody, more than expression, it is an enjoyment hardwired into our consciousness—an instrument whose value we may have yet to fully appreciate. Lindsey Nield is a PhD student in the Department of Physics

Arts and Reviews 27


The Great Trigonometrical Survey Tim Middleton explores how India was mapped and the world’s tallest mountain named 1819 and the monsoon was due. Lieutenant George Everest was in the middle of the Indian jungle between the Godavari and Kristna rivers with a team of 150 men. These jungles were home to numerous menacing creatures: humpbacked boars and tigers patrolled the forest floor; boa constrictors and bird-sized spiders lived in the trees; and hornbills ruled the air. Then the monsoon struck. Vegetation sprouted from every crevice in the cracked earth and paths soon became choked with plants. Insects swarmed everywhere. Dry riverbeds became raging torrents. Suddenly, Everest found himself cut off from his supplies by the river Musi, a tributary of the Kristna. His next goal, the hill of Sarangapalle, was the other side of the Kristna itself. Everest’s elephants, carriers of his prized surveying instruments, refused to cross the swollen river. So Everest and 12 of his trusty men took to the water in a coracle, a small round boat woven from palms and covered in animal hides, which he had found on the river bank. Evening was drawing in, but they pressed on to Sarangapalle, a further 12 miles. By the time they reached the hills the heavens had opened. They spent the night in a raging thunderstorm with no food or tents, their Harris tweeds doing little to keep out the rain. Within a few weeks, the whole of Everest’s party had succumbed to fever. Such was the life of a surveyor in India. George Everest had taken leadership of The Great Trigonometrical Survey in 1823 upon the death of his successor, Colonel William Lambton, who established it in 1802. It was one of a number of surveys that were set up as part of the colonial enterprise; a way for the British to stamp their authority on Indian territories. It was essential for the British rulers “to have a complete geographical knowledge of the country for their revenue and administrative purposes”. The job of this survey was, quite simply, to determine where places were. Up until this time the only known method for locating a point on the Earth’s surface was to make astronomical observations, which took months if not years to complete. Lambton recognised that surveying by triangulation was the only sensible way to make a map of India. Triangulation involves measuring angles. If the distance between two locations is known and the angle from each of these locations to a third location

it was july

George Everest, who painstakingly mapped much of the Indian subcontinent

Measuring the width of a river by triangulation

28 History

is measured, then the rest of the distances and angles in the triangle can be calculated using trigonometry. Therefore, the job of a surveyor was to measure the distance along a baseline between two initial locations as accurately as possible and then to keep measuring angles and performing calculations in order to construct a whole network of triangles. However, because of the Earth’s curvature these triangles are actually being measured on the surface of a sphere. In spherical geometry, the angles in a triangle add up to slightly more than 180 degrees. This ‘spherical excess’ had to be taken into account. To make matters worse, the Earth is slightly fatter round the equator than round the poles so it isn’t actually a perfect sphere in the first place. Surveyors had, in other words, to be exceedingly confident in their mathematical ability. The angles were measured using an instrument called a theodolite, which was essentially a fancy telescope. The telescope was mounted so that it could rotate in both horizontal and vertical planes. Microscopes were mounted on the instrument for reading off the angle in each of these planes. For measuring distances, surveyors used a carefully calibrated chain, which was painstakingly manoeuvred along a baseline in order to measure its entire length. The surveyors were so meticulous that they even carried out experiments to check how the length of their chains varied with temperature. One such experiment concluded that a temperature change of 1degree made a difference of 7/1000s of an inch over the length of a 100 foot chain. After his ordeal in the Indian jungle, Everest was plagued by fever for the next few years and was forced to return to England in 1825. But Everest used his time in England wisely, spending five years improving his instruments. He liaised with Troughton & Simms instrument makers in London to devise a new, smaller, lighter and cheaper theodolite. He also decided to make use of compensation bars, a pair of iron and brass bars strapped together, for measuring his distances. Since the two metals expand to different extents when heated, the effect of temperature on the length of the bars is compensated for. He also adopted a new surveying strategy. He realised that he didn’t need to cover the whole country with triangles but could instead create a gridiron pattern: an array of northsouth and east-west traverses across the country that intersected at right angles. Lent 2011


LUCA GALUZZI

North Face of Mount Everest as seen from the path to base camp

Finally, in 1830 Everest returned to India. He started by triangulating from Dehra Dun to Sironj, a distance of 400 miles across the plains of northern India. The land was so flat that Everest had to design and construct 50 foot high masonry towers on which to mount his theodolites. Sometimes the air was too hazy to make measurements during the day so Everest had the idea of using powerful lanterns, which were visible from 30 miles away, for surveying by night. With all this new equipment, Everest’s party was now 700 men strong. The entourage included four elephants for the principals to ride on, thirty horses for the military officers and forty-two camels for carrying supplies. By the time Everest retired from the survey in 1843 most of the job was done. The survey’s line of triangles up the spine of India covered an area of 56,997 square miles, from Cape Comorin in the south to the Himalayas in the north. What is utterly remarkable is the accuracy they achieved. From time to time, they would measure the length of a baseline at the far end of a line of triangles to check their results. One such measurement of a 7.19 mile baseline differed only 3.7 inches from the value calculated by triangulation. The

0 201 lmas e 19 Issu co.uk

mag

ISS

Modern World, Modern Art Ian Fyfe explores the way in which science and technology has revolutionised fine art of the weird and wonderful exhibits at the Tate Modern may not appear to have any connection to science. Neither may the masterpieces of Warhol, Dali, Picasso and Monet. But without scientific innovation, we would have had none of these. The 19th century saw the birth of modern science, with a surge of technological progress, revolutions in thinking and the founding of the scientific method. It is no coincidence that the same period saw the birth of modern art. Prior to the mid-1800s, art was used to produce realistic depictions of scenes. Subjects were almost always religious or mythical scenes, historical events or portraits of eminent people. Artwork was usually commissioned, and artists painted what their wealthy customers wanted. But within the last 150 years, modern science has changed the place of art in the world forever. Photography is undoubtedly the technology that had the most obvious impact on art. By 1840, glass lenses, photosensitive silver compounds and fixing solutions had been combined to produce the first glass negatives. It was not long before photography was a cheaper and more accurate means than painting of producing realistic pictures of people, places and events. One of the purposes of art had been undermined. At a similar time, in 1851, The Great Exhibition at Crystal Palace was the first of many international expositions to bring the latest industrial and technological advances to the public. They created enthusiasm for machinery, industry and the future; it was a new world, a new technological era. With photography threatening the value of art and the public being swept away with science and technology, it would take a revolution to prevent art from being left behind. The first step in this revolution was the emergence of Impressionism in the 1860s. The Impressionists,

the latest

With Impressionism, modern technology became the subject of art (right). Simultaneous contrast causes the same colour to appear different according to the colour it is next to the central square is the same colour in both cases (above).

exemplified by Claude Monet, departed from conventional subject matter and, inspired by photography, captured moments from the new technologically-driven life; city street scenes, train stations, bridges and boats. But they did not set out to reproduce the scene accurately. Instead, they aimed to recreate the experience of a passing moment. The representation of the light was more important than the subject itself; a major departure from artistic convention and one which was triggered by the integration of technology into daily life. The impressionists’ techniques were equally unconventional and also relied on recent scientific progress. The use of colour in Impressionism was influenced by the colour theories of Michel Eugène Chevreul. As professor of chemistry at the Lycée Charlemagne, with expertise in dye compounds, Chevreul became director of the Gobelins tapestry works in Paris. During his work there, he noticed that the colour of a particular yarn appeared to change according to the colour it was immediately next to. He realised that this was due to an alteration in our perception of the first colour caused by the second, and published his theory of simultaneous contrast in 1839. The Impressionists incorporated his theories into their work to achieve the desired effects of light and shadow. Chevreul had discovered a perceptual oddity that forever changed the use of colour in art. Also key to the Impressionists’ success were discoveries and new manufacturing techniques that changed their materials. Science was applied to the development of paints. New pigments based on the recently discovered elements of chromium, cadmium, zinc and cobalt provided brighter colours, while the manufacture of synthetic pigments added completely new colours. More significant than the paints themselves was the collapsible paint tube. Before the 1840s, artists purchased pigments to grind and mix themselves and stored them in pigs’ bladders in their studios. But new manufacturing techniques allowed tin to be rolled thinly and pressed, leading to the invention of the squeezable tube by James G. Rand in 1841. The

CLAUDE MONET

, rsity vy nings . Scur Biodive S - The begins Art nces . CU ern FO Therapy d triumph 26 . Mod Experie mun Body Gene ges an of nkha en Tuta . Out chall ndish Cave y Henr Arts and Reviews

Lent 2011

Michaelmas 2010

tube was refined to incorporate a screw cap, allowing paint to be stored without drying. The paint tube liberated the Impressionists and allowed them to work outside, since the paint was contained and easily transported; their choice of subject was unlimited. The new paints also contained paraffin wax and animal fat, resulting in a consistency that allowed thicker application. Because paints in tubes could be stored without drying, artists could afford a greater range of colours, and the new vibrant pigments helped them to recreate light effects. Pierre-Auguste Renoir, an eminent Impressionist, said that “without tubes of paint, there would have been no Impressionism”. And without Impressionism, there may have been none of what followed in the art world; the next generation of artists built on the revolution of the Impressionists and moved art forward with science. By the early 20th century, science was changing the way people viewed the physical world, both literally and conceptually. Passenger steam trains were in common use and mass production of cars began in the early 1900s. Motorised transport carried people through their daily lives at speed; the world flashed by in flickers of light, familiar forms blurred together. Meanwhile, Einstein was changing the way we thought of space and time, raising new questions about the nature of the world and our experience of it. The impacts on art were profound. No longer restricted by artistic conventions of subject and technique, the artists of the early 20th century explored this new world with radical approaches. Traditional perspective, form and colour were discarded entirely and rather than depicting scenes at all, paintings were used to convey concepts. The Futurists attempted to represent the movement and dynamism of the modern world, rejecting every artistic convention and embracing the triumph of science and technology as their subject. Picasso and the Cubists explored the experience of seeing, and how our perceptions of objects are constructed from continually changing perspectives. They captured this by including multiple views of the subject in one picture. Their work developed to include no recognisable subject at all, but instead became metaphors for relativity and our visual experience of the world. Abstract art had been born. The early 20th century also saw the first attempts to explain human behaviour scientifically. Sigmund Freud in particular was highly influential to the Surrealists. These artists, including Salvador Dali, created dream-like scenes with strange motifs, objects that merged into one another, and often sexual undertones in line with much of Freud’s work. They explored concepts of the mind, often turning to episodes or fears from their own lives for inspiration; this was completely new ground for art. Michaelmas 2010

The invention of the paint tube gave artists more freedom

IAN FYFE

azine

748 9 771 692 748 9 771

ience

ty sc

iversi

e Un

920 8-6 > N 174 19 ISS 8-6920 > N 174 19 000 692 000

luesci. www.b

bridg

Tim Middleton is a 3rd year undergraduate in the Department of Earth Sciences

iver sity ge Un from ine brid Cam magaz science Michae

Cam

People are unsure whether Everest ever laid eyes on the mountain that bears his name. It was his successor, Andrew Waugh, who extended the triangulation network into the Himalayas and named the mountain after him. Waugh wanted “to perpetuate the memory of that illustrious master of geographical research…Everest.” Unfortunately, Waugh wasn’t aware of the local Tibetan name of Chomolungma; perhaps another mark of the nature of British colonialism. But it is nevertheless fitting that the tallest mountain in the world is at least in part named after one of the giants of surveying. Today, the Ordnance Survey in the UK employs a combination of aerial photographs and GPS (global positioning satellites) for their map making, meaning it can largely be done at a desk. However, the dedication of 19th century surveyors such as Everest in the face of so many adversities is commendable. Indeed, The Great Trigonometrical Survey of India has, quite appropriately, been described as “one of the most stupendous works in the whole history of science.”

In the 1950s, the introduction of the television and expansion of print media and advertising – together with mass production of consumer goods – created popular culture; this then spawned ‘Pop Art’. Personified by Andy Warhol, pop art took mass produced symbols of popular culture and presented them as fine art. Mechanical techniques produced several identical pieces of artwork, challenging the concept of art itself. In the modern digital era, art is still changing. Having provided an important trigger for the development of modern art, photography is now a major art form itself. Digital cameras and sophisticated editing software allow the creation of almost any visual effect. Combination of traditional materials and techniques with digital editing further widens the scope in art. In a similar way to photography, the recent explosion in mass media and the internet may well have provided a new trigger for changes in art. They provide a continual bombardment of images, meaning that fine art can show us little new on a visual level. Instead, works such as Tracy Emmin’s My Bed, and Doris Salcedo’s Shibboleth (the crack in the floor of the Tate Modern) have come to the fore. This kind of conceptual art does not try to impress visually, but instead presents familiar images in unfamiliar ways, hoping to affect how we think. The art of today is often scorned; it seems whacky, obscure and overpriced. But this could have been said of art at any stage of the last century-and-a-half. Looking back, we can see that modern art is an attempt to represent and understand a rapidly changing, technological world. Modern science has provided new material, both physically and conceptually, to drive a gradual progression of art that has brought us to the modern day. There is no doubt that future science will continue to drive the evolution of art. Ian Fyfe is a PhD student in the Department of Pharmacology

Modern art such as Doris Salcedo’s Shibboleth attempts to make us think rather than impress visually

MORE SCIENCE, MORE OFTEN

twitter.com/bluesci bluesci.co.uk

Arts and Reviews 27

History 29


Mind Games Tom Ash looks into the development of computer systems that can receive commands directly from the brain computer game controller design has seen a

recent move away from buttons and joysticks towards more naturalistic input using motion sensors and voice recognition. The next step could be even more revolutionary and would not require any movement at all—simply an active brain. The technology behind this control scheme— electroencephalography (EEG)—is not new. Electrical activity in the brains of animals was reported first in the 19th century by Liverpudlian Richard Caton. Since then, the science has developed steadily. Present day EEG kits use scalp electrodes connected to a laptop. These electrodes detect the electrical activity of neurons in the brain, giving feedback to researchers on when, where and how the brain is working. A recent trend in EEG research has been the creation of brain-computer interfaces (BCIs). BCIs use the brain’s electrical activity to carry out simple tasks such as controlling a cursor on a screen, spelling out a word or moving a paddle in a game of pong. The development of these schemes requires a training phase during which a computer learns to recognise the patterns that the brain produces as an individual thinks about different things. Alternatively, users can train themselves to change their brain states in ways the computer can interpret. For example, increasing their concentration to make an avatar rise on the screen and relaxing to make it drop again. In recent years, this technology has been brought out of the lab and into the consumer market by dedicated producers such as Emotiv, NeuroSky and OCZ. In order to make them affordable and easy to use, these companies have stripped down EEG research systems to their essentials. Unlike research sets, these do not require extensive scalp and electrode preparation and use fewer electrodes. They are also considerably cheaper, with

.dh

Engineers and artists exploring the possibilities of brain-computer interfaces in Arizona, USA

30 Technology

some sets selling for less than 100 pounds. It is thus no surprise that game manufacturers are starting to look on with interest. Although games using this technology are already on the market, they are not nearly as complex as modern console releases. Mindflex invites users to steer a ball around an obstacle course with their mind, while Star Wars Force Trainer allows users to emulate Jedi ‘force’ tricks by using their minds to control a fan whose motion causes a ball to lift. Recently, household names have invested in developing the technology further. Microsoft Research is working on bringing low-cost brain-computer interfaces to the masses and Square Enix, makers of the popular Final Fantasy series of computer games, have started to develop a game with EEG controls. Not everyone is convinced, however, that these devices will be a gaming success. EEG experts point out that these cheap devices are not as capable of filtering out noise as research set-ups. In particular, electrical signals from scalp and facial muscle movements are likely to drown out true brain signals. Even if this problem is solved, researchers suggest that to make these systems work reliably, users have to be prepared to put a lot of time into optimising electrode placement and usage, and into training both computers and their own brains to ‘communicate’ effectively. For a generation of computer gamers accustomed to quick hits and instant gratification, this may be too much work. Instead, it may be that the greatest beneficiaries of the consumer-EEG movement will be those most motivated to put time and effort into making them work. For patients with locked-in-syndrome or those with motor disabilities, the ability to communicate with their mind or control a wheelchair or prosthetic, could immeasurably improve their quality of life. The time and intellectual investment required to train one’s brain to communicate with a computer may prove too much for the casual gaming market. However, gaming’s flirtation with EEG will leave a legacy of cheaper, easier to access brain-control technologies for those who really need it. No matter how many of us end up brain-gaming, it will not be a wasted venture. Tom Ash is a PhD student in the Department of Clinical Neurosciences Lent 2011


In the Driving Seat Audi ‘e-tron’ electric car

Lent 2011

is your local mp interested in science? If you voted in Cambridge the answer is a resounding yes. Dr Julian Huppert was a researcher at the Cavendish Laboratory before being elected to the House of Commons. Being an MP with scientific training to degree level, however, he is very much in the minority. Despite this, our politicians are increasingly required to debate high-level scientific and technical issues. How should the NHS choose which cancer treatments to fund? Is current internet regulation adequate? And one of the questions I grappled with this summer—could the use of electric vehicles really reduce the UK’s carbon emissions? In May 2010, a few days after the general election, I began a three-month fellowship at the Parliamentary Office of Science and Technology (POST) under a scheme open to PhD students. POST’s purpose is to inform MPs and peers on scientific issues which are of relevance to policy. This is primarily achieved through the publication of four-page briefing documents on particular topics, called POSTnotes. My topic was Electric Vehicles, and my POSTnote was published in October. The quickest way to become an ‘expert’ on a totally new topic (my PhD being nothing to do with electric vehicles) is to talk to the people who are. So, amongst others, I interviewed representatives from the automotive and energy industries, professors of engineering, and civil servants from the Department for Transport. I tried to guess the first questions an MP would ask about electric vehicles. They would probably be more interested in immediate practicalities—‘what can the technology do now?’—than the wide range of future possibilities being considered by researchers. Other areas of direct relevance to policy—which, coming from a scientific perspective, I had not at first considered—include current consumer opinion and the potential impact of electric vehicles upon the UK automotive industry. MPs are expected to appear well-informed when quizzed on just about anything. A POSTnote therefore aims to give concise background information on a topic and then highlight for parliamentarians key areas which will need to be considered when policy is being made. POST has an impressive back catalogue of notes available on their website, from ‘CO2 Capture, Transport and Storage’ to ‘Diagnosing Dementia’. Of course science is not and should not be the only consideration for policy makers. The level

Thomas wolf UNIVERSITY PRESS OXFORD

Rosie Robison recounts her experience working at the Parliamentary Office of Science and Technology

of social acceptance arguably plays a greater role in illegal drugs policy than the scientific evidence of their relative harms; whether this is right or not remains a matter for debate. However, in our increasingly technological world, science can make a real contribution to better policy making. It can help politicians foresee future problems, such as the effects of climate change, as well as future economic opportunities. Technology can aid better policy implementation—it will play a critical role in the 2012 Olympics. Finally, scientific knowledge is used to directly improve people’s lives in healthcare. MPs are not the only non-specialists who benefit from accessible science. However, academic scientists tend to write primarily for other academic scientists. In part, this reflects the time and dedication required to understand scientific findings. There may be a feeling that too much would be lost in writing for the lay person. I myself wondered this summer whether it was misleading to give someone the impression that they had covered all there was to know about a subject in four pages. However, working at POST has convinced me that it is possible to get to the point in plain English, while still highlighting areas of uncertainty and complexity. Hopefully I went some way towards achieving this. You can read Rosie’s briefing on electric vehicles at http://www.parliament.uk/post where you will also find further details on POST’s fellowship schemes. Rosie Robison is a PhD student in the Department of Applied Mathematics and Theoretical Physics Away from the Bench 31


Weird and Wonderful ALEX HAHN

A selection of the wackiest research in the world of science And bad news if you think you can just cover up that yellow shirt with a jacket. The study also showed that even when the shirts in the photographs were hidden from view, subjects still rated the red and black shirt wearers as more attractive than the yellow and white shirt wearers, suggesting that even the wearers were not immune to the psychological effect of their shirt colour. The moral? Wear red or black next time you’re trying to pick someone up. Or, if you can’t, at least try to make them believe that you are. tg

Oh F*@#! Whale snot collected by helicopter collecting whale snot isn’t easy.

Previously, snot had only been obtained from stranded whales or whales in captivity. Recently, however, researchers from the Zoological Society of London have come up with an ingenious method for collecting snot in the wild. Whales exhale through a blowhole in the top of their head, so the team hired a trained pilot to fly a remote-controlled helicopter into these exhaled ‘blows’, using binoculars to keep track of proceedings from a safe distance away. They collected their samples in two sterile Petri dishes, strapped to the landing gear of the helicopter with cable ties. When the helicopter returned from its five-minute flight, they retrieved the snot, wearing safety masks and disposable gloves to avoid contamination. Back in the lab, molecular biology techniques were used to analyse the snot for disease-causing bacteria from the whales’ respiratory system. Infectious diseases are currently considered a serious threat to wildlife conservation, and the researchers hope that their pioneering new technique will help conservationists to monitor whale diseases. tm

whether stubbing your toe, touching a hot stove or hitting your thumb with a hammer, we’re all guilty of uttering some expletives. However, scientists have shown that swearing really does help lessen pain. Ig Nobel prize winners from Keele University recruited students to plunge their hands into cold water whilst reciting a swear word of their choice or a commonplace word to describe a table. While repeating the swear word, the volunteers were able to withstand the pain for longer and perceived the pain as being considerably less. Why or how swearing has this effect is not known, but in the study the swearers also showed an increase in heart rate, which could indicate the general discharge of the sympathetic nervous system, the physiological alarm reaction known as the ‘flight or fight’ response. This fundamental response allows our body to deal with imminent stress. When faced with pain, swearing may also raise levels of aggression, thus reducing feebleness, enabling us to better deal with what could be a dangerous situation. But do not overdo it! Excessive casual swearing can result in the words losing their emotional attachment, meaning next time you walk into that table corner there will be no expletive help at all. ns

ALEX HAHN

No luck in dating? Change your shirt a surprising amount has been written about the psychological effects the colours you wear have on others. For example, red clothing seems to increase success in competitive sports, whilst darker jackets contribute to perceived competence among female job applicants. A recent study from the University of Liverpool set out to test whether the colour of one’s clothes affects how attractive one seems to others. The researchers found that subjects rated photographs of members of the opposite sex more attractive if they wore black or red shirts, less attractive if they wore white or yellow shirts, with green and blue somewhere in the middle.

32 Weird and Wonderful

Lent 2011


Write for Email complete articles or ideas to submissions@bluesci.co.uk

See your article in print... ersity bridge Univ The Cam magazine from science as 2010 Issue 19

Michaelm

o.uk

sci.c www.blue

ine

magaz y science

920

00 48 6920 9 7717

19 >

iversit

ge Un Cambrid

ISSN

1748-6

Ready to Go Paperless?

Book Reviews The Cambridge Companion to Science and Religion

Wing Ying Chow investigates the advantages of electronic lab notebooks

are science and religion necessarily in conflict? Was the development of intelligent life on our is a Cambridge is the first chemistry department in the planet an evolutionary inevitability? Will it be possible to maintain religious faith as astronomers phrase often used to describe the progression of UK to adopt an ELN system, which is currently in and physicists discover more and more details about the early universe and how it formed? These science, with each generation of researchers building the pilot phase. are the sorts of questions addressed in The Cambridge Companion to Science and Religion. It isn’t on the results of their predecessors. Successful The ELN has three key features: a central a light read by any stretch of the imagination, but for those interested in some of the deepest experiments find their way into published papers, database, templates, and digital searching. A questions, it is compelling. centralised database that is professionally maintained but what about the dead ends, the unsuccessful Fourteen separate contributions, each from a different author, cover a diverse range of issues. The attempts? Often these are not published and become and regularly backed up means that data is much first five chapters chart the historical interactions between science and religion, and are refreshingly and sculptures his tomb; researchers oftenfocus on contemporary less likely to be lost. The ELN offers templates lost in laboratory notebooks. In the digital age, this objective in their analysis, if at times ainlittle dry. The central fivehave chapters queried whether hemuch mightmore haveopinionated. been intersex. may change as the recording of research moves from that carry out routine calculations automatically. issues related to the two subjects, and are The final chapters explore Genetic testing byinthe team chapters. determined that These templates speed up the planning process and paper to computer. some of the philosophical aspects raised theCairo preceding There is a lot of emphasis on was biologically male,this having of sex whilst this is an unfortunate encourage the recording of experimental details in A lab notebook is the place to sketch out ideas Darwinian evolutionhe throughout and at times startsan to XY feel set repetitive; chromosomes. The researchers theto reflect on the arguments. a format that other researchers can understand. As and record experimental procedures, results and consequence of having numerous authors, it does givetherefore the readerascribe a chance CUP, 2010, £50.00 depictions of to King Tut more to thetoartistic a digital system, the ELN can be searched using conclusions. It is a valuable record of a particular I would thoroughlyfeminine recommend this book anyone who wants fully examine the questions that style of Be theaware, time rather than actual appearance.popular science book. tm text or even chemical structures: very handy when scientific investigation for both the researcher who science raises about religion. however, thishis isn’t an easy-reading, Using genetic testing and other biomedical writing a paper or thesis. Many types of files, from carried out the work and colleagues who may want techniques to answer some of the questions annotated images and spectra to journal papers in to revisit and build upon it. surrounding the life and death of King Tut is PDF format, can be dropped into the ELN and Yet not all researchers keep equally good lab Blood and Guts: A History of Surgery not only a huge technical achievement, it also searched in the same way. notebooks, and repetition of work due to badly a change tradition. The main challenge with the ELN is getting blood and guts isrepresents a fascinating accountinofarchaeological pioneering surgery and the people behind it. kept records is not uncommon. James Collip, Archaeologists now in forming academics to switch from their current method of Hollingham illustrates the successes and and scientists failures ofare surgery vivid detail using examples from the biochemist who first purified insulin, lost teams to solve problems together. This practice hasgladiators, high-speed recording experiments. The pilot scheme was targeted ancient history through to modern times. Surgeons patching up Roman track of the variables during the initial successful become common here infacial Cambridge according at first-year PhD students and post-docs so that century, astonishing reconstructions of to the present day: all are used purification. It took another twoMaggie months for Jack digs into the techniques of new archaeology, how they are evolving and how thisamputations may in the 17th Spence, archaeologist occurred. at the University. they could establish a paperless routine right from to describe momentsKate where surgicalanbreakthroughs The development of anaesthesia him to re-discover a working method. Such cases help us1920s. to answerthequestions “who King A number of labs are now dedicated to applying start of theirlike projects. Ninekilled months into Tut?” the pilot and the control of infections were two particularly important discoveries. Hollingham examines are not restricted to biochemistry in the scientific methods to archaeology. For example, scheme, there are 45 users, with 6 being particularly the changing perceptions of surgery within society by looking at the use of brain surgery to Bioinformaticians, whose research is born of the a geoarchaeology labsocial that value specialises active. Most users indicate that they still keep some of “cure” mental healthCambridge problems inruns 1960s America, and the of reconstructive surgery, digital age, also sometimes find it “easier to run in micromorphology. This technique is used to is undergoing dramatic changes in that decorate his tomb portray him with elongated their lab records on paper. particularly to wounded soldiers. an experiment again instead of trying to find archaeology the determine the composition of materials such as medical journals such as JAMA gain its research methodology. Historically, it has been features, characteristic of Marfan’s. Canes that were In contrast to academia, ELN systems Although the book accompanies a television series, it stands alone wellthe and givespublished those less in familiar data”. They rarely use paper, but they must still floorinto of an ancient or theReaders surfacewith of anmore ancient more attention an individualistic and humanities-based discipline, found in his tomb, intended to help him walk inwith themedicine an insight are becoming the standard in industry. surgery androom its origins. knowledge of the subject and ‘impact’ than articles published keep track of their investigations. cooking In an isotope lab, the bones The of ancient innot archaeological methods are being increasingly afterlife, also support this theory. However, no firm GlaxoSmithKline, a major pharmaceutical company, may find it a little slow, but thevessel. personal stories are worth reading. author does succumb to journals. This means that more and An electronic lab notebook (ELN) may helpbut to scientific people are examined determine their dietinstead, and he credits moreeach archaeologists used. Projects are becoming collaborative, and 3000 conclusions could be drawn without BBC Books, 2008, more £18.99rigorous has rolled out an ELN system to over the temptation of filling the book with gorytotales of mad scientists; surgeon, are being pushed to use the latest address some of the shortcomings of the traditional origin. Palaeobotany another of theatre cutting-edge biomedical experimentation. now resemble the multi-author articles employees. They switched from paper to electronic even those who may land seem of misguided, with playing aispart in theexample operating of today. aj techniques. paper one. The Department of Chemistry in publications a technique now commonly used, in which samples Although the application of scientific methods common lab in more traditional scientific subjects. notebooks in only nine months, andOne most of The Cairo research team undertook this challenge of ancient plants are analysed in an attempt to to archaeology has become common in other by analysing the DNA from the bone tissue of eleven example istheir a study in the February 2010 userspublished prefer it over paper notebooks. reconstruct the environments of ancient peoples. countries, it has thus far been limited in Egypt royal mummies of the New Kingdom, including The Price of Altruism edition of the Journal of the American Medical Unlike in industry, ELNs will not be mandatory Even with the integration of scientific methods, due to post-colonial politics. Since the 1970s, the King Tut. They tested Tut for Marfan’s syndrome and Association which a team researchers for(JAMA), academicinscientists in theofshort term, yet the the core of archaeology remains “Thereby a motley Egyptian government has banned the removal of discovered that he did not have Marfan’s, but did suffer based in Cairo sheduse light on some ofnotebooks the mysteries eventual of electronic in universities opening with a colourful description of George Price’sunchanged. funeral, attended collection will Harman always beproceeds a place to fortake individual work antiquities surrounding King Tutankhamun. authors’ from avascular bone necrosis, a disease characterised is “inevitable” accordingThe to Dr Tim Dickens, who of beggars and scientists, the reader on aand whirlwind tourany through the life from the country. This policy was people thinkingPrice through instated in order to curb the extensive exporting extensive isuse of biomedical techniques highlights a breakdown of bone tissue that results from a of this eccentric thinker. responsible for the computing systems that by drive Ultimately, soughtquestions to answersynthetically,” the ultimate conundrum: if survival Spence; “Good always of objects from the changes place thethe field. prolonged lack of blood circulation. This would of the fittest is all thatsays thetaking current ELNinin Department of Chemistry. matters, “how couldarchaeologists behaviour thathave lowered fitness be selected?...Why do Egypt into Western museums and worked with of the evidence is available.” collections. King ‘Tut’ a pharaoh of theofNew Kingdom, “Anwas increasing amount funding is for large,explain the canes in Tut’s tomb, but it is unlikely that vampire bats share blood? Why do all sentry gazelles jumpthat up and down when a lionprivate is spotted, putting Previously, archaeologists based These techniques areand simply providing archaeologists UK would an era of multidisciplinary relative peace and projects, prosperityand within Egypt to search this disease resulted in his death. By analysing the themselves precariously the ability between the herd the hungry hunter?” and “What doin allthe of these have tobring samples from the excavation with more evidence to fittest consider. However, thenicest?” site to machinery located at their home institution that lastedand from thedata mid-16th to 11thparticularly century BC.important.” other mummies, the researchers were able to identify share is becoming do with morality in humans? Survival of the or survival of the addition of scientific doessome present some they stilland do from excavation sites in other Previous researchers had used a limited amount the parents of King Tut, discovering in the process The author seamlessly Moreover, as a digitised database, the notebooks intertwines the lifemethods of Price with of the great minds (as of the 19th to archaeological culture. Spence countries). But due to Egypt’s antiquities policy, of evidence hypothesise KingtoTut that they were also siblings. This inbreeding may have cantoeventually be that released thedied general public, 20th centuries, from challenges Charles Darwin to William Hamilton. Despitenotes frequent references to notable thatcomplex it is easier to fund archaeological that scientific of a genetic disease such as have Marfan’s syndrome, contributed to Tut’s early demise by predisposing him who as taxpayers a right to access the work biological problems and mathematical concepts suchprojects as game-theory, theany book reads research performed at new excavations advantage of scientific Also, be conducted within Egypt. This makes which weakens the funded. connective tissue in the body. to genetic defects that affected his health. that they effortlessly, making a take scientific background whollymethods. unnecessary. In articles fact, as Harmanmust transports the applying new techniques to the field of Egyptology Papyri from the era of the New Kingdom also reader from the Siberian steppes to the slums of London, from the Russian Revolution A radiological scan of his foot showed that he to Nazi The suffered Bodley Head, a very slow process, since most of archaeology’s suggest that Tut may have from2010, malaria. Germany and from scientific laboratories to humid jungles, this brilliantly researched had a malformed arch, suggesting disease of this book offers Wing Ying Chow is a PhDa student in the £34.95 Department Chemistry.of King Tut latest scientific tools are developed and located in Four of the mummies analysed by the Cairo team, type. Furthermore, theofdepictions more thrills than many novels. dv universities outside of Egypt. including Tut, tested positive for AMA1, a protein The introduction of scientific techniques found on the malarial parasite. AMA1 is responsible Opening up the Michaelmas 2010 2010 and Book avenues Reviews 31 has opened up new of investigation to for the binding of the parasite toMichaelmas human cells, innermost shrine of archaeologists. By using some of the most advanced its presence in the body is a sure sign of infection. King Tut’s tomb biomedical technology available, Egyptian researchers Many of the initial media reports describing the were able to properly address some of the centuriesstudy misquoted it and claimed that the team found old questions surrounding King Tut and his family. definitive proof that Tut died of malaria. However, In doing so, they highlighted the innovations although this is another possible explanation for his and disciplinary changes that are transforming death, the presence of AMA1 does not necessarily archaeology and helping to solve some of the ancient mean that Tut died of malaria; he could have world’s greatest mysteries. been infected with the parasite without actually succumbing to it. Another mystery addressed by the study is the Maggie Jack is an MPhil student in the Department of History and Philosophy of Science feminised appearance of King Tut in the murals

standing on the shoulders of giants

Analysis of a soil sample reveals human remains such as bone, wood ash and charcoal

DR CHARLY FRENCH

The Transformation of Archaeology

Feature articles for BlueSci magazine can be on any scientific topic. They should be 1200 words and written for a wide audience.

gs, S FOCUerapy - The beginnin

The problem of legibility is illustrated by this excerpt from Charles

Darwin’s Gene Th and triumphs notebook ity t . Scurvy challenges dern Ar nces . Biodivers un . Mo rie Tutankham t of Body Expe . Ou vendish Ca nry He

30 Technology

MUSEUM OF CAIRO

Us

NEW YORK TIMES

n Inside Evolutio

10 The Transformation of Archaeology

Michaelmas 2010

Michaelmas 2010

Deadline for next issue is 11 February 2011

The Transformation of Archaeology 11

...or online We need writers of feature articles and regulars for our new, improved website. Submissions can be of any length, and submitted at any time. For more information, visit

www.bluesci.co.uk Contact editor@bluesci.co.uk to get involved with editing, graphics or production


Centre for Science and Policy The Sciences and Technology in the Service of Society

The Centre for Science and Policy is a networking organisation dedicated to building links between policy makers and experts in the sciences and engineering. Launched in 2009, it has already established itself as a unique and effective channel in the dialogue between scientific research and public policy.

Policy Workshops

Professional Development

Our Centre Interest Groups bring together scientists and policy makers interested in specific issues from biodiversity to behaviour change, and from economic and social innovation to genomic medicine. The Policy Workshops that they convene provide a platform for developing insight and mutual understanding.

Our Professional Development Seminars introduce early career researchers to the possibilities and realities of engaging with policy. They are a first step in developing the skills needed to understand and communicate the implications of their research to policy professionals.

Policy Fellowships

Network of Associates

Our Policy Fellowships bring policy professionals to the University for one-on-one meetings with researchers, giving them fresh insight into their policy specialisms, and helping them build a network on whom they can call for advice. They remain Policy Fellows for two years, engaging with the work of the Centre.

The CSaP network is our most valuable asset, connecting together policy makers and researchers across disciplines and departments, and creating lasting links between them. Everything we do, including our lectures and seminars, is designed to use and re-inforce the network.

The Centre welcomes offers of assistance from those who would like to gain experience in the operation of the Centre. To learn more about the work of the Centre for Science and Policy, please contact enquiries@csap.cam.ac.uk, call 01223 768392, or visit www.csap.cam.ac.uk where you can also sign up for our newsletter.

Read about one internship experience with CSaP at www.bluesci.co.uk


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.