BlueSci Issue 06 - Easter 2006

Page 1

Cambridge’s Science Magazine produced by

Issue 6

Easter 2006

in association with

www.bluesci.org

The Energy Crisis What are our options?

Mobile Dangers Are phones really a healh risk?

Opinion Views from Cambridge

• Drugs in the Sewage • Quantum Cryptography • • Time Truck • Gaia • Pharmacogenomics •



Easter 2006

Issue 6

contents

Features

Does Urban Sewage Have a Drug Problem?

11

Oliver Jones tackles the problem of contaminated sewage............................................................

For Your Eyes Only

12

Tristan Farrow explores the mysterious world of quantum communications..............................

The Gaia Hypothesis:Yet Another Greek Tragedy?

14

Is the Earth a living organism? Louise Woodley reviews the debate................................................

The Future of Medicine

16

James Pickett investigates how genome sequencing could revolutionize medicine.....................

Hold the Phone

18

Gemma Simpson explores the myths and realities of mobile phone use.......................................

Plus, on the web‌ Evi Goloni interviews Roger Penrose Tess Leslie investigates bovine methane output Neil Singh looks into recent research on metabolic syndrome

Regulars

Editorial .............................................................................................................................. 03 Cambridge News ............................................................................................................. 04 Focus ................................................................................................................................... 06 Opinion .............................................................................................................................. 10 A Day in the Life of... ...................................................................................................... 20 Away from the Bench ..................................................................................................... 22 Initiatives ............................................................................................................................ 23 History ............................................................................................................................... 24 Arts and Reviews .............................................................................................................26 Dr Hypothesis .................................................................................................................. 28 Front cover Main image: Equinox Graphics, www.e-nox.net Earth from space: NASA Headquarters – Greatest Images of NASA (NASA-HQ-GRIN)



Issue 6: Easter 2006

AVMG

From The Editor

This term, BlueSci discusses some pressing issues that might help put those looming exams into perspective. The FOCUS section reviews the energy crisis and considers what we can do to reduce our dependence on fossil fuels. Our new OPINION page gives an opportunity for you, the BlueSci reader, to share your views on science-related matters. Our features this issue cover a wide variety of topics. Following last issue’s article on quantum computers, FOR YOUR EYES ONLY investigates how quantum cryptography can keep data safe from prying eyes by employing the laws of physics. On the theme of science and the environment, THE GAIA HYPOTHESIS takes a look at the history and future of James Lovelock’s controversial theory; and DOES URBAN SEWAGE HAVE A DRUG PROBLEM? brings to light the surprising levels of prescription drugs excreted into the water system. As if there wasn’t enough to

worry about already, HOLD THE PHONE analyses whether mobile phones are frying our brains. Finally, perhaps one day we will carry around a card containing our genome; could this be the FUTURE OF MEDICINE? Make sure you also check www.bluesci.org for material complimentary to BlueSci in print. Look online for weekly news updates, an interview with the world-famous mathematician Roger Penrose, a story on how to stop cows belching, and much more. Soon we’ll be adding podcasts to the line-up too, so put us in your favourites and check back frequently. As always, if you’d like to contribute to the magazine, help with editing, production, graphic design or our new, enhanced, website, don’t hesitate to get in contact with us. I hope you enjoy the sixth issue of BlueSci. Tom Walters issue-editor@bluesci.org

From The Managing Editor Welcome to another issue of BlueSci. As usual, there is much progress to report and a few small changes to our basic format. The first, which you may notice in the print edition, is that we have changed ON THE COVER to the OPINION page, aiming being to invite more discussion of hot topics in Cambridge science research. If you wish to respond to a topic or suggest one of your own, please contact us at submissions@bluesci.org. Secondly, our news and events team have been doing an excellent job of reporting Cambridge-related science news every week on the website www.bluesci.org. Furthermore, Mike Marshall, our news editor, is also writing an accompanying editorial—often expressing strong opinions! If you feel provoked to respond to these or to join

the news team, please contact news@bluesci.org, we’d love to hear how many of you are finding this service useful. Finally, this term Cambridge University Science Productions (CUSP), our parent society, has been podcasting interviews with the lecturers involved with the prestigious Darwin lectures. You can download these at www.cusp.org.uk/podcast. We are hoping to utilise the skills that this team have learnt to create a weekly BlueSci podcast to accompany the news page every week.Watch this space and the website! We hope that you enjoy a BlueScied summer. Louise Woodley managing-editor@bluesci.org

Editor: Tom Walters Managing Editor: Louise Woodley Submissions Editor: Ewan Smith Business Manager: Chris Adams Design and Production Production Manager: Ryan Roark Pictures Editor: Sasha Krol Production Team: Sheena Gordon, Nichola Hodges, Helen Stimpson, Jonathan Zwart

Section Editors News Editor: Michael Marshall News Team: Richard Van Noorden, James Pickett, Lucy Heady, Gemma Simpson, David Jones, Gurman Kaur, Laura Blackburn, Anne Hinton, Lizzie Fellowes-Freeman Focus: Helen Stimpson, Michael Marshall, Jon Heras, Sheena Gordon,Tom Walters Features: Anne Hinton, Margaret Olszewski, Helen Stimpson Opinion: Sheena Gordon A Day in the Life of…: Maya Tzur Away from the Bench and Initiatives: Kathryn Holt History: Varsha Jagadesham Arts and Reviews: Owain Vaughan Dr Hypothesis: Rob Young CUSP Chairman: Björn Haßler ISSN 1748–6920 enquiries@bluesci.org

Varsity Publications Ltd 11/12 Trumpington Street Cambridge, CB2 1QA Tel: 01223 353422 Fax: 01223 352913 www.varsity.co.uk business@varsity.co.uk BlueSci is published by Varsity Publications Ltd and printed by Cambridge Printing Park. All copyright is the exclusive property of Varsity Publications Ltd. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, without the prior permission of the publisher.

Next issue: 6 October 2006 Submissions Deadline: 10 July 2006 www.bluesci.org

Produced by CUSP & Published by Varsity Publications Ltd

luesci

03


Cambridge News

Cambridge News Gene Variants Linked to Depression Natural variations in the Serotonin Transporter gene lead to altered behaviour. Serotonin is a chemical produced in the brain that is important in many aspects of behaviour, including mood, emotion and appetite. Once it has been released, serotonin transporters act to recover it, ensuring that its effects are not too pronounced. A team of Cambridge researchers now suggest that natural variations in the serotonin transporter gene can alter people’s response to certain chemical conditions in the brain. Each individual has two copies of every gene in their genome—one from each parent. These two copies are called alleles and are not always identical.The gene that codes for the serotonin transporter, 5HTT, is one such variable gene. Sometimes the

gene contains an extra 40 bases, called ‘letters’ of DNA.The different alleles code for slightly different proteins: a long and a short form. It has already been shown that people with the short form are more likely to develop depression. The new research, led by Dr Jonathan Roiser and published in the journal Neuropsychopharmacology, suggests that a person’s alleles can also change their response to reduced levels of a key amino acid, tryptophan. Reducing the body’s levels of tryptophan causes reduced serotonin production and is already known to cause people who have recovered from depression to relapse temporarily. Volunteers were given a test in which the chance of getting a reward changed throughout, in order to see how their

Papers for Ghana?

Why are Monkeys Colour-Blind?

Aidworld

A team of software developers from the Cambridge NGO Aidworld left for Ghana in late March.Their aim is to produce software that will improve access to online research resources for academic communities in the developing world.There are several initiatives, for example from the World Health Organization (WHO) and the UN, to make research widely available. Nevertheless, many academics still do not have access. Academic papers are typically stored online as PDF files and can be very large, often hundreds of kilobytes. Searching for and downloading such files over a poor connection is very time-consuming and incurs enormous cost.Taking into account the living costs and average salaries, an Internet user in Ethiopia pays over one hundred times the cost of the same connection in the US. “The result of this,” explains Dominic Vergine, CEO of Aidworld, “is that researchers struggle to access up-to-date information.” This limited access works both ways: “There is a lot of unique research being done in developing countries which isn’t getting fed back into the international academic community.” While in Ghana, the Aidworld team will work with the UN and the WHO to address the three main problems with on-line materials: slow speed, poor bandwidth management and intermittent connectivity. They will build on their ‘Loband’ project, a website simplification service developed last year which provides 800 textonly web pages per day to people in developing countries and accelerates Internet access by about 10 times. LH www.aidworld.org

Aidworld representative Tariq Khokhar (right) talks with Eric Acquaye (left), a network administrator at the Council for Scientific and Industrial Research, Ghana

04

performance changed with increasing chance of a reward. Normally, people will try hardest when they have a high chance of success. People with two copies of the long allele performed the same, or better, when they were experiencing reduced tryptophan levels than when their tryptophan levels were normal. However, people with two copies of the short allele showed the opposite results, performing worse when their tryptophan levels were low.The researchers suggest that this tendency towards low motivation may help explain why the short allele is associated with depression. The work was a collaborative effort between several departments on Cambridge’s Addenbrooke’s site, and researchers in London and California. DJ

Humans, apes and some monkeys see in full colour, but most New World monkeys display some form of colourblindness. According to a Cambridge scientist, the evolutionary advantages of full colour vision are not simply due to being able to spot fruit against a background of leaves, as has previously been suggested. Mauricio Talebi, of the Department of Biological Anthropology, together with colleagues in the US, has recently investigated the peculiar colour vision of the woolly spider monkey, or muriqui, an endangered species found in Brazil. Humans see in ‘full colour’ because the retina contains three types of photoreceptors, molecules that are sensitive to light. These respond most strongly to different colours of light: one to blue, another to green and the third to red. We share this distinction with only a few other mammals, one of which, the howler monkey, is closely related to the muriqui. But the muriqui, like many of their relatives, show variable colour vision: some females see in full colour, whilst males and other females are missing a photoreceptor and so cannot distinguish reds and greens. Also, individual monkeys have subtle differences in their photoreceptors. Talebi’s team showed that, in the case of the muriqui, these differences are caused by variations in the gene encoding for the photoreceptor. One theory is that full colour vision enabled monkeys to better pick out red fruit or green leaves from a background of trees. However, according to Talebi, writing in the journal Molecular Ecology, this does not explain the complex colour vision differences between closelyrelated monkey species with similar diets. An alternative theory has recently been put forward by an American team. Dr Mark Changizi and colleagues, of Caltech, writing in Biology Letters, suggest instead that full colour vision may have evolved to detect blushing. Primates with bare faces would have an advantage if they could detect changes in skin tone, which typically signal emotional states like aggression or arousal. Changizi points out that those monkeys which are colour-blind species typically have their faces and rumps covered in fur, so little skin is visible; in these cases, seeing in full colour would provide little advantage. RVN

luesci

Easter 2006


A new material that could one day be used in the cooling of electronic chips and domestic refrigerators has recently been identified by scientists at the University. Their results, published in Science in March, demonstrate that the removal of an electrical current from thin films of a specially-prepared metal oxide caused a large decrease in the material’s temperature. The phenomenon, known as the Electrocaloric Effect, was first identified in the 1970s. Alex Mischenko, lead author on the paper and a thirdyear PhD student in the Department of Materials Science, explains: “You apply an electric field, the material gets hotter, and then you release some part of the heat to the environment. You then switch the field off and take some heat from the thing you want to cool down.” When the Electrocaloric Effect was first observed, the changes in temperature were very small, because of the materials being used. Mischenko’s material, perovskite, an oxide containing a mixture of lead, zirconium and titanium, produced a cooling effect a hundred times greater than previously measured. This offers the possibility of producing temperature changes large enough to be used commercially, for example in microchips. However, in the current experiments, the best cooling effect was only observed once the material had first been heated to above 200°C. Mischenko hopes that, by altering

Professor Gerry Gilmore, the Cambridge astronomer, predicts that ground-based astronomy could be wiped out by 2050 as global warming causes increased cloud cover. Clouds and aircraft condensation trails, ‘contrails’, have always irritated the astronomical community. Global warming makes matters worse by increasing the amount of water vapour in the atmosphere through evaporation, contributing to overall cloudiness. Professor Gilmore chaired a study looking into how global warming and rising air traffic will influence the impending 100-metrewide Overwhelmingly Large Telescope (OWL) in Chile. The OWL is intended to probe planets around nearby stars and look for extremely faint objects in the night sky. It is just one of a proposed series of large telescopes designed to observe the skies with unparalleled precision.According to Gilmore, such studies would instead have to be carried out in space. Critics of the study argue that it will be impossible to fill the entire sky with clouds. Speaking to reporters in London, Professor Gilmore said: “There are places where you get relatively fewer clouds—that’s where we put our telescopes—but there is nowhere on Earth that you don’t get clouds and aeroplanes.” Some of the most important terrestrial telescopes are located in Hawaii, the Canary Islands and South America—all popular holiday destinations and therefore likely to see the largest rises in air traffic. Professor Gilmore warned: “You either give up your cheap trips to Majorca, or you give up astronomy.You can’t do both.” GS

Laura Singh

My Telescope is Foggy

Alex Mischenko in the Department of Materials Science

the composition of the material, he will be able to obtain a similar effect at room temperatures. Further work is also required to make the material less toxic. This discovery is made at a time when industry is showing strong interest in new cooling technologies. Computer circuits are becoming ever more complex and require more advanced cooling systems. Meanwhile, current methods of domestic refrigeration are notoriously inefficient, requiring large amounts of energy.Therefore, the search is on for greener methods of cooling. JP

C a m b r i d g e N ew s

Keeping Cool at 200°C

The Scientific Society had a successful Lent Term, with a suitably varied mix of talks.Their Founders Dinner this term will feature Professor Lord Robert Winston, and is selling out fast. Their first talk was by Dr Paul Digard and discussed avian flu, which at the time was receiving massive media exposure. Given that these talks are arranged months in advance, this was an impressive bit of prescience. In another topical talk, Professor Peter Crane, Director of Kew Gardens, described his work developing workable strategies for conserving biodiversity. More esoterically, Dr Allan McRobie described his work applying mathematics to nude art.There is a field of mathematics, Catastrophe Theory, which has previously been used mostly for the study of, er, catastrophes, such as ships capsizing. It operates by creating mathematical landscapes, and therefore can also be used to describe the human body. Perhaps the most provocative talk of the term was given by Mark Cantley, of the EC’s Directorate for Life Sciences. Dr Cantley is a proponent of biotechnology, arguing that biotechnology-derived crops lead to reduced pesticide use and that GM crops are safer than conventional crops. Given the current public antipathy towards such technologies, it is clearly essential that alternative views are heard.

Cong Cong Bo (SciSoc)

CU Scientific Society: Catastrophes, Biotechnology and Robert Winston

Dr Martin Bucher of l’Université Paris-Sud discusses dark energy

The newly-elected Co-President of SciSoc, Adrian Slusarczyk, said:“Our goal is for the Scientific Society to be the society for everyone in Cambridge interested in science—and there’s a lot of us!” MM

Some of the events for Easter Term have already been announced, including: Professor Sir Sydney Brenner, winner of the 2002 Nobel Prize for Physiology or Medicine, on ‘Biological Computation:The Machinery of the Cell’, and Professor Edward Hillhouse on ‘Stress: Birth, Survival and Death’. Talks are normally held in the Pharmacology Lecture Theatre on Tennis Court Road. All details can be found at www.scisoc.com.

news@bluesci.org

luesci

05


Department of Biochemistry

Data: M. Imhoff, NASA GSFC and C. Elvidge, NOAA NGDC. Image: C. Mayhew and R. Simmon, NASA GSFC.

Focus

FOCUS The Energy C Energy – the Changing Climate Is Nuclear Power Our Only Option? Sir Tom Blundell discusses the challenge of reducing energy consumption The affordable motor car, cheap flights, warm housing, modern manufacturing and intensive agriculture underlie our entire way of life in the west. All depend on access to abundant and instantly available energy. But some 2.5 billion people currently have no access to modern energy services and many who have limited access seek more. It is not surprising that world energy consumption is expected to grow at the rate of 2% a year in the future. All forms of energy production have substantial effects on the environment: damaging air pollutants from fossil fuels, intrusive wind farms in upland scenery, radioactive emissions from the reprocessing of spent nuclear fuel, and destruction of woodlands to supply cooking and heating fuels. However, the most serious damage will be done by carbon dioxide from the burning of fossil fuels, the largest single source— accounting for 75%—of greenhouse gas emissions from human activities, and thus the largest cause of global warming. The Royal Commission on Environmental Pollution was one of many to make this point in its report Energy—the Changing Climate, published in June 2000. The response in the white paper in 2003 followed many of our recommendations. In particular, the Commission were delighted that our recommendation to reduce carbon dioxide emissions in the

06

UK by 60% by 2050 from the current level has been taken up by the government.This recommendation was derived from an analysis of the trends in energy consumption and an allocation of emission rights to nations on a per capita basis—enshrining the idea that every human is entitled to release into the atmosphere the same quantity of greenhouse gases (Meyer, 2001). This is not just a challenge for the future; much must be done now. There are many opportunities for large efficiency improvements in the use of energy by manufacturing industry, commercial and public services, households and transport. A particular problem is civil aircraft, which fly in the tropopause, where nitrogen oxides, water vapour and particles all contribute to radiative forcing, increasing the greenhouse effect. The impact is actually a factor of nearly three over the carbon levels alone. On the energy supply side, there is limited potential for further hydropower in the UK as environmental concerns rule out major new schemes.The growing of energy crops could make a much larger contribution, at the same time increasing biodiversity and improving farmland landscapes. The proportion of electricity supplied by wind, waves, tides and sunshine can increase substantially, but the intermittency of these sources will pose growing problems in matching supply with demand and a need to develop large new energy stores or novel

luesci

energy carriers. Hydrogen is one such energy storage device and carrier, which can also make energy use more efficient. In general, government policies focus too much on electricity and ignore heat. Combined heat and power (CHP) units, both small domestic and larger industrial ones, will ensure that we use low-grade heat properly and do not throw it to the sky, as we do now in many electricity-generating plants. Policies of the past have favoured the generation of electricity in ways that waste vast quantities of heat— heat that could be used to warm buildings. Further sources of base-load electricity production will be required.The big challenge for any scenario, assuming 60% reductions of carbon dioxide emissions by 2050, is deciding how these cuts will be made. Reductions could come from the use of nuclear power, which is carbon free, although the energy costs of establishing the system must be taken into account.The alternative is the use of fossil fuel burning power stations with carbon dioxide capture and isolation, either by pumping the carbon dioxide back where we have extracted natural gas, or in rock strata under the sea. The latter has until recently been largely ignored by industry and government alike. Thus we need to solve the problems of either nuclear waste or carbon dioxide storage.A major problem in costing energy is discounting. Upfront costs include build-

Easter 2006


Fo c u s

UK government targets aim to increase the amount of energy produced by renewable sources from 2.7% in 2003 to 20% by 2020. It is without doubt that renewable energy will play a significant role in meeting our future energy needs, but problems with consistency of supply mean that management of this energy is no straightforward matter. If we are to successfully decrease carbon dioxide emissions over the next decades, there is a clear need for the development of alternative technologies. Here, we explore three approaches which could contribute to a significant reduction in greenhouse gases.

Why Nuclear Power May Not Be the Answer

Crisis

Meyer, Aubrey. Contraction & Convergence: The Global Solution to Climate Change, Green Books (2001) Sir Tom Blundell is Professor of Biochemistry and Chair of the School of Biological Sciences, and was Chairman of the Royal Commission on Environmental Pollution, 1998–2005

www.bluesci.org

Michael Marshall believes the dangers and costs of nuclear power are too great The British government has set a target of a 60% cut in carbon dioxide emissions by 2050, and increased use of nuclear fission is a major part of proposals to achieve this. Nuclear fission is the most controversial of all our energy sources, yet it has two crucial advantages. Firstly, it does not release carbon dioxide, and so does not contribute to climate change. Secondly, a small number of nuclear power plants can reliably provide large amounts of energy, unlike renewable energy sources which are, by their nature, variable in output. Nevertheless, there are formidable arguments against use of nuclear fission. Firstly, there are suggestions that the ability of nuclear power to tackle carbon emissions may not be as great as proposed. According to the Sustainable Development Commission (SDC), in their 2006 report Nuclear Power in a Low Carbon Economy,“even if the UK’s existing nuclear capacity was doubled, it would only give an 8% cut on CO2 emissions by 2035”. In 2005, the Tyndall Centre for Climate Change Research produced a report, Decarbonising the UK: Energy for a Climate Conscious Future, which describes five future scenarios, each of which achieves a 60% cut in carbon dioxide emissions. One scenario achieves high economic growth and energy demand without nuclear power, using a range of technologies including biofuels, renewables and carbon capture. The report concludes that “a high consumption future need not have a high reliance on nuclear technology”. A key point in the Tyndall report is that aviation is the fastest-growing emis-

luesci

sions source. The report claims that “without a step change in technology, aviation is likely to become the single most important emission sector by 2050.” It seems unlikely that nuclear power can resolve this problem. In addition to worries about the efficacy of nuclear power, there are practical concerns. Britain currently has 16 nuclear power stations. Half of these are due to close by 2010, and all but one will close by 2023. To cut carbon emissions using nuclear fission we must build a new generation of power stations. According to the SDC report,“the economics of nuclear new-build are highly uncertain”. The companies which would be responsible for building new plants are British Nuclear Fuels and British Energy, and in theory they should bear the costs. In practice though, it is likely that government funds would become involved if expenses spiralled. There is also the consideration of safety. This problem is two-fold: firstly there is the possibility (albeit a remote one) of a serious accident. Secondly, there is the problem of radioactive waste, which remains dangerous for thousands of years and must be safely disposed of. New generations of nuclear power stations would significantly increase our nuclear waste legacy. Lastly, we must remember that climate change is a global issue. Even if increased use of nuclear power meets UK energy needs and cuts our carbon emissions, this alone is not the answer to the world’s climate change problems. Michael Marshall is a Scientific Literature Curator at the Sanger Institute

All images on pages 07-08 are by Equinox Graphics

ing power stations, barriers across rivers or buying machinery for carbon sequestration, but downstream costs like the disposal of nuclear waste are discounted heavily and so do not impact on costs.We tend to be left with nuclear waste with no one willing to take responsibility for the costs. All new developments in energy production require further funding of research as the present government recognises. But government funding decreased by 81% between 1987 and 1998, while privatisation and reorganisation of the industry led to significant decrease in the research investment also. So, we have started from a very weak point in terms of implementing new technologies and encouraging innovation. In summary, there remain huge challenges, both in decreasing demand and in increasing efficiency. We need much more sophisticated management of energy used for heating and cooling— increased use of CHP and a large deployment of alternative energy sources. We need to solve the problem of nuclear waste or carbon sequestration, if we are going to move in either of the two routes for the base-load electricity production. Most importantly though, we need diversity in production for reasons of security as well as efficiency. Nuclear power may have become necessary now, especially in view of the slowness of governments in the twentieth century to invest in renewables in the UK, but it is not, and must not become, the only solution!

07


Focus

Bio-fuels for Road Transport Robert Skelton reviews the use of fuels derived from plants With concerns over carbon emissions, there is considerable pressure to increase use of bio-fuels across the EU, and a recent directive states that by 2010 at least 5.75% of road transport fuel should be of bio origin. Though the technology does exist to achieve this target much will depend on the level of subsidies, which at the moment vary widely across the EU. So what exactly are bio-fuels, and how are they used? There are two forms of bio-fuel for road transport, bio-ethanol which is a petrol substitute/extender, and bio-diesel fuel. Bio-ethanol is the best established technology and Brazil, which has the largest sugar cane crop in the world, has been fuelling a high percentage of its road vehicles with bio-ethanol since the 1980s. The fuel is produced from sugar cane using fermentation and distillation. The processes are not dissimilar from those used to produce spirits for drinking, though it is necessary to remove almost all water from the fuel.

Use of bio-ethanol requires some modification of engine timing, though about 10% can be added to petrol without making alterations. With growing interest in the EU and USA some manufacturers, including Ford and Saab, now produce dual fuel vehicles which can run on petrol or bio-ethanol. A drop in sugar beet subsidies and increasing grain surpluses have kindled interest in the UK, and British Sugar have received planning permission to build the UK’s first bioethanol plant at Wissington, close to Kings’ Lynn. An alternative to bio-ethanol is biodiesel. The term bio-diesel is generally applied to methyl esters of naturally occurring glycerides in the form of vegetable oils. The product is often referred to as Fatty Acid Methyl Ester (FAME). This technology is not as mature as bio-ethanol, though there are a number of large-scale plants in Europe. Manufacture involves reacting vegetable oil with methanol in the presence of a suitable catalyst, usually sodium or potassium hydroxide.The product is

the ester, with glycerol as a by-product. The two are separated and the ester is subjected to several purification steps. Provided that the product meets tight European standards it can be used without engine modification, though there may be seal problems with older engines and the ester attacks all forms of rubber. It also has detergent properties and will dislodge deposits in the fuel system resulting in filter blockage. Hence the use of 100% biodiesel is recommended only for new vehicles. Like bio-ethanol, bio-diesel is mostly only sold as a 5–10% blend. The Department of Chemical Engineering have developed a novel process for production of bio-diesel which has now been sold to a venture capital company. It is hoped to have a commercial plant operational by the end of 2006. www.cheng.cam.ac.uk/research/groups/ polymer/OFM/biodiesel.html Robert Skelton is a retired Senior Lecturer in the Department of Chemical Engineering

Carbon Capture Paul Fennell and Stuart Scott discuss underground storage of CO2 In the UK we use 50 million tonnes of coal a year for electricity generation. This figure is dwarfed by use in China, where 1582 million tonnes of coal are burnt. Global reserves of coal exceed those of oil and gas combined, but if all the coal in the world were burnt it would be impossible to stabilise atmospheric carbon dioxide at an acceptable level. If we are to cut carbon dioxide emissions we must limit emissions from fossil fuels long before they run out.The development of carbon capture technology—to which both the EU and China are committed—is a key factor in limiting future emissions. Carbon capture aims to recover the carbon dioxide released from burning fossil fuels, ready for storage underground. Suggested storage sites include saline aquifers (vast underground watercontaining rock formations) and depleted oil and gas wells. The Intergovernmental Panel on Climate Change suggests that that full carbon capture facilities could decrease emissions of a typical modern coal-burning power plant by up to 90%, making development of this technology an attractive

08

prospect. The sequestration of CO2 does, however, have difficulties: it requires the identification of stable areas for disposal on a geological time-frame, and has substantial infrastructure costs. Additionally, there are concerns over the possibility of leakage back into the atmosphere and of unpredictable environmental effects. In the UK we could sequester large quantities of CO2 in depleted oil and gas wells using the existing infrastructure. Injecting CO2 into these fields can also increase the amount of oil recovered. Our research in the Combustion Group at the Department of Chemical Engineering focuses on two technologies which could be used to produce pure carbon dioxide, ready for sequestration. One technology utilizes limestone (CaCO3) rocks to shuttle carbon dioxide from a dilute stream of carbon dioxide in a power station exhaust to a separate reactor where pure carbon dioxide is driven off. The solid CaO is recycled to react with the exhaust stream, and carbon dioxide is produced at a pressure and concentration suitable for sequestration. An alternative uses a solid carrier such as iron to transfer oxygen from the

luesci

air to react with a fuel: this technology produces a pure stream of CO2 because the oxygen reacting with the fuel has been separated from the air to start with. The time-scale over which these technologies could become commercially viable is, in essence, set by the nominal value of CO2 emissions. Enhanced oil recovery via CO2 injection is already a reality. Given reasonable political will (and sufficient funding), the technologies we are investigating could be working at full scale within 10 years. Other technologies which are able to produce a pure stream of CO2 are closer to market, for example oxy-fuel combustion, where the fuel is combusted in pure (and expensive) oxygen rather than air, and the scrubbing of CO2 from the gases produced from a power station using an amine. However, these latter two methods reduce power station efficiency, and so development of the full potential of carbon sequestration will require continued research. Paul Fennell is Post-Doctoral Researcher at the Department of Chemical Engineering Stuart Scott is a Lecturer at the Department of Engineering

Easter 2006


Fo c u s UKAEA

EFDA-JET

Fusion Future Tom Walters talks to Chris Llewellyn-Smith about harnessing the reactions that power the stars With its potential as a non-polluting and safe source of power, nuclear fusion has been a dream of energy researchers for decades. If full-scale fusion reactors become a reality, they could provide plentiful electricity with no carbon emissions, no risk of catastrophic failure and very low running costs. With all this behind it, fusion is definitely the acceptable face of nuclear energy. So when could full-scale fusion become possible? I spoke to Chris LlewellynSmith, Director of UKAEA Culham, a major centre for fusion research worldwide, to find out his views on meeting the energy demands of the world. Culham is home to JET, the Joint European Torus, a collaborative fusion experiment involving researchers from across the continent. Design of JET began in 1973 and it is now coming to the end of its life, but the next big step in fusion research, ITER, is already planned. It is hoped that it will be the stepping-stone to a working prototype fusion reactor. ITER is a collaboration between the EU, Japan, the US, Russia, China, South Korea and India. “These countries are home to half the world’s population. This is a global response to a global problem,” says Llewellyn-Smith. “It’s expensive, and it’s very long-term.” ITER will take 10 years just to build, and eight or nine years to digest the results. After that it could take about eight years to build a working prototype. Llewellyn-Smith believes that with good funding and good will, there could be fusion power stations in about 40 years.

How Fusion Works Fusion is the reaction that powers the stars. Energy is liberated when nuclei of the lightest elements fuse together at high temperatures and pressures. The easiest fusion reaction to perform on Earth involves heating a gas of deuterium and tritium, two isotopes of hydrogen, to 100,000,000°C. The ensuing reaction releases 10 million times as much energy as an elementary chemical reaction. There is one deuterium atom for every 6,500 hydrogen atoms on the planet and tritium can easily be made from lithium, so both reactants are readily available in large quantities.

www.bluesci.org

To the observer, fusion may seem to have taken a long time to progress, but Llewellyn-Smith is quick to defend its development. “In the early 1950s people knew the basic physics of fusion but didn’t know about the basic physics of plasmas, and they grossly underestimated the difficulties. Sorting out the basic science took the 1950s and 1960s.” The technical challenges involved were, and still are, immense. The first step in making fusion work was to find the best configuration of magnets to hold several thousand cubic metres of gas that is ten times hotter than the centre of

When you look at the size of the energy challenge, we probably need everything we can get

the sun.“It wasn’t until the end of the 1960s that the basic configuration you need to do this was worked out,and from then it’s been a systematic process of scaling up in size”, says Llewellyn-Smith. This scaling up is a complex task, but he seems confident that it is not insurmountable: “things that would have seemed inconceivably complex 50 years ago,like a jumbo jet,do work routinely and we’ve learned a lot about technology, so I’m personally rather confident that the complexity problem will be mastered.” So what about the intervening period before fusion’s promises become reality? Should we invest in new fission reactors or in renewable energy? On this point, his response is simple: “probably both, actually”. “There’s a rather false debate going on at the moment: ‘do we want nuclear or renewables?’ The fact is that when you look at the size of the energy challenge, we probably need everything we can get, especially when you look on a global scale at the anticipated increase in energy use.A huge increase is not only projected but needed to lift billions of people out of poverty in the developing world.” It is unlikely that any one renewable energy source could provide more than 5% of the world’s energy needs. The only exception to this is hydroelectric power,

luesci

which already stands at 6%.“But”, he suggests “we should take all those 5 percents and add them together.” Renewables are inherently variable in output, and with current technology, there is no efficient way of storing power. “To complement renewables, you need steady, base-load power around the clock. At the moment, the only options are to go on burning coal, oil and gas, or to use fission.We really need another option”, says Llewellyn-Smith. In the long term, fusion could fill this gap, but what about the intervening period? Should we build another generation of fission power stations in the UK? LlewellynSmith is pragmatic.“This country can easily afford to build one more generation of power stations without increasing the nuclear legacy enormously. Fission has, around the world, a pretty good safety record, it’s rather reliable and the costs are looking pretty reasonable, but I would be nervous about the world going onto a big expansion of fission forever,” he says. Llewellyn-Smith is realistic about the timescales involved in development of fusion:“With a blank cheque we could go out and build a fusion power station and it might produce a few kilowatts, but it would be very unreliable.” He believes that the real question is when it will be possible to build 20 reactors, at high reliability and low cost. This, he thinks, is still around 40 years away, “but we could get much surer about that date with more investment”. On the issue of energy research and development funding, Llewellyn-Smith is animated. “The level is pathetic”, he says. “Worldwide, the funding of energy R&D has halved in real terms since 1980. I find that absolutely staggering. In 1980 we didn’t have a problem—while we’ve learned that we have an energy problem the funding of energy R&D has halved.” The statistics he gives are telling. “Public funding of energy R&D—which is the major source for long-term work—is less than 0.3% of the [worldwide energy] market. I don’t know of any high-tech business with a looming crisis where less than 0.3% is going into R&D. We need bigger energy R&D budgets.” www.fusion.org.uk Tom Walters is a PhD student in the Department of Physiology, Development and Neuroscience

09


Opinion

A Cambridge View-Point BlueSci readers share their thoughts on all matters scientific BlueSci’s new opinion page gives you the opportunity to voice your opinion on all matters scientific. Whether you want to increase awareness of a scientific issue or comment on a BlueSci article, we want to hear from you! Email opinion@bluesci.org

Space? No Thanks,We’re British When asked as child what I wanted to be when I grew up, my reply would always be, “An astronaut, of course!” I thought that in time I would grow out of my asthma and short-sightedness. But there was one obstacle I would never grow out of: I was British. The UK has never been involved in manned space flight. I had great hopes in 1991 when Helen Sharman became the first Briton in space, but this proved to be a one-off commercial venture.The only other ‘Briton’ in space has been Michael Foale, who flies with NASA and has dual US/UK citizenship. Barring any disappointment felt by Britain’s budding astronauts, many regard this as a shrewd move, The International Space Station (ISS) has cost nearly $100 billion and has returned little by way of interesting science.The ISS has come to represent to UK policymakers all that is wrong with manned space flight.They dismiss manned missions as expensive, prestigious projects with little scientific reward. Enter the Royal Astronomical Society (RAS). In October 2005, the RAS commissioned astronomer Ken Pounds, environmental scientist John Dudeney and particle physicist Frank Close to investigate why the UK should send humans, as opposed to robots, into space. Dudeney concluded “that the capabilities of robots have been vastly overestimated” and that “we need the imagination and adaptability of human beings.”

Saving Time Are you aware that we live in the Holocene epoch? Is it important for you to know this? The answer will depend on your research, and how connected it is to the Geological Time Scale (GTS). The GTS defines intervals of time that share a particular set of environmental conditions. An interval can be assigned, for example, based on magnetic polarity being in the same direction (it hasn’t always been directed to the north). Global agreement of the precise dates for a time interval in the GTS is rare. Rather, upper and lower boundaries are set based on stratigraphic sections that clearly illustrate the environmental conditions of the time interval under consideration. The hierarchy and nomenclature of the GTS is modified from time to time. In recent years, the Tertiary Period, famous for the extinction of the dinosaurs at its boundary with the Cretaceous, has been relegated to a sub-era. The Primary and Secondary Periods, which predated the Tertiary, were long since done away with, so it seemed logical to remove the Tertiary. In the literature, there is little reference to sub-eras, with preference for discussing periods or epochs, which are smaller divisions of time, or eras, which are larger. We live in the Cenozoic Era, which encompasses both the Tertiary and Quaternary sub-eras. A recent and very controversial proposed change to the GTS has targeted the Quaternary Period. Whilst this might seem a logical continuation, the Quaternary is ‘our’ period; the time within which humans appeared on Earth. Referred to in the early part of the nineteenth century as the ‘Diluvial’ period (as many of the stratigraphic sequences were attributed to the biblical flood), the Quaternary marks the start of cyclical phases of glaciation on a global scale. The famous geologist Sir Charles Lyell advocated the principle that the present is the key to the past: unravelling past climate change will improve our understanding of climate change both now and in the future. This, combined with the fact that more

10

The RAS investigation highlights that human exploration of the moon or Mars could help answer fundamental questions. The moon’s thin atmosphere allows interstellar objects to collide with its surface, which means that the moon’s surface can be regarded as a four-billion-year-old record of the evolution of the solar system.The possible unveiling of the mystery of life on Mars may also hold clues as to the origins of life on Earth. The investigators concede that the scientific case alone does not justify spending 5% of the budget of all UK research councils: £150 million per annum. Dudeney claims that “the payback to industry would be huge, as would the inspirational effect on children.” At a time of crisis in UK science education, this argument is compelling. It is not clear, however, that manned missions specifically have the ability to inspire. Indeed, the robotic Beagle 2 mission to find life on Mars fired the public imagination far beyond many of the earlier manned moon missions. In December 2005, Lord Sainsbury announced the UK’s commitment to the first, robotic, stage of Aurora, Europe’s Mars mission.Aurora’s final stage is a manned mission in 2030.The decision to be involved with this mission must be made by 2010. How the debate in the UK plays out over the next few years will be pivotal in determining whether the next generation of Britons will have a shot at becoming astronauts, or whether they, like me, will have their ambitions thwarted by the mere fact of nationality. Lucy Heady is a postdoc in the Cavendish Laboratory recent time periods provide more detailed evidence of climate change, makes the Quaternary Period one of great importance. The importance of the Quaternary is reinforced by the titles of a number of leading journals, such as Journal of Quaternary Science, Quaternary Science Reviews and Quaternary Research. In addition, the International Quaternary Association holds conferences attended by hundreds of scientists. Dr Amos Salvador of the University of Texas commented that the “Quaternary is probably the stratigraphic term most frequently used in the geologic literature.” An earlier proposal to include the Holocene, which marks the presence of man recorded in a stratigraphic section, within the Pleistocene, was rejected on the basis that it “would run counter to history and to an immense literature, and would serve no great purpose” (Harland et al., 1989). A similar argument would seem to apply to keeping the Quaternary as a Period in the GTS. It will be interesting to see how the debate about the Quaternary will be resolved. The resolution of this issue will have a considerable effect on a large number of different scientific fields, including the 40 or so Cambridge Quaternary researchers and the students currently studying for their MPhil in Quaternary Science! A.C. Hinton is an Affiliated Lecturer in the Department of Geography Salvador, A., Bulletin of the American Association of Petroleum Geologists (2006) Harland et al., A Geologic Time Scale, Cambridge University Press (1989) For more on this issue, including a tribute to Professor Sir Nicholas Shackleton of the Department of Earth Sciences, who was one of the world’s leading Quaternary scientists, visit www.bluesci.org

luesci

Easter 2006


Oliver Jones

Does Urban Sewage Have a Drug Problem?

Oliver Jones tackles the problem of contaminated sewage It’s the morning after a night of indulgent festivities, and Jack’s head hurts. Groggily, he stumbles to the medicine cabinet in the bathroom and reaches for the packet of paracetamol he keeps for just such emergencies. After taking two tablets he crawls back to bed. Later he’ll excrete a large proportion of that dose and send it heading for the nearest sewage treatment works. All over the country the previous night’s revellers are following the same ritual. Other people are taking medicines for a variety of reasons including arthritis, epilepsy, high blood pressure, infection—the list seems endless. To many people, once something has gone down the toilet, it’s out of sight and out of mind. You have probably never thought about it, and you are not alone. Until recently nobody had given it a second thought, despite the detection of chemicals in treated and untreated sewage throughout the world for the past 30 years. The population of most developed countries swallows an extraordinary amount and variety of pharmaceuticals every day that eventually end up in the sewage. These range from familiar compounds like antibiotics, anti-epileptics, and anti-asthma medications to more exotic groups such as synthetic hormones, antidepressants, and psychiatric drugs. Because people continue to overlook

www.bluesci.org

this problem, many questions remain unanswered. Are drugs degraded in transit to the sewage system? Can they pass through into the wider environment? And, perhaps most importantly, can they get into our water supplies? Most drugs that have been detected so far have been present at nanogram per litre concentrations and occasionally at microgram per litre levels.This is roughly equivalent to a sugar cube dissolved in an Olympic-sized swimming pool. To detect such low concentrations scientists use a

The big question is whether or not low-scale doping of the environment matters, either to humans or to wildlife

combination of gas chromatography and mass spectrometry. Chromatography is used to separate out the target compounds from the variety of other substances often present in environmental samples, whereas mass spectrometry is better for identifying and quantifying relative amounts of pollutants. The signals produced by the instruments can then be compared to those in accepted spectral libraries, allowing unknown substances to be identified.

luesci

The big question is whether or not low-scale doping of the environment matters, either to humans or to wildlife. Two of the most pressing concerns are the spread of antibiotic resistance in bacterial populations and the feminization of male fish after exposure to endocrine disrupters. Other effects of drugs in the environment have also been observed. For instance, the painkiller ibuprofen prevents the growth of some bacteria, and the antibiotic streptomycin interferes with the growth of some plants, whilst paracetamol has been shown to be quite effective in controlling pest populations of brown tree snakes. While the risks associated with exposure to drugs are probably greatest with regard to the natural environment, the public’s concern is understandably often more focused on human exposure.This is especially important in areas that practise indirect water reuse—where sewage effluent is released to streams and rivers, which are in turn used as a source of raw drinking water for communities living downstream. Drinking water provides a direct route to the human body for any drug compound that may be present. Other pathways such as ingestion (eating crops irrigated with contaminated effluent or grown on soil contaminated with sewage sludge) or bodily interaction (bathing or showering in waters containing effluent) can also put the body in contact with pharmaceuticals. While modern treatment technologies can remove the majority of drugs from water, they are by no means routinely employed. In developing countries water rarely receives any kind of treatment. In addition, some drugs are resistant even to advanced forms of water treatment. For instance, a 2003 study found detectable concentrations of both the antiepileptic drug carbamazepine and the blood lipid regulator gemfibrozil in four out of ten Canadian cities tested. All used modern treatment technologies to treat their drinking water. Although this report was not published in peer-reviewed literature, but rather commissioned by two Canadian news broadcasters (CTV News and The Globe and Mail newspaper), the results make interesting reading. While only three of the 440 analyte-sample combinations gave detectable levels for carbamazepine, and only one gave detectable levels for gemfibrozil, the results show there is a clear possibility for drug compounds to pass through even modern advanced water treatment facilities. This may sound discouraging, but keep in mind that the concentrations in question are very small. In part, the latest findings simply reflect the increasing ability of researchers to spot pollutants in water, so there is no need to stop taking your medication just yet. But next time you reach for a pill, for whatever reason, think about where it could end up—you might be surprised. Oliver Jones is a postdoc in the Department of Biochemistry

11


For Your Eyes Only Tristan Farrow explores the mysterious world of quantum communication When Tony Blair and George Bush talk on the phone, you can be sure they don’t use your regular BT landline. In fact, they have a hotline that joins Downing Street to the White House that allows them to speak in complete secrecy. Try eavesdropping on their conversation and all that you hear is a meaningless scramble that could make a Pollock painting seem like a study in perspective. To extract any useful information from that, you would need to know the recipe, known as a ‘key’, which produced the scramble. Only then could you perform the reverse operation that would let you extract the hidden message. That recipe is a random sequence of numbers produced before each hotline session, which is used to mix up the order of the information in the message. Since the numbers in the key are random, so is the order of the information in the scrambled message.There is no way of making sense of it without knowing the exact string of numbers used and, since they are random, there is no possibility of deducing them. But, a chink in the armour opens here: the key itself has to be passed on to the intended recipient so that he can unscramble the information. In principle at least, there is nothing preventing it from falling into the wrong hands, which could use it to pry open the message. Governments are traditional sorts of institutions, so they prefer to fly these keys physically in briefcases back and forth across the Atlantic under armed guard. However, standard modern cryptographic methods allow keys to be transmitted on the internet without much compromise to their security.

The secret key is produced by encoding binary values on a stream of particles of light

so time-consuming as to make it unrealistic and out of reach for even very powerful computers. Yet this security could be washed away overnight if quantum computers became a reality, as was highlighted in the last issue of BlueSci. Because the problem of factorization is relatively straightforward for a quantum computer, the foundations underwriting the security of today’s systems could vanish, with far-reaching consequences for an economy that relies heavily on factorization-based encryption. With governments around the world pumping millions into research to produce such a device, and George Bush announcing that quantum computing, together with a manned mission to Mars, are to be the prestige US science goals for the next 20 years, the nightmare of cryptographers could well come true. A step change in the sophistication of the art of information concealment is taking place as we speak. Researchers have invented a cryptography system based on the laws of quantum mechanics, marking the start of an era when secrecy can be guaranteed absolutely. Unlike all previous techniques, quantum cryptography is flawless. The laws of quantum physics underwrite its security. They are deployed in such a way that to breach the system’s security would require you to break the laws of physics first. In quantum cryptography, the secret key used to scramble a message is produced by randomly encoding binary values, zeroes and ones, on a stream of particles of light, or photons. When photons travel they vibrate, and the orientation of that oscillation is called the polarization. Indeed, it is possible to produce photons with a polarization of our choosing; a property exploited in an important type of quantum cryptographic system. In such polarization-based schemes, the sender, let

Polarizer Used

Photon’s Polarization

People have been busy sending coded messages and concealing information since they could speak, but the sophistication attained by modern encryption, or coding, is bewildering. It combines cutting-edge technology with the ingenuity of a barrage of mathematicians, information theorists, and computer scientists. However, despite the fact that modern cryptographic systems can be extremely safe, they are not invulnerable.Their security relies on the extreme inefficiency of traditional computers to factorise large numbers. Cracking a code thus becomes

12

luesci

us christen her Alice, sends photons with four possible polarizations using a device called a polarizer. Actually, she must use two polarizers, which give two different polarizations each. One gives photons with oscillations angled at either 0º or 90º, while the other produces them at either 45º or 135º. We can now decree that a binary ‘zero’ is represented by a photon polarized at 0º or 45º, and a ‘one’ by 90º or 135º. These two double sets of angles are called polarization ‘bases’, which we will soon discover to be mutually exclusive.

The foundations of today’s security systems could vanish

Crucially, the choice of which polarizer and angle Alice uses to produce her string of zeroes and ones must be completely random—although she does keep a careful record of the random sequence in which she used her two polarizers, and of the angle she gave each photon. The string is sent through an optical fibre to the recipient, Bob, who now has to read it by measuring the polarization on each photon, but without any knowledge of which basis Alice used for each encoding. Bob has no choice but to pick his reading polarizers completely at random for each photon he receives. He only gets one shot at it, for once a photon has passed through his polarizer, it irretrievably loses the orientation that Alice gave it—unless, of course, Bob happens to use the correct polarizer. Then, the photon preserves its original angle and passes unchanged. So, if Bob happens to use the right polarizer, say that one producing photons either at 0º or 90º, he will correctly identify whether the

Polarization Angle

Binary Value

0

90°

1

45°

0

135°

1 Easter 2006


Tom Walters, Jonathan Zwart and Tristan Farrow

Eve can’t escape detection becuse of the extra errors she introduces into the sequence Bob receives.The cryptographic key is 110110011 photon represents a binary zero or one. And, since there are two polarizers to choose from, statistics say that Bob’s measurements will always be only half wrong. Here comes the crux: no matter how clever a measurement is made on Alice’s photons, quantum mechanics will prevent anyone from deducing which of the two polarizers she used to encode them. Thus Bob, or any possible eavesdropper, has no way of knowing from his measurements which of the two bases, or polarizers, he should choose to perform his own measurements. In the textbooks, this is known as Heisenberg’s uncertainty principle, a law of nature that is widely applied to several other mutually exclusive or simultaneously immeasurable pairs of physical properties of sub-atomic particles—such as the speed and position of a moving electron. Now that Bob has a string of zeroes and ones that is half wrong, Alice tells him exactly which polarizer she used to encode which photon. Significantly, she says nothing about the polarization angle, since that would amount to revealing the secret key. Their discussion can take place on the pages of BlueSci for all they care, because that information does nothing to threaten the system’s secrecy. Because Bob now knows which of his measurements were made with the right polarizer, he can finally say which of his zeroes and ones are right, while throwing away all the rest— that is half of his string of numbers. Now comes the moment of truth:Alice and Bob compare random snippets of their key, say over a public Tannoy, if they have a dramatic bent, to check that there are no errors in Bob’s sequence. If all is well and the random snippets of their binary strings match, they can be absolutely certain that no one could have copied their key without their knowledge. Their random binary sequence is now ready to be used by Alice to scramble her private message before she sends it to Bob on the internet or otherwise.Thankfully, computers do all the number crunching in actual quantum cryptographic systems.

www.bluesci.org

Imagine though that somewhere along the optical fibre between Alice and Bob a technologically advanced eavesdropper, called Eve, hatched a plot to siphon off some of the photons. Like Bob, Eve would have to measure each photon with one of the two polarizers chosen at random. Crucially, just by the act of diverting and measuring the photons, she would necessarily introduce additional errors into the sequence Bob receives. This would show up, because now Bob’s string would be more than half wrong. However inventive Eve might be, she cannot prevent this from happening since, like Bob, she is stopped by quantum mechanics from knowing which polarizer to choose for each measurement. And that isn’t all; she is also prevented from copying those photons before measuring them, had she intended to avoid alerting Bob to the fact that some of the bits in his string had been wrongly measured.This is a consequence of another quantum mechanical law called the ‘nocloning theorem’, which forbids the cloning of quantum objects like photons. So, Eve can never go unnoticed by Alice and Bob, who can immediately break off their communication and switch channels as soon as they detect the tell-tale additional errors when their number sequences mismatch. There is one real danger, however, which arises when the light source that Alice uses produces too many photons in each pulse. Since all the photons in the burst acquire the same angle when they pass through her polarizer, a futuristic Eve could conceivably measure only a small sample of photons in each pulse, while allowing the rest to race on. Bob would then be none the wiser about having a spy on the line, as his string would contain no more errors than usual. To deal with this problem, real systems use dim light-emitting laser diodes that produce very few photons in each pulse, while researchers are focusing their efforts on developing single-photon sources. Intricate mathematical proofs have been used to calculate error thresholds below

luesci

which quantum cryptography guarantees absolute security even for multi-photon pulses. In the famous scheme named BB84, after the surnames of IBM researchers Charles Bennett and Gilles Brassard who devised it in 1984, the acceptable error threshold, called bit error rate, for absolute security is calculated to be 11%. Above that, the channel is no longer secure.

To breach the system’s security would require you to break the laws of physics first

Quantum cryptography systems are already in operation in a handful of physics labs around the world, and the momentum is gathering with three such devices testing the commercial winds. Compared with classical cryptography, however, present-day quantum cryptographic systems are constrained by the distance over which communication can take place. As photons are pumped down an optical fibre linking two parties, some of them never make it through owing to leaks in the fibre. Toshiba Cambridge Research Laboratory holds the current record of 120 kilometres, with a guarantee of absolute security for the first 50 kilometres. At that distance, quantum cryptography becomes suitable for large local networks between neighbouring cities. Predictably, banks and government institutions are eyeing the new technology closely, and are expected to be its first users in a market that is forecast to capture a large chunk of the encryption business, itself worth long strings of dollars. For a demonstration of the BB84 protocol see http://monet.mercersburg.edu/henle/bb84 Tristan Farrow is a PhD student in the Cavendish Laboratory

13


Eq

u in ox Grap hics

T

sis

H ypot a i a G h e e h

Yet another Greek tragedy? Is the Earth a living organism? Louise Woodley reviews the debate Nothing causes a more impassioned scientific debate than those who mix even mild hints of spirituality or ‘newworldliness’ with scientific ideas— whether it is the benefits of acupuncture, the existence of individual human auras and energies, or the advantages of medicinal herbs and food supplements. One such idea, the Gaia hypothesis, was proposed in the mid-1970s and even now is brought into debates on global warming, nuclear power and evolution; although not without some controversy. So what is Gaia and how can a Mother Earth myth help to explain the relationship between human life and our planet’s environment? The idea of Mother Earth is not new. Throughout history various civilisations have had their own symbols, names and obvious spiritual respect for our planet

14

and the woman that they believe created it. The Greek goddess is called Gaia; she was the first parthenogenic (requiring only one parent) daughter of Chaos. She was a highly fertile female and her numerous offspring included Uranus, with whom she then conceived the rest of creation, as well as Titans and Cyclopes. When Gaia later suspected Uranus of preventing her from having yet more children, she chopped off his genitals and continued to reproduce without him. In 1975 an English atmospheric scientist, James Lovelock, used the term Gaia in a New Scientist article in which he proposed that the Earth functions as a single organism: “a complex entity involving the Earth’s biosphere, atmosphere, oceans and soil; the totality constituting a feedback or cybernetic system which seeks an optimal physical

luesci

and chemical environment for life on this planet”. In simple terms this implies that the Earth, rather like the feisty Greek Gaia, is capable of regulating its own environment, including the temperature and chemistry of the atmosphere, so that favourable conditions for life are maintained. The co-author of the original Gaia hypothesis was Lynn Margulis, the microbiologist responsible for the Endosymbiotic Theory that states that organelles, such as mitochondria, inside the cells of higher organisms, including plants and humans, are actually ancient bacteria. Her version of the Gaia hypothesis carries slightly more overt implications than Lovelock’s:“there is no special tendency of biospheres to preserve their current inhabitants and so Earth is not a living organism which can live and die all at once, but rather a kind of community of trust which can exist at many discrete levels of integration.” She makes more explicit the idea that if Earth is indeed composed of many processes and reactions that are all interlinked, then changing any one of these will have knock-on effects elsewhere in the system. This might include deforestation, that reduces the Earth’s ability to recycle carbon dioxide, or pollution of the oceans, which kills phytoplankton which release a compound needed for the aggregation of water droplets into clouds, hence altering the climate. The Gaia hypothesis provoked instant opposition, not least from the evolutionary biologist Richard Dawkins. He complained that the original hypothesis was teleological, that it claimed that all things have a predetermined purpose, an idea in obvious contradiction with natural selection. Lovelock countered that he had never explicitly made this claim. Another criticism of the theory is that to describe Earth as a living thing, whether a cell or organism, implies that it possesses the fundamental characteristics of life, such as being self-organizing, being far from thermodynamic equilibrium and having a metabolism with chemical reactions that are stabilised by feedback networks. While these claims were implicit in the original hypothesis, they were not scientifically demonstrated. Furthermore, to truly be a living organism, the Earth should be capable of reproducing. This has been countered by claims that man is the mechanism through which Gaia could reproduce onto other planets, apparently reflected in our obsession with space travel and science fiction, although this enters the realms of the fanciful. Finally, in the ultimate rebuff, the hypothesis was accused of being unscientific. Its claims could neither be proved nor disproved, so it is impossible to verify according to traditional scientific philosophy. Lovelock’s original hypothesis was based on several observations, at least one of which he formulated during his time

Easter 2006


working for NASA on a project to determine whether there is life on other planets. He claimed that there must be a global system to control temperature, atmospheric composition and ocean salinity, as all three of these have remained constant. Since the appearance of life on Earth, the energy provided by the sun has been estimated to have increased by 25–30%, although the temperature of Earth has not been affected to the same extent. In the case of the Earth’s atmosphere, approximately 20% consists of oxygen, the second most reactive gas after fluo-

environment. All accusations of teleology were then dropped. James Kirchner, a philosopher and physicist, proposed that there is a spectrum of at least five subtly different Gaia hypotheses ranging from ‘weak’ to ‘strong’. The weak hypotheses merely claim that life influences planetary processes, while the strong, and less popular, hypotheses claim that life actually controls processes on Earth. In 2000 there was a second Gaia conference that focused less on the semantics of the hypothesis and more on sci-

We have pushed our relationship with Earth too far and this will inevitably result in global warming and not just the death of humans, but of other species too

rine. That the level of oxygen has remained stable when it is so reactive must be due to its recycling by living organisms. This contrasts with Mars and Venus, both of which are lacking in oxygen and also life. Following the original criticisms of his hypothesis, in 1983 James Lovelock and Andrew Watson developed a series of computer simulations, termed Daisyworld. These consist of modelling an imaginary planet where there are two species of daisies, one of which is white and the other black, thus differing in their albedos or reflectivity.Various equations are used to demonstrate that feedback mechanisms to regulate environmental conditions could exist on Earth. The idea is that the white daisies are able to reflect radiative energy and so will proliferate when there is more incoming solar radiation in order to cool the planet down. However, when incoming solar radiation levels are below a certain level, black daisies are favoured over their white cousins, as they absorb the incoming radiation and warm the planet up. This idea of a biological feedback mechanism is termed homeostasis and relies upon the detection of alterations in conditions and a subsequent response to those changes that brings the conditions back to equilibrium, rather like the way a thermostat acts in the central heating system to hold the temperature of your house constant. Important for silencing the critics, Daisyworld also demonstrated that for this kind of large-scale homeostasis to operate on Earth there does not need to be an input of conscious control. The heated debate and scientific interest in the original Gaia hypothesis resulted in a conference in 1988 to discuss Gaia theory. As a result, Lovelock proposed a refined hypothesis that appealed more widely to the scientific community: the Earth is homeostatic and life can influence the climate and

www.bluesci.org

entific concepts: what exactly is Gaia and how has it changed over time? How can we measure what effects it has on climate and how long lasting any effects might be? Gaia was clearly more accepted across the sciences, from climatology and ecology to geophysics. The Gaia hypothesis was arguably one of the first attempts to show that the Earth is not merely a series of unrelated ecosystems or chemical reactions. It also demonstrated the effects that man can have, and is having, on precise ecological balances.The inspiration that this provided for environmentalists and philosophers alike is undisputed—from the formation of the radical Green group of ‘Gaians’ to an exploratory text Scientists on Gaia by Stephen Schneider and Penelope Boston, in which there are 44 different discus-

inevitably result in global warming and not just the death of humans, who are the chief cause of the imbalance, but of other species too. We have betrayed the ‘community of trust’ that Margulis described. To return to the Daisyworld example, Gaia has no reason to favour humanity over any other form of life. If global warming results in the extinction, or at least massive decline in the number, of humans on the planet, then it will also presumably result in a reduction in the main cause of global warming. Just how Gaia would then react and reset the thermostat to maintain another form or forms of life is an interesting debate. Whether scientists agree on the political suggestions, such as adopting nuclear power, proposed as a result of consideration of the Gaia hypothesis, it is certain that the hypothesis itself is becoming more widely accepted. While we may not be able to say with certainty yet whether our various actions are having positive or negative effects on the whole system, we can no longer claim that they are isolated events, without further consequence. What is more, the holistic view of our planet that the Gaia hypothesis proposes sits much more comfortably in recent times with the wider appreciation that many scientific disciplines are indeed interconnected. The Dalai Lama claims: “the universe we inhabit can be understood in terms of a living organism where each cell works in balanced cooperation with every other cell to sustain the whole. If, then, just one of these cells is harmed as when disease strikes, that balance is harmed and there is danger to the whole.This in turn suggests that our individual wellbeing is intimately connected both with that of all others and with the environment within which we live. It also becomes apparent that our every action, our every deed, word and thought, has an implication not only for

To describe Earth as a living thing, whether a cell or organism, implies that it possesses the fundamental characteristics of life

sions by scientists on the various ‘philosophical, empirical and theoretical foundations of Gaia’. Unsurprisingly, politically Lovelock is strongly Green, although his predictions on global warming, particularly in his latest book The Revenge of Gaia, released early this year, have made him a strong supporter of nuclear power as the only viable way to meet our energy demands without further damaging the planet. In this bleak-viewed book, Lovelock argues that we have pushed our relationship with Earth too far and this will now

luesci

ourselves, but for all others too.” This is the essence of the Gaia hypothesis. It will be interesting to see whether such ideas do motivate more definitive action to counter controversial problems, such as pollution and global warming. Louise Woodley is a PhD student in the Department of Biochemistry For further information, including a link to an interactive Daisyworld simulation, visit www.bluesci.org

15


The Future of Medicine

Tom Walters

James Pickett investigates how genome sequencing could revolutionize medicine You’ve been unwell for some time and so you visit your doctor. At the surgery you hand over a card containing a record of your entire genetic make-up. The doctor can diagnose your condition and consider relevant medication. By consulting your genetic information, the doctor can predict the success of any drug they prescribe, the dose that you will need to respond and the probability that you will experience undesirable side effects. Based on this knowledge, they can administer personalized medicine. This could be a future visit to your G.P. if the potential of a new branch of medicine, known as pharmacogenomics, is realized. Your DNA is at least 99.9% similar to that of any other human individual.This is not really surprising since most of the processes the human body performs are identical across the population. Despite this, it is clear that small differences in the genomic sequence contribute to diversity in human behaviour, disease susceptibility and drug response. The goal of pharmacogenomics is to develop a better understanding of how differences in DNA influence these factors. This, in turn, may lead to better, safer medication administration. Advances in the technology used to sequence whole genomes are, for the first time, making a pharmacogenomic approach to drugs a possibility. DNA has been described as the ‘blueprint for life’ because it contains the information necessary to make every protein in an organism. The building blocks of DNA are known as nucleotides, of which there are four major types. It is the type and order of the nucleotides that regulate the structure of a protein. The human genome is composed of approximately six billion nucleotides, and genetic sequencing has shown that variation between individuals occurs one in every 1000 or so nucleotides. Sometimes a nucleotide may be inserted or deleted,

16

but the most common variation is a substitution of one nucleotide for another: such sites are known as Single Nucleotide Polymorphisms (SNPs, pronounced snips). This may cause a change in the protein produced and so have functional consequences. Understanding the implications of SNPs is considered by many as being central to comprehending diversity in human responses, and it is this that will make pharmacogenomics a possibility. The determination of SNPs in the genome has become a major focus for researchers. By 2001 a two-year project by the SNP consortium, a collaboration between drug companies, had validated

Many of the pharmaceutical industry’s most successful drugs have similar stories: moderate success rates and long, diverse lists of unpleasant side effects

over 1.8 million SNPs.A second ongoing project, called HapMap, is profiling the genomes of 270 individuals from diverse cultural groups.To date, over five million individual SNPs have been discovered, and it is estimated there are another five million to be identified. The HapMap project is particularly ambitious because

luesci

it hopes to show that people differ in sets of SNPs, known as haplotypes.This could mean that rather than having to sequence the 10 million or so SNPs an individual carries, one could obtain the same information by profiling only a small number of them.Thus, in time the determination of each individual’s complete set of SNPs should become possible. It is this that could allow the development of personalized medicine. Current drug prescribing practices are hampered because many drugs do not work for all patients and may also cause unwanted side effects.The real problem is that there is no way of knowing which patients will respond well to any particular drug. Prozac, for example, is the world’s most prescribed antidepressant despite the fact that at least 40% of depressed patients do not improve with Prozac treatment. Not only is Prozac treatment not guaranteed to be a success, but the side effects experienced by many people, varying from nausea and vomiting to impotence, are sometimes intolerable. Many of the pharmaceutical industry’s most successful drugs have similar stories: moderate success rates and long, diverse lists of unpleasant side effects.To many of us this information may seem surprising, but this problem is well known to the pharmaceutical industry. In 2003 the of genetics at vice-president GlaxoSmithKline made the eye-raising statement “The majority of drugs—more than 90%—only work in 30–50% of people.”The burden placed upon the budget of the NHS in prescribing drugs to which the patient may not respond is enormous. Pharmacogenomics should allow a better prediction of how an individual will respond to a particular drug regime, replacing the trial and error approach currently used in the treatment of many medical disorders. With current approaches to pharmaceuticals, drugs which could potentially

Easter 2006


rs

benefit a large number of patients have had to be withdrawn because of a few severe reactions in a small subset of the population. Recently there was a high profile case involving Vioxx, a drug used in the treatment of rheumatoid arthritis. In 2004 Vioxx was being prescribed to approximately 400,000 patients on the NHS with beneficial effects. But, the drug was withdrawn because clinical trials showed that long-term use increased the risk of suffering a heart attack. As there is currently no way of predicting which patients are susceptible to these effects, the majority of patients are deprived of the beneficial effects of Vioxx because of a small but unpredictable risk. If the promise of genomics is realized, predicting the compatibility of a patient with a particular drug may become possible.This may mean that fewer drugs fail in clinical trials. Some people believe that pharmacogenomics will never completely allow the prediction of some diseases, as the diseases are too complex and influenced by too many genes. Despite this, some SNPs have already been linked to an increase in disease susceptibility.An example is a protein known as COMT, which is present in the brain and plays a role in breaking down a neurotransmitter called dopamine. An SNP in the gene that codes for COMT leads to two different forms of the

ticed, the study of how variation in a single gene can affect drug responses— known as pharmacogenetics—is already used in hospital medicine. One such application is in calculating the initial dose of a drug to be administered.This is important as most medications only produce therapeutic benefits over a small range of concentrations and the dose that must be administered to obtain this concentration varies between individuals. Existing methods to predict the ideal concentration use crude measures, like body mass, an indicators. This technique,

Pharmacogenomics should allow a better prediction of how an individual will respond to a particular drug regime

however, does not account for subtle individual differences in the way drugs are metabolized. Most drugs are broken down in the liver via the actions of enzymes known collectively as cytochrome P450 enzymes. Varying forms of these enzymes have been identified in individuals and may have either enhanced or reduced activity in the metabolism of specific drugs. In 2004 the drug company Roche released a diagnostic tool that, using a simple blood

Small differences in the genomic sequence contribute to diversity in human behaviour, disease susceptibility and drug response

enzyme, which metabolize dopamine at different rates. People with the less active form of the enzyme are more likely to develop schizophrenia or other psychotic illnesses. This increased probability, however, is small, and schizophrenia is not confined only to people with the less active enzyme. A challenge of pharmacogenomics will be to understand how a multitude of genes can be linked to a particular disease. Although we have not yet reached a stage where pharmacogenomics is prac-

Herceptin, a drug used in some breast cancers, is approved in the US

www.bluesci.org

sample, allows doctors to rapidly profile a number of key liver enzymes expressed by the patient.This information can be used to predict how long a number of commonly prescribed drugs will stay active in an individual. This allows better calculation of the required dosage for a drug and thus reduces the instance of the unwanted effects of overdosing. Another area in which pharmacogenetics is being applied clinically is in genetic testing for disease: for example, a simple test can reveal if a breast cancer expresses

The first complete sequence of the human genome is published

1952 1998 2001 Watson and Crick solve the structure of DNA, announcing it to be ‘the secret of life’

a protein known as HER2. Only cancers which are HER2 positive can be treated with the drug Herceptin. Currently, in the USA over 900 genetic tests for various diseases are available, and the prediction is that this number will continue to increase in the foreseeable future. With the possibility that pharmacogenomics could be used to predict the future onset of disease come difficult moral questions that must be addressed. What would be the impact on society if we could predict which individuals would go on to develop psychotic illness-

Genetic testing for a large number of major illnesses becomes frequent

2003 2004

The SNP consortium identifies 1.8 million single nucleotide polymorphisms in the human genome

James Pickett is a PhD student in the Department of Pharmacology

The prediction of a drug’s success and side effects becomes possible through advances in rapid DNA sequencing and a better understanding of the genetic variations which make us individual

2010 2015

BiDil, the first drug to be licensed for a specific racial population, is approved

luesci

es, or other social disorders such as alcoholism? Similarly, what help should be offered to someone who suffers from a chronic illness, but is genetically predisposed not to respond to any of the currently licensed medications? It is important, if we are to avoid future unease about pharmacogenomics, that such issues are discussed by both the scientific community and the general public before the technology is available. So, is all this information really available in the genome? There are caveats to this idea, such as the observation that identical twins do not always develop the same diseases, although the probability for most conditions is increased substantially. Additional social and environmental factors, for example smoking, also influence disease onset and progression. In time though, we may come to understand how these factors exert their effects through interactions with the genome. It may then become possible to incorporate these factors into a pharmacogenomic approach to medicine. Development in pharmacogenomics will most likely increase the amount of genetic testing in clinical medicine, but whether further promises will be fulfilled is as yet unclear.

2050

Liver enzyme profiles are routinely screened by doctors to help calculate dosage of many medicines

17


Equinox Graphics

Hold the Phone Gemma Simpson explores the myths and realities of mobile phone use You probably use one every day, but is it putting your health at risk? There are rumours that your mobile could be frying your brain with every call, that children living near mobile masts have an increased risk of cancer and, sorry boys, that leaving your mobile in your trouser pocket could be damaging your vital organs. No one knows what sort of damage a mobile phone does to the human body, and few mobile phone users seem to care. But if the rumours are true the implications could be horrific for the nearly two billion phone users worldwide. Unfortunately, rumours are all we have at the moment as most scientific studies are drawing a blank on the issue. The problem in determining whether mobile phones are a risk to health lies in the difficulty of simulating and replicating conditions in which they are used. Add to this the fact that mobile phones have only been in use for the past decade, and finding the long-term effects becomes even more complicated. So what are we worrying about? One of the most notorious links is that using a mobile phone might increase one’s risk of

18

developing a brain tumour and many studies are looking into this possibility. But so far, the evidence suggests that mobile phone use is not putting your health at risk. The Interphone study is an international project coordinated by the European Union and the International Union Against Cancer, involving some 13 countries. The study has been examining

DNA strands to break and genotoxic effects to occur.These effects cause genetic mutation and might contribute to the development of tumours. An earlier international study called REFLEX suggested that radiation coming from your mobile phone could damage the DNA in cells. The original study involved 12 groups from seven European countries, all supposedly carrying out

One worrying hypothesis is that radiation emitted from mobiles can cause DNA strands to break and genotoxic effects to occur

data from 1998 onwards to ascertain whether there is a link between brain tumours and regular mobile phone use. Last month British researchers issued their findings as part of the Interphone study and suggested that regular mobile phone users were at no greater risk of developing brain tumours than less frequent users. Another worrying hypothesis is that the radiation emitted from mobiles can cause

luesci

identical experiments. But experts believe the results were far from conclusive as the different groups were not completely standardised due to financial constraints. The study found that animal and human cells exposed to electromagnetic fields showed a significant increase in DNA damage that could not always be repaired by cell repair mechanisms. This genotoxic effect was observed in some cell lines, but not in others. “In my opinion, the results are rather

Easter 2006


unusual for a genotoxic agent and might be due to an indirect effect,” states Professor Guenter Speit, who is currently trying to replicate the REFLEX findings. Since REFLEX, many other groups have been trying determine whether or not our DNA really is deteriorating with every phone call we make.The good news is that no one seems able to find evidence of disintegrating DNA. Professor Guenter

what exactly is going on we need the remaining Interphone study results.As one of the largest studies on brain tumours ever performed it is hoped that the Interphone study will at last provide some conclusive results on mobile phone safety. The study’s conclusions should be released in the next couple of months. As both research efforts and bickering amongst scientists continue, there is hope

‘Reluctantly I’m coming to the conclusion that this is a huge non-issue, generating far more hot air than light in the process’

Speit and his colleagues at the University of Ulm in Germany have, thus far, been unable to find conclusive evidence that genotoxic effects are occurring.“At present the problem of the conflicting results is not solved,” states Professor Speit. The team is currently working with Professor Rüdiger, of the original REFLEX study, in Vienna, to try and solve this prickly problem. Does you ear feel warm after a call? Don’t worry; this is probably just due to the battery heating up as the phone is used. But there is some concern that microwave radiation emitted from a mobile phone might be causing heating effects that are less favourable inside the body. High intensity microwave fields are well known for their ability to heat up biological materials—this is how a microwave oven works. For temperature increases above 1°C damaging thermal effects will occur, but there is a grey area for temperature changes around 0.5°C. Mobile phones are low intensity microwave sources and only cause increases in temperature between 0.1°C and 0.3°C.“These small temperature differences are generally dismissed as insignificant, and any apparent effects of exposure are described as ‘non-thermal’,” states Dr David de Pomerai of the University of Nottingham. Dr de Pomerai was the first to credibly report that microwave heating could possibly induce cells to produce heat shock proteins, another genotoxic effect. But even his results have turned out not to be seemed. originally what they “Unfortunately, our ‘convincing’ work on microwave radiation causing cellular damage turns out to have been a rather subtle thermal artefact… [that] undermines any effect of microwave fields whatsoever,” states Dr de Pomerai. Other researchers think that it might be possible that ‘hotspots’ are forming in the brain’s tissue. Although it seems pretty unlikely, researchers at the Mobile Telecommunications and Health Research Programme (MTHR) are currently investigating the possibility. But even if low levels of radiation from mobile phones can damage DNA, does it actually make a difference to your health? Research suggests that it does not. But for a better idea of

www.bluesci.org

that a massive international study, organized by the World Health Organisation, will help answer queries about the potential health hazards of mobile phones.The study is due to begin this year and will monitor around 200,000 mobile phone users, both infrequent and frequent, hopefully for the rest of their lives. What makes this study different is that, unlike Interphone and other studies, it does not rely on the human subjects having to remember their mobile phone use. With memories being prone to bias, it is hoped this study will give more representative results. Unfortunately, it could be years before results emerge. One of the real concerns is that children are at more risk from the adverse health effects of using mobile phones. A child’s brain is still developing and their skull is thinner, so radiation produced by a mobile phone handset can penetrate with more ease.Additionally, if they start using mobile phones at an early age then their cumulative lifetime use will be higher than for new adult users. “Chronic diseases such as cancer have a long latency period,” states Professor Franz Adlkofer, who was involved with the original Interphone study.“While adult people may die before the outbreak of the disease, this may not be the case with children.” Consequently, children are advised to use mobile phones only in an emergency.

just pass directly over your head. If you live within 50–200 metres away then you may be exposed to the beam, but the intensity of the radiation will be greatly reduced. “I fail to see how the much lower exposures from mobile phone masts (usually 0.1–1.0% of the dose received from a handset) could possibly be causing the wide range of adverse health effects that anti-mast campaigners claim,” states Dr de Pomerai. Although conclusive evidence has not yet been produced it seems that the telecommunications companies are concerned. The mobile phone industry has spent millions on investigating the safety of their handsets and it is still resolute that there is no link between the use of mobile phones and any adverse health effects. It seems that no one can make up their mind about whether the mobile phone really is a threat to health. “Uncertainty is the major concern,” comments Professor Adlkofer. “We do not know much about the effects of mobile phone radiation and whether they might constitute a risk to the health of people or not. The actual problem is that no scientist is at present in the situation to exclude such a risk with certainty.” So what do you think about mobile phones? A fantastic advancing technology, or an unknown danger to our health? “Reluctantly I’m coming to the conclusion that this is a huge non-issue, generating far more hot air than light in the process,” states Dr de Pomerai. Most other experts seem inclined to agree. “Nobody can predict the future. Findings from studies have been reviewed together by expert committees and these reviews concluded generally that there is no evidence for risk to date,” states Dr Minouk Schoemaker, of the Institute of Cancer Research, who is heading the British part of the Interphone study. “Longer-term studies are needed… [and] it is important that all evidence is reviewed when evaluating health risks of mobile phones, rather than selectively focusing on a few studies.”

‘I fail to see how the much lower exposures from mobile phone masts could possibly be causing the wide range of adverse health effects that anti-mast campaigners claim’

Advice from the MTHR recommends that children under nine years of age should not be given a mobile phone at all. But what about mobile phone masts; do the children living nearby really have an increased chance of cancer? Not necessarily. A mobile mast is intended to serve areas that are at least 50 metres away. Therefore if you do live very close to a mast the powerful main beam will

luesci

With mobile phone sales continuing to rise, it seems most people consider the benefits of using a mobile outweigh any potential health hazards.And by the time we know if there is a health risk, will any of us be able to cope without our mobile phones? Gemma Simpson is a PhD student in the Cavendish Laboratory

19


Anne Hinton strolls with Tom Körner along the Champs Elysées of Mathematics The Centre for Mathematical Sciences is rich in personalities. One of them is Tom Körner, Professor in the Department of Pure Mathematics, Teaching Fellow at Trinity Hall and author of several books—“some popular, some less so”—which convey his enthusiasm for mathematics, history and life in general.“The odd interests me. I can remember jokes and very little else!” How did you become a mathematician? At school, I was a rather erratic student of mathematics. Sometimes I did really well, sometimes really badly. My friends went into science at A-level, so I chose mathematics over history and have never regretted my choice. The famous Russian mathematician Kolmogorov did historical research in his youth. He analysed tax records and by checking whether the sums involved were whole numbers or fractions was able to discover which taxes were paid by individuals and which by communities. When he presented his work in Moscow a senior historian stood up and said ‘Young man, in mathematics one proof suffices, but in history we require five!’ The truths of mathematics may not be as important as other truths, but they are as close to absolute truths as human beings can get. They can be firmly established. In mathematics someone can be instantly convinced that they are mistaken! And how did you become an academic mathematician at Cambridge? Having got onto the academic escalator, mathematics was the natural thing to do at university. I enjoyed all parts of mathematics. I started Part III (the final year before research) thinking that I was an algebraist but after a couple of weeks I realised that I wanted to do analysis. Algebraists deal with rigid structures but in analysis you have much more freedom to tinker with the objects of study. My Director of Studies felt that it was more important for a research student to have the right supervisor than the right subject. He phoned me at home at Christmas to say that a mathematician called Varopoulos was in Cambridge for 24 hours and I should come at once. England was blanketed in snow so travelling conditions were chaotic, but I managed to make it to Cambridge and Varopoulos agreed to take me on. It was a great stroke of luck because he was a marvellous supervisor, bubbling with mathematical problems of all sorts. In my second year of research, I followed him to Sweden and then in my third year to the University of Orsay in Paris.Trinity Hall gave me a research fellowship that they then extended for a year in the hope that I would get a university post, which I did. In those days of university expansion it was almost true

20

that everyone who got a PhD could expect to get an academic job. What skills are needed to become a mathematical researcher? It is useful to be very hard working, but not necessary. It is useful to be very clever, but not necessary. Mathematicians can be very stupid! Some mathematicians are very good at spotting patterns, some have a very good intuition about randomness, some are good at solving concrete problems by placing them in an abstract setting and some at solving abstract problems by using concrete instances. Each mathematician has a different mixture of skills and abilities.What is needed is lots of determination! Is the approach to mathematical research different from that of other scientific subjects? Yes. If a mathematician has a problem all he or she can do is think about it. When Newton was asked how he solved such difficult problems he replied “by constantly thinking about them”. Although there have been exceptions (both up and down) most mathematical researchers in the past have only published 50–100 papers in their lifetime. It takes a lot of effort to write a mathematical paper and almost as much to read one. Many papers are single author ones, including most of mine.

DPMMS

A D ay i n t h e L i f e o f …

A Cambridge Mathematics Professor

Are you working on any particularly interesting or exciting topics? Whatever one is thinking about seems to be important. At present I am working on a new proof of a result first proved about 40 years ago. The result states that something can happen in spite of certain constraints. I think I can show that it can happen ‘as fast as the constraints allow’ and I think that the new proof is easier than the old. You will need to read A Quantitative Version of a Theorem of Rudin when (and if) it appears to see if you agree. What happens on a typical day? To the outside observer it must look like an Andy Warhol movie. Nothing much happens. One may read, think, calculate or have coffee with colleagues and gossip. Sometimes I just sits and thinks and some-

Although great mathematics is done by great mathematicians, anyone may suddenly come across a seam of pure gold

There is an increasing trend towards collaboration (email helps with this) but never 40 authors to a paper.

times I just sits. By inclination I am a night worker; it is hard to get long uninterrupted periods of work during the day.

How much time do you spend on your own research? I generally do research outside term time. It can take a week to get up to speed on a problem, so I like to keep long intervals of time completely clear. I try to push my administration and teaching into term time but sometimes things leak. You can get ideas at any time.Very often these sudden illuminations turn out to be false. Checking ideas requires concentrated work, sitting at a desk for a long time examining each step of the reasoning.

When did you start writing books? I wrote my first book, Fourier Analysis, in my early thirties. It was a very happy time for me. I spent four winter months in Ann Arbor whilst writing the first few chapters and composed most of the rest whilst I was courting my wife Wendy. I think the happiness comes through in the writing. I included a lot of historical and physical background beside the hard technical mathematics. Some years later I attended a lecture on the future of mathematics in British

luesci

Easter 2006


ing other people’s clever ideas. Mathematicians value the composer above the player, but the player derives great pleasure from performing the work of the composer. The business of getting a book published—which seems to take as long as the writing—is not as pleasurable: think of the endless proof-reading. What does your college work at Trinity Hall involve? I have been Director of Studies for 30 years. The largest and best part of my duties involves taking pairs of students through their week’s work. It is a marvellous occasion for them and for me

Professionally, Cambridge University is like the Champs Elysées: eventually everyone who is anybody in mathematics strolls by

advanced second and third year undergraduates.The foundations of analysis are difficult to learn; they are both technical and philosophically profound. The usual approach to the subject is to make light of the difficulties but I show that although there are difficulties one can get around them. The second half of the book addresses more advanced topics.

when they realise that what was impossible at the beginning of a year is now obvious and routine. I am also Vice-Master for the next four years. This is an important position if things go wrong, but the college is working well and I hope to remain a gaily painted fifth wheel during my entire period of office.

What is the satisfaction element for you in writing books? Mathematicians, I think correctly, value the production of new mathematics above all else. However, there is great pleasure in writing out and contemplat-

What are the best aspects of your job? It is like working in a chocolate factory. I am paid to do what I enjoy doing. Teaching is fun when syllabus and students match as they do in Cambridge. It is a thrill to realise that your first year audi-

ence may contain a future Fields medallist. Professionally, Cambridge University is like the Champs Elysées: eventually everyone who is anybody in mathematics strolls by. It is nice to be surrounded by people who are cleverer than oneself—really, really good at their subject.The principle is: you should always play chess with someone who is better than yourself.

A D ay i n t h e L i f e o f …

schools that angered and dismayed me. I decided that it was better to light a candle than to curse the darkness and wrote The Pleasures of Counting to show the beauty and utility of relatively simple mathematics and its involvement in history and everyday life. Just as you cannot choose your children’s friends so you cannot choose those of your books.The book did not reach as many of its intended audience of schoolchildren as I had hoped but proved very popular with people like engineers who use mathematics and enjoy extending their horizons. Recently, A Companion to Analysis: A Second First and a First Second Course in Analysis appeared. This is addressed to

What are the worst aspects of your job? Very few. As a university teacher I have lived through a golden age and I suspect my successors will not be quite so fortunate. Compared with when I started, I think that university administrations are very much more eager to offer help and advice when you do not need it and very much less able to give it when you do. There is one intrinsic drawback that I suppose is common to all creative endeavours. After the completion of a piece of research, you always wonder if you will ever produce anything again. But it is better to have enjoyed the thrill of discovery and regret that it lies in the past than never to have enjoyed it at all. And you always have the pleasure of learning and teaching the discoveries of others. Luck plays a great equalizing role in mathematics. Although great mathematics is done by great mathematicians, anyone may suddenly come across a seam of pure gold.We cannot teach students how to have a good idea; we can only teach them how to exploit it. Anne Hinton is an Affiliated Lecturer in the Department of Geography

www.bluesci.org Next Issue: 6 October

2006

Article Submissions By 5pm 10 July 2006 See website for guidelines submissions@bluesci.org

BlueSci Online www.bluesci.org

Updated weekly with news and events New science podcasts from CUSP

Subscriptions

Extra articles only online

Direct to your door for £12 per year subscriptions@bluesci.org

Full, searchable archive of past issues BlueSci is produced by CUSP, a society dedicated to science communication. Film. Podcast. Webcast. Writing. Get involved. www.cusp.org.uk


Away from the Bench

Seismic Scientists James Jackson and colleagues study earthquakes around the world, Anne Hinton reports Unlike the common laboratory-bound scientist, James Jackson, Professor of Active Tectonics in the Department of Earth Sciences, conducts much of his research in remote areas such as Sefidabeh in Iran. Sefidabeh is located in the middle of the desert; one must travel more than 100 kilometres in any direction to find another community. Professor Jackson visits such isolated areas to study fault movements and their association with natural hazards like earthquakes. An important question being addressed by Professor Jackson is why the number of earthquakes that claim more than 10,000 lives is on the increase. Between AD 1000 and 1600, such earthquakes occurred once every 20 years; while between AD 1600 and 1900 the frequency increased to one every five years. This trend has continued over the last two centuries, such that we now experience almost one devastating earthquake a year. Of the 118 earthquakes that have killed more than 10,000 people, 34 occurred in the last 100 years. In 1994 an earthquake occurred at Sefidabeh. From his investigations, ranging from analysing false-colour satellite images to field work in the desert, Professor Jackson was able to explain why. At Sefidabeh there is a ‘blind’ fault, one that does not reach the surface, but is expressed as a fold in the landscape.As the fault moves, it grinds the sediment into

clay particles, creating an impermeable layer of clay against which water ponds up.This makes the water easier for people to access and, ironically, due to this the fault makes life possible in the desert. As one can imagine, visiting the site of an earthquake is not usually a pleasant experience. In 2003 Bam, also in Iran, experienced an earthquake. Professor Jackson comments, “I was in Bam three days after the earthquake, and of course it was shocking. 40,000 dead out of 100,000; a city the same area and population as Cambridge. Just head-high rubble as far as you could see.”The high mountains and deserts around Bam are uninhabitable, so the people live around the edge of the mountains—precisely where the faults are. Small settlements on trade routes have grown to be mega-cities, with populations of 5–10 million people. Tehran, the capital of Iran, is also bordered by a fault. It has been destroyed several times, once in the fourth century BC and in AD 855, 958, 1177 and 1830. It wasn’t a very populated city in those days, but is now home to 10–12 million people. Recently, a tower and a hospital have been built on the top of a growing fold—the worst possible place in the event of an earthquake. Although Professor Jackson’s work is concerned with the faults themselves, it has implications for engineering geologists and the development of strategies to minimise earthquake damage.

The developed world is currently much better at protection against earthquake damage and much of this is due to building standards. For example, the Northridge, California earthquake in 1994, which measured 6.8 on the Richter scale, killed 50 people; while in Bam, the 2003 earthquake of the same magnitude killed 40,000 people. Professor Jackson has found that “people in Iran are starting to ask, ‘Why can’t we be like California?’” Professor Jackson is currently on sabbatical at Caltech, where he has been working with colleagues, including Professor Kerry Sieh, who are investigating the 2004 Boxing Day tsunami. Here, an earthquake along a 1,600 km fault line in the Indian Ocean caused built-up strain associated with gradual deformation of the earth’s crust to be released.The power of the earthquake caused parts of Sumatra to sink by two metres and raised Simeulue Island more than a metre out of the sea. “Even though the tsunami had raged across the reef, there was scant evidence of any breakage of the delicate whorls and dendritic corals that crunched beneath our feet. But a fishing boat in the trees beyond the shoreline and an overturned, two-ton umbrella-shaped Porites coral head were testimony to the power of the tsunami,” wrote Professor Sieh, on his visit to Simeulue Island in January 2005. This is a process that has been occurring in cycles over a long period of time; field evidence for this is found in tree and coral growth.When the coral is raised out of the sea, co-seismically (during an earthquake) it dies, providing an opportunity for trees to grow on it. However, when aseismic submergence occurs, the trees die as they cannot live in salty water. Similar aseismic deformation is evident further south of the fault rupture, opposite Padang in Sumatra, which has a population of about 800,000 people.The scientists were uneasy about spending much time in this location themselves, since the area is now “loaded and ready to go”, as Professor Jackson put it, making it a prime location for the next big tsunami. What can be done to lessen the impacts of these natural hazards? Professor Jackson advocated P-M-E: preparation, mitigation and education, in presenting this work in the Darwin lecture series on Survival. Let’s hope that the world takes note.

Kerry Sieh

Podcast interview with James Jackson: www.cusp.org.uk/podcast Jackson, J., Times Higher Education Supplement 17 February 2006, pp18–19 Children on the uplifted coral reef, Simeulue Island, Sumatra, January 2005

22

luesci

Anne Hinton is an Affiliated Lecturer in the Department of Geography

Easter 2006


I n i t i at i ve s

What on Earth? Geology can be a bit of an odd science. On the one hand there are researchers who spend their entire lives looking at fossilized remains of long-dead animals; on the other are those who spend their careers modelling the dynamics of modern climate change. However, if you ask most children what they think geology is about, you’ll get one word in response: “Dinosaurs”. Time Truck aims to build on this passion, shared by so many children, by introducing them to the variety of ideas and subjects which geology encompasses. Each year a group of enthusiastic students and staff from the Department of Earth Sciences and Sedgwick Museum descends upon local primary schools in the last week of Lent term to teach children about the varied and exciting subject of geology. As the name suggests, Time Truck is centred around a truck—a seven-tonne one, to be precise— in which children can discover exciting posters and models explaining some of the fundamental aspects of geology.The centrepiece is the geological clock, which compacts all 4.6 billion years of Earth’s history into just 24 hours.There are also four posters in the truck that cover some key aspects of Earth Sciences: the evolution of life, plate tectonics, climates, and geological hazards. These topics are brought to life both by the interactive posters and by accompanying displays,including a jigsaw showing how the continents have moved over the past 200 million years, a model volcano that erupts, and a 3D model of Great Britain showing which parts of the country will be submerged as sea levels rise (watch out, Cambridgeshire!). The children are highly entertained by the exploding volcano and the flooding,and while having so much fun, they learn some memorable facts.They also have the unique opportunity to handle rocks and fossils from the Sedgwick Museum.It’s always great fun to watch them try to work out how you can find desert rocks on top of Scottish hills, or why seashells can be found at the tops of mountains. They even get to handle dinosaur bones. A great time is had by children and volunteers alike. The only problem is that there’s never enough time to see everything. Time Truck also has two open days, allowing a wider audience to experience hands-on geology and talk to volunteers about all aspects of earth sciences. In February Time Truck runs handling sessions as part of the Sedgwick Museum’s Fabulous Fossiliferous Family Fun Day. Time Truck also runs an open day on the final Saturday of Lent term in the Department of Earth Sciences as part of the Cambridge Science Festival. All the posters and models from the truck are featured, along with a few extra ones, with volunteers on hand to answer questions.

initiatives@bluesci.org

Susan Conway

Michelle Pope and Sebastian Watt explain how teaching about geology is truckloads of fun

School visit 2004:Volunteers Corin Hughes (left) and Daniel Hobley show the Time Truck clock to schoolchildren in the Truck

If you ask most children what they think geology is about, you’ll get one word in response: ‘Dinosaurs’

Families can handle the fossils and rocks, and the dinosaurs prove very popular once again. This year the Time Truck also featured its first field trip—through geological time. Children were taken on a walk through the last 630 million years of the Earth’s history (along an 18-metre timeline board set out in the Downing Site car park) to find out when some of the most important events occurred, including the evolution of those ever-popular dinosaurs. In addition to these annual events, 2006 saw Time Truck go national! The Truck attended the launch of the Fforest Fawr Geopark in the Brecon Beacons, allowing visitors to enjoy all the displays and specimens available at the Cambridge Science Festival.This was a wonderful opportunity for volunteers to enthuse and educate a group of children that they normally would not be able to reach.

luesci

Time Truck has been going strong for eight years now. It depends increasingly on sponsorship from both national and local businesses and organizations, including the donation of models and children’s books. However, most of the displays and all the posters are constructed by the volunteers themselves. The success of Time Truck relies entirely on the group of student volunteers from the Department of Earth Sciences and the staff of the Sedgwick Museum of Earth Sciences, who provide many hours of advice and support. Time Truck has been a great success since it began in 1998 and will hopefully continue to grow long into the future. www.timetruck.co.uk Sebastian Watt and Michelle Pope are Time Truck Co-ordinators for 2005 and 2006

23


History

Cambridge and the Computer The first Cantabrigian to contribute to the development of the computer was Alan Turing. Turing came up to Cambridge in 1931 to study mathematics. He was fascinated by mathematical logic; in particular, Kurt Godel’s notion of undecidability. This stated that mathematical logic was essentially incomplete because some conjectures could be neither proved nor disproved. In response, mathematicians tried to identify such problems, set them aside, and thus leave mathematics essentially complete in its architecture. Turing imagined a series of machines, each programmed to perform a particular operation such as multiplication or division. He then conceived of a single machine whose operation could be altered to make it perform any function. This machine would be able to answer any question that could conceivably be answered. Turing then showed that it would be unable to arbitrate on the undecidability of a question: thus, Godel’s problem could not be contained. In his thinking of mathematical logic, Turing had conceived of a machine unlike others of its time. In the 1930s electromechanical machines worked by storing data on punched paper, with the algorithm, what the machine did with the data, fixed by the physical set-up of the machine. They were unable to perform any other operation on the data provided. With his ‘Universal Turing Machine’, Turing had conceived of the first re-programmable computer. The technology did not exist then to make the idea a physical reality, but it was a crucial theoretical step in the develop-

Maurice Wilkes (left) with EDSAC I

Swift deciphering of messages meant a need for mechanisation. He thus came into contact with Colossus, a machine based on his Universal Turing Machine, and designed by Max Newman.This was technically the world’s first programmable computer, although re-programming meant re-wiring. However, being top

Turing imagined a single machine which would be able to answer any question that could conceivably be answered

ment of the modern-day computer. Computers today, for example, are able to run word processors, as well as a myriad of other programs. A twist of fate led Turing to pursue the re-programmable computer. Around the start of the Second World War he was recruited by the Government Codes and Cipher School (GC&CS) at Bletchley Park to work on code-breaking. It was here that he famously spearheaded the cracking of the German Enigma code.

24

© Computer Laboratory, University of Cambridge. Reproduced by permission

Tim Wogan looks at the birth of the computer and Cambridge’s role in its development

secret, it was destroyed after the war, and credit went to the American ENIAC (Electronic Numerical Integrator and Computer) in 1945 for the first programmable computer. Turing found the inspiration to set about turning the Universal Turing Machine into a reality. He taught himself electrical engineering and designed a computer. However, because no one knew of the work done at Bletchley, Turing was seen as something of a fan-

luesci

tasist when he showcased his idea; and with Britain in a state of near bankruptcy after the war, his ambitious schemes were not realised. Turing died in tragic circumstances; he committed suicide after being publicly disgraced and prosecuted for homosexuality. He was 42 years of age. The future would see him redeemed both academically and socially. It was left to John von Neumann to advocate the advantages of the stored program computer.Von Neumann was a pioneering mathematician, quantum physicist, and game theorist, and the Americans wanted his expertise at Los Alamos, developing the atomic bomb. The work here was far removed from the clean, beautiful work theoreticians had done in academia. If a solution to how a blast wave propagated from an atomic bomb could not be worked out mathematically, a numerical one would have to be found. This meant performing massive numbers of calculations, a task that could not possibly done by hand. Mechanical calculators were the researchers’ indispensable tool for performing these repetitive calculations.

Easter 2006


civilian hands and Wilkes became Director. In May 1946 he read Neumann’s “First Draft”, in one night. Realizing its importance, he jumped at the chance to attend a series of lectures in Pennysylvania entitled Theory and Techniques for Design of Electronic Digital Computers. The

Turing was seen as something of a fantasist when he showcased his idea; and with Britain in a state of near bankruptcy after the war, his ambitious schemes were not realised.

staff member was Maurice Wilkes, a physicist and University demonstrator.The laboratory was housed in the former Anatomy School at the New Museums Site. However, before it could be opened, the laboratory was commandeered by the Ministry of Supply for the war. Wilkes’ knowledge of radio-waves proved useful in the development of radar, while LennardJones became Chief Scientist at the Ministry of Supply. In 1945 the laboratory was returned to

The computer had been conceived in Cambridge and was born in Cambridge; it would, in part, be raised here too

Abinand Rangesh

with the product, and this time they would put their machine in the public eye rather than the incinerator. The Computer Laboratory at Cambridge, then called the Mathematical Laboratory, was founded in 1937, with John Lennard-Jones as Director. The only

History

In 1943 von Neumann learned of ENIAC, and, in 1944 he joined its designers, Eckert and Mauchly, at the University of Pennsylvania in work on its successor, to be called EDVAC, the Electronic Discrete Variable Automatic Computer. Eckert and Mauchly were discussing how a program might be stored in a computer before von Neumann joined them. However, it was von Neumann who in the summer of 1945 wrote “First Draft of a Report on the EDVAC”, one of the most famous papers in computing history. In this, he outlined characteristics that a computer should have. Among these were a memory, which could store data and instructions in such a form that each could alter the other, a calculating unit, capable of performing both arithmetic and logical operations on the data, and a control unit which could interpret instructions from the memory and treat them differently depending on the results of previous operations. Perhaps unjustly, von Neumann received almost all the credit for the theory behind programmable computer. By 1946, the conceptual framework for the EDVAC was in place. However, the British would once again be the first

principal instructors were Mauchly and Eckert, who gave a detailed description of the proposals for the EDVAC. On his return,Wilkes was determined to build a stored program computer. Construction began in the old dissecting room, chosen partly for its large goods lift that had once carried cadavers.The process benefited from a staff of varied academic fields. Maurice Wilkes, now an Emeritus Professor, recalls “We had some new things to learn…What would lead to a harmless flash on a television screen could lead to a serious error in a computer.” The machine, christened EDSAC (Electronic Delay Storage Automatic Calculator), was completed in 1949, and the first logged program, computing the squares of the numbers 0–99, ran on 6 May that year. While the first internal memory computer had been built the previous year in Manchester, EDSAC was the first functional stored memory computer. It found itself in such heavy demand by X-ray molecular biologists, radio astronomers, theoretical chemists, and others that in 1953 a priority committee was established. The American EDVAC was not unveiled until 1952. The stored program computer had been conceived in Cambridge by Turing. Owing to Wilkes and his associates it was born in Cambridge; it would, in part, be raised here too. EDSAC 2, which replaced EDSAC in 1958, was the first full-scale machine to use a simplified form of logic called microprogramming. Developments continue at the Computer Laboratory (so called since 1970) to this day. Robin Stokes, an early user of EDSAC, wrote on the fiftieth anniversary of EDSAC: “I did not imagine that 50 years later I would be using one of EDSAC’s descendants, about a million fold faster and more powerful, to send birthday good wishes from my home on the other side of the world.” Who knows what changes the next 50 years will bring? For references, see www.bluesci.org

The current Computer Laboratory building on the West Cambridge Site

www.bluesci.org

luesci

Tim Wogan is a recent graduate in Natural Sciences, specializing in Physics and History and Philosophy of Science

25


Owain Vaughan talks to artist Helen Gilbart about art, science and the origins of life The muse has been four billion years in the making. The artist, Helen Gilbart, has been in residence at the Sedgwick Museum and Department of Earth Sciences since 2001. Her work from this project, a rare collision between art and science, has recently culminated in The Sedgwick Tapes, an exhibition of her drawings and paintings. Set in the modern surroundings of New Hall,The Sedgwick Tapes explores the evolution of life from its very origins. It is a show both about what we know and, more importantly, about what we do not know, concerning the Earth’s prehistory. It is about creation, destruction and the delicacy of existence on the planet, over inconceivably long periods of time. In turn, it offers us the opportunity to contemplate our own ephemerality. “Where was life’s trigger, where was the start?”The exhibition opens with one of a series of drawings that signify the artist’s struggle with these very issues. Breath, an abstract piece in dark tones, depicts a lone box in which something undefined appears captured.This is, according to Gilbart,a representation of her attempt to come to grips with a moment we know very little about: the moment when all the necessary constituents for life were in place,“when all the circumstances collided and life emerged”. Similarly, Chink, a glimpse of colour from within an almost concrete blandness, concerns knowledge and interpretations of it. This is a painting about the limited evidence that we have at our disposal to unravel the past or, as Gilbart puts it, it is about the way in which “we see knowledge through tunnel vision”. But the show is about more than just the unknown. Although traces of vast evolutionary changes are sparse, we do have some evidence, largely derived from fossil records. Working with the renowned fossil collection at the Sedgwick Museum and scientists at the Department of Earth Sciences, Gilbart has been inspired to explore her own interpretations of these expressions of early life.Whispers I and II, for example, are fragile renditions in pale blues and yellows of fossilized stromatolites, which Gilbart describes as “algae that oxygenated the earth’s atmosphere and were thus key to the course of a largely oxygen-dependent evolution”. Similarly, the series of 14 canvas blocks, Field, concerns the development of the eye and its various early forms. It includes images inspired by dissections of fossilized trilobites and the primitive calcite lenses they possessed, as well as the large eyes of the deep ocean squid, around 25 centimetres in diameter. Where did it all begin for Gilbart? Her life’s work has been primarily concerned with the landscape of the Earth—its structure, history, composition. After studying

26

Helen Gilbart

Arts & Reviews

Drawing Breath

Breath, mixed media on handmade rag, 2004

fine art and geography at the University of Lancaster, where she was inspired by the unruly landscape of the northwest of England, she returned to her native London to work in an East End warehouse on the Thames. Driven by her love of all things geographical, she went on to spend a year working in the very heart of her subject, first in the mountains of Cyprus and subsequently in the southeast of Spain. She now lives in Suffolk. It was here that she first became involved with fossils, working with a local fossil collection. It was this discovery that led her to the Sedgwick Museum, and “the extraordinary collaboration” that emerged. It was a collaboration that she found difficult at first:“It took me a long time to find a language in which to respond to the scientific concepts,” she recalls. Gilbart was concerned that her art should not mimic the science—“it would just be a very inadequate copy”. Instead she sought something less literal, and ultimately more personal. For in the end, she is an artist, not a scientist: “I’d like to think that the images stand for themselves, that they stand independently from all the science. The pieces have

luesci

to work on a fine art level.” Informed by what she has learnt from the residency in Cambridge, Gilbart now plans to work closer to home, documenting the haunting landscape of Suffolk. She is excited by the prospect of studying the coastline, which, she notes, will be subject to massive changes over the next few decades. Although Gilbart stresses that she does not intend her paintings to be literal representations or to bombard the viewer with messages about today’s environmental issues,she cannot help but be saddened by the lack of will to look after the natural world properly: “I cannot understand the lack of urgency,” she says. But for Gilbart it is less the planet that is in danger than the people who populate it: “The story goes on and on, it’s just that perhaps we won’t be here to interpret it.” For the moment, however, Gilbart’s paintings provide us with a thoughtful and subtle interpretation of the story so far. www.helengilbart.co.uk Owain Vaughan is a PhD student in the Department of Chemistry

Easter 2006



Please email your queries to drhypothesis@bluesci.org for your chance to win a £10 book voucher

Dear Dr Hypothesis, I have recently been asked to participate in an experiment looking at the effect of coffee on the brain. Keen though I am to help the advance of science, part of this would involve being subjected to a Magnetic Resonance Imaging (MRI) scan. I haven’t been in a hospital for many years and I don’t like the idea of any chemicals being injected into my body. What exactly is involved in an MRI scan, and how are the images produced from it? Caffeinated Carl DR HYPOTHESIS SAYS: Carl, there’s no need to worry—MRI is a technique carried out in hospitals every day. The machine applies a magnetic field straight down the body, causing the majority of the protons in the hydrogen atoms of the body to line up along this field. A radio pulse is then applied to these atoms, which makes them spin slightly differently. Once this pulse is removed, the spins of the atoms gradually return to their previous state, releasing energy. It is this release of energy that varies for different patterns of hydrogen atoms.This is visualised on the computer without the need for any invasive procedure. No nasty chemicals will be injected into your body. Dear Dr Hypothesis, I must confess that I ate one too many Easter eggs over the holidays and, with the stress of exam term coming on, I’ve let my diet slip over the last couple of weeks. It was when I clambered into the shower and felt the curtain brush up against me that I first noticed how much more space I was taking up. Could you please tell me why the shower curtain gets sucked inwards when I turn on the shower? Is there any way to prevent this daily reminder of all that chocolate? Clean Carmella DR HYPOTHESIS SAYS: The curtain is dragged in towards you because there is a lower air pressure inside the shower than outside. This is caused by the faster flow of air downwards in the shower as it is dragged with the water. Computer modelling has shown that this forms a mini-cyclone just as is seen (admittedly on a larger

scale) in weather patterns. I’m afraid, Carmella, that there isn’t much you can do to avoid this. So, if it upsets you, it’s probably time to get on that treadmill.

Lizzie Phillips

Dr Hypothesis

Dr Hypothesis

Dear Dr Hypothesis, I have only just returned home from a rather lengthy stay in hospital after an experiment I conducted during the winter went badly wrong. I was interested in finding out why salt is spread onto icy roads. To investigate this, I used a number of different household products to try to melt the ice on my garden path. Needless to say, none of them were successful and I slipped, leading to my trip to the hospital. Do you know why salt, specifically, is used on icy roads? Limping Lorna DR HYPOTHESIS SAYS: I’d first like to warn my readers of the dangers of experimenting alone—I’m sure Lorna would agree! Regarding the question, salt is spread on roads so that when it dissolves in any water present it lowers the melting point as far as –9˚C, preventing ice formation. The decision to use salt, rather than another soluble compound with a similar property, is likely to be based on cost; glycol, for example, costs up to 20 times as much as salt and so is only rarely used. Glycol is usually only used on bridges where its non-corrosive nature is important. A list of Dr Hypothesis’ resources can be found online at www.bluesci.org

Dr Hypothesis asked: “Why do the geographic and magnetic north poles not strictly coincide?” One of our readers answered: The geographic north pole, or true north, refers to the most northern point where the Earth’s axis of rotation meets its surface.The magnetic pole, on the other hand, refers to the Earth’s magentic pole and this tends to shift. These two poles can be hundreds of miles apart, and this explains why you have to correct your compass bearings when reading a map.

The Good, the Bad and the Mathematician Dr Hypothesis’ research assistant Tom Pugh challenges you to try your hand at this mathematical mystery: After a terrifying journey three fearless gun-slingers have reached their goal, the now unearthed stash of gold. The only way to decide who will receive this prize is with a three-way fight to the death. The order of the three gunmen will be determined by drawing straws; they will each then take it in turns to make one shot until only one man remains. The Good, being the sharpest draw in the west, will hit his target every time. The Bad, with a less impressive but still formidable aim, hits 80% of the time. The Mathematician is truly a fearless man, although unfortunately not the best shot, hitting only 50% of the time. No wild shots intended for one man will hit any other. The Good and Bad will each adopt the strategy which gives them the highest chance of victory, shooting at each other until one is dead, then shooting at the mathematician. What is the best strategy for the Mathematician to take? Furthermore, what are the chances of victory for each man? Visit www.bluesci.org for the answer.

Think you know better than Dr Hypothesis? He challenges you with this puzzle: What colour is molten gold? Why? Please email him with your answers, the best of which will be printed in the next issue of BlueSci.

28

luesci

Easter 2006



Looking for a great job?

urce reers reso leading ca In addition .org is the e. rs or ee m ar en eC ev Scienc w it offers , ts. And no navigation for scientis ith easier w te si eb e, the w av w W ne t d ex N an to a br w includes eers.org no ScienceCar

e is e. Next Wav s magazin advance lp line career he on l to ia nt es esse and articl h features packed wit . er re e ca g your scienc Careers.or w.Science Surf to ww

Get the experts behind you Visit www.ScienceCareers.org

stings s of job po • Hundred Next Wave d an s ol • Career to rmation • Grant info Database CV • Resume/ rum • Career Fo


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.