Michaelmas 2014 Issue 31 www.bluesci.org
Cambridge University science magazine
FOCUS
GM Crops: Feeding the Nine Billion
Measles . Voyager 1 . Colours Bioelectricity . Nature vs Nurture
We Care - about Innovation
A new generation of Cell Sieve has arrived!
EASYstrainer
™
For the Filtration of Cell Suspensions and Primary Cell Isolates FREE S AMPLE S AVA I L A BLE Conta te
Tel: 01453 825255
email: sales@uk.gbo.com
rm
ct us qu oting 'B lu S c i'
sa nd c o n d i t ions apply
www.gbo.com
. . . that‘s why - we never stand still
Some scientists make important breakthroughs everyday Robin Hartfield Cross Taught: Maths Now: PwC Graduate Scheme
For further information, contact Natalie Mason nmason@teachfirst.org.uk Change their lives. Change yours.
Apply now for our Leadership Development Programme
teachfirst.org.uk
Charity No 1098294
Michaelmas 2014 Issue 31
Contents
Cambridge University science magazine
Features
Regulars
6
On The Cover 3 News 4 Reviews 5
Our Colourful History
Rhian Holvey explores how colour has shaped our history 8
10
Voyager 1: Breaching the Final Frontier of the Solar System Simon Watson describes Voyager 1’s journey of a lifetime
I Heal the Body Electric
Joy Thompson uncovers the importance of bioelectricity in medicine 12
Measles: The Return of an Eliminated Virus
Sarah Smith investigates the return of measles 14
You are your Genes and your Environment
Alex O’Bryan Tear discusses the longstanding nature versus nurture debate
30
Perspective
32
Science and Policy
34
Arts and Science
36
Away from the Bench
38
Matthew Dunstan explores the changes YouTube is making to education Robin Lamboll investigates current alcohol pricing and the future of alcohol tax Joanna-Marie Howes talks to Christopher Riley about his latest documentary
FOCUS
16
Technology
Verena Brucklacher-Waldert describes advances in single-cell profiling
Jonathan Lawson discusses this year’s FameLab competition
GM Crops
BlueSci reveals the origins of the debate surrounding GM crops and discusses whether current legislation hinders progress
About Us... BlueSci was established in 2004 to provide a student forum for science communication. As the longest running science magazine in Cambridge, BlueSci publishes the best science writing from across the University each term. We combine high quality writing with stunning images to provide fascinating yet accessible science to everyone. But BlueSci does not stop there. At www.bluesci.co.uk, we have extra articles, regular news stories, podcasts and science films to inform and entertain between print issues. Produced entirely by members of the University, the diversity of expertise and talent combine to produce a unique science experience.
Committee President: Nathan Smith ���������������������������������������������������� president@bluesci.co.uk Managing Editor: Sarah Smith �������������������������� managing-editor@bluesci.co.uk Secretary: Robin Lamboll ���������������������������������������������������enquiries@bluesci.co.uk Treasurer: Chris Wan ������������������������������������������������������ membership@bluesci.co.uk Film Editors: Shayan Ali ������������������������������������������������������������������ film@bluesci.co.uk Radio: Hinal Tanna........................................................................ radio@bluesci.co.uk Webmaster: James Stevens �������������������������������������������webmaster@bluesci.co.uk Advertising Manager: Sophie Harrington ����������������advertising@bluesci.co.uk Events & Publicity Officer: Ornela de Gasperin ���������������� events@bluesci.co.uk News Editor: Joanna-Marie Howes �����������������������������������������news@bluesci.co.uk Web Editor: Camilla d’Angelo ����������������������������������������web-editor@bluesci.co.uk
Weird and Wonderful
40
Anniversary Specials Where Are They Now? Nobel Prizes and Ig Prizes Great Discoveries Headline Grabbers Pavilion
22 24 26 28 39
Contents 1
Giant’s shoulders
Issue 31: Michaelmas 2014 Editor: Greg Mellers Managing Editor: Sarah Smith Second Editors: Sarah Smith, Nathan Smith, Irene Marco-Rius, Alison Mackintosh, Ana Duarte, Ornela De Gasperin Quintero, Robin Lamboll, Daisy Hessenberger, Caitlin McCormack, Camilla D’Angelo, Greg Mellers, Carol O’Brien Copy Editors: Sarah Smith, Simon Watson, Caitlin McCormack, Robin Lamboll, Zaria Gorvett, Greg Mellers News Editor: Joanna-Marie Howes News Team: Madeline Kavanagh, Paula Siemek Reviews: Nathan Smith, Dhiren Mistry, Jannis Meents Focus Team: Greg Mellers, Emily Bailes, Nathan Smith, Daisy Hessenberger Weird and Wonderful: Brandon Bedford, Rachel Harvey, Ellen Rugg Production Team: Sarah Smith, Simon Watson, Caitlin McCormack, Nathan Smith, Robin Lamboll, Greg Mellers, Sophie Harrington, Arporn Wangwiwatsin Illustrators: Matt Cotten, Alison Mackintosh, Sue Smith, Emily Pycroft, Daisy Hessenberger, Alex Hahn Cover Image: Dhiren Mistry
ISSN 1748-6920
Blue Pantone 299 C
Cambridge University Students’ Union
CMYK c79 m16 y0 k0 RGB r98 g165 b223 Hex #62A5DF
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License (unless marked by a ©, in which case the copyright remains with the original rights holder). To view a copy of this license, visit http://creativecommons. org/licenses/by-nc-nd/3.0/ or send a letter to Creative Commons, 444 Castro Street, Suite 900, Mountain View, California, 94041, USA.
BlueSci first went to print ten years ago, when the scientific world was undergoing some important changes. The Human Genome project, which completed its primary goal of sequencing the human genome in 2003, had required vast amounts of time and resources to achieve. As a result, it became clear improvements in technology were required and hence the National Human Genome Research Institute (NHGRI) began a funding programme which aimed to reduce the cost of sequencing a human genome to $1000 within ten years. This fuelled a boom in so-called ‘Next-Generation’ sequencing technologies which led to Illumina’s ‘HiSeq X Ten’ machine achieving the goal of low-cost genome sequencing in January 2014. The huge swathes of data being produced from such technology will soon permit population level studies of higher organisms and vastly improve our investigative powers, likely revolutionising health care. Graphene had only just been produced but its strength and efficient thermal and electrical conductance has since generated a $9 million industry. Though commercial applications have yet to be unlocked, its predicted influence in electronics and bioengineering are set to revolutionise how we interact with technology. Finally, physicist Peter Higgs had recently been awarded the ‘Wolf Prize for Physics’ in recognition of his pioneering work on mass generation. In this past decade, scientists have gone on to confirm his theoretical particle, which won him a Nobel Prize last year. More importantly, the distinction between these three fields has become increasingly blurred as interdisciplinary problems require combined research efforts to solve. For example, prosthetic eyes now consist of encoders and transducers to mimic the natural retinal output of the eye, requiring research input from scientists of every discipline. Throughout these ten years of influential changes, BlueSci has continued to explore and report upon this ever-changing landscape and this issue’s theme reminisces about the great scientists who have gone before us. We look back to the history of our relationship with colour and how the development of synthetic dyes changed the medical world forever. The entrance of a manmade object to inter-stellar space after a thirty six year journey is truly a feat of human engineering, but we will have to wait another fourteen thousand years for it to leave our solar system. Though the discovery of electricity revolutionised society forever, it is yet to relinquish all of its secrets as we discover its emerging role in controlling biological development. However our predecessors are not without fault; we discuss how the eradicable Measles virus has recently made a frightening resurgence following the MMR vaccine controversy. In our Focus article, we examine the early legislative decisions surrounding the consumption and cultivation of GM crops and askGrey if they have stifled subsequent research efforts. To celebrate our tenth anniversary, Pantone some of the previous contributing members of Bluesci provide an insight into life after Black C 80% leaving the magazine. We detail some of the most important scientific discoveries since CMYKand reflect on those that grabbed public attention in the media and those that 2004 c0 m0 y0 k80 achieved the highest scientific accolade – a Nobel Prize. RGB magazine would not have continued through these past ten years without This r94 g94 b93 the persistence and perseverance of countless individuals. The authors, illustrators, Hex and producers of this magazine may change year on year, but the ultimate editors #5E5E5D goal is always to strive for an interesting and engrossing read. I hope that this ten year anniversary issue inspires you to get involved with BlueSci and that you join me in celebrating the tireless effort of those who have gone before me to make this such a wonderful publication.
Greg Mellers Issue 31 Editor 2 Editorial
Michaelmas 2014
Turbulent encounters Dhiren Mistry discusses the turbulence we encouter every day Cambridge University science magazine
FOCUS
GM Crops: Feeding the Nine Billion
Michaelmas 2014 Issue 31 www.bluesci.org
Measles . Voyager 1 . Colours Bioelectricity . Nature vs Nurture
EVAN BENCH
Honey has a much higher viscosity than water
Michaelmas 2014
Most of us experience turbulence through
stomach-churning drops in an aircraft. Fortunately the turbulence that is widely researched by engineers and scientists is much less nauseating, yet remains an intimidating subject. Research on turbulence is related to the chaotic, swirling behaviour of fluids, which comprises both liquids and gases. We can find turbulence in many naturally-occurring and man-made flow scenarios, such as the mushroom clouds created by volcanoes and nuclear bombs, and the swirling flows in tornadoes and behind propellers. In fact, it is difficult to study most fluid phenomena without taking into account the effects of turbulence. Unfortunately, even after almost 175 years of research this chaotic fluid behaviour is not yet understood, illustrating the unforgiving complexity of this topic. However, the chaotic nature of turbulence is precisely what allows fluids to mix so well. Without turbulence, exhaust gases that are ejected from cars, aircraft, and smoke-stacks would simply not disperse effectively with the surrounding air. A simple explanation categorises fluids into two different states of flows, laminar or turbulent; which of these flow states a fluid adopts is dependent on both the viscosity and the inertia of the fluid. Viscosity, which can be described as the ‘stickiness’ of the fluid, can best be explained with an example: golden syrup is very viscous, whereas olive oil is much less viscous. Try stirring a bowl of golden syrup and you will notice that it does not mix very easily. This is because its viscosity is restricting the mixing motion. The inertia of a fluid describes its resistance to changes in motion. For example, the fastmoving exhaust gases leaving the nozzle of a rocket have a lot of inertia. Laminar flows are steady and predictable because they are stabilised by viscosity, which is relatively strong compared to the inertia of the fluid. Fluids become turbulent when the effects of viscosity become very small compared with the inertia of the fluid motion. This is the result of having a fluid with less viscosity (e.g. oil) or mixing the fluid very quickly to increase the inertia (e.g. mixing golden syrup with a blender). An interesting feature of turbulence is the prevalence of swirling motions. Move a teaspoon in a straight line through a cup of tea and you will notice swirling patterns in the wake behind the spoon. This is the effect of ‘shear’ on fluids, and shearing motion is unstable. In laminar flows, the viscosity inhibits the unstable, shearing motion, but in turbulence this motion is allowed to rollup the fluid into the swirling patterns that we observe. This explains the shape of the mushroom clouds from volcanoes and nuclear explosions. The sudden upwards
movement of air from the volcano or explosion is being dragged, or sheared, against the relatively motionless surrounding atmosphere whilst the inner fluid of the cloud moves at greater speed. This shearing motion of slow movement on the outside of the cloud and fast movement on the inside results in the cloud of fluid rolling up into a turbulent mushroom shape. Turbulence exists over a range of scales, which means that there are very large, random, swirling motions, and within them are moderately-sized swirling motions, and within those even smaller ones, like a set of Russian dolls. While it might seem easy to dismiss the smallest motions in turbulence, these motions are in fact responsible for many important features in turbulent flows such as the mixing between exhaust gases and the air, and the skinfriction drag from the airflow over cars and aircraft. The motivation behind a lot of turbulence research is therefore directed towards understanding small-scale turbulence, which is very difficult to predict, in relation to the larger scales of the flow, which are easier to predict. The image on the front cover of Issue 31 illustrates the remnants of turbulent flow structures following an experiment. The aim of this experiment was to investigate the behaviour of turbulence at a small scale, in the order of a few millimetres. The turbulence was generated by injecting a jet of dyed water into a tank of still water (think of spraying a water pistol that is submerged in a bathtub). The bright green patterns are visualised using a fluorescent dye and a powerful laser. The laser beam is formed into a thin sheet of light, which means the cover image represents a two-dimensional slice through the fluid. Note the mushroom-shaped patterns that are similar to that from a volcanic eruption. These patterns were caused by some of the dyed fluid being pushed outwards into the still, ambient region. The difference in the fluid speed between the slow outer region and the fast inner region of the dyed fluid resulted in shear-produced swirling motions in the shape of mushrooms. In this regard, the scalable nature of turbulence means that we are able to study large, natural phenomena in a laboratory setting. Researchers ultimately hope to find some coherence in this chaos to help explain the intricate and entrancing patterns that turbulence produces. Dhiren Mistry is a 3rd year PhD student in the Department of Engineering Dhiren’s image was shortlisted in the University of Cambridge’s Department of Engineering Photography Competition 2013. The competition, sponsored by Zeiss, aimed to show that engineering is not only about fixing machines and building bridges, but involves everything from studying objects and processes in microscopic detail, to building towering structures.
On the Cover 3
Addiction, Addicti-off
PUBLIC
recent studies published by Professor
Check out www.bluesci.org or @BlueSci on Twitter for regular science news and updates
Scott Steffensen and his collaborators at Brigham Young University suggest that addiction is a chronic brain disease, not just a result of bad choices or rebellious behaviour, and that it could be treated and possibly cured like any other disease. The team strongly believes that in the near future, medicine will be able to help addicts return to a relatively normal state by dealing with the changes in their brains.The studies show that there is a common mechanism behind addiction to nicotine, alcohol, and drugs which is comparable to a driver overcorrecting a vehicle. So, how does it work? The presence of these substances in a human’s body triggers the release of unnaturally high levels of a hormone called dopamine in the brain’s pleasure system. That, in turn, leads to an occurrence of oxidative stress in the brain. The body tries to deal with, and compensate for, the unnatural levels of the hormone by producing a protein called brain derived neurotrophic factor (BDNF) which suppresses the brain’s normal production of dopamine. The only problem is that the correction occurs long after the person comes down from a high. Thus the body’s delayed reaction results in a lack of dopamine, which has unpleasant consequences commonly known as withdrawal symptoms: anxiety, distress and pain. Steffensen and his collaborators emphasise how misunderstood the concept of addiction is in our society and how they hope their studies will help to better explain it as well as grasp the mechanisms behind it. These studies could be a milestone in helping to eliminate the stigma that haunts addicts and to help them to return to society. DOI: 10.1016/j.biopsych.2013.08.033 ps
Gendercide of Mosquitoes genetically engineering a population of mosquitoes
to produce only male offspring could provide a novel means of eliminating malaria. Researchers have developed a method of selectively destroying the X-chromosome in mosquito sperm, so that only the male-producing Y-chromosome was available during fertilization. The number of offspring produced remained unaltered, but were more than 95 per cent male. Malaria is a potentially lethal disease caused by the parasite Plasmodium falciparum, which only reproduces in the gut of the female Anopheles mosquito. Female mosquitoes are also solely responsible for transmitting malaria to people as they feed on blood; male Anopheles mosquitoes are vegetarian. Altering the sex ratio in mosquito offspring has two benefits for malaria control. Firstly, the number of females available for reproduction and transmission of malaria is reduced. Secondly, over a number of generations, the reproductive capacity of the mosquitoes diminishes as the proportion of females available for breeding becomes unsustainable. Because mosquitoes have only a short lifespan of one to four weeks, the impact of changing population dynamics is rapidly observable. Unlike other methods of controlling mosquito populations, this technique specifically targets Anopheles mosquitoes. It does not impact other insect populations, or require environmental engineering or interventions by the local human community. There are significant social and political barriers to overcome before the system can be implemented outside the lab. Though, given the numerous benefits, approval will hopefully be forthcoming. DOI: 10.1038/ncomms4977 mk ALVESGASPAR
News
Are we comfortable with our innermost thoughts? how far would you go to escape the queasy
MATTHIAS ZEPPER
feeling that often accompanies self-reflection? As far as electrocuting yourself? It might sound surprising, but research by a team at the University of Virginia indicates that 67 per cent of males and 25 per cent of females would rather self-administer a small electric shock than sit alone with their thoughts for as little as 15 minutes. The team wanted to determine how easily people were able to disengage from the distractions of modern life and how pleasant they found the experience. Students were asked to sit for between six and fifteen minutes in either a distraction-free lab or home environment and “just think”. About 50 per cent of participants found the experience uncomfortable, rating it at or below the midpoint on an enjoyment scale. The majority complained 4 News
of feeling unable to concentrate and that their mind wandered. Prompting the participants to think about a specific subject or upcoming holiday did not improve task enjoyment. Researchers then tried to determine if experiencing an unpleasant sensation, such as an electric shock, was preferable to doing nothing at all. The result, for men at least, was a resounding yes. The observed gender difference was attributed to the ‘sensationseeking’ behaviour of men and the lower pain threshold of women. Inward thinking is a uniquely human trait thought to be important in creativity, and engaging in mindfulness on a regular basis is linked with improved mental health. However, despite these clear benefits it’s now clear that “the untutored mind does not like to be alone with itself”. DOI: 10.1126/science.1250830 mk Michaelmas 2014
Reviews Experiment Eleven: Deceit and Betrayal in the Discovery of the Cure for Tuberculosis - Peter Pringle in experiment eleven, Peter Pringle tells the incredible story of Dr Albert Schatz and
Bloomsbury, 2013, £8.99
his discovery of streptomycin, the antibiotic that would be the first cure for tuberculosis. He tells us how the then PhD student Schatz, working in the lab of Professor Selman Waksman, came across a microbial culture that produced the invaluable drug. This discovery led to a storm of public praise for—well, for Waksman. Painting a picture of an egocentric believer in hierarchy, Pringle describes how Waksman used his seniority to nudge Schatz out of the picture, taking all the credit for himself and eventually winning the Nobel Prize for ‘his’ discovery. The story is told in a fast and thrilling way, making it difficult for the reader to put the book aside. Occasionally he gets lost in side stories and, ironically, Pringle also tends to make the same mistake as Waksman: he attributes the credit to one person alone, in this case to Schatz, almost denying Waksman any part in the discovery. While this makes the story human and relatable, his sarcastic tone towards Waksman casts a shadow on Pringle’s objectivity as an author. Dedicated to “the researchers in science who did the hard work, and never reaped the glory”, Experiment Eleven addresses important questions about ethics in science and who should be given credit for discoveries. It is a fantastic read which will hopefully help straighten the records about the discovery of the cure for tuberculosis. jm
WEEDS: The Story of Outlaw Plants - Richard Mabey richard mabey ’s ‘ weeds: The Story of Outlaw Plants’, released in 2010, is a brilliant
Profile Books, 2012, £8.99
book about the unwanted plants around us. It examines our relationship with ‘weeds’, past and present, with poetic whimsy; switching from anecdote to anecdote, scientific experiment to poetry, and adding botanical examples galore. Mabey is clearly passionate and well-versed in the subject and his argument for a more nuanced view on weeds is convincing. Each chapter is titled after a different plant, be it real or mythological, and each chapter explores weeds in a different way. For example, ‘Love-in-idleness’ explores weeds in literature whereas ‘Self Heal’ explores the cultural history of today’s weeds as medicine. Each challenges our modern-day perception of ‘weeds’ and confronts the reader to reconsider our attitude towards them, from the prickly burdock to the blistering giant hogweed. Whilst somewhat heavy at times, the book remains a testament to the lesser appreciated beauty of nature. Indeed, perhaps much of its sentiment is best summed up in the verse by Gerard Manly Hopkins, “What would the world be, once bereft/ Of wet and of wildness? Let them be left,/O let them be left, wildness and wet;/ Long live the weeds and the wilderness yet”. ns
Project Sunshine - Steve McKevitt and Tony Ryan the future of humans on Earth looks bleak; relentless population growth coupled
Icon Books, 2013, £16.99
with an ever greater demand for water, food and electricity. Is it even possible to meet these demands? Project Sunshine addresses this question in a to-do list for the human race. Setting the stage with a thorough review of the food and energy requirements for the coming decades that leaves the reader with overwhelming anxiety for the future, McKevitt and Ryan systematically evaluate a range of potential technologies and solutions, including genetically modified crops, solar power, wind turbines and nuclear power. The technical writing style in these central chapters is at times dry and uninspiring. This, however, simply reflects the reality of scientific progress that is known all too well by most students in research. In this regard, it is refreshing to have an evaluation of scientific concepts and technologies that has not been glorified by popular media. As hinted in the title, McKevitt and Ryan’s solution is a globally-coordinated investment in photovoltaic energy, which holds the greatest potential for scalability and cost-effectiveness. Project Sunshine is a compelling read that hopes to inspire the reader to tackle the significant global issues that will soon affect our planet. dm
Michaelmas 2014
Reviews 5
EMILY PYCROFT
Our Colourful History Rhian Holvey explores how colour has shaped our history
PUBLIC
we live in a colourful world, from the colours
Newton discovered how prisms ‘split’ white light
6 Our Colourful History
on this page right up to our blue and green planet. Their ubiquitous presence makes it easy to forget that colours are one of the most human of concepts. Classification of colours exhibit regional variations between different cultures: Russian has separate words for dark blue (siniy) and light blue (goluboy), while Welsh has only one word to describe shades of blue, green and grey (glas). Defining colours remains subjective even within the same language, with two people describing the same shade differently. We use them to describe our emotions, and they even modify our behaviour, the calming effect of pale blues and greens in hospitals, or how sports teams in red are statistically more likely to win. All these features of colour come from describing the same physical property, that of different wavelengths of light. In 1671, Isaac Newton presented his research on the properties of light to the Royal Society. Prisms generate a rainbow-patterned arrangement of colours when light passes through one of the faces. It was originally thought that it was the prism itself that added these colours. However, Newton’s key finding described how prisms in fact split ‘white’ light into the colour spectrum. Each colour represents a particular wavelength of electromagnetic radiation in the spectrum visible to humans, ranging from the shortest wavelengths at ~400 nanometres (violet), to the longest at ~700 nanometres (red). We perceive colour when an object absorbs some wavelengths of light while reflecting others. Thus, if an object appears red it absorbs all wavelengths of light except red. Within the eye there are different light sensitive cells dispersed on the retina: rods and cones. Rods are active in dim light and are insensitive to colour. Rod cells are deactivated upon exposure to bright light and require time to be restored, which explains the time delay required to regain night vision after turning off a light. Cone cells work
better in bright light and are responsible for our colour vision. Most humans have three different types of cone cells sensitive to short, medium and long wavelengths of light, respectively. The changing stimulation that these cells experience from particular wavelengths is interpreted by the brain as different colours. Though the wavelengths of ‘coloured light’ are fixed, colour perception is not universal. Most humans have trichromatic vision because they have three types of cone cells in their retina. However some people have fewer than this, or no receptor types, which results in colour blindness. The most common colour blindness is red-green, which arises from a shortage of cones sensitive to medium or long wavelengths. People with this condition thus have difficulty distinguishing between red, yellow and green. Similarly, there is a lot of variation in colour vision across the animal kingdom. Marine mammals have only one type of cone cell and thus have monochromatic vision. Contrary to popular belief, dogs, like most land mammals, have two types of cone cells and do not only see in black and white. In this regard, their vision is similar to that of a red-green colour blind person. Many other animals such as birds, reptiles and fish are tetrachromats, having four different colour receptors and can distinguish more colours than humans. The animal with the most is the mantis shrimp, which has twelve types of cone cells and an ability to distinguish four shades of ultraviolet. Humans have been fascinated with colour from our earliest history. There is evidence of industries formed for the production of single colour dyes dating back over 5000 years. Indigo, for example, represented the British American colonies’ biggest agricultural export alongside rice during the 18th century. Even today, indigo is a dye that is widely used to add the blue colour to jeans. Unlike dyes, which are soluble and fix to the substance being dyed, pigments are insoluble and dry on top of the
Michaelmas 2014
PUBLIC
target substance. Pigments were originally obtained from natural sources such as minerals, plants, and animals, and often at great cost. For example, ultramarine was extracted from lapis lazuli, a deep blue semi-precious stone that was worth more than gold in medieval times. This reliance on natural pigments changed in 1856 through a serendipitous discovery by an 18-year-old British chemist, William Perkin. Perkin was attempting to synthesise the anti-malarial compound quinine from coal tar; a waste product from the production of coal gas. One day, while washing away a black residue from an apparently failed reaction, he noticed that the ethanol washings had a brilliant purple colour. Curious, he dipped some silk into the solution, which became dyed and remained strongly coloured even after washing. Despite his youth, Perkin immediately saw the potential to produce the world’s first synthetic dye. Perkin patented his ‘Aniline purple’ or ‘Tyrian purple’, named after the notoriously expensive dye extracted from sea snails in ancient times, and set up a factory for mass production. The initial uptake of Perkin’s synthetic dye was slow, but the colour leapt to fame in 1857 thanks to Empress Eugénie of France. The empresses’ dominance in fashion soon popularised the French name for the colour: mauve, and the dye subsequently became known as ‘mauveine’. A year later Queen Victoria wore a mauve dress to her daughter’s wedding, and suddenly women on both sides of the Channel wanted to wear the colour, with the magazine Punch dubbing it the ‘mauve measles’. With the wealth Perkin made from his discovery, he continued to pursue his chemical interests, which led him to discover and synthesise more dyes and compounds including coumarin and
Michaelmas 2014
alizarin; respectively the first synthetic perfume component and the first natural pigment replicated synthetically. The success of ‘mauveine’ had a much greater impact on organic chemistry than on Perkin’s own fortune though. Prior to Perkin’s discovery, chemistry was regarded as having no practical importance. But after the advent of ‘mauveine’ and other synthetic dyes, the industrial potential of chemistry was quickly realised and dye factories sprung up all over Europe. Amongst the most successful were those in Germany and one of the most famous, Bayer, would be partially responsible for the rise of modern medicine. In the 1870s, cousins Karl Weigert and Paul Ehrlich were the first researchers to stain biological samples using dyes. What they found was surprising; different dyes would distinctly stain specific cell types and bacteria. Weigert and Ehrlich’s methods became crucial for studying human cells and diagnosing bacterial infections. Ehrlich was captivated by these results and wondered whether they could be used for more than diagnosis. He reasoned that if a dye could stain a bacterium without touching human cells, then this property of dyes could be exploited to direct a toxin and specifically kill the bacteria without damaging human tissue. The theory was neatly summarised by his term ‘magic bullets’, which was one of the earliest proposals for chemotherapy. The belief at that time was that all chemicals were too toxic to be used internally, though many chemicals were widely used externally as antiseptics. Magic bullets were proved plausible in 1909, when Ehrlich discovered the bright yellow Salvarsan, a reasonably effective drug to treat syphillis with fewer side effects than the traditional mercury-based treatments. The success of Salvarsan inspired the work of chemists and pathologists at Bayer, particularly Gerhard Domagk, who in 1932 helped discover the world’s first therapeutic antibiotic: Prontosil, derived from a red dye. For the first time in human history, the bacterial infections caused by Streptococcus (one of the most common and lethal bacteria) such as childbed fever, scarlet fever and rheumatic fever, could be treated. Though the active part of Prontosil was later shown to be the colourless sulphanilamide, a class of antibiotics still used today, it was the former dye pursued by Bayer that led, not only to the discovery of the world’s first therapeutic antibiotic, but to the establishment of chemotherapy and medicinal chemistry, both corner stones of modern medicine. Worth considering the next time you put on your favourite coloured top!
Ehrlich discovered the first chemotherapeutic drug, Salvarsan, used for treating syphillis
Rhian Holvey is a post-doctoral researcher in the Department of Chemistry Our Colourful History 7
SUE SMITH
Voyager 1: breaching the final frontier of the Solar System
Simon Watson describes Voyager 1’s journey of a lifetime 1977, nasa launched two unmanned probes into space to take a ‘Planetary Grand Tour’ of the outer Solar System. Voyager 1 was to explore both the gas giants Jupiter and Saturn, while Voyager 2 would further explore Uranus and Neptune. The timing of the launches was planned to take advantage of a rare alignment of the outer planets that would not occur again for another 175 years. This alignment allowed the probes to exploit the gravitational pull of each planet to alter their trajectory and speed, slingshotting towards the next planet. The probes only needed to carry enough fuel to get to Jupiter, where they received a drastic speed boost which significantly reduced the mission’s costs and f light time. The speed increase was considerable - by slingshotting around Jupiter, the Voyager probes sped up by approximately 35,700 mph. As dictated by the law of momentum conservation, Jupiter slowed in response by about one foot per trillion years. Voyager 1’s f lyby of Jupiter provided the first evidence of an extraterrestrial active volcano, with the probes photographing volcanic plumes extending to over 190 miles above the surface of in
NASA
Voyager 1 left the sun’s heliosphere and entered interstellar space in August 2012
8 Voyager 1: Breaching the final frontier of the Solar System
NASA
The Voyager probes passed Jupiter in 1979, taking photos of its Great Red Spot
Jupiter’s fourth-largest moon, Io. This geological activity is due to friction from the stretching and squeezing of Io’s crust as it orbits around Jupiter. Voyager 1 was also the first spacecraft to photograph a thin dusty ring surrounding Jupiter, which at the time made Jupiter the third planet with a ring system, after Saturn and Uranus. Because Voyager 1 had been directed to a close f lyby of Titan, Saturn’s largest moon, its trajectory was def lected such that it was now heading out of the orbital plane of the planets, and was unable to continue further planetary exploration. However, its journey was far from over - having completed its primary mission, it was given a second extended mission to explore past the outer planets of the Solar System, beyond the Sun’s protective heliosphere, and out into interstellar space. Voyager 1 has the accolade of being the most distant man-made object from Earth. As of June 2014, it had travelled over 12 billion miles, making it over three times further away than Neptune, the outermost planet in our Solar System. Despite this vast distance, NASA engineers are still in constant communication with the spacecraft to both receive the measurements it takes, and to provide course Michaelmas 2014
Michaelmas 2014
NASA
extend all the way out to the vast Oort cloud. This hypothetical cloud of icy matter is thought to be the source of all long-period comets, and lies between 2,000 and 50,000 AU from the Sun. 1 AU is the average distance between the Earth and the Sun (approximately 93 million miles), used by astronomers to measure distances at the scale of the Solar System. Given Voyager 1’s current location and speed, it is not expected to reach the Oort cloud for another 200 years, and not expected to pass through for another 14,000 years! The Voyager probes only have enough power to last until approximately 2026, so we will unfortunately not get any indication of when they pass into the Oort cloud. However they may still have a grander purpose for humankind; in the absence of air resistance, the probes will continue to drift through the vacuum of space at their current velocity until they either hit an object or are found by an intelligent species. In the vast expanses of space, hitting another object is unlikely, even when passing through the Oort cloud. Therefore, placed aboard the spacecraft are gold-plated audio-visual discs carrying photographs of Earth and its life forms; sounds of whale calls, babies crying, and waves breaking on a shore; music from various cultures and eras; and many mathematical and physical quantities. These Golden Records are humankind’s ‘message in a bottle’, released on the miniscule chance that an intelligent space-faring life-form from another planetary system comes across them, letting them know that they’re not alone in the Universe.
The Golden Records placed aboard the probes are humanity’s ‘message in a bottle’
In 14,000 years Voyager 1 will pass through the Oort cloud and leave the Solar System
NASA
corrections and software updates to its onboard computer. At such a distance, each radio signal from Voyager 1 takes over 17 hours to reach Earth and is so weak when it arrives that it requires a worldwide network of giant radio antennae up to 70 metres across as well as orbiting satellites, called the Deep Space Network, to receive it. Voyager 1 was not the first probe to be sent to outer space–Pioneer 10 and 11 were launched in 1972 and 1973 respectively, and Voyager 2 was launched two weeks earlier than its sister probe. However, because Voyager 1’s f lyby of Titan brought it much closer to Saturn than Voyager 2, it gained a greater increase in speed from its slingshot. As a result, Voyager 1 is travelling at around 38,000 mph, compared to around 34,500 mph for Voyager 2, and much faster than the Pioneer probes. So in 1998, despite being launched 5 years later, Voyager 1 overtook Pioneer 10 to become the most distant man-made object - an achievement that will not be surpassed in the foreseeable future. On 12th September 2013, NASA declared that the Voyager 1 probe had entered interstellar space, 36 years after starting its journey. This was determined from the density of the ionised gas, called plasma, which is present in the space between the stars; within the protective ‘bubble’ of the heliosphere, there is a much lower density of interstellar plasma. As Voyager 1 doesn’t have an active plasma sensor, it had to determine this density indirectly through a massive solar f lare in March 2012 that ejected an enormous amount of plasma into the solar system. When this ejection reached Voyager 1 in April 2013, it made the surrounding plasma vibrate. By comparing the pitch of these oscillations to previouslyrecorded pitches, NASA scientists determined that Voyager 1 was travelling in plasma over 40-times denser than within the outer boundary of the heliosphere. By looking back at previous measurements, they further found that the change in density indicative of passing through the heliopause (the heliosphere’s boundary, where plasma from the Sun’s solar winds meet the stellar winds from surrounding stars) and entering interstellar space occurred back in August 2012. Sound files of these oscillations can be found on NASA’s Voyager website. Contrary to proclamations in the media, Voyager 1 has not officially left the Solar System. The boundaries of the Solar System are said to extend to where the Sun’s gravity is no longer a dominant force, which is difficult to define. However, the interstellar space beyond the heliosphere is still well within the Sun’s gravitational inf luence, which is considered to
Simon Watson is a post-doctoral researcher at the Wellcome Trust Sanger Institute. Voyager 1: breaching the final frontier of the Solar System 9
ALISON MACINTOSH
I heal the body electric
Joy Thompson uncovers the importance of bioelectricity in medicine
Neurons fire based on electric potential
Electrical potential can be observed using Tesla coils
10 I heal the body electric
University can make tadpoles grow eyes in their gut. This might sound like a bizarre drug side effect, but the reality is simpler – and more profound. All that the researchers in Levin’s group did was alter the electrical properties of gut precursor cells. Such cells were previously thought to only develop into the gut, and to be unable to form other tissues. Amazingly though, the ‘electrified’ gut cells proceeded to develop into eyes unaided. Levin’s results are the first conclusive proof that electrical signals can be a master control switch in organ formation. They have also stimulated renewed interest in the use of electricity to treat disease. The idea that electricity is fundamental to life and health has haunted the scientific and popular imagination since at least the 18th century. The story of bioelectricity, as we now call it, began with an Italian physician, Galvani. He discovered that the legs of a dissected frog would kick as though alive when an electrical spark was applied. From the 19th century onwards, electrotherapeutic devices were widespread in Western countries, and are still used in the beauty industry, though not always regulated or scientifically justified. The 19th century also produced one of the most memorable occurrences of bioelectricity in fiction: the scientist Frankenstein, who galvanised his creature into life using an electric current. So what is bioelectricity, exactly? We now know that all the cells in our bodies have a voltage, or difference in charge, across their outer membranes,in other words, cells are tiny batteries. The voltage is called the ‘membrane potential’ and it is generated from differing concentrations of ions inside and outside the cell, usually sodium, potassium and chloride. Only living cells have a membrane potential: when a cell dies, the
difference in charge promptly collapses, showing that electricity really is a crucial part of life! The membrane potential has two main functions: to set cell identity and to help cells communicate. Cells within a tissue or organ have a specific value for their membrane potential. For instance, in Levin’s tadpoles the membrane potential of eye precursor cells is normally greater than in gut precursor cells. The gut precursor cells could only form eyes when their membrane potential was artificially increased to an ‘eyelike’ level. The most powerful example of cell communication is our nervous system, which transmits information around the body using electrical impulses. These signals are in fact rapid and transient changes in the membrane potential that propagate down our nerve cells to their targets. Given bioelectricity’s importance in the nervous system, it’s not surprising that its first therapeutic use in modern medicine began with the brain. Some initial approaches, particularly early electroconvulsive therapy (ECT), were crude. ECT was
DAVE FAYRAM
MARK MILLER
michael levin ’ s research group at Tufts
Michaelmas 2014
OTIS HISTORICAL ARCHIVES
HOLLEYANDCHRIS
used to treat psychological disorders. It involved passing an electric current across the hemispheres of the brain. It is now just as much a part of the literary imagination as Frankenstein and his galvanism. The experiences of ECT patients in the early to mid-20th century have inspired several novels (think The Bell Jar, and of course, One Flew over the Cuckoo’s Nest). Despite their grim past, therapeutic approaches like ECT are still used today. ECT is now done under anaesthetic, using small and carefully controlled electrical currents. For many patients with severe depression which does not respond to drugs it is the only treatment that works. Similarly, deep brain stimulation is a treatment which delivers electrical pulses to a specific brain area through an electrode. It is one of the only treatment options for Parkinson’s disease patients whose symptoms do not respond to drugs. We are beginning to understand how a simple electric current can have such dramatic effects on our health–from re-setting a person’s mood to stopping muscle tremors–many cells switch genes on or off in response to electrical signals. Still, the interaction between gene activity and bioelectricity is complex and often unpredictable, we are a long way from a complete understanding. Since bioelectricity interacts with basic cellular functions like gene expression, it is important for maintaining all the body’s systems–not just the brain and nervous system. As the biomedical community has started to accept this idea, therapeutic applications for bioelectricity have increased. Electricity has been used to treat rheumatoid arthritis, a painful condition where the immune cells that normally protect us from infection are overactive and attack healthy tissue –in this case, the limb joints. The disease can be practically reversed by electrical stimulation of the vagus nerve. This might seem strange given that this nerve sends signals to internal organs like the gut and spleen. However, the spleen is also one of the training centres for immune cells,
and is responsible for priming immune cells to attack specific targets. It is thought that slight changes in the electrical signal carried by the vagus nerve were enough to help the rheumatoid patients’ spleens function normally again. This experimental treatment was so successful that it generated a start-up company, SetPoint, and has progressed to an equally successful clinical trial. Bioelectricity has even been implicated in cancer and regenerative medicine. Researchers are finding that the membrane potential in cancer cells is abnormal. This is generating interest in ‘electroceuticals’: prospective drugs and diagnostics that target the electrical properties of cancer cells. Dyes to mark changes in membrane potential within living tissues are already available for laboratory use, and indeed were used in Levin’s ground-breaking tadpole experiments. Similar techniques could be developed to visualise tumours in cancer patients. Other work, again in tadpoles, has shown that we can control limb regeneration just by manipulating or blocking specific electrical currents. Perhaps in the future, electrical stimulation could be used to build simple tissues or organs in culture for transplant patients. The story of bioelectricity has traced a long and colourful path through history, science and literature. Levin’s tadpoles mark a new chapter in this story. They are helping show us the way to innovative electrical therapies, which if successful, could revolutionise many areas of modern medicine. This is electrifying stuff indeed, both Galvani and Frankenstein would probably have been proud.
Electricity can move a frog’s legs after death or change how their body develops
Electric shock therapy has come a long way since treating shell-shock in the First World War
Joy Thompson is a 1st year PhD student in the Department of Physiology, Development and Neuroscience.
Michaelmas 2014
I heal the body electric 11
MATT COTTEN
Measles: The return of an ‘eliminated’ virus
Sarah Smith investigates the return of measles and developments in vaccines
Small pox is the only virus to have been eradicated world-wide
Many Amish communities don’t believe in vaccinating children
eradicated world-wide is smallpox. It took an enormous international effort and a highly effective vaccine. Virus eradication is always the ultimate goal for any vaccination program, as it is the only way to ensure a virus doesn’t reemerge. Measles is a respiratory infection caused by the morbillivirus, and like many viruses it is spread by coughing and sneezing, and is highly contagious. Infection causes a very high fever and a distinctive spot-rash that covers the whole body. Complications during infection can occur in many children; one in 10 get an ear infection, one in 20 get pneumonia, and one in 1000 die. However according to scientists, measles is another potentially eradicable viral infection. The first measles vaccine was introduced in 1963 by John Enders and colleagues - before then almost everyone contracted measles before the age of 15 and the virus killed nearly 500 Americans every year. Subsequently, a very effective combined measles, mumps, and rubella (MMR) vaccine was introduced. By 1999, this virus was considered to be eliminated from the United States. Fifteen years later, however, the Centers for Disease Control and Prevention (CDC) reported that the US has seen its highest number of measles cases (514) since the elimination–and that’s only those reported in the first six months of this year! Measles is mainly considered a childhood disease, but 25 per cent of these reported cases were in 5-19 year olds and 52 per cent were over 20 years of age. Although no measles-related deaths have been reported so far this year, several patients have required hospitalisation due to complications. Successful vaccination programs need not rely on 100 per cent uptake; ‘herd-’ or ‘community immunity’ occurs when a significant proportion of the population is vaccinated against a disease
12 Measles: The return of an ‘eliminated’ virus
so that the few who are unvaccinated remain uninfected. This is because chains of infection cannot be maintained through the population. Herd immunity protects people that cannot be vaccinated, for example because of egg allergies (many vaccine strains of viruses are grown in chicken eggs). This immunity can only be maintained if more than 85 per cent of the population is vaccinated, to decrease the likelihood of these chains of infection. Of the measles cases in the US this year, 363 occurred as part of an on-going outbreak in an unvaccinated Amish community in Ohio. The contagiousness of the virus, coupled with the lack of herd immunity, means that infections in this group are very likely. However, isolated communities in the United States are not alone in these measles outbreaks–in 2012 there were more than 2000 cases of measles reported in England and Wales, with the largest epidemic centred in Swansea.
JOHNNY APPLSEED
CHARLES FRED
the only infectious disease to have been
However, herd immunity does not protect the unvaccinated if they leave the herd. Although the MMR vaccine is not immediately brought to mind when considering travel vaccinations, 97 per cent of the recorded infections in the US were linked to foreign travel–particularly to the Philippines. In 2013, 6497 measles cases were reported in the Philippines, 23 of which were Michaelmas 2014
WELLCOME LIBRARY
WELLCOME LIBRARY
A huge anti-vaccination demonstration was held in Leicester in 1885, which attracted 100,000 people and lead to an amendment of the law in 1898. The ‘cumulative penalties’ sub-clause was removed and a ‘conscience’ clause was introduced. This allowed parents who did not believe vaccination was efficacious or safe to obtain a certificate of exemption; this act introduced the concept of the ‘conscientious objector’ into English law. Perhaps these 19th-century conscientious objectors were justified in their fears; controlled clinical trials had not taken place and the vaccine had not been thoroughly tested. However today’s anti-vaccination campaigners do not have the same arguments to fall back on–vaccines are one of the most thoroughly scrutinised medicines developed nowadays. They undergo four phases of clinical trials over several years, the last of which involves several thousand participants. Some anti-vaccine campaigners argue that the number of vaccines given to children in such a short time is problematic, that they should be spaced out so as not to ‘overwhelm’ the baby’s immune system. However, the antigens presented to a child by vaccinations are a tiny minority compared to the number of bacteria and viruses that a child is exposed to from its parents, a family pet, and the environment. The immune system’s job is to respond to foreign material, and some studies suggest we make over a million specific antibodies in our life time. In the UK, it is advised to immunise your child against eleven different diseases in the first thirteen months, including measles, mumps, and rubella. Vaccines can have some mild side-effects, such as swelling or tenderness at the site of infection, but not having a vaccine is far more frightening, especially if you’re about to embark upon a gap year abroad.
Measles virus infection causes a distinctive red rash
WELLCOME LIBRARY
fatal. Although some vaccinations are missed due to ignorance or poor health-care, some parents opt to excuse their children from vaccination on the grounds of ‘philosophical exemption’ that is those who do not agree with vaccinations. Those children who have not been vaccinated are relatively safe whilst still in Western countries, but once they embark upon an ever popular gap year, and travel to developing countries, their risk of contracting measles increases dramatically. The philosophical exemption argument against the MMR vaccination was stirred to life in 1998 by Andrew Wakefield, a physician and researcher. He published an article in The Lancet medical journal linking the administration of the MMR vaccine with the onset of behavioural problems, including autism, in 8 of 12 children. Wakefield actually suggested the use of single vaccines instead of in combination, although much of the press coverage failed to report this accurately. Subsequently, much larger studies have been carried out to try and replicate Wakefield’s findings. One of the largest followed over 537,000 Danish children, 82 per cent of whom had been vaccinated. Although 316 children had a diagnosis of autistic disorder, the authors found no association between vaccination and the development of autism. Following this study, and many others, Wakefield has since been struck from the UK medical register and The Lancet fully retracted the paper in 2010. Even with all this new evidence, there is still a stigma associated with the combination MMR vaccine; a seed of doubt had already been sown. Contrary to popular opinion, the antivaccination movement is not a new one. After Edward Jenner showed that exposure to cow pox could immunise children against infection by the related but more dangerous small pox virus, vaccination of babies by the age of three months was decreed as law in the UK under the Vaccination Act of 1840. From its introduction, people opposed the law and the Anti-Compulsory Vaccination League was founded in 1867.
Herd immunity can protect those people who cannot be vaccinated
Vaccination of babies against Smallpox by the age of three was decreed as law in 1840
Sarah Smith is a 4th year PhD student at the Wellcome Trust Sanger Institute. Michaelmas 2014
Measles: The return of an ‘eliminated’ virus 13
DAISY HESSENBERGER
You are your genes and your environment
Alex O’Bryan-Tear discusses the long-standing nature versus nurture debate the question of how much nature and nurture
contribute to an individual’s characteristics exerts a special fascination for people. A generation ago, the question was expressed as a debate between the ‘nativist’ school of thought, arguing that the environment is all but unable to influence the destiny written into our genes, and the alternative ‘tabula rasa’ idea that each child is born as a blank slate, waiting to have their identity stamped on them by external forces. Today, research in the field has become more nuanced than this, assuming that both nature and nurture play an important role and aiming to establish how much each mechanism contributes to a particular trait. However, even this attitude, as reflected in popular imagination, misses the point by assuming that nature and nurture make distinct and separable contributions to a given trait. The truth is that an individual is wholly determined by their genes and wholly determined by their environment, at the same time. Let’s start with a typical line of research in this field: a twin study, aimed to establish how much
GLOW
Twin studies have been used to try and establish links between genes and intelligence
14 You are your genes and your environment
IQ is determined by genes and how much by the environment. A group of monozygotic or identical twins is compared with a group of dizygotic or fraternal twins. Monozygotic twins share 100 per cent of their genes, whereas dizygotic twins are likely to share as much of their genes as any pair of siblings–an average of 50 per cent. By comparing the level of variability in IQ across these two groups, researchers hope to establish how much of this variation can be attributed to the shared genes and how much must be attributed to the environment. Of course, there are many much discussed flaws with this approach. For example, twins are also likely to share large and difficult to quantify amounts of their environment with one another, such as growing up under the same roof with the same parents. There are ways of solving some of these problems. For example, if you happen to be one of those rare sets of twins adopted into different families, you’re likely to be very popular with genetic researchers. However, methodology aside, the more important question is whether such a distinction between nature and nurture can be quantified in the first place. Let’s say we have a person with an IQ of 120. This is 20 points higher than we would expect–20 points that need to be attributed to either nature or nurture. Let’s also say the heritability of IQ has been found to be 0.75; that is, 75 per cent of variance in IQ can be attributed to genes (a similar figure to that found in many studies). This may seem to paint the picture that 75 per cent of an individual’s variability from the norm in IQ is the result of their genes, and the remaining 25 per cent is the result of their environment. Therefore, perhaps our case study individual gained 15 points of their IQ from their genes, and the remaining 5 points from, say, reading the intellectual books scattered around the house, listening to Mozart in their sleep and their top-drawer education. This intuition – that nature Michaelmas 2014
Michaelmas 2014
PUBLIC
Children can be categorised as either ‘orchid’ or ‘dandelion’ children
Genes for an interest in French film won’t necessary translate to that trait in all environments
“Which contributed more to the area of an rectangle, its length or its width?”
A. DRAUGLIS
PUBLIC
and nurture combine additively to produce the individual variation – is what lies at the heart of the nature/nurture divide. But nature and nurture simply can’t be divided into separate factors in the neat way the additive model implies, as a look at gene-environment interaction shows us. There are many ways that a genetic trait can interact with the environment it’s placed within. One simple example of this in intelligence is that a genetic predisposition towards intelligence may reinforce itself by selecting environments that boost it further. In other words, a slightly more intelligent than average child will, over time, become exposed to opportunities to increase their intelligence. Praise, encouragement, and of course streaming and a selective education system all contribute to exaggerating small variations in intelligence until they become much bigger. Another example is that genetic predispositions towards other traits, such as good concentration skills or a desire to impress authority figures, may set an individual up to thrive in the educational environment, leading to a boost in IQ. These kinds of gene-environment interactions don’t necessarily undermine our whole understanding of the nature-nurture divide. Although the route from genes to intelligence has become muddied, the basic notion of genes coding for intelligence remains intact. A more far-reaching effect of gene-environment interaction was posited by Ellis and Boyce upon discovering that some genes seem to dictate the extent to which an individual responds to their environment at all. This is a pattern that has been identified in intelligence as well as other psychological traits such as optimism, alcoholism, and propensity to mental illness. To illustrate it, Ellis and Boyce made a distinction between two types of children: ‘orchid’ and ‘dandelion’ children. Orchid children were those who were genetically predisposed to be sensitive to their environment. If placed in optimal conditions they would thrive but anywhere else they would suffer. Dandelion children, meanwhile, did pretty well in any environment they were placed in. Their conditions didn’t matter as much.
Genes don’t dictate traits in themselves, they dictate how the environment should mould those traits. The two factors combine interactively, not additively. It’s a simple point, but one that shakes the assumption that the influence of genes and of environment can be separated meaningfully. To return to our earlier case study, let’s imagine going back and eliminating all the positive environmental factors on intelligence: the books, the schooling, the positive role models. The additive model, assuming a heritability of 0.75, would predict that this child’s IQ would come out at 115, instead of the earlier 120. The environmental component is gone but the genetic component is left intact. The interactive model, on the other hand, suggests that the effect on this child depends on whether or not they are an orchid or a dandelion. If they are an orchid, their IQ may plummet far lower than 5 points and if a dandelion, it might remain unchanged. The additive model of genes and environment simply doesn’t stack up. Any gene requires interaction with a certain kind of environment in order to manifest itself. For instance, a baby with the genes for an intelligence of 95, height of 180cm, and a penchant for French indie films won’t acquire any of those traits if it’s born on the surface of Mars. We need to throw out this assumption of a linear stacking effect of genes and environment, and replace it with a multiplicative one. Under this new framework, the question of whether it is nature or nurture that contributes more to an individual’s traits stops making sense. In the words of Donald Hebb, this question is equivalent to asking “Which contributes more to the area of a rectangle, its length or its width?” The answer is that the area of a rectangle is wholly dependent on its length and wholly dependent on its width. Both are essential for the rectangle to have any area at all, and without knowledge of both, it’s impossible to guess what the final product will be. In a similar way, it’s impossible to examine nature and nurture separately, both are essential to the development of an individual.
Alex O’Bryan-Tear is a 3rd year PhD student in the Department of Psychiatry.
You are your genes and your environment 15
GM crops: Feeding the
FOCUS
BlueSci explores the debate surrounding GM crops and whether current legislation hinders progress
BEN CREMIN
the human population is predicted to rise to
nine billion
around 9 billion people by 2050. The current global malnutrition estimate has already reached 8-12 per cent with some sub-Saharan regions reaching levels as high as 25 per cent (UNICEF, WHO). This population growth alone will place a major strain on resources, but even conservative predictions of climate change drastically impact arable land productivity. Water availability will become more disparate with arid regions becoming even drier and unsuitable for agriculture, whilst temperate regions will have an increased risk of flooding. Similarly, the yield of many crops is reduced under sustained temperatures above 30 degrees centigrade, a situation that is likely to become common in some regions. The combination of a rising population and a dwindling availability of arable land is worrying in itself, however damage by herbivory is presently also a major problem. If such biotic stress could be combated, less land and water would be required to sustain the global population as yield would be increased. Therefore there is a demand for stress-tolerant crops that are able to cope with low water availability and high temperatures whilst also being resistant to biotic stress. Genetic modification (GM) is one way to create more tolerant crop varieties, but currently these techniques are negatively perceived by the public and thus are strictly regulated. Such broadscale restriction on GM limits progressive research; here we ask, is this really the safest solution or should each new GM crop be assessed individually for safety in a process-agnostic manner? GM is defined as the intentional modification of the characteristics of an organism by manipulation of its genetic material. Fundamentally, this means that DNA from a donor organism is inserted into the DNA of a recipient organism that confers the ability to produce a protein which the recipient could not previously make. This protein then either goes on to create a useful substance or to carry out a function that the recipient organism could previously not achieve. Research in the late 1980s focused mainly on the possible pharmaceutical applications of such modifications; however the controversy surrounding
Focus 17
IITA IMAGE LIBRARY
FOCUS
Genetic modification, in the form of artificial pollination, has been practised for hundreds of years
GM crops really took hold in Europe in 1996 following the arrival of GM soya beans from the US The soya shipments that sparked the controversy were mixtures of several varieties, including up to 2 per cent of a GM variety named ‘Roundup Ready’. The alteration of this soya variety renders it resistant to glyphosate, the active ingredient in a herbicide called ‘Roundup’. This modification resultantly allows farmers to kill weeds using ‘Roundup’ without killing their crops. Soya is mainly processed to generate vegetable oil, the by-product of which is high-protein animal feed. Member countries of the European Union (EU) are far from self-sufficient in generation of high-protein animal feed and hence around 85 per cent of it is now GM labelled, which is a legal requirement for imported feed if it contains more than 0.9 per cent GM content according to the food standards agency (FSA). Approval for the use of GM soya in food and livestock feed in the UK was gained in 1995 and, less than a year later, an application was submitted to the European Commission and approved. With the green light given for GM soya, companies producing it in the US now sought to branch out to the European market. The approval for this GM crop came amidst significant disagreements between EU member states about labelling of products. As a result of these disagreements legislation regarding the need for rigorous proof of safety and appropriate GM labelling did not come into force until January 1997 with the ‘Novel Food Regulation’. This legislation relates to foods that do not have a significant history of consumption or are generated by methods not previously employed for food. However, the soya arriving in the Autumn of 1996 preceded this regulation and hence was legally sellable without being labelled as a GM crop. It was arguably this mixed shipment of soya that subsequently drastically shifted the public perception of GM in the UK. During the period of application in the UK there was no official parliamentary debate and only fifteen articles were published in the press. When it emerged GM soya was being shipped mixed in with traditionallyderived crops, several organisations actively tried to prevent shipments from reaching the European market.
BHASKARANAIDU
Controversy over US-imported GM soya in animal feed led to restrictive GM legislation in the EU
18 Focus
They claimed that clear labelling of GM crops should be achieved, in line with public opinion, which would not be possible in light of the mixture of GM with traditional varieties in the shipment. Part of the issue arose as a direct result of the fact that perception of risk is more influential than empirical risk. Involuntary risk is perceived by consumers to be more threatening than voluntary risk where labelling is applied, even if the genuine risk is identical. Hence, the lack of clear labelling of these crops as GM is perceived as more threatening regardless of the genuine risk of the product itself. This in turn generated vast amounts of media attention surrounding the issue of GM crops, with their safety and environmental impact being brought into question in over a hundred articles. Public confidence in government regulation on food safety was already damaged by the Bovine Spongiform Encephalopathy (BSE) or ‘mad cow disease’ epidemic and hence greater attention was brought to bear on food standards. Initially, the industry sought to allay fears regarding the safety of ‘Roundup Ready’ soya. Nevertheless, this discord quickly snowballed into concerns about the morality of all biotechnology, with allegations of “playing God” stifling the debate rather than focusing on the product safety itself. At this point the industry changed tack and instead sought to champion GM as the solution to the future problems of food security discussed previously. Unfortunately, the damage was already done and this incident planted a seed of doubt about GM crops within the UK which would grow to become the highly restrictive-policy environment we still have today. Despite this negative media attention subsequent legislation has permitted the import of various GM crops into the EU including cotton, maize and sugar beet. However, cultivation of GM crops is still strictly regulated in the EU, with only one crop currently approved for cultivation as of July 2014 (maize Mon810, which is modified to kill herbivore pests). On the other hand, several crops have been permitted to be grown commercially in other countries, which have had generally positive impacts. As it stands, over 12 per cent of global arable land is used to cultivate GM crops. The benefits of GM crops can be exemplified from case studies of GM cotton and maize. These species have both been genetically modified to generate varieties that contain one or more of a group of crystal proteins derived from Bacillus thuringiensis (Bt). These proteins act as natural pesticides and have been used in agriculture for around ninety years via direct spraying either of the protein itself or of spores of the bacterium, which subsequently produce these crystalline proteins that are only toxic to insects. During the 1950s the protein was isolated and, in a move away from synthetic Michaelmas 2014
FOCUS
Michaelmas 2014
a GM crop introduction must also be sensitive to the particular ecosystem in which it is to be introduced, but, more importantly, to existing agricultural practices. Much of the focus on GM crops surrounds the idea that they are innately harmful, with traditionally-bred crops being perceived as completely safe. However, traditional breeding is not without its pitfalls. In the late 1960s, breeders sought to create a potato with a high starch content so that it was better for making crisps, resulting in the ‘Lenape’ variety. The variety was initially approved for cultivation by the US but this approval was withdrawn seven years later. To achieve high starch the traditional breeding method had also inadvertently more than quadrupled the production of a natural toxin called Solanine (from 8 to 35mg/100g). This is the same toxin produced when potatoes turn green and leads to severe nausea, diarrhoea and even unconsciousness in extreme cases. In another more recent case, severe skin rashes were observed in Southern Israel on the hands of farm workers who had been harvesting traditionally bred celery. It transpired that the variety had twice the concentration of toxins called psoralens than other varieties, and this had induced the rashes. In light of these incidents, perhaps it is counter intuitive that extensive, rigorous testing and control is not applied to all crop products, regardless of whether or not their origin is from GM or ‘traditional’ breeding. At present, the differences in the process used to generate crop varieties are precisely where the distinction resides. Crop varieties may be produced by traditional breeding practices in which farmers select favourable characteristics of varieties over several generations. This process can take a large number of generations to acquire the beneficial traits of both original parents and can result in carry-over of undesirable genes in the process, as described in the examples above. Once grown, this new variety undergoes some minimal testing to ensure it consistently provides the proclaimed benefits, but its safety is generally considered to be equivalent to that of the original parent plants. In contrast, the commonest method for genetic modification of plants is transformation using a species of bacteria called Agrobacterium tumefaciens. In nature, A. tumefaciens is responsible for causing crown gall disease; by inserting sections of bacterial DNA (known as T-DNA) into the host plant’s genome, inducing massive growth and proliferation of the plant’s cells. During the infection, the bacterium absorbs sugars BRENT CASWELL
Cotton plants have been modified to produce a natural toxin from Bacillus thuringiensis
CYNDY SIMS PARR
pesticides, the use of Bt proteins became popular amongst farmers. This protein is also permissible for use in organic farming as it is naturally derived and the stipulation for organic crops is the exclusion of ‘synthetic’ pesticides and herbicides. Hence, the GM crops containing this protein have a wealth of data prior to the GM crop generation itself to support the safety of this protein and the only GM crop approved for cultivation in the EU, maize MON810, also carries this protein. In fact the European Food Standards Agency said of Bt-modified maize in 2009 that “…MON810 is as safe as its conventional counterpart with respect to potential effects on human and animal health.” At present, there is globally more than 50 million hectares of land cultivating Bt-modified crops, including both maize and cotton. In those areas with previously high pest problems, the introduction of Bt crops has drastically reduced insecticide use. In Arizona, Bt cotton was introduced alongside an integrated pest management system that uses knowledge of pest life cycles to improve monitoring of arising problems; particularly focusing on cotton bollworm, the major pest threat. Between 1996 and 2008 this Bt cotton led to a 70 per cent reduction in insecticide use with an estimated total saving of $200 million. However, the use of Bt cotton reduces competition upon other pests that also target this crop and in other introductions this has presented problems. Use in China did lead to increased yield, and initially insecticide use was reduced, but such an integrated pest management system was lacking. This lower insecticide spraying reduced the environmental competition on mirids, an insect that also targets cotton crops, and subsequently insecticide use rose again to only 17 per cent below pre-introduction levels. Nevertheless, this small change still reduced the exposure of farmers to insecticides, and positively impacted upon insect biodiversity, along with increasing yield and farmers’ profit. It is also important to note that growing Bt cotton simultaneously reduces pest outbreaks on neighbouring farms growing non-GM crops, consequently raising the yield of surrounding farms. Hence, we see that with effective pest management, GM crop introductions can provide large benefits to the farmer and their environment. Yet the introductions of Bt cotton highlights the need to critically assess all impacts of any novel crop introduced to market; environmental, economic and societal impacts of GM must all be taken into account before the crop is cultivated. The employment of a more rigorous integrated pest management programme in China would likely have led to similar benefits to those observed in Arizona. However, the lack of continued management eventually led to reduced benefits and a dissatisfaction of farmers with the product. Therefore,
Bt-modified crops give protection against cotton bollworm, a major agricultural pest
Focus 19
ORIN HARGRAVES
FOCUS
PAUL HUDSON
Genetic Modification of corn is achieved through the use of a bacterial intermediary
Science policy has the opportunity to shape Public opinion
20 Focus
synthesised by the host plant. Inside the lab, this special ability to transfer DNA across taxonomical domains allows modified bacterial T-DNA to carry a particular gene of interest, for example the Bt gene discussed previously, which is then inserted into the plant genome. The A. tumefaciens is therefore a vessel that facilitates the transfer of genetic material into a plant which produces a new variety relatively quickly and with no unintended gene carry-over. However, once any novel GM crop has been created, an exceptionally rigorous and strategic assessment needs to be made of the benefit it incurs and the safety of the new variety, both of which are significantly more stringent than those for the traditionally-bred varieties. The steps taken will depend on the use of the GM crop; for example if the crop will not encounter the human food chain it will not have to be tested for food safety. Most importantly, proof of an added benefit over the conventional crop is needed. To this end, extensive field trials are usually required to provide enough statistically-significant evidence that the GM technology has a reproducible benefit over other varieties. Along with the final legal registration of the GM crop within the EU, field trials present one of the largest costs to the commercialisation process of a GM technology. Food and feed safety assessments are usually done using laboratory techniques. Most transgenic plants contain a novel protein, and this introduced protein is targeted in safety checks. Acute oral toxicity studies are conducted, usually using mice, to check for an allergenic or toxicity risk for animals and humans, as well as longer-term feeding experiments to check the growth characteristics of animals being fed the GM crop. Resultantly, the costs involved in getting a crop to market through all of these safety checks is prohibitively expensive for all but big agricultural companies. This perpetually stifles innovation in the field because it prevents smaller firms from developing new crop varieties. Currently many governments struggle to keep pace with the rapid evolution in the field of GM research. Policy-makers are elected by the public, which is reflected in the science-policy legislation that they produce. Communication between members of parliament and the public goes both ways however; not only should science policy be based on public opinion, but policy should also have the opportunity to shape the very opinions that impact it so highly. Rather than GM policy being singly reactive to both technologies and public opinion, policy makers should strive to set an example for the future that they hope to achieve. For example, GM and GM-free labelling supports the view that transgenic and natural crops are different entities and should be treated so. At present, we have a negative labelling system in which the absence of GM
constituents or processes is indicated (amongst other things) through using the ‘Organic’ label, which is in contrast to a positive labelling system in which products would be specifically labelled as ‘GM’. Although there may be arguments for labelling of GM foodstuffs, the impact this labelling and differentiation has on public opinion should also be taken into account. Unfortunately negative opinions of GM crops and the actions of environmental and consumer groups, have contributed to creating a ‘wicked problem’ for the introduction of GM crop technology in many countries. This ‘wicked problem’ being that the expense and delay created by policy towards new GM technologies not only means that these regions are unable to benefit from the advantages of GM crops, but it also holds back technological advances. This is especially true for small biotech start-ups, which drive many of the advances in other fields, such as cancer research or drug delivery. The main deterrent to the spread of GM crops and to GM biotechnology is consequently the policy choices made worldwide rather than their empirical safety. Opponents to GM crops generally focus on food safety, driven by the persistent perception that these foods are ‘unnatural’ and therefore dangerous. This goes back to the earliest controversy in 1996 in which the perceived dangers of GM soya first arose. Some recent studies have found correlations between cancer and GM crop intake–though these were initially retracted they have since been republished. Predictably these studies have made a huge impact on public opinion, despite their general renouncement by the scientific community for their poor experimental design and lack of appropriate statistical analyses. Conversely, there is a plethora of evidence from long-term ingestion studies, and the actual human consumption of GM crop products since 1996, showing that the food safety risks posed by GM crops are no higher than those posed by non-GM equivalents, yet this seems to have failed to reach the public sphere of consciousness. The problem therefore lies in how to improve the public understanding of the introduction of GM crops in light of the existing perceptions of their risk that leads to broad public rejection of all introductions, in turn restricting policy decisions. The most important question is whether to treat GM crops any differently from any other crops. Currently non-GM crops are also highly regulated, and if GM crops are assumed to be the same, they should undergo the same treatment. The American Association for the Advancement of Science, in keeping with the general scientific consensus, stated in 2012 that “foods containing ingredients from GM crops pose no greater risk than the same foods made from crops modified by conventional plant breeding”. Hence one would Michaelmas 2014
FOCUS
Michaelmas 2014
Unlike the US, the EU has very strict GM regulations, stifling technological innovation
MPD01605
expect GM crops to undergo the same treatment as other non-GM crops. In some countries, such as the US, the truth is close to this expectation. As usual however, novel technology brings novel suspicion, especially if it is not fully understood by the general public, and so the unknown or unpredicted dangers of GM crops dictate the policy decisions in most countries. For policy to address the existing GM crop fears adequately, it may be beneficial to separate out the different sectors in which GM policy can act. This can be segregated into intellectual property rights, biosafety, trade, food safety and consumer choice, and public research investment. Once the different areas are defined, then policy makers in different countries can decide which avenues to take with each to maximise benefits to their own country’s requirements–whether they are promotional, permissive, precautionary or preventative. Ultimately any policy decisions essentially determine how difficult and expensive it is to bring a GM crop to market globally. The policy climate in Europe is probably the worst in the world, with very strict GM regulations. To overcome the long-standing policy deadlock in the EU, a safeguard clause in legislation was included in 2001 that allows Member States to restrict or prohibit the use and/or sale of GM organisms in their territory (Article 23, Directive 2001/18/EC). Since this clause has been included, eight countries have already banned the cultivation of maize MON810 discussed previously, further allowing member states to impose even more prohibitive GM policy choices. Under co-existence regulations, they may also designate GM-free zones, effectively allowing them to ban GM crops without even invoking this safeguard clause. Consequently, these disagreements over policy between European member states lead to a highly fractured legal framework at the EU level. Sir David Baulcombe, the UK Regius Professor of Botany, believes that “Food and agriculture are too important to be held up behind a red flag. We need minds that are open to new technology in agriculture and an end to the Punch and Judy of GM. It’s time to start talking seriously about a new way forward for crops and how we grow them”. Unfortunately current EU policy means that many member states are folding under social pressure, which also negatively impacts upon broader EU policy. The limitations imposed, and resultant costs of commercialisation, prevent development by small biotechnology companies, leaving GM crop production to be the reserve of big business. This stifles progress in the field because it reduces global commitment to the development of this technology. The restrictive legislation similarly further compounds negative public perception by reaffirming
continued concerns over societal, environmental and health impacts as it exhibits legislative reservation in the acceptance of these crops. GM crop policy should be sensitive to the environment, both natural and figurative, into which this emerging technology is going to be introduced. To that end, the policy decisions made by a country should take into account its own agricultural challenges, economic aims, research capacity, and public opinion. Developing countries, who likely face a stronger imperative to alleviate agricultural challenges whilst also having a lower research and regulatory capacity, should not be seeking to fit to either the US ‘promotional’ model or the EU ‘restrictive’ policy models. Instead, for each sector of GM crop regulation, countries should be carefully deciding where along this spectrum they lie in order to set up their countries to be able to safely benefit from the advances being made by scientists. More importantly, similar to each sector of GM crop policy being treated separately, each GM product should be treated as a singular case. Rather than “Yes to GM” and “No to GM” we should look at each case individually for the safety and benefits that the final product provides. Rather than judging the method by which something is produced, we should have the choice between “Yes to THAT variety” and “No to THIS variety.” Hence, rather than ‘process focused’ policy that currently prevails, we should be moving towards ‘process agnostic’ regulation. As global GM policy is brought up to the same rigor as the scientific testing it demands, public opinion may begin to change. Emily Bailes is a 2nd year PhD student in the Department of Plant Sciences. Daisy Hessenberger is a 4th year PhD student in the Department of Plant Sciences. Greg Mellers is a 2nd year PhD student in the Department of Plant Sciences. Nathan Smith is a 3rd year undergraduate in the Department of Plant Sciences. Focus 21
BLPRNT_VAN
Where Are They Now? since bluesci’s first issue ten years ago, countless members have contributed to the ongoing
success of the magazine. Although they played many different roles, they all had one thing in common: their passion for science communication. To celebrate BlueSci’s tenth anniversary, six past BlueSci members would like to share their stories with our readers, confiding their experiences while working for BlueSci, and where they’ve been afterwards. rachel mundy; first managing editor
Rachel was the first Managing Editor of BlueSci
After joining Cambridge University Science Productions (CUSP) - a Cambridge University society dedicated to promoting science through the media - Rachel was appointed Head of Writing in 2003. In this role, in collaboration with CUSP Chairman, Björn Haßler, and committee member, Helen Stimpson, the idea for a popular science magazine to make Cambridgebased scientific achievements accessible to all was conceived. “I recognised that fantastic, internationally and historically renowned scientific discoveries at Cambridge were not widely known throughout the university, so I wanted to create a magazine product that promoted the understanding and awareness of science in an engaging format whilst providing a breeding ground for the next generation of science writers.” Rachel set about drawing up a business plan to move the project forward and crucially recruited a new and dedicated founding committee who tirelessly worked to bring this project into realisation, producing a first issue. The second issue quickly followed with new committee members. After leaving Cambridge, Rachel went on to study science communication, at Royal Holloway University of London, and then launched her career as a freelance science journalist. Devising the concept of BlueSci and working on the very early stages of its existence helped Rachel consolidate her passion for science in the media and opened up opportunities to pursue this as a career.
Lou Woodley; Co-founder of BlueSci
As a member of the BlueSci founding team, Lou worked as the magazine’s Managing Editor for two and a half years. She still sees BlueSci as one of the most enjoyable projects she’s ever worked on, as it was crucial for deciding what to do after Cambridge. The 360° view she got of science communication by being involved in all stages of producing a magazine - from event management and house style
22 Where Are They Now?
discussions to recruiting new team members and sponsors - made her realise that she was more suited for a science communication job than lab-based research. So, she swapped her lab-coat for a laptop and joined Nature Publishing Group (NPG) as an intern in their Web Publishing department, working with new tools and technologies for communicating science. This internship eventually became a full time position that lasted five years, where she worked on community-focused projects such as Nature Network, the nature.com blogs and the SpotOn series of events in London and New York. Her time at NPG gave her enough experience to go freelance and combine paid work with her own projects such as MySciCareer - a science careers website - and her blog ‘Social in Silico’, where she shares her thoughts about online communities and the science of online communication. Lou’s career advice would be: “Don’t be scared to create the job you want by combining the things you’re passionate about. And don’t worry if it doesn’t seem to exist yet; many of the tools I use and the questions I’m thinking about on a daily basis weren’t at all mainstream five years ago.”
tom walters; first production manager
Tom joined the BlueSci team before the first issue came out, after seeing an advert about a new science magazine that was being founded. Things were just getting started, and he remembers being impressed with the energy and enthusiasm of everyone at the first meeting. Since he’d previously worked on production and design for Varsity, he ended up as the main design and production manager for the first few issues, or at least that was his official role. In fact, at that stage everything was so new that everyone did a bit of everything. He remembers that while they were brainstorming the design of the first issue, most of the team ended up on a field trip to Borders bookshop to work out how other magazines presented themselves and noting down design ideas. The fun and adrenaline rush of finally getting the first issue to the printers totally hooked him, and he stuck around for another couple of years, ending up with a stint as editor for issue six.
Michaelmas 2014
During all this, and the much procrastinated over PhD work, he landed an internship at Google, and there he found a very similar scrappy, “everyone does a bit of everything to get the job done” culture that he’d enjoyed so much while working for BlueSci. He’s still at Google now, working as a Research Scientist in the Zurich office. He says his experience with BlueSci definitely helped his career. In the short term, it gave him a whole bunch of useful talking points for his initial Google interviews, and more importantly it showed him just how much fun it is to work in a small, well-motivated team doing something for the very first time.
side, organising conferences and seminars, student support, and helping to develop policies and processes, the most recent being how best to teach students about plagiarism. Her two key pieces of career advice are: “If you’re interested in science communications jobs you should: 1) follow the money, as research funders always need good communicators, and 2) join the PSCICOM e-mail list (www.jiscmail.ac.uk/ psci-com).”
jon heras; illustrator, producer and president
ewan st. john smith; focus and submissions editor
Jon was involved in BlueSci issues three to nineteen as illustrator, producer and copy editor. He was also in the video team and served as president for one year. After graduating he founded Equinox Graphics, and now makes a living by producing science and engineering artwork, a mixture of stills and animation. He says the most rewarding thing about having been involved with BlueSci was seeing his artwork in print, which gave him a real buzz. He learned how to be creative to a deadline: “The magazine is going to press, so you’d better be ready!”, and to work in a team, “It’s hard when you know you’re right, but compromise and listening always gets the best result”. Working for BlueSci also gave him the opportunity to develop skills and learn from lots of talented people. This experience also opened options for him when he graduated. He admits it hasn’t been an easy path: “I’ve made a lot of mistakes along the way, and it’s taken a while to find my niche, to develop my skills and portfolio, and to find enough clients to generate a steady stream of work. But it’s more rewarding and creative than just taking a run-of-the-mill job. Every day is different, and I’m continuously learning about science, art and visual effects.”
Ewan joined as editor for several BlueSci issues. After his PhD he moved to Berlin where he spent more than five years investigating the weird and wonderful world of the naked mole-rat, identifying the molecular basis for the insensitivity of these rats to a noxious stimulus. Then he spent a year in New York at the Skirball Institute of Biomolecular Medicine, examining CO2-mediated neuronal activation in the nematode Caenorhabditis elegans. In 2013 he took a University lectureship position in the Department of Pharmacology at the University of Cambridge. His group’s research focuses on understanding the effect of CO2 on neuronal activity and the neural basis of pain. Besides his very successful career as a researcher he still enjoys public outreach work, having taken part in Berlin’s Long Night of Science and the Cambridge Science Festival. In addition he has done podcasts for Science, The Physiological Society and The Naked Scientist. Christine Weber is a BlueSci alumni currently at Kings College London.
Laura became a communications officer for Cancer Research UK
Ewan is still invovled in academia, taking on a lectureship role at the University of Cambridge in 2013
laura blackburn; news writer and editor
Research didn’t quite work out for Laura, and writing became her Plan B. She joined BlueSci after contributing an article to the first issue, as the oneperson news team, then as news editor and features editor. A series of fortunate events after her PhD led her to a 6-month news internship at Science, and then to a year’s maternity cover as a journal news and views editor. Her current job as a scientific communication officer at the Cancer Research UK Cambridge Institute covers many things: on the science communications side, writing for and editing reports, booklets and websites and answering public enquires by e-mail and phone; on the science administration Michaelmas 2014
Where Are They Now? 23
ThomasFisherRar eBo okL i br
ar y
Ten Years of Nobels for over 100 years, Nobel Prizes have recognised those conferring “the greatest benefit on mankind”. Meanwhile,
the Ig Nobel Prizes – described in Nature as “the highlight of the scientific calendar” – are for research that makes you laugh, and then makes you think. In honour of BlueSci’s tenth anniversary, we’ve picked ten which showcase the principles and peculiarities of the Nobel and Ig Nobel Prizes of the past ten years.
NASA/JPL-CALTECH
expansion of the
The universe is expanding at an accelerating rate
Universe
The discovery that the Universe is expanding at an accelerating rate won the 2011 Nobel Physics Prize. The recipients had competed for years, Saul Perlmutter leading the Supernova Cosmology Project, and Brian P. Schmidt and Adam G. Riess the High-Z Supernova Research Team.The two groups raced to find Type Ia supernovae, formed from the explosion of white dwarfs. Small areas of sky were repeatedly imaged and carefully searched for tiny changes. Intending to show how the expansion of the Universe was slowing in the wake of the Big Bang, they compared the light curves of the supernovae with others at known distances. Between them, the two groups found 50 distant supernovae whose light was weaker than expected, revealing that the expansion of the Universe is in fact not slowing down, but speeding up.
QUINN COMBROWSKI
snapping spaghetti
Spaghetti is flexible but brittle, and always snaps into more than two pieces
24 Ten Years of Nobels
The 2006 Ig Nobel for Physics went to Basile Audoly and Sebastien Neukirch for their analysis of high-speed photography to discover why it is impossible to snap dry spaghetti into just two pieces. Spaghetti is flexible but brittle, breaking only when its elastic strain limit is exceeded. When snapped, the two broken ends are released to bend back in the opposite direction. The researchers’ photographs revealed that the rod will, at some points, ping back even further than the original curve. At the site of maximal curvature, it is highly likely to break. So, when a piece of spaghetti is snapped in two, the stress will break off at least one smaller piece. tagging proteins with fluorescent markers
The 2008 Nobel Prize in Chemistry was awarded to Osamu Shimomura, Martin Chalfie and Roger Tsien for the discovery and development of green fluorescent protein, or GFP. Osamu Shimomura first isolated GFP from the jellyfish Aequorea victoria, which glows green when agitated. He discovered that the protein would also glow under ultraviolet light. Martin Chalfie then connected the GFP-producing
gene to a gene for another protein, tagging the protein with a glowing green colour. This made it possible to track the protein and see which cell populations it inhabited. Roger Tsien further developed the marker by producing a wide colour palette of fluorescing proteins. One exciting use of this technique was the creation of a multi-coloured mouse brain to show how the individual nerve cells are woven together, which could help further our understanding of neurodegenerative diseases. human digestive system versus unchewed shrew
Brian Crandall and Peter Stahl were awarded an Ig Nobel prize in 2013 for persuading a volunteer to swallow a parboiled shrew. The rodent was skinned, disembowelled, briefly boiled and cut into three portions which were swallowed by the participant without chewing. For three days afterwards, his faecal matter was collected and examined with a hand lens. Although the skull and jaw sustained some heavy damage from digestive acids, the researchers observed that a considerable number of shrew teeth and bones emerged relatively intact. Their investigation has better enabled archaeologists to draw conclusions about the diets of past societies from mammalian remains. stem cells for genetic modification
The 2007 Nobel Prize in Physiology was awarded to Mario Capecchi, Sir Martin Evans and Oliver Smithies, for their work on genetic modifications using embryonic stem cells. Independently, Mario Capecchi and Oliver Smithies demonstrated that genes within a cell can exchange information with each other. This occurs every time an egg is fertilised, when genetic material from two parents is combined into two new chromosomes. Martin Evans applied this finding, observing that if embryonic stem cells were altered, they would pass on that change to all cells which developed from them, regardless of how they specialised. These discoveries have allowed us to introduce gene modifications to whole organisms and examine the roles of individual genes.
Michaelmas 2014
CHAD SHEERS
artificial synthesis of medical molecules
A Nobel Prize in Chemistry was awarded to Richard Heck, Ei-ichi Negishi and Akira Suzuki in 2010 for their use of palladium compounds to catalyse cross couplings in organic synthesis. Carbon is the basis for all known life and the development of carbon-based molecules is crucial in medical development. However, carbon is very stable and the atoms do not react easily. Forcing them to join using reactive substances generated unwanted by-products, and isolation from natural sources could never yield sufficient quantities for widespread use. The researchers discovered that palladium compounds could act as catalysts, drawing the carbon atoms together close enough to begin to react. This has enabled the synthesis of carbon molecules that are vitally important in the treatment of conditions including colon cancer, HIV and MRSA. five - second rule
Jillian Clarke, a high-school student on a summer internship at the University of Illinois, received a Public Health Ig Nobel in 2004. She investigated the validity of the five-second rule, which claims that dropped food can be safely eaten if it is on the floor for less than five seconds. To simulate the Michaelmas 2014
graphene : a super - strong substance peeled off with sticky tape
In 2010, Andre Geim and Konstantin Novoselov were awarded the Nobel Prize for Physics for their groundbreaking experiments with the material graphene, a two-dimensional form of graphite. Graphite comprises many layers of hexagonallystructured carbon. After years of failed attempts to isolate one of these layers, the researchers tried to simply peel them off a graphite block using Scotch tape. Ten to twenty applications of the tape successfully reduced it to a single sheet of atoms. This graphene is stronger than steel, highly stretchable, transparent and a good conductor. Practical applications may include touch screens and flexible electronics smaller than silicon chips. Graphene was not Geim’s first honour for slightly unorthodox research; in 2000, he received an Ig Nobel prize for levitating a frog to demonstrate that water can be levitated in a magnetic field.
ALEXANDERAIUS
Lynn Halpern, Randolph Blake and James Hillenbrand won an Acoustics Ig Nobel prize in 2006 for investigating the unpleasant sound of fingernails scraping on a blackboard. Participants rated the unpleasantness of a recording of a gardening fork scraping across slate. The researchers expected to find that the disagreeable quality was in the high frequencies. However, when they filtered out frequencies above 3 kHz, there was no impact on participants’ ratings. Against expectations, the middle frequencies (2-3 kHz) were found to make the sound unpleasant. The researchers were unable to suggest why this should be the case, but their own repeated exposure to the noise did allow them to report that the experience became more tolerable over time.
dropping of food on a dirty floor, she inoculated tiles with E. coli and placed food on them. Environmental scanning electron microscopy (SEM) revealed that five seconds was more than sufficient for the bacteria to transfer to the items. But Clarke then swabbed the University floors looking for bacteria. To her surprise, she found very few, probably because floors are too dry for most pathogens. So the five-second rule may be wrong – but you can still eat your dropped food!
Graphene is the twodimensional form of graphite and is stronger than steel
The unpleasant sound of fingernails on a chalk board is due to the middle frequency soundwaves
tequila diamonds
In 2009, Javier Morales, Miguel Apátiga and Victor Castaño were honoured with an Ig Nobel Prize for growing diamonds from tequila. They discovered that high-quality diamond films could be produced from ethanol diluted in water. Having determined that the ideal concentration of ethanol was 40%, they repeated the experiment with a $3 bottle of tequila. The tequila was heated to remove impurities and break it down into its constituent parts. The resulting carbon atoms were deposited to form a thin, uniform film which is only visible under an electron microscope. Hard and heat-resistant, this diamond film may provide an alternative to silicon in computer chips and ultra-fine medical cutting instruments.
Javier Morales won the 2009 Ig Nobel prize for making diamonds from Tequila
GILES DOUGLAS
fingernails on a blackboard
Sarah Regan is an MPhil student in the Department of Psychiatry. Ten Years of Nobels 25
KHALID ALBAIH
Top Ten Scientific Discoveries of the Past Ten Years
In celebration of BlueSci’s tenth anniversary, Joanna-Marie Howes revisits the past ten years of influential scientific stories
Exoplanets are planets outside our solar system
Exoplanets are planets that are not located within our solar system. Although these stellar objects were thought to exist for centuries, they had remained elusive to our detection. The first confirmed exoplanets were discovered in 1992 in orbit around pulsar PSR B1257+12. Since then, around 2000 more have been identified, sparking renewed interest in the search for extraterrestrial life. In order to support living organisms, the exoplanet needs to be in the habitable zone of a star, allowing liquid water to exist on its surface. In February 2014, development of the statistical technique ‘verification by multiplicity’ allowed NASA to identify 715 new exoplanets using the Kepler Space Telescope, four of which were in the habitable zone of their central stars.
KEITHB_B
2. publication of the human genome
SCITECHNOL
The first draft of the human genome was completed in 2004
Optogenetics is the use of light to control and monitor neural activity
The Human Genome Project (HGP) began in 1984. Its purpose was to ‘understand the human genome’ and gain ‘knowledge of the human [which] is as necessary for the continuing progress of medicine and other health sciences as knowledge of human anatomy has been for the present state of medicine’. The project was brought into existence through the collaborations of several workshops within the US Department of Energy. In 1990, after much debate and deliberation, the $3 billion project was officially funded by the US Department of Energy and the National Institutes of Health, and was envisaged to take 15 years to complete. With the assistance of multiple other facilities around the world, the 3.3 billion base pair human genetic code was methodically broken down into chunks, amplified in bacteria and sequenced. In the project’s later stages, another privately funded quest was launched in 1998 by Celera Genomics. The firm tried to patent its findings but an announcement in 2000 by President Bill Clinton forbidding this meant that the information from both groups was freely available to all researchers. The drafts were published in 2001, with both Celera and HGP credited. In 2004 the project was completed. Now, the analysis of variations in DNA continues in order to identify the roles of these changes in disease.
26 Top Ten Scientific Discoveries of the Past Ten Years
DALLAS 1200AM
1. discovery of exoplanets
3. isolation of graphene
Graphene is a one atom-thick layer of the carbon form Graphite, and looks a little like molecular chicken wire. Unlike the graphite ‘lead’ in your pencil however, a graphene monolayer is extremely strong, light and almost transparent. It is also a very good conductor of both heat and electricity. This makes graphene extremely useful for a wide range of applications, such as flexible display screens, ethanol distillation and solar cells, to name but a few. Graphene is very difficult to produce however. Since its isolation in 2004, graphene was at one time or another one of the most expensive materials on Earth. But this has not stopped the European Union making a €1 billion grant to fund research into potential graphene applications. 4. 3d printing Additive manufacturing or 3D printing is the creation of solid three-dimensional objects from a digital template. During this process, layer upon layer of the object is built up to give the final shape. Currently, 3D printing is commonly used for making prototype objects in the engineering industry. However, 3D printing is now spreading into the fashion world, with designers using the technique to design shoes, bikinis and dresses. Sports manufacturers Nike and New Balance are even using 3D manufacturing to custom fit their shoes.
5. optogenetics
Optogenetics involves the use of light to selectively and precisely control neural activity. A gene coding for a light-sensitive algal protein is inserted into specific neurons in the brain. These neurons can now be activated by light, firing an electrical impulse in response. Using this technique, the activities and roles of individual neurons can be studied in real time. In 2010 optogenetics was chosen as the ‘Method of the Year’ across all fields of science by the interdisciplinary research journal Nature Methods and was also featured in “Breakthroughs of the Decade” in the scientific research journal Science. Michaelmas 2014
7. discovery of the ‘ageing gene’ The cells in our body are constantly dividing and replacing themselves to keep us ticking over. Each time a cell divides, the chromosomes within it are replicated to pass on their genetic information. At the end of each duplicated chromosome is a telomere–a repetitive nucleotide sequence which protects the end of the chromosome. During cell replication, DNA duplication cannot continue all the way to the end of a chromosome, and so over time the telomere becomes shorter. Eventually replication becomes impossible and the cell dies. This is biological ageing and some people are more prone to it than others with their telomeres shortening faster over time. In 2010, it was discovered that the Telomerase RNA component (TERC) gene determines not only how long the telomeres are when someone is born but also how quickly they shorten. The effect of this gene is considerable in those with the variant, equivalent to between three and four years of biological ageing as measured by telomere length loss.
8. rna interference
RNA interference (RNAi) refers to the biological mechanism by which small ribonucleic acid (RNA) molecules inhibit the expression of genes. RNAi is initiated by the introduction of short double-stranded RNA molecules into cells and can be applied to specifically inhibit genes of interest. RNAi shows promise in a number of fields - clinical trials are underway in the treatment of macular degeneration and respiratory viral infection, and RNAi research is also focusing on the reversal of liver failure and silencing of genes that promote the growth of cancer tumours. RNAi in vivo delivery to tissues still eludes science – especially to tissues deep within the body, Michaelmas 2014
and is currently limited to surface applications (for example, the eye and respiratory tract) where it is applied in direct contact with the tissue. For deeper applications, the RNAi must be targeted and protected from degradation. Higher levels of RNAi have been trialled to combat this, but have resulted in toxic side effects. 9. light observed from the ‘big bang’ The ‘Big Bang’ theory is a widely accepted cosmological model for the early development of the Universe and marks its birth, which is thought to be in the region of 13.798 ± 0.037 billion years ago. Now, the £515 million Planck Space Telescope has captured a ‘map’ of light originating from the dawn of time, and reveals patterns which support the Big Bang theory. The maps were compiled over a period of 15 months by the European Space Agency, by focusing on the faint glow of microwave radiation found in space known as the Cosmic Microwave Background (CMB). Their findings suggest that the Universe contains slightly more matter than expected and a little less dark energy, the force thought to propel the expansion of the universe.
10. the large hadron collider and the discovery of the Higgs Boson
Named after Peter Higgs, an Edinburgh University physicist, the Higgs boson or ‘God’ particle is an elementary particle that was initially theorised in 1964, and its discovery was pivotal to the Standard Model of particle physics. Determining the existence of the Higgs would allow physicists to explain why some particles have mass when they should be ‘massless’ and why the ‘weak force’ has a much shorter range than the ‘electromagnetic force’. Proving the existence of the Higgs would finally allow physicists to validate the Standard Model, a quest so important that it resulted in the assembly of the Large Hadron Collider (LHC), the world’s largest and most powerful particle collider. Located in Switzerland, the 27 kilometre subterranean LHC was a collaboration of over 10,000 scientists and engineers from over 100 countries. Operated by the European Organization for Nuclear Research (otherwise known as CERN), the LHC finally enabled the discovery of the Higgs in 2013. The analysis of the by-products of high energy particle collisions has been invaluable to the discovery of new and theorised particles that would be impossible to study in other ways.
AJC1
Unlike traditional lithium ion (Li-ion) batteries, which are extremely limited in the amount of power that they can retain, nanobatteries store energy at the molecular scale and so are much more compact. Li-ion technology uses materials such as cobalt-oxide or manganese oxide particles that range in size between five and twenty micrometres. In contrast, nanobattery particles measure fewer than 100 nanometres (200x smaller). If a Li-ion battery is charged too quickly, the lithium moving through the electrolyte liquid causes a ‘bottleneck’ as it moves from the negative electrode to the positive. Under slow charging conditions this does not cause a problem, but it limits the rate at which batteries–for example, in your mobile phone–can be recharged. Toshiba is one of several companies currently researching nanobattery technology, announcing the development of a Li-ion battery with a nanostructured lattice at the cathode and anode that recharges eighty times faster than its Li-ion counterpart.
JULIAN HERZOG
6. nanobatteries
Telomeres are the end sections of chromosomes
The large hadron collider is the world’s largest and most powerful particle accelerator
Top Ten Scientific Discoveries of the Past Ten Years 27
DOUG WHEELER
DOUG WHELLER
Top Headline Grabbers of the Past Ten Years
cloning fraud
YU HWANG WU
Dr Hwang Woo-suk claimed to have cloned human embryos, which turned out to be a fraud
The cloning of embryos for the extraction of stem cells has long been a contentious subject, receiving copious media attention. Embryonic stem cells are useful as they can differentiate into any type of cell in the body, making new heart muscle, bone or brain tissue. In 2004, Dr Hwang Woo-suk was hailed as an international pioneer in stem cell research when he published a paper in the journal Science claiming that he had created the world’s first cloned human embryos, extracted stem cells from them and genetically matched them to specific patients. However, scientist bloggers began posting signs of fabricated data, leading the journal Science to retract his papers. In 2006, Dr Hwang was stripped of his license to carry out stem cell research, and he was formally accused of fraud, misusing state funds and violating bioethics laws. Hwang was later convicted of embezzling over $705,000 of funding and of illegally buying human eggs for his research. His former colleagues were also either fined or given suspended prison terms for fraud.
USDA
foot and mouth The UK suffered a major outbreak of footand-mouth disease which resulted in the mass slaughter of many animals
Foot and mouth is a highly infectious and sometimes fatal viral disease affecting cloven hoofed animals (for example cattle, sheep, pigs, goats and deer). Symptoms include fever followed by blisters in the mouth and on the feet. The lameness caused by the disease usually gets worse until the animal can only hobble, and a loss of condition is noticeable, partly due to fever but also because the mouth is so painful that the animal doesn’t want to eat. The virus is present in great quantity in the fluid from the blisters, but can also be spread by airborne means. The UK suffered major outbreaks in 2001 and 2007, resulting in the mass slaughter of many animals and devastating the farming community. Infection by the virus also renders the animal’s immune system vulnerable to further attack.
severe acute respiratory syndrome (sars)
Although still relatively rare, the highly contagious nature of the SARS virus resulted in a major outbreak originating in southern China between 2002 and 2003. The contagion rapidly spread onwards to other countries and reached the public spotlight in February 28 Top Headline Grabbers of the Past Ten Years
2003, when an American businessman travelling from China became afflicted with pneumonia-like symptoms and later died in a hospital in Vietnam, where the subsequent spread of the epidemic among the hospital staff alarmed health authorities. The epidemic subsequently continued through until 2004. SARS is a serious form of pneumonia, and symptoms of the disease resemble either a cold or influenza but are accompanied by a fever temperature of over 38°C. SARS has no cure, and a vaccine approved for human use remains a global priority.
blood clot risk of the contraceptive pill
Over a million women take the combined oral contraceptive pill each day. However, in 2014 the Daily Mail online reported on the ‘Deadly risk of pill used by 1m women: Every GP in Britain told to warn about threat from popular contraceptive’. In reality, although the combined pill contains a synthetic form of the hormone oestrogen that is associated with the risk of blood clots, the true risk of suffering from a blood clot as a direct consequence of taking the combined pill is only around 12 women per 10,000.
mmr vaccine controversy
The controversy surrounding the combined measles, mumps and rubella (MMR) vaccine originates from a 1998 publication in the medical journal ‘The Lancet’ that supported the now discredited theory that colitis and autism disorders could be caused by the jab. The media was heavily criticized for its naïve reporting and for lending undue credibility to the fraudulent research. It was later revealed that the leader of the research, Andrew Wakefield, had undeclared conflicts of interest and had manipulated evidence. Most of Wakefield’s coauthors then withdrew their support for the study. The Lancet paper was partially retracted in 2004 and fully retracted in 2010 when The British General Medical Council (GMC) conducted an inquiry into allegations of Wakefield’s misconduct. In the years that followed, other researchers failed to reproduce Wakefield’s findings or confirm his hypothesis of a relationship between childhood gastrointestinal disorders and autism.
confirmation of water on mars
Over the years, imaging of Mars’ topography has suggested that liquid water played a role in the planet’s Michaelmas 2014
hydrogen-fuelled cars
Our search for clean renewable energy sources has led to the research and development of new power sources for our currently fossil fuel-hungry vehicles. Hydrogen-powered cars run on a fuel cell, which converts hydrogen to electricity, giving off only heat and water as by-products and, according to the United States Department of Energy, could ‘reduce greenhouse gas emissions by 60 per cent’. However, hydrogen is not a fuel source that occurs naturally on Earth, and the fuel cells are expensive to produce. In 2009, Fortune magazine estimated the cost of producing the Honda Clarity at $300,000 per car. There are still lots of challenges to face regarding the use of hydrogen in vehicles, include production, storage, transport and distribution. Several hydrogen cars now exist, but most of them are concept cars, including the Chevrolet Equinox, the BMW 745h and the Honda FCX.
The discovery that Mars had water has generated interest in the search for life on the ‘red planet’
NASA
history, yet confirmation of its current existence on the Martian planet has remained elusive. However, in 2004, the Mars Express satellite confirmed that the Martian polar caps contained liquid water, and in 2005, the European Space Agency announced the existence of a crater containing frozen water. On July 2008, NASA’s Phoenix Lander confirmed the first ever direct observation of ice from the surface. Two years later, the Mars Reconnaissance Orbiter determined that the total volume of water ice in the cap is equal to 30 per cent of the Earth’s Greenland ice sheet-enough to cover the surface of Mars to a depth of 5.6 meters. These discoveries have heightened interest in the search for life on Mars, both past and present, and in 2014 NASA reported that the Curiosity and Opportunity rovers have now made the search for evidence of habitability and organic carbon on Mars a primary objective.
first trial of human embryonic stem cells
In 2009, approval was given for transplantation of oligodendrocytes (a cell type of the brain and spinal cord) derived from human stem cells into spinal cordinjured individuals, marking the first human stem cell trial. The research behind this scientific advancement was conducted by Hans Keirstead and was supported by the Geron Corporation. Keirstead had previously shown an improvement in motion in spinal cordinjured rats following the transplantation of human endothelial cell-derived oligodendrocytes. The trial was approved in 2009 but, due to concerns regarding the formation of microscopic cysts, was postponed until 2010. Geron left the field of stem cell research following poor results of the trial, but in 2013 BioTime acquired all of Geron’s stem cell assets, stating their intent to restart Geron’s embryonic stem cell-based clinical trial for spinal cord injury in the future. Joanna-Marie Howes is a post-doctoral researcher in the Department of Biochemistry.
TAISYO
Hydrogen-fuelled cars convert hydrogen into electricity
Michaelmas 2014
Top Headline Grabbers of the Past Ten Years 29
Solitary cell confinement Verena Brucklacher-Waldert discusses the challenges and benefits of chips that analyse individual body cells separately is made up of about 37.2 trillion cells. They all originate from one single cell, the zygote, which divides multiple times to build a collection of cells that then gives rise to a functional organism. Yet none of them are alike. Each single cell of our body is unique and so are we. We worship our individualism by personalising gifts, mobile devices, and even the designs of debit cards. But how about personalised medical treatment? Would it not be a logical consequence of our uniqueness to be treated with tailored medication, in the right dose at the right time, to be cured quickly and avoid any possible adverse effects? In order to create such a truly personalised medicine we have to understand the human body extremely well, at the single cell level. This ambitious goal is accomplished by single cell analysis. Single cell analysis embraces the uniqueness of each individual cell in an organism. Cell types, such as neurons or immune cells, differ in their structure and anatomical location within the body. These differences are often genetically determined by switching on and off certain genes during the development of an organism. Notably, even cells of one type, sharing a common structure and location, are not alike over time. This cellular variation is partially to blame for the diverse range of observed responses to treatments of a disease
BRUNO C. VELLUTINI
One cell splits up into trillions, all of them subtly different
30
Technology
NIST
each of us
Microfluidics is used to control single cells in lab-on-a-chip technology
across individuals. The current inability to predict an individual’s response to treatment for most diseases means that clinicians must follow a ‘trial-and-error’ approach. A patient with high blood pressure, for example, is placed on a number of medications until the optimal medication is found to reduce this to a healthy level. Nowadays, clinical diagnostics and research assess characteristics of blood or tissue sample cells by bulk measurements. These data are therefore an averaged response from a mass of cells and can mask valuable information. Bulk screening of a patient’s blood sample, for example, may not reveal a mutation of the p53 gene (a gene involved in cancer development), whereas single cell analysis of the same blood sample might identify some cells carrying this mutation. This information would enable clinicians to react preventatively, and more efficiently, against possible cancer development as cells with a mutated p53 gene can eventually become tumours. Analysing single cells is challenging if one considers the size of these cells (which are typically about 100 times smaller than a poppy seed), the number of cells (for example 5 million red blood cells in just 1 mm3 of blood) and the diversity of cell characteristics to examine (for example genes, proteins, lipids). A technology that enables single cell analysis must resultantly be able to isolate single cells from a sample; assess the
Michaelmas 2014
Michaelmas 2014
The sensors can fit on a chip, but analysing all the data can be computationally challenging
CORY DOCTOROW
unique information of each of these single cells; and analyse, integrate and interpret the acquired data. Ideally, these steps should be performed as quickly as possible and take up as little work space as possible; criteria currently best met by the revolutionary ‘lab-on-a-chip’ technology. These chips are generally about five centimetres by seven centimetres and are made up of a system of channels connected to syringes. Cells in a suspension are introduced to the chip at a rate of several million cells per minute and float through channels just a few micrometres in diameter. Depending on which specific system is used, cells are compartmentalised into wells or droplets. Chips using the well system will trap single cells as they pass through the channels into wells which match the shape and size of a single cell of interest. In a droplet system, cells are trapped in a tiny aqueous droplet which flows within inert carrier oil through these channels. Each well or droplet functions as an independent laboratory that allows the investigation of a single cell through analytical techniques miniaturized to fit on a chip. This ‘lab-on-a-chip’ can assess several features of a cell ranging from its shape to its ‘-omics’. ‘Omics’ is the large-scale study of genes (genomics, epigenomics), RNA transcripts (transcriptomics), proteins (proteomics), lipids (lipidomics), metabolites (metabolomics) and the interactions between all of these (interactomics). In order to assess all of these features, several different methods need to be employed. While some of these methods have already been successfully established for the ‘lab-on-a-chip’ technology, the future challenge is to increase the number of methods performed simultaneously per well or droplet. Only a complex analysis will unravel the interconnected chemical and biological processes occurring in our body. Moreover, the ‘lab-on-a-chip’ technology offers biologists the chance to study these dynamic processes over time rather than via a snapshot style recording, which obviously does not represent living systems accurately. This culminates in the collection of multiple data points for each cell, whereupon each recorded parameter can increase the complexity of the data set very rapidly. This curse of dimensionality is a major challenge for analysing and interpreting single cell data. At the end of this process, the interpretation of single cell data can result in either an individualised clinical diagnosis for the patient or the identification of interesting target cells and their products. In the case of interesting targets, viable cells can be retrieved from within
the chip and, with appropriate manipulation, can subsequently be used for the production of hormones, vaccines or antibodies, which would all be specific to that individual donor patient. Multidisciplinary research fields comprising biology, informatics and physics all work on the development of new techniques that enable single cell analysis. The Nature Publishing Group has selected single cell sequencing as their “Method of the Year for 2013”, a distinction previously awarded to techniques such as super-resolution microscopy (2008). Several companies have recently been working on the goal of largescale, single cell analysis. For example, Fluidigm located in San Francisco (US) has launched single cell devices for sequencing that have contributed to the discovery of unrecognised immune cells, whilst Sphere Fluidics in Cambridge (UK) has patented a novel technology platform enabling ultra-high throughput analysis of single cells in picodroplets. The Nobel Prize winner Sydney Brenner once stated, “progress of science depends on new techniques, new discoveries and new ideas, probably in that order”. The new technologies that allow us to study our body on a single cell level will unravel crucial molecular and cellular mechanisms which will be instrumental for the development of novel therapeutic concepts. Inevitably, several moral and ethical questions will arise such as intellectual property, regulatory oversight, reimbursement and confidentiality. Nevertheless, the potential for revolutionising medicine is immense. Single cell analysis may just herald in the era of personalised treatment for a disease. Verena Brucklacher-Waldert is a post-doctoral researcher at the Babraham Institute.
Technology
31
Textbooks or ‘tubes: YouChoose Matthew Dunstan investigates the changing role of YouTube and online video in education if i were to ask you what the most popular
category of videos on YouTube is, you might guess something to do with cats or other fluffy animals. While you would have been correct in the past, a new trend has arisen on the website – educational videos, particularly those involving mathematics and science. With over double the number of weekly views compared to videos uploaded to the pets and animals category, the behemoth video platform is now being used for much more than the occasional chuckle. In stark contrast to the venerable giants of science communication from the past–greats such as Richard Feynman or Carl Sagan–the new generation of educators is young, hip and surprisingly adept at condensing complex scientific concepts into bite-sized, entertaining portions. Embracing a wide variety of formats, from the more traditional talking heads to stick figures drawn with markers filmed in stop-motion, these YouTubers are making science interesting and tangible for anyone with access to an Internet connection. Even more, what they are doing is incredibly popular, with some of the largest
LIN KRISTENSEN
Are books now an archaic form of learning?
32 Perspective
LUCÉLIA RIBEIRO
Students are now able to learn by using wide varieties of media on the Internet
channels averaging several million views a week. It might be easy to discard this phenomenon and its effect on the traditional science education model by assuming that its popularity is derived from the dumbing down of its scientific content to make it palatable for the average person. Choosing specific, popular science ideas that make for entertaining videos may have little use in the real pursuit of scientific knowledge or discovery after all. But you only have to look as far as channels such as ‘Minute Physics’ by Henry Reich, a former physics masters student, or ‘Crash Course Chemistry’ by Hank Green (his brother, John Green, also hosts a channel called ‘Crash Course History’) to see that content focused directly at the high school syllabus can still be wildly popular. The impact on educational practices cannot be understated–YouTube and other online video content offers virtually limitless, up-to-date information for students and will challenge some core assumptions about how students should learn mathematics and science. In addition to YouTube, other organisations such as the Khan Academy host vast online repositories of video lectures and interactive exercises numbering in the thousands over a wide variety of subjects, all of which are free to access for students, staff and parents. The availability of online materials, as well as the ability for teachers to easily upload their lectures, has even led to a revolutionary educational concept known as the ‘Flipped Classroom’, a concept Michaelmas 2014
ADAM RIFKIN
they only attract viewers of the same mid-20s demographic of many of the channels’ hosts and creators. The instigators of this new wave are able to captivate audiences who would be otherwise uninterested in the subject matter with short, witty videos that entertain and educate in a way no other form can. This can only help improve the overall scientific literacy of society, something that is sorely needed given the increasingly scientificallycomplex problems the world faces. Richard Feynman is regarded as one of the best lecturers of all time, but he was still only one man with a limited reach. What would he have made of the technology we have at our keyboards today? Perhaps the most disruptive effect of online video is the ease with which one person, given time and the right idea and presentation, can influence millions. Take the example of Salman Khan, who started his YouTube channel in 2006 uploading videos of himself giving tutorials in mathematics so as to meet the demands of his friends and family who wanted his help. Following the popularity of his videos amongst students and teachers, he quit his job as a financial analyst to found the Khan Academy, whose online lectures and exercises are now used by millions of students all around the world. Khan is a talented lecturer, and had the foresight to record his lessons, but it really was made possible through the transformative technology he had available to him. Despite the promise of this Brave New World of learning, there is still a major caveat. Internet access at home is far from assured, especially in developing countries, and can contribute to a digital divide between students who can access content at their own pace in the privacy of their own rooms (a factor that Khan himself explicitly mentions as one of the key benefits of his form of lecturing) and those who cannot. With better access for lower income households this disparity might be able to be overcome, but that is currently a far way off. We must be careful not to allow this revolution to become another source of inequality. Despite this, it is a fantastic time to be an educator, and especially a science communicator. Never before have passionate and talented scientists, especially younger scientists, been able to follow their interests with such freedom. Furthermore, through programmes such as the YouTube partnership scheme, this passion can turn into a viable and even lucrative career. Even better, we all get to enjoy their creativity, learn from their wisdom and laugh along with them, without spending a single pound.
STEVE JURVETSON
that was arguably pioneered by Harvard physics professor Eric Mazur in the early 1990s. The idea is for teachers to set homework that involves students watching a number of short lectures online and then to work with students through activities related to the lectures during the school day, under their supervision. This switching of the normal cycle of information delivery during the school day and revision after school, allows students to work through the lecture material at their own pace and then to make use of the teacher’s advice when trying to apply the new knowledge to a problem. This approach has been implemented in a number of schools, most notably at Clintondale High School in Michigan. In 2010, Clintondale was rated among the worst 5 per cent of schools in the state with ninth grade mathematics and science failure rates of 44 per cent and 41 per cent, respectively. In one year, after introducing a new paradigm focused on the Flipped Classroom, these same failure rates dropped dramatically to 13 per cent and 19 per cent, respectively. Such improvements would be impossible without the free and easy access to online video tutorials. Online educational content also has a much wider reach than just the classroom. According to Mitchell Moffit and Gregory Brown, creators of the YouTube channel AsapSCIENCE, their average viewer is a 35-year-old male. Their channel covers topics ranging from the outcome of sleep deprivation to the science of hangovers, via cartoons on a whiteboard. These videos are therefore not solely the domain of high school students trying to pass their exams, nor do
Salman Khan, the founder of Khan Academy
YouTube: Not just for cute cats
Matthew Dunstan is a 3rd year PhD student in the Department of Chemistry. Michaelmas 2014
Perspective 33
The Pint is Right Robin Lamboll debates whether the price of alcohol should be raised There ’s a drug that causes over a million
admissions to hospital in the UK every year, and two million visits to accident and emergency clinics. It’s linked to about half of all violent crime in the UK and it’s not just a UK problem– the world health organisation estimates that this drug is to blame for about 6 per cent of all deaths worldwide. This drug is legal in the majority of countries. This drug is alcohol. With such a track record, it might seem extraordinary that this drug is legal when others aren’t. Prohibition of alcohol was attempted in the USA for over a decade, but is widely regarded as unsuccessful. Statistics on the use of banned substances might generally be untrustworthy (after all, people are trying to hide the substances), but by following the levels of alcohol-related arrests, deaths, cirrhosis (liver damage) and alcoholic psychosis, we can determine the general trend. This suggests an initial collapse in drinking levels immediately after the introduction of Prohibition in 1920, followed by a fairly rapid return to about 60-70 per cent of the normal level, through a combination of black-market activity and exploitable loopholes. Not a complete failure on its own terms then, but hardly a success. Ultimately any policy aimed at limiting the harm of alcohol needs to accept that people like drinking it. Firstly this raises questions over whether or not it’s right to penalise people for what many regard as an enjoyable, social activity.
PUBLIC
Prohibition is largely seen as an ineffective measure to control alcohol consumption
34 Science and Policy
KIMERY DAVIS
Alcohol is responsible for about six per cent of deaths worldwide and UK consumption is rising
But for social scientists, it means that people will pay to drink, regardless of legality. Analysis of the figures from the Prohibition show that the fall in drinking was likely due to the cost rather than any social changes – the increased difficulty of hiding and smuggling alcohol meant that prices were as much as three times higher in 1930 than pre-Prohibition. In economics, some goods are considered ‘elastic’, meaning that when prices rise, people consume less of them. Alcohol is considered a very inelastic good, meaning that changes in price do surprisingly little to change the amount people consume. Such a low elasticity is also evidence that the other effects of Prohibition can’t have had much impact at all. This inelasticity means that taxing alcohol doesn’t have as large an impact on consumption as might be thought, but it also means that such taxes will be good revenue earners. The price of a pint includes about a pound of taxes, half each from VAT and alcohol duty. The cost of whiskey and vodka are both about two thirds tax. And yet, in spite of raising revenues of over 9 billion per year on alcohol duty alone, there are still major questions over whether this is enough. It certainly covers the bills for the NHS, which estimates put at about 3 billion pounds, but for society as a whole there are much larger costs–adding in factors like alcohol-related crime, unemployment and lost productivity, government sources estimate costs of around 20 billion. There’s a fundamental agreement that something must be done to prevent this, but a divide as to what, partially because so many of us enjoy drinking and don’t want too much interference in how we do it. Studies tend to show that government regulation of how alcohol is Michaelmas 2014
Michaelmas 2014
similar increases in expenditure for the public as proposed minimum pricing, but the money goes to the government rather than retailers. These strategies have both been tested. In most of Canada, regional authorities have a monopoly on the production of alcohol, meaning that they still end up with the increased income from minimum pricing. Data shows that a 10 per cent increase in the minimum price of alcohol resulted in only 3 per cent decrease in alcohol consumption, but a 9 per cent reduction in hospital admissions for alcohol-related problems. This data supports models developed by the University of Sheffield, suggesting that minimum pricing has a slightly more focused effect on reducing the crime and public health problems than increases in taxation. Minimum pricing and a constant tax per unit alcohol are two possibilities being explored to reduce alcohol consumption
DJANDYW.COM
sold has little impact on its effects. For instance, changes in pub opening times changes when alcohol-related violence happens, but has little impact on the number of violent incidents. The number of places where people can buy alcohol appears to have some correlation with increased harm, but here correlation and causation are difficult to disentangle. The largest effect on reducing alcohol-related problems is seen following an increase in the price of alcohol. There are two options for how to do this: a minimum price or higher taxes. NICE, the national advisory body for healthcare, advises a minimum price per unit alcohol. Any alcohol served in pubs, and most alcohol for home consumption, will cost more than this already, so the effect of this change will be concentrated on those after alcohol without frills: binge-drinkers and alcoholics. On the other hand, it will also be concentrated on the less well-off, who typically buy cheaper brands. NICE argue that the cost savings from lower rates of alcohol-related diseases and social problems would more than offset the reduction in government revenue. The institute for fiscal studies (IFS) is not so sure, and worries that any difference between the sales price and the previous cost of alcohol goes straight into the pockets of the supermarkets or off-licences. This money might well be used to advertise the sale of more alcohol. Studies suggest these adverts are particularly effective at increasing drinking among the young, although banning adverts targeting the young does not necessarily help, since they still respond to nontargeted adverts. The IFS therefore suggest an increase in taxes. Currently, alcohol duties are complicated, meaning that buying a unit of alcohol is much cheaper in cider than in beer or wine. All types of alcohol are historically cheap relative to earnings. The IFS suggest both raising and rationalising taxation, bringing alcohol costs into line with earnings. A constant tax per unit alcohol would discourage binge drinking and definitely raise money for the treasury. This tax rise would affect all drinkers, but assuming that supermarkets passed all the costs on, would raise the price of cheaper alcohol more. In spite of lower average alcohol consumption among the poor, this change would proportionally affect them more than the rich, however less so than minimum pricing. The low elasticity of alcohol means that although the amount of alcohol sold will decrease, government revenue will still increase from the tax alone, with savings on healthcare and crime offering a bonus. They predict that this ‘rational tax’ will result in
But should we raise prices at all? Won’t people just turn to illegal sources, like during Prohibition? Certainly people will seek alternative options–either purchasing illegally or buying more alcohol from abroad–but history suggests consumption will still go down. Law-abiding, non-violent drunks can argue that their drinking pays more into society than their health problems take out. What is the payoff between their freedom and the risk of doing harm to those around them? Do we limit the freedom of supermarkets to sell alcohol at any price but give them more money to spend on advertising? Or should we let them tempt customers with belowcost price alcohol if they wish, but ensure they give the government enough money to deal with the consequences? Or a combination of the two? Grab a pint and join the discussion. Robin Lamboll is a 2nd year PhD student in the Department of Physics.
Science and Policy 35
The Dolphin Whisperer Joanna-Marie Howes talks with director Christopher Riley about his latest documentary john cunningham lilly was a neurophysician,
scientist and writer, inventing the sensory deprivation tank and working closely with the American Navy and NASA. Yet he died disgraced among the shreds of his work. What went wrong? Lilly theorised that if humans could teach dolphins to understand the English language, then we could learn how to communicate with extraterrestrial life. Rooted in idealism, the experiments eventually ended amidst rumours of animal abuse and drug-fuelled scandal. Until now, those involved have not spoken publicly about the experiments. Recently however, producer and director Christopher Riley was given the opportunity to speak to the participants in this ground-breaking and ultimately careerbreaking experiment. I talked to him about his documentary ‘The Girl who Talked to Dolphins’ to find out how an attempt to reach into the cosmos proved to be the rise and fall of John Lilly. In the 1950s, Lilly began experimenting on dolphins. His wife noticed that the animals appeared to be mimicking the human voice. Lilly was convinced that the dolphins were not only mimicking, but trying to speak to the research team. Christopher says, “John Lilly’s idea of
LUKAS BALANDIS
Christopher Riley set about redressing the misinformation on the study
building an interspecies communication bridge was routed in the dream of enriching our culture with that of another species. It was a vision which Lilly extrapolated as far as a Cetacean Chair at the United Nations, where dolphins and whales could share their perspective on the planet with us. It might seem fanciful today, but then so does landing on the Moon, which we did manage to 36 Arts and Science
LILLY ESTATE
Margeret Howe was given the job of training the dolphin
accomplish in the 1960s and early 1970s!” Lilly believed that for the first time an equallyintelligent species was trying to make contact. His findings gained worldwide attention, including that of a team of astronomers from the ‘Search for Extraterrestrial Intelligence’ (SETI). At the height of the space race, Lilly’s ideas on interspecies communication interested astronomers anticipating language barriers posed by an intelligent extra-terrestrial race. NASA provided Lilly with financial support, enabling him to establish the Communication Research Institute, or ‘The Dolphin House’. The Dolphin House was built to provide the best conditions possible for its animal inhabitants, straight from Marine Studios in Miami, with a pool directly linked to the sea to provide fresh water. Lilly had gathered a notable research team to help him in his task, including the Cambridge educated anthropologist Gregory Bateson, and a resident vet to ensure the dolphins’ health. In 1964, a young woman, Margaret Howe, also came to investigate the rumours of this strange isolated building. She was granted access by Bateson who quizzed her on the animals’ behaviour. He was impressed with Margaret’s perceptiveness and allowed her free admission to the facility. With Lilly travelling, much of the work was carried out by the rest of the team. Dolphins are quick to develop friendships with humans and those at the Dolphin House were no strangers to humans, having been in the original version of Flipper. Margaret was given the job of training Peter who was chosen because he’d had no previous language ‘tuition’. Michaelmas 2014
Michaelmas 2014
Margaret still believes that her time with Peter was too short to achieve any substantial results, by 1965, the experiment’s lack of progress had its funding bodies concerned and they began to withdraw their support. Lilly was now desperate for results and injected the dolphins with LSD. Against his expectations, it appeared to have no effect on them. Lilly’s obsession eventually overrode scientific ethics and in a frantic and cruel attempt to provoke a response, he took a jackhammer and started to drill into the floor of the Dolphin House; a serious assault on the animals’ super-sensitive hearing. For the rest of his team, this was the last straw. By then, Lilly’s money was gone and he was running up huge debts. As his interest in drugs grew, his interest in the dolphins waned. The animals were transported to another research facility. They were kept in tiny plastic tanks with rancid water and no natural light. Dolphins don’t breathe involuntarily but consciously choose to take each breath. Eventually, the stress grew too much for Peter, and he simply stopped breathing. “I’m pleased that I was able to help Margaret to redress all the misinformation and misreporting of exactly what went on at the Dolphin House, it’s important that there’s an accurate record of these events,” said Christopher in the interview. In the following years, Lilly sank deeper into drug culture. LSD and ketamine took their toll on his mind, leading him to believe in cosmic entities which he dubbed the Earth Coincidence Control Office. However, he released his dolphins and campaigned for their wellbeing. Though his experiments are marred by scandal, and viewed as pure mimicry rather than inter-species conversation, it is partly as a result of Lilly’s efforts that dolphins are now protected, helping to cement a mutually-respectful relationship with these fascinating creatures. In part due to Lilly’s efforts, dolphins are now protected animals
NASA
Aside from communicating with each other underwater through clicks and squeaks, dolphins can make sounds by opening and closing ‘lips’ on their blowholes in the open air. Peter’s lessons included vowels, select words and counting to three. Though sometimes reluctant, he began to mimic Margaret’s voice. She asked him to repeat ‘One, two, three....’ with a raised intonation on ‘three’. Peter recognised the change in pitch and copied her words, although the work was monotonous and frustrating on both sides. Then Margaret had an idea: instead of leaving the dolphins alone every night when the researchers went home, she’d live with Peter. Lilly approved whole-heartedly and the Dolphin House underwent a major overhaul to create multilevel watertight accommodation for dolphin and human to coexist. Throughout the following months, Margaret taught Peter as if he were a human child. Peter made progress, but was restricted in the sounds that his blowhole could physically make. Although Lilly was happy with Margaret’s progress, other members of the team were sceptical of the experiment’s validity. NASA too, after sending Carl Sagan to the facility, were dubious about the work’s merits. While Peter was copying sounds, he did not appear to actually understand the meaning of conversations. At this stage Bateson and NASA thought that inter-dolphin communication could tell them more than Lilly’s approach. Yet Lilly stuck to his agenda, and Margaret continued her work. However, other problems were developing. Lilly routinely injected himself with LSD, convinced that it provided new insight into the workings of the brain. Though it may seem extreme today, as Christopher describes, “Through the 1950s and into the early 1960s… US-government sponsored programs used it on both human and animal subjects to better understand its potential benefits.” Now though, Lilly’s aim was to inject the dolphins with the drug in order to ‘free’ the mind and so facilitate his perception of the dolphins’ communication. Margaret wanted no part of the experiments and stalled Lilly’s plans. Though worried about his escalating drug use, this was not her only concern. Peter was a young male dolphin, and was becoming increasingly sexually precocious with Margaret. To relieve his urges, Peter was allowed access to the female dolphins. However, transporting him on the facility’s lift became more and more impractical. Margaret decided to take the unprecedented approach of manually relieving Peter herself. She stated, “It just became part of what was going on... Like an itch – we’ll scratch it and then we’ll be done... Just move on.” Although
Joanna-Marie Howes is a post-doctoral researcher in the Department of Biochemistry. Arts and Science 37
Absolutely FameLab-ulous Jonathan Lawson reminisces on this year’s FameLab, a competition for science communication 2005, Cheltenham Science Festival played host to the very first FameLab competition, a talent show-like science communication contest, which challenges participants to explain a scientific topic to a general audience within three minutes. Just two years later, FameLab became an international competition, and now in its ninth year, FameLab 2014 brought together 23 national champions from 25 different countries. The FIFA World Cup was around for 50 years before it had that number of countries. The hopeful contestants represented the full diversity of young scientists from around the world, ranging from undergraduate students to lecturers, and were discussing the full spectrum of scientific research. Over two intense semi-final evenings, the 23 hopefuls were whittled down to just ten by select panels of expert judges. Their three minute talks covered assorted topics from the psychology of smiling to rivers and quantum mechanics.The following night the lucky ten returned with new performances to vie for the FameLab International title. You’d be forgiven for expecting FameLab International to have a tense, competitive atmosphere, with everyone’s eyes firmly on the prize. However, this gathering of science communicators was a surprisingly relaxed and enjoyable affair. Each one of the competitors in FameLab International had already won the national FameLab competition in their own country. As far as they were concerned they’d won their prize, and a trip to Cheltenham, complete with a master class from FameLab’s public speaking expert, Malcolm Love, was simply part of the reward. All had come together to learn from each other and share a mutual enthusiasm for science more than to compete.
GRAHAM FUDGER
in june
Caroline ShentonTaylor won the UK Competition
Features
The grand prize was awarded to Pádraic Flood, representing the combined Benelux states of the Netherlands, Belgium and Luxembourg, for his talk on the possibility of developing more efficient methods of photosynthesis. As the host nation and one of the front runners in science communication, FameLab in the UK is a great challenge for anyone over 21 that wants to take a shot at talking about science. You need at least two three-minute speeches prepared, possibly with props. You’ll need these for the regional competitions held around the country, including here in Cambridge. The winners, and one wildcard runner-up, get given a special master class in communication, and are invited to the national final in London. This year, the lucky wildcard UK finalist was BlueSci’s own Robin Lamboll, who won his place in the final with an astonishing performance of a three minute poem explaining ocean ecology. The eventual UK champion was physicist Caroline Shenton-Taylor on ‘How a Cup of Tea Stirred up Science’, an exploration of how we can learn about atomic structure using nuclear magnetic resonance by stirring our samples. Caroline went on to win the FameLab International alumni award alongside Joanna Bagniewska for FameLab Poland, who showed how to train bees to detect explosives. FameLab 2015 starts soon and with more countries than ever before. Whether you love talking about science, or have yet to give it a go, you can find out more and register to get involved online at famelab.org Jonathan Lawson is a 4th year PhD student in the Department of Genetics.
References:
Our Colourful History - Simon Garfield - Mauve: how one man invented a colour that changed the world Voyager 1 - http://voyager.jpl.nasa.gov/spacecraft/goldenrec.html Bioelectricity - http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3243095/ Measles - http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1123944/ Nature vs Nurture - Ellis and Boyce - Current Directions in Psychological Science June 2008 vol. 17 no. 3 183-187
Regulars
Lab on a Chip - Chattopadhyay et al. (2014), Single-cell technologies for monitoring immune systems. Nat Imm, 15(2), 128-35 YouTube - https://www.khanacademy.org/ Alcohol Pricing - https://www.nice.org.uk/Guidance/Lifestyle-and-wellbeing/Alcohol FameLab - http://famelab.org/uk/ 38 Away from the Bench
Michaelmas 2014
in the course of evolution, insects have developed a variety of strategies to reduce surface contamination and thus to avoid inhibition
of physiological functions. For example, ants regularly clean their antennae with a sophisticated cleaning structure located on their forelegs. During a cleaning movement, the ant’s antenna is clamped into a two-piece cleaning device and then pulled through unidirectionally. In a second step, the ant cleans these structures with its mouthparts in order to allow reuse. The antenna cleaner in ants is equipped with tiny hairs, which form comb or brush-like arrays, and vary in shape, size or spacing. Having different hair configurations on one cleaning structure enables the insects to pick up surface contaminants of different sizes with a single cleaning stroke. This colourised Scanning Electron Microscopy (SEM) image shows a 10 ¾m polystyrene particle (ten times smaller than the diameter of a human hair) covered with smaller particles, attached to a cleaning hair after its removal from a Camponotus rufifemur ant’s antenna. So far, nothing is known about the forces acting between the cleaning hairs and the dirt particles, and this is the first time that a SEM image shows adhesion between a single cleaning hair and a contaminant. Understanding the underlying principles of the cleaning mechanisms of insects might enable us to develop artificial devices for surface cleaning of sensitive systems on a micro- or nano-scale. Alexander Hackmann is a 3rd year PhD student in the Department of Zoology.
AL
Michaelmas 2014
EX
AN
R DE
HA
CK
M
AN
Pavilion 39
Weird and Wonderful A selection of the wackiest research in the world of science which, in this issue, comes from a competition for sixth-form students Eau de Malaria it was proposed in the past that people and
animals infected with malaria had a higher chance of attracting mosquitoes, but why this occurred was unknown. Research conducted in Switzerland and the US has revealed that malaria could be changing the odours emitted by a host by making alterations in the mixture of volatiles produced. Volatiles released by both infected and healthy mice for the duration of the malarial infection were collected, and gas chromatography and mass spectrometry used to identify the compounds that could augment the attraction toward mosquitoes. Interestingly, it was discovered that mosquitoes were only attracted to the infected mice during the contagious stage of malaria. It’s suggested that changing the host’s odour is essential for the parasite to complete its life cycle. Following infection, this consists of reproducing in the liver, spreading to the blood and then being consumed by mosquitoes (Anopheles), where they multiply in preparation for transmission to another host. Gametocytes (the blood stage parasites), are thought to be responsible for changing odour profiles. If this relates to humans infected with malaria, it may be possible to recognise malarial infections in people that are yet to show symptoms. rh
Turbines Perturb Hurricanes mark jacobson, a Professor of Civil and
Environmental Engineering at Stanford, has been doing research on hurricanes and environmental engineering, assessing what effect a large wind farm would have on a hurricane’s power. As part of his research, he developed a model which simulates hurricanes. He then developed the model further and simulated what might happen if a hurricane encountered an enormous wind farm stretching for many miles offshore and along the coast. Incredibly, he found that the wind turbines could disrupt a hurricane enough to reduce wind speeds by up to 92 miles per hour and decrease storm surge by up to 79 per cent. Using the case of hurricane Katrina and Jacobson’s model, he found that the 78,000 wind turbines off the coast of New Orleans would have weakened the hurricane. In the computer model, by the time Katrina reached land, its wind speeds had 40 Weird and Wonderful
decreased by 36-44 metres per second and the storm surge had decreased by up to 79 per cent. It’s safe to say wind turbines save money and lives! bb
Starch Contrast counterfeit money is becoming a growing
problem in today’s society, with the majority of this problem being caused by fake bank notes being found in circulation. There are many different ways that a fake bank note can be identified; one of the most instantaneous and effective tests is to use a Counterfeit Banknote Detection Pen. The concept itself seems fairly complicated, however the test is very simple to carry out and can be easily explained. The ink of the Counterfeit Banknote Detection Pen contains iodine, which reacts with the starch in the wood used to make conventional paper, leaving a blue-black stain. However, the paper used to make bank notes does not contain this starch-based wood and instead is made up of a fibre-based paper. As such, no reaction occurs when a line is drawn. When the starch reacts with the iodine, a starchiodide complex is formed. The negatively-charged electrons pass a charge from the iodine ion to the starch, causing the colour change in the solution to occur. The original orange colour of the iodine turns blue-black. This reaction is what shows the presence of starch on the banknote, therefore identifying whether it is genuine or not. er
Illustrations by www.alexhahnillustrator.com
Michaelmas 2014
Write for The
14 er 20 East ue 30 Iss org ci. .blues www
rsity
Lent 2014 Issue 29 www.bluesci.org
ne
agazi
ce m
scien
Cambridge University science magazine
g
brid
Cam
ive e Un
The Cambridge University science magazine from
rsity Unive from ine
ge brid Cam e magaz scienc
Cambridge University science magazine
FOCUS
Lines of Communication ting mpu r m co ge uantu rick San Q . e s sion n . Fred u ill l ptica iminatio s. O cr . Dis type d s o od Blo er-fo Sup
Ancient Genomes . Sensation . Water Shortage Teaching Evolution . Virus Sculpture . Mo Costandi
S CU s FOgnitive biase Co
Feature articles for the magazine can be on any
We need writers of news, feature articles
scientific topic and should be aimed at a wide
and reviews for our website.
audience. The deadline for the next issue is
For more information, visit
24th October 2014
www.bluesci.org
Email complete articles or ideas to submissions@bluesci.co.uk
For their generous contributions, BlueSci would like to thank: CSaP Churchill College If your institution would like to support BlueSci, please contact enquiries@bluesci.co.uk