Synapse Science Magazine Issue #5

Page 1

SYNAPSE THE SCIENCE MAGAZINE WRITTEN BY STUDENTS FOR STUDENTS

ISSUE 5 - June 2013 - FREE

HIV: 30 Year Anniversary Review Mosasaurs:

Giant Marine Carnivores of the Late Cretaceous Another Climate Conundrum Venom Overkill


EDITORIAL

The Synapse Team Tom Stubbs

Editor In Chief

Alicja Jedrzejewska

Senior Editor and Vice President

Felicity Russell

Senior Editor and Secretary

CONTENTS A Message from the Editor In Chief Hello! Welcome to issue 5 of Synapse Science Magazine. This issue includes a diverse range of topics, including climate change, sleep control and even the science behind ice cream! Be sure to share this magazine with your friends and colleagues and check out our blog. We hope you enjoy the read. If you have any comments or wish to join our magazine please contact synapse.scimag@gmail.com

Oliver Ford

Articles

On the cover

4. Venom Overkill 6. The Science of Ice Cream 7. Alcoholic Primates? 8. Another Climate Change

12. HIV: 30 Year Anniversary

Review 4. Venom Overkill 8. Another Climate Change Conundrum 20. Mosasaurs: Giant Marine Carnivores of the Late Cretaceous

Conundrum 15. Captivating Creatures 17. Sleep Control 18. Can Crops Combat Climate Change 20. Mosasaurs: Giant Marine Carnivores of the Late Cretaceous

Senior Editor and Treasurer

Daniel Ward

Senior Editor and Graphic Designer

Molly Hawes Managing Editor

Gemma Hallam Events Manager

Louisa Cockbill Senior Editor and Publicity Officer

Mary Melville Fundraising Officer

Felix Kennedy Katherine MacInnes Saraansh Dave Alex Pavlides Senior Editors

2 | SYNAPSE

Article Editors

Hannah Bruce Macdonald Juliette Curtis Hayward Rachel Greenwood Frances Cartwright Georgina Maguire Georgina Winney Alfred Omachar Jacob Hutchings Eira Fomicheva Erik M端端rsepp Matthew Cole Naomi Farren Shakir Misbah Ryan Hamnett Natalie Parker Julia Walton Ione Bingley Cher Bachar Tom Ridler Jo Sadler

Opinions

Features

10. Time In Physics 16. Cherry Picking Medicine

Marvels 12. HIV: 30 Year

22. A Storm On Our Horizon

Anniversary Review

Join us online!

www.synapsebristol.co.uk @synapsebristol

SYNAPSE | 3


EDITORIAL

The Synapse Team Tom Stubbs

Editor In Chief

Alicja Jedrzejewska

Senior Editor and Vice President

Felicity Russell

Senior Editor and Secretary

CONTENTS A Message from the Editor In Chief Hello! Welcome to issue 5 of Synapse Science Magazine. This issue includes a diverse range of topics, including climate change, sleep control and even the science behind ice cream! Be sure to share this magazine with your friends and colleagues and check out our blog. We hope you enjoy the read. If you have any comments or wish to join our magazine please contact synapse.scimag@gmail.com

Oliver Ford

Articles

On the cover

4. Venom Overkill 6. The Science of Ice Cream 7. Alcoholic Primates? 8. Another Climate Change

12. HIV: 30 Year Anniversary

Review 4. Venom Overkill 8. Another Climate Change Conundrum 20. Mosasaurs: Giant Marine Carnivores of the Late Cretaceous

Conundrum 15. Captivating Creatures 17. Sleep Control 18. Can Crops Combat Climate Change 20. Mosasaurs: Giant Marine Carnivores of the Late Cretaceous

Senior Editor and Treasurer

Daniel Ward

Senior Editor and Graphic Designer

Molly Hawes Managing Editor

Gemma Hallam Events Manager

Louisa Cockbill Senior Editor and Publicity Officer

Mary Melville Fundraising Officer

Felix Kennedy Katherine MacInnes Saraansh Dave Alex Pavlides Senior Editors

2 | SYNAPSE

Article Editors

Hannah Bruce Macdonald Juliette Curtis Hayward Rachel Greenwood Frances Cartwright Georgina Maguire Georgina Winney Alfred Omachar Jacob Hutchings Eira Fomicheva Erik M端端rsepp Matthew Cole Naomi Farren Shakir Misbah Ryan Hamnett Natalie Parker Julia Walton Ione Bingley Cher Bachar Tom Ridler Jo Sadler

Opinions

Features

10. Time In Physics 16. Cherry Picking Medicine

Marvels 12. HIV: 30 Year

22. A Storm On Our Horizon

Anniversary Review

Join us online!

www.synapsebristol.co.uk @synapsebristol

SYNAPSE | 3


ARTICLES

Venom Overkill

W

Why Are Snake Bites So Deadly?

hen it comes to venomous animals, we have a bizarre curiosity to discover which species holds the title of ‘the worst’. This macabre competition often sees distantly related groups compared, be it sea wasps, stonefish or venomous snakes. It is not uncommon to see the danger potential of venomous animals measured in potential human casualties should the animal in question go on a killing spree, as improbable as that is. These figures range from tens to hundreds in the most venomous, and while the comparative levels of toxicity are interesting, I find the question of why they are so high more so. Looking in

4 | SYNAPSE

particular at snakes, if one bite could take down forty people, why not tone it down to a safe bet of two or three? Surely the sheer amount of lethality is overkill and wasteful of metabolically costly materials, especially considering the size of their prey items. The clearest indication that this is not the case is the fact that many species worldwide possess that lethality. If there wasn’t a selection pressure on them to produce an apparent surfeit of venom, they would quickly lose it. Snake venom serves the dual purpose of food acquisition and defence. Although important to survival, defence is arguably the secondary

of the two. Most snakes avoid confrontation by relocating or using threat displays; even when biting many species demonstrate unwillingness to use venom defensively. These include Belcher’s sea snake, which, despite possessing one of the most devastating venoms of all, will only bite after repeated physical assault, and 75% of the time will give a ‘dry’ first bite as a warning, before using any of its precious venom on a second (although some aggressive species will use venom in most or all defensive bites, including the black mamba, whose bite is known as ‘the kiss of death’ and left untreated has a fatality rate of close to 100%). Bearing defence in mind, it starts to make more sense that snakes make lots of potent venom: they sometimes have to take down enemies many times their size and strength, and the faster they do it, the better their chances. It has been shown that snakes can modulate the volume injected depending on the size of the target, injecting more for big creatures like humans. However, this is not the whole story. We also need to look at the primary use of venom: the immobilisation of prey. First of all, the prey must be killed or incapacitated quickly. The faster this happens, the lower the chance the prey will escape to die beyond the hunters reach, or fight back and cause damage to the hunter. Secondly, the venom must overcome the prey’s biological resistance. Just as we are now struggling against bacterial resistance to antibiotics, snakes have been struggling with resistance to their venoms for millions of years. Resistance and even immunity to venoms has been observed in many species, including those that prey on venomous snakes (e.g. mongooses or grasshopper mice), and those

that are preyed upon (e.g. dormice and ground squirrels). Resistance to venom can even be gained by humans through repeated low level exposure, an example of Mithridatism. Venom is therefore a case study of the red queen hypothesis: both predator and prey must evolve as fast as they can just to stay in the same place. Indeed, studies have shown that genes which code for venom have high rates of evolutionary change, and favour novel combinations that give rise to new arsenals of toxins. Snake venoms also employ toxin cocktails in order to get around any resistances the prey may have to particular proteins, just as we can use antibiotic cocktails. Returning to my original question, it is no longer hard to see why the venom has such an apparent overkill effect on humans. Relative to body size, more effective and more copious venom is required for prey, so when we are exposed the results are often disastrous. Venom puts a much smaller selection pressure on humans than on prey, so we have not gained any resistance to it. With that said, is it now understandable that these snakes are so hated and feared in many countries around the world? Well, no; what is required is respect. The risk posed by snakes can be easily neutralised with education and common sense on our part. The aggressive, angry snakes portrayed by so many TV shows fuel the myth that they want you dead; however these individuals are often teased in order to elicit a reaction. There is a reason why UK adder bites are almost exclusively on the hands of teenage boys. All the snake wants is for you to leave it alone, why else would they have evolved all the bright colours, hoods and rattles to keep you away?

Cormac Kinsella

SYNAPSE | 5


ARTICLES

Venom Overkill

W

Why Are Snake Bites So Deadly?

hen it comes to venomous animals, we have a bizarre curiosity to discover which species holds the title of ‘the worst’. This macabre competition often sees distantly related groups compared, be it sea wasps, stonefish or venomous snakes. It is not uncommon to see the danger potential of venomous animals measured in potential human casualties should the animal in question go on a killing spree, as improbable as that is. These figures range from tens to hundreds in the most venomous, and while the comparative levels of toxicity are interesting, I find the question of why they are so high more so. Looking in

4 | SYNAPSE

particular at snakes, if one bite could take down forty people, why not tone it down to a safe bet of two or three? Surely the sheer amount of lethality is overkill and wasteful of metabolically costly materials, especially considering the size of their prey items. The clearest indication that this is not the case is the fact that many species worldwide possess that lethality. If there wasn’t a selection pressure on them to produce an apparent surfeit of venom, they would quickly lose it. Snake venom serves the dual purpose of food acquisition and defence. Although important to survival, defence is arguably the secondary

of the two. Most snakes avoid confrontation by relocating or using threat displays; even when biting many species demonstrate unwillingness to use venom defensively. These include Belcher’s sea snake, which, despite possessing one of the most devastating venoms of all, will only bite after repeated physical assault, and 75% of the time will give a ‘dry’ first bite as a warning, before using any of its precious venom on a second (although some aggressive species will use venom in most or all defensive bites, including the black mamba, whose bite is known as ‘the kiss of death’ and left untreated has a fatality rate of close to 100%). Bearing defence in mind, it starts to make more sense that snakes make lots of potent venom: they sometimes have to take down enemies many times their size and strength, and the faster they do it, the better their chances. It has been shown that snakes can modulate the volume injected depending on the size of the target, injecting more for big creatures like humans. However, this is not the whole story. We also need to look at the primary use of venom: the immobilisation of prey. First of all, the prey must be killed or incapacitated quickly. The faster this happens, the lower the chance the prey will escape to die beyond the hunters reach, or fight back and cause damage to the hunter. Secondly, the venom must overcome the prey’s biological resistance. Just as we are now struggling against bacterial resistance to antibiotics, snakes have been struggling with resistance to their venoms for millions of years. Resistance and even immunity to venoms has been observed in many species, including those that prey on venomous snakes (e.g. mongooses or grasshopper mice), and those

that are preyed upon (e.g. dormice and ground squirrels). Resistance to venom can even be gained by humans through repeated low level exposure, an example of Mithridatism. Venom is therefore a case study of the red queen hypothesis: both predator and prey must evolve as fast as they can just to stay in the same place. Indeed, studies have shown that genes which code for venom have high rates of evolutionary change, and favour novel combinations that give rise to new arsenals of toxins. Snake venoms also employ toxin cocktails in order to get around any resistances the prey may have to particular proteins, just as we can use antibiotic cocktails. Returning to my original question, it is no longer hard to see why the venom has such an apparent overkill effect on humans. Relative to body size, more effective and more copious venom is required for prey, so when we are exposed the results are often disastrous. Venom puts a much smaller selection pressure on humans than on prey, so we have not gained any resistance to it. With that said, is it now understandable that these snakes are so hated and feared in many countries around the world? Well, no; what is required is respect. The risk posed by snakes can be easily neutralised with education and common sense on our part. The aggressive, angry snakes portrayed by so many TV shows fuel the myth that they want you dead; however these individuals are often teased in order to elicit a reaction. There is a reason why UK adder bites are almost exclusively on the hands of teenage boys. All the snake wants is for you to leave it alone, why else would they have evolved all the bright colours, hoods and rattles to keep you away?

Cormac Kinsella

SYNAPSE | 5


ARTICLES

Alcoholic Primates?

The Science of Ice Toby Cream Benham D

ating back over 2000 years, ice cream is now a popular treat worldwide. Ice cream first existed in the form of a milk and rice mixture frozen by packing in snow. Legend has it; Roman emperors would send slaves to mountaintops to bring back fresh snow which would be flavoured subsequently. Following this theme, during the Persian Empire it was common for people to pour grape juice over snow to fashion their ice cream. Nowadays, thicker ice creams are favoured for general consumption – for example Ben and Jerry’s or Häagen-Dazs. Ice cream became a lot more widely consumed during the 20th century with the advance in modern refrigeration and the regular feature of a freezer at home. Ice cream consists of molecules of fat, microscopically dispersed throughout a water-sugar-ice structure with air bubbles present. Having one substance dispersed in another like this is called a colloid. Other examples include milk and mayonnaise. These would all further be classified as emulsions – a mixture of two or more liquids that are normally immiscible. A number of chemical processes can be used to create an emulsion. Stabilisers and emulsifiers are commonly employed to stabilise an emulsion by increasing its kinetic stability. They also act to hold the mixture together and maintain texture. Lethicin is a ubiquitous stabiliser, found naturally in a range of foods such as egg yolk. This is the reason eggs are added to many recipes, to stabilise the cooking mixture and any emulsion that forms; important when fat is being added. Allowing ice cream to melt before being refrozen changes its chemical properties. It tends to separate out after melting because the emulsifier present loses its even distribution from when it was first made. This causes the water content to rise to the top as it is less dense than the rest of the mixture. This can give ice cream a gritty texture due to the layer of crunchy ice that forms on top. During the manufacturing process, flavourings are added. All manner of ice cream flavours now

6 | SYNAPSE

exist; from the delicious to the bizarre. There is even one ice cream parlour in Brighton I once attended offering bacon-flavoured ice cream. One slightly peculiar flavouring found in ice cream is skatole. Produced in the human digestive tract by the breakdown of the amino acid tryptophan, skatole is what gives human excrement its distinctive smell. Yet, at small concentrations this molecule actually smells quite pleasant and is even used in perfumes! Ice cream is something that many people do not stop to think about in detail, but the science behind it can be considered to be quite involved. After a long journey, the make-up of ice cream has been thoroughly developed into a product that millions of people now consume world-wide, every day.

W

Ice cream consists of molecules of fat, microscopically dispersed throughout a watersugar-ice structure with air bubbles present.

Alcohol Points to Development in Primate Family Tree

hile the body’s ability to break down ethanol might not be the most interesting pub conversation, it has been useful for determining our evolutionary progression from extinct primates. Alcohol dehydrogenase is the enzyme responsible for catalysing the breakdown of ethanol, on entry to the body. DNA sequencing of this enzyme has been carried out and analysis speculating the evolutionary changes to the genetic code can be determined, inferring what the DNA sequence may have looked like at points of evolutionary branching. Most primate ancestors

would not have been capable of digesting ethanol, but the developed ability has been shown to be linked to the evolution of ground living gorillas and chimps. Living on the ground, these animals would have been exposed to more fermenting fruits which produce ethanol than would be found higher up in the branches. This created an evolutionary advantage for those species with more active enzymes for ethanol breakdown. The speculated DNA sequences for extinct primate species code for an enzyme

that is 50 times less effective than our current one. This links to a branching from a common ancestor 10 million years ago to tree-dwelling and ground-dwelling primates, with their differing ethanol exposures causing this genetic deviation. This type of estimation of what the genome looked like previously is very useful in cases like these, where the history is not well recorded with fossil data. More research and evidence will be able to shed light on the development of the primate family tree.

Hannah Bruce Macdonald SYNAPSE | 7


ARTICLES

Alcoholic Primates?

The Science of Ice Toby Cream Benham D

ating back over 2000 years, ice cream is now a popular treat worldwide. Ice cream first existed in the form of a milk and rice mixture frozen by packing in snow. Legend has it; Roman emperors would send slaves to mountaintops to bring back fresh snow which would be flavoured subsequently. Following this theme, during the Persian Empire it was common for people to pour grape juice over snow to fashion their ice cream. Nowadays, thicker ice creams are favoured for general consumption – for example Ben and Jerry’s or Häagen-Dazs. Ice cream became a lot more widely consumed during the 20th century with the advance in modern refrigeration and the regular feature of a freezer at home. Ice cream consists of molecules of fat, microscopically dispersed throughout a water-sugar-ice structure with air bubbles present. Having one substance dispersed in another like this is called a colloid. Other examples include milk and mayonnaise. These would all further be classified as emulsions – a mixture of two or more liquids that are normally immiscible. A number of chemical processes can be used to create an emulsion. Stabilisers and emulsifiers are commonly employed to stabilise an emulsion by increasing its kinetic stability. They also act to hold the mixture together and maintain texture. Lethicin is a ubiquitous stabiliser, found naturally in a range of foods such as egg yolk. This is the reason eggs are added to many recipes, to stabilise the cooking mixture and any emulsion that forms; important when fat is being added. Allowing ice cream to melt before being refrozen changes its chemical properties. It tends to separate out after melting because the emulsifier present loses its even distribution from when it was first made. This causes the water content to rise to the top as it is less dense than the rest of the mixture. This can give ice cream a gritty texture due to the layer of crunchy ice that forms on top. During the manufacturing process, flavourings are added. All manner of ice cream flavours now

6 | SYNAPSE

exist; from the delicious to the bizarre. There is even one ice cream parlour in Brighton I once attended offering bacon-flavoured ice cream. One slightly peculiar flavouring found in ice cream is skatole. Produced in the human digestive tract by the breakdown of the amino acid tryptophan, skatole is what gives human excrement its distinctive smell. Yet, at small concentrations this molecule actually smells quite pleasant and is even used in perfumes! Ice cream is something that many people do not stop to think about in detail, but the science behind it can be considered to be quite involved. After a long journey, the make-up of ice cream has been thoroughly developed into a product that millions of people now consume world-wide, every day.

W

Ice cream consists of molecules of fat, microscopically dispersed throughout a watersugar-ice structure with air bubbles present.

Alcohol Points to Development in Primate Family Tree

hile the body’s ability to break down ethanol might not be the most interesting pub conversation, it has been useful for determining our evolutionary progression from extinct primates. Alcohol dehydrogenase is the enzyme responsible for catalysing the breakdown of ethanol, on entry to the body. DNA sequencing of this enzyme has been carried out and analysis speculating the evolutionary changes to the genetic code can be determined, inferring what the DNA sequence may have looked like at points of evolutionary branching. Most primate ancestors

would not have been capable of digesting ethanol, but the developed ability has been shown to be linked to the evolution of ground living gorillas and chimps. Living on the ground, these animals would have been exposed to more fermenting fruits which produce ethanol than would be found higher up in the branches. This created an evolutionary advantage for those species with more active enzymes for ethanol breakdown. The speculated DNA sequences for extinct primate species code for an enzyme

that is 50 times less effective than our current one. This links to a branching from a common ancestor 10 million years ago to tree-dwelling and ground-dwelling primates, with their differing ethanol exposures causing this genetic deviation. This type of estimation of what the genome looked like previously is very useful in cases like these, where the history is not well recorded with fossil data. More research and evidence will be able to shed light on the development of the primate family tree.

Hannah Bruce Macdonald SYNAPSE | 7


ANOTHER CLIMATE CHANGE CONUNDRUM ARTICLES

Does Climate Change Effect Parasite Epidemiology?

parasitic diseases, particularly for parasites whose lifecycle involves a significant amount of time outside of a host. The free-living lifecycle stages are highly sensitive to the selection pressures of temperature, humidity and

of significance is increasing in northern Europe. One might hypothesise that northern and southern European populations of H. contortus have differences in temperature dependence and as such, have different thresholds

carried out by Van Dijk and Morgan and reports from livestock producers, it is now evident that many farms in the UK are suffering both spring and autumnal outbreaks of the disease Nematodirosis that affects sheep,

levels of rainfall. For example, Fasciola hepatica is a flatworm parasite that primarily affects sheep and cattle. The incidence of this liver fluke can be directly correlated to rainfall and is significantly more prevalent in years when summer rainfall is high. The two most interesting examples of parasite adaptation to climate change are the northerly spread of Haemonchus contortus and the southerly spread of Nematodirus battus. H. contortus, being relatively thermophilic, thrives at higher temperatures and as such, its lower temperature threshold for development is higher, as is its optimal temperature range for development, compared to other nematode parasites of sheep. Despite higher thresholds for development, its importance as a parasite

for development. However, studies have found that the temperature dependence of this species is unaltered. Rather than altering its optimal threshold for development, it appears that northern European populations of H. contortus have adapted to colder conditions by employing increased use of hypobiosis, a form of arrested development in the host, as an overwintering strategy. The second example of nematode adaptation to warming is seen in Nematodirus battus, a gastrointestinal nematode of lambs and goats. Historically, N.battus outbreaks were most commonly seen in spring when, following a period of chill over winter, ambient temperatures begin to rise. Once temperatures rise above 10ºC, N.battus larvae begin to develop. Following research

a complete contrast to published data on hatching requirements. The existence of a second autumnal peak in larval abundance was first described in 1987, however little advancement was made towards understanding the determining factor behind this second outbreak. It could be hypothesised that the increase in mean monthly temperatures, combined with warmer wetter winters has led to adaptive selection, leading to multiple annual outbreaks. This geographic expansion is a clear indicator of changing patterns of parasite development and lifecycle characteristics brought about by climate warming. This could consequently impact livestock farmers who may have to face increased costs of treating parasite infection or a direct financial loss if herd mortality is raised.

Owen Gethings

I

n recent months climate change has been at the forefront of the world’s media. Facts and figures are being thrown around here and there about the level of destruction we humans have apparently brought to our little planet. This was highlighted in the BBC One program Africa, which aired on the 6th February. At the end of this series we saw the illustrious Sir David Attenborough playfully engage with a blind baby rhinoceros, before approaching the true reality of our problem. According to the BBC, the average temperature in Africa has increased by 3.5ºC in the past century, but just where did they get these figures? The widely trusted organisation have since apologised for erroneously exaggerating the warming statistics.

8 | SYNAPSE

Despite this, there is still evidence of a changing climate closer to home. The climate in the UK has been highly variable in recent years but with a general trend towards warmer and wetter winters and considerably warmer summers. The national average monthly temperature increased by 1ºC between 1961 and 2004, but going forward, there is the potential for temperatures to rise between 2ºC and 3.5ºC by 2080. At a global level it is predicted that temperatures are likely to rise between 1.1ºC and 6.4ºC by the end of the 21st century. This is dependent on the success of schemes to reduce greenhouse gas emissions. Parasitic nematodes of vertebrates are very successful

at adapting to niche and extreme environments because of their exceptionally high reproductive capacity. This ability has enabled nematodes to inhabit a diverse array of environments ranging from polar regions to the tropics. Even more remarkable is their ability to adapt simultaneously to both the external climate and the internal environment of their host. Whilst the environment within the host is relatively constant, the free-living larval stages need to withstand major fluctuations in daily temperature, soil moisture, humidity and stochastic weather events as all these factors impact on development and parasite viability. It is predicted that climate change will have profound effects on the patterns and prevalence of

SYNAPSE | 9


ANOTHER CLIMATE CHANGE CONUNDRUM ARTICLES

Does Climate Change Effect Parasite Epidemiology?

parasitic diseases, particularly for parasites whose lifecycle involves a significant amount of time outside of a host. The free-living lifecycle stages are highly sensitive to the selection pressures of temperature, humidity and

of significance is increasing in northern Europe. One might hypothesise that northern and southern European populations of H. contortus have differences in temperature dependence and as such, have different thresholds

carried out by Van Dijk and Morgan and reports from livestock producers, it is now evident that many farms in the UK are suffering both spring and autumnal outbreaks of the disease Nematodirosis that affects sheep,

levels of rainfall. For example, Fasciola hepatica is a flatworm parasite that primarily affects sheep and cattle. The incidence of this liver fluke can be directly correlated to rainfall and is significantly more prevalent in years when summer rainfall is high. The two most interesting examples of parasite adaptation to climate change are the northerly spread of Haemonchus contortus and the southerly spread of Nematodirus battus. H. contortus, being relatively thermophilic, thrives at higher temperatures and as such, its lower temperature threshold for development is higher, as is its optimal temperature range for development, compared to other nematode parasites of sheep. Despite higher thresholds for development, its importance as a parasite

for development. However, studies have found that the temperature dependence of this species is unaltered. Rather than altering its optimal threshold for development, it appears that northern European populations of H. contortus have adapted to colder conditions by employing increased use of hypobiosis, a form of arrested development in the host, as an overwintering strategy. The second example of nematode adaptation to warming is seen in Nematodirus battus, a gastrointestinal nematode of lambs and goats. Historically, N.battus outbreaks were most commonly seen in spring when, following a period of chill over winter, ambient temperatures begin to rise. Once temperatures rise above 10ºC, N.battus larvae begin to develop. Following research

a complete contrast to published data on hatching requirements. The existence of a second autumnal peak in larval abundance was first described in 1987, however little advancement was made towards understanding the determining factor behind this second outbreak. It could be hypothesised that the increase in mean monthly temperatures, combined with warmer wetter winters has led to adaptive selection, leading to multiple annual outbreaks. This geographic expansion is a clear indicator of changing patterns of parasite development and lifecycle characteristics brought about by climate warming. This could consequently impact livestock farmers who may have to face increased costs of treating parasite infection or a direct financial loss if herd mortality is raised.

Owen Gethings

I

n recent months climate change has been at the forefront of the world’s media. Facts and figures are being thrown around here and there about the level of destruction we humans have apparently brought to our little planet. This was highlighted in the BBC One program Africa, which aired on the 6th February. At the end of this series we saw the illustrious Sir David Attenborough playfully engage with a blind baby rhinoceros, before approaching the true reality of our problem. According to the BBC, the average temperature in Africa has increased by 3.5ºC in the past century, but just where did they get these figures? The widely trusted organisation have since apologised for erroneously exaggerating the warming statistics.

8 | SYNAPSE

Despite this, there is still evidence of a changing climate closer to home. The climate in the UK has been highly variable in recent years but with a general trend towards warmer and wetter winters and considerably warmer summers. The national average monthly temperature increased by 1ºC between 1961 and 2004, but going forward, there is the potential for temperatures to rise between 2ºC and 3.5ºC by 2080. At a global level it is predicted that temperatures are likely to rise between 1.1ºC and 6.4ºC by the end of the 21st century. This is dependent on the success of schemes to reduce greenhouse gas emissions. Parasitic nematodes of vertebrates are very successful

at adapting to niche and extreme environments because of their exceptionally high reproductive capacity. This ability has enabled nematodes to inhabit a diverse array of environments ranging from polar regions to the tropics. Even more remarkable is their ability to adapt simultaneously to both the external climate and the internal environment of their host. Whilst the environment within the host is relatively constant, the free-living larval stages need to withstand major fluctuations in daily temperature, soil moisture, humidity and stochastic weather events as all these factors impact on development and parasite viability. It is predicted that climate change will have profound effects on the patterns and prevalence of

SYNAPSE | 9


OPINIONS

TIME IN PHYSICS Leslie Bicknell

“It is impossible to meditate on time and the mystery of the creative passage of nature without an overwhelming emotion at the limitations of human intelligence.”

W

hat is time, really? Time rules our lives, yet it is surprisingly easy to lose track of and ignore even in physics the most fundamental science. Professor J. L. Synge once wrote of the importance of measurements of time in physics, that “the theory underlying these measurements is the most basic theory of all.” But this is not reflected in the phys-

10 | SYNAPSE

rium. However the universe is not in thermal equilibrium, so this cannot be the case. If there was a beginning of time, this implies something came from nothing. Nevertheless, this also cannot be true as it would mean a violation of the conservation of energy. Thus, the first and second laws of thermodynamics (energy conservation and thermal equilibrium) contradict one another. One solution is outlined in the No-Boundary Proposal, developed by physicists James Hartle and Steven Hawking. The proposal suggests there was no time before the big bang, only space. Conservation of energy would not then be violated, since it was not space that had a

beginning, but time itself. Einstein’s theory of relativity encourages us to think of time and space on an equal footing. Time is relative, and like space is influenced by the presence of matter. But unlike space, many details about time are difficult to grasp. Space seems to be presented to us all of a piece, whereas time comes to us only bit by bit. Time, unlike space, is asymmetrical. Bending space or manipulating relative motion means changing how quickly relative time flows, but it still flows in the same direction. In quantum mechanics, time is treated more as a backdrop that a physical system is projected onto, whereas in relativity, time is incorporat-

ed into the framework of the theory. Physicists are currently formulating a theory of quantum gravity which is not an easy task requiring the reconciliation of two of the biggest fields in physics: quantum mechanics and relativity. Developing a theory of time may not only help reconcile the conflicting theories, but also shed light on entirely new ideas in physics. Einstein once suggested time is an illusion. But if it is, what does that mean, for us and for physics? Time is fundamental, underlying our notions of cause and effect and is at the root of many of our physical assumptions. However little about time is understood and this deserves more attention.

N. Whitehead, The Concept of Nature, Chapter III ics of today with time often being ignored altogether. Post-Renaissance physicists prefer to deal with ‘space-like’ concepts alone, striving to express all relations having the form of laws. Including Physical laws which are constant, uniform, and are supposed to hold true at all points in time: past, present, and future. Our judgements concer-

ning time and events in time themselves appear to be ‘in’ time. Because of this, it has been questioned whether time exists outside of us at all. Thermodynamics presents us a paradox. The question must be asked, was there a beginning of time? If there were no beginning, the universe would have had an infinite amount of time to reach thermal equilib-

SYNAPSE | 11


OPINIONS

TIME IN PHYSICS Leslie Bicknell

“It is impossible to meditate on time and the mystery of the creative passage of nature without an overwhelming emotion at the limitations of human intelligence.”

W

hat is time, really? Time rules our lives, yet it is surprisingly easy to lose track of and ignore even in physics the most fundamental science. Professor J. L. Synge once wrote of the importance of measurements of time in physics, that “the theory underlying these measurements is the most basic theory of all.” But this is not reflected in the phys-

10 | SYNAPSE

rium. However the universe is not in thermal equilibrium, so this cannot be the case. If there was a beginning of time, this implies something came from nothing. Nevertheless, this also cannot be true as it would mean a violation of the conservation of energy. Thus, the first and second laws of thermodynamics (energy conservation and thermal equilibrium) contradict one another. One solution is outlined in the No-Boundary Proposal, developed by physicists James Hartle and Steven Hawking. The proposal suggests there was no time before the big bang, only space. Conservation of energy would not then be violated, since it was not space that had a

beginning, but time itself. Einstein’s theory of relativity encourages us to think of time and space on an equal footing. Time is relative, and like space is influenced by the presence of matter. But unlike space, many details about time are difficult to grasp. Space seems to be presented to us all of a piece, whereas time comes to us only bit by bit. Time, unlike space, is asymmetrical. Bending space or manipulating relative motion means changing how quickly relative time flows, but it still flows in the same direction. In quantum mechanics, time is treated more as a backdrop that a physical system is projected onto, whereas in relativity, time is incorporat-

ed into the framework of the theory. Physicists are currently formulating a theory of quantum gravity which is not an easy task requiring the reconciliation of two of the biggest fields in physics: quantum mechanics and relativity. Developing a theory of time may not only help reconcile the conflicting theories, but also shed light on entirely new ideas in physics. Einstein once suggested time is an illusion. But if it is, what does that mean, for us and for physics? Time is fundamental, underlying our notions of cause and effect and is at the root of many of our physical assumptions. However little about time is understood and this deserves more attention.

N. Whitehead, The Concept of Nature, Chapter III ics of today with time often being ignored altogether. Post-Renaissance physicists prefer to deal with ‘space-like’ concepts alone, striving to express all relations having the form of laws. Including Physical laws which are constant, uniform, and are supposed to hold true at all points in time: past, present, and future. Our judgements concer-

ning time and events in time themselves appear to be ‘in’ time. Because of this, it has been questioned whether time exists outside of us at all. Thermodynamics presents us a paradox. The question must be asked, was there a beginning of time? If there were no beginning, the universe would have had an infinite amount of time to reach thermal equilib-

SYNAPSE | 11


FEATURE

HIV:

30 Year Anniversary Review T

he Human Immunodeficiency Virus was first isolated in 1983 from a lymphadenopathy patient and identified as a retrovirus (a family of RNA viruses whose replication cycles require the activity of reverse transcriptase (RT) enzymes to generate DNA from RNA templates). In the following 30 years, it has claimed over 25 million lives and continues to infect around 35 million others. While vast amounts of research have greatly increased our knowledge of this infamous virus, it is nevertheless well-known how hopeless the perception and prospects of HIV infection continue to be. This article aims to provide a brief overview of the background of HIV while outlining the progress and achievements made both in research and clinical therapy since its discovery 30 years ago. The immune deficiency

12 | SYNAPSE

condition characteristic of HIV infection was first observed in 1981 among numerous homosexual young men in various cities in the USA. They exhibited rare skin cancers (Kaposi’s sarcoma) or rare forms of pneumonia caused by Pneumocystis carinii bacteria, which are both associated with an immunocompromised state. Soon, it was found that a depletion of a subset of immune cells (CD4+ T helper cells) was a distinctive feature of this condition, and that there existed risk groups other than homosexual men, suggesting that an infectious agent was responsible. The condition was therefore named ‘acquired immune deficiency syndrome’, although HIV was not named so until 1986. HIV primarily infects by binding to CD4 receptors on the surface of CD4+ T cells, and infect other cells of the imbut has also been shown to uti- mune system. The virus itself lize a variety of other receptors is an enveloped spherical virus

Acquired immune deficiency virus mortality in Africa, Asia and the United States 1982–2006. (Weiss et al., 2008)

containing 2 copies of singlestranded, positive-sense RNA. This is reverse transcribed into DNA by viral RT upon entering the host cell, and may then be incorporated into the DNA of the host cell within the nucleus. This allows the virus to establish a latent infection and avoid recognition by the host’s immune system. Eventually, the virus may become active and transcribe its DNA to produce new RNA molecules and ultimately functional HIV proteins. Both are then packaged into new viral particles and released from the cell via lysis, killing the cell in the process. This active, or acute, replicative phase may lead to the stimulation of cells of the immune system, causing them to also selectively kill HIV-infected cells. This dynamic switching between viral replication and latency, along with the natural programmed

cell death (apoptosis) that can be triggered by viral infection, results in the gradual decline of CD4+ T cells and ultimately leads to the exhaustion of the immune system, which works to replenish lost cell counts. Thus a state of immuno-suppression follows, imparting a predisposition for opportunistic infections and conditions that are the frequent cause of fatality. Development of diagnostic methods soon followed the discovery of HIV, with kits produced in 1985 that were capable of detecting antibodies raised against HIV in blood samples. This hugely aided the prevention of transmission by, for example, screening blood donors. Tests were also developed to detect for the presence of HIV in infected individuals who had not yet raised antibodies to the virus (and therefore would not present a positive result

with earlier tests) but were still infectious. These involve testing for various components of the HIV viral particle, allowing accurate measurement of viral load. Markers for drug resistance were also discovered, which greatly assisted the coordination of anti-viral drug combinations later devoloped. Of the many drug treatments created since the discovery of HIV, the first to undergo clinical trials was Zidovudine, which acts as a nucleoside analog, inhibiting the extension of DNA by viral RT, and therefore the viral replication cycle. Since RT is not found in human cells, this makes Zidovudine a selective inhibitor, i.e. it doesn’t affect human cell replication. Although showing promising results at first by reducing the viral load in patients, Zidovudine-resistant HIV mutants soon emerged. Since then, many other drugs have been developed to inhibit both RT and other HIV proteins, such as HIV protease. In 1996, the concept of combining three or more of these drugs was devised, leading to the creation of highly active anti-retroviral therapy (HAART). This significantly affected disease outcome, reducing the chance of mortality by almost 70%. However, resistance may still emerge, especially if patients were to stop taking drugs (which occurs all too frequently). Most importantly, those most in need of drug therapy are in developing countries and/or remote locations, making logistics

SYNAPSE | 13


FEATURE

HIV:

30 Year Anniversary Review T

he Human Immunodeficiency Virus was first isolated in 1983 from a lymphadenopathy patient and identified as a retrovirus (a family of RNA viruses whose replication cycles require the activity of reverse transcriptase (RT) enzymes to generate DNA from RNA templates). In the following 30 years, it has claimed over 25 million lives and continues to infect around 35 million others. While vast amounts of research have greatly increased our knowledge of this infamous virus, it is nevertheless well-known how hopeless the perception and prospects of HIV infection continue to be. This article aims to provide a brief overview of the background of HIV while outlining the progress and achievements made both in research and clinical therapy since its discovery 30 years ago. The immune deficiency

12 | SYNAPSE

condition characteristic of HIV infection was first observed in 1981 among numerous homosexual young men in various cities in the USA. They exhibited rare skin cancers (Kaposi’s sarcoma) or rare forms of pneumonia caused by Pneumocystis carinii bacteria, which are both associated with an immunocompromised state. Soon, it was found that a depletion of a subset of immune cells (CD4+ T helper cells) was a distinctive feature of this condition, and that there existed risk groups other than homosexual men, suggesting that an infectious agent was responsible. The condition was therefore named ‘acquired immune deficiency syndrome’, although HIV was not named so until 1986. HIV primarily infects by binding to CD4 receptors on the surface of CD4+ T cells, and infect other cells of the imbut has also been shown to uti- mune system. The virus itself lize a variety of other receptors is an enveloped spherical virus

Acquired immune deficiency virus mortality in Africa, Asia and the United States 1982–2006. (Weiss et al., 2008)

containing 2 copies of singlestranded, positive-sense RNA. This is reverse transcribed into DNA by viral RT upon entering the host cell, and may then be incorporated into the DNA of the host cell within the nucleus. This allows the virus to establish a latent infection and avoid recognition by the host’s immune system. Eventually, the virus may become active and transcribe its DNA to produce new RNA molecules and ultimately functional HIV proteins. Both are then packaged into new viral particles and released from the cell via lysis, killing the cell in the process. This active, or acute, replicative phase may lead to the stimulation of cells of the immune system, causing them to also selectively kill HIV-infected cells. This dynamic switching between viral replication and latency, along with the natural programmed

cell death (apoptosis) that can be triggered by viral infection, results in the gradual decline of CD4+ T cells and ultimately leads to the exhaustion of the immune system, which works to replenish lost cell counts. Thus a state of immuno-suppression follows, imparting a predisposition for opportunistic infections and conditions that are the frequent cause of fatality. Development of diagnostic methods soon followed the discovery of HIV, with kits produced in 1985 that were capable of detecting antibodies raised against HIV in blood samples. This hugely aided the prevention of transmission by, for example, screening blood donors. Tests were also developed to detect for the presence of HIV in infected individuals who had not yet raised antibodies to the virus (and therefore would not present a positive result

with earlier tests) but were still infectious. These involve testing for various components of the HIV viral particle, allowing accurate measurement of viral load. Markers for drug resistance were also discovered, which greatly assisted the coordination of anti-viral drug combinations later devoloped. Of the many drug treatments created since the discovery of HIV, the first to undergo clinical trials was Zidovudine, which acts as a nucleoside analog, inhibiting the extension of DNA by viral RT, and therefore the viral replication cycle. Since RT is not found in human cells, this makes Zidovudine a selective inhibitor, i.e. it doesn’t affect human cell replication. Although showing promising results at first by reducing the viral load in patients, Zidovudine-resistant HIV mutants soon emerged. Since then, many other drugs have been developed to inhibit both RT and other HIV proteins, such as HIV protease. In 1996, the concept of combining three or more of these drugs was devised, leading to the creation of highly active anti-retroviral therapy (HAART). This significantly affected disease outcome, reducing the chance of mortality by almost 70%. However, resistance may still emerge, especially if patients were to stop taking drugs (which occurs all too frequently). Most importantly, those most in need of drug therapy are in developing countries and/or remote locations, making logistics

SYNAPSE | 13


and therefore treatment incredibly difficult to arrange. Lastly, there has been a huge inability to produce an effective vaccine against HIV to prevent infection, despite the many trials developed. This, along with the inability to mount an efficient immune response and the emergence of drug-resistant mutants, is largely due to the high mutation rate of the HIV RT, which is 50 times higher than the mutation rate of E. coli DNA replication, and is capable of making up to one mistake in each nucleotide of its genome every day! The genetic diversity of the various HIV clades and quasispecies adds additional complexity to this already challenging task. It is therefore extremely difficult to generate or provoke an efficient immune response that can keep up with so many changes and variations so frequently. Work has instead focussed on alternatives, such as ‘therapeutic

ARTICLES

Captivating Creatures Contributing To Developmental Biology

How single cells develop and specialise to aid formation of specific tissues and organs is essential to the understanding of the adult form. Knowledge of developmental biology can contribute to our understanding of ‘what went wrong’ when faced with human abnormalities or certain diseases. Many fascinating creatures are being studied to unravel the evolutionary developmental processes. Here are just a few astonishing examples:

Felicity Russell

vaccines’ (to improve the immune response alongside drug treatment) and immunizing infected individuals with modified HIV antigens (to enhance the immune response towards potential developing mutants). Although much progress has been made over 30 years in research, diagnosis and management, HIV remains a pathogen that is not only lethal and easily

transmissible but remarkably adept at proving almost every possible cure or vaccine futile. The search continues for effective treatments and vaccines. Exceptional cases of recoveries provide hope, however small, that the possibility of cures and permanent prevention will one day become reality.

Sophia Ho

Did you know? ''Could the post-night out stubble be more than just a reminder that you’ve overslept? Alcohol is known to cause vasodilatation, hence why feeling and looking flushed is a common side-effect of drinking. Dilated peripheral blood vessels enable an increased nutrient and oxygen supply to hair follicles, thus encouraging growth. The vasodilatation characteristic of minoxidil, historically used as a blood pressure treatment, is thought to be responsible for the increased hair growth experienced by patients who were prescribed the drug. Due to this adverse effect, minoxidil is no longer recommended as a hypertensive treatment but can now be purchased at most pharmacies to treat hair loss. Many men complain that their stubble is noticeably longer the morning after a heavy night of drinking - could this be why?''

Rachel Cole

14 | SYNAPSE

Axolotl

A larval form of the salamander, handy for studying limb development. When their limbs are lost they can regenerate the whole limb from the trunk tissue and spinal cord, bones included. They can even regenerate their tail when it is amputated. Like newts and frogs they have the ability of lens regeneration although not as efficiently. Salamanders can also regenerate kidney, brain and heart tissues. This enables these amphibians to undergo repair when damaged, most likely due to predation, and provides us humans with the chance to reveal the secrets of development and repair.

Zebrafish

Sea Anemone

These primitive animals related to jellyfish and corals are found attached to rocks and shells along the British coastline. They have a soft polyp and multiple stinging tentacles used for catching prey. Anemones have recently been used to demonstrate the processes contributing to embryonic tentacle development, crucial for understanding the formation of appendages and diversification of body plans. Tentacles are now known to develop from disks of high density cell patches called placodes. These cells can become thinner and flatter as the tentacle elongates. These clever fish can rapidly regenerate and replace lost heart tissue. The missing tissue can be replaced by growth of heart cells (cardiomyocytes). The main question is trying to work out where these new cells come from. The British Heart Foundation is working with scientists to promote the discovery of drugs targeted at promoting growth of these new heart cells which could potentially be used to help heal human hearts in the future.

SYNAPSE | 15


and therefore treatment incredibly difficult to arrange. Lastly, there has been a huge inability to produce an effective vaccine against HIV to prevent infection, despite the many trials developed. This, along with the inability to mount an efficient immune response and the emergence of drug-resistant mutants, is largely due to the high mutation rate of the HIV RT, which is 50 times higher than the mutation rate of E. coli DNA replication, and is capable of making up to one mistake in each nucleotide of its genome every day! The genetic diversity of the various HIV clades and quasispecies adds additional complexity to this already challenging task. It is therefore extremely difficult to generate or provoke an efficient immune response that can keep up with so many changes and variations so frequently. Work has instead focussed on alternatives, such as ‘therapeutic

ARTICLES

Captivating Creatures Contributing To Developmental Biology

How single cells develop and specialise to aid formation of specific tissues and organs is essential to the understanding of the adult form. Knowledge of developmental biology can contribute to our understanding of ‘what went wrong’ when faced with human abnormalities or certain diseases. Many fascinating creatures are being studied to unravel the evolutionary developmental processes. Here are just a few astonishing examples:

Felicity Russell

vaccines’ (to improve the immune response alongside drug treatment) and immunizing infected individuals with modified HIV antigens (to enhance the immune response towards potential developing mutants). Although much progress has been made over 30 years in research, diagnosis and management, HIV remains a pathogen that is not only lethal and easily

transmissible but remarkably adept at proving almost every possible cure or vaccine futile. The search continues for effective treatments and vaccines. Exceptional cases of recoveries provide hope, however small, that the possibility of cures and permanent prevention will one day become reality.

Sophia Ho

Did you know? ''Could the post-night out stubble be more than just a reminder that you’ve overslept? Alcohol is known to cause vasodilatation, hence why feeling and looking flushed is a common side-effect of drinking. Dilated peripheral blood vessels enable an increased nutrient and oxygen supply to hair follicles, thus encouraging growth. The vasodilatation characteristic of minoxidil, historically used as a blood pressure treatment, is thought to be responsible for the increased hair growth experienced by patients who were prescribed the drug. Due to this adverse effect, minoxidil is no longer recommended as a hypertensive treatment but can now be purchased at most pharmacies to treat hair loss. Many men complain that their stubble is noticeably longer the morning after a heavy night of drinking - could this be why?''

Rachel Cole

14 | SYNAPSE

Axolotl

A larval form of the salamander, handy for studying limb development. When their limbs are lost they can regenerate the whole limb from the trunk tissue and spinal cord, bones included. They can even regenerate their tail when it is amputated. Like newts and frogs they have the ability of lens regeneration although not as efficiently. Salamanders can also regenerate kidney, brain and heart tissues. This enables these amphibians to undergo repair when damaged, most likely due to predation, and provides us humans with the chance to reveal the secrets of development and repair.

Zebrafish

Sea Anemone

These primitive animals related to jellyfish and corals are found attached to rocks and shells along the British coastline. They have a soft polyp and multiple stinging tentacles used for catching prey. Anemones have recently been used to demonstrate the processes contributing to embryonic tentacle development, crucial for understanding the formation of appendages and diversification of body plans. Tentacles are now known to develop from disks of high density cell patches called placodes. These cells can become thinner and flatter as the tentacle elongates. These clever fish can rapidly regenerate and replace lost heart tissue. The missing tissue can be replaced by growth of heart cells (cardiomyocytes). The main question is trying to work out where these new cells come from. The British Heart Foundation is working with scientists to promote the discovery of drugs targeted at promoting growth of these new heart cells which could potentially be used to help heal human hearts in the future.

SYNAPSE | 15


OPINIONS

ARTICLES

Cherry Picking Medicine Sleep Control Y

ou hear about it all the time, the gross misrepresentation of miracle cures by The Daily Mail, the disastrous scandals from the likes of Andrew Wakefield and the studies funded by the money-grabbing companies who are liable to misconstrue. Yet Doctors and medical researchers cling onto their notion of Evidence-based Medicine (EBM) as a means of avoiding or spotting the pitfalls. According to the British Medical Journal, EBM is the ‘conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients’. A notion endlessly signposted throughout medical practice; in fact, if you aren’t familiar with the term by the end of first year medicine then homeopathy may be more up your street. It is flagged as the ‘gold standard’ and essential tool for making sense of the array of evidence out there and if applied correctly, should successfully ease every doctor’s conscience. In short, EBM becomes the secret to a doctor’s good night’s sleep. However, whether this safety net is adequate becomes the knee-jerking question. So we learn the ropes, learn the hierarchy of study types and characteristics, in an attempt to pick out the beauties from the beasts, or as the case may be, the double-blinded from the confounded. Along with scientists at large, we become a fan of the multi-trial meta-analyses and systematic reviews. Yet, what we are not told is what

16 | SYNAPSE

evidence we have the privilege of choosing from, and that is where Ben Goldacre (author of ‘Bad Science’) comes along. He (amongst many others) highlights that studies are not just biased but that half of all clinical studies are not accessible at all. So while the systematic collaboration of evidence may be celebrated by our modern day research experts, this collaboration relies on a small and somewhat disproportionate bunch. Therefore, while doctors may try their best to identify what is flawed (and Drug Company sponsored), they cannot get past the fact that much of the research successfully carried out and completed, remains unpublished or even unregistered. They may be excellent analysts, but without a full crop to cultivate from, they are fighting a losing battle. As a medical student, the future implications become pretty hard to swallow. What hope do I have of one day answering the questions posed by confused patients if all the evidence is not available to dispel my own confusion? What happens when doctors prescribe a drug with supposedly ‘significantly positive’ effects but whose negative, potentially dangerous effects remain expertly hidden from view? The Daily Mail’s juicy headlines

Georgina Maguire start to feel a lot closer to home. One hopes that the notion of doing what is best for the individual patient shall prevail, that the likes of Ben Goldacre continue to uncover the evil workings of drug companies and that growing petitions such as ‘alltrials. net’ continue to raise awareness and draw attention to this worrying reality. It seems that EBM may be great in theory but, in practice, has a long way to go. I for one hope that the fair publication and systematic collaboration of all completed studies becomes a reality, so that I may avoid becoming a ‘drug-company based’ doctor of the future and not make the avoidable mistake of doing more harm than good.

I

Reema Joshi

t is clear that we all need a good night’s sleep, but I often wish I could run efficiently on less than the recommended 8 hours. This idea has recently become of interest to researchers who are investigating technology-based methods to reduce the amount of sleep required, without losing the cognitive and health benefits associated with 8 hours of sleep. It is known that standard sleep consists of several cycles with 4 stages: stage 1, stage 2, slow-wave and rapid eye movement (REM) sleep. Stage 1 sleep lasts between 5 and 15 minutes and is the stage between sleep and wakefulness. Stage 2 sleep involves the production of rhythmic brain activity called sleep spindles, and is linked to the restoration of alertness and muscle fatigue. Slow-wave sleep consists of delta waves, and is the most difficult stage to wake from. In the first cycle it lasts approximately 60 minutes and then decreases in duration with each cycle, with REM sleep eventually taking over. Slow-wave sleep is critical for the long-term consolidation of novel memories and is also associated with important mechanisms, including learning and plasticity processes, and the secretion of growth hormones which stimulate bone and tissue repair. This knowledge has led researchers to suggest that, by employing brain stimulation techniques, it may be possible to reduce the amount of sleep required whilst also preventing the negative effects of sleep deprivation by condensing sleep into the most important stages, such as slow-wave sleep. One technique, known as transcranial direct current stimulation (tDCS), which directs continuous weak current to brain areas using electrodes, has made it possible to manipulate brain waves, allowing movement into deeper stages of sleep. Lisa Marshall and colleagues (2004) applied tDCS to frontolateral locations in healthy human subjects 4 minutes after they had entered stage 2 sleep and were able to manipulate the sleep stages experienced. Slow waves were increased and the lighter sleep stages were reduced; the induced slow-wave sleep appeared indistinguishable from normal sleep. Furthermore,

reducing the lighter stages of sleep seemed to have no negative impact on memory tests. This research indicates that sleep stages can indeed be manipulated through brain stimulation. Another brain stimulation technique, known as transcranial magnetic stimulation (TMS) may allow even more sleep control by actually manipulating the brain to skip the least important early sleep stages 1 and 2. TMS is a non-invasive technique which induces weak currents in brain areas using a magnetic field. Marcello Massimini and colleagues (2009) carried out TMS on 15 subjects and immediately triggered slow waves matching naturally occurring slow waves during slow-wave sleep, indicating that perhaps one day TMS may allow us to skip less important stages of sleep and potentially reduce the amount of sleep we require whilst also maintaining the cognitive and health benefits associated with the standard 8 hours of sleep. In conclusion, with further research and technological advancements in brain stimulation techniques, running efficiently on less sleep may actually become a possibility, thereby allowing us to pack in a little bit more into our already limited waking hours.

SYNAPSE | 17


OPINIONS

ARTICLES

Cherry Picking Medicine Sleep Control Y

ou hear about it all the time, the gross misrepresentation of miracle cures by The Daily Mail, the disastrous scandals from the likes of Andrew Wakefield and the studies funded by the money-grabbing companies who are liable to misconstrue. Yet Doctors and medical researchers cling onto their notion of Evidence-based Medicine (EBM) as a means of avoiding or spotting the pitfalls. According to the British Medical Journal, EBM is the ‘conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients’. A notion endlessly signposted throughout medical practice; in fact, if you aren’t familiar with the term by the end of first year medicine then homeopathy may be more up your street. It is flagged as the ‘gold standard’ and essential tool for making sense of the array of evidence out there and if applied correctly, should successfully ease every doctor’s conscience. In short, EBM becomes the secret to a doctor’s good night’s sleep. However, whether this safety net is adequate becomes the knee-jerking question. So we learn the ropes, learn the hierarchy of study types and characteristics, in an attempt to pick out the beauties from the beasts, or as the case may be, the double-blinded from the confounded. Along with scientists at large, we become a fan of the multi-trial meta-analyses and systematic reviews. Yet, what we are not told is what

16 | SYNAPSE

evidence we have the privilege of choosing from, and that is where Ben Goldacre (author of ‘Bad Science’) comes along. He (amongst many others) highlights that studies are not just biased but that half of all clinical studies are not accessible at all. So while the systematic collaboration of evidence may be celebrated by our modern day research experts, this collaboration relies on a small and somewhat disproportionate bunch. Therefore, while doctors may try their best to identify what is flawed (and Drug Company sponsored), they cannot get past the fact that much of the research successfully carried out and completed, remains unpublished or even unregistered. They may be excellent analysts, but without a full crop to cultivate from, they are fighting a losing battle. As a medical student, the future implications become pretty hard to swallow. What hope do I have of one day answering the questions posed by confused patients if all the evidence is not available to dispel my own confusion? What happens when doctors prescribe a drug with supposedly ‘significantly positive’ effects but whose negative, potentially dangerous effects remain expertly hidden from view? The Daily Mail’s juicy headlines

Georgina Maguire start to feel a lot closer to home. One hopes that the notion of doing what is best for the individual patient shall prevail, that the likes of Ben Goldacre continue to uncover the evil workings of drug companies and that growing petitions such as ‘alltrials. net’ continue to raise awareness and draw attention to this worrying reality. It seems that EBM may be great in theory but, in practice, has a long way to go. I for one hope that the fair publication and systematic collaboration of all completed studies becomes a reality, so that I may avoid becoming a ‘drug-company based’ doctor of the future and not make the avoidable mistake of doing more harm than good.

I

Reema Joshi

t is clear that we all need a good night’s sleep, but I often wish I could run efficiently on less than the recommended 8 hours. This idea has recently become of interest to researchers who are investigating technology-based methods to reduce the amount of sleep required, without losing the cognitive and health benefits associated with 8 hours of sleep. It is known that standard sleep consists of several cycles with 4 stages: stage 1, stage 2, slow-wave and rapid eye movement (REM) sleep. Stage 1 sleep lasts between 5 and 15 minutes and is the stage between sleep and wakefulness. Stage 2 sleep involves the production of rhythmic brain activity called sleep spindles, and is linked to the restoration of alertness and muscle fatigue. Slow-wave sleep consists of delta waves, and is the most difficult stage to wake from. In the first cycle it lasts approximately 60 minutes and then decreases in duration with each cycle, with REM sleep eventually taking over. Slow-wave sleep is critical for the long-term consolidation of novel memories and is also associated with important mechanisms, including learning and plasticity processes, and the secretion of growth hormones which stimulate bone and tissue repair. This knowledge has led researchers to suggest that, by employing brain stimulation techniques, it may be possible to reduce the amount of sleep required whilst also preventing the negative effects of sleep deprivation by condensing sleep into the most important stages, such as slow-wave sleep. One technique, known as transcranial direct current stimulation (tDCS), which directs continuous weak current to brain areas using electrodes, has made it possible to manipulate brain waves, allowing movement into deeper stages of sleep. Lisa Marshall and colleagues (2004) applied tDCS to frontolateral locations in healthy human subjects 4 minutes after they had entered stage 2 sleep and were able to manipulate the sleep stages experienced. Slow waves were increased and the lighter sleep stages were reduced; the induced slow-wave sleep appeared indistinguishable from normal sleep. Furthermore,

reducing the lighter stages of sleep seemed to have no negative impact on memory tests. This research indicates that sleep stages can indeed be manipulated through brain stimulation. Another brain stimulation technique, known as transcranial magnetic stimulation (TMS) may allow even more sleep control by actually manipulating the brain to skip the least important early sleep stages 1 and 2. TMS is a non-invasive technique which induces weak currents in brain areas using a magnetic field. Marcello Massimini and colleagues (2009) carried out TMS on 15 subjects and immediately triggered slow waves matching naturally occurring slow waves during slow-wave sleep, indicating that perhaps one day TMS may allow us to skip less important stages of sleep and potentially reduce the amount of sleep we require whilst also maintaining the cognitive and health benefits associated with the standard 8 hours of sleep. In conclusion, with further research and technological advancements in brain stimulation techniques, running efficiently on less sleep may actually become a possibility, thereby allowing us to pack in a little bit more into our already limited waking hours.

SYNAPSE | 17


ARTICLES

Extreme proposals for geoengineering schemes include placing massive mirrors in space to reflect sunlight

for geoengineering schemes include placing massive mirrors (1000km across) in space to reflect sunlight or fertilising the oceans to encourage growth of algae that will remove carbon dioxide from the air by photosynthesis. These sorts of schemes would be very costly and could have large, unpredictable effects on the Earth’s ecosystems.

A THEORETICAL SOLUTION:

CAN CROPS COMBAT CLIMATE CHANGE? THE PROBLEM:

An alternative proposal being suggested by researchers in the schools of Biology and Geography at the University of Bristol is to use crop plants to mitigate global warming. The Earth has many different surface types: ocean, desert, ice, urban and agricultural land etc., all of which naturally reflect different amounts of the sun’s radiation (varying from ~85% for snow to ~1% for dark wet soil). Crop plants cover about 11% of the Earth’s land area. Computer climate models have shown that a relatively small increase in the reflectance of crops, such as wheat and barley, by about 4%, could mean a 1˚C reduction of local summer time temperatures (when the crops are fully out in leaf). This would be particularly beneficial in areas of North America, Europe and South East Asia that practice intensive agriculture. Crop geoengineering would be much cheaper and quicker to implement than the aforementioned largescale schemes and would be easily reversible if negative climatic side effects are experienced.

IS CROP GEOENGINEERING FEASIBLE?

Frances Cartwright

The environmental problems associated with climate change are becoming hard to deny whilst an ever-increasing amount of carbon dioxide continues to be released into the atmosphere from industry. This increase in greenhouse gas is intensifying the naturally occurring greenhouse effect, whereby a proportion of the radiation from the sun that is absorbed by the Earth’s atmosphere and re-radiated is trapped within the atmosphere by greenhouse gases. The rest, reflected away without being absorbed by the Earth’s surface, passes back up through the atmosphere and is lost. For the temperature on Earth to remain constant there must be a balance between the radiation being trapped and that being reflected. Increasing greenhouse gas in the atmosphere is trapping more radiation, causing global warming and associated problems, for example sea-level rises, severe floods and droughts and extreme weather events.

Research in the Biology Department is investigating whether crop geoengineering is feasible at a practical level. It is important to quantify differences in reflectance between varieties of commonly grown crop plants to see if the 4% increase can be achieved using natural variation within species. Measurements of reflectance are made using specialist equipment, a spectroradiometer in conjunction with an integrating sphere, to measure the reflection of light across a wide spectrum of wavelengths from ultra-violet up to infra-red. In addition, research is being carried out to try to understand the interaction of light with the complex plant surface. Properties of the waxy outer layer, the presence of hairs and other outgrowths on the leaf, microscopic wax crystals, and leaf pigments all influence the reflection of light in different ways. These must be understood to effectively select crop plant varieties that will reflect more light. Measuring the reflectance of a wide range of leaves allows us to build a picture of the properties that increase their reflectance. Plants that grow in high light environments, such as mountainous areas and deserts, have naturally evolved features that are more reflective of light. Having an understanding of what these features are and how they affect reflectance can help to select varieties of crops that could, one day, successfully mitigate global warming.

THE NEED FOR GEOENGINEERING:

To minimise the damage that rising temperatures will do, carbon dioxide emissions must be reduced on a large scale across the globe. Current predictions are that temperatures will rise between 2°C and 6°C (on average across the globe) over the next century depending on how severely reductions of carbon emissions are made. To minimise this, scientists are trying to come up with ways of limiting the warming effect whilst emission reductions can be made. This is called ‘Geoengineering’: the artificial manipulation of the climate system in order to alleviate climate change effects. Not a fix to climate change, but rather a stopgap. Schemes either work by increasing reflection of the sun’s radiation, or by removing carbon dioxide gas directly from the atmosphere. Extreme proposals

18 | SYNAPSE

SYNAPSE | 19


ARTICLES

Extreme proposals for geoengineering schemes include placing massive mirrors in space to reflect sunlight

for geoengineering schemes include placing massive mirrors (1000km across) in space to reflect sunlight or fertilising the oceans to encourage growth of algae that will remove carbon dioxide from the air by photosynthesis. These sorts of schemes would be very costly and could have large, unpredictable effects on the Earth’s ecosystems.

A THEORETICAL SOLUTION:

CAN CROPS COMBAT CLIMATE CHANGE? THE PROBLEM:

An alternative proposal being suggested by researchers in the schools of Biology and Geography at the University of Bristol is to use crop plants to mitigate global warming. The Earth has many different surface types: ocean, desert, ice, urban and agricultural land etc., all of which naturally reflect different amounts of the sun’s radiation (varying from ~85% for snow to ~1% for dark wet soil). Crop plants cover about 11% of the Earth’s land area. Computer climate models have shown that a relatively small increase in the reflectance of crops, such as wheat and barley, by about 4%, could mean a 1˚C reduction of local summer time temperatures (when the crops are fully out in leaf). This would be particularly beneficial in areas of North America, Europe and South East Asia that practice intensive agriculture. Crop geoengineering would be much cheaper and quicker to implement than the aforementioned largescale schemes and would be easily reversible if negative climatic side effects are experienced.

IS CROP GEOENGINEERING FEASIBLE?

Frances Cartwright

The environmental problems associated with climate change are becoming hard to deny whilst an ever-increasing amount of carbon dioxide continues to be released into the atmosphere from industry. This increase in greenhouse gas is intensifying the naturally occurring greenhouse effect, whereby a proportion of the radiation from the sun that is absorbed by the Earth’s atmosphere and re-radiated is trapped within the atmosphere by greenhouse gases. The rest, reflected away without being absorbed by the Earth’s surface, passes back up through the atmosphere and is lost. For the temperature on Earth to remain constant there must be a balance between the radiation being trapped and that being reflected. Increasing greenhouse gas in the atmosphere is trapping more radiation, causing global warming and associated problems, for example sea-level rises, severe floods and droughts and extreme weather events.

Research in the Biology Department is investigating whether crop geoengineering is feasible at a practical level. It is important to quantify differences in reflectance between varieties of commonly grown crop plants to see if the 4% increase can be achieved using natural variation within species. Measurements of reflectance are made using specialist equipment, a spectroradiometer in conjunction with an integrating sphere, to measure the reflection of light across a wide spectrum of wavelengths from ultra-violet up to infra-red. In addition, research is being carried out to try to understand the interaction of light with the complex plant surface. Properties of the waxy outer layer, the presence of hairs and other outgrowths on the leaf, microscopic wax crystals, and leaf pigments all influence the reflection of light in different ways. These must be understood to effectively select crop plant varieties that will reflect more light. Measuring the reflectance of a wide range of leaves allows us to build a picture of the properties that increase their reflectance. Plants that grow in high light environments, such as mountainous areas and deserts, have naturally evolved features that are more reflective of light. Having an understanding of what these features are and how they affect reflectance can help to select varieties of crops that could, one day, successfully mitigate global warming.

THE NEED FOR GEOENGINEERING:

To minimise the damage that rising temperatures will do, carbon dioxide emissions must be reduced on a large scale across the globe. Current predictions are that temperatures will rise between 2°C and 6°C (on average across the globe) over the next century depending on how severely reductions of carbon emissions are made. To minimise this, scientists are trying to come up with ways of limiting the warming effect whilst emission reductions can be made. This is called ‘Geoengineering’: the artificial manipulation of the climate system in order to alleviate climate change effects. Not a fix to climate change, but rather a stopgap. Schemes either work by increasing reflection of the sun’s radiation, or by removing carbon dioxide gas directly from the atmosphere. Extreme proposals

18 | SYNAPSE

SYNAPSE | 19


ARTICLES

Mosasaurs: Giant MarineCarnivores of the LateCretaceous Tom Stubbs The Mesozoic era, between 252 and 65 million years ago, is usually referred to as the ‘Age of Dinosaurs’. However, during this time an exceptionally diverse range of reptiles lived and prospered in the oceans. These sea-going reptiles are collectively called Mesozoic marine reptiles. Here we check out just one of these groups, the Mosasaurs, which appeared at the very end of the Mesozoic.

Age: Mosasaurs were one of the most successful reptilian groups during the Late Cretaceous, between 98 and 65 million years ago. They became extinct during a mass extinction at the end of the Cretaceous period. During this catastrophic event many other major animal groups also became extinct or suffered a considerable loss of diversity.

20 | SYNAPSE

Movement: Mosasaurs moved through the ocean using two methods. Some were anguilliform swimmers, meaning they generated propulsive force with lateral undulations of the entire body. Others evolved a more advanced method of movement called carangiform locomotion. This involved using the back end of the body, mainly the tail, to produce thrust. Some suggest that carangiform locomotion may have been an adaptation to cruising in open ocean environments. Appearance and origins: Mosasaurs were specialized marine reptiles related to lizards and snakes. They are sometimes called “sea-going monitor lizards” as they had some resemblance to modern monitors, such as the Komodo dragon. Mosasaurs resemble large lizards, but they have a number of key adaptations, including large size, robust skulls with massive teeth, paddle-like limbs and the ability to give birth to live young. The largest Mosasaurs were around 13 metres in length, which is two times bigger than an average killer whale. Primitive mosasaurs could be as small as 1 metre.

Diet: Mosasaurs fed upon a diverse range of foods. Some ate hard shelled ammonites, while others fed upon other large marine reptiles, such as plesiosaurs. Giant mosasaurs were at the top of the food chain during their existence. Evidence for predation comes from fossils with bite marks, including bones and shells.

SYNAPSE | 21


ARTICLES

Mosasaurs: Giant MarineCarnivores of the LateCretaceous Tom Stubbs The Mesozoic era, between 252 and 65 million years ago, is usually referred to as the ‘Age of Dinosaurs’. However, during this time an exceptionally diverse range of reptiles lived and prospered in the oceans. These sea-going reptiles are collectively called Mesozoic marine reptiles. Here we check out just one of these groups, the Mosasaurs, which appeared at the very end of the Mesozoic.

Age: Mosasaurs were one of the most successful reptilian groups during the Late Cretaceous, between 98 and 65 million years ago. They became extinct during a mass extinction at the end of the Cretaceous period. During this catastrophic event many other major animal groups also became extinct or suffered a considerable loss of diversity.

20 | SYNAPSE

Movement: Mosasaurs moved through the ocean using two methods. Some were anguilliform swimmers, meaning they generated propulsive force with lateral undulations of the entire body. Others evolved a more advanced method of movement called carangiform locomotion. This involved using the back end of the body, mainly the tail, to produce thrust. Some suggest that carangiform locomotion may have been an adaptation to cruising in open ocean environments. Appearance and origins: Mosasaurs were specialized marine reptiles related to lizards and snakes. They are sometimes called “sea-going monitor lizards” as they had some resemblance to modern monitors, such as the Komodo dragon. Mosasaurs resemble large lizards, but they have a number of key adaptations, including large size, robust skulls with massive teeth, paddle-like limbs and the ability to give birth to live young. The largest Mosasaurs were around 13 metres in length, which is two times bigger than an average killer whale. Primitive mosasaurs could be as small as 1 metre.

Diet: Mosasaurs fed upon a diverse range of foods. Some ate hard shelled ammonites, while others fed upon other large marine reptiles, such as plesiosaurs. Giant mosasaurs were at the top of the food chain during their existence. Evidence for predation comes from fossils with bite marks, including bones and shells.

SYNAPSE | 21


Marvels

T

he immune system is a true marvel: without one, the mildest infections can easily turn fatal. This is why AIDS (acquired immunodeficiency syndrome) kills around two million people a year. However, there are instances where a strong immune system can be your downfall due to a component of our immune systems: the cytokine. Cytokines are proteins, similar to hormones, which are secreted inside our body and used in cell communication. They play a key role in immune responses, summoning immune cells to an infection and then ordering those cells to make more cytokines, a positive feedback loop. Some diseases cause this feedback loop to go into overdrive, producing a cytokine storm. This runaway response is biological ‘friendly fire’, with massive numbers of immune cells causing tissue damage, blocking off airways and potentially leading to death. Strong immune systems are a weakness here, as they lead to larger responses. This helps explain why the Spanish flu pandemic of 1918 killed up to one hundred million, with a disproportionate effect on young, healthy people. It is also partly why a human-to-human avian flu mutant is so concerning - humanity today is in the calm before A Case Of Survival Of The Weakest Cormac Kinsella the (possibly) impending storm.

The Storm On Our Horizon

MARVELS

STUDENT INSPIRATION AWARD: EMILY MILODOWSKI

C

ongratulations to Bristol University student Emily Milodowski, who on the 9th March was chosen as the winner of the Student Inspiration Award at this year’s Crufts Dog Show. This award recognizes and rewards students who are making an impact on the health and wellbeing of dogs, and transforming our understanding of human diseases. Emily's major interests include canine gastroenterology as well as bacteria and important changing features of bacteria such as the development of antibiotic resistance. Her research started with a Summer Research Scholarship from the BBSRC (Biotechnology and Biological Sciences Research Council) looking at bacterial involvement in inflammatory bowel disease (IBD) in dogs- the role of bacteria has previously been disputed in this disease, with many believing that IBD results from activation of the immune response in the gut for unknown reasons and that the bacteria might then take advantage of damage that is there already. Her findings suggest that there was an ‘association’ between Campylobacter and diagnosis of inflammation. This work on the prevalence and distribution of bacteria, in the canine intestine, has led to Emily being awarded the £10,000 prize to fund her future work.

Did you know?

''Did you know that the voices in your head are real? Don't worry, you haven't finally lost it (probably). When silently reading a book, your brain imagines direct speech (for example, "I played football earlier") as being spoken by a 'voice'. Using MRI scans of people reading, researchers have shown that as well as activating the visual cortex of the brain, reading speech or quotations stimulates the auditory regions of the brain. You actually 'hear' the voices in your head! Just don't listen if they start telling you to do things...''

Sarah Jose

22 | SYNAPSE

Bristol University student Emily Milodowski with Dr Elaine Ostrander and Dr Gus Aguirre

SYNAPSE | 23


Marvels

T

he immune system is a true marvel: without one, the mildest infections can easily turn fatal. This is why AIDS (acquired immunodeficiency syndrome) kills around two million people a year. However, there are instances where a strong immune system can be your downfall due to a component of our immune systems: the cytokine. Cytokines are proteins, similar to hormones, which are secreted inside our body and used in cell communication. They play a key role in immune responses, summoning immune cells to an infection and then ordering those cells to make more cytokines, a positive feedback loop. Some diseases cause this feedback loop to go into overdrive, producing a cytokine storm. This runaway response is biological ‘friendly fire’, with massive numbers of immune cells causing tissue damage, blocking off airways and potentially leading to death. Strong immune systems are a weakness here, as they lead to larger responses. This helps explain why the Spanish flu pandemic of 1918 killed up to one hundred million, with a disproportionate effect on young, healthy people. It is also partly why a human-to-human avian flu mutant is so concerning - humanity today is in the calm before A Case Of Survival Of The Weakest Cormac Kinsella the (possibly) impending storm.

The Storm On Our Horizon

MARVELS

STUDENT INSPIRATION AWARD: EMILY MILODOWSKI

C

ongratulations to Bristol University student Emily Milodowski, who on the 9th March was chosen as the winner of the Student Inspiration Award at this year’s Crufts Dog Show. This award recognizes and rewards students who are making an impact on the health and wellbeing of dogs, and transforming our understanding of human diseases. Emily's major interests include canine gastroenterology as well as bacteria and important changing features of bacteria such as the development of antibiotic resistance. Her research started with a Summer Research Scholarship from the BBSRC (Biotechnology and Biological Sciences Research Council) looking at bacterial involvement in inflammatory bowel disease (IBD) in dogs- the role of bacteria has previously been disputed in this disease, with many believing that IBD results from activation of the immune response in the gut for unknown reasons and that the bacteria might then take advantage of damage that is there already. Her findings suggest that there was an ‘association’ between Campylobacter and diagnosis of inflammation. This work on the prevalence and distribution of bacteria, in the canine intestine, has led to Emily being awarded the £10,000 prize to fund her future work.

Did you know?

''Did you know that the voices in your head are real? Don't worry, you haven't finally lost it (probably). When silently reading a book, your brain imagines direct speech (for example, "I played football earlier") as being spoken by a 'voice'. Using MRI scans of people reading, researchers have shown that as well as activating the visual cortex of the brain, reading speech or quotations stimulates the auditory regions of the brain. You actually 'hear' the voices in your head! Just don't listen if they start telling you to do things...''

Sarah Jose

22 | SYNAPSE

Bristol University student Emily Milodowski with Dr Elaine Ostrander and Dr Gus Aguirre

SYNAPSE | 23



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.