FR
EE
SYNAPSE THE SCIENCE MAGAZINE WRITTEN BY STUDENTS FOR STUDENTS
ISSUE 13 - Winter 2017
University of Bristol
Is the Moon a Planet? The Male Contraceptive Pill
Neutrinos: Subatomic Ghosts
EDITORIAL
The Team George Thomas Editor-in-Chief
Amy Newman Vice President
Jessica Towne
Secretary & Senior Editor
Mutanu Malinda
Treasurer & Senior Editor
Gabriel Penn
Chief Graphic Designer
Melissa Levy
Welcome to
SYNA PSE Science Magazine
W
elcome to the 13th issue of Synapse Magazine, the University of Bristol’s science magazine written by students. This issue features a wide range of articles, with topics ranging from the sound barrier to contraceptives, and addressing issues such as coral bleaching and the controversy surrounding homeopathy. There are many ways you can get involved with the magazine, and it’s never too late to contribute! Take a look at our blog at synapsebristol.blogspot.co.uk for more information, or you can join our society via the SU website.
Managing Editor
Nina Smith
Managing Editor
Rachel Baxter
E: synapsebristol@gmail.com
Media Director
Marta Plaszczyk Senior Editor
Kieran McLaverty Sophie Groenhof Senior Editor
Cover image kindly supplied by
Monika Jakinowicz 2 | SYNAPSE
Editors
Senior Editor
David Morris James Portman Fangyu Wan
Synapse wishes to acknowledge the support of the
School of Biochemistry.
CONTENTS
4
Bleach clothes, not coral Neutrinos
6
Subatomic ghosts
9 10
Animal electricity Is the Moon a planet? Homeopathy
12
Medicine or placebo?
Fat chance! Chasing the Sound Barrier The Male Contraceptive Pill Coming soon?
Alzheimer’s: the diabetic brain? Synapse Strips
14 16 18 20 22
On the cover: Light microscope image of a synthesised Prussian blue pigment on glass slide. In nanoparticle form, the pigment could have uses in medical imaging.
synapsebristol.blogspot.co.uk | 3
ARTICLE
BLEACH CLOTHES W
e have all become familiar with the detrimental effects of climate change over the past couple of decades. An area which seems to have been sorely overlooked, however, is that of coral bleaching. Coral bleaching is the process by which coral loses its symbiotic relationships with the dinoflagellates (a group of single-celled algae) which live within them. This causes a colour change, as the symbiotic dinoflagellates are what provide the reef with its vibrant colours, leaving the coral white – hence the name ‘coral bleaching’. The white calcareous skeleton left behind serves as a reminder of the once diverse ocean habitat. To put the severity of the issue into context, coral covers a mere 1% of our oceans yet is home to 25% of all the marine wildlife known to man. In the last few decades we have contributed to the destruction of 40% of coral around the globe. This phenomenon is mainly caused by increases in water temperatures, as coral has a limited temperature range
4 | SYNAPSE
in which it can function. When the temperature gets too high the coral becomes stressed and the organisms living within them are expelled. If these are not re-absorbed soon after, the coral will die as it cannot survive without them. The consequence of this is a regime shift, in which there is a rapid expansion of macroalgae such as seaweed, whose numbers would otherwise be limited, subsequently leading to a change in the diversity of fish in the surrounding area. Coral is an essential part of tropical and sub-tropical marine ecosystems; and bleaching therefore has a profound effect not only on the coral and its symbionts but on the vast majority of species living in the surrounding areas, even us! This is because coral has numerous roles, including providing approximately 25% of marine species with shelter, protecting shorelines, supporting fishing industries, and attracting large amounts of tourism. Although reefs are very susceptible,
Photo credit: XL Catlin Seaview Survey
NOT CORAL
coral bleaching is not a death sentence; some resilient reefs are able to bounce back after a bleaching event. A study carried out in the Seychelles Islands found that out of the 21 reefs studied, 12 did not undergo a regime change following a mass bleaching event in 1998. This is a positive, as coral reefs are seen as one of the marine environments most vulnerable to climate change. The researchers also found that less structurally complex reefs found in shallower waters were more at risk of undergoing a regime change. So, this study may suggest that some types of reef are able to adapt at a rate sufficient enough to cope with climate change. Funding and research can give us an insight into how marine ecosystems respond to a bleaching event, and into the detrimental effects of anthropogenic factors such as pollution and overfishing on coral reefs. Identifying the main factors in the survival of coral reef structures may also allow us to devise
a strategy to protect them from climate change, prioritising more structurally complex reefs at greater depths, which are more likely to cope and recover if a mass bleaching event occurs. It would be beneficial to also study the abundance, diversity and conservational status of the species that are associated with different corals as a basis for prioritizing the conservation of reefs. You can reduce your impact on coral simply by diminishing your carbon footprint and supporting tree conservation - trees store carbon and reduce agricultural chemical runoff into watercourses, which could inevitably end up in the ocean as pollution. The numerous ecological and economic benefits of conserving coral reefs makes them an important and cost-effective conservational priority, which must be addressed if we are to protect the vast and beautiful array of marine wildlife found around them.
Nick Comben synapsebristol.blogspot.co.uk
|5
ARTICLE
ν
Neutrinos:
ν
Subatomic Ghosts N
eutrinos are some of the most elusive of nature’s fundamental particles. They are all around us — in fact, around a hundred trillion of them have passed through your body in the last second — and yet they are incredibly hard to detect. These ghostly particles are so reluctant to interact with other matter that they can fly straight through the Earth’s core virtually unimpeded. To most of us, the neutrinos passing through us all the time might as well not exist, and the feeling’s mutual: to the average neutrino, passing through the matter that makes up our bodies and everything around us is as easy as walking through a light mist. But despite their aloofness, neutrino physics has proven vital to our understanding of many subjects, from nuclear physics to astronomy, and remains a very active area of particle physics research.
ν
ν
ν
ν
The existence of the neutrino was first postulated in 1930 by Wolfgang Pauli. Physicists had been grappling with the problem of beta decay, the
ν
6 | SYNAPSE
ν
process by which a radioactive atomic nucleus releases energy by emitting an electron (known here as a beta particle), thus changing its atomic number. Based on theory at the time, beta decay appeared to violate the law of conservation of energy: the energy of the emitted electrons should have been fixed at a certain value, but observations showed that it could be anywhere between that value and zero. This seemed to imply that energy was disappearing — a troubling prospect. Pauli proposed that this energy was instead being carried away by a previously unknown particle with some very particular properties: it must be electrically neutral to satisfy charge conservation, have zero (or extremely small) mass to fit with the observed range of beta particle energies, and interact exceedingly weakly with other matter to explain why it had never been observed. As this idea started to gain traction in the scientific community, Enrico Fermi popularised the name ‘neutrino’, Italian for ‘little neutron’, for Pauli’s particle.
ν
ν
ν
ν
ν
Today, we know that the neutrino does indeed exist. It comes in three ‘flavours’ and makes up the electrically neutral half of the family of particles known as the leptons. Each of these flavours corresponds to one of the electrically charged leptons (the electron and its heavier siblings, the muon and tau), and each is twinned with an anti-particle (straightforwardly known as an antineutrino). Of the four fundamental forces, the neutrino interacts only via gravity (which is so weak that it is negligible on the subatomic scale) and the weak nuclear force. It is immune to the influence of the strong and electromagnetic forces, and this is what makes its interactions with other matter so weak. As difficult as it is to pin neutrinos down, it can be done. The very small probability of an individual neutrino interacting can be offset by the huge numbers in which they are produced in nuclear reactions (such as those taking place in the heart of the Sun, which produce the vast majority of the neutrinos speeding by at any given moment), and by building a large enough detector to capture neutrinos at an appreciable rate. This usually involves filling a huge tank with ton upon ton of neutrinosensitive liquid, and placing it at the bottom of a mine to shield it from the barrage of cosmic rays that would
otherwise drown out the comparatively feeble signs of neutrino detection. This method was pioneered by Raymond Davis and John Bahcall starting in 1970, using the Homestake Gold Mine in South Dakota. Davis and Bahcall hoped to confirm theoretical models of nuclear fusion in the Sun by measuring the resulting flux of solar neutrinos. But they hit an unexpected hitch: no matter how much Davis improved their experiment and Bahcall his calculations, they consistently detected only one third of the number of neutrinos predicted. As more and more neutrino experiments sprang up across the world, they all found the same thing, which became known as the solar neutrino problem. It was not until 2013 that the explanation for this discrepancy was confirmed: neutrinos oscillate, changing from one type to another. Davis and Bahcall were expecting only electron-neutrinos, but some of these had transformed into muon- or tau-neutrinos over the course of their journey from the Sun. This was a momentous discovery, since quantum mechanics tells us that it implies
synapsebristol.blogspot.co.uk
|7
neutrinos do have mass (albeit very little) and thus contradicts the Standard Model of particle physics. The 2015 Nobel prize in Physics was awarded to Takaaki Kajita of SuperKamiokande (SuperK) and Arthur McDonald of the Sudbury Neutrino Observatory (SNO) for their respective teams’ work in confirming neutrino oscillations. There are many questions that neutrino physics has yet to answer. The neutrino masses, and other mathematical parameters that govern oscillation, have yet to be pinned down. There is also potential for new physics that goes beyond the Standard Model. One possibility is that the neutrino and anti-neutrino are one and the same: a Majorana particle. Another is that a ‘sterile’ neutrino exists, one that cannot even interact via the weak force, making it yet more intangible than the ones we already know of — and providing a long-sought-after candidate for dark matter. Whatever their true nature may be, it’s clear that these ghosts of the subatomic world have yet to give up all of their secrets.
Gabriel Penn
8 | SYNAPSE
I
t was around 1780. The lifeless corpse of a frog lay on the table with a brass hook sticking grotesquely out of its leg, holding it in place. The scientist lowered an iron bar, about to attach it to the hook. As the contact was made between metals, the leg twitched! How could the frog show any sign of life in its deceased state? The phenomenon became temporarily known as ‘bioelectrogenesis’, or ‘animal electricity’. The scientist, Luigi Galvani, believed that a naturally-occurring kind of electricity was generated in the frog’s muscular tissue, activating the limbs as it passed through the metal skewer. Galvani concluded that the phenomenon originated from a fluid secreted by the animal’s brain, flowing through the nerves to the muscles, causing their movement. He believed that the nerve carriers were insulated by an invisible layer called myelin. However, approximately 10 years later, a well-known physicist, Alessandro Volta showed that the animal’s legs were twitching not because of the electricity inherently contained within the living organism, but because of the specific metals used in Galvani’s experiments. He proved that electric current resulted from the contact between two different metals within a moist environment. This phenomenon would later become known as galvanism.
ARTICLE
Animal Electricity Despite the criticism, Galvani persevered with his research. He was able to show that muscle contractions were possible without the use of any metals, but rather, by adjoining exposed nerves with exposed muscles. It took two centuries to find out that muscle contractions are indeed initiated by electrical impulses carried by nerves. These arise because of differing concentrations of potassium ions appearing inside and outside of a nerve cell, resulting in electrical potential energy being maintained across the cell’s membrane. The membrane itself is semi-permeable. When charge differences are briefly reversed and ion channels within the nerve open, the potential energy is released as a contraction-triggering impulse. Despite his initial misconceptions regarding ‘animal electricity’, Galvani rightly deserves the credit for being the first to observe the phenomenon of galvanism and establishing the basis of modern neuroscience by discovery of the nerve impulse. Rather than a subject of metaphysics, after decades of further research, today it is common knowledge that nerve impulses are a physical phenomenon, acting as the groundwork for human sensations, movements, thoughts and emotions.
Marta Plaszczyk synapsebristol.blogspot.co.uk
|9
ARTICLE
Is the Moon a Planet? A
quick Google search of “is the Moon a…” will offer the seemingly hilarious suggestion “is the Moon a planet” as an autocomplete. But is this question as misinformed as it sounds? Neither Mercury nor Venus have moons, and the moons of Mars are just small trapped asteroids; nothing like ours. So why does the Earth enjoy the company of a large lunar neighbour, and what is it that makes it a moon, and not a planet? The most commonly accepted definition of a planet is the International Astronomical Union’s definition, which is that an astronomical body is a planet if it orbits the Sun, is large enough that it has overcome hydrostatic equilibrium (is roundish in shape), and has cleared its orbit of debris (such as asteroids). This is the definition which famously removed Pluto’s planetary status, and is highly debated. Only the first of these requirements is clear-cut, and it excludes all planets outside our solar system. The second requirement is flawed too; it is hard to define the point at which a body becomes “round”, and this can be largely dependent upon the materials which comprise the body, rather than the body’s size or mass. For example, Neptune’s moon Proteus is an irregular shape, despite being more massive than Saturn’s spheroidal moon Mimas.
10 | SYNAPSE
However, some progress has been made with the last requirement. Jean-Luc Margot from the University of California recently proposed a formula to clearly define whether a body is capable of clearing its orbit under a given time frame. Although the selection of this length of time is fairly arbitrary, the formula does offer a clear cut-off for which objects would be able to fulfil the last of the IAU’s criteria based solely upon their mass, distance of orbit, and the mass of their host star. And, surprisingly, the Moon is massive enough to satisfy this requirement! So, if there was no Earth and just a Moon on the same orbit, the Moon would be classified as a planet. In his paper detailing this formula, Margot offers a new planetary definition, which may be a vast improvement upon the current one. However, this proposed definition demands nothing of a body to orbit a star alone, which calls into question whether a system of two bodies, like ours, is a system of a planet and moon, or actually two planets. Imagine if the Moon was about the same size as the Earth. Although we would still have originally called it the Moon, we would have eventually realised it was a planet, and that we were in a binary planet system. What if we had evolved on an Earth that orbited a gas giant, which in turn orbited a star? Perhaps we would
classify ourselves differently to the other moons that the gas giant would undoubtedly harbour. So when should an object be considered a moon? Can systems where two planets of roughly the same size share an orbit around their host star even exist? Some research has been conducted into this scenario by researchers at the California Institute of Technology, and the answer is yes! Given the right mass and the right collision, two planets can form together, orbiting each other as they progress around their host star. If such a binary planet system is possible, there must be a point at which it is possible to distinguish between a binary planet system and a planet-moon system. It is thought that our Moon formed from a collision between the Earth and a protoplanet, however under conditions which meant that the Earth retained most of the mass, so the Earth and Moon are different in size. This is why the Earth has a relatively large moon, but the other inner planets do not. Pluto’s moon Charon is actually very similar in size to Pluto, which has led to many suggestions that the pair should be classified as a dwarf-planet binary system. It is unlikely that the definition of a planet will change to encompass the Moon any time soon. For one thing, Margot’s definition would also include Jupiter’s largest moon, Photo credit: NASA
Ganymede, and possibly its second largest, Callisto, as planets. This seems inappropriate given the large difference in size between these two moons and Jupiter. Ideally, a new definition is needed which solves the planet/moon distinction, but at least allows for similar mass binary planets. A possible definition might use the barycentre: the centre of mass around which two bodies orbit. Alternatively, a new class involving moons large enough to be planets could be devised. Either way, the classification of objects is not the most important part of the process, but understanding how planetary systems form, and how the objects within them can interact and vary, is essential in our understanding of the Universe.
George Thomas synapsebristol.blogspot.co.uk | 11
ARTICLE
Homeopathy: a medical solution, or just a placebo?
H
omeopathy has had worldwide use as a form of natural health care for over 200 years. However, it remains a highly controversial issue, with concerns over the effectiveness of the various remedies. Some are strong believers in this alternative form of medicine whilst others disregard its use based on the highly diluted nature of the active ingredient within the preparations. It may be so low that in some cases, no molecules of the active ingredient remain. For this reason, some have dubbed them as nothing more than ‘sugar pills’; any results are due to a placebo effect. Homeopathy was given its name long ago in 1796, by the physician Samuel Hahnemann. The various remedies are composed of an active ingredient (usually toxic) that is subjected to repeated rounds of dilution with either water or alcohol. The final preparation contains dramatically low levels of the active ingredient or none at all. Snake venom and arsenic are among the vast range of starting ingredients for creating different remedies; the idea behind using harmful ingredients is that by diluting them to such a low level, the negative effects of the ingredient are eliminated whilst the healing benefits remain. The premise behind the creation of these
12 | SYNAPSE
homeopathic remedies is based upon two principles held by Hahnemann, sometimes referred to as ‘laws’. The first of these is the ‘law of similars’, which claims that a substance that causes symptoms in a healthy person alike those experienced by a sick person can be used to cure these sick individuals. This is often referred to as ‘like cures like’. The second is the ‘law of infinitesimals’, whereby the more diluted a mixture is, the more effective it will be. When preparations contain no molecules of the starting active ingredient it is believed that the diluting medium will have a ‘memory’ of the active substance. This point receives much well-deserved scepticism. Arnica is one of the most widely known homeopathic remedies in Britain and is fairly commonplace in household cabinets. It is derived from the bright yellow flowers of the plant Arnica montana. The herb itself is toxic and if eaten can be fatal. However, homeopathic pills or
creams contain a highly diluted form of this molecule which renders it safe for use. Despite its effectiveness being unconfirmed, it is widely used to treat aches, pains and bruising, and to reduce swelling. The people who advocate its use range from mums treating playground bumps, to cosmetic surgeons for the relief of post-procedure pain and professional athletes as a means to reduce muscle ache. It is also sometimes used as a type of pre-trauma aid, prior to dental treatment. Some swear by its effectiveness. However, positive effects of this homeopathic remedy are largely based on anecdotal evidence. Positive cases related to diminished bruising or a reduction in swelling could merely be a result of the natural healing process which occurs over a period of time and wrongly credited towards Arnica use. Alternatively, arnica pills may help people suffering trauma solely as a result of the placebo effect: it is known that just believing something is good for you can do wonders and even accelerate the healing process. Clinical trials are really what is needed and these are few and far between. In a double-blind trial, the effects of taking Arnica pills of
different doses 30C and 6C were tested against a placebo. Participants in the trial were all undergoing wrist surgery for carpal tunnel syndrome; 64 patients took the pills 7 days presurgery until 14 days post-surgery. The results showed that there was no significant effect between any of the treatment groups in terms of postoperative pain, bruising or swelling and thus it was concluded that Arnica was no more beneficial than a placebo. To conclude, there is a lack of evidence to suggest that homeopathy has any real medical benefit, despite its widespread use. Selling homeopathic ‘medicine’ in pharmacies may be misleading to some consumers, as not all are aware of the ‘laws’ these remedies are based on and the lack of scientific evidence demonstrating effectiveness. Although homeopathic remedies are not considered to be damaging to health, a danger comes into play if people decide to delay the use of conventional medicine for a serious problem. Consumers of homeopathic remedies should have readily available information regarding homeopathy with appropriate labelling in pharmacies. People can then make an informed decision as to whether they wish to take the remedies as a form of treatment.
Roisin McDonough
synapsebristol.blogspot.co.uk | 13
ARTICLE
Fat Chance! I
’m sure you’ve heard countless times throughout your life that fatty foods are bad for you, but perhaps fats haven’t been given enough of a chance. Fats are nature’s best energy resource. At 9Kcal/g, they contain more energy than the empty calories in alcohol (~7Kcal/g) or macronutrients like proteins or carbohydrates (~4Kcal/g). Due to this high energy density, excess consumption of fatty foods can very easily lead to weight gain. Scientists have linked obesity to diseases such as cardiovascular disease, type 2 diabetes and many cancers, just to name a few. There’s also a big debate on whether supplementing ourselves with synthetic vitamins is as beneficial as consuming the natural forms, such as those found in fatty foods. Foods such as cereals, milk and spreads aren’t fortified effectively with bioavailable forms of vitamins, especially vitamin D. However, it’s not all bad news. Fats are essential for healthy bodily function; subcutaneous fats provide insulation and visceral fats protect the vital organs. The fat-soluble vitamins present in fats are also hugely beneficial. Making the correct choice of fats can replenish your serum levels of these essential micronutrients.
14 | SYNAPSE
Each year, thousands of children worldwide are blinded by vitamin A deficiency. ‘Golden Rice’ is a genetically-engineered rice plant that contains high concentrations of carotenoids and retinol, the precursors for vitamin A. Since 2000 it has been grown in countries that rely heavily on a rice-based diet to help solve this problem. Vitamin A is also found in fatty meats, vegetables (carrots and sweet potatoes) and whole milk. Vitamin D is actually a steroid hormone, not a vitamin as its name suggests. It is involved in regulating the transcription of over 2000 genes in the body. Vitamin D deficiency has shown a clear correlation with various diseases such as osteoporosis, Alzheimer’s, Crohn’s inflammatory disease, sclerosis and multiple cancers. Vitamin D2 (ergocalciferol) is not fat-soluble and must be converted to vitamin D3 in the body. Its benefits may therefore limited, so think twice before buying synthetic vitamin D2 supplements. Vitamin D3 (cholecalciferol) is the more bioavailable form, making it the better option. Our bodily concentration of vitamin D depends mostly on the length of our exposure to solar UVB radiation. Vitamin D is
synthesized in unprotected skin when exposed to UVB radiation, converting 7-dehydrocholesterol to cholecalciferol. Therefore, vitamin D serum concentrations correlate with seasonal fluctuations of UVB. In the UK, the body is unable to produce enough vitamin D from sunshine from October to April. This can lead to a deficiency of vitamin D3, especially if this hormone is not supplemented through our diets. Sun exposure stimulates nitric oxide synthesis, a potent vasodilator that helps to prevent cardiovascular disease. You may have seen yoghurt adverts that promote vitamin D consumption for efficient calcium absorption. This is true, in part, as Vitamin D3 regulates intestinal calcium absorption from the gut lumen through the intestinal plasma membrane pump. However, other molecules, like vitamin K2, are also required for efficient calcium utilisation in the body. Vitamin D may also be of interest to some people because of its role in sexual function. When bound to a cytosolic protein called “sex hormone binding globulin” (SHBG), testosterone is unable to act on androgen receptors. Vitamin D reduces the serum concentration of SHBG, increasing levels of free testosterone in the cell. This activates steroid hormone-dependent signalling pathways, so eating foods rich in vitamin D before a night out could potentially aid your performance by promoting sexual
vigour (if you’re lucky). It’s not called a sex hormone for nothing! There are eight types of vitamin E which are present in a lot of plant oils and soy. However, due to recent controversy with the levels of oestrogen in soybeans, better sources of vitamin E are nuts, avocado, sunflower seeds, apricots, kale and spinach. Vitamin K2 is required alongside vitamin D to translocate calcium from the blood into cells after calcium absorption from the gut. This reduces free calcium in the blood, lowering the risk of arterial calcification and atherosclerosis. Therefore, the benefits of vitamin D are only reaped alongside consumption of sufficient vitamin K2. Western diets tend to contain ten times more vitamin K1 than K2. As the body doesn’t convert K1 to K2 effectively, eating eggs and grass-fed dairy products, such as blue cheese, is required to maintain sufficient levels of K2, helping to restore a healthy K1:K2 ratio. Fats are therefore just as essential as any other source of energy, as long as they are acquired through natural, bioactive food sources.
Lydia Melville &
Olli Walker synapsebristol.blogspot.co.uk | 15
ARTICLE
O
ne of the most important breakthroughs in the history of aviation and our understanding of aerodynamics was the breaking of the sound barrier. You have probably experienced the sound barrier for yourself without even realising: the sound of a whip cracking is actually the result of it moving faster than the speed of sound, creating a miniature sonic boom. Since the time of Isaac Newton people have attempted to work out how quickly sound moves through air and whether we could “overtake” it, as it undoubtedly moves very quickly, but nowhere near as
speed of sound. This generated an alarming buzzing sound and, more importantly, reduced performance due to turbulence generated in the air by tiny shockwaves produced by crossing the barrier. In the Second World War, a number of reports were made of high-performance planes breaking
Chasing the Sound Barrier quickly as light. You can observe this during a lightning storm; the further you are from a lightning strike, the longer it takes for the accompanying thunder to reach you. By noting the time delay you can crudely calculate the speed of the thunder. This speed varies depending on atmospheric conditions, but generally it sits around 760-770mph. The concept of a sound barrier became an issue in aviation during the First World War. It was noted that, despite early biplanes barely exceeding 140 mph, their propeller blade tips sometimes exceeded the
16 | SYNAPSE
the sound barrier during attack dives, resulting in violent shaking and, in more extreme cases, stalling or even breaking up. Some early jet fighters also experienced “control reversal” which could be fixed if they kept accelerating afterwards, indicating that when the sound barrier was exceeded some aircraft could actually perform more or less normally.
A common misconception is that the sound barrier is an actual physical barrier that aircraft were unable to cross. The fact that some did it “by accident” proves this to be false. The sound barrier is more of a transition from one type of air movement to another, and so to exceed the speed of sound safely an aircraft must be shaped and propelled in ways that allow it to adjust to the change in conditions. At high subsonic speeds, as airflow over a wing is faster than underneath it, a pocket of supersonic air forms on top, which terminates in a shockwave causing downward pressure on the wing surface. As the sound barrier is crossed a conical shockwave forms in front of the aircraft (the bow wave) and “trailing edge” shockwaves form behind. These shockwaves are extremely important because they are essentially boundaries between bodies of air with very different temperatures and pressures, which can significantly affect an aircraft’s performance if they are interfered with. For example, if the wings are long enough to contact the bow wave then significant drag may occur, which is why swept wings (wings that are angled from their root) are used so often. Britain and Germany both experimented with supersonic aircraft designs during the war, but neither achieved it. However, some of the features needed to achieve supersonic flight were identified as a result: small, thin wings; a jet or rocket engine to meet the high power requirement; an “all-moving” tail to counter control issues; and, for
jets, a shock cone to slow down air entering the engine (as jets require subsonic air to function). In 1947, the Bell Company tested an aircraft based on these features and more. Engineered by a team of scientists led by John Stack, the Bell X-1 was flown by Captain Chuck Yeager at high altitude after being released from the bomb bay of a larger aircraft. On its 50th flight, the X-1 became the first manned aircraft to fly at supersonic speeds. The first intentionally created sonic boom from an aircraft cracked through the atmosphere. But why, returning to the beginning of this article, do sonic booms happen? Again you have almost certainly experienced a much reduced version of the effect for yourself when you hear ambulances, police cars and any other moving vehicles with sirens. The pitch of an approaching siren is higher than when it is moving away, which is due to the Doppler Effect. Soundwaves moving out in front of a sound-making moving object are compressed, whilst those moving backwards are stretched. If an aircraft is moving faster than the sound it produces, it creates a wave of enormous compression just behind it, which results in an extremely loud, explosion-like noise.
James Ormiston
synapsebristol.blogspot.co.uk | 17
ARTICLE
The Male Contraceptive Pill — Coming Soon? W
hile the female contraceptive pill has been around for over 50 years, the range of effective male birth control methods available has lagged far behind. However, widespread production of a male contraceptive pill is becoming promisingly close. Unfortunately, making male sperm unable to fertilise an egg is much trickier than making the female reproductive system unreceptive to sperm. It’s no wonder when you consider that over 1,500 sperm can be released in a second, and that for a male contraceptive solution to be effective, every single one must be unable to fertilise an egg! Along with the need for safety and reversibility, this has meant that progress in developing a male pill has been somewhat slow. At the moment, there are two main avenues of interest in the field of male contraception: hormonal and non-hormonal. Non-hormonal methods currently being researched involve the vas deferens, the duct which carries sperm from the testes to the penis and beyond. These have included the injection of a chemical that kills sperm, and the insertion of
18 | SYNAPSE
a plug that blocks their passage out of the body. While initial results from both have proven promising, a simple non-invasive pill is much more likely to be accepted and used by the public. Much of the research into a possible male pill has focussed on ways in which the production of the hormone testosterone can be halted, as it triggers the production of new sperm cells in the testes. However, testosterone levels must be retained to some degree, so that users don’t experience unwanted side effects such as a lowered sex drive. Scientists have been able to test different combinations of two hormones, synthetic testosterone, and progestogen (a synthetic female sex hormone), and these seem to have proven very effective. Phase III trials, the last stage of testing before drugs are awarded clinical licences, are even taking place in some countries. The male contraceptive pill could be closer to hitting the shelves than we’d previously imagined! While the focus so far has been mainly down this hormonal avenue, another clever approach currently
in the early stages of investigation involves the use of particular molecular signals which block the action of other molecules. For example, two receptor proteins, alpha1A-adrenoceptor and P2X1-purinoceptor, have been identified as important in male fertility. They are part of the signalling pathway that is needed to allow the transport of sperm through the vas deferens, and therefore out of the body. Studies found that a lack of these molecules caused complete male infertility in mice, without compromising on negative sex drive side effects. Types of adrenoreceptors are already used in treating humans for other reasons such as high blood pressure, and so it is hoped that drugs similar to these can be developed and trialled specifically for contraceptive purposes. A 2015 poll conducted by the Telegraph newspaper has shown that over half of the 80,000 men who responded were willing to use a male contraceptive pill. Such large-scale uptake of the male pill will increase the options available to individuals and couples alike, empowering both sexes to take even better control of their fertility.
Amy Newman
synapsebristol.blogspot.co.uk | 19
ARTICLE
Alzheimer’s
The Diabetic Brain A
ccording to the World Health Organisation (WHO), there are 47.5 million people with dementia worldwide - therefore it is likely everyone has come across the term ‘Alzheimer’s’ at some point in their lives, having either known or heard of someone with the condition. Alzheimer’s disease, which is the most common form of dementia, is a chronic neurodegenerative disease where patients suffer an ongoing loss of function and death of neurons within the brain, causing long-term deterioration of a person’s memory and cognition — their ability to process thoughts. It has an insidious onset, meaning that when symptoms of cognitive dysfunction first start to appear in a person, the disease has already been present for a number of years and will continue to worsen as time goes on. There is also currently no treatment available to stop its progression. Alzheimer’s not only causes memory loss but is one of the main causes of dependency among older people worldwide, with the sufferer slowly withdrawing from their family and society due to problems with language, self-care and mood. The WHO found that Alzheimer’s accounted for around 60–70% of all dementia cases in 2015 with the prevalence increasing rapidly
20 | SYNAPSE
and estimated to exceed 120 million by 2040. It should be noted that even though the disease predominantly affects the elderly, it is not a normal part of the ageing process. While the causes of Alzheimer’s disease have always been poorly understood, most recent research has shifted towards an idea that creates a link between diabetes and Alzheimer’s disease — suggesting Alzheimer’s is actually the presentation of a new form of diabetes, termed ‘type 3’ diabetes. Diabetes mellitus is now a relatively common and well-studied metabolic disease, which is currently divided into two categories, type 1 and type 2. When studying the different forms of diabetes, the main common characteristic in both is that the body is not able to produce or use insulin adequately. Type 2 diabetes is usually caused by obesity or a sedentary lifestyle, in which body cells become insulin-resistant (desensitised to insulin) due to being exposed to unnaturally high levels of insulin for a prolonged length of time. An appreciation of the pathology behind type 2 diabetes is crucial when understanding the potential link between diabetes and Alzheimer’s disease. Up until the late 1970s, the brain
research has shifted towards an idea “ recent suggesting Alzheimer’s is actually the presentation of a new form of diabetes, termed ‘type 3’ diabetes
has been considered an insulininsensitive organ because insulin has no effect on the uptake of glucose into brain cells, in contrast with the strong stimulatory effect it has on muscle, fat, and liver cells. Recently however, researchers at Brown Medical School and Rhode Island Hospital in the U.S. have proven that the brain is actually sensitive to insulin, having found high levels of the hormone in post-mortem brain samples. This has led to the discovery that insulin is not only made by beta cells within the pancreas, but it is also produced within our brains. In the brain, insulin acts as a neurotrophic agent (regulating healthy brain development), exhibits neuromodulatory effects (controlling the cell-signalling between neurons) and has shown to be a potent neuroprotective agent (preserving structure and function of neurons) by producing strong antiinflammatory effects. Researchers have now found that it is a specific reduction in this ‘brain insulin’ that plays a key part in the development of Alzheimer’s disease. This results in the brain cells entering an insulinresistant state, and leads to lower levels of insulin and its receptors being present in the brain during the
”
initial stages of the disease, which continue to decline as the condition progresses. Using these findings we can now look at Alzheimer’s disease in a different light, viewing the cognitive dysfunction experienced by individuals with Alzheimer’s as being linked to insulin signalling abnormalities. This is similar to those experienced by a person with type 2 diabetes - only this time it is being localised to the brain. Creating the concept of Alzheimer’s being a ‘diabetes of the brain’ has meant scientists are now testing medications currently used to treat type 2 diabetes as potential treatments for Alzheimer’s. Recently, it has been discovered that the diabetes drug ‘Liraglutide’ reduced the damage caused by Alzheimer’s in the brains of mice, and now a trial of the drug is being carried out by Imperial College London on humans in the early stages of the condition. If successful, the drug could reverse the damage caused by Alzheimer’s disease and could potentially lead to the biggest breakthrough in the treatment for the condition for decades.
Sumiya Hussain synapsebristol.blogspot.co.uk | 21
Synapse Strips by 22 | SYNAPSE
James Ormiston
SYNAPSE THE SCIENCE MAGAZINE WRITTEN BY STUDENTS FOR STUDENTS
Join the society as a writer, editor, artist, blogger or designer.
News & Updates Find Synapse on Facebook for news on upcoming events, opportunities and progress updates!
The SYNAPSE blog • Digital magazines • Online-only articles, global and Bristol science news
E: synapsebristol@gmail.com
synapsebristol.blogspot.co.uk
Did you know? Cooling liquid helium below -269o C transforms it into a superfluid, thus allowing it to flow without friction. Superfluid helium displays unusual behaviours, such as being able to climb the walls of its container, remaining motionless while the container is spinning, and leaking through microscopic cracks. Because all atoms in the superfluid are in the same quantum state, studies conducted on superfluid helium are believed to have potential in aiding the understanding of quantum mechanics.
Marta Plaszczyk
Synapse thanks our sponsors: