I,Science Issue 40 (Summer 2018)

Page 1

THE SCIENCE MAGAZINE OF IMPERIAL COLLEGE

“YOU’RE ALL IDIOTS” Reflections on a flat earth convention

www.isciencemag.co.uk

THE LAST BREATH

The physical and ethical end of life

ENDINGS SUMMER 2018

ESCAPE VELOCITY Redefining systems of belief

I, Science

1


Editors-in-Chief Poppy-Jayne Morgan Christopher Richardson Magazine Editor Claudia Cannon Pictures Editor Taryn Kalish Web Editor Pedro Ferreira Online Features Editor Joe Hincks News Editors Jordan Hindson Meesha Patel

Hello and welcome to our summer term issue. The end of the academic year is here. For some, that may signal the end of exams, classes, or projects, but for others it may mean the end of your time at Imperial. Whether you are feeling wistful or joyful, we hope you have enjoyed the past year and look forward to the future. With this bittersweet feeling, we have explored Endings. We are surrounded by many endings, both great and small. We constantly seek to understand endings in our lives and we hope that this thought-provoking issue may provide some answers.

As we wave goodbye to what the year has held, we’re excited to bring you our new content in this magazine, as well as online and on air. Once again, we wish to thank our friends at the Royal College of Art and Central Saint Martins for their stunning contributions in this issue. We have some captivating articles for you to explore and we wish all of our contributors and readers the best in their next steps. We hope you enjoy the issue – in fact, we hope you read to the end. Chris and Poppy-Jayne Editors-in-Chief

Business Manager Luke Cridland

Endings come in many guises. They can be personal, spiritual, physical and practical. For an individual, an end may come in the form of death or heartbreak. History has shown us the end of the dinosaurs and of ancient civilisations. We ponder the end of our species, our planet and of the Universe. Some endings we seek – to end ageing or publishing pressure.

Marketing and Social Media Rachael Smith

But so often, endings simply mark new beginnings.

Events Manager Josh Sucher

I,SCIENCE

Radio Editor Hannah Fisher TV Editor Lina Kabbadj

Sub-Editors Hilary Guite Rachel Kahn Bridie Kennerley Sarah Leach Jonathan Neasham Mariam Shafei-Sabet Cover Illustrator Pod Hughes

For further information on the artists and artwork featured in this issue, please visit: isciencemag.co.uk/features/ iscience-issue-40-artwork

We’re always on the lookout for new contributors for both the magazine and the website. If you would like to get involved as a writer or illustrator please don’t hesitate to get in contact. You can email us at i.science@imperial.ac.uk, tweet us @i_science_mag or contact us directly through our website: www.isciencemag.co.uk

I, Science, c/o Liam Watson, Level 3, Sherfield Building, Imperial College London, London SW7 2AZ Email: i.science@imperial.ac.uk Printed by:Leaflet Frog, 38 Britannia Way, Bolton BL2 2HH

2

I, Science

Interested in advertising with us? Contact us at : iscience-business@imperial.ac.uk

I, Science is a publication of the Science Communication Unit, Centre for Languages, Culture and Communication, Imperial College London. However, it is a student publication, and as such the views expressed in I, Science do not reflect the views of the Unit, Centre or College.

www.isciencemag.co.uk


The last breath The physical and ethical end of life Bans and burnings Natural disasters and human choices as constructive forces Image: Shannon Bono

This is the end The Earth | The Universe Under pressure Nothing personal, it’s just business | Don’t give into peer pressure You’ve got some nerve Following the journey of nerve agents through the body Ahead of the game In celebration of innovations from civilisations past Tripping End of life visions and hallucinations | The psychedelic treatment revolution Tipping point Past and future wipeouts To infinity and beyond Get your head around the mythical ‘lazy eight’ Don’t go breakin’ my heart The psychological responses to destructive emotions The end of ageing How to grow a 1000 year old head Escape velocity Redefining systems of belief Seeking closure Why we just can’t live without an ending Upgrade available now Archaic and innovative tech

www.isciencemag.co.uk

Image: Pod Hughes

4 6 8 10 12 14 15 16 18 20 22 24 26 28 30

CONTENTS

“You’re all idiots” Reflections on a flat Earth convention

I, Science

3


“YOU’RE ALL IDIOTS” REFLECTIONS ON A FLAT EARTH CONVENTION AUTHOR: CHRIS RICHARDSON IMAGE: OLGA SUCHANOVA

What is reality? It’s the state of things that actually exist, although that in itself is a nebulous concept. When several minds come together we often tell elaborate stories and create intersubjective realities, things that do not exist but which are useful. A state is just an organisation with a monopoly on violence within a defined geographical area, its borders but arbitrary lines in the sand. Money is only as valuable as it is collectively agreed to be. There are elements of our “reality” so well ingrained that we rarely bother to question them at all. Yet as scientists – and as journalists – that’s our job. To scrutinise, challenge, and ask questions about the reality we are presented with. With that in mind, I began speaking with a group of people questioning something that most people take for granted – the curvature of the Earth – to understand more about their notions of reality. I should begin by making this clear: these are not the garden-variety internet trolls dominating online comment threads. These are committed believers in a flat Earth model, many of whom had travelled from “around” the world to spend a weekend of their time exchanging ideas with kindred spirits at the debut UK Flat Earth Convention. The delegates were some of the most friendly, welcoming people I’ve ever encountered. “Peace and love” was a core message of the convention, which resonated with me and is difficult to fault. Off to a good start. Yet things shifted gears as I began to explore their beliefs in more detail. What is interesting about the community is that there is absolutely no consensus among them, other than the fact that the Earth is flat. Beyond that, anything goes. Some say that gravity is an illusion arising from Earth’s status as a perpetually-rising cosmic disc. Others say that Antarctica is an ice wall occupying the entire circumference of the Earth, beyond which sits the government and/or alien civilisations. And, depending on Biblical inclinations, others say that the moon is a luminary body, emitting its own light rather than reflecting light from the sun. Conversations typically began with a discussion of the flat Earth, yet would rapidly segue into lengthy, meandering monologues in which other ideas became entangled. The government is remotely controlling your brain through fluoride toothpaste and vaccines. The round Earth model and evolution are myths peddled by Zionists to make you feel small and insignificant. And, of course, 9/11 was an inside job. At this point you are probably asking yourself why people would believe in such things. I think that it offers some solace, some respite from the fact that we are in fact small

4

I, SCIENCE Science

www.isciencemag.co.uk


and insignificant within the endless cosmos. Alternative views offer shelter from a cold, harsh truth. Rather than confronting reality, believers are told they are special, and not to let the proverbial “them” make you think otherwise. And, more practically, belief offers a community of friendly people with whom to bond. Routes into this community are as interesting as the beliefs themselves. One woman spoke about her reality shattering when she walked in on her husband in bed with another woman. At this point she realised she had been “living a lie” and decided to investigate other potential lies in her life. Taking a ruler along for her evening walk on the beach, she held it up to the horizon only to discover that there was, in fact, no curve. Others spoke fondly of their experiences with psychedelics – notably the so-called “spirit molecule” dimethyltryptamine – in awakening them to things not always being what they seem. The commonality seems to be a significant “awakening” event, with a dash of psychological vulnerability. And with this awakening comes a superiority. “I was once a scientist like you” was parroted by many delegates, implying higher status, wisdom to be attained. Commonalities may be rare within the community, but belief in measurement and in one’s own senses are generally held in high regard. Measurement is particularly interesting, as there must be a certain level of mental gymnastics involved in applying the scientific method while simultaneously demonising it as a flawed tool developed by liars. For what it’s worth, many flat-earthers are at least attempting to make measurements and engage with data, rather than accepting ideas on faith. Some of these experiments illustrate a serious dedication, such as FECORE’s recent experiments on Hungary’s Lake Balaton. The team embraced days of blistering cold to fire lasers across the frozen lake in an attempt to prove the flat Earth model, which they claim to have achieved. Belief in one’s own senses is also interesting: it may be attuned to aid survival, but the human eye has to be one of the most unreliable, fallible pieces of equipment with which to make inferences about reality. Trust your senses? A simple optical illusion demonstrates just how easy it is to dupe the eyes.

is not flat. This is denialism, something entirely different to scepticism, and highlights a seemingly-impossible obstacle. While entertaining, the debate itself became a frustrating sort of ping-pong in which specific pet ideas were thrown at the physicists. They provided “mainstream” scientific explanations for said ideas, which were then immediately dismissed. As well as highlighting a disdain for academic credentials, the debate illustrated a fundamental misunderstanding of the scientific method. For the flatearthers, the replacement of old ideas with new shows a never-ending chain of “bullshit” in which scientists “get it wrong”. Yet this represents progress, something on which we should take pride, and something that clearly needs to be better communicated with the world. I’ve never been to outer space, and probably never will. Is the Earth flat? Maybe, but probably not. The convention was an eye-opening experience and something that should concern us all. Beyond the vague and troublesome antiSemitic undertones, the fundamental misunderstandings of science and truth on display extend much further. And the ramifications are legion. What is reality? Nowadays, it’s whatever one decides. You have your truth, I’ll have mine, and let’s leave it at that. The rise of the flat-earther hasn’t emerged in a vacuum: it’s part of a much wider problem we’re facing as a society in which there are no truths, only perspectives and opinions. And as we descend further into our personal journeys and political echo chambers, this is only set to worsen. As scientists and science communicators, we certainly have our work cut out. And we need to start developing concrete strategies to combat these ideas, and pull us back out of this weird, post-truth rabbit hole.

I was asked to bring along a team of “mainstream” physicists and facilitate a debate against a panel of flatearthers to conclude the weekend’s proceedings. Matching the overall mood of the convention, the debate was friendly and good-natured, although we were under no illusion we might change any minds. Despite claiming “healthy scepticism”, many people I spoke with confirmed that no amount of evidence would convince them that the Earth

www.isciencemag.co.uk

I, SCIENCE Science

5


The last breath The physical and ethical end of life The physical process of death More than half the people who died in your living room this year were probably shot, stabbed, or killed themselves, and a surprising number had a supernatural death. Watching television is the way most people first encounter death, and it is often our closest encounters with it until our parents die. Research into the content of popular programmes shows that portrayals of natural deaths, which occur 92% of the time in real life, are the exception on television. Our living room experience of death is the ultimate fake media. There, people die of the wrong things in unrealistic ways. This article will aim to correct that balance. I will cover the top five physical signs that show that death will occur within three days, discuss if a “wet” or a “dry” death is preferable, and explain the death smile and the death rattle. How people die is changing. Deaths from cancer and dementia now account for 4 out of every 10 deaths in England, and many others die from multiple long-term conditions. Sudden deaths from heart attacks and strokes are, likewise, decreasing. So how do people die if their hearts keep on beating and their lungs keep breathing? For most people entering the terminal phase of their lives, the desire to eat and drink decreases. Most people will die within 10 days of stopping drinking, and begin to fall in and out of consciousness within three days. There is concern that people dying in this way are suffering intense thirst and pain. This concern leads some practitioners to turn the “dry” death into a “wet” death. Hydration and nutrition is added via drips or PEG tubes inserted through the abdomen directly into the stomach. These measures in the last days of life may not extend

6

I, Science

www.isciencemag.co.uk


it and may risk inhalation pneumonia, pressure sores and bloating. On the other hand, people choosing to bring forward their own deaths by voluntarily stopping eating and drinking describe euphoria and pleasant lightheadedness. It appears that dehydration reduces pain and the death rattle —a noise made when breath passes over the secretions that gather at the back of the throat and in the lungs once the cough reflex is no longer active. The death rattle is one of the first of the five signs that occur in the last days of life. In a rare study into the physical changes at the time of death, Morita in 1998 documented that the death rattle occurred, on average, at 57 hours before death. At 7.6 hours before death, mandibular breathing appeared – sometimes called “fish breathing” as the patient juts their jaw forward and gasps for breath. This is a natural part of death and not thought to be distressing to the patient. Next, at 5.1 hours, the peripheral circulation shuts down leading to a bluish tinge on the skin. This is followed by loss of the radial pulse in the arm at 2.1 hours. The last sign usually is Cheyne-Stokes respiration— intermittent breathing with long gaps of 30-40 seconds between breaths, making relatives think the end has come. At the true end, some people have a spike in brain activity that can lead to them opening their eyes, smiling, sometimes saying something or seeing things. Most people witnessing this describe a profound sense of peace. Euthanasia and Physician Assisted Suicide “You wouldn’t let a dog die like this.” The 30-year old wife of my patient dying from non-Hodgkin’s lymphoma was crying. I was crying too. There was nothing I could do legally as his doctor to help him die. Ask any of us (well, 80% of us according to UK national survey Natcen), and we think doctors should help people to die if they have a painful and incurable illness that will kill them. This is called voluntary active euthanasia and usually requires the administration of a lethal dose of barbiturate. This method is almost certainly preferable to the nauseainducing “euthanasia rollercoaster” designed by Lithuanian artist Julijonas Urbonas to create a deathly 10G force for one minute, followed by seven loop-de-loops.

eyes this is assisted suicide rather than euthanasia, for which national support falls to 50%. The distinction might seem semantic, but most nations agree that this situation is ethically distinct from euthanasia. In most countries, it is still illegal to assist someone to die under these circumstances. Beneath our ambivalence to active Physician Assisted Suicide is a war of the three types of ethics. All have important points to make, but all are inconsistent with each other. The first ethical position is what most of us turn to day-to-day. A kind of “do as you would be done by” rulebased ethics arising from Emmanuel Kant. It goes: “I would like control of how and when I die and I guess you would too, so it must be alright”. But you may think very differently to me, so who decides which of us is right? Aristotle had the answer for who decides. He thought that a “good man does what a good man does”—I would update him by saying a good person. He believed that a person who has spent a lifetime reflecting and trying to improve their own virtue will come to the most ethical correct decision because of this lifetime endeavour at selfimprovement. But can we trust doctors, nurses or judges to be that good person? There have been enough scandals in the medical and legal professions to mean that few of us would ultimately trust them to decide if we live or die. Probably the scariest ethics of all, in this case, is utilitarianism —the greatest good for the greatest many. Vulnerable people may feel it’s better for everyone if they die. People could feel under pressure either directly or indirectly to request death to avoid being a burden. There are other arguments against euthanasia. It fundamentally changes the relationship between the patient and their doctor. Once Physician Assisted Suicide is legal, how will you know your physician has your best interests at heart? Those running hospice care think that Physician Assisted Suicide could become the preferred cheaper option. Who wants to be in the position where to die free of pain you have to kill yourself, when good hospice care could give you a peaceful death? Author: Dr Hilary Guite Image: Yvonne Yen

Personally, I would prefer eternal entry into San Junipero, featured in season three of Black Mirror, inhabiting my younger body in an endless “life” of love and partying. Failing that, I think that voluntary stopping eating and drinking with some opiate support might just be okay. The real debate comes when we discuss assisted dying for conditions that are not curable and are painful and/or make us dependent on others, but will not kill us. In most people’s

www.isciencemag.co.uk

I, Science

7


Bans and burnings Natural disasters and human choices as constructive forces Author: Issy Stafford A truckload of rubbish fills our oceans every minute and the majority of this is plastic waste. Plastic has been identified in every marine ecosystem and is known to impact at least 700 species in the ocean. Even humans are at risk, with the average seafood eater consuming around 11,000 fragments of plastic each year. However, following the success of Blue Planet II, a heightened consciousness amongst its 14 million viewers about the severity of the marine plastic issue has led to increased pressure on worldwide governing bodies. In January 2018, the UK announced the strongest ban on microbeads globally, whereby all products containing microbeads will be removed from our shelves by 30th June 2018. Trillions of these non-biodegradable beads flush out from UK households every day from our cosmetic products. Their small size and resilience means most travel into our oceans where they are ingested by at least 280 different marine species. A domino effect is now occurring globally as other countries including the United States, New Zealand and parts of Europe are also enforcing a ban on the bead. This isn’t the only significant ban surrounding plastic that came into effect in 2018. As of the 1st January, China banned 24 types of waste imports, including household plastic. Since the 1980s China has been the world’s largest importer of plastic waste, with the UK alone exporting 2.7 million tonnes to China and Hong Kong since 2012. This ban has received much negative press as UK recycling companies warn of stockpiling and more incineration of plastic waste. However, the ban is ultimately a highly sustainable and positive enforcement. Much of the waste that ends up in China isn’t suitable for recycling and stockpiles in landfills, from which it leaches into China’s watersheds. 95% of ocean plastic comes from just 10 rivers and it is China that hosts the world’s most polluted, the Yangtze River. According to a recent study, the Yangtze deposits an estimated 333,000 tonnes of plastic waste a year, which is 55% of the world’s total plastic waste from rivers. A ban on waste imports will reduce pressure on Chinese recycling systems and allow for the start of a mass clean-up process. In turn, we could see serious reductions in pollution levels from both the Yangtze River and China as a whole.

8

I, Science

www.isciencemag.co.uk


Author: Hannah Fisher Volcanoes are some of nature’s most destructive forces. Vesuvius, Krakatoa, Mount St. Helens, Eyjafjallajökull - all are famous examples of simultaneously fascinating and terrifying volcanic eruptions. Such eruptions cause havoc and destruction – in January 2018, Mount Mayon in the Philippines erupted multiple times over two days with huge fountains of lava, leading to mass disruption and 50,000 people being evacuated. In May 2018, Kilauea in Hawaii (which has had continuous mass eruptions over the past 30 years) became explosive, causing damage to homes and transport routes. But volcanoes are also one of the most constructive forces, central to the very concept of life on earth. They are a source of building materials, made farming a possibility, and even helped to form our atmosphere. Volcanic eruptions around 201.4 million years ago led to the rise of dinosaurs after toxic fumes from eruptions made many other species extinct. After the dinosaur extinctinction, Homo sapiens began to rise to prominence, leading to the age of human dominance over Earth. Volcanic eruptions bring long-term benefits to their surrounding areas. The soil surrounding volcanoes is incredibly fertile due to the breakdown

www.isciencemag.co.uk

of volcanic ash, which often contains beneficial minerals for plants. This means many populations are based around volcanoes because of the farming opportunities. Forest wildfires are another natural phenomenon that we often attempt to prevent. However, while appearing to devastate the environment, wildfires actually remove debris from forest floors, opening it up to sunlight, and nourish the soil. This helps to regenerate and revitalise forests to make them stronger and healthier, ultimately aiding the existence of diverse life on Earth.

Image: Sophie Moates

In the UK, we should also see the positives from the end of this destructive practice. Although the demand for single-use plastic continues to rise, a build-up of pressure in waste management systems is likely to be a game changer for developed countries. With no alternatives, countries must quickly devise sustainable waste-management solutions and curb the demand for single-use plastic. In the UK, almost 70 consumer goods companies have signed a plastic pact, whereby all plastic packaging will be 100% recyclable, compostable or reusable by 2025. Within the next year we could also see the ban of other harmful plastics such as straws and cotton buds, as the war on plastic proliferates.

Wildfires naturally clear the forest floor by removing low-growing underbrush and debris, allowing more sunlight to penetrate the forest cover to the ground below. Fire ensures trees undergo a ‘natural selection’ as it were – ridding the forest of the weaker trees and allowing those remaining to become stronger and healthier. This increased space on the ground allows for greater flow of water, helping to provide habitat, and the fire can clear widespread diseases affecting forests. In fact, after a series of destructive fires in Yellowstone National Park in July 1988, which burned almost 800,000 acres, the park launched a fire management plan which included appreciation of the importance of natural fire in wilderness and national park areas. Ultimately, some areas are dependent on forest fires occurring every 3-25 years for survival. Some trees require fire to provide heat which opens their cones, allowing them to germinate and some even have leaves that contain flammable resin to ensure they maximise any chance of a potential forest fire. While we should take measures to mitigate impacts to human life, we must remember that these natural processes would occur whether or not we were on this planet. Both volcanoes and wildfires can be hugely destructive forces, but they can also be keystones of incredible chances for life. Nature has a remarkable way of creating new life out of so called ‘disasters’ and apparent endings.

SCIENCE I, Science

9


This is the end. Nothing lasts forever—not even the Earth we stand upon, the Sun that rises in the east every morning, the stars that shine in the sky, or even, perhaps, the Universe itself. How will these constants cease to be constant? And what will that mean for the future of life? The Earth Science has demonstrated the future that lies ahead for our home. The Sun is a star, and, because there are plenty of stars in the Universe to observe, the basic physics that defines their lives is understood, even if there are still details to understand. Counterintuitively, as stars like our Sun age, they become brighter, hotter, and larger. In fact, the Sun has been becoming more luminous for its entire life. This has gone on fairly steadily for almost five billion years, and will continue for roughly another five billion. Then, things will change. The Sun creates most of its energy by fusing hydrogen in its core. That hydrogen however, is a finite resource, and will run out. When it does, the core will collapse inwards, and its outer layers will expand outwards. The surface of the Sun will expand to reach roughly the orbit of the Earth. It is most likely the Earth will be swallowed into the Sun and destroyed, merging into the star it has orbited for so long. The Earth will likely be unlivable long before then though. The increased heat from the Sun may cause dramatic climatic changes. It is likely our planet will lose all its oceans in the next few billion years. With the loss of this water, plate tectonics could cease, ending a crucial piece of the carbon cycle which life depends upon. Before that happens, the increased heat could change how minerals on Earth react with the environment, taking carbon dioxide out of the air and killing off plants. Whenever it happens, the future is clear: Life is doomed if it remains solely on Earth. Leaving home is a rite of passage. The question of when we leave our home on Earth is typically termed a prosaic one in the name of exploration, or an escape from a planet we have ruined. However, even without the drive to expand our reach or a failure to live within our means, eventually we will have no choice. Life will not just have to leave our home planet, but likely our home solar system. At that point in time interstellar travel will become a matter of survival. Humans have always been migrants. We have a deeply inborn desire to explore, likely because our ancestors were the ones who journeyed into the unknown. That desire will probably drive us to leave the Earth long before we must. People are already lining up for one-way trips to Mars, and probably will for much longer trips too. But, eventually, there will be no home for any life to linger in, even for the homebodies. One day, in order to survive, all life will have to set out on the great adventure across the cosmic seas.

10

I, Science

www.isciencemag.co.uk


The Universe The end of the Universe is harder to predict than the end of the Earth. We do not have many universes at different phases of their life to observe, as we do with stars, so our ideas are theoretical, and likely to be wrong. We know that the Universe is expanding, and that this expansion has been speeding up over time. Whether that continues comes down a rather esoteric question: What is the shape of the Universe? When scientists talk about this, shape means something different to our usual understanding of shape than what might come to mind. This is the shape of dimensions and time themselves. As such, it is something difficult to observe from our vantage point. This shape matters a lot. If that shape is curved inward, like a sphere, then a Universe will eventually cease expanding and contract backwards instead, into something occasionally called the ‘Big Crunch’. However, if the space is curved negatively, like a saddle, it would expand forever. The last possibility, a flat Universe, lies in between these two curvatures. In a flat Universe expansion slows down, eventually becoming so slow the Universe may as well be a stable size. So, what shape is our Universe? The observations so far point to it being flat, but the increase in speed from dark energy means it may never stop expanding. What does this mean for the Universe, especially for life? It’s not good. As the Universe expands it gets colder and galaxies get further apart. This is especially bad news for future astronomers, although we still have some time left. In about 150 billion years we may be unable to see galaxies outside our local group, a cluster of galaxies bound together by gravity. Because light takes time to travel, losing the ability to observe these far-off places would mean that life would no longer be able to make observations about the early history of the Universe. Eventually, stars will stop forming, and the skies will slowly go dark. Black holes may come to dominate the Universe, eating up free matter. It’s possible that the expansion will cause everything from planets to particles to split apart. This is, obviously, not good for life in the end. However, it is trillions of years off. In the meantime, it might be a golden age for computers. As things become colder, computers will become more efficient, and there are theories that civilisations based upon artificial intelligence may be out there, hibernating, waiting for the Universe to cool off. Perhaps one day we will replace ourselves with computers running efficiently, still exploring, although under darkening skies. Author: Sarah Leach Image: Nicolas Baird

www.isciencemag.co.uk

I, Science

11


NOTHING PERSONAL, IT’S JUST BUSINESS “This is not a university anymore but a business.” “Professors are really like small business owners.” “It is a prostitution of science.” These are powerful, damning words, and they aren’t quotes sourced from a student protest. They are instead from the highest of sources within the academic system itself. Alice Gast, the Rector of Imperial College; David Colquhoun, Emeritus Professor of Pharmacology at UCL; and Stefan Grimm, a Chair in Toxicology at Imperial College London who committed suicide in 2014, supposedly because of pressures he faced meeting funding expectations. Over recent years, universities have been slowly morphing from one form to another, moving away from the tuition feefree environment they once were, to a more profit-driven entity, and with profit comes defined targets and results. Academia by its nature is defined in uncertainty, a push into the unknown, but this new culture moved at odds with this founding ideal. A change was occurring, and in recent years the incredibly real impact this new guise universities had taken has started to be recognised.

AUTHOR: LUKE CRIDLAND IMAGE: ROSE ZHOU

UNDER PR

To the public, this change is most visible in the rise of tuition fees, with costs now being up to £9,250 per year for undergraduates. However, academics have also been facing the stark reality of this business-driven approach, to a damaging degree. One such example was the threat to pension schemes for academic staff this year. Strikes were held in protest of the proposed amendments and the second term for many universities was disrupted as a result. Unions have, for now, called off strikes because an expert group has been called in to review the proposed changes, but that does not mean the issue is resolved. Academics spend much time on their work, seeing the ‘greater good’ ideal as an incentive to the hours consumed, but where this worldview was once a luxury they had dominion over, it has now become an expected norm. What makes matters worse, is that it may not be the work produced in these innumerable hours that provides the measure of their success, but instead the financial recognition it gains in grants. This leads us on to the dark reality we are in now. Academics are driven to unruly work commitments for a goal whose achievement can’t be promised, but whose results will guarantee their tenure. These are unsustainable conditions for academia, and if there’s anything to be taken from Stefan Grimm’s tragic death, it is a wake-up call for those responsible for inducing this environment. Some signs indicate this may be starting to occur. Nature recently surveyed over 3,200 scientists concerning their well-being. This level of questioning is unprecedented, and can hopefully provide more transparency and scrutiny going forward. This is a crucial step in trying to amend the patterns of the present, but it needs to be the first of many if we are to see real change.

12

I, Science

www.isciencemag.co.uk


DON’T GIVE INTO PEER PRESSURE Whether rightly or not, grants determine the success of a scientist, but what determines a grant being given to one academic over another? It is the metric system of citations and impact factors facilitated by scientific publishing houses that have this sway. These metrics have become powerful tools within the scientific community because they can give a quick and easy solution to a funding body’s decision-making process by quantifying academic success. Consequently, a publisher’s decision to publish a paper can have a great impact on the work’s success and the future success of the research team behind it.

PRESSURE

The reason publishers have such influence over metrics is because it is the publisher’s reputation which directly determines them. The more citations the published papers of a journal receive, the higher the impact factor it has. The higher the impact factor, the greater the prestige of a journal, and monopolies start to form. However, this system should be questioned because publishing is flawed but allowed to remain as such because of these monopolies. A major flaw to address is the peer review system. This is what publishing houses use to determine whether a paper is of a good enough quality to publish in their journal. An editor will send a paper off to be reviewed by experts in the paper’s field and whether it is rejected or accepted is based upon the reviews submitted by these experts. However, it is not as clear-cut as this. Firstly, two or three reviewers aren’t a large enough pool of scrutiny, regardless of their skill-set. Secondly, this entire process goes on behind closed doors, with much of the merits and flaws of a paper not available to the public. Most importantly, reviewers are only human. Usually, they are busy academics being asked to review a paper for free. The incentive is one towards the community, not the individual, and that can be a hard ask to expect of all scientists, to never be selfish when reviewing. Consequently, whether intentionally or not, selfishness can seep through in time spent actually reading a paper and leaving personal biases to one side. Although reviewers remain anonymous, authors do not. Niche areas of science are small communities, a figurative village of expertise. Everyone knows everyone, and it’s only human nature that not everyone in a community will get along, with rivalries occurring. What’s to stop such a grudge from clouding the content of a review if it’s a rival’s paper? Recently, changes have started to occur to stamp these biases out. Many journals have adopted a double-blind peer review system, which makes both the authors of the paper and the reviewer anonymous. Furthermore, Nature Communications has introduced a transparent peer review system, where authors can opt-in to publish the reviewers’ comments alongside their paper. Although a start, a lot more work is needed to truly improve peer review and publishing, but so long as transparency is at the heart of any change, we can be hopeful.

www.isciencemag.co.uk

I, Science

13


27 miles from the base holding the agent were accidentally killed. In the ‘70s and ‘80s, Russian scientists developed a series of advanced nerve agents; Novichok (meaning newcomer). These are the most potent chemicals of this type currently known. Nerve agents, like many weapons, come in different strengths and types, but all act by shutting down the nervous system with potentially fatal outcomes. They fall into three main groups; Novichok (including A-230), G-agents (including sarin) and V-agents (including VX), and can come in the form of gas, powder or liquid. It is incredibly easy for any of the agents to get into your body as they can be breathed in or absorbed by your skin, and while they might each have their own characteristics, all cause horrifying and painful effects. Nerve agents are often regarded as “possibly some of the most dangerous things that humans have ever made, after the atom bomb”. But what happens once it gets under the skin? All of these chemicals block the nerve signals from the brain to the organs, by stopping a particular enzyme at the junctions that relay messages. This enzyme, acetylcholinesterase, normally works to regulate acetylcholine, a neurotransmitter controlling the autonomic nervous system. Overall, acetylcholinesterase acts as an off/on switch at these junctions, or synapses. If a nerve agent comes and removes acetylcholinesterase, then the body loses control and the synapses are always on. This overstimulation of the synapses is what results in physical symptoms.

YOU’VE GOT SOME NERVE FOLLOWING THE JOURNEY OF NERVE AGENTS THROUGH THE BODY Images of officials in full hazmat suits and gasmasks is not something that we expect to see outside of horror films. Since the nerve gas attack in Salisbury earlier this year, these scenes have become more familiar on the news causing much distress to the British public. Nerve agents were first discovered in the ‘30s, but have recently been thrown into the limelight following the attack on former Russian spy, Sergei Skripal, and his daughter Yulia. Although discovered by accident by researchers attempting to find insecticide alternatives, they were found to be highly toxic to humans and were later weaponised by Nazi scientists. The first nerve agents contained tabun and sarin, and even though the Nazis developed plans to manufacture them, the plant was (thankfully) not operational by the time the Third Reich fell. From there, the UK research into insecticides took similar steps with VX being created in 1952 and handed over the UK’s Porton Down Chemical Weapons Research Centre. When the UK renounced chemical weapons, VX was handed over to the US, where over 3,000 sheep grazing

14

I, Science

First, this will cause uncontrollable salivation and fluid production, leading to the foaming of the mouth seen in multiple sarin attacks. Then the pupils become restricted, hindering the victim’s sense of sight, and due to every nerve synapse in the body firing repeatedly, the chest will tighten and breathing will become increasingly difficult. The over-firing causes systems to malfunction and stop understanding the signals, meaning that bodily functions will become out of the control of the brain. There are convulsions, rapid heartbeat, sweating, vomiting and diarrhoea, causing the possibility of dehydration to increase. If a patient exhibits convulsions, it is a sign that the dose was high enough to be fatal. Fatal doses can kill within 10 minutes. Is there a treatment? Removing the nerve agent is of utmost importance in these cases. If removed, the brain has a chance to regain control of the bodily systems. One of the drugs administered is atropine, another neurotransmitter which blocks acetylcholine in the attempt to limit the firing. However, recovery rates in nerve agent attacks are varied and depends on what is used and how soon help is found. Overall, control over your nerve endings is hugely important and it is clear to see why using such chemicals is illegal in war. In an ideal world, we would hope to look to a future where nerve agents are neither used nor created.

Author: Rachael Smith Image: Olga Suchanova

www.isciencemag.co.uk


AHEAD OF THE GAME IN CELEBRATION OF INNOVATIONS FROM CIVILISATIONS PAST

Author: Sadie Sweetland | Image: Pod Hughes (detail) Today’s civilisations believe that they are more innovative than ever before. Technology has a large role to play in this. It is often associated with a civilisation’s success, and, with the rise of computers and artificial intelligence, we appear leagues above ancient civilisations. Yet, if we go back more than 2,000 years ago, we can see innovation and inventions that were ahead of their time. The Ancient Romans, for example, produced advances in technology and engineering that were unrivalled for centuries; they perfected aqueducts to transport water into their cities, created a remarkably durable concrete allowing for structures like the Colosseum to still stand today, invented the book (a stack of bound pages known as a codex), invented surgical tools such as forceps, specula and bone drills, pioneered caesarean sections and even disinfected instruments in hot water before use. Prodigious advances in understanding and technology did not begin with the Romans. The Ancient Greeks created water mills, perfected cartography to create the first map of the world, invented odometers to measure distances, and made discoveries in science and medicine that have contributed to our knowledge since. The basis of geometry was created by Greek mathematicians and led to the development of geometry as we are taught today. Greek scientists proposed that the Earth revolved around the Sun, that the Earth was a globe and that symptoms of a disease were caused by the body’s reaction to its progression, not god’s way of punishing humans. It doesn’t stop here. We can go back even further. Ancient Egyptians created water clocks, locks and the engineering feats needed to build the pyramids. The Sumerians are alleged to have created the first wheeled vehicles. The Minoans built palaces with sophisticated drainage and water systems making them extremely hygienic for their time. Across the centuries, civilisations have been constantly innovating. Although many of their inventions have been replaced by modern technological developments, they created a completely different way of thinking and living, and a basis to develop inventions further. However, every great civilisation falls in some way. For some, they are incorporated into other civilisations, as when the Ancient Egyptians became the Hellenistic Greeks, but others collapse into a simpler form, as occurred in the Dark Ages. When civilisations fall, we can lose some of the progress that they have made. When the Roman Empire fell, a period known as the Dark Ages arose in Western Europe. During this time, many of the advances made during the Roman Empire were lost. This was not because people became less intelligent. Rather, when the original infrastructure fell apart while political division and warfare from invaders arose, people had more important things to worry about than building aqueducts or learning to read. www.isciencemag.co.uk

We have lost important knowledge and technology through civilisations ending. Some of this reappears later in history as is seen with the steam engine. The steam engine was first invented by Hero of Alexandria in Ancient Egypt in 10AD but was never pursued further. It was forgotten until 1577, when the Turkish scientist Taqu al-Dinn reinvented it. We can also see this with the Antikythera mechanism. It was found amongst the remains of a shipwreck in 1902 and was dated to the Ancient Greeks. The mechanism could be manipulated to predict astronomical positions of the sun, moon and certain planets decades in advance. It could also track the four-year cycle of the athletic games. Its complex gear mechanism, which could calculate lunar phases and solar years, makes it the world’s first analogue computer. Yet, somehow, the knowledge of this technology was lost and doesn’t appear again until the 14th century with the development of mechanical clocks. Other knowledge is lost forever. One of the biggest losses in history was the destruction of the Library of Alexandria. The Library of Alexandria was founded in Ancient Egypt around 300 BC and was the first attempt to gather all outside knowledge in one place. When it was destroyed, so too was some of the knowledge and technology of the time. Another, similar, event was the destruction of the House of Wisdom in Baghdad. This was created during the Islamic Golden Age as a major intellectual centre housing work from scholars in medicine, astronomy, mathematics and philosophy. This was a time when science, philosophy and culture were flourishing in the Islamic world through the preservation of knowledge from earlier civilisations, along with additional improvements and innovations of their own. Knowledge was lost with the destruction of Baghdad and the House of Wisdom by invading Mongols. Along the years, the progression of knowledge has slowed down or been stopped completely by the fall of civilisations. This begs the question: if these civilisations had never ended, would we be more advanced than we are today? Maybe, if we had not spent time reeling from the fall of empires, our progression would not have slowed, and we would have been texting people, driving cars and watching TV back in the 18th century. Or maybe, if these civilisations had never fallen, technology would have progressed in a completely different way. How technology appears today is the culmination of centuries of innovation and inventions and, if history had not played out how it has, there might have been a different basis for technology today. We may have never decided to burn coal for energy, build the rail network, or create machines that can think for us. Some academics believe that civilisations today will fall through a ‘clash of civilisations’. If these civilisations fall, will we also lose or halt the progression of knowledge as has happened in the past? Years down the line, maybe someone will be writing an article like this about the mobile phone, a strange device that was leagues ahead of its time.

I, Science

15


End of life visions and hallucinations Author: Catherine Webb What will you see when you are dying? You might imagine your life flashing before your eyes, a light at the end of a long tunnel, an out of body experience or simply darkness. What you may not be expecting is for an elephant to walk nonchalantly past your hospital bed, or your long dead grandma to come by and hold your hand. Hallucinations often go under reported but are common to end of life patients in hospice settings. ‘Deathbed visions’ of loved ones or angelic presences are associated with peaceful deaths, but around 20% of hallucinations cause distress. Some see these phenomena as proof of spirit communication or the existence of an afterlife. However, there are many biomedical theories why a dying person may be hallucinating. A ready culprit is the pain killer morphine. With morphine, the classic hallucination is of large animals and that’s useful to know, says end of life (palliative) care specialist, Dr Kate Crossland, from St Joseph’s Hospice, London. “If a patient tells you there is a leopard on the ward, it’s a good sign that you maybe need to change their pain medication around.”

start administering antipsychotics. Dr Crossland stresses the importance of communication. “I would say to a patient ‘sometimes people at your stage see things that aren’t there. We call them hallucinations, has that happened to you?’ And if they say yes then I’ll ask them, ‘how does that make you feel?’ And if they say they find it comforting then it doesn’t need treating.” The aim is to respect the patient and to help them to have a good death, as free from physical and psychological pain as possible.

Image: Maritina Keleri

Other palliative care drugs, as well as oxygen depletion and conditions such as dementia, can be associated with hallucinations. A further theory is that some cancers may release hallucinogens into the bloodstream. Dr Crossland suggests that some may simply be the result of fanciful thinking and deteriorating eyesight. The upshot of all this is that we don’t always know why a particular patient might be hallucinating. What is more interesting than these biomedical explanations, says Dr Crossland, is the cultural overlay – in other words, what are people getting out of these visions? What stories are they telling themselves about their impending deaths? For instance, in African-Caribbean culture, seeing someone who has already died is considered to be a sign that death is imminent. The departed are thought to be coming to help the patient into the afterlife and this is considered an entirely natural part of the dying process. Palliative care is undergoing a culture shift in how end of life hallucinations are treated. In biomedical settings they were often considered as symptoms which must be treated with antipsychotic drugs – and when a patient is distressed, that is still the case. However, if a patient is using the hallucinations to resolve difficult psychological issues, or if they are deriving comfort from them, there is no need to

16

I, Science

www.isciencemag.co.uk


The psychedelic treatment revolution Author: Rachel Ditchfield Research in the 1950s showed promising results for psychedelics facilitating talking therapies. But this progress was brought to a screeching halt in the 1960s with strict controls on the use of such drugs, including research into their therapeutic potential. However functional MRI (fMRI) scans that show which areas of the brain are working in response to different stimuli have generated pioneering research into the action of psychedelic drugs in the brain, justifying again the medical study of these drugs in humans. For those suffering from treatment resistant depression (TRD), there is mounting evidence that psychedelics such as psilocybin could be the way forward where available treatments have failed. The nature of TRD means that patients have endured an arduous process of trying various unsuccessful courses of treatments, often with unpleasant side effects. Conventional treatments have thus far led to dead ends. Can the recent resurgence of psychedelic research provide new insights? Psilocybin, the active ingredient in magic mushrooms, induces feelings of euphoria, hallucinations, and mind-altering experiences. It bears a striking structural similarity to serotonin, a mood-regulator implicated in depression, and acts in a similar way on the brain. People who have taken psilocybin show increased interconnectivity across the whole brain. fMRI

www.isciencemag.co.uk

findings correspond to patient experiences of increased fluidity of thought and openness of mind. Patients participating in one study to treat TRD described feeling a change from disconnection to connection and avoidance to acceptance. Some patients who reported these emotional breakthroughs with psilocybin said they had not been able to achieve them through conventional medication or short-term talking therapies. Enduring changes in mood can occur with just one or two doses, but why are these changes occurring? A team at Imperial College London are investigating changes in the brain activity of patients with TRD, before and after taking psilocybin. They think the drug acts by ‘resetting’ activity in regions of the brain that are highly interconnected and consistently active when the brain is at rest — known as the default-mode network of the brain. This ‘resetting’ is similar to the mechanism by which controversial treatment electroconvulsive therapy (ECT) is believed to work. Is this a real avenue for new treatment options or are psychedelics just having a moment? Time will tell. We will not be seeing psilocybin prescribed for depression anytime soon: it is still an experimental treatment and more research is required to determine its effectiveness, safety, and side effects. Psilocybin may end up as an aid to talking therapies, rather than a stand-alone treatment. In clinical trials in which psilocybin was taken, a psychiatrist or clinical psychologist needed to spend a lot of time building trust before the session, then facilitating conversation throughout the patient’s experience. The results so far seem promising and for those suffering from TRD, the prospect of new potential treatments must come as good news. This could be the end of dead ends and the beginning of openness of mind, taking a holistic view of treating depression which values the patient’s own experiences and treats the mind and brain as one.

I, Science

17


65 million years ago, the world saw a global mass extinction. The Cretaceous period was the final chapter for the non-avian dinosaurs, the mosasaurs and plesiosaurs of the seas and the pterosaurs of the skies. But many survived. From mammals to crocodiles, snails to starfish — the death of the dinosaurs was not the end of life. Dated to 65 million years ago is a layer of the metal iridium. This layer is found globally across land and sea, marking the Cretaceous-Tertiary period — known as the KT boundary after the German for chalk ‘Kreide’. Although iridium is rare on Earth, it is found in meteorites and in the Earth’s core, making the dinosaur demise debate largely rest on volcanic activity or an asteroid impact as the main culprit, or culprits. The KT boundary has iridium levels 30 times higher than averages and so is viewed as marking the end of the dinosaurs. The aftermath of volcanoes or an extra-terrestrial impact would have had similar ramifications. Large amounts of dust, an obscured sun, changing weather patterns, and dramatic temperature fluctuations — all would play havoc on ecosystems leading to the mass extinctions we see in the fossil record. But dinosaurs began to disappear long before the asteroid impact date and volcanic activity, and continued to gradually disappear afterwards. There are many weird and wonderful theories that attempt to explain why the dinosaurs were wiped out.

In 1928, Harry Marshall thought the dinosaurs died from rickets due to a dust-covered sun. Stanley Flanders proposed an overwhelming tide of caterpillars - with no natural predators, the caterpillars could have eaten plant life quicker than the herbivorous dinosaurs, starving them and then their carnivorous dino- predators. In 2008, Chinese researchers suggested wildfires led to dramatic deforestation, creating a perfect environment for fungi to take over – with spores overcoming dinosaur immune systems. More recently, Professor and evolutionary psychologist Gordon Gallup proposes taste aversion weakness — he believes that the psychology of the dinosaurs led to their downfall. Gallup argues that the rise of flowering and toxic plants were changing the ecosystem. For us, and many other animals, foods have different tastes; bitter, sour, salty. By identifying these tastes, and learning what is edible or not, we avoid rotten and unripe food, as well as poisons. Modern relatives of the dinosaurs, including birds and crocodiles, do not have this taste aversion to toxic substances – they cannot learn from taste about a foods edibility. The large herbivore dinosaurs may have consumed such large quantities of the new plants that they reached lethal doses of toxic plants – death by gastrointestinal distress. Regardless of the final trigger, the end of the dinosaurs opened the door for mammals to diversify and evolve into new niches, leading to the eventual rise of humans. But another catastrophe could spell the end once more… Author: Poppy-Jayne Morgan

Tipping point Past and future wipeouts Image: Olga Suchanova

18

I, Science

www.isciencemag.co.uk


In the early hours of September 26, 1983, alarms sounded at the Soviet missile detection command post. Five nuclear missiles had (mistakenly) been detected inbound to the Soviet Union from the United States. Protocol was to launch an immediate counteroffensive; a single push of a button. But the officer on duty made a monumental decision–he did nothing. Stanislav Petrov may have single-handedly prevented World War 3, but this tale is far from a relic. Humanity appears to find peering down the barrel of self-annihilation a tempting proposal and the likelihood of human extinction seems to increase synchronously with our technological progress. 15,000 nuclear weapons currently exist and, in a toxic political climate, it’s easy to find this fact unsettling. In spite of colossal explosions, thermonuclear war would likely end humanity with more of a fizzle than a bang. When a nuclear explosion hits, smoke and soot flood into the atmosphere. Aided by intense heat from the blast, particulates reach great altitude, reducing sunlight reaching the surface. Extended across the globe, a planet-wide blackout is the result. A predicted 12-20°C temperature drop in core farming regions and little sunlight would decimate agriculture, leading to a global nuclear famine. Even if humans avoid nuclear conflict, an interspecies war waged for millennia may soon topple us. Bacterial infections were often a death sentence until the advent of penicillin in the 20th

WWW.ISCIENCEMAG.CO.UK www.isciencemag.co.uk

century, and the plethora of antibiotics since isolated or synthesised, able to quash almost every assault from our eukaryotic aggressors. However, upon excessive exposure to these drugs, resistant bacteria will proliferate. An alarming number of ‘superbugs’ now exist, thanks to our misuse and abuse of antibiotics. 490,000 people worldwide developed multidrugresistant tuberculosis in 2016 and this is set to rise annually. A sufficiently contagious, resistant, lethal strain would put us in grave danger. Rather than suffer an unplanned, catastrophic demise, could Homo sapiens depart with dignity? At a microscopic level, the operation of our remarkable brains boils down to billions of individual neurons, performing simple actions. This mechanical perspective places the origins of consciousness firmly in the realms of physics. Inspired by brain wiring, artificial intelligences can perform increasingly complex tasks with surprising efficiency, and in silico minds would be able to outlast many of the calamities facing carbon-based organisms. Uploading our minds into synthetic frameworks would equal nearimmortality, or we may create superior beings worthy descendants of humanity. Conceding the torch of consciousness to artificial intelligence may be a proud conclusion to our species’ chapter in the universe. Author: Abdul Zafar

I, SCIENCE Science

19


To infinity… and beyond! 20

I, Science

Get your head around the mystical ‘lazy eight’ “Yes, it is times infinity”. Case closed, mic dropped, debate over. Using this strategy, I soundly defeated my sister in many disagreements during our childhood. I had found the loophole. No argument, regardless of its logic, could stand up to an attack of sheer volume. My reign was glorious and indisputable; it was always my turn to pick the movie yet never to change our brother’s diaper. The last slice of cake consistently found its way to my plate until the fateful day when she blindsided me with her counter-offensive and dealt my strategy a fatal blow. “No, it isn’t, it’s time s infinity plus one”. The cake was hers, the diaper my consolation prize. Suffice to say neither of us understood the intricacies of debate or the mathematical concept of infinity and, over the last two decades, that hasn’t changed much. I recently became curious as to how legitimate our technique was. We all know ‘if you say it more times, you win even if it’s nonsense’ holds true - we’ve all seen a political campaign. Yet what about my sister’s ‘plus one’ retort? Does it make any sense? Did I concede to a flawed logic? What even is infinity? The Oxford English Dictionary (OED) defines infinity as “a number greater than any assignable quantity or countable number” and it’s a concept that has been around since Ancient Greece. It is represented by a symbol known as a “lazy eight, ∞” or a lemniscate for the intellectuals. This concept of something limitless, and without end, was originally conceived in a philosophical context, and known as apeiron. The first attestable mathematical use

www.isciencemag.co.uk


of infinity was by Zeno of Elea, notorious for his mindbending paradoxes designed to explore the nature of infinity. Aristotle later expanded on apeiron by classifying two types of infinity – actual and potential. Wikipedia tells me that actual infinity is a predefined set of infinity, like all real numbers, and potential infinity is making a never-ending list by forever adding one to the last term. My argument was actual infinity, Kitty’s was potential infinity. It wasn’t just the Greeks who had definitions of infinity. Jain mathematicians in Ancient India, who were the first to use zero, were also fascinated by its exact opposite. Their interest in large numbers led them to create their own definitions of infinity, five no less, but they also boiled down to the bounded and unbounded sets. Unfortunately for me, Aristotle rejected actual infinity stating it was impossible to encounter infinity within our universe, the divine notwithstanding. Round one to my little sister.

So maybe I’m on diaper duty forever, but what has become clear is that infinity is complex and nebulous to grasp in nature. Consequently, it is often best understood in thought experiments like Zeno’s paradoxes or the Hilbert Hotel: a hotel with infinite guests in infinite rooms that always has room for one more. It also induces levels of mind-bending seen in the final sequences of Nolan’s 2010 film Inception and with facts like we’re only almost sure that infinite monkeys with infinite typewriters would eventually write Shakespeare. Infinity also includes the possibility of them pressing the adjacent keys ‘asdf’ forever…. Yet we cannot be certain. What I am certain of is I’ve got the perfect comeback for the next time Kitty and I get into it: infinity plus two.

So, Aristotle didn’t have my back, but, though his definitions of infinity persist, two new definitions are more commonly used today – countable and uncountable infinity. Sorry OED. Countable infinity is easy enough to explain; it’s an infinity where each element can be assigned a natural number; they can be counted individually. One, two, three, 4 trillion: each number is a step away from the next number. Uncountable is the opposite and can be understood through the infinitely small. 0.9 is a number between zero and one. Intuitively the next step is one. However, 0.99 is between 0.9 and 1. As is 0.999 and 0.9999 and so on ad infinitum. Thus, it is not possible to count the infinity between one and zero. This concept of infinitely small numbers led Isaac Newton to discover calculus, one of the cornerstones of mathematics that is crucial in studying change and difference covering everything from predicting the weather to facial recognition.

Author: Oluwalogbon Akinnola Image: Riko Yasumiya

Society, however, is made by people. Finding infinity in the world around us is proving to be much more difficult. Aristotle may have been right. Infinity, like maths if my father is to be believed, may be of the divine. Physicists make assumptions, the gravity in a black hole is infinite for example, but we have no evidence that infinity exists in the natural world. That doesn’t mean it’s not there, though. Infinity plays a role in explaining the answer to one of life’s big mysteries – the nature of our universe. The most popular explanation of Big Bang theory is cosmic inflation and it can create infinite space volume by stretching indefinitely. It’s not only the size of the universe that may be limitless. It’s agreed that the universe has a finite past, but does it have an end? Is its age infinite in one direction, one of the five Jain definitions of infinity, expanding for eternity? It begs the question, what is it expanding into? What’s bigger than infinity? Well it turns out there are different sizes of infinity, as demonstrated by George Cantor in the 19th century using a powerful mathematical tool called proof by contradiction. You assume a truth, then highlight a flaw to prove the contrary. Recall countable and uncountable infinity. Cantor showed countable infinities are the same size, but uncountable is a larger sized infinity. This means that though every even number is a subset of every natural number, they are the same size of infinity. I had assumed Kitty was wrong but Cantor showed otherwise. I should have listened to Buzz Lightyear when he said, “To infinity… and beyond!”.

www.isciencemag.co.uk

I, Science

21


DON’T GO BREAK MY 22

I, Science

www.isciencemag.co.uk


The physiological responses to destructive emotions

All endings come with their own particular set of emotions, whether positive, negative, or a bit of both. But such emotions can go far beyond simple feelings; our emotions can have a much deeper and more physical impact than you might expect. When we talk about events that threaten our feelings of social connection, like breakups, exclusion or bereavement, our language tends to include allusions to physical pain heartbreak or heartache. And research suggests that the links between emotional and physical pain can go well beyond mere phrasing. Social pain actually involves the same brain pathways and neurochemicals that are involved in processing physical pain. Studies have even suggested that taking paracetamol can reduce not only a hurt body, but also hurt feelings.

KIN’ This connection between physical and emotional pain might have something to do with the endogenous opioid system. This neurochemical system regulates physical pain and might also be involved in mediating social attachment. Indeed, at low doses, opioids appear to be beneficial for relieving both types of pain and in contrast, opioid receptor antagonists (which essentially have the opposite effect to opioids) seem to worsen physical pain, while increasing “distress vocalisations” of animals that are isolated from their companions. One reason for the strong link between emotional pain and physical pain might be that in the past, our ancestors were totally reliant on their social groups for survival. Without other humans around, you were unlikely to be able to feed or shelter yourself, there would be nobody to keep guard as you slept, and the chances of you managing to have children and pass on your genes would be low. Having a strong and painful reaction to social isolation would be essential to your continuing survival and evolutionary fitness. Interestingly, people seem to respond to different types of emotional loss in different ways. Being explicitly rejected can unhelpfully lead to behaviours that prevent that kind of pain again, but don’t help in the long run – like withdrawing

www.isciencemag.co.uk

and becoming more introverted. Being ignored, however, can make you more likely to try harder to engage with others, with hopefully more positive long-term results. The social environment around you can impact both your feelings and your physical pain levels. Someone accidentally hurting you actually seems to physically hurt less than some intentionally doing so, suggesting a difference between experiencing just physical pain and experiencing “social pain” at the same time. On the other hand, having social support seems to be not only the best relief for sadness, but is also surprisingly effective at helping to reduce physical pain. When it comes to emotional endings, our physical responses can go far beyond simply pain. Those who are grieving, or heartbroken, can experience a range of symptoms that shouldn’t be dismissed as mere histrionics. Those suffering the loss of a relationship or a loved one can struggle with controlling increased intrusive thoughts or insomnia, and even experience decreased immune functioning. That’s right – a bad breakup really can make you ill. Some people even suffer from broken heart syndrome, a condition in which patients feel intense chest pains, similar to the sensation of a heart attack. It’s actually due to a real heart condition, takotsubo cardiomyopathy, in which the muscular portion of the heart suddenly and temporarily weakens. In more than 85% of cases, this is linked directly to an emotionally stressful situation. The mechanism behind such extreme responses to loss has been suggested as simply the loss of a person who acts as a “social regulator”, providing the ideal amount of stimulation and arousal (no, not that kind). Being in love actually seems to increase the amount of dopamine in your brain’s “reward pathway”, and the area of the brain that is associated with painful emotions seems to reduce activity when people are shown pictures of a loved one. Your brain becomes accustomed to these feelings, and to being neurochemically boosted by the presence of certain people. When those people are suddenly gone, the brain struggles to adapt to a new reality. There’s also a lot of mental confusion caused by aligning what were positive memories with negative emotions around grief, loss or anger and this can seriously undermine your sense of self. Beyond this, a breakup, especially if it’s messy, can mean a damaged social identity, going back to our feelings of pain around social isolation. This might not seem like the most hopeful of concepts to have in your mind, particularly if you’re currently embarking on a bright new relationship. But it’s not all doom and gloom. If you’re working through a painful split right now, there’s always hope. Studies have suggested that specifically focusing on positive outcomes (even if they’re difficult to see) means a quicker recovery and more personal growth. And remember that close social relationships, whether family, romantic or friendly, are vital to our mental and physical well-being, even in the modern day. Author: Bridie Kennerley Image: Shannon Bono

I, Science

23


The end of ageing Our chronological age is inextricably linked to the passage of time, measured by the clock and calendar. But biological age, and the ageing process, has a different metric. In our first decades, the molecular machinery of our cells and tissues is pristine. The developmental processes responsible for increasing the body’s size, strength and complexity are executed with utmost fidelity to our genetic instructions. The result is a harmonious body: from the whole individual to the molecular scale, the system is balanced, responsive, and efficient. If blessed with circumstances that promote good health, we have many years to enjoy our bodies, working like precise and finely-tuned machines. But as we move through our thirties and into our forties, we begin to experience a decline. The cause of this decline is the accumulation of metabolic damage: damage within our cells and tissues. Although our bodies work in a highly regulated manner, the environment around us can cause stress to our bodies. Erroneous molecules or transcriptional mistakes in our DNA occur, which escape the processes of correction. These defects injure the structure of tissues. Over time, the damage builds and passes a threshold beyond which the symptoms of ageing arise: slower kidneys, a weaker immune system, stiff joints, poor vision, cataracts, cancer and Alzheimer’s to name but a few.

24

I, Science

Ageing is an umbrella term that encompasses a host of pathologies, some recognised as just part and parcel of the broad regression in our physicality, and some recognised as distinct diseases. Nonetheless, they all can be traced back to seven types of molecular or cellular damage. (1) Mutations in nuclear DNA, when associated with certain genes, manifest themselves in cancerous behaviours, and (2) mutations in mitochondrial DNA can lessen a cell’s ability to function. Unwanted molecules that have escaped degradation, or molecules that are the dead end of detoxification, exist as (3) extracellular or (4) intracellular ‘junk’, interfering with other reactions. (5) Cross links between cells increase and reduce a tissue’s elasticity. (6) Cells become senescent, essentially ‘sleeping’, ignoring self-destruct signals. (7) The healthy and normal death of cells becomes problematic as the body loses its ability to replace them. Due to these processes, whether as a direct consequence or owing to an ensuing snowball effect, our bodies become old. The evidence is in our molecular structure; the manifestation outwardly visible and inwardly tangible. But with knowledge comes power. The Strategies for Engineered Negligible Senescence (SENS) Research Foundation is developing regenerative therapies, known as ‘rejuvenation technologies’, that will work to reverse the effects of ageing at the level at which

they occur – the cellular and molecular. The foundation is taking an engineer’s approach: fix the damage, even if the underlying metabolic process is not understood. So how to counter cell loss? Use stem cells. To remove intracellular junk? Use novel hydrolytic enzymes to break the molecules down. To counter the effect of mutations in mitochondrial DNA? Place back-up copies of the genes into the nucleus, so that they can continue to be expressed, away from the harm of mitochondrial free-radicals. This targeted rejuvenation is a far cry from the suite of cures and strategies we currently apply to the vast range of age-associated diseases and their symptoms. Rather than improve our ability to treat the symptoms of ageing, rejuvenation technologies propose we do away with the diseases and the ageing process altogether – divorce even further our biological age from our chronological. By targeting each of the seven damages, we could maintain the integrity of our tissues at a level we understand as belonging to, say, a 30-year old. The crux of it is: we need not age at all. Rejuvenation technologies currently exist at varying levels of development. Investment is sought to bring these biotechnologies through their remaining stages and onto the market. The scientists involved in this research are firm in their belief that rejuvenation technologies are much more than pie-in-the-sky thinking, but a

www.isciencemag.co.uk


near reality. With this belief, one prominent figure of the SENS Research Foundation, Dr. Aubrey de Grey, suggests that within a decade, we will be able to achieve a life extension of 30 years. But he goes much further. Technology improves, and by the nature of these rejuvenation technologies, those who benefit are more likely to be around to reap the rewards of the next increment of progress. The difference in lifespan of a newborn and a 10-year old could be vast; indeed, de Grey believes the first 1000-year-old human is already alive today. What would happen to societal structures and norms is an open question. Understandably, there are those who oppose the development of rejuvenation technologies on these grounds of unpredictable and unprecedented social change. There is further opposition to be found in the diversion of a natural

www.isciencemag.co.uk

process – one which is shared by nearly every living thing on Earth. Scientists behind the rejuvenation research bat away these concerns. It wouldn’t be the first time human endeavour set society on an unknown trajectory – the industrial revolution is a case in point, though the social, cultural and environmental consequences of this paradigm shift in human development are still felt today. Scientists would also argue that it is hypocritical to put ageing on the pedestal of sacred nature, when we already try our utmost to fight cancer and eradicate certain infectious diseases.

Author: Poppy Lambert Image: Rose Zhou

Rejuvenation technologies might not be our next frontier in medicine, but if they do come to fruition, ageing will be but a side-effect of being alive – one which we have the same power to contain and even cure.

I, Science

25


Redefining systems of belief

ESCAPE VELOCITY

Are you the same person you were a year ago? Perhaps your knowledge of a subject area has improved, your sense of self-worth has increased, or you’ve become more appreciative of a family member’s awful music taste? We are in a constant state of change, not only physically but also emotionally and mentally. While we clasp onto our beliefs as central to our identity, these views are not static. The history of science

26

I, Science

demonstrates that humanity’s ideas are in constant motion. Had you been born in Classical Greece, you would have been laughed at if you did not agree that all matter was composed of classical elements - air, earth, fire, and water. This idea changed in 1789, when Antoine Lavoisier’s Elements of Chemistry listed the first of the modern chemical elements. In 2016, the elements Tennessine, Nihonium and Moscovium were added to this list. To dig in our heels because such a development threatens our belief that only

www.isciencemag.co.uk


115 elements exist in the Periodic Table would be intellectual suicide today. Old ideas can wither and die, but they can also sprout new ones. I have experienced multiple endings and beginnings of my own beliefs. During my teens, I rolled the dice by turning my back on my upbringing in an infamous cult to begin a life where I could freely question my ideas. Putting a bullet in the head of my old belief-system was the result of much critical thinking, a longdormant faculty which had slowly been awakened after a variety of experiences had challenged my deep-set indoctrination. Turmoil engulfed my family when my brother was born with congenital heart disease and desperately needed a blood transfusion. The ministers of my congregation demanded the procedure be refused, threatening my parents that if my brother received blood he would have no promise of an afterlife. Thankfully, the surgeon was able to perform a lifesaving intraoperative blood salvage procedure which satisfied my minister’s wishes. Further, the cult would often silence victims of child abuse, something which is now well documented in the media. After my dad became aware of this, he was deemed an ‘apostate’ and ostracised, as a silencing tactic. It was soon after that my dad became ill, and, only until his death, my family was reinstated. Due to being victims of cult mind control, we bizarrely accepted this as rational. Though I was very young and deeply indoctrinated, these experiences profoundly shook my conscience. I started reading science magazines, which gradually stimulated my hunger to pursue a university education. However, the cult mandated that while, regular schooling was permissible to acquire basic skills, higher education was prohibited because it corrupted your mind with philosophies of a ‘sinful world’. Instead, I was encouraged to feed upon the knowledge in the cult’s literature and dedicate my life to their cause. But the idea of suppressing a mind that is hardwired for learning, stopping it from indulging in an education, felt illogical and only seeded more questions in my mind. I decided to research the ‘apostate lies’ of my organisation which caused my entire belief-system to come crashing down. I had finally ‘woken up’. I realised that everything I had been taught was a lie. At this time, my sister was being reprimanded by my ministers for fleeing her violent husband because, in doing so, she had desecrated the cult’s perceived sanctity of marriage. They demanded that she return to her husband. Finally broken from the chains of the

www.isciencemag.co.uk

cult myself, I encouraged my sister to leave the faith along with other family members, where we could build a new life. Though the reality with which the cult had provided me with had been all that I had known, I was prepared to re-construct everything I knew about the world, because I had realised that I could no longer uphold my previous belief-system. Killing my old beliefs changed my life completely. After escaping the cult, I embarked upon a university education where I studied science more in-depth. Here I learned that debate and discussion were far more intellectually satisfying than mechanically swallowing whatever viewpoint was fed to you. I was finally free to build upon my beliefs as I saw fit, coming to view science as a purely objective and flawless representation of reality. However, another reformation of my beliefs was yet to happen. Studying MSc Science Communication at Imperial College London has made me appreciate that scientific knowledge is not waiting to be unearthed from nature by the golden hands of a scientist devoid of all bias and error. Instead, scientists unwittingly bring their preconceptions to their research, colouring knowledge with biases rooted in language and culture. For instance, Himba tribespeople distinguish between light green and dark green but see green and blue as inseparable. It is intriguing to imagine how the scientific canon might differ had the Himba people overseen its creation. Scientific research is also victim to many irrational leaps which deviate from its method. In 1969, the physicist Joseph Weber claimed to discover large amounts of gravitational radiation, but when other scientists obtained conflicting results when repeating his experiments, an influential researcher performed a dubious experiment to catapult criticism onto Weber’s findings, promoting acceptance for the negative findings. This is one of many examples which made me question science as complete, objective ‘truth’. For some, beliefs are like a pair of shoes. You start wearing one pair and find they work best for you, but soon frailties appear, and their lustre fades. One day you buy a new pair which now feels more appropriate, and the cycle continues. I have thrown out multiple pairs of ‘shoes’, going from a doomsday cultist to a Richard Dawkins zealot to now someone who appreciates the social aspects of science. Such experiences demonstrate that for some of us, beliefs are not so static. They can end, but they can also begin new ways of seeing. Author & image: Jonathan Neasham

I, Science

27


Images: Sophie Moates

28

I, Science

www.isciencemag.co.uk


Seeking closure

Author: Jordan Hindson

Why we just can’t live without an ending

In 2006, NBC made a psychologically catastrophic decision. Deadwood, its rakish, highly popular Western series, was to be cancelled after just three seasons. Plot lines were unresolved, characters’ fates unknown. Ending a story with a cliffhanger is an ancient storytelling device that has renewed relevance in the age of the endless TV serial – see Cumberbatch’s Sherlock diving from atop St. Bartholomew’s Hospital for an almost literal example of the term. A cliffhanger usually implies a follow-up, a resolution. It creates expectation, and, when this is not met, viewers unite in anger. Deadwood’s abrupt termination meant that for millions of viewers the loop of narrative expectation was never closed. Even waiting too long for a resolution can induce a flurry of indignation: in the 19th century, Charles Dickens’ readers rioted at the New York docks in feverish anticipation of Little Nell’s fate in The Old Curiosity Shop, the last instalment of which was steaming towards the docks by ship. Our craving for closure is intimately bound up with our instinctive sense of narrative shape. In his 1967 book, The Sense of an Ending, literary scholar Frank Kermode tried to formalise this observation. Fiction, he argued, is like the ticking of a clock. A stark tick–tick, filtered by the nervous system, becomes a tick–tock. A beginning and an ending are imposed, and each depends on the other. Fiction, on this reading, is an intricate elaboration of this inescapable pattern. Modern science is now treading similar terrain. 20th century physics taught us that time is a peculiar, supple substance. Now, modern neuroscience reveals that our own perception of time comes as much from within – from neural circuitries and synaptic connections – as it does from the space–time of without. Our brains create meaning from sensory chaos, and stories graft shape onto a shapeless world. But why do we need stories in the first place? We have been regaling each other with tales for many thousands of years, and explanations as to why abound. Multiple studies suggest that reading fiction (and, by implication, absorbing the experiences of others) can enhance the gift of seeing the world through another’s perspective, which has evident evolutionary benefits. Equally, storytelling is a memory-aiding tool: studies show that we are more likely to remember facts if they are presented as part of an engaging narrative. But memory formation catalysed by narrative structure is less about accuracy than it is about coherence – the web of links and associations between facts or statements enhances memorability. This desire to both tell and hear stories, fictional and nonfictional, is universal, but the stories themselves also tend to share common characteristics and patterns. Scholars have identified archetypal plots and recurring character templates that disproportionately constitute the stories we invent. In the early 20th century, Vladimir Propp identified 31 narrative elements in Russian folk tales. Subsequent scholars have expanded Propp’s work into other genres.

www.isciencemag.co.uk

On this reading, the literary canon is comprised of small variations on recurring themes. Less, however, is known about why such narrative commonalities exist. It could be that our brains are evolutionarily primed to respond to the characteristics shared by the narrative arts, but some critics are unconvinced by explanations that derive from this ‘Darwinian’ evolutionary psychology. Such theories, they argue, are unscientific, overly convenient, and fail to appreciate the seismic impact of culture and society on the stories we tell. This debate over the origin of storytelling and the common forms of stories reflects a broader argument, the tension of which is revolutionising our view of the roots of human behaviour. Pitting a cemented, glacially changing core of human nature, fashioned by evolutionary pressures over millions of years, on one side against a growing appreciation for the strong, fluid roles of culture and context on the other, this schism divides large swathes of the field of psychology today. Whatever the source of recurring elements in stories, neuroscience is tentatively discovering that the empathyinducing power of stories has a measurable physical basis in the brain. In 2006, researchers in Spain used functional MRI scans to measure neural activity and observed that participants exposed to words associated with smell (‘perfume’, ‘coffee’, etc.) exhibited activity in the primary olfactory cortex. Later research built upon this finding, demonstrating that sentences containing bodily actions (e.g., “John grasped the object”) elicited activity in the motor cortex and related areas. Past research on splitbrain patients, in whom the two hemispheres are no longer able to communicate with each other, demonstrated the existence of a ‘left brain interpreter’, which constructs stories, often false, to explain events. (More recent work, however, acknowledges that this binary model of the brain is hopelessly simplistic.) In other words, the effects of language and narrative on the brain are not limited to socalled ‘language’ areas; rather, they are more complex and diffuse. These are preliminary studies, baby steps. They suggest that the brain of an individual treats a story similarly to if the individual was undergoing the experience, and studying the impact of stories on the brain may elucidate why so many of them possess similar structures. A core feature of our commonplace perception of stories is the presence of endings and finales. The language of ‘closure’ pervades many of our conversational topics, from bereavement to break-ups to the season finale of Game of Thrones. Modern neuroscience and psychology are starting to sniff out exactly why so many stories, fictional and nonfictional, have a similar shape. The neurological craving for the tock of the ending imbues the beginning’s tick with significance, and the need for narrative closure is physical. And, as we all now know, the secret of good storytelling is

Science I, SCIENCE

29 29


Upgrade available now

Author: Meesha Patel | Image: Tere Chadwick

Technology is a constantly changing industry. Each piece of software, or hardware, goes through so many iterations it can be difficult to keep up with. Between generation to generation we choose something that will hopefully stick. However, customers now have to decide between durability and novelty - nothing lasts forever, and as each item becomes superseded we have to move on. Here we explore two archaic technologies and two recent innovations that may prove to make our current tech obsolete.

Storage: floppy disks Providing up to a whopping 200MB at the height of its use, the 8-inch floppy disk was first invented by IBM. With time it became more compact and, with more data capacity, most PCs were using the 3.5-inch versions by the 1990s. The death of the floppy disk was partially due to rewritable CDs, more popular thanks to their data storage capacity. Sony officially stopped manufacturing floppy disks in 2011, but some claim the floppy is ‘still a thing’, buying from sites like floppydisk.com and using them in retro artwork. In the technology industry, floppy disks are a relic. As everyone creates content online, storage methods need to be smaller, faster and hold more data. The floppy disk just couldn’t keep up.

Organisation: the personal digital assistant During the ‘80s and ‘90s the personal digital assistant (PDA) was born; a handheld device that connected to the internet. Its arrival heralded a new world of constantly-checked diaries and emails. First consisting of a screen with a full qwerty keyboard, the term ‘PDA’ was not actually used until 1992 when Apple’s CEO John Sculley described the Apple Newton. What really revolutionised this device was the creation of the first PDA-telephone hybrid in 1994, arguably the first smartphone. PDAs also used applications to organise the user’s life, so rather than going extinct, they were integrated into developing tech to become the smartphone of today.

Usability: the computer mouse With the rise of laptops, personal computers are being phased out. We want fast, mobile and convenient products. Every time you enter a website there’s a 15% chance that you accessed it using a touchscreen, not a mouse. This has risen significantly due to the invention of touch screens on computers and smartphones. As gesture-control technology advances, user interfaces on touchscreen devices will make it easier to use swipes to browse the internet than a mouse.

Finance: credit and debit cards While cards used to be for accessing cash, advancing contactless technologies mean being cashless is the new trend. There’s less to worry about physically losing them, plus you never need to check you’ve got the right amount of change. Now, the use of mobile or virtual wallets is on the rise. For shoppers, added loyalty incentives (like Yoyo payments at Imperial) makes spending money feel more like a goal. In a world full of Apple Watches and payment on the go, cash and cards become obsolete. Who knows what new technologies we will find fundamental to our lives in the future? What we do know is that change is inevitable, but can you keep up?

30

I, Science

www.isciencemag.co.uk


www.isciencemag.co.uk

I, Science

31


32

I, Science

www.isciencemag.co.uk


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.