Issue 9: Science in a Post-truth Society

Page 1

ISSUE 9 AUTUMN 2018 Best Specialist Publication 2016 & 2018

The Guardian

STUDENT MEDIA AWARDS BEST WEBSITE 2015

BIOHACKING: D I Y GE N E T H E R A P Y THE AGE OF UNENLIGHTENMENT SHARK ATTACKS, ICE CREAMS, AND THE R A N D OM I S E D T R I A L

SCIENCE IN A POSTTRUTH SOCIET Y ARE WE FUELLING THE MOVEMENT THAT FAVOURS FEELING OVER FACT?


WELCOME


WELCOME

EDITORIAL

"FACT AND FEELING" In our last issue, we spoke about 'alternative facts', and the growing controversy they bring to the world of science and research. Sadly, this looks like a trend to continue for the forseeable future, as we see the rise of the 'post­truth' society, where your ability to sway public opinion and emotion is just as important as the facts behind the story. In this issue, we dive into some new developments, and hope our readers will gain insight into how new technologies could even be driving this change. First, biohacking: ever seen a TV scientist at work and thought, "I could do that"? Sonya Frazier guides us through the dangers of do­it­yourself gene therapy. Stephanie Cumberworth covers the risks of anti­vaxxing on page 12. Sara Cameron talks correlation and causation on page 10, and how to avoid the pitfall that tabloids love to exploit. 2018 has been a successful year at theGIST, as we were awarded 'Best Specialist Publication' by the Student Publication Association for the second time in three years. To quote the announcer, "They've done it before, they've done it again!". We want to congratulate all the other publications that received awards, including a few others based in Glasgow; we're all doing our city proud on the national stage. Speaking of representing Glasgow, we're very excited to announce that we're expanding to include Glasgow Caledonian University. After working with both the University of Glasgow

and the University of Strathclyde since our launch, we want to continue to grow and support more budding science journalists across the city. As usual, we would never have been able to achieve any of this without our contributors; thank you for your continued work, dedication, and inspired writing. Massive thanks go to our hard­ working board, theGIST has always been a collective effort, and each board member brings their own skills and talents that continue to make us proud of what we do. Finally, of course, we want to thank you for picking up this copy of theGIST. We hope you love it as much as we do! When we tried something different and put out a call for themed articles, we worried we might limit the scope of our contributors' writing. However, what they've shown us is truly how far­reaching the post­truth movement is. We're very excited for you all to read this issue, we have a fantastic collection of articles for you to see. We will be holding an EGM in the coming weeks to find some new board members, including for our Head of Glasgow position ­ Gabriela will be completing the final year of her PhD and so will be passing on the role once Freshers' events are over. You may still see her involved in other GIST projects, however! Thank you to our other members who have moved on over the summer, and those leaving us before the next EGM, we'll do you proud. Much love, Gabriela De Sousa & Katrina Wesencraft

NEW Editors­in­Chief: Gabriela De Sousa and Katrina Wesencraft Submission Editors: Costreie Miruna and Sonya Frazier Head of Copy­Editing: Kirsten Woollcott Layout: Katrina Wesencraft, Gabriela De Sousa, and Caitlin Duncan Art: Lana Woolford (cover), George Bell, Cully Robertson (Cullor Illustration), Roxanna Munir

FEATURES

POST-TRUTH


NEWS

SCIENCE

is set to become one of the first UK universities to benefit from a . Twelve talented engineering students will receive funding as part of a . The univeristy joins the University of Cambridge in signing a formal partnership with the CFG Foundation.

UPDATE The is to lead a STEM equality initiative, , in partnership with other groups such as construction firm BAM Nuttall Ltd and the Scottish Research Partnership in Engineering. , led by Strathclyde Vice-Principal Professor Scott MacGregor, and aims to reduce the in universities, particularly in STEM fields. It will do so by funding changes such as workshops to promote best practice to key industry partners, and facilities such as a free flexible crèche to allow women on leave to attend research meetings.

2

researchers have contributed to two breakthroughs concerning the this summer. The researchers are a part of , which has been studying the Higgs boson since it was first discovered at the Large Hadron Collider in 2012. Now, for the first time, the Higgs boson has been observed to decay to two particles (called bottom quarks), and is to be produced alongside two particles (called causing the ice in and top quarks). Both discoveries were expected, around the Arctic Ocean to melt and add credence to our rapidly. This is . Glasgow’s . A challenge on this physicists will continue to investigate the scale requires a major response and Higgs boson, in particular looking out researchers from Strathclyde are joining for any unexpected behaviour. partners from 31 other institutions in the , run by the UK’s Natural Environment Research Council and Germany’s Federal Ministry of Education and Research. WWW.THE-GIST.ORG


NEWS

The is helping to make research impact a top priority with a “Research Impact: Making a Difference” beginning this October. The course aims to highlight the effect that research can have in areas like the economy, policy and education, and . It is becoming increasingly important for researchers to be able to engage with stakeholders including industry and the government. During the three week long course, research students and staff will have the chance to develop skills that will help them translate their work into and collaborate across the sectors. Partners in this endeavour include . If you’re interested in learning how to make a difference with your research, more information can be found at:

Glasgow Caledonian University will lead a to protect a historic fishing village in Aberdeenshire. , famous for inspiring the paintings of artist Joan Eardley, will be protected from coastal erosion, flooding and landslides. The village will be that will be involved in the conservation study.

A Professor at the University of Strathclyde has received a from the Royal Society of Edinburgh. , of Strathclyde’s Department of Mathematics and Statistics, has been awarded the Professor Heath has researched a wide range of marine ecological issues and is a leading authority on the impacts of fishing and climate change on marine food webs.

WWW.THE-GIST.ORG

3


NEWS held a "STEM social" followed by the regional heat of . Contestants had just three minutes to thrill the audience, and the judges, with their favourite STEM topic. The event was hosted by Jamie Gallagher, former GISTer and runner-up of the UK competition in 2012. to the Glasgow winners Marc Vives Enwich, Mahmoud Beshir, Erika Robinson and Hannah Bialic!

Researchers at the have discovered a genetic link between In the largest study of its kind, the activity data and genetic information of 71,500 participants was used to identify two areas of the human genome with genetic variants that had an increased likelihood of leading to disrupted circadian rhythms. These natural cycles progress throughout the day and dictate essential functions The Chancellor has like sleep patterns. One of the areas identified announced that the city of contained a gene that binds to the protein Glasgow will receive a share of an product of . This suggests there may After meeting with be a genetic link between disrupted with researchers in the University of sleep-activity patterns and Strathclyde’s serious mood disorders.

, he said, “The UK is a world leader in Quantum technologies, … The £80 million in new funding, that I have announced today, will ensure that we remain at the forefront of this exciting technological revolution”. Quantum imaging could be used to collect vital visual information during when it can be hard to see, such as wildfires or snow storms.

4

WWW.THE-GIST.ORG



LIFE SCIENCES

IDENTITY CRISIS customer, as well as alerting the establishment to a customer’s mood and intent in order to deliver tailored customer service. While anything that prevents pushy upselling when you’re not in the mood for it can’t be all bad, many worry about the precedent this sets with respect to privacy – shouldn’t it be an individual’s choice to wear their heart on their sleeve?

I

t’s Friday night, you’ve had a long week in the lab and your colleagues decide that it’s finally time to take off your goggles and let your hair down. There’s drink to be drunk! At the bar, one by one the bouncer allows everyone to pass... until it’s your turn. “Sorry, can I see your ID?”. Of course you don’t have it so you resign, dejected, back to your own flat, left wondering why the bouncer couldn’t see the maturity in your eyes. Soon, however, they may be able to. Once exclusive to government agencies and camp crime dramas, facial recognition technology has become so cheap and readily available that it’s not unusual to hear of high street chains (7­Eleven, Walmart, Amazon, et al.) and certain stadiums, shops and venues implementing these systems, each with their own motivation for doing so. Previously, security was the most commonly cited reason for utilising such technology. The concept of being able to remotely pick a dangerous face out of a crowd is hugely appealing to organisations charged with keeping us safe, from the police to national security services. More recently, however, “end­user experience” is likely to be credited. Facebook recently applied to patent a new version of this technology1 to instantly allow retail companies access to Facebook’s personality profile of a given

So how did we get here? One major aspect of this technology is the ability to recognise “biometrics” which, in this case, refers to the positioning of facial features. This retrieved data is then cross referenced against a pre­collected “database” to determine who the particular biometric data corresponds to.

Far from being a modern breakthrough, this technology was pioneered by Woodrow Wilson Bledsoe in the 1960s2 . Bledsoe established coordinate points on facial features using simple forward­ facing photographs and then compared this data to previously analysed subjects. These positionings, such as shape of jaw, distance between eyes, thickness of lips and length of nose vary enough from subject to subject that the fidelity of the comparisons met an acceptable standard. However, due to computer memory restrictions of the era and the fact that this data could not be gathered autonomously, instead requiring manual specification and selection of the relevant features, the process was slow and methodical, with much room for improvement. Nowadays, thanks to modern machine learning algorithms, these

WWW.THE-GIST.ORG

data points can even be created on moving images. Digital cameras now have much higher resolutions and the prevalence of mobile technology and social media accounts creates enormous (normally user­generated) photographic libraries of faces that are already inherently connected to a myriad of other personal data. When combined, these factors greatly increase the potential of what could be achieved using the same basic principles discovered by Bledsoe. These advances do have a dark side. Worryingly, there have been modern studies claiming to be able to discern criminality from the shape of a subject’s face using these algorithms, echoing the obsolete and disproven field of Victorian phrenology3. It is vital that those developing this technology consider the potential cultural setbacks its adoption may cause when subject to the already inherent biases within those designing and implementing these systems. Many have already had uncomfortable experiences with early speech­recognition technology being unable to determine what has been said unless the user mimics a standard English or American accent, and similar setbacks and bias are apparent in the facial recognition field. These problems have led some to claim that this technology has been developed by white men, for white men, and is severely lacking in accuracy when it comes to the recognition of gender and other qualities in all non­white groups. On top of these issues is the very real problem of in­built racism that sadly occurs when a machine learns from interactions in online communities4,5. Private sector implementation of this technology has already garnered controversy. A recent example is patrons feeling targeted at Madison Square Garden after it was


LIFE SCIENCES

discovered that they’d been scanning all attendees to run their faces against an in­house security database and collect data on their ages, genders and ethnicities for market research purposes6. More worrying still, certain governments like China’s are being criticized for ethnically discriminating against entire communities by warning those who aren’t recognised on CCTV as fitting a certain description that they’re “in the wrong neighbourhood”7.

This article was written by Luke Prentice, specialist edited by Gabriela De Sousa, and copy edited by Kirsten Woollcott.

Such is the pushback against having individual personal information readily available to any business with a camera, certain subcultures have began developing means to confuse or otherwise spoof these algorithms. Introducing many fake faces elsewhere on their person, obscuring their features with specially designed make­up or facewear act to create false data points to incite false positives and outright errors8,9,10. Some simply stick to the tried and tested method of using a mask. However, even these need to be specially designed as surveillance systems are now capable of seeing past face coverings over 50% of the time11,12. Undoubtedly, the implementation of these new technologies could bring welcome advantages to our day­to­ day lives in the form of major convenience, saving us from fishing out tickets and IDs whilst passing through security gates or allowing us to pay for our items in the supermarket without the need for interaction. However, having your entire online identity available to the highest bidder, and your location potentially tracked by mysterious third parties, opens us up to exploitation and an increasing number of potential civil issues. With this kind of data collection becoming more commonplace, it is imperative that we all become more conscious of the potential impacts such technology will have. Of course you could choose to ignore it. But you will be seen doing so.

WWW.THE-GIST.ORG


POST-TRUTH SOCIETY

PICS OR IT

DIDN'T HAPPEN

Jamie Tarlton believes new AI-generated "deepfake" videos will be used to spread political misinformation.

W

e are understandably quick to applaud when new research suggests new treatment strategies for Alzheimer’s disease, and who doesn’t appreciate the ability to contact their family and friends around the world within minutes. However, it is also important to consider the role of science and technology in the apparent “post­truth” society we now occupy. While scientific discovery may depend on objectivity, technology can be abused to spread disinformation and uncertainty.

of images of target subjects. Their faces are then algorithmically isolated from the background of each image. This library can then be used to replace the face of the video performer with that of your chosen target. At points where their faces have similar expressions, the images are matched and superimposed1.

One such example is the ability to digitally manipulate video to replace faces for any other. The technique has been termed “deepfake” as Reddit user “deepfakes” posted prominently about, and apparently developed, the basic artificial intelligence (AI) algorithm. The process to create a deepfake video depends on collecting large numbers

focussed on replacing porn actors’ faces with that of prominent actresses, prompting concerns over consent and the potential for deepfakes to be used for “revenge porn”. More recently, the technology has been used to add the face of Adolf Hitler to Argentinian politician Mauricio Macri2. While I can now confirm Angela Merkel does

Development of AI facial recognition can of course be useful for law enforcement, and technology allowing easy face­swapping is used in popular apps, like Snapchat. However, the abuse of deepfakes can change the nature of reporting and truth in the news media. The original “proof of principle” videos mainly

WWW.THE-GIST.ORG

not look better with a Donald Trump visage3, the abuse of deepfake videos in politics has the potential to divide our society even further. Social media can act to filter our news sources and awareness of the wider world, leading to reinforcement of our own beliefs; this may be exacerbated once deepfakes become well known and anything that conflicts with our personal worldview can be dismissed. Recently, deepfake technology has been developed into a desktop app that removes the need for programming experience to create the controversial videos. The only limitation for the true democratisation of deepfake technology is the computing power necessary. However, this would be no issue for anyone with motivation. The threat of this technology is apparent in our modern political environment where calling fake news on inconvenient facts is a weekly occurrence. Journalism has always required an element of trust in the reporter from the reader, but videos included in media brings an authenticity to a report that goes largely undoubted. When the provenance of every source of information in the media can be reasonably doubted, what does this mean for news reporting?


POST-TRUTH SOCIETY One example of a news story where video evidence was particularly damning is the case of Sam Allardyce and his short time as the England football team manager. Sam Allardyce left his job after a video was captured of him explaining to supposed businessmen how to avoid laws imposed by his employers regarding the ownership of footballer registrations. The role of the video was critical in this story as, frankly, the football industry is well known for corruption, and there would have been little interest in the story among readers were it not for the video of Mr Allardyce and his pint of wine. The fact that a news story needs a sensational element to gain media traction is an issue in itself, but what happens if Sam Allardyce could plausibly suggest the video was a deepfake? There is little doubt the tribalism of football fans could lead to the creation of fake, compromising videos produced by those with a grudge against him. The

normalisation of deepfake technology may lead to easy deflection by people in “real” compromising videos, and this could be a greater issue than how convincing the deepfakes will be. Identifying a deepfake is reasonably simple as the actor and the subjects’ skin tone still don’t match perfectly. Facial movement is also technically very difficult to synchronise to sound, and the algorithm produces black frames if there are insufficient library images matching the facial features in the “real” video. This offers scant comfort as the technology will only get better, and a casual, unsuspecting viewer (or, more pertinently, a viewer experiencing confirmation bias) would not

WWW.THE-GIST.ORG

immediately notice the deepfake. The only question that remains is: how much more severe will the political divide become following the deepfake assault on truth?

This article was written by Jamie Tarlton, a PhD student at Glasgow Caledonian University. Specialist editing, copy editing, and layout design by Katrina Wesencraft.


POST-TRUTH SOCIETY

SHARK ATTACKS, ICE CREAMS AND THE RANDOMISED TRIAL SARA CAMERON EXPLAINS WHY WE SHOULD READ BETWEEN THE HEADLINES WITH A CRITICAL LENS TO LEARN THE TRUTH BEHIND CAUSE AND EFFECT CLAIMS IN SCIENCE.

W

hen ice cream consumption rises in summer, so does the number of shark attacks on the coastal beaches of Australia1. Perhaps this sheds intriguing light on the effects of diet on humans’ appeal to sharks. Could it be that the high fat content of ice cream is a precursor to weight gain, which could make you more attractive to sharks, increasing your odds of being attacked? Or that with their highly developed sense of smell and sensitive electrical receptors, the sharks are able to sense recently eaten ice cream in the stomach? It seems as though junk food can potentially shorten your lifespan in more obscure ways than you thought.

Of course, this seems absurd, and I’ll bet your immediate response is ‘no no no, hot days make people visit the beach’. At the beach, you are more likely to both buy an ice cream and to get in the sea and be attacked by a shark. The correlation between these two variables can actually be explained by hot, sunny days. Hot days cause both these things. But there is a problem. Just because two factors are correlated does not necessarily mean one is directly causing the other to change. Correlation does not imply causation. So why is it that we can find reason in inferring causation for one case, but not in the other? We are instinctively drawn to ‘plausible mechanisms’, rational explanations to tie together observations, based

on our understanding of the factors and behaviours involved. We can tell ourselves a believable story about how increased rainfall could cause increased umbrella sales, for example.

This is by no means a trivial concept. Practical assumptions about causality have been a catalyst for connecting the dots in science, engineering, and medical research for thousands of years. But, like eating your ice cream in the sea, it can sometimes be dangerous and lead to incorrect and potentially harmful conclusions. Although the above examples are silly, correlation is very often mistaken for causation in ways that are not immediately obvious in the real world. Take the infamous fallacy ‘vaccines cause autism’ that quickly gained publicity and led to the campaign dubbed as ‘The Vaccine War’ by anti­vaccine advocates. The number of vaccines given to young children has risen substantially over the past two decades. During the same period, the number of children diagnosed with autism has increased considerably. This observation led to many frenzied parents choosing not to vaccinate their children against harmful diseases like measles,

WWW.THE-GIST.ORG


POST-TRUTH SOCIETY mumps, rubella, and polio. The concept was splashed across the media and unmoderated internet sources. Despite this observed correlation, there exists many possible explanations for this exponential increase in autism diagnoses, including cultural progression, improved screening methods, better awareness, and redefinition of the autistic spectrum. In order to establish cause­and­ effect, we need to go beyond observations and gather separate evidence through carefully planned experiments. The field of experimental design is an area of statistics which helps researchers to plan and interpret experiments. The design depends on what variables you wish to control for and which mechanisms you want to test. The gold standard for uncovering cause­and­effect relationships is a double­blind randomised controlled trial (RCT). These trials are thought to provide the highest form of evidence for causation, and their results are frequently used to guide experts' advice on what to eat, how to teach, which medical treatment to choose, whether to worry about pesticides, and so on. In an RCT, all subjects are under exactly the same conditions, except they are randomly allocated to either the group receiving the variable under question or to a control group. This allows the researcher (who does not know which group the subjects have been assigned to) to study the direct effects of the variable of interest, whilst all other variables remain the same. But even interpretation of meticulous RCTs comes with a warning label. A 2013 study investigating the long­suspected health benefits of a Mediterranean diet, concluded that eating a this diet supplemented with olive oil caused a 30 percent lower risk in heart attacks, stroke, and death from cardiovascular disease compared with a low­fat diet2. However, when the study design was reviewed, this landmark RCT study was retracted on the basis of methodological misconduct. The study was supposed to randomly assign participants to either: a Mediterranean diet with a minimum of four extra tablespoons of olive oil a day, the same diet but with at least an ounce of mixed nuts, or a low­fat diet. But it was found that of the approximately 7,500 participants

in the study, 14% had not actually been randomly assigned. Instead, many married couples were assigned to the same group on the basis of convenience. In one particularly troubling case, a field researcher decided to assign an entire village to a single group, because some residents were complaining that their neighbours were getting free olive oil. The end result is that the study’s overall findings are still accurate in one sense; there is a correlation between the Mediterranean diet and better health outcomes. But in another sense, the paper was entirely wrong; the Mediterranean diet cannot be said to cause better health outcomes.

Another issue exists in the correlation vs. causation debate. Even with the best set out trial, in order to rule out all possible explanations and interactions, the required number of experimental subjects can become astronomical, and it is often not practical or possible to rule everything out. And even with enough data, since the interpretations are based on statistics, there is always a chance, however slim, that your results are simply by chance. So in reality, there is no way to definitively claim causation. But if we were to be too sceptical of the logical reasoning formed by the well­trained minds of expert researchers, we might be careful not to halt progress altogether. If someone suggested that DNA wasn’t a major cause of inherited traits because we can’t truly prove causality, we’d be pretty stumped! The same goes for questioning whether smoking causes lung cancer, massive blood loss causes death, and a number of other

WWW.THE-GIST.ORG

scientifically­supported causal relationships. So, even though causation always comes with a bit of uncertainty, science usually (with the odd exception) does a pretty good job of discovering causal relationships that make sense and work in day to day life.

To really understand and interpret study results, great care has to be taken to understand exactly what the data implies, and more importantly, what it doesn’t imply. Unfortunately, analysing research methods, statistics, probabilities, and risks is not an everyday skill set wired into human intuition, and it is all too easy to be led astray, even for seasoned researchers. In an era of ‘fake news’ and flashy media headlines, in which abstract findings are taken at face value, we need to cast an ever more critical eye on claims and look deeper into the underlying data and experimental design. We must constantly resist the temptation to see meaning in chance and to confuse correlation and causation in this new post­truth world. During research for this article, I came across probably my favourite causation conspiracy ­ that as the number of pirates in the world has decreased over the past 130 years, global warming has gotten steadily worse3. Clearly, if you truly want to stop global warming, the most impactful thing to do is to become a pirate. Perhaps this is the study that slipped by Donald Trump?

This article was written by Sara Cameron. It was copy edited by Kirstin Leslie. Specialist editing and layout by Katrina Wesencraft.


POST-TRUTH SOCIETY

#VACCINESWORK WITH MORE ‘SHOTS’ FIRED AT ANTI-VAXXERS IN AUSTRALIA, STEPHANIE CUMBERWORTH TAKES A JAB AT EXPLAINING THE BENEFITS OF VACCINATION.

G

rowing up in the 90s, my parents, like many others, made the choice on my behalf to not receive the MMR (measles, mumps and rubella) vaccine, fearing it may cause autism. A lot has changed in those twenty­something years. Firstly, I’m now an adult making my own informed medical decisions, and am up­to­date with my vaccinations. More importantly, the original research claiming a link between the MMR vaccine and autism is widely refuted, and the lead scientist behind the research, Andrew Wakefield, was discredited and banned from practicing medicine. Despite this, twenty years after the fraudulent paper that gave birth to the infamous autism/MMR myth, support for anti­vaccine movements is still unwavering.

Shots, jabs or jags ­ whatever you like to call vaccinations ­ are typically an injection of a cocktail of immune­priming substances that protect the body from a specific disease­causing agent (or pathogen), such as some bacteria and viruses. The main ingredient that is used to train the immune system to intercept the pathogen belongs to the pathogen itself. The key is that the pathogen is adapted, so it would be able to make your body respond

to the threat without causing the disease itself. Nifty, right? This ingredient comes in one of four varieties: a small bit of the ‘bug’ (subunit vaccine), a small dose of the toxin it makes (toxoid vaccine), a ‘dead’ version (inactivated vaccine) or an extremely weakened version (live attenuated vaccine). Other substances may be added to some vaccines, including aluminium salts, lipids (a fat­like substance), proteins and sugars. These substances have one of the two roles: 1) Stabilise the vaccine so that it can be stored until needed, and still work. 2) Boost the immune response towards the vaccine (known as adjuvants). Put all these ingredients together (plus 10+ years of rigorous testing) and you have a vaccine. But how does it work? Vaccines train your immune system to recognise a pathogen before you have encountered it; similar to a scenario where you have seen a photograph of someone before meeting them in person. By training the immune system to recognise a pathogen through vaccination, it develops a faster reaction time to recognise and also destroy an invading pathogen when it comes head to head with the real deal.

If vaccines help us, why do some people choose not to get them? Some people’s faith in medicine has faltered. In an age of instant information, it’s easy to get lost in the pool of knowledge out there regardless of whether it’s true or false. The blame cannot be pinned entirely on that one false study published twenty years ago. However, celebrity endorsements and the misleading 2016 ‘documentary’, Vaxxed, (incidentally directed, produced by, and starring the author of the fraudulent study) certainly don’t help matters. Arguments from some anti­ vaxxers (people against vaccinations) span from the mistrust of health officials and governing bodies, to safety concerns for vaccine components, though the two are not mutually exclusive.

‘Vaccines cause serious problems e.g. Autism’

I won’t dwell on this any longer, it is simply not true. Twenty years of extensive studies by scientists across the globe have discredited this claim1. ‘Vaccines

WWW.THE-GIST.ORG

health

contain

dangerous


POST-TRUTH SOCIETY substances, such as formaldehyde and aluminium’ Formaldehyde: While it is true that formaldehyde can be one way to inactivate or ‘kill’ the pathogen to prepare it for use in a vaccine, the vaccine itself contains only trace amounts of this chemical which is not enough to cause harm. In fact, the cells in our own body produce formaldehyde and the body is equipped to remove the tiny amounts that are introduced by vaccination2. Aluminium. You might be thinking what is aluminium, a component of drinks cans and foil, doing in vaccines? Aluminium­based products, such as aluminium salts, are used as adjuvants to help boost the immune response of the recipient to the vaccine. This is very different to what goes into making drinks cans. Aluminium salts in vaccines are harmless and are present in very low quantities, even lower than what we encounter on a daily basis and can be found naturally occurring in soil, used in antiperspirants, and even in the food you eat3. All in all, vaccines are extensively tested before widespread human use. Though side effects may occur, they are better than the alternative of experiencing the disease in full. The bottom line is, vaccine components are safe.

‘It is my child and my choice, it doesn’t affect anybody else if I choose not to vaccinate my child’ I’ve seen similar statements plastered over the internet from members of the anti­vax community. These are most likely not isolated cases. It is not a matter of one or two unvaccinated individuals in a community, there are many. This is where the problem lies. Unvaccinated people are at risk of contracting potentially lethal pathogens. Even if there is no disease progression, that person may act more efficiently as a ‘carrier’ of pathogens (that’s not to say that vaccinated people cannot also act as carriers of certain pathogens). However, the extra time that it would take for an unvaccinated individual to destroy an invading pathogen (if at all), increases the time that they can come into contact

with another unvaccinated person or the general public and allow the pathogen to spread. This is problematic as not everybody can be vaccinated, for instance those who are too young or too old to be vaccinated, as well as people with weaker immune systems, such as cancer patients and those with immune­related disorders. If these vulnerable individuals come into contact with a pathogen through interactions with carriers, the disease could be life­threatening. Herd immunity can prevent this4. This is when there is a high enough percentage of vaccinated people who can protect the whole population by acting as a barrier to transmission of a pathogen. The higher the percentage of elective unvaccinated people, the increased risk of herd immunity failing and risk of pathogens spreading to vulnerable individuals. Recent measles outbreaks across Europe (including in the UK) and in pockets in the United States (21/50 States at the time of writing) are classic examples of places where herd immunity has failed due to lack of vaccine uptake. Higher percentages of unvaccinated people are not only a risk to the vulnerable but to the population as a whole by acting as a

‘reservoir’ of infection. A reservoir is a population where a pathogen can exist and spread without the transmission chain being cut off. The longer a pathogen ‘grows’ and spreads, the higher the chance that pathogen adapts to better suit its host through mutations. For pathogens such as viruses, this can be an extremely quick process. Eventually, these mutations could cause infections to spill over into vaccinated populations causing an outbreak ­ meaning everybody is at risk of developing the disease. By having a certain number of people in the population reach the threshold to achieve herd immunity, we decrease the chances of such events occurring.

The Australian government has linked child care benefit payments to vaccinations since 1998. Over the years, their government has clamped down on conscientious objection to vaccination. The most recent action

WWW.THE-GIST.ORG

saw child tax benefit reductions change from an annual reduction to a fortnightly reduction ­ to serve as a vaccination reminder5. In addition to ‘no jab, no pay’ legislature, unvaccinated children are being turned away from childcare facilities in a ‘no jab, no play’ movement. Benefit cuts might not be the answer. Some reports claim there is a rise in affluent anti­vaxxers; therefore, targeting these groups will require more than cuts to a benefit that they are unlikely to receive6. While government intervention to encourage vaccination seems like it might be the way forward, it may be hurting the situation in cases where mistrust in the government factors into objection to vaccination. In fact, some anti­vaxxers in South Australia have established their own social services with anti­vax friendly health practitioners in order to navigate current laws there. One way to help increase the support and uptake of vaccinations is to improve how we communicate to people about them. This includes being transparent about their components, production, and most importantly, their public health benefits.

Both the title of this article and a mantra of pro­vaccine tweeps (that is twitter users, for those not immersed in the twitter­verse), there is no denying that #VaccinesWork. The eradication of smallpox, and the fact that my generation (in the UK) first learned of polio through textbooks and not in day­to­day life, is a testament to the power of vaccination. Sadly, there are cracks in the woodwork, with more reports of vaccine preventable diseases appearing more frequently. So my parting message is this ­ run with the herd, because #VaccinesWork.

This article was written by Stephanie Cumberworth. It was specialist edited by Ricardo Sanchez and copy edited by Lavanya Sundar. Layout design by Katrina Wesencraft.


POST-TRUTH SOCIETY

THE AGE OF UNENLIGHTENMENT Is technology fuelling a movement that favours fiction over science? Maisie Keogh investigates how we can burst the filter bubble and begin to think critically about the news

T

he Enlightenment period of the late 17th and 18th centuries was an era that saw the rise of the scientific method, reason, liberty, and progress. Building on the cultural and scientific revolutions of the Renaissance, Enlightenment thinkers felt compelled to observe and analyse rather than simply relying on accepted wisdom or intuition.

allowed you to choose what content you wanted to see based on your interests. It also began to draw information from your previous searches to make inferences about the kind of news you want to be told about. From just one app, we can view news that always agrees with our opinions and never challenges us. It doesn’t require us to think.

predicted just how vital it would become to our economy, infrastructure and ultimately, our lives. According to The International Telecommunications Union, a specialised department of the United Nations, an estimated 47% of the world's population uses the internet. Of these more

When we look at the causes for this reversal in reasoning, we see that that there are a number of complicated components at play. From one­sided media coverage of global events leading to radicalisation, to the rise of right­ wing populism, a convoluted network exists that has been perpetuated by technology. Despite this, such technology is a powerful tool with many innovative uses and the ability to increase communication and connectedness. But, how exactly has it begun to foster a culture of ignorance?

than 3.3 billion people, 81% live in developed countries2. The internet has a vastness that reaches across our planet and yet, the online ‘bubble’ that we as individuals inhabit is incredibly small.

Rational thought and skepticism were key concepts to the enlightenment period with notable figures, such as Isaac Newton, Adam Smith and Immanuel Kant, giving us the tools to essentially teach us how to think critically1. Gradually, there was a paradigm shift as more and more individuals began to subscribe to more secular concepts of learning and to the scientific method. It was hoped by many Enlightenment philosophers that this would propel us into an age of reason where rational thought would help us break free from irrationality and ignorance. Fast forward over four hundred years, and we are living in a world where the news we receive straight to our smartphones has been tailored to what we ‘want’ to see. A year ago, Google introduced a new personalised news stream which

When the internet was first introduced, no one could have

WWW.THE-GIST.ORG

These micro­universes have been subtly created over time by industry giants, such as Google and Facebook, leaving us in a permanent state of intellectual isolation. In 2011, Eli Pariser, an author and internet activist, wrote “The Filter Bubble: What the Internet is Hiding from You” and in it, he first coined the term ‘filter bubble’.


POST-TRUTH SOCIETY A filter bubble is a term used to describe the algorithmic bias that exists when using social media and search engines. Despite the aim of personalised feeds being to show you content that is relevant to you, it has instead propelled us to a point of cerebral solitude with Pariser characterising it by commenting that “Personalization filters serve a kind of invisible autopropaganda, indoctrinating us with our own ideas…”3. Every social media post we ‘like’ and every link we follow allows our personal preferences to be further refined and as we do, we are further insulating ourselves from reality and the world from the perspective of others. We now exist in a digital domain where no two people will have the same results from a search they conduct and where background industrious bots are diligently curating a selection of the things they think we want to see. What’s revealed to us are things that agree completely with our understanding of how we think that the world is and that do not challenge our assumptions in any way. In 2016, The Wall Street Journal (WSJ) conducted an experiment designed to try and burst the so­ called ‘filter bubble’ with their ‘Blue Feed, Red Feed’ feature4. The intrepid news outlet attempted to point out our biases by showing the same news from two different perspectives. In this case, material that appeared in the red feed is deemed to be deeply conservative. In comparison, the blue feed, was profoundly liberal and by showing the two side by side the WSJ highlighted the disparities between the two points of view. The tool was designed for users who may be inquisitive about opposing viewpoints but who are uneasy about ‘liking’ these stories on platforms, such as Facebook and Twitter. In an interview with Quartz in 2017, Bill Gates commented that “Technologies such as social media lets you go off with like­minded people, so you’re not mixing and sharing and understanding other points of view…”5.

constantly being made by algorithms about what kind of personalised utopia we should be living in.

Social media has a surprising way of luring you in and will work hard to keep you on their sites. What was supposed to be a couple of minutes on Facebook turns into 20 mins of scrolling through your feed, clicking on a link that will see you completing a personality quiz and, when that is done, feeling compelled to swipe through a slide show entitled “20 Hollywood actors and how they really look now”. Since its inception into mainstream society, social media has been slowly training us to accept what we see at face value. In feeds where anyone’s thoughts can become immortally digitised and news has been carefully selected to mirror our own moral compasses, we don’t have to scroll down far to have our opinions validated and our view on current affairs confirmed.

The researchers found that no matter whether the subjects were presented with confirmatory or conflicting data, they remained firm in their judgements in what is known as bias assimilation. However, when asked to ‘consider the opposite’6, they were able to avoid the bias as they began to think about how they were processing the information. This study and the results it obtained are still incredibly relevant today, particularly when trying to look beyond our own filter bubbles and accessing the news we read critically. By challenging our own assumptions, being curious and actively seeking to discover views that oppose our own, we can finally begin to dispel our culture of fake news and enter a new age of reason.

This article was written by Maisie Keogh, a postgraduate student studying biofluid mechanics at The University of Strathclyde. It was specialist edited by Anna Henschel and copy edited by Kirsten Woollcott. Layout design by

Despite its faults, the internet can be a positive asset that allows for advancements that would be impossible otherwise, but it is becoming increasingly difficult to find a corner of it where truth has not been distorted and the line between fact and fiction blurred beyond recognition. So, how can we combat the culture of ignorance? How can we pop the filter bubble? Challenging our own assumptions and reflecting on personal biases can help us view the information we receive in a completely different way. In 1988, a study was conducted at Princeton University by Charles G. Lord, Elizabeth Preston and Mark R. Lepper who was at Stanford University at the time. The aim of this experiment was to find a practical way to fight confirmation bias and to develop a method of correcting our reduced reasoning. They asked participants who had strong existing opinions, both for and against the death penalty, to review information that either confirmed or opposed their views.

Gates accurately pinpointed one of the fundamental issues we are facing with social media. We are using the platforms as intermediaries more and more progressively, mindlessly consuming irrelevant content. All the while, invisible judgements are

WWW.THE-GIST.ORG

Katrina Wesencraft.


POST-TRUTH SOCIETY

THE BABY

POWDER

BACKLASH A SERIES JOHNSON

OF LAWSUITS HAVE BEEN BROUGHT AGAINST JOHNSON & BY WOMEN ALLEGING THAT TALCUM POWDER CAUSED THEIR OVARIAN CANCER, A SUGGESTION THAT CURRENT SCIENCE DOES NOT SUPPORT. KATRINA WESENCRAFT TAKES A LOOK AT THE EVIDENCE TO SEE IF THERE IS A GREATER CONSPIRACY AT PLAY.

A

merican pharmaceutical giant Johnson & Johnson is currently fighting more than 9,000 lawsuits involving their talc­based products. Many of these are based on the allegation that their popular Baby Powder has led to users developing ovarian cancer. Talcum powder is a refined form of talc, a clay mineral containing magnesium, silicon and oxygen. Its use dates back to ancient Egypt where it was used to create cosmetic eye kohl. Today, talc is still one of the most common ingredients in cosmetic powders, however, it is best known for its absorptive properties. Johnson’s Baby Powder has been on our shelves for over 100 years, and has been dusted onto thousands of bottoms around the world, minimising friction and preventing nappy rash. Johnson’s “best for baby, best for you” branding was extremely popular with mothers, leading the company to market Baby Powder as a feminine hygiene product. In an interview for The New York Times Magazine in the 1980s, the company revealed that 70% of Baby Powder users were actually adults1. The potential link with ovarian cancer has been causing controversy since the ‘80s, but no warning label has been added to the products, and a recent spate of lawsuits has seen Johnson &

Johnson ordered to pay out billions of dollars in compensation. Concern surrounding the potential for talc to cause cancer was originally raised due to some naturally occurring talc being mined in close proximity to asbestos, a substance proven to cause cancer. Microscopic asbestos fibres have no smell or taste and, once inhaled, can cause a variety of cancers of the lung and the surrounding pleura. Despite chemical similarities between talc and asbestos, they are morphologically distinct and belong to different mineral subgroups. It is well documented that asbestos’ carcinogenic properties are due to its fibrous morphology, enabling the substance to accumulate in body tissues causing chronic inflammation and cellular damage over time. As it’s rare for talc to occur in a fibrous form, many researchers doubt that talc is a carcinogen. However, it is listed as “possibly carcinogenic” by the International Agency for Research on Cancer (IARC), the World Health Organisation’s specialised cancer agency.

WWW.THE-GIST.ORG

Since 1982, several case­control studies have been undertaken to investigate the association between talcum powder and ovarian cancer. Researchers were able to observe a link, suggesting that dusting talcum powder on the perineum led to an increased risk of developing ovarian cancer. This may sound like solid proof that talc is carcinogenic, but a number of concerns have been raised about the findings. While the studies all reached the same conclusion, they used varied methodologies and there was great disparity in the size of risk determined. In addition, a lack of dose­response curve has led to criticism. The dose­response curve is one of the most fundamental concepts in pharmacology, and describes the relationship between the effect on an individual caused by exposure to different concentrations of a substance. Not being able to reproduce this data makes determining the presence (or absence) of a cause–effect relationship extremely difficult. On top of this, case­control study design can be problematic. These studies are inherently prone to bias; subjects are not randomised into


POST-TRUTH SOCIETY exposed or unexposed groups, but are observed by researchers who can deduce both their exposure to a cancer­causing substance and the outcome. Case­control studies provide less evidence for causal inference than randomised controlled trials; however, despite being the gold standard, conducting this type of trial would be unethical as researchers would be required to deliberately expose women to the suspected carcinogen. Instead, researchers can draw conclusions about inference from three factors: assessing the strength of the proposed association, the consistency of findings across multiple studies, and proposed biological mechanism underlying development of the disease. Looking at these factors across multiple studies, several research groups believe that the association observed between talc and ovarian cancer is statistically weak2. Inaccurate reporting across studies may contribute to the varied results observed, as participants had to remember their talc usage over time. A meta­analysis of 16 case­ control studies that found an association between the use of talc on the perineum and ovarian cancer concluded that the relative risk is 1.3. In other words, habitual users of

perineal talc would have a 30% increased risk (compared to non­ users) in contracting this already rare cancer. This is difficult to visualise, but relative risk below 1.5 is considered to be fairly small3. This data analysis has been interpreted as proof that talc has a causal role in the development of ovarian cancer. However, as it’s difficult to control for the brand and quantity of talc used, in addition to the frequency and location of use over a number of years, comparing results across studies is complex. Many of the case­ control studies investigated only compared one or two of those factors.

Jacqueline Fox blamed Johnson & Johnson’s products for causing her aggressive ovarian cancer; she had been habitually dusting Baby Powder in her underwear since she was a teenager. She passed away in 2015, and a jury awarded her family $72 million in damages after finding Johnson & Johnson liable for

WWW.THE-GIST.ORG

negligence, conspiracy, and for failing to warn customers of the potential risk of dusting Baby Powder in the genital area. The foreman of the jury said they reached this verdict after a series of internal memos, presented during the trial, gave the impression that the company were “hiding something”. Dr Roberta Ness, former Dean of the University of Texas School of Public Health and former President of the American Epidemiological Society, testified as an expert witness in the trial. She criticised the studies which found no link between perineal talc use and ovarian cancer, claiming that only five such studies measured both frequency and duration of talc use (although similar concerns have been raised about studies that did find a link). Johnson & Johnson felt they did not have a responsibility to add warning labels to products as they could not conclusively prove that talc dusted on the genitals can reach the ovaries. Dr Ness made it clear that she believes this does not excuse the company from their duty to warn Baby Powder users of any known


POST-TRUTH SOCIETY association with health risks. Following the verdict, 17,000 people contacted Fox’s attorneys with claims of their own. This verdict was eventually reversed on appeal, not because of any new evidence or distorted science, but because the trial took place in Missouri instead of the plaintiff ’s hometown Alabama4. Fundamentally, lawyers and expert witnesses in the case seemed to realise that the science supporting a link between perineal talc dusting and ovarian cancer is disputed. However controversial it may be, it is still peer­reviewed scientific evidence, and the jury believed that Johnson & Johnson should have warned their customers that this evidence existed. In July of this year, again in Missouri, a judge ordered Johnson & Johnson to pay $4.7 billion damages to 22 women alleging that their talcum powder caused their ovarian cancer. This figure is one of the highest ever awarded for punitive damages in a product liability case.

Perhaps taking the scientific evidence (and more likely previous successful appeals) into consideration, this most recent lawsuit is the first to suggest that the ovarian cancer was not caused directly by talc, but by talc contaminated with asbestos. The IARC states that ovarian cancer is one of only four cancer types that can definitely be caused by asbestos exposure. This conclusion has been reached as researchers have found asbestos accumulated in the ovaries of women who have died from ovarian cancer, but the way it reaches the ovaries has still not been proven. Some more unusual theories suggest that asbestos might reach the ovaries through the blood or the lymphatic system5. However, if dusting your genitals with contaminated talc, the most likely route for asbestos would be through the reproductive tract. This is still an impressive feat, as in order for the microscopic fibres to reach the ovaries they would have to travel a distance of approximately 18cm. In the US, FDA safety regulations

are in place to ensure that talcum powder products are asbestos­free. These rules were introduced in the mid­1970s when the link between asbestos and cancer was first suspected. However, as asbestos­ related cancers can take years, if not decades, to develop6, cases from before the regulations were introduced are still being diagnosed today. However, many were shocked by the verdict of this latest trial, as several of the plaintiffs began using Baby Powder after the FDA regulations were brought in. During the suit, lawyers representing the women alleged that Johnson & Johnson were aware that their Baby Powder products were contaminated with asbestos and have been systematically covering it up since the ‘70s. Witnesses in the trial claimed that talc becomes contaminated with asbestos during the mining process and that it us impossible to separate the two substances. Johnson & Johnson refute this allegation, stating that they have rigorous purification and

testing protocols in place. In response to persistent public concern, a year­long study commissioned by The FDA in 2009 investigated the asbestos content in an array of talc products, including Johnson’s Baby Powder. None of the samples tested by researchers were found to contain any asbestos fibres7. Despite the FDA findings, it was claimed during the trial that many of the women had talc and asbestos present in their ovaries. Johnson & Johnson are appealing the verdict. It is possible that further scientific study will reveal a causal link between genital talc dusting and ovarian cancer, but for now, the National Cancer Institute states that the “weight of evidence” does not support this8. This may not be enough for Johnson & Johnson, as lawyers have acknowledged that they are not interested in the science. Plaintiffs in Baby Powder cases are suing in small groups to maximise the impact on the jury. Mark Lanier, a lawyer representing the group of 22 women, stated “It’s

WWW.THE-GIST.ORG

easier to get justice in small groups. In small groups, people have names, but in large groups, they’re numbers”9. Many of the expert witnesses in the trials have admitted that the science is ambiguous, and that more study is necessary. However, this was not what stuck with the jury. They weren’t interested in study design or statistics, what they saw was that there was some evidence of a connection between talcum powder and ovarian cancer. At best, Johnson & Johnson failed to warn their customers; at worst, they covered it up.

Article and layout design by Katrina Wesencraft, a PhD student with OPTIMA CDT at the University of Strathclyde. Specialist editing by Miruna Costreie and copy editing by Kirsten Woollcott.


OPINION

REFORM THE EXTRACTIVE NATURE OF SCIENTIFIC PRACTICE Researchers must be accountable to the public that provide their funding. Danko Antolovic proposes steps to improve that accountability.

S

cience is everywhere in our modern world. We can look at any news outlet and chances are that we will run into some exciting bit of new science, perhaps even a breakthrough: a new elementary particle; yet another planet circling a distant star; another promising development in the fight against cancer... But then we look around and we see that our daily lives aren't undergoing any corresponding dizzying changes: our transportation relies on the gasoline engine that has been around for over a century; chemicals invented a lifetime ago support our food production; our good health still derives largely from vaccines, antibiotics and surgery, techniques which originated about a century ago.

Our foundational technologies, those that support the basics of modern life, rest on old science. There are many welcome refinements and improvements, to be sure, but little that is fundamentally new: cars are more efficient, but for the most part they still run on hydrocarbon chemistry; surgery is safer and less traumatic, but hopes of nonintrusive body repair at cellular level remain

elusive. Even our endlessly elaborate electronic gadgets are based on the transistor, an invention that is more than half a century old. And we still die of cancer and heart disease... Despite all the feverish research, why is tangible, beneficial progress so slow? Furthermore, amid the stream of exciting science news we find darker snippets: students and junior researchers suffering from depression and burnout; some rising academic star going down in flames when it is discovered that his career was built on fraudulent research; fraudulent journals offering to put the stamp of "peer review" on anything, for a fee1. What is really happening with science today?

In order to understand, we must look at how scientific research is paid for. For the most part, practitioners of science, either individuals or research groups, compete for grants that the wider society funds with taxes. Their competition, however, is not a race in which all runners compete on equal footing. Rather, it resembles a Monopoly game or an actual market

WWW.THE-GIST.ORG

competition, where the participants' wealth increases the likelihood and size of the winnings2. The wealth in this game is not the grant money itself, which must be spent on scientific work, but professional prestige. Prestige is measured in publication counts, titles, previous grant awards and the like. Participants use what prestige they have to extract what grant resources they can, and they convert these resources into further prestige by means of doing scientific research. This prestige is then re­invested into acquiring additional grants. We must understand that the participants must maintain this cycle if they want to stay in the game, and the only way to survive is by increasing one's store of prestige: advancement of science is subservient to this goal. This is not to impute wholesale dishonesty to the scientific community, but the present system of rewards and penalties dictates the priorities and the participating scientists must follow them, regardless of personal preferences. We see right away that this game demands risk­avoidance. Researchers must generate the quota of prestige that is required to continue the cycle and will naturally choose safer, more tractable


OPINION investigations which will reliably yield the necessary publication or two in time for the next grant application. It is possible to make incremental progress in this manner, but daring, far­reaching investigations are discouraged from the start because they are unlikely to survive long enough to bear fruit. Quite to the contrary, since significant discoveries are naturally rare, ever more frivolous findings are exploited to fulfill the demand for prestige.

When the competitive pressure becomes intense enough, it tempts researchers into darker practices: the same unit of work may be chopped up into several publications or published twice, all to generate maximum prestige from a given investigation. Then there's selective reporting of "best" results, data massaging, and the occasional outright fabrication. But even research that is nominally above reproach serves only as a means to an end, yielding results that are superficial, fragmented, and often of no consequence3. This bazaar­like practice of science has prevailed for the last half­ century or so, with dubious results. It has filled libraries with scientific publications to the point of bursting, but the larger failure is there for all to see: one after another, the exciting headlines fade from the news, there is little follow­up in the years to come, and the great volume of research runs off into the sand. We may reasonably ask whether the fault lies in the conversion of scientific results into usable technology. This conversion has its own problems but it is mostly kept focused by the profit motive and the market dynamics. There are inevitable implementation difficulties, as well as popular resistance to certain technologies, such as genetic engineering, but

technology development shows no signs of fragmentation and drift inherent in the research funding model. The taxpaying public has neither the expertise nor a venue through which to assert its interest in the bargain, and cannot correct the extractive, self­serving nature of current scientific practice. In so many words, the public is expected to provide monetary support and be satisfied with whatever it receives in return. It is not surprising that it doesn't receive much.

This one­ sided deal is not sustainable and will, in the end, only encourage anti­scientific attitudes that are already present in modern societies. As the world grapples with large­scale problems that will require all the scientific ability we can muster, it will not be helpful if the public perceives the science community as a parasitic class, squandering money on frivolous things. Serious reforms are needed. First of all, the directionless, wasteful science bazaar must come to an end and be replaced with a sustained commitment to fewer, but worthwhile, research projects. The public will have to give its consent, by direct vote or through its representatives, that a proposed project is of sufficient public benefit, and they will be entitled to an estimate of the likelihood of success. The scientific community will, in turn, assess whether the goal is realistic and provide the estimate of the probability of success.

WWW.THE-GIST.ORG

This will not be an easy reform to undertake, but other publicly funded activities, such as infrastructure and education, for example, operate under a reasonable degree of public consent and accountability; there is no compelling reason for science to be exempt from this. Furthermore, sustained, long­ term research will have to be managed by people who are actual managers. This idea is anathema to academic scientists but a good case can be made for it. Large technical projects, such as space exploration missions and particle accelerators, are often praised as inspiring examples of "big science". These projects are, in reality, feats of engineering rather than science proper but they have an impressive record of success and are always formally managed. Good management serves the vital purpose of preventing the project from falling apart due to conflicting self­interests of the participants. And thirdly, participating researchers must be given reasonable leeway when proposing and carrying out their specific part of the investigation — it is difficult to see how research could be done otherwise. However, individual contributions must be evaluated against relevant, tangible criteria and with the advancement of the project in mind; not in the current, increasingly meaningless terms of accumulated prestige. Some readers will no doubt object that such an approach would stifle science with rules and regulations, depriving it of spontaneity and creative insight. But let us be honest: free creative innocence existed only in the early, precarious days of scientific endeavour.


OPINION Contemporary scientists are not autonomous agents but employees of larger institutions, keenly preoccupied with their standing in the professional hierarchy. Meanwhile, scientific research and education are nationally important undertakings of all industrial countries. The supporting public is entitled to an honest assessment of what it is getting in return for its support, and how soon. Making it clear what the goals, obligations and expectations are can only be beneficial to everyone involved. Lastly, let us retain our confidence in the better side of human nature. There are many sincere practitioners of science, often young researchers who wish to do good work and who chafe at unreasonable requirements and the wasteful, self­serving hierarchy of the scientific enterprise. They deserve a more decent consideration than the one their chosen profession gives them now. Science has been a great source of good to humanity but there is no guarantee that it will remain great

This article was written by Danko Antolovic, a scientist and technologist whose publications cover research in quantum chemistry and computational modeling of molecules, research in solar energy for space applications, design of systems for image analysis and robotic vision, and development of wireless communication technology. He is the author of "Whither Science?" and other writings about the nature of scientific and good automatically, without our conscious effort. A course correction is neededÍž we are convinced that a more cohesive atmosphere, steady institutional support, and commitment to larger common goals will once again bring out the creative best in those who devote themselves to science.

WWW.THE-GIST.ORG

inquiry. This article was copy edited by Dzachary Zainudden. Specialist editing and layout design by Katrina Wesencraft.


BOOK REVIEW

REVIEW:

EVERYBODY LIES BIG DATA, NEW DATA, AND WHAT THE INTERNET CAN TELL US ABOUT WHO WE REALLY ARE - SETH STEPHENS-DAVIDOWITZ

D

oes the internet know you better than you know yourself? Quite possibly, according to Seth Stephens­Davidowitz, who makes an entertaining case for this in his book: “Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are”. It knows what movie you are likely to watch next, your sexual orientation, even whether you’re going to make it to the last chapter.

searches, offer an incentive to tell the truth. Google may well be the only reliable source for some information, such as the frequency of self­ administered abortions. Stephens­ Davidowitz uses Google search data to dispel myths (it turns out social media is less of an “echo chamber” than real life), confirm stereotypes (the working title of this book was “How big is my penis?”) and confront the reader with some uncomfortable truths...

Stephens­Davidowitz, a former Google data scientist with a PhD in economics from Harvard, argues that big data is providing an unprecedented insight into the human condition. And he backs up his claim with style; bombarding us with case studies, drawing heavily from the holy trinity of internet data: Google, Facebook, and PornHub. The revelations are by turns comforting, unsettling, and hilarious. Some are expected, like the fact that people exaggerate how much sex they are having, some less so, like the unnerving frequency of incest­related Google searches.

Just before Obama’s re­election in 2012, reports based on polls and surveys indicated that, by and large, race did not influence the voting intention of Americans. However, a study performed by the author revealed a striking correlation between areas with high rates of racist Google searches and areas where Obama underperformed in the election by an amount that could not be explained by other factors. The same pattern was found for Trump’s better than expected performance in the 2016 Republican primaries, with racist searches proving to be the factor most strongly correlated with his popularity. This is a hugely relevant piece of information missing from the debate surrounding race relations in the United States. Google is not only a source of honest data, it is also the only place some confessions see the light of day.

First, let’s take a step back. What is big data? Here, it is a loose term used to describe the sheer volume, diversity, and quality of data that modern technology allows us to acquire. Stephens­Davidowitz is concerned with what we can do with this data. He presents what he considers to be the “four powers of big data”, which are revolutionising data science: the data is honest, novel, abundant, and suitable for randomised controlled trials. The biggest game­changer is honest data. Stephens­Davidowitz administers his “digital truth serum” in the juiciest portion of the book, which demonstrates that, in contrast to common databases that rely on self­reporting, like polls, new sources of data, such as Google

While Stephens­Davidowitz dreams of big data being used to satisfy intellectual curiosity and improve society, he notes that profit­ makers have already begun to capitalise on this new resource, exploiting new data in order to manipulate consumers. How do they find the data that is useful (read: honest) to them? Through the relentless application of online A/B tests ­ experiments in which two random groups of customers are exposed to two different options (A or B), allowing the company to identify

WWW.THE-GIST.ORG

the option that produces the most desired response. Why would a social media company ask its users which version of a new feature they prefer ­ and risk them choosing the option they would use less ­ when an A/B test would reveal which is the more enticing and therefore profitable? These tests are employed by almost every large online presence you can think of, from Netflix to news outlets; we are all participating in this whether we like it or not. A/B tests are, in fact, large­scale, online versions of randomised controlled trials carried out by scientific researchers every day. Randomised controlled trials are a type of experiment in which a number of people are randomly assigned to one of two groups. One group is exposed to an “intervention” ­ this could be a drug, an exercise or anything that is hypothesised to have a measurable effect on the participants ­ and the other group is not. In medical research, these trials are the gold standard for demonstrating the success of a treatment because they can show a causal link between treatment and patient outcome. With big data comes the power to infer causation from correlation more cheaply and easily than ever before. New types of data and huge amounts of data ­ providing sufficient granularity for one to “zoom in” on subsets of the population ­ complete the four powers. Together, they allow us to answer all kinds of questions, old and new. To demonstrate this, Stephens­Davidowitz recounts an excellent study on the relationship between violent crime and violence in the media, which makes use of copious amounts of box office and crime data. In contrast to the oft­ quoted results of lab experiments, which measure increased levels of aggression after exposure to violent films, this study found the opposite


BOOK REVIEW effect: a reduction in crime during the weekends when a popular violent movie opened in cinemas. Real­world data is influenced by factors a lab could never account for; from society’s point of view, this study probably has the more ecologically valid result. Perhaps the most consequential piece of debris left by the big data explosion is the ability to use correlations to predict the behaviour and condition of individuals with increasing accuracy and precision. Pool the Google search histories of people who went on to be diagnosed with pancreatic cancer, and you’ve got yourself a new diagnostics tool. Stephens­Davidowitz believes that “collecting rich data on the world’s problems is the first step towards fixing them”, and the Human Rights Campaign and child protective service agencies are already keen to enlist his help. He has also, of course, been approached about predicting the stock market. Motivated by the various hazards of this newfound predictive power, the book closes with a warning: with big data comes big responsibility. Correlations are easy to find. However, correlation need not imply causation, and the more you look the more likely you are to stumble upon a coincidental correlation linking two entirely independent variables (as a team who attempted to predict the stock market using Twitter data found out to their dismay [1]). There are dangers associated with making decisions based on even the most reliable predictors of behaviour. While most people would support Google’s choice to display the Samaritans helpline to people who search the phrase “suicidal thoughts”, I doubt a policy of arresting predicted criminals before they have broken

the law would appeal to many. Another issue is the rigour, or lack thereof, of big data science. Stephens­ Davidowitz appears to take Google search data more or less at face value. He does acknowledge that Google can bring out a different side of people; that searches “skew towards the forbidden”. But is it possible to quantify this effect? I, for one, have expressed many a questionable sentiment in the search box, tempted by an outrageous phrase suggested by the autocomplete, and frequently use Google as an outlet for my morbid curiosity. Nevertheless, the author argues convincingly for the potential uses and likely ubiquity of big data science in the future. Personally, I find this ubiquity ­ “everything is data” ­ to be the biggest concern. Stephens­Davidowitz touches upon it, and concedes that “Big Data does not eliminate the need for all the other ways humans have developed over the millennia to understand the world.” He gives some specific examples of where big data can fail. But what about the long term repercussions? As more people begin to realise the rewards that can be reaped from big data, there is a danger of reducing almost all aspects of life to a data science. After all, we as a species are primitive data scientists. The instinct to go to the side of the sick in order to care for them helped our species survive for millennia. The idea of isolating some sick people came later, after the observation and analysis of many outbreaks of contagious diseases. In this case we have over­ridden an instinct for our own good. What would society look like if we had more reason to go against our gut? In some areas,

WWW.THE-GIST.ORG

such as medicine, this is likely a positive thing. In other areas, it could lead to denying help to individuals who need it because of predicted outcomes, pressure for reduced privacy and less emphasis being placed on in­depth understanding. What about a society where companies can continually manipulate people’s behaviour unchecked? There is no coherent set of laws to prevent this and many would struggle to put their finger on exactly why they are uncomfortable with it; it’s not explicit control, but a more subtle form, with consequences including social media addiction. I fear we are seeing the effects already. We have never witnessed the evolution of a species with this power at its disposal. Now, more than ever, we may need to keep a close eye on our own critical thinking and the values we aspire to as a society. At the same time, big data science is providing us with unique opportunities ­ to expand our knowledge and improve quality of life ­ outside the domain of traditional science. In addition to offering some tentative hope for the future, “Everybody Lies” is an excellent read if you are interested in data science, behavioral psychology or are just nosy.

This review was written by Anna Duncan. It was specialist edited by Anna Henschel and copy­edited by Kirsten Woollcott. Layout by Gabriela De Sousa.


WELCOME


LIFE SCIENCES

BIOHACKING: DIY GENE THERAPY SONYA DIVES INTO THE WORLD OF GENE THERAPY, EXPLORING A NEW TREND IN BIOHACKING, DIY GENE THERAPY, AND ITS IMPLICATIONS

B

iohacking has been a longstanding theme throughout science fiction, from Mary Shelley’s classic novel Frankenstein to Rupert Sanders’ recent action­filled movie Ghost in the Shell. But biohacking, or ‘do­it­yourself (DIY) science’, is no longer an abstract or futuristic concept. In fact, it is becoming a growing trend that has garnered increasing attention.

DIY gene therapy continues to grow as a movement, learning about the field, exploring different perspectives, and discussing its implications are crucial to promote understanding of a concept that has now become reality.

Encapsulated in its name, biohacking is broadly defined as using science and/or technology to hack living organisms. It encompasses two main spheres, hacking non­human organisms and hacking humans. Unconventional and innovative projects include genetically modifying plants to glow in the dark and using pigment­ producing bacteria to develop biodegradable ink. With regards to biohacking oneself, this can range from altering one’s diet to making direct changes to our genome. Recently, several individuals have shared videos on social media showing themselves self­ administering gene therapies, thereby propelling biohacking into a whole new arena ­ DIY gene therapy. While enthusiasts cite their right to self­experimentation, health care professionals, academics, and government agencies have been outspoken critics, and are working to highlight the potential dangers. If

Gene therapy is a technique that involves introducing genetic material into cells, aiming to compensate for missing or defective genes as a treatment for disease. While this may sound relatively straightforward ­ swap a disease­causing gene with a new healthy one ­ in practice, it is remarkably complex. The first issue, and a fundamental requirement of gene therapy, is that genetic material must enter a cell’s nucleus for it to work. Therefore, it must be able to withstand and penetrate the physical barriers within our body, such as enzymes in our blood that readily degrade genetic material and the cell wall which surrounds every cell in our body. Hence, gene­carrying vehicles, known as ‘vectors’, are required to facilitate the transfer of genetic material into cells. Vectors

fall

into

two

main

WWW.THE-GIST.ORG

categories: non­viral and viral delivery systems. Viruses are an ideal delivery vehicle for genes, since these microscopic agents possess unique mechanisms to infect or ‘transduce’ cells. Viral­based vectors, derived from viruses, such as the adenovirus, retrovirus, and herpes simplex virus (HSV), are genetically modified to reduce their toxicity, while retaining their ability to get inside cells. As a result, viral gene therapies benefit from relatively high gene transduction efficiency compared to non­viral gene therapies. Non­viral vectors include chemically synthesised agents that form a complex with DNA and plasmids. Plasmids are circular, double­stranded DNA molecules that can ‘automatically’ replicate. They occur naturally in bacteria, from which they can be isolated and manufactured to carry a gene. Although non­viral vectors carry fewer safety risks than viral vectors, off target effects and toxicity are a ubiquitous concern in gene therapy. This highlights a further issue, namely that genetic material is ideally only targeted to cells relevant to the disease. Finally, once the gene is inside the nucleus of its target cells, a gene must be functioning well enough and for long enough to have the desired


LIFE SCIENCES clinical outcome. Failure to address or poor understanding of any of the aforementioned factors can lead to ineffective treatments. More importantly, it can also lead to dangerous and even fatal outcomes, both of which have occurred in clinical trials. Overall, these factors highlight the challenges in developing a successful gene therapy as well as the gravity of clinical testing.

In 1990, 4­year­old Ashanthi DeSilva became the first person to take part in a gene therapy trial1. She was born with a rare genetic disorder called adenosine deaminase severe combined immunodeficiency (ADA SCID), which arises from inheriting two faulty copies of the gene that encodes the enzyme ADA. With all other treatment options exhausted, she entered the trial and received a genetically modified retroviral vector carrying a functioning version of the gene. Today, Ashanti still requires low dose enzyme replacement therapy, but as the first of its kind the trial was arguably a success. However, not all gene therapy trials have shared this outcome. In 1999, Jesse Gelsinger, who suffered from a rare metabolic disorder, ornithine transcarbamylase (OTC) deficiency, was the first person to die in a gene therapy clinical trial2. Within hours of receiving an adenovirus vector carrying the OTC gene, Jesse developed an immune reaction and passed away four days later. In another trial, four out of nine infants treated with a retrovirus­mediated gene therapy for SCID, developed leukemia and the trial was halted after two years in 2002. With gene therapy trials facing backlash and rising safety concerns, it is of little surprise that the first gene therapy, Glybera, was not approved until 2012. Since the approval of Glybera the number of gene therapy clinical trials has steadily been increasing, with six gene therapies now approved in the western world. As scientists have gained a deeper understanding of how vectors interact with the human body and taken a more apprehensive approach to pre­clinical and clinical testing, significant progress has been made in gene therapy. The field has expanded from rare single­gene, also known as ‘monogenic’ disorders, to the treatment of complex disorders, which arise from an array of genetic

and environmental factors. In line with this, gene therapy trials are being conducted in a variety of complex disorders, including cancer, cardiovascular, infectious, and neurological diseases. In fact, two of three approved gene therapies in the USA are indicated for different types of blood and skin cancer. With a total of 2,597 on­going gene therapy clinical trials in 2017, the number of approved therapies is expected to rise3. However, it is important to note that the majority of trials remain in the early phases of clinical development with only an estimated 4% at later stages. This highlights the rigorous testing that gene therapies undergo during development. However, this carefully regulated process has now been challenged with the recent rise of DIY gene therapy experiments that have been circulating on social media.

In October 2017, Josiah Zayner became an online sensation after injecting the gene­editing tool CRISPR into his forearm at a synthetic biology conference in San Francisco. In his experiment, Zayner used CRISPR to cut the gene myostatin, with the aim of promoting muscle growth in his body. In a recent interview the self­ proclaimed social activist defended his live­streamed event stating ‘One of my big problems with academic and medical science is you read lots of these papers. Lots of stuff, we cured X or we did X, but it won’t be available to the general public for 10, 20, 30, 40 years. To me, that seems ridiculous. How do you expect this technology go forward if they aren’t testing, playing around it?’4. The intention of his self­ experimentation was not to share a major scientific breakthrough or demonstrate the potential therapeutic value of gene therapy. Rather, Zayner hoped that knocking down his own gene in a public forum would break down barriers surrounding scientific research. The aim was to show the public that they can take science into their own hands and that gene editing tools should be available to them, not regulated by research institutions or pharmaceutical corporations. He believes individuals should be entitled to edit their own genes, so it is perhaps unsurprising that he is CEO of The ODIN, a company that sells DIY gene therapy products and kits online to the public. The freedom to explore biology and the right to self­experimentation are concepts rooted in the ideology of biohacking. Through scientific investigation, enthusiasts seek to gain

WWW.THE-GIST.ORG

a deeper understanding of human biology and in turn hack the system with self­improvement in sight. Self­ experimentation is not explicitly illegal and in fact has a history in the field of medical research, with eight scientists awarded Nobel Prizes for their work related to self­ experimentation. It harnesses the potential to advance research and prompt scientific breakthroughs. However, Zayner’s experiment and the cases that followed his stunt highlight how DIY gene therapy and its public accessibility challenge these perspectives. A few weeks after Zayner posted his video, Tristan Roberts, a 28­year­ old computer programmer, live­ streamed himself injecting a plasmid carrying an antibody into his stomach, a product prepared by the company Ascendance Biomedical5. Roberts expressed hope of potentially curing HIV, which he had been diagnosed with 6 years earlier. In February 2018, the CEO of Ascendance, 28­year­old Aaron Traywick, injected himself with an experimental herpes treatment containing a modified version of HSV at a biohacking conference in Austin, Texas. Roberts and Traywick, who unlike Zayner have little or no medical experience, hoped their experiments could drive research forward or even cure their diseases, potentially saving patients around the world. Though luckily neither were seriously harmed immediately after, their experiments did not produce the results they had wished for. But could their results have driven research forward and truly had the impact they believed? And to what extent were these experiments socially responsible? This is a particularly important question given that Traywick passed away 3 months later, in May 2018. Although, equally important to note, is that his cause of death remains publicly unknown.

In terms of scientific value, it is difficult to draw conclusions from data of a single person, arguably undermining any results, with self­ experimentation also prone to bias. Furthermore, any treatment requires approval before it can be prescribed, meaning adherence to the drug development process and its corresponding feasibility and safety studies is necessary for a treatment to reach patients. In addition, live­streaming such experiments on social media and the


LIFE SCIENCES

A basic guide to gene therapy: the introduction of genetic material into cells can compensate for abnormal genes to restore cell function as a treatment for disease.

resulting disproportionate hype can hinder public understanding. This puts the delicate relationship between regular citizens and scientists into jeopardy, a relationship which the research enterprise, through public engagement, has paradoxically been seeking to build. Finally, providing inaccurate information and false hopes can obscure the dangers involved in DIY gene therapy, potentially putting peoples’ safety at risk. After watching Traywicks self­ administered herpes treatment, Zayner echoed this sentiment in a Facebook post stating, “Looking at my actions in the past, which unfortunately did include a public injection in a semi­ridiculous manner, I want to apologize, in that I could have inspired people to think I was doing things on a whim when I was not… All of this is not to say I am against self­experimentation or treatment… What I am against are biohackers and sketchy companies misleading people into believing they have created cures for diseases or that cures could be created so easily”6. Furthermore, shortly after his self experimentation, Roberts announced “...after having my optimism thoroughly crushed and trampled, it's with an unburdened heart that I am announcing that I am dissociating myself from Ascendance Biomedical at least until the CEO is removed. Aaron Traywick is, by most definitions, a scam artist”7. As evidenced by the fatal outcomes of past clinical trials, self­ experimenting with untested gene therapies should not be taken lightly: an important message at a time when such therapies are

accessible to the public. Though DIY gene therapy products and kits supplied by The ODIN require certain expertise as well as access to facilities that the majority of people do not have, they are still readily available online. Moreover, Ascendendance Biomedical offers gene therapy research compounds to anyone who signs up online, with no restrictions in place. With items labelled ‘not for human consumption’ and ‘non­pathogenic’, regulatory agencies have been put in a difficult situation and it remains a grey area whether self­experimenting with such unapproved treatments is punishable by law. Nonetheless, the FDA issued a warning that the sale of DIY gene therapy products and kits is against the law, citing concern over the safety risks involved. Previously, the German government has also issued a statement that ordering these kits and conducting experiments outwith a licensed genetic engineering facility can lead to a €50,000 fine or up to three years imprisonment. However, it remains to be seen whether and what legal actions will be taken, with Ascendance Biomedical citing April 2018 as their earliest release for products.

While the recent social media stunts bring the potential dangers of DIY gene therapy to a forefront, it is important to remember that these dangers are not necessarily shared throughout the vast field of biohacking. Biohacking does not equate to taking life­threatening risks, with the mission of ‘establishing a vibrant, productive and safe community’ lying at its core8. This is reflected by the recent development of a draft codes of ethics, devised to act as a framework to help

WWW.THE-GIST.ORG

guide DIY biologists and future research. However, DIY gene therapy highlights the new ethical and legal questions that biohacking may raise, as the movement searching to democratize research is only just beginning to grow.

This article was written by Sonya Frazier, copy­edited by Kirsten Woollcott, and specialist edited by Emily May Armstrong. Layout by Gabriela De Sousa.


SOCIAL SCIENCE

KEEP DANCING WHILE THE MUSIC IS PLAYING Paul Lavery examines the current state of the private equity market.

I

n 2010, after returning cash to investors and struggling to find a buyer, Candover ­ one of Europe’s largest private equity firms ­ went into liquidation. This was due to a series of failed investments amid the height of the frenzied private equity activity in 2007/08 and the company, once seen as the gold standard of European private equity, liquidated. Today, the private equity industry is showing signs that it is in a perilously similar condition to 2007; a condition which previously resulted in an imminent collapse.

But first, what is private equity investment? It is, simply put, giving money to a firm, who pools money together from different investors into a ‘fund’ and then uses this fund to improve and develop private companies. As the companies in which they have invested begin to grow and develop, you (the investor) earn a return on your initial investment, while you pay the firm a fee for their services. The warning signs in the private equity market are there for all to see and have been for a

while. Private equity firms have been receiving record amounts of money from investors who are desperate to earn a return. This has allowed firms to raise record­sized funds: last summer, a US firm ‘Apollo Global Management’ opened a $25bn fund, the largest ever private equity fund raised. The European record was also broken in 2017 with a €16bn fund being raised1. Not only are private equity firms being given more and more money to invest, they are being given it quicker than ever. While normally a firm may spend around a year speaking to potential investors trying to convince them to invest

money, they are now only spending around 9 months doing so ­ highlighting the surge in demand for their services. But surely more money and more demand in a financial market is a good thing? Not always. Given that private equity firms have record amounts of money to invest, they are under serious pressure to invest that money into profitable companies and earn the investors a return. Just think – if you

WWW.THE-GIST.ORG

gave someone money who promised to earn you a profit before returning it to you, you would want to see them actively working to earn that return, wouldn’t you? Well, that isn’t quite happening just now. While plenty of money is undoubtedly being invested, private equity firms currently have more uninvested money on their books than they have ever had before. This money, known as ‘dry powder’, essentially gathers dust while it waits to be invested. Too much dry powder is bad for business; nobody invests money and pays a professional investment manager lofty fees to watch it gather dust. After all, uninvested money won’t earn a return, defeating the purpose of the initial investment. The reason so much money is being left uninvested is simple: competition. Today, there are over 3,000 private equity firms in the UK and over 7,000 in the US. As mentioned, they are all receiving record amounts of money from investors in record times. Moreover, they are all looking for profitable companies to invest that money in. However, not all companies meet their criteria, and not all necessarily want a private equity firm investing in them (since private equity firms often have a large say in the day­ to­day running of the


SOCIAL SCIENCE

business). As a result, there is intense competition between a huge number of private equity firms for a finite amount of profitable investment opportunities which has resulted in a lot of money being left on the shelf. The laws of supply and demand in economics tell us that when we have so many buyers and only a finite amount of items to sell (private, profitable companies, in this case), rising demand causes the price of those goods to rise. Likewise, as there is a high demand and competition for companies in the private equity market, firms are currently paying more for their assets (companies) than they were in 2007/08, before the market eventually collapsed. And wide­scale overpricing, commonly referred to as a bubble, is rarely a good thing in any financial market. All of this leads to a major concern: sloppy investments. If a private equity firm comes under serious pressure to invest money, there is a good chance that it may decide to invest in a company that it wouldn’t have touched under normal circumstances. This can potentially have a dangerous impact on the economy. If money is being invested in an unprofitable company that is in a weak financial position, it may

money into alternative investments, such as private equity. What’s more, this low interest rate environment makes bank loans cheaper, allowing private equity firms to take on record amounts of debt and make larger transactions than ever before. In any area of finance, more debt translates into more risk. As was the case in 2007, increasing risk and pressure due to inflated prices can lead to dangerously rash investments. In the age of ever­advancing financial technology, investment vehicles ­ such as hedge funds and mutual funds ­ are now turning to machine learning and artificial intelligence (AI) to aid in selecting the most profitable investments. Essentially, algorithms select where to invest money, as opposed to a human researching different opportunities and selecting one. In 2016, the hedge fund industry recorded its sixth year­on­year increase in investors in quantitative funds ­ funds which depend on advanced algorithmic software to make investment selections ­ many of which depend on machine learning and AI. The private equity industry is lagging behind in this respect. Nevertheless, some leading Asian private equity firms are turning to AI in a bid to improve their investment screening process2. In particular, they are using algorithms to filter through

seconds, and panic in global markets ensued. While technology can undoubtedly have benefits to the financial industry, it can bring with it unprecedented dangers. What should private equity firms do? Could advanced technologies like AI and machine­learning be the future of private equity investing? It remains to be seen whether the industry will embrace these advances to the same extent as other financial markets. What’s important to know now is this: private equity firms are taking on more risk and paying more for assets now than they have ever previously done, due to the intense pressure they face from investors. This increases the likelihood of sloppy investments. Altogether, these ought to be alarming signs. Only a decade ago, many private equity firms went bust after a period of similarly frenzied activity. In the meantime, it would make sense for firms to take on less money from investors and reduce their dry powder (money waiting to be invested). This would relieve some of the pressure on them to find profitable companies to invest in. Unfortunately, regardless of the financial market in which they operate, saying ‘no’ to more money as an investment manager is easier said than done. Memories can be short in financial markets; they simply keep on dancing while the music is playing.

This article was written by Paul Lavery, a PhD student in ultimately default on its debt, leading to redundancies or liquidations – in short, the investment will go badly wrong. It may seem far­fetched, but this can become problematic, particularly if a large number of investments start to go wrong at the same time. This problem was in part fuelled by low interest rates over the last few years (interest rates were reduced to help stimulate the economy after the global financial crisis). When interest rates are low, investors find it harder to earn a large return on traditional financial instruments, so they push more

hundreds of companies’ financial data and accounts, in order to quickly select those which meet their initial investment criteria3. Despite the increased efficiency and profitability they can bring, technologies can have severe adverse effects on financial markets. We need only to remind ourselves of the 2010 ‘flash crash’, where the whole of the US stock market was inaccessible for over half an hour (a painfully long time for traders) and the market collapsed in a matter of seconds before quickly rebounding, causing huge widespread losses. As it happened, advanced computer­driven algorithms caused prices of stocks to plummet inside a few

WWW.THE-GIST.ORG

corporate finance who has previously worked in financial technology and private equity. Specialist editing by Chirsty McFadyen. Copy editing and layout design by Katrina Wesencraft.


LIFE SCIENCES

A CONTRACEPTIVE

PERSPECTIVE

IS IT TIME TO LEAVE OUR CONTRACEPTION PRECONCEPTIONS AT THE DOOR? KIRSTIN LESLIE INVESTIGATES THE THE BAD PRESS AROUND CONTRACEPTIVES. T

here is no doubt that the introduction of effective contraception has had an important role in empowering women: however, it doesn't necessarily feel like you are making an empowered decision when understanding the myriad options available can be a minefield. Not to mention the conflicting information portrayed in the media; if I had wanted to write an article proving that the combined pill causes cancer then I would have been able to find news sources to back it up, but equally if I had wanted to prove that it actually reduces cancer risk then there are enough sources to back that up as well. Confused? Me too. And, to add to these contraception misconceptions, we often perpetuate misinformation around our friendship groups by passing off anecdotal evidence as fact. That’s not to dismiss there can be real side­ effects but it is important to understand that something that works for one person may not for another. Moreover, our understanding of the risks involved can often be clouded by these judgments. So let's delve into some of the headlines, based on both facts and myths, and see if we can make some sense of it all.

There are a huge range of hormonal contraceptives but the combined oral contraceptive (COC) pill is the most common. It contains the synthetic hormones oestrogen and progestogen which prevent ovulation. Progestogen also creates a mucus block at the cervix to prevent sperm from being able to enter and thins the lining of the uterus to prevent any eggs that are released and fertilised from implanting. It is 99% effective if taken correctly, making it one of the most competent methods of birth control. Hormonal contraceptives have been getting a fair bit of bad press over the years so let's rattle through some of the most worrying claims made against them and try to clarify what is known: Infertility: There is zero evidence that the pill can make you infertile and most women will quickly become fertile again as soon as they stop taking it. Blood Clots: Yes, the pill can put you at a risk of blood clots, but this is well documented and your GP will not prescribe it if they think you could be at risk. This is why you have your

WWW.THE-GIST.ORG

blood pressure checked every 6­12 months before getting a new prescription and if you have any additional risk factors, such as migraines with aura (visual disturbances), then you will not be able to take the COC. In this case progestogen­only methods such as the implant or mini­pill might be options. Cancer: Research is still ongoing on this. Current consensus is that the pill can give you a slightly elevated risk of breast cancer but it reduces the risk of other cancers, such as ovarian or colon cancer, and 10 years after you have stopped taking the pill your cancer risk returns to normal1. Depression: This one is slightly more complicated as there is conflicting research. Certainly as it influences your hormone levels it can contribute to mood swings when you initially begin taking it, though this should settle after a few cycles. But can it cause depression? One large study that has underpinned much of the media claims only found a link with depression when prescribing of antidepressants in the first few months of taking COC’s, but then the risk declined2. It’s also important to note that correlation is not causation and the findings from this study do need further investigation. If you do have a history of anxiety or depression then discuss this with


LIFE SCIENCES your GP before starting any hormonal contraceptive, and if you are on the pill and you feel it is affecting you then don’t be afraid to consider trying an alternative.

Not being an expert on this subject, I have to admit that my original reaction was not trust these apps as far as I could throw them (or as far as I could throw the phone they’re downloaded onto). I still remember in school being told that people can get pregnant even on their period, so I had a healthy cynicism when they became popular. However, last year the Natural Cycles app was approved as a certified contraceptive in the EU. The company claim that the app has an efficacy of 93% when used regularly (i.e. imperfectly but still to a typical standard) in a study of over 22,785 women3. It works by monitoring daily body temperature measurements and uses this to estimate when you are ovulating and might be at risk of becoming pregnant. It seems attractive as a cheap and side­effect free method ­ though note that while 93% sounds high, that’s 7 in every 100 women having an unwanted pregnancy. The NHS do not recommend the app: they state that further research should be carried out to confirm results found in the initial study as it was a retrospective data analysis, which can be an imperfect study design as it does not always give a sensitive enough level of information4. Notably, the study relied on app users inputting information regarding sexual intercourse. However, it is clear that some data was missing as almost half of all pregnancies recorded during the study followed a cycle with no intercourse recorded. This is compromising to the study design as efficacy studies of contraceptives are required to exclude months without sex in their analysis, as the risk of pregnancy is null by default. Other flaws with the study include conflicts of interest, as the lead authors were also the app developers and the study was funded by the company, a lack of clarity around differences in efficacy between those with regular and irregular menstrual cycles, and a high participant drop­out rate (of 34%). More insight is needed before Natural Cycles can be said to be anywhere near as reliable as the

alternatives. Recently it has also been under investigation in Sweden following a string of unwanted pregnancies5. However, it may have value for tracking your cycle generally or to work out when you are ovulating if you are trying to get pregnant. If you cannot use other contraceptives then it is a helpful option to have but do read up on it and be aware that you could still be at a risk of getting pregnant. Also, if you struggle with irregular periods or conditions such as Polycystic Ovarian Syndrome then the app may not be able to track your cycles effectively6.

The coil, or intrauterine device, is implanted into the uterus via the cervix and prevents pregnancy by creating a physical block and releasing copper, a natural spermicide. The thought of getting a coil put in might be a little daunting but it is possible to have local anesthetic during the procedure so don’t let that part put you off. It’s one of the most effective birth control methods and can last 5­10 years7. When it is first put in, the copper coil can lead to heavier or prolonged periods but for most this does settle, and it is a hormone free method. If you have heavier periods then the Mirena coil, which releases progesterone, is a similar method ­ though it lasts 3 or 5 years depending on the specific brand.

You may have heard over the past few years that male contraceptives are on the horizon. In 2016, an injected, hormone­based contraceptive (like female contraceptives, it also uses progestogen) was trialled and found to be highly effective, with 75% of men in the trial saying it is something they would be happy to continue using were it available8. However side­effects including mood disorders and changes to libido, similar to other hormonal contraceptives, were found. As a result the trial was halted until formulations could be improved to reduce this. More recently a male pill, dimethandrolone undecanoate, or DMAU, has shown promise in early trials, with fewer side effects reported. However, it has several phases of testing to go through before it could come to the market.

WWW.THE-GIST.ORG

Other innovations include an injectable gel which can block the ducts which carry sperm out of the testicles which may be available as soon as 2020.

This one is not a myth to debunk but rather a shout­out to latex (or latex­free for those with allergies). They may not be glam but they are one of the only ways to prevent sexually transmitted infections so, no matter your birth control, they are the safest way to go if you are with a new sexual partner. So after all that, what have we learned? Some contraceptives do have real side­effects and it is important to be aware of these risks so that you know what’s right for you. Of course there are legitimate concerns starting any new medication so if you are new to contraceptives, or considering a switch, make an appointment with your family planning clinic to discuss concerns if you are in doubt. Information is empowerment and contraceptives allow us a degree of control over our fertility that not all women have access to. Be skeptical of scaremongering headlines and don’t rule something out just because it didn’t work for someone you know.

This article was written by Kirstin Leslie, a PhD student at the University of Glasgow. The specialist editor was Tuuli Hietamies and the copy editor was Kirsty Callan. Layout design by Katrina Wesencraft and artwork by Cully Robertson (Cullor Illustration).


LIFE SCIENCES

THE

P§γČhěĐęŁïč RENAISSANCE F

or many people a ‘psychedelic experience’ doesn’t mean much more than seeing colourful shapes and listening to Pink Floyd while taking ‘acid’ or ‘shrooms’. However, for some scientific researchers these experiences serve as a unique tool to explore the human consciousness.

The word ‘psychedelic’ is derived from the greek words ‘psyche’ and ‘delein’, meaning ‘the soul’ and ‘to reveal’, respectively. The term is used to describe substances which can trigger mind­altering experiences. Such drugs range from naturally occurring substances, such as psilocybin (the active compound of magic mushrooms), to the chemically synthesised lysergic acid diethylamide (LSD). It was precisely the synthesis of LSD and the discovery of its psychoactive properties by Albert Hoffman in 1943, which sparked the initial interest in using psychedelic drugs in both scientific research and psychoanalysis. From the 1960s until the early 1970s, scientists explored the use of psychedelic drugs in various settings and patients, including those suffering from mental disorders, such as schizophrenia, or affective disorders, such as anxiety and depression. While in patients with schizophrenia psychedelics only worsened their psychotic symptoms (delusions, paranoia, hallucinations), patients with obsessive compulsive disorder, addiction or affective disorders experienced notable improvements1. Even though the experimental standards at that time wouldn’t meet the current criteria in terms of reporting, control groups and replication, psychedelics still showed promise as a treatment therapy. This was in spite of the possible side­effects of distorted perception or recurrence of frightening images (however these are rare and usually occur in unsupervised settings). So, what happened? Why can’t you buy psilocybin in the pharmacy?

Before researchers had the chance to further explore the action of psychedelic drugs, conducting psychedelic research became rather complicated due to the publication of the UN Convention of psychotropic substances in 1971. In the legally­ binding, international agreement, psychedelic drugs were re­classified as a schedule 1 or class A drugs ­ the class with the harshest punishment for drug possession or dealing2. As its adherence is a requirement of UN membership, this action resulted in limiting the access to psychedelic drugs internationally, thus affecting psychedelic research on a global scale. Several decades later, researchers still have to face a number of practical and financial difficulties to conduct their research. Prescription of psychedelics not only requires a license which costs £3000, but the drugs can only be administered in licensed institutions ­ and in the UK there are only 4 such institutions. Moreover, only one pharmaceutical company produces psilocybin of the required quality ­ and sells it at the rate of £100,000 per 1g. The total cost of psychedelic drug research is therefore 5­10 times higher than that of other classes of drugs, such as heroin3. Such high costs largely limit the number of participants and control groups, thus restricting the researchers’ freedom to explore the action of psychedelics in large trials. Not only that, because of the controversial nature of the research, most research groups depend on private funding and donations which are not always reliable sources of income. Nevertheless, there are some exceptions.

One of the fortunate groups is the Psychedelic Research Group at the Imperial College London lead by Dr Carhart­Harris. Thanks to to Amanda Feilding, founder and director of the Beckley Foundation which supports research into psychedelics, the group at Imperial College London has received substantial funding to study the impact psychedelics have on the human brain.

WWW.THE-GIST.ORG

With a background in psychoanalysis and psychology, Dr Carhart­Harris became interested in the possibility of using LSD and psilocybin to understand vague concepts, such as ‘self ’ and ‘consciousness’, by relating them to brain activity with the help of brain­imaging techniques. As part of his PhD research he approached Professor David J Nutt, at that time the Head of the Psychopharmacology Unit at Bristol University and currently a member of the Psychedelic Group at Imperial. Notably, Professor Nutt also held a position on the Advisory Council on the Misuse of Drugs but was dismissed after presenting statistical data on the harmful effects of drug abuse, with the unfavorable conclusion that alcohol or horse riding is more harmful than taking LSD4. Over the course of their collaboration, Dr Carhart­Harris and Professor Nutt administered psilocybin to healthy volunteers. With the use of functional magnetic resonance imaging (fMRI), they noted significant differences in brain activity and connectivity between waking­ consciousness (or resting­state) and the psychedelic state. Specifically, they observed decreased activity of the default mode network (DMN) ­ a network


LIFE SCIENCES composed of many interacting brain regions. The DMN is thought to be involved in processes such as the perception of the self or others or thinking about past and future events, as its increased activity is commonly seen in ‘mind­wandering’. Moreover, the less coordinated activity of the DMN in the participants was associated with increased sense of ego dissolution and magical thinking (wishful thinking, paranoia, creative thinking), suggesting it plays a role in shaping and controlling the state of our normal waking consciousness. All of their observations led Dr Carhart­Harris to a new, interesting idea ­ the entropic brain hypothesis.

For those who are not familiar with entropy, the term describes the measure of how a ‘system’, which

pruning and maturation, so your overall network connectivity and brain entropy is relatively high (that’s why some forms of daydreaming and ‘magical thinking’ are deemed acceptable, or even cute). But with increasing age, the entropy of your brain decreases as a consequence of the reduced connectivity. So, how can you access this high­entropic state again, being the boring, non­magical adult you are? As you might have guessed, the answer is psychedelic drugs. Although it's still not entirely clear how psychedelics produce these effects, it is known that on a cellular level they bind to specific types of neurons and make them more excitable and thus, more likely to communicate with other neurons. And interestingly, psychedelics activate these neurons differently based on their location in the brain. Although the theory is still in its infancy and we still don’t have a specific ‘bio­marker’ of entropy in the brain, it seems to make sense5. You could even stretch the theory a bit further and start to question whether people with mental disorders, such as schizophrenia, have had their networks affected during development and, thus, somehow their brains have more entropy than they should, potentially explaining how hallucinations and other dysfunctions could arise. Which brings me to another area of the psychedelic research ­ the treatment of depression.

can be anything from your room, to the universe or the brain, is organized. For example, if you are busy with uni work and your room is a complete mess, you could say your room has a high entropy. Conversely, if your parents are visiting and you clean up, your room would now have low entropy. You can look at entropy in terms of molecules and atoms and the concept is commonly used in chemistry and physics ­ so, could it be applicable in neuroscience? According to Dr Carhart­Harris, some networks and connections of the DMN evolved to suppress the naturally high­entropic state of the brain. This was most likely to prevent humans from losing focus when they should be trying to survive, find food or reproduce. However, this may also affect creativity. During childhood, your brain networks are still undergoing dynamic changes due to synaptic

Although some advances have been made in the treatment of depression, the options are far from ideal, with many patients experiencing side­effects or even being non­responsive. After some research groups started to explore the possibility of treating anxiety in late­stage cancer patients using MDMA/psilocybin and being successful enough to start trials6, the Psychedelic Group used a similar approach for treatment­ resistant depression (TRD). The so­ called ‘psilocybin with psychological support’ (PwPS) refers to giving the patients a low­dose of psilocybin under controlled conditions, in the company of an experienced clinical psychologist and the principal investigator. Throughout the psychedelic experience, they monitor the patients to reduce the possibility of a 'bad­trip’ (surprisingly patients who experienced a bad­trip still reported decreased depressive

WWW.THE-GIST.ORG

symptoms) and possibly help them discover the root of their depression. The whole experience thus resembles an intense version of a psychoanalytic therapy session, with psilocybin acting as a 'catalyst’ for achieving recovery. So far, the group has achieved incredible results. Many of their patients have reported an alleviation of all depressive symptom for up to six months after one session, describing the experience as a ‘reset’7.

While the researchers still don’t know the mechanisms behind these drastic and effective improvements, the findings are worth further investigation. The research of psychedelics could potentially improve the lives of millions of people suffering from depression. As with any new scientific discovery, it’s important to be cautious ­ the group puts a lot of emphasis on the psychological support given to the patients (which alone might be a reason for the big improvement) and doesn’t advise people to self­ medicate. Psychedelic drugs can be harmful, no one denies that, but why not allow scientists to have access to such a potentially powerful resource? What else could psychedelics teach us? Hopefully, we’ll find out by the end of the psychedelic renaissance.

This article was written by Katarina Moravkov, a neuroscience student at the University of Glasgow. It was specialist edited by Miruna Costreie and copy edited by Kirsten Woolcott. Layout design and artwork by Roxanna Munir.


LIFE SCIENCES

PETase: THE EARTH'S SPRING CLEAN

C

an society live without plastic? This is a question that remains difficult to answer. It has become so ingrained in the way we live. Toothpaste caps, plastic straws, cups, clothes, plugs, sandwich bags… if we look around it's difficult to find something that doesn’t contain any plastic ­ go on, try it. We can, and should, be impressed with the sheer versatility of this product and how it has improved the way we live. Considering the mass production of plastics only truly began in the 1940s and ’50s, it is quite remarkable the impact it has had on the modern world1. However, as Newton’s Third Law tells us, every action has an equal and opposite reaction. In this case, the positive, convenience in packaging products and electronics that plastic has provided to society has a cost to us and everything else we share the planet with. With plastic becoming the hot topic of environmental conversation, it finally feels as if the magnitude of the problem is finally hitting home.

We are a wasteful generation, and our waste is becoming the natural world’s poison. Scientists have quantified the effects of plastic in the animal kingdom ­ in a 2015 study it was discovered that at least 690 species had encountered marine debris, 92% of which was related to plastic waste2. In another study, UK researchers examined 34 species of seabird and discovered that 74% have ingested plastic. This research involved the study of colonies in Northern Europe, Russia, Scandinavia, Greenland, the Faroes and Iceland3. Further investigation revealed that this is a result of algae growing on the plastic secreting a similar smell to krill, which the marine life typically consume. This causes them to starve to death, as they feel full but do not get any nutrition from the waste they consume. It’s affecting us too in many different ways. Rivers are blocked in Indonesia by ‘plastic­bergs’, preventing transport and essential fresh water from reaching towns. Even the army has been called in to help with the problem4. Closer to home, plastic is building up in UK recycling plants as China bans imports. This leaves councils

WWW.THE-GIST.ORG

struggling with a problem that is not going to disappear anytime soon without drastic changes to waste policy.

Plastics are made up of long molecules of repeating units, called polymers. In essence, this can be viewed as the DNA of plastic. The polymers are connected to one another, increasing the strength of the plastic and giving it the properties that make it so useful to us. All man made polymers, and plastics, are initially created by fractional distillation of crude oil. The fraction used for the manufacturing of plastic is Naphtha. Poly Ethylene Terephthalate (PET), one of the most commonly produced plastics, is generated by reacting raw materials taken from the Naphtha fraction with terephthalic acid and ethylene glycol. Polymers also exist in nature, in plant cellulose and starch, and proteins in food. Unfortunately, the biggest difference between naturally occurring polymers and synthetic polymers are their biodegradability.


LIFE SCIENCES

The latest from the frontline on ‘The War on Plastic’ is that the UK throws away 8.5 billion plastic straws a year. The average weight of a plastic straw ranges from 0.2 – 0.5 grams, which means we are throwing away approximately 17,000 tonnes of plastic straws annually. It seems substantial and is exceptionally wasteful, but in the grand scheme of the problem we face, this single battle will certainly not win the war. 5.2 billion tonnes of plastic have been discarded as waste historically. This is the true scale of the problem that we face. To put this in perspective, the weight of all humans on Earth is 316 million tonnes, and we have thrown away 15 times this amount in plastic waste over the years5. So, with this in mind, assuming we are able to overcome the wily foe that is plastic straws, what do we do with the other 5,199,983,000 tonnes that will still exist as waste at the end of the year?

in the wastewater near and in recycling locations in Japan, has the uncanny ability to consume PET. It uses two separate enzymes, PETase and MHETase, to reduce the plastic into an energy source for its growth. Something so curious deserved further investigation, and a team from the University of Portsmouth and the US Department of Energy’s National Renewable Energy Laboratory collaborated on further research. While identifying the structure of the enzymes, they accidentally engineered an even more efficient enzyme than what nature had already provided. This mutant of PETase has the added

fascinating proteins and enzymes that exist, or will come to exist, scientists hope to identify pathways to breakdown all the plastic waste that exists on our planet. Years into the future, we could be looking back at this discovery made from sludge located at a recycling centre as one of the most important in the fight against plastic waste. The next stage is to investigate whether an industrial process can be created of a sufficient and efficient scale to overcome the surplus of plastic that we create and use at the moment. What would it look like? Would countries even buy into such a solution? The only sure point is that we must reduce the amount of plastic that is currently generated. As fossil fuel resources become ever scarcer, and the plastic waste pile continues to grow, the potentially irreversible damage caused as a result of them becomes ever more evident and drastic.

This article was written by

Reduce, Reuse, Recycle – you have probably heard this said a lot in the last 10 years, but there is nothing that suggests that converting the plastic to its raw materials is an option. Fear not ­ scientists have stumbled upon this exact process of turning PET back to terephthalic acid and ethylene glycol. Introducing Ideonella sakaiensis, a bacteria that was discovered by a team of microbiologists from Kyoto Institute of Technology and Keio University6. The bacteria, discovered

appetite for an additional plastic, PEF7. The hope is that by continuing to optimise the protein as far as possible, the millions of tonnes of PET waste that already exists in our environment can, overtime, be broken down back to its raw materials. This would allow for legitimate recycling of the products required to manufacture PET in the first place and prevent the increasing risk of microplastics building up in our environment. Currently, only PET and PEF have been identified as a “consumable” plastic. However, with the continued study of these

WWW.THE-GIST.ORG

Callum Maxwell. It was speacialist edited by Chirsty McFadyen and copy edited by Kirsty Callan. Layout design by Katrina Wesencraft.


NEWS

WHAT'S NEW WITH THEGIST? I

t's been a busy six months since our last issue. We've had to expand our board to handle the sheer volume of submissions we've been getting, it's a nice problem to have! We have aslo welcomed some new board members to take over our social media and to support our snippets, submissions, and multimedia teams. Some board members will be stepping down this term as they move on to bigger and better adventures, so let us know if you'd like to get more involved with the day足to足day running of theGIST. As always, no experience necessary! Just drop an email to editor@the足GIST.org

Having been run jointly by students at the univeristies of Glasgow and Strathclyde for a number of years, we are excited to finally launch theGIST at Glasgow Caledonian University! As we continue to grow and expand, reaching out to GCU seemed like the obvious next step. We can now access a whole new pool of SciComm enthusiasts and look forward to welcoming them to the fold. A big shout out goes to Jamie Tarlton and Roxanna Munir for being our first GCU contributors!

In April, we attended the Student Publication Association National Conference and award ceremony. We were nominated for a total of five awards and we were absoulutely thrilled to win 'Best Specialist Publication' for the second time! This is a credit to all the hard足working members of our board but most importantly to our wonderful contributors.

Over the coming months we will be providing media coverage for a range of science events around Glasgow. Keep an eye out for us at the Science Centre and at Glasgow Skeptics meetings. Get in touch if you'd like to help out or if you'd like us to cover one of your events!

Every year is a struggle to ensure the financial security of our beloved print magazine. We cannot thank Anna Duncan enough for the hours she has spent filling in grant applications and seeking sponsorship. Without her this issue would not have come to fruition. We'd like to thank the Union at University of Strathclyde for providing financial support and advice for the future. We'd also like to thank the university's Alumni Fund. We're also grateful to the University of Glasgow MVLS for the grant from the Excellence With Impact Fund.

MEET THE TEAM

WWW.THE-GIST.ORG


WELCOME


WELCOME


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.