YOUR AD HERE
BlueSci is published at the start of each university term and 3000 copies are circulated to all Cambridge University colleges and departments, and to our paid subscribers. However, this doesn’t come cheap, and we rely on our partners and paid subscribers to keep us going.
We are also seeking new partnerships. If you would like to sponsor a student-run society teaching key science communications skills with reach across Cambridge University and beyond, we would love to hear from you!
Contact finance@bluesci.co.uk to subscribe to us or to secure your advertising space in the next issue
An Interview with Dr Catherine Galloway
Merissa
Why Plant-Based Just Makes Sense
Pauline Kerekes
how the electricity grid is suspectible to gloabl hazards and what
Lizzie Knight
done to improve
BlueSci was established in 2004 to provide a student forum for science communication. As the longest running science magazine in Cambridge, BlueSci publishes the best science writing from across the University each term. We combine high quality writing with stunning images to provide fascinating yet accessible science to everyone. But BlueSci does not stop there. At www.bluesci.co.uk, we have extra articles, regular news stories, podcasts and science films to inform and entertain between print issues. Produced entirely by members of the University, the diversity of expertise and talent combine to produce a unique science experience.
President: Adiyant Lamba.. president@bluesci.co.uk
Managing Editor: Rachel McKeown................................... ..........managing-editor@bluesci.co.uk
Secretary: Adam Dray enquiries@bluesci.co.uk
Finance Officers: Amelie Lam, Katie O’ Flaherty...................................................finance@bluesci.co.uk
Subject Editors: Bethan Charles, Elizabeth English subject-editor@bluesci.co.uk
Podcast Editors: Georgia Nixon ..............................................................................podcast@bluesci.co.uk
News Editors: Yan-Yi Lee news@bluesci.co.uk
Webmaster: Clifford Sia........................................................................................webmaster@bluesci.co.uk
Social Media and Publicity Officer: Andrew Smith................................communications@bluesci.co.uk
Social Secretary: Merissa Hickman............................................................communications@bluesci.co.uk
Art Editor: Pauline Kerekes....................................................................................art-editor@bluesci.co.uk
Issue 56: Lent 2023
Issue Editor: Merissa Elizabeth Hickman
Managing Editor: Rachel McKeown
First Editors: Andrew Smith, Bartek Witek, Bridget Eburne, Daniel Lim, Devahuti Chaliha, Ella Plevin, Laura Chilver, Lauren Lee, Merissa Elizabeth Hickman, Mia Wroe, Raina Jia, Roberta Cacioppo, Shuyan Zhang, Tee Lee
Second Editors: Andrew Smith, Bridget Eburne, Daniel Lim, Devahuti Chaliha, Ella Plevin, Laura Chilver, Lauren Lee, Merissa Elizabeth Hickman, Raina Jia, Roberta Cacioppo, Shuyan Zhang, Tee Lee
Art Editor: Pauline Kerekes
News Team: Yan-Yi Lee, Anne Thomas, Chisom Ifeobu
Reviews: Libby Brown, Laura Chilver, Raina Jia
Feature Writers: Merissa Elizabeth Hickman, Konstanze Schichl, William Smith, Hayoung Choi, Anna Pujol, Emily Birt, Lizzie Knight, Xaviour Wang, Sheryas Iyer
FOCUS Writer: Merissa Elizabeth Hickman and Rachel Duke
Pavilion: Pauline Kerekes
Weird and Wonderful: Libby Brown, Holly Smith, Tasmin Wood
Production Team: Merissa Elizabeth Hickman, Rachel Mckeown, Adiyant Lamba
Copy Editors: Andrew Smith, Bridget Eburne, Laura Chilver, Maddie McGinnis, Merissa Elizabeth Hickman, Tee Lee
Illustrators: Caroline Reid, Sumit Sen, Pauline Kerekes, Biliana Todorova , Barbara Neto-Bradley, Rosanna Rann, Marida IanniRavn, Josh Langfield
Cover Image: Sumit Sen
Science and Society
Scientific AdvAncementS have the power to change our everyday lives, yet they raise concerns amongst society itself. Many scientific innovations are deeply embedded in our everyday lives and inadvertently affect us as individuals - from large world-wide pandemics to the financial economy, science has a major influence over society. Despite this, there seems to be major errors in the way science is currently communicated to the general public. In the age of social media, the public are often misled which creates a crisis of trust between scientists and society.
Now, more than ever, it is vital to achieve a greater integration of science and society. This issue of BlueSci intends to explore recent and potentially the most ethically challenging scientific advancements to prompt our readers to engage in the world of science. We will investigate the impact of recent advancements on society and everyday individuals, as well as the ethics of their potential applications.
Starting with the broader point of view, with the help of Dr Catherine Galloway from the Kavli Centre for Ethics, Science, and the Public I explore how the new centre pursues the admirable aim of creating social change through involving and engaging the public in pioneering ways. Opening up the ethical debate, Konstanze Schichl then discusses the immortalised HeLa cell line and how it has shaped science and society. William Smith then summarises the benefits of a Plant-Based diet and the national response from University Unions to rally support for a plant-based diet. Continuing the discussion of our diets, Hayoung Choi shares the journey of genetic modification in food crops.
The FOCUS piece hones in on perhaps the most ambitious healthcare initiatives of our time. Myself and Rachel Duke explore the promises and impact of genomic sequencing from the first complete human genome assembly to the latest Newborn Genomes Programme which has given rise to a nation-wide conflict of opinions.
Moving on to our Pavilion piece, Pauline Kerekes interviews Keira Tucker from Ascus exploring a platform where art and science can meet.
Emily Birt then discusses the clinical applications of Botox and its potential uses for mental health. The next part of the issue moves towards the physical nature of scientific advancements. Beginning with an article that considers how the electricity grid can be susceptible to global hazards, Lizzie Knight describes how we can improve the resilience of the electricity grid. Xaxiour Wang then goes on to discuss the promises and challenges of fusion energy. To finish off our features, Sheryas Iyer introduces the remarkable prospect of a quantum internet.
Science communication and engagement is the key to integrating science and society and to rebuild the trust between scientists and the public. It is our hope that by exploring the recent advancements which span a broad range of scientific fields that you, the reader, will be inspired to further engage with science, even if it initially seems inaccessible. We hope that you are encouraged to share your thoughts, ideas and concerns regarding the ethical implications of scientific advancements with your friends, family and colleagues.
This work is licensed under the Creative Commons
Attribution-NonCommercial-NoDerivs 3.0 Unported
License (unless marked by a ©, in which case the copyright remains with the original rights holder). To view a copy of this license, visit http://creativecommons.org/ licenses/by-nc-nd/3.0/ or send a letter to Creative Commons, 444 Castro Street, Suite 900, Mountain View, California, 94041, USA.
Merissa Elizabeth Hickman Issue Editor #56On the Cover
the illuStrAtion on the cover is my way of celebrating the scientific advancements that we have achieved in past decades. The illustration depicts the recent advancements in diverse fields of science, starting from genetics, and neurobiology to quantum computing, and their impact on society and the upbringing of human life. All these have been manifested in different abstract art forms.
Focusing on the center, I have used geometric abstraction to showcase how genome sequencing has progressed our understanding of the biology of humans, presented here by simple vector art of the Vitruvian man. This has been further extended in the illustration of the focus article. Onto the left, the asymmetric mandala-like art exhibit the origin of HeLa cells and how Henrietta Lacks has been immortalized to contribute to the scientific community immensely. On the right, I have made a simple illustration of how neuronal cells are cultured on a plastic disk to mimic the brain to study how it functions.
At the back, I have tried to mix abstract expressionism and geometric art styles to show how quantum computation works, where the overlapping concentric rings on two edges depict different quantum properties like superimposition and entanglement, which are being utilized in quantum computing to send qubits, shown here as solid circles and rings pointing to different directions. I have tried to assemble all of these in a comprehensive yet aesthetic manner on the cover of the magazine.
Sumit SenAccess to Electricity : A growing issue in 2023 and Beyond
This year will be the first year in over two decades that there is a global increase in the number of people without access to electricity. Access to electricity is crucial as it directly affects the living standard of individuals as well as the economic development of nations. In the global quest to ensure a sustainable future for all, universal access to electricity must be achieved.
To do this, there needs to be a full understanding of what is meant by access to electricity. One might be tempted to assume that having electricity access means having a source of electricity to power basic appliances. However, the threshold for electricity access currently is having enough electricity to power a lightbulb, mobile phone and radio for four hours per day. This is considered as ‘Tier 1’ access according to the MultiTier Framework (MTF) approach to electrification specified by the World Bank. The framework specifies four higher tiers with increasing consumption levels. This classification could inform more strategic planning for access to electricity, especially in current times when resources are worryingly scarce. CI
Original article: https://www.iea.org/commentaries/for-the-firsttime-in-decades-the-number-of-people-without-access-to-electricityis-set-to-increase-in-2022
Uplifting Plants: Hebes in New Zealand’s Southern Alps
Mountains all over the worldare known as biodiversity hotspots. New Zealand’s largest group of endemic (native) plants, flowering shrubs called hebes (genus Veronica), has over 120 species. Most species live in mountain habitats in New Zealand’s Southern Alps. DNA evidence suggests the group is only around 6 million years old — relatively young on an evolutionary timescale — but hebes have surprisingly diverse forms. They range from small trees with long, narrow leaves, to dense shrubs, to cushion plants that only grow in the high alpine zone. Can their preferred mountain habitats explain how hebes evolved so much diversity in so little time?
Although the estimated origins of the hebes predate the uplifting of the Southern Alps (also quite young mountains), models simulating how species may have evolved and migrated over time suggest that once uplifting began, a few ancestral lowland species colonised the growing mountains. Once there, populations encountered barriers created by rocky cliffs and fluctuating glaciers, as well as opportunities to adapt to new alpine environments by taking on new forms. Forces like this may have encouraged the burst of diversification that gave rise to the species living there today. But without a time machine, we’ll never know for sure exactly what went on in the Southern Alps during those millions of years! AT
Rapidx: A Rapid Disease Testing Startup at Cambridge University Using Plasmonics
At least 1.4 billion people around the globe have been diagnosed with microbial infections such as urinary tract infections (UTIs), sexually transmitted diseases (STDs), and chronic respiratory diseases, among which millions die every year. In the two minutes that you spend reading this article, around fifty deaths have occurred due to late diagnosis. A Cambridge-based startup — Rapidx — is aiming to change this.
The venture was initiated by Nipun Sawhney, Cambridge PhD candidate in Physics with experience volunteering in Epidemiology in India during Covid, along with Dr. Shuler Xu, medical doctor trained at University College London. While Polymerase Chain Reaction (PCR) testing itself is by no means novel, the team adopts plasmonics — a photonics-inspired technology using lasers that allows the rapid heating and cooling of DNA — to detect diseases more accurately. This eliminates the need for external heating used in current technology, such as Peltier heating. Plasmonic heating also allows PCR tests to be performed with a much shorter window period, without compromising the sensitivity or specificity of the DNA or RNA. The project is currently in its prototyping stage, and is between a year and a half to three years away for testing with the public. YL
Original article: https://link.springer.com/article/10.1007/s00345-021-03913-0?fbclid=IwAR0sxZvPOT6O7KxWofpUt0X5Z9spKkl81 axPgYdvVKsBqDj3adXDPxitWF8
Check out our website at www.bluesci.co.uk, our Facebook page, or @BlueSci on Twitter for regular science news and updates
Reviews
Coughs and Sneezes or Attention Seekers?
hAve you ever heArd A ‘Snough’?
Gorillas at Atlanta Zoo have been seen to open their mouths wide and let out a loud theatrical noise, somewhere between a sneeze and a cough. In a recent study, anthropologist Roberta Salmi recorded the response of eight gorillas in three different settings to observe this phenomenon. The first setting, a bucket of grapes outside the enclosure; the second, a keeper in place of the grapes; and the third, both the keeper and the grapes. The gorillas ‘snoughed’ most when both the keeper and grapes were present. “Coughing and sneezing are signs of a cold, which are signals that caregivers pay specific attention to”, says Salmi, who leads the primate behavioural ecology lab at the University of Georgia. In this instance, the gorillas seem to have noticed that a ‘snough’ in particular grabs the attention of the keeper.
‘Snoughing’ has never been observed in the wild, and this is the first time that ‘complex vocal learning’, where novel sounds are learnt through imitation, has been observed in captive gorillas. Significantly, if a greater understanding of primate communication can be gained, it might help to explain how human language emerged in the first place. LB
Birdsongs alleviate anxiety and paranoia in healthy participants
Anthropgenic noiSe pollution in urban environments has been shown to mask bird signals and create selection pressures driving the evolution of bird songs. Over the last couple of decades, multiple studies have shown that urbanisation negatively impacts mental health with symptoms such as increased depression, paranoia and schizophrenia. Similarly, many studies have shown the mental and physical health benefits correlated with access to nature and rural environments. However, little focus had been made on the effects of sound.
A recent study by Stobbe et. al looked at how birdsong can affect those in urban environments filled with noise pollution. They found that birdsong can improve our mental health and that these effects are stronger in areas with more severe noise pollution. In this study, subjects listened to recordings of high and low traffic soundscapes with different vehicle diversity as well as high and low birdsong soundscapes with varying diversity. Participants’ psychosis liability, cognition, mood and paranoid symptoms were measured with several standardised scale indices before and after listening to each soundscape. Results showed that although cognition did not appear to be affected by the stimuli, noise pollution had a negative effect and birdsong had an exclusively positive effect on other studied aspects. LC
Original article: Stobbe, E., Sundermann, J., Ascone, L. et al. Birdsongs alleviate anxiety and paranoia in healthy participants. Sci Rep 12, 16414 (2022). https://doi.org/10.1038/s41598-022-20841
Modifying the genetics of humanity? What we do not know is what matters
Would you like your baby to be genetically modified? With the in vitro fertilisation (IVF) and gene-editing technologies well heard nowadays, the prospect of being able to select the genetic makeup of our future generations is looming. If you ever wonder about what choice you might make, a new book - “The End of Genetics: Designing Humanity’s DNA”, written by the renowned geneticist David Goldstein, may offer some clues.
When faced with decisions concerning genetic modification in humans, mixed feelings may arise depending on the context: should we eliminate known genetic variations that cause fatal or debilitating diseases? Many parents may be inclined to say yes. However, thoughts and emotions can get complicated when it comes to non-life-threatening characteristics, such as height and intelligence, and this raises many more questions than answers. Would we create a healthier humanity or relapse back into the horrors of eugenics prevalent merely a few decades ago? Would there still be a sense of biological belonging to a family if parents decide not to have their children inherit their genetic quirks? After all, do we really know about the potential consequences of applying reproductive genome editing en masse?
By taking readers through the history of human genetics – from the realisation of heredity as a concept, all the way up to the latest advancements in reproductive gene editing technology – Goldstein lays out what we do and do not yet know about the ways in which genetic variations in the population may affect our lives and well-being. Goldstein stresses that for better or worse, as with any technological progress in human history, what can be done will be done; so now is the time for scientists to inform the public more about the uncertainty and less about the hype around this potential avenue of reproductive genome design. RJ
An Interview with Dr Catherine Galloway: Translation and Innovation Lead at the Kavli Centre for Ethics, Science, and the Public
Merissa Hickman explores the pioneering ways that the Kavli Centre for Ethics, Science and the Public involve and engage the general public in science
The Kavli CenTre for eThiCs, sCienCe, and The PubliC (KCesP) was launched in December 2021, and is a unique collaboration between the Faculty of Education at The University of Cambridge, and Wellcome Connecting Science funded by The Kavli Foundation. Prof Anna Middleton - the director, Dr Richard Milne - the deputy director, and Dr Catherine Galloway - the translation and innovation lead, head the centre.
I had the pleasure of interviewing Dr Galloway regarding her role at Kavli and exploring the goals for the centre. The KCESP focuses on taking ‘the scary out of science’, says Galloway. It aims to invite the public into conversations and discussions around the innovations which shape our futures. The centre pursues the admirable aim of creating social change through involving and engaging the public in pioneering ways, by connecting them to scientists working in the areas of genetics, big data, and AI.
Q: WhaT is Your role aT The KCesP?
a: So, the short description of what I do is the Department of Crazy Ideas. I'm the only non-scientist in the centre.The ability to engage with all sorts of public, and make potentially difficult things
understandable and accessible, are what they [the Kavli Foundation, our funders] were interested in. So the translation part is taking the cutting-edge science and translating it into a format, or a variety of formats, that are accessible for people who are nonscientists.The innovation part is the department of crazy ideas.
GalloWaY believes that her role, at the centre, is being ‘the voice of the general public’. She is particularly interested in ‘solutions journalism’, an evidence-based mode of reporting on the responses to social problems, a model she wishes to implement in the Kavli Centre. A recent debut of one of Dr Galloway’s ideas took place at the Cambridge Alumni Festival. The Hopes and Fears lab was run by the KCESP on the 24th September 2022. This gave alumni the chance to sit down with scientists working on cutting-edge developments, to hear what excites them and what makes them stop and think. It also provided an opportunity to share the public’s hopes and fears in return. Dr Galloway reported that the event was a ‘conversation experiment’ which resulted in excellent communication between members of the public and scientists, which was ‘magical’ to see.
lving and engaging the public in pioneering ways, by connecting them to scientists working in the areas of genetics, big data, and AI.
Q: WhaT are The biGGesT ChallenGes The KCesP faCes?
a: At a moment, when scientists and the public face so many pressing and urgent concerns, our biggest challenge is to show how our key questions about science and society are also relevant and need to be addressed now, for the benefit of us all. Whether in the context of future medicine, tech development, or food and farming. We are asking people to consider, right now, 'Is this the type of science we want? Is this creating a world that we want?' And we want to do this in ways that are as cutting edge as the science itself. Doing things very differently to how they have been done before is challenging. But it's also what makes the work really exciting.
The centre recognises the current issues in science spread of misinformation, science involvement needs to take place ‘where people are’. Galloway boldly states that ‘Science communication as it stands isn't working for people who aren’t walking towards the the science already; more and more
people are saying they feel left out, disengaged.’ The KCESP takes necessary action to proactively engage the public in non-traditional ways. It aims to work with people outside the scientific community, to make science relevant, appealing, and accessible to the public. Galloway wants to develop ‘creative partnerships’ with those outside the science world. An exciting project of the KCESP is called ‘Only Human’, and is a ‘socialising the genome’ project. Galloway revealed that this is a creative partnership with ‘top filmmakers’, advertising agencies, and musicians. The centre realises the value of using these renowned storytellers, and the important role they will play in moving large groups of people - the public - towards genetics, big data, and AI.
"sCienCe CoMMuniCaTion, as iT sTands, isn'T WorKinG for PeoPle Who aren'T alreadY WalKinG ToWards The sCienCe... More and More PeoPle are saYinG TheY feel lefT ouT, disenGaGed."
Q: hoW do You inTend To reaCh GrouPs WhiCh are ofTen eXCluded froM sCienCe enGaGeMenT. suCh as Those WiTh losW soCio-eConoMiC sTaTus, and eThniC MinoriTies?
a: The methods we are most excited about using aren't conventionally part of the science engagement repertoire. Instead, we're building on partnerships with community groups in all sorts of different areas, finding out what matters to them and then working on a way to build science into that. Our director, Anna Middleton, has a background in genetic counselling and this idea of listening as much as talking - and using dialogue-based activities rather than a straight 'lecture' style - is very much part of how we work. And finally, being located at the Faculty of Education is especially helpful as colleagues there have done a lot of fascinating work on how to reach disengaged or disadvantaged groups.We are rooted in the belief that a global conversation about science and society needs everyone, and everyone has a right to be part of it. Or, as Jesse Jackson says, "When everyone is included, everyone wins." We have that up in the office.
Q: WhaT areas of sCienCe, oTher Than GeneTiCs, biG daTa and ai, do You Plan To eXPand To?
a: That's going to depend, in part, on what the public tells us they want. Dr Richard Milne is going to do his big global survey, which is currently being called the Kavli Global Survey on Public Attitudes to
Science. If that is coming back really strongly [saying] we feel very disengaged or disempowered in this certain area [and] we're very worried by it, that is where we'll push.
The centre does not intend to be ‘limited’ by their ‘three big areas’, Galloway states. The KCESP plans to create purposeful connections between scientists and the public at early stages of research. This differs from the traditional scientific journey, in which the science is often shared with the public when results are known, at the end of a scientific journey. the public and scientists, which was ‘magical’ to see.
Q: hoW Can sCienTisTs here aT CaMbridGe enGaGe WiTh The KCesP?
a: Get in touch! Our door is always open for anyone, at any level, interested in working with us to reflect on the implications of their own science, develop new skills, and think creatively about the future we are building and about how to connect with the public in a way that is beneficial to all sides.
The Kavli Centre for Ethics, Science, and the Public are eager for engagement from scientists and the public alike. The work of the centre is proving essential in a world overwhelmed by constantly developing science. Scientific advancements have the power to change our everyday lives, yet they raise concerns amongst society itself. The centre ambitiously tackles this by creating a science-society bridge. They are creatively answering how to bring global public audiences into discussions on scientific breakthroughs and taking a multidisciplinary approach to tackle the ethical issues raised by cutting-edge science. For more information and exciting opportunities and updates, find the centre at: https://kcesp.ac.uk
HeLa: The Immortal Cell Line Shaping Science and Society
Konstanze Schichl is a 1st year PhD student studying human papillomavirus and cervical cancer at Peterhouse, University of Cambridge. Konstanze is particularly interested in the gender equality in research.
The hisTorY of The hela Cell line
In 1951 at Johns Hopkins Hospital in Maryland, USA, a 31-year-old African American woman named Henrietta Lacks was diagnosed and treated for cervical adenocarcinoma. In the first half of the 20th century, cervical cancer was the fourth deadliest type of cancer in women, accounting for almost 14% of female cancer deaths in the USA. Dr George Otto Gey, a cell biologist from Johns Hopkins Hospital, started growing cells from her cervical tumour in a laboratory mere months before Lacks passed away, leaving behind 5 five children. He named the isolated cells HeLa after Henrietta Lacks.
THE UNIQUE PROPERTIES OF HELA CELLS
Replicability is the foundation of scientific studies as it enables verification. The use of HeLa cells in biomedical research allowed researchers to replicate experiments using genetically identical cells anywhere, anytime. This eliminates potential unknown effects arising from genetic differences. HeLa cells are an immortalised cell line: a cell population isolated from a multicellular organism that can divide indefinitely in laboratory environments outside of the organism. Normally, animal cells can only grow in their native environments, with appropriate intercellular contacts and signalling from neighbouring cells, orchestrated by self-regulating networks as part of a whole organism. Such mechanisms eliminate mutated cells to prevent them from uncontrollably proliferating and causing cancer in animals. Typically, the number of divisions from a cell is also limited
by a consumable resource — telomeres, which gradually deplete and lead to cell death. Telomeres are repetitive DNA sequences protecting the structural integrity of the ends of the chromosomes, which shorten with every cell cycle. However, a series of mutations enabled HeLa cells to evade these regulations and grow independently from a human body for prolonged periods while constantly regenerating telomeres. In fact, this inspired the modern approach to creating immortal cell lines from any isolated cells by telomere regeneration.
Fun fact: Evolutionary biologist Leigh Van Valen proposed in his article “HeLa, a New Microbial Species” to classify HeLa cells as a new organism because of their non-human chromosome number and their ability to replicate without a host.
HOW HELA CELLS SHAPED SCIENCE
HeLa is the oldest immortal cell line and is widely used in cancer, virology, and genetics research. The cell line was used in developing the polio vaccine and COVID-19 vaccine, as well as in the study of mumps, measles, Ebola, and HIV. HeLa cells were the first human cells to be cloned and were even sent to space to explore the effects of radiation on astronauts. More than 110,000 scientific studies using HeLa cells have been published, with a yearly increase of over 6,000 new articles in 2015 and 2016, according to data compiled by the National Institutes of Health. HeLa cells were the first human cells to be cloned and were even sent to space to explore the effects of radiation on astronauts. More than 110,000 scientific studies using HeLa cells have been published, with a
Konstanze Schichl discusses the legacy of Henrietta Lacks with a focus on the history and modern-day implications of the use of the HeLa cell lineIllustration by Caroline Walker
yearly increase of over 6,000 new articles in 2015 and 2016, according to data compiled by the National Institutes of Health.
The HeLa cell line was essential in the studies linking cervical cancer to human papillomavirus (HPV) infection, leading to Herald zur Hausen winning the Nobel Prize in Physiology or Medicine in 2008. He was one of the first to link certain types of human cancer to viral infections. This discovery paved the way to the development of the HPV vaccine Gardasil, which was FDA approved in 2006. This was followed by Ceravix and Gardasil 9, which have since decreased the cervical cancer incidence by almost 90% in women vaccinated under the age of 13, according to a study funded by Cancer Research UK. However, the characteristics of HeLa cells are not always helpful to science. The accumulation of genetic mutations means that HeLa cells are very different from normal human cells. For example, their genetic material comprises over 80 abnormally structured chromosomes, instead of the standard 46. Thus, inferring knowledge about the human body using findings from HeLa cell experiments can be controversial. Additionally, the contamination of other cell culture experiments by HeLa cells is very common, because of their ubiquitous and highly proliferative nature. According to a report by Science (2015), over 7,000 peer-reviewed publications had unknowingly used other human cells contaminated by HeLa.
Not-so-fun fact: The contamination of other samples by HeLa cells contributed to increased tensions between the USA and the USSR during the Cold War, as the USSR scientists thought that internationally exchanged samples were deliberately contaminated.
HOW HELA SHAPED SOCIETY
HeLa cells helped make important advances in biomedical research, leading to tremendous societal benefits. For example, with the development of the Gardasil vaccine in combination with regular cervical screenings, the World Health Organisation (WHO) set a goal towards the elimination of cervical cancer. Canfell et al. predicted that the triple-intervention strategy by the WHO, including upscaling cervical cancer treatment, vaccination, and screening, will reduce the mortality of the disease by 99% by 2120.
However, the story of Henrietta Lacks has sparked controversy in recent years, as new discussions have been opened about ethics in scientific research. We can only speculate if Lacks’ sex and race were reasons for her exploitation. Although it was not common courtesy to ask any patients for consent when extracting samples in the early 20th century, African American individuals were exploited especially brutally for scientific research. From the 1930s to the 1970s, the Tuskegee Syphilis Study was carried out on over 400 African American men with syphilis, intentionally leaving them untreated for the disease to study its effects, leading to over 100 deaths.
Equally controversial is the ongoing lawsuit the Lacks family filed against Thermo Fisher in October 2021, one of the most prominent and rapidly growing biotechnology companies with an annual revenue of $44 billion in 2022. The Lacks family are seeking compensation in royalties from the company for profiting off commercialising HeLa cells decades after the consent issue had been raised. While there is widespread support for the Lacks family, others dismiss the claims on the grounds that the time limit for initiating legal proceedings has passed. Despite the controversy, the Lacks family are still fighting for more ethical medical research.
ETHICS OF USING HUMAN SAMPLES IN SCIENTIFIC RESEARCH
Do In the USA, human research subjects are under the protection of the Common Rule, adopted in 1991, which states that voluntary, informed consent and protection of vulnerable groups is essential for ethical research. Under today's jurisdiction, Lacks would be protected as a member of a vulnerable group and research using her cells would only be done with her consent. This law does not, however, confer protection retrospectively. HeLa cells have been excluded from the definition of a human subject after the passing of Lacks. My work as an HPV researcher involves handling human samples and practising the relevant ethical code. In England, Wales, and Northern Ireland, all research involving the use, removal, storage, or disposal of human bodies, organs, and tissue, must follow the regulations of the Human Tissue Act 2004. Human cells and anything containing genetic material is classified as ‘Relevant Material’. Legal and ethical complications arise as many research samples are collected in other jurisdictions. For instance, my research uses human samples collected in South Africa, but analysed in the UK. As the two countries have different ethical codes and legal regulations surrounding human sample collection and handling, does this work remain ethical as long as the procedures conducted in each jurisdiction follows local ethical codes, even if they violate the code of another participating jurisdiction? While there are regulations in place for international collaborations, the processes can be long and poorly understood. Do legal regulations equate to ethics? To extend this to a broader level, does societal benefit ever outweigh the personal welfare of vulnerable patients, and is it even valid to weigh objective benefits to the society with the subjective human experience? Such contemplations may never have definite answers but are worth pondering for both the researcher and the everyday person who benefits from debatable research practices.
Lacks’ family wanted her legacy to live on, with a focus on the societal progress that arose from the creation of the HeLa cell line. Her legacy lives on through articles, statues, and even New York Times bestselling books. We should thank her, unfortunately only posthumously, for shaping the scientific and ethical progress of modern medical research.
Why Plant-Based Just Makes Sense
William Smith presents the environmental benefits of a plant-based diet
CliMaTe ChanGe ProbablY isn’t news to you. Perhaps you have also heard that plant-based food is better for the environment than animal products in some way. But does that make sense? Surely local meat is better for the environment than some vegan meat made of soya beans grown halfway around the world. Here, we’re going to look at these concerns and see why plant-based does indeed just make sense. Understanding the impact of animal products on the environment is key to tackling the climate crisis. This is the motivation behind the Plant-Based Universities campaign; it urges universities to be leaders in making this transition. We will learn more about this campaign later.
Firstly, we should address what exactly we mean by ‘plantbased’. Plant-based simply means derived completely from plants, i.e., contains no ingredients sourced from animals. When it comes to food, ‘plant-based’ is more or less synonymous with the older term ‘vegan’. Neither terms have a definitive definition but the main difference, if there is one, is that plant-based refers to only food, whereas vegan describes someone’s beliefs about how animals should be treated.
So why are plant-based diets better for the environment?
Human activities often have a negative environmental impact. The most common is greenhouse gas emissions — the main drivers of climate change. However, there are other ways that human activities negatively impact the environment, such as land use, water use, and chemical pollution derived from the use of pesticides and fertilisers. Plant-based foods tend to perform better in all of these categories.
All life on Earth gets energy from the sun. Plants do this directly by photosynthesis; animals consume plants or other animals to obtain energy. When an organism consumes energy, most of it is used by the organism itself for it to live, meaning that only a small amount can be obtained by consuming the organism. It is therefore most efficient to consume plants, rather than consuming animals that have consumed plants. This means that growing plants to feed to animals to then eat is a grossly inefficient way of feeding the world that results in higher use of land, water, fertilisers, and pesticides and leads to higher emissions.
So now that we understand this conceptually, what does this mean in the real world? Currently, 50% of all habitable land is used for agriculture. Of course we need this to live off, but could we use it more efficiently? Of that agricultural land, 77% is used for animal agriculture, but this only provides 18% of the calories and 37% of protein supplied globally. This makes plant-based agriculture 15 times more efficient in terms of calories and 6 times more efficient in terms of protein than animal agriculture.
Land use for agriculture is a primary driver of habitat loss around the world, causing mass biodiversity loss and the E
Earth-s six mass extinction, meaning we need to reduce the land we use, while also feeding the world. The most effective way to do this is by switching to a plant-based food system. Beef, for example, accounts for 41% of deforestation worldwide. Additionally, 24,000 endangered species are threatened by agriculture, that is 85% of all endangered species.
By switching to an entirely plant-based food system, the vast majority of the land currently used for animal agriculture would be freed up and could be rewilded into its natural state, reducing the threat to endangered species. Doing this would also allow this land to act as natural carbon sinks, offsetting the equivalent of 16 years’ worth of fossil fuel emissions. Animal agriculture is also a much larger source of emissions than plant-based agriculture. Cows and sheep are ruminants, meaning they produce lots of methane during digestion. Methane is a much more potent greenhouse gas than CO2, its warming potential is 84 times greater over 20 years. This is why beef, lamb, and mutton and cheese have the highest emissions of any food. The larger amounts of land required for any animal products also means that more fertiliser is required, resulting in more emissions in its production. A common misconception about plant-based diets is that they must have worse emissions due to the greater distances
that foods like chickpeas and soybeans have to travel. This is understandable as food miles are a much more tangible form of emissions than many of the other types of emissions associated with our diets. However, a study in the US found that eating plant-based just one day a week can reduce your dietary emissions more than by sourcing all your food locally. This is because for most food, especially animal products, farming practices make up a much larger proportion of emissions than transport (Figure 1). It’s often the case that even animals raised locally in the UK are fed crops grown outside the UK, such as soya beans from Brazil. In fact, 77% of all soy produced globally is fed to livestock and only 20% is consumed directly by humans, mostly as oil.
There are many things that we need to do in order to prevent catastrophic climate change. We need to completely phase out fossil fuels, reduce our dependence on aviation and develop some form of carbon capture technology. Switching to a plant-based food system is also an essential step in ensuring the future of our planet. It also has major advantages
over other methods: it doesn’t require the invention or scaling up of any new technologies nor does it require any large-scale changes in infrastructure. Arguably, its greatest advantage is that anyone can play a role in this change every time they decide what food to buy.
Another way is through showing support for campaigns such as Plant-Based Universities, which was started at UCL in 2021. This is an international campaign to make universities switch to 100% plant-based catering (PBU). Its aim is to hold universities to account in the role they should play in educating on and tackling the climate crisis. It is universities that produce the research that show how important a step this is in tackling climate change. This year the campaign was launched in Cambridge and has already received a large amount of support. The next steps are to hold a vote at the Students’ Union on whether to make it an official campaign of the SU. The more support the campaign receives from students the more likely this is to pass. Then the aim is to make all catering run by the University 100% plant-based. So make sure you vote!
GM brought us Golden Rice, but is it a Golden Solution?
Hayoung Choi explores whether GM crops are a solution to the food crisis
‘Genetic engineering’ is a broad term, ranging from modifying existing genes to introducing a gene derived from other species. The term emerged as scientists began to recognise ‘gene’ as a unit of inheritance in the early 20th century. If we understood the role of each gene and modulated them, some imagined, we could design an organism. Genetic engineering was fulfilling this dream: improving crop quality and yield. From the late 20th century, the emergence of genetic engineering brought genetically modified (GM) crops to the market, and with it, anti-GM sentiment sometimes as extreme as vandalism and lawsuits. Can we be confident in GM crops? Which societal and scientific challenges should we be mindful of?
The benefits that GM crops bring us are solid. Herbicide- and pest-resistance, the modifications first engineered into crops, are still widely used. They aided the massive increase of crop yields during the so-called ‘’Green Revolution’’ in 1950-70 along with employing chemical herbicides and pesticides. Herbicideand pest-resistance are still widely used. In 2020, Kenyan farmers lost their crops to locusts, a problem insecticidal GM crops could solve. Increased food production generally leads to a better living standard. In the 1950s UK households spent about 33% of their income on food whereas today the figure is around 8%, now having more spare money to spend. Furthermore, modulating the relevant genes, we can produce larger fruits, delay ripening, and change the colour or shape. Significantly, a GM variety that produces vitamin A producing rice, called “Golden Rice”, reduces incidence of childhood blindness due to malnutrition in underprivileged regions. Such beneficial genes can be homologs of genes identified in close relative crops. There are also efforts to diversify the genetic pool which has gone through a bottleneck due to selective breeding. As such, GM technology can help prevent disease through malnutrition, increase food security and help those at greater risk of food poverty.
Moreover, the future of genetic engineered crops looks bright. The recently developed gene editing tool, clustered regularly interspaced short palindromic repeats system (CRISPR-Cas 9), is sophisticated yet convenient enough that labs can simply order the kit to insert, delete or mutate a single gene locus or multiple loci. This greatly lowers the barrier of gene engineering as the expertise is not required; so CRISPR effectively revolutionised genetic engineering and broadened the potential of GM crops.
However, some alleged advantages of herbicide- and insectresistant phenotypes are controversial. Pesticides secreted directly by the pest-resistant plants will be more effective than
applied pesticides which will be washed away. However, GM crops have also introduced some chemicals to our food: for instance, growing herbicide-resistant crops to replace tiling with herbicide treatment encourages careless use of herbicides, among which the most widely used is glyphosate. Problematically, the International Agency for Research on Cancer (IARC) classified glyphosate as a probable human carcinogen, finding “strong” evidence of genotoxicity. Glyphosate is used on some 80% of GM crops which means that GM technology is indirectly leading to widespread exposure to potential carcinogens.
One of the earliest GM crops was insect-resistant crops producing Bt toxin, which specifically acts on insect gut receptors but not on humans or animals preying upon the insects. The Bt toxin gene is inserted in a variety of crops, especially widely used among maize and cotton. Now Bt toxin has been extensively used for pest control for nearly a century and is approved for organic farming. The safety of Bt crops have been comprehensively verified. However, this is not the case for the majority of GM crops, when the industry is churning out varieties upon varieties. Some of the GM crops may bear unknown health risks, which might not have manifested yet. There are perils of rushing into new technologies that have not been extensively tested for safety.
In her book Silent Spring, Rachel Carson highlighted the chronic toxic effect of DDT, which accumulates in predators and manifests only in the long term. DDT was excessively used, believed to be safe based on short-term experiments that showed no significant results. Indeed, we need close preclinical studies and robust regulatory science to ensure healthy development of GM technologies. This is especially the case due to the complex regulation of gene expression that we do not fully understand. Scientists are now aware that ‘genes’ are not discrete and phenotypes are regulated through a complex of molecular interactions. Even CRISPR, a recent tool also called “genetic scissors” due to its specificity in recognising target sites, shows high risk of off-target effects: unintended mutations can be made in similar sites, or the modulated region may participate in regulation of expression of other genes.
"We should also consider the potential societal consequences of commercialised GM crops"
We should also consider potential societal consequences of commercialised GM crops. Large agricultural corporations, such as Monsanto and Syngenta, are the main patent owners of GM crop varieties and are often exclusive in their ownership. While farmers often salvage seeds to sow the following year, GM seeds are sterile so must be purchased every year. In an industry with marginal profits for farmers, having to pay for seeds every year poses yet another financial burden. Corporations will benefit at the expense of middle-class citizens; and at its worst, weakening of middle class farmers may destabilise economic structure in an agriculture-based country. For instance, the U.S.A-based agricultural corporation Monsanto’s influence on Indian agriculture has been controversial . Commercialised in 2002, Monsanto’s GM cotton replaced local cotton varieties at a high rate, homogenising the vegetation. Seed monopoly ladened the financial burden on the farmers, having to pay a fee to the company every year. Monsanto was blamed for the raised suicide rate among Indian farmers.
So how can we be aware of the potential dangers of genetic engineering and provide a social safety net for individual farmers? To be commercialised, preliminary studies are required by governmental institutions of respective countries, which should take place in a closely monitored environment. Novel genes or technologies are often contested. For instance, a massive field trial in India demonstrated the usage of novel gene (GURT) may be defective, causing death of livestocks that consumed the variety. This event elicited public outrage, leading to campaigns against GMO in general in the late 1990s in India. Ensuring safety requires another social aspect of science, regulatory science, which asks the fundamental question: how much evidence is sufficient to believe in a product? After all, the acceptance of a technology comes down to whether the public chooses to consume. While this question is mainly addressed by experts in modern society, we need social consensus on ‘safety’ so that each individual can be aware of what they eat.
Here lies another dilemma: contemporary science is a powerful tool to investigate natural processes but also authoritative, complex, and a point of collision of various interests. Some scientific research funded and published by companies tends to be biassed towards the company. In the 1950-60s, chemical companies lobbied regulatory authorities to undermine the evidence that DDT is carcinogenic. We should ensure that a fair and in-depth scientific analysis can be done through robust regulatory science institutions such as the Genetically Modified Organisms (Contained Use) Regulations (GMO(CU)) of the UK. On the other hand, in an attempt to solve societal issues, multiple social initiatives develop genetically modified crops for the benefit of citizens. The International Crops Research Institute for Semi-Arid Tropics (ICRISAT) is one such nonprofit organisation that takes molecular approaches to develop crops suitable for semi-arid tropic farms. There are risks in using GM crops, both known and unknown.
Historic environmental disasters such as DDT should remind us to be cautious of letting our desire for technological progress override our sceptical scrutiny of new technologies, particularly those with far-reaching environmental consequences. We must also face an easily neglected issue for food security: we waste an estimated 30% of food each year worldwide. Perhaps we should tackle this social justice issue as much as we invest in better GM variety. There is no doubt science is a powerful tool that can improve our life, but our belief in science often blinds us from facing the elephant in the room. Scientific development should always go together with societal efforts since science and technology can sometimes bear health and environmental risks.
Hayoung Choi is a second-year undergraduate studying biochemistry & molecular biology, plant & microbial sciences, and history & philosophy of science at the University of Cambridge. They have a broad academic interest and are currently Workshop team lead at Cambridge Ukniversity Science & Policy Exchange (CUSPE). Artwork by Biliana Todorova.
"There are risks using GM crops, both known and unknown..."
Brains in a Dish - Models for the Study of the Human Brain
have You ever lived in a city for so long that you witnessed it changing over the years? Streets got damaged and repaired, old shops went out of business and new ones opened up. I bet you can even picture that run-down cafe in the corner getting replaced by a fancy bar with queues of youngsters lining up at the weekends. Now imagine this dynamic city inside an opaque snowball and ask yourself, how could we study and understand those changes now?Some may suggest cracking it open, and some may suggest finding a way to make the snowball transparent. What would you suggest if the globe is someone’s brain and the intricate network of the city is made up of cells and tissue? Over the years, doctors and neuroscientists have tried to tackle this problem. Brain injuries and related pathologies affect millions of people every year and yet the brain is still a great enigma to us, partially due to its inaccessibility.
LIVE BRAINS | In some rare situations, doctors have been able to directly study functioning human brains. For instance, Dr Penfield applied local anaesthetic to his epileptic patients, removed part of the skull and stimulated certain areas while the patients stayed awake. This allowed the patients to report what they were experiencing when the local stimulations were applied. Through this method, Dr Penfield was able to understand the role of some brain regions and how they modulate our experiences and actions.
Alternatively, scientists can study post-mortem brains. Nevertheless, these are often extremely damaged and only provide a static picture of the last stage of a disease. What is really interesting to scientists is the ability to monitor the processes and events that are happening in the brain in real-time. Being able to study the developmental processes of the brain, and how neural networks change over time is critical for scientists to understand the mysteries that lead to the many functions and dysfunctions of the brain.
replicate human diseases. This is due to the fact that replication of the disease in an animal model relies on the extent of our knowledge of the disease and its causes. In many cases, the pharmaceuticals developed to treat a disease show great efficacy in animal models, but fail in humans. This is because either the animal protein or gene targeted by the drug does not exist in humans, or the animal drug target is distinct from the human version. Indeed, we must not forget that there are some evolutionary differences between us and other animals. In fact, the best way to study a human brain is by studying a human brain.
GOING BACK IN TIME
| This brings us back to the original question of how to study a human brain. While it is certainly not ethical to remove someone’s brain to study, new technologies have found a new way to get around this issue. What if the brain did not belong to anyone, and was in fact, grown in a laboratory? Although we are not quite there yet, this field of research has been growing in recent years. Welcome to the 21st century!
FROM
MICE TO FISH TO HUMANS
| The development of animal models has also allowed us to peer into the brain. We have all heard about lab mice, but in fact, there has been a wide range of animals used in brain research - from pigs and monkeys, to flies (Drosophila), fish (D. rerio), and worms (C.elegans). Each of these models has its advantages and disadvantages depending on the goal of the research. These models can be used to study the molecular mechanisms that drive the fundamental processes in the brain, such as synapse formation and network plasticity. They are also used in behavioural studies, in which we can see how animals react to certain cues and environments, and then link their behaviours to brain activity and molecular pathways. We can also genetically modify animals to replicate human pathology. This is done by introducing specific mutations associated with a human disease into the animal’s gene. Nevertheless, we have hardly ever been able to faithfully
Nowadays, scientists can collect cells from people’s skin and de-program them to a pluripotent state. Pluripotency is the ability that some cells have to give rise to any other cell type that we find in the body. Normally we only find these cells in the early embryonic states and we lose most of them later on. So for scientists to obtain them, they have to get some adult cells (normally skin), put them in a time machine, and send them back to their embryonic state. This “time machine”, in fact, is called the Yamanaka Factors, which are a group of proteins that activate certain genes needed for cells to become pluripotent. The resultant cells out of the ‘time-machine’ are called induced pluripotent stem cells (iPSC). In addition, we can culture iPSCs in different media (any liquid that supports cell growth) and differentiate them into any type of cell in the body. For example, if we want to study Parkinson’s disease, which affects the dopaminergic neurons of the Substancia Nigra region of the brain, we can obtain skin cells from the patients, revert them to iPSC and then differentiate them into that specific type of neuron. This allows us to have millions of neurons in our flask growing in 2D and behaving in a similar way as they do in the brain. A great advantage of this approach is that these patient-derived iPSCs would maintain the genetic background of the patient of origin. This would therefore, better mimic the individual characteristics of the disease in the patient (such as the degree of aggressiveness).
Despite these advantages, this approach has some limitations. For instance, these cells can only be kept in culture for a limited amount of time. Furthermore, they don’t recapitulate the complex structure and cellular heterogeneity in the brain. In other words, we are trying to mimic a 3D world on a 2D platform. Thankfully, 3D cultures have been developed to overcome some of these issues. Spheroids are small iPSCderived 3D cultures. Owing to their 3D morphology, these
Anna Pujol explores the different models used to study the brain and its related pathologies
cultures can mimic the brain’s architecture better. Inside the spheroid, we can obtain diverse regions and layers with different cell types, just as we would have in a human brain. Furthermore, the cells in a spheroid are surrounded by other cells in all directions, which helps to enhance the processes of adhesion and communication, and therefore, better mimics the reality of a human brain. Finally, another advancement in 3D cultures is that they can be kept for a longer time, allowing the cells to form more connections of higher complexity and allowing us to study them for longer.
Another emerging interest in recent years has been the extracellular matrix (ECM). The ECM is the space in between the cells, the non-cellular component present in all organs. It is formed by an intricate network of molecules linked together into a structurally stable composite. In the past, researchers thought that the ECM was a passive structure, but it is now evident that it has a very active role, not only in contributing to the mechanical properties of a tissue but also in cell attachment, cell migration and communication. By embedding the 3D cultures in matrices, it could allow us to model the human brain environment. Some of these cultures have already been used to study diseases, such as in brain cancer, epilepsy and bipolar disorder. Within the matrix we might be able to find novel drug targets that allow the development of treatments for these pathologies.
Despite the recent advances in the field, there is still a lot to be done. One limitation of these models is the non-existence of a vascular system within the 3D culture. In fact, once the brain spheroids reach a certain size, there will not be sufficient nutrients and oxygen flux inside the sphere, and so the core will become necrotic.
In recent studies, many researchers have started to come up with different ideas to surpass this limitation, but further improvements are needed. Currently, we have a broad range of tools and models that allow us to study the brain in many ways. This is a fast developing field filled with new and exciting discoveries. There is no doubt that with every new implementation that goes beyond the current limitations, we will be one step closer to obtaining brain cultures that resemble the human brain better. With this, we are one step closer to understanding this enigmatic organ and having better platforms to test drugs for brainrelated diseases.
Anna Pujol Castiblanque is a Darwin College PhD Student studying Glioblastoma heterogeneity at the Mair lab and Markaki lab. Prior to coming to Cambridge, she completed her undergraduate in Biotechnology at the University of Barcelona and her MSc in Brain and Mind Sciences at UCL and Sorbonne. Anna is particularly interested in patient derived human models for the study and modelling of brain pathologies. Illustration by Josh Langfield.
Genome Sequencing: its Promises, its Impact
Merissa Hickman and Rachel Duke discuss the societal implications of advancements in genomic medicine
a researCh GrouP at Stanford University has sequenced the human genome in a record-breaking time of under 6 hours. Unbelievably, this is twice as fast as the previous record. Advancements in genomics are changing healthcare as we know it, for better or for worse. Welcome to the Genomic Revolution.
WHAT IS GENOME SEQUENCING?
Genome sequencing allows medical specialists to view the complete set of genetic material in an individual. Now, in as little as a few hours, genomic data is widely accessible and available. The information from the genome is instrumental in identifying rare inherited disorders, characterising mutations that drive cancer progression, and as evidenced by the recent pandemic, tracking disease outbreaks. The interpretation of genetic data requires highthroughput computational analysis which is often time-consuming, expensive, and subjective. Advancements in genomics have been critical for advancing scientific policy and have increased support for the open-sharing of scientific data. However, this has raised many ethical issues surrounding the use of genetic data, particularly in medicine. Who should have access to this data? What could the data incidentally reveal? And how do we communicate this information to patients?
A BRIEF HISTORY OF SEQUENCING
As incredible as it may seem, genome sequencing is a relatively new technology. It all began in 1977 when Frederick Sanger developed the first sequencing method and was awarded a Nobel Prize for his revolutionary work. The first DNA based genome was sequenced in the same year — the genome of a bacteriophage which infects E. coli.
Jumping to the year 2000, the full genomic sequence of the fruit fly was completed. In the same year, the first plant genome was also sequenced.
The launch of the 1990 Human Genome Project set out to sequence the entire genome of humans, and 15 years later established the first ‘complete’ human genome.
Following this, the HapMap Project in 2005 aimed to describe common patterns of human sequence variation, a major stepping stone for large-scale human genome projects such as the 1000 Genome Project.
2009 saw an explosion of new computational tools. Increasing affordability and accessibility to genome sequencing meant that new tools to accommodate the requirements of large-scale genomic projects were required.
The 100,000 Genome Project was the next major stepping stone, a UK initiative to sequence and study the roles of genes in health and disease. The research and analysis of this initiative is
still ongoing. Most recently, the Newborn Genomes Programme was introduced. It aims to explore how, and when, offering whole genome sequencing to newborns is appropriate.
Here, we will focus on two important examples of genomic sequences by looking back and reflecting on the aims and outcomes of the Human Genome Project, and looking forward to see how the Newborn Genomes Programme may change the future of healthcare.
GENOME SEQUENCING: ITS PROMISES
The huMan GenoMe ProjeCT
It is difficult to attend any biological lecture or read any textbook without the Human Genome Project being mentioned in some capacity. In many ways, it has allowed us to gain a complete understanding of the human genome, remarkably reducing the basis of our existence to a sequence of letters on a page. But is our understanding really complete? Although labs all over the world have access to the human genome at their fingertips, each breakthrough presents us with further unanswered questions and reveals an additional layer of complexity.
The Human Genome Project was set up in 1990 with the initial promise of decoding the human genome down to its sequence of base pairs and annotating the genes. It is clear that the scientists involved in the project, and those who funded it, believed that the success of this project would be a momentous step in modern biology and would have an impact on many disciplines, from physiology and medicine to pharmaceutical design. There was also a great amount of excitement from the media and general public about how we would be able to read the secrets of our own lives in an accessible way. Many thought that this project promised to decode the base sequence of our genome and, in doing so, promised to easily identify all disease causing genes, find genes that set humans apart from other great apes, and explain every aspect of our phenotype. It is difficult in hindsight to distinguish between what the scientists ‘promised’ and what the media and general public expected.
The first draft of the human genome project was published in 2003. For many scientists working in the field today, having access to the human genome is as vital to the lab as test tubes and petri dishes. Because of the Human Genome Project, geneticists have a reference genome that they can compare to the genomes of other humans to identify areas of variation, and use to find similarities and differences between our own DNA, and the DNA
of other species. The Human Genome Project was a landmark project in many other ways, not least for its public data publishing and the collaboration between multiple labs working towards a common goal.
The data collected from the Human Genome Project has had a significant impact, in the field of genetics and beyond. This is seen both in its direct use and indirectly in subsequent projects that have used similar or adapted experimental techniques. The Human Genome Project has revealed a lot in terms of our past evolution and relationships to other organisms in the evolutionary tree of life. We have been able to disprove suggestions that the more DNA an organism has, the more complex it is — this is the concept of the so-called ‘C-value enigma’. For example, humans and mice have a similar number of genes while some plants, like corn, have many more genes than we do. Through careful analysis of the Human Genome Project data and the 1000 Genomes initiative, we have been able to identify SNPs or small non-coding polymorphisms. These are bases that are different in different people, for example, 80% of a population may have an A nucleotide at a specific position while the other 20% have a C.
have The ProMises of The huMan GenoMe ProjeCT been fulfilled?
The Human Genome Project made many claims during its initial stages about its possible future impact on the biosciences and many of these have stood the test of time. However, almost 20 years on from its completion, we must reflect and ask if all these promises have been met. It is very clear that to know the nucleotide sequence of the human genome is to know the human blueprint. But, in the same way a script can only tell us so much about the final film, the base sequence is not the whole story.
The genome is not simply a string of letters to be read chronologically. In reality, DNA is highly compacted through the involvement of histone proteins and nuclear scaffolds. Some parts are highly repressed and are not actively transcribed while other parts are more accessible to the transcription machinery. The more accessible a gene is, the higher the rate of transcription will be. It should be noted that, for a large proportion of the genome, its state of compaction is dynamic and dependent on the immediate needs of the cell. In contrast, genes that are not needed by a cell (for example, liver enzymes in a muscle cell) may be entirely and almost irreversibly repressed. The fact that a neuron, epithelial cell, and muscle cell are so disparate in structure and role while containing the same genetic information proves that knowing the underlying blueprint is limited in its ability to determine how a cell and organism will look and function.
Epigenetics is currently a very fashionable area of research and one that, arguably, is deserving of the attention it attracts. Epigenetic modifications refer to heritable changes to the genome that do not alter the underlying base sequence (and were therefore not sequenced as part of the Human Genome Project). This could be the addition of chemical groups, such as an acetyl
group, to histone proteins, or the addition of a methyl group to cytosine nucleotides in the DNA sequence. These modifications can alter how accessible the DNA is and therefore alter gene expression. When we ask seemingly simple questions like ‘why do genetically identical twins who grew up in the same environment develop different diseases?’, we confirm the nucleotide sequence cannot possibly be the whole story. It is probable that epigenetic divergence plays a part in the different phenotypes of genetically identical individuals.
Even if we briefly ignore the issues that gene expression presents, why is it that we have not been able to clearly identify all the genes in the genome and match gene sequence to function? It is clear this process is not as simple as one might have predicted when the Human Genome Project began. Less than 1.5% of the human genome codes for proteins meaning they must be searched for amongst a desert of non-coding sections of DNA. Introns are non-coding regions of genes that are removed from mRNA transcripts by splicing. They appear frequently and can be up to 200 Kb long. To add a layer of further complexity, the term ‘gene’ is actually very difficult to define. Genes can be nested in other genes by being located in an intron. In addition, a ‘gene’ can encode different proteins if the transcript is post-transcriptionally modified in different ways — this is known as differential splicing. Simply knowing the sequence of A,T,G, and C nucleotides is of limited value without an understanding of the proteins they encode and the regulation of their transcription. However, there have been many advancements in transcriptomics (the analysis of the mRNA transcribed from the DNA sequence) through experimental techniques such as microarrays, for which the sequencing of the human genome was necessary for its creation. It cannot be denied that sequencing the human genome was a crucial step in understanding our genetics and was a necessary foundation for subsequent advancements in both knowledge and technology.
We must be aware when discussing the ‘success’ of the original Human Genome Project that the media greatly exaggerated the claims of the scientists involved. The project was fundamentally a success as it succeeded in completing its primary goal — to sequence the human genome. The notion that the Human Genome Project would revolutionise the biosciences did not come from geneticists but rather the journalists of the time. The difficulty in understanding the complexities of the decoded genome could not have been predicted at the outset of the project. Perhaps presenting unanswered questions and lines of further research is as valuable as giving direct insights.
neWborn GenoMes ProGraMMe
Fast-forward to 2021, a public dialogue on the use of whole genome sequencing in newborn screening was released. The dialogue’s main findings are summarised in box 1. The recent Newborn Genomes Programme aims to utilise whole genome sequencing in newborns to expand screening from the current nine conditions offered to many more rare diseases.
boX 1 a suMMarY of The PubliC dialoGue froM The neWborn GenoMes ProGraMMe
1) It would be acceptable to use Whole Genome Sequencing (WGS) to identity a wider set of conditions than current NHS newborn screening programmes provided: The condition impacts infants in early childhood. Treatments and/or interventions are available to cure, prevent or slow progression of the conditions.
2) Genetic Counselling and Mental health support must be available to those who receive a diagnosis.
3) A comprehensive genetic database should be established so ethnic minority backgrounds are not disadvantaged by receiving more uncertain or less accurate diagnoses.
4) The complexities of WGS must be recognised during the design of the consent process. Including:
- Implications of WGS for the wider family.
- That parents consent on behalf of children, but the children may have different views when they grow up.
- The screening tests have the potential to look for more conditions than current newborn screening.
Professional guidance and advice on whole genome sequencing of seriously ill infants is conflicting. This creates wide-spread controversy on whether the new screening programme is considered acceptable or not. The British Society for Genetic Medicine believes that testing the whole genome is unlikely to be controversial when testing aids immediate medical management. Conversely, the European Society of Human Genetics believe it is preferable to use a targeted screening approach (such as targeting specific genes) to avoid unsolicited or uninterpretable findings. If a reliable method to detect a serious genetic disorder earlier exists, is it not our responsibility to use it? The professional moral imperative of beneficence — doing good to others, including moral obligation — would have you believe so. However, overwhelming patients with complicated genetic results, riddled with uncertainties, may hinder the professional duty of nonmaleficence — to do no harm. The European Society of Human Genetics raises valid concerns regarding potential incidental findings. Whole genome sequencing of newborn babies may reveal unexpected abnormalities, and this raises questions on how, or even if, we report these findings. Despite initial reservations, the design of Genomics England Newborn Genomes Programme is currently underway.
The programme broadly aims to identify rare diseases in babies; to understand how genomic data could be used to improve knowledge and treatments; and to explore the potential risk and benefits of storing an individual's genome over their lifetime. If successful, the programme could provide early diagnosis for childhood-onset rare genetic conditions. In theory, this is a
no-brainer. Improving diagnosis and immediate care for infants whilst building a comprehensive genomic database to inform research and knowledge; what could go wrong? In reality, our understanding of the genome is far from perfect and whole genome sequencing in newborns raises many questions. Who will be affected by whole genome sequencing? What information will be shared to patients and to big data sources? And what are the implications for wider society?
GENOME SEQUENCING: ITS IMPACT
The huMan GenoMe ProjeCT
Having a reference genome available and being able to sequence an individual’s personalised genome is now very important in healthcare, particularly in treating conditions such as cancer. For example, in certain breast and ovarian cancers there is a mutation in the BRCA1 gene and these patients respond particularly well to specific drugs. In individuals with a mutation in a different gene, this targeted therapy will not work. Introducing more targeted gene therapies for conditions like cancer, where alternative treatments are highly invasive and have many side effects, would be a significant step for cancer research and biomedicine as a whole.
Genetic sequencing technologies have been extended in recent years to more commercial ventures such as home testing DNA kits made by companies like Ancestry or 23andMe. This allows people to understand and explore their heritage in more detail than we could have imagined 50 years ago. The same goes for DNA sequencing tests that allow people to be reunited with family or confirm their biological relatives. This personalised genome sequencing can have a huge impact on an individual’s life and means we now live in a society where relationships can be scientifically confirmed.
Sequencing an individual’s genome has never been quicker or cheaper. In the not-too-distant-future, it seems genomes will be sequenced at birth and personalised medicine tailored to each individual’s genes may be the norm. This could help us identify people who are at risk or predisposed to certain diseases or conditions and potentially offer preventative medicines, advice, or more regular check-ups. However, we must be wary about advancements of this kind. Will this information be available to health insurance providers and healthcare professionals? It seems that in the digital age we live in today, discrimination and prejudices based on a person’s genome could become a devastating reality. Will biological information, like other forms of private data, be protected? The issue of privacy and data protection is very important to any discussion relating to genome sequencing and is something that is already a cause for concern with the DNA commercial testing kits. As technology becomes more advanced and valuable data is stored at such a large scale, can we ever be sure this intimate information will stay private?
To go one step further, editing the genome seems like the stuff of science fiction but the technology is readily available to scientists today. Nobel Prize-winning CRISPR gene knockout technology has
been used in experiments for many years now and has proved to be extremely useful. But, will the widespread use of similar technology on humans be a good thing? Genetic therapies could prevent fatal and debilitating conditions being passed onto the next generation but, despite the potential benefits, we seem to be at risk of tumbling down a steep slope of ethical concerns surrounding the commercial use of this technology. We must not disregard the very real risk of data misinterpretation if it fell into the wrong hands that could lead to catastrophic repercussions throughout society, including the risk of a resurgence in eugenics. Such outcomes could not be predicted at this stage. We may look back on these early stages of genomic editing in hindsight and wonder how we did not see these problems coming. Would you want your genome sequenced?
neWborn GenoMes ProGraMMe
The Newborn Genomes programme may result in whole genome sequencing being routinely available at birth. Although titled ‘Newborn Genomes Programme’, the results of the initiative will not be limited to affecting newborns. The programme may improve immediate health benefits of seriously ill newborns such as earlier detection and improved care management, however potential harms should also be considered. Due to the complexity of classifying whether a variant is harmful or not, false positives may arise through whole genome sequencing. Another aspect to consider is if information regarding future disease risk is revealed. The so-called ‘Angelina Jolie Effect’ significantly increased the testing of BRCA1 and BRCA2 mutations in women. This resulted in more women opting for preventative treatments to reduce their risk of developing breast cancer. However the difference between these women and newborns is blatant — these women chose to test. If the same mutations are detected in newborns, a child’s right to make their own choices about accessing this information must be considered. A child’s right to an open future should be factored into the design of the programme. Using whole genome sequencing to look opportunistically for a broad range of conditions is generally considered unacceptable in the medical community, but incidental findings may be unavoidable.
Whole genome sequencing also affects parents. Parents may feel entitled to the genetic information of their child, particularly if it reveals information relevant to their own health. The complexity of interpreting genetic information may lead to many uncertainties which, if reported, may overwhelm parents. This may also influence how the child is raised. Other family members may also be interested in knowing information relevant to their health. This creates another question for healthcare professionals — what information from whole genome sequencing should be disclosed to parents of newborns?
Grey areas exist in determining who should access genetic information. Communication of genetic information has been recently classified as belonging to the family rather than the individual — but patient confidentiality still exists. Knowing when to share information, even beyond the wishes of the patient, is not standardised and is assessed on a case-by-case basis. The new programme may over-burden healthcare professionals. Another potential harm is that not all healthcare professionals are trained in genomics. They might not understand the limitations of whole genome sequencing, nor be able to adequately interpret or deliver results.
There are many potential benefits and costs of whole genome sequencing in newborns. Increased uptake of genome sequencing could lead to the creation of a more balanced, population wide genome database. Due to the historical imbalance in funding across many scientific disciplines, the genomic databases currently available to researchers are almost entirely representative of the genomes of White Europeans. Whole genome sequencing at birth may help to bridge the gap for minority populations, creating a more comprehensive database. Tackling the discrimination and exclusion within current databases may lead to increased diagnosis in minor ethnic populations. Something to consider is the potential effects on public attitudes towards genetic variation. Major scrutiny has been passed over Iceland’s recent near elimination of Down Syndrome. Since prenatal testing was introduced, close to 100% of women who received a positive test for Down Syndrome terminated their pregnancy. It has been suggested that population genome screening should not be approved until we have tackled negative societal attitudes experienced by those with genetic conditions.
This brief overview of ethical issues of genome sequencing in infants is not exhaustive but only presents a handful of important implications to consider. The potential benefits of the Newborn Genomes Programme has also been briefly explored. Although the major benefits of the programme are evident, the potential issues raised are challenging and not easily tackled. Whether the design team for the Newborn Genomes Programme will address all of these issues is yet to be determined.
The sequenced human genome is now a vital tool of any genetic or epigenetic research lab. While the Human Genome Project may not have provided us with an answer to every question, it did provide us with a necessary starting point for further experiments and progress in understanding the secrets of the genome. Whole genome sequencing has an incredible potential to improve healthcare for everyone, hence why sequencing genomes at birth is being explored. However the societal and ethical consequences of the Newborn Genomes Programme are not yet known. How genomic data will be stored, shared, and utilised requires further public consideration. There is an ongoing debate around whether genome sequencing at bir th is a step in the right direction and the benefits and harms of the screening programme should be continuously weighed. What is clear, however, is that the Genomic Revolution is well and truly underway.
Rachel Duke is a Second Year Biological Natural Sciences at St Catharine's College. Her interests lie in developmental biology and how genetic and epigenetic systems influence the physiology of an organism. For this article she focuses on the Human Genome Project.
Merissa Hickman, MPhil in Genomic Medicine at Murray Edwards College, University of Cambridge is particularly interested in the ethical implications of Genomic Medicine, here she focuses on the Newborn Genomes programme. Artwork by Sumit Sen.
ASCUS: Society at the Heart of Science
Pauline Kerekes talks to Keira Tucker, the General Manager at ASCUS Art & Science that provides the only publicly accessible lab in the UK, aiming at strengthening the necessary cross-talk between society and science.
Q: What is ASCUS Art & Science?
A: ASCUSArt & Science is a non-profit organisation dedicated to bridging the gap between art and sciences.It started in 2008.At that time, it was an association where researchers and artists could come together and mingle ideas,and in 2015 through a people award from theWellcomeTrust,we began work on establishing theASCUS Lab.We wanted to challenge the notion that‘a lab is somewhere you can’t go unless you’re a scientist’,through our belief that a laboratory should be accessible to everyone.It is the only publicly accessible lab in the UK. This space is a mix between a research lab and an art studio:the great thing about that is that scientists and artists are seen as equal – the stereotype that scientists would be perceived as smarter or“better” doesn’t exist here. Everyone is valued.We encourage conversations,explorations, development of new skills and satisfying curious minds.
Q: Can anyone come to the lab?
A: The lab is publicly accessible so yes,anyone and everyone can use our facilities for independent projects or as part of the workshop we run from time to time. The only limitation is that we are a biosafety level 1 lab so we can only work with what’s considered safe microorganisms.We also avoid working with harmful chemicals that could lead to dangerous reactions like concentrated acids,as we sometimes have people coming to the lab who have no scientific background at all and try to make the space as user friendly as possible.Despite the
limitations,we encourage people to enquire regardless as there might be ways to work around certain issues or alternatives we could implement to reduce certain risks.
Q: What is your background?
A: My background is in microbiology so I’m mainly a scientist.I completed my PhD a couple of years ago,but I’ve always been a creative person.Initially I wanted to become a film composer, and the only reason I didn’t go down the creative route was because the university lost my application! I also wanted to do research to create solutions to help people,but I felt in the end the only people who would read or understand my work would be other scientists or people who would pay for access to journals.So,after my PhD I wanted to take a break from academia,reassess what path I wanted to follow and that was when I stumbled across Ascus, fell in love with their values and was lucky enough to join the team.
Q: Can you give examples of past projects at ASCUS?
A:Yes,for instance the G-lands project. A university researcher, Dr.Elaine Emmerson,joined up with the artist Emily Fong, and asked us to mediate this collaboration. for those that have damaged salivary glands as a result of Radiotherapy is used to treat head and neck cancer patients however,in the process can damage the patients salivary glands.As a result of this damage,these patients struggle to speak,swallow and to eat. Dr.Emmerson’s research focuses on the regeneration of
salivary gland function to improve the quality of life for those affected by head and neck cancer.She is really interested in getting feedback from patients to make sure she’s researching in the right direction.The artist came in and learnt about the science behind Dr Emmerson’s work through discussions with scientists,medical doctors,surgeons,pathologists,and patients. She then created illustrations and sculptures to represent her observations.One of the ASCUS’ roles in this project was to set up a workshop with different hands-on experiments for people with lived experience of head and neck cancer. These included extracting DNA from saliva and looking at salivary glands under the microscope for patients to approach the biology behind their disease,create artwork about it and reflect together about their own journey.It was a very moving experience as despite the difficulties of the treatment,patients remain so adaptable and able to make a way around it. Another programme atASCUS was designed for younger people who struggle in a normal school environment,having to sit behind a desk and look at a chalkboard.This program is collaborative with a community project officer who is all about green spaces,biodiversity and ecology.The kids collect samples and take them back to the lab to look at them with the microscope.We follow their curiosity and intuition:they choose the items found in nature they would like to look at and then they decide the questions they would like to ask and which experiment to do.The idea is that I’m not a teacher showing them things,we are all equal in the playfield. Art is necessary for science as it promotes creativity.
Q: Conversely, would you have an example of experiments that create art material?
A:One that comes to mind is using slime mould,which is a yellow slimy organism that is neither a fungus or a bacteria. It finds its source of food with chemotaxis,and if you place food sources at different places in a petri dish,the slime
mould finds the most efficient route to the food sources, creating a network that connects them all..It has been used in city planning after researchers showed the organisation of the network created by this unicellular organism is so wellstructured that it closely resembles the underground rail system surroundingTokyo,(for more details,see https://www. wired.com/2010/01/slime-mold-grows-network-just-like-tokyorail-system/).We have slime mould atAscus and have used it in a similar way,placing a printed out map of a local park underneath a petri dish,placing food sources on parts of the map we would like to explore,and letting the Slime mould tell us which route to take to get there. Some of the artists that use our facilities are also exploring the patterns Slime mould makes on textiles.
Q: And maybe a project involving a form of art one could consider far from science?
A:Most people think of fermented tea with health benefits when they hear“Kombucha”.However a fashion design student came toAscus to grow Kombucha and used the part of it that is called SCOBY,which is the symbiotic colony of bacteria and yeast,to create a sustainable eco dress! She now has her own company,Mykko,where she uses mycelium to create ecoleatherScience serves society but the communication between the two is often indirect,and sometimes even absent.Through collaborations between art and science,ASCUS Art & Science builds a necessary two-way bridge:patients,who are the ones experiencing the pain,the doubts and the fears of a disease inform the science. Furthermore, at ASCUS, science comes to creative people of all ages and backgrounds through inspiring new ways of teaching and obtaining knowledge.
Pauline Kerekes is a post-doc in neuroscience at the Physiology, Development and Neuroscience department at Cambridge who helps coordinate the art behind BlueSci. Pictures provided by Emily Fong and Ascus team.
Botox: From Anti-Wrinkle to Anti-Depressant
We are all aWare that toxins cause damage to our body though many people subject themselves to toxic substances such as alcohol and nicotine regularly. Despite this, it is perhaps surprising that one of the most potent poisons known to science is regularly used by beauticians and prescribed in hospitals across the world.
Clostridium botulinum bacteria produce a set of toxins which affect nerve cells, or neurons. These so-called neurotoxins cause paralysis by blocking the signals neurons release onto muscle cells. Specifically, they cut protein complexes which are responsible for the release of neurotransmitters, the chemicals which signal to the muscle to contract. The resulting weakness affects the muscles controlling respiratory function and may prove fatal if medical help is not provided. Despite its foreboding nature, this toxin is often used in a clinical setting and is known under a different moniker: Botox.
The negative effects of botulinum intoxication have not marred its reputation in the cosmetics sector. The Aesthetic Society reports that Botox injections have remained the most popular non-invasive cosmetic practice since 1999, with over 1.7 million cosmetic injection procedures completed in 2019 in the US alone. Intramuscular injections of Botox ensure the toxin stays local to the site of injection. The resulting muscle weakness lasts between 3-6 months – a temporary effect which can have favourable, rather than deadly, consequences. Used as an aesthetic aide, Botox is heralded to create a youthful appearance, smoothing wrinkles in the face which are caused by muscle contractions.
Aside from its prominent cosmetic use, Botox has also proven useful in treating a number of medical conditions. In fact, the use of Botox in a clinical setting began in the 1980s, 20 years before cosmetic
application became widespread. It can alleviate a variety of symptoms, including management of muscle hyperactivity disorders, misalignment of the eyes and excessive sweating. Botox has also been shown to be effective against chronic headaches and migraines. While our understanding of migraines is not complete, the pain they cause is thought to be due to inflammation around nociceptive (pain-sensing) neurons. Stimulation of these neurons by inflammatory substances causes increased neurotransmitter release from these nerve terminals. This causes sensitisation of neurons, increasing the transmission of nociceptive signals from the periphery to central systems. Botox impedes this sensitisation by reducing the release of neurotransmitters from peripheral neurons.
The effects of Botox extend beyond the physical and neurological. It is well-established that facial Botox recipients lack control over emotional expression. Recently, research has shown Botox may also affect perception, as well as portrayal of emotion. This is suggested in studies in which recipients of Botox injections have reduced ability to identify facial expressions in photos presented to them. This is thought to be due to interference with a process known as emotional proprioception. Proprioception is our sense of limb location and body position. For example, you know if your arm is raised in the air without having to physically see it above your head. This is mediated by feedback from our muscles to the brain. We assume our emotions dictate our facial expression; you smile when happy and frown when sad. However, by smiling or frowning, your muscles also indicate to the brain that you are happy or sad. In this way, your expression can affect your emotion. A forced smile, regardless of genuine happiness, improves how entertaining a viewer rates a film or TV program. This bi-directional interaction between muscles and emotional state is referred to as the facial feedback hypothesis. Facial feedback helps us to understand our own emotions, but may also help in the perception of the emotions of others. When we interact with someone, we subtly mimic their facial expression on our own face. The combination of facial mimicry and facial feedback means that feedback from our muscles can be used to comprehend the emotions of the people we speak to. If this hypothesis is true, the paralytic effects of Botox will disrupt this feedback.
The phenomenon of facial feedback is now being tested as a potential therapy for people with severe mental health disorders. Major Depressive Disorder is an extremely prevalent illness, affecting millions of people around the world. Several studies suggest that when Botox is injected into the glabellar region between the eyebrows of depressed patients, they show significant improvements in depression scores compared to pretreatment values when tested around 6-8 weeks following injection. The paralysis of the corrugator muscle is thought to reduce the proprioception of negative facial expressions, and therefore reduce perception of negative emotion. Although research is more in the preliminary stages, a recent study has suggested Botox may also be an effective treatment against untreated bipolar disorder.
Arguments remain as to whether Botox is actually effective in psychiatric treatment; a product of our limited understanding of facial feedback and the interaction between emotion and mental health. This is exacerbated by a myriad of studies which have weak results, low sample sizes, and authors with conflicts of interest. As with many mental health medications, the story is not black and white. Proponents of Botox dispute naysayers, arguing for its use on the basis that we should care if something works, rather than how it works. Similar doubts surrounding Botox could be applied to current mental health medications. For example, serotonin selective reuptake inhibitors, a common class of antidepressant, are still subject to intense scrutiny, with studies suggesting the drug has limited effectiveness against depression, poor withdrawal responses and causes an assortment of side effects. As we are yet to fully understand the aetiology or causes of complex mental health disorders, it is perhaps foolish to expect a full understanding of their remedies. Compared to other treatments, Botox is relatively cheap and causes very few unpleasant side effects; there are few safety concerns associated with its use. It may be used in combinatorial approaches to reduce the dosage of medication patients take or in cases where people who have not responded to other medication and therapies. However, it is clear larger and more in-depth patient studies need to be completed to further determine the effectiveness of treatment.
Lauded by some as the ‘miracle toxin’, Botox is often seen in a scornful light by general society. Although the native toxin is causative of death, this unsavoury opinion pertains to Botox’s cosmetic applications; reserved for the rich and vain. This unfortunate reputation does a disservice to the benefits of its use and ignores its true value. It is a sad truth that people feel it is necessary to undergo procedures to appear more youthful, but this should be perceived as a reflection of the unrealistic beauty standards imposed upon us. Even if the potential benefits to self-esteem are ignored, Botox has many other benefits. While it is overambitious to claim the discovery of a ‘miracle’ cure, the clinical use of Botox in neuromuscular conditions alone should merit a favourable stance towards Botox in society. Additional interest in its use as an alleviant of chronic pain and a part of combinatorial mental health therapies indicate that a reappraisal in our view towards Botox is long overdue.
Emily Birt is a third year Natural Sciences student at Girton College, University of Cambridge. She is specialising in Neuroscience with a project focusing on the neuroendocrine regulation of reproduction and fertility. She is the social media officer for the Girton College Natural Sciences Society and secretary for Cambridge University Show Choir. Illustration by Marida Ianni-Radi.
Power to the People: Creating a Resilient Electricity Grid in the Face of Global Hazards
ThouGh eleCTriCiTY use accounts for 20% of current global energy consumption, it is an indispensable resource in our increasingly digital society. As the world transitions away from fossil fuels like coal and natural gas, renewable electricity – which can be generated from climate-friendly sources – will become a vital source of energy for both industry and domestic use. However, recent events like the Texas blackout of 2021 have demonstrated the vulnerability of the electricity grid in the face of hazards – an issue that will be aggravated as we approach net-zero and society becomes even more reliant on electrical power.
HOW IS ELECTRICITY GENERATED AND SUPPLIED? | Electricity is generated when mechanical energy is converted into electrical energy, via a turbine generator. The turbine can be powered by different energy sources: in 2021, 40% of the UK’s electricity was generated from natural gas, 40% from renewable sources, and the remainder from nuclear power and other fossil fuels. Electricity is then distributed to consumers through the national electricity grid. To keep the grid operating successfully, the supply and demand of electricity must be carefully balanced, and the electrical current kept at a stable frequency (around 50 Hz in the UK). Power blackouts can be caused by physical or digital failures within the electricity grid, or if the system operator deliberately disconnects the power if electricity supply is expected to be lower than demand.
HAZARDS | Electricity grids are made up of a myriad of components and operating systems, all of which are susceptible to environmental and socio-economic hazards. Firstly, poor weather conditions can adversely affect the energy supply chain and transmission system: strong winds bring down power lines; heatwaves reduce the efficiency of electrical equipment; and flooding can damage equipment throughout the supply chain. In February 2021, Storm Uri overwhelmed the Texas power grid by plunging the state into freezing temperatures. Electricity generators of all types broke down because power plants were not equipped for the harsh winter conditions. Electricity demand from consumers far surpassed the available supply, forcing the regional electricity operator
ERCOT to order an intentional blackout across much of the state. This decision inadvertently cut power to several natural gas producers, which exacerbated the problem as the producers couldn’t supply enough gas to the electricity plants. Ultimately, 5 million Texans lost power to their homes and businesses for several days, and nearly 250 deaths were attributed to the blackout.
As society becomes increasingly digitised and online, a report from the Energy Research Partnership has identified cyber warfare as a key threat to national power grids. The Ukrainian electricity grid was hit by a cyber attack in 2015, attributed to a Russian hacking group. The event left nearly 1.5 million people without electricity for several hours; power was quickly restored, but through physical interventions rather than the recovery of the weakened digital systems. Ukraine suffered several more attacks throughout 2016, and as the Russia-Ukraine conflict continues, the likelihood of further state-sponsored attacks only increases.
If terrestrial hazards weren’t enough, another serious consideration is the effect of “space weather”. The sun emits solar energy and plasma which interacts with the Earth’s atmosphere and magnetic field. On their own, space weather events can generate ‘excited’ currents within the electricity grid, damaging equipment and causing power fluctuations. If the grid is already near peak demand due to storms or heat waves, a space weather event can push the grid beyond its capacity. This was the case for 6 million residents of Quebec in 1989, where the grid was hit by both a snowstorm and two solar plasma ejections within the space of a week, leading to a regional blackout lasting nearly half a day. Scientists are particularly concerned about the impacts of space weather as we approach a ‘solar maximum’ in 2025: a period of intense solar activity which occurs every 11 years.
CLIMATE CHANGE | When discussing potential hazards to the electricity grid, the elephant in the room is climate change, and unfortunately the issue here is two-fold. For nations to reach their net-zero emissions targets, fossil fuel
Lizzie Knight discusses how the electricity grid is suscpetible to global hazards and what can be done to improve its resilience
usage must dramatically reduce over the next few decades. As electricity can be generated from renewable net-zero energy sources like wind and solar power, we will see increasing ‘electrification’ of industrial and domestic processes, displacing the need for fossil fuels. Transitioning to a net-zero society will create huge structural change within the energy industry, as power grids must both adapt to new energy sources and expand to service the increasing demand for electricity.
However, as we saw above, power grids are susceptible to adverse weather conditions, but climate change is making extreme weather events increasingly common. As detailed in BlueSci Issue 48, a warmer climate increases the likelihood of heatwaves, flash floods, hurricanes, and wildfires. Data from the US Department of Energy shows that power outages in the US linked to severe weather events have risen from around 50 per year in the early 2000s, to more than 100 per year on average over the last five years.
CREATING A MORE RESILIENT GRID |
As society moves away from fossil fuels, the electricity grid must ensure uninterrupted availability of electrical power, withstanding and quickly recovering from any disturbances. Fortunately there are many organisations and stakeholders working to improve the resilience of the electricity grid.
The Convergence Hub for the Exploration of Space Science (CHESS) is a US-based research project that aims to improve society’s resilience to space weather. As well as predicting hazardous space weather events, CHESS investigates the potential impacts of space weather on the electricity grid. In April 2022, CHESS ran a workshop for researchers, policymakers and grid operators, where they simulated a large solar plasma ejection. They determined how the event might affect the power grid in northeastern US and mapped out communication lines between different institutions. This community-wide exercise served not only to improve the future resilience of the grid, but also to expand interdisciplinary connections within the entire ‘sun-to-power-grid’ system. The workshop finally made several recommendations to policymakers, such as conducting
further research into how the power grid relies on telecommunication systems (which themselves might be damaged during space weather events).
In the UK, the UKRI-funded ARIES project (Adaptation and Resilience in Energy Systems) will investigate how climate change may affect the UK’s energy systems. ARIES will model the impacts of changing weather conditions on current and emerging energy generation technologies, and how the energy grid might withstand these emerging hazards. ARIES will also investigate how climate change might affect the availability of weather-dependent energy sources like wind power.
ENERGY SECURITY FOR THE FUTURE
|The UK National Grid has unveiled a new ‘Whole System’ approach to energy production, encouraging collaboration between industries to build a resilient, fair, and affordable energy system for all consumers. This was followed by the recent announcement of a ‘Future Systems Operator’, a new government body which will oversee the UK’s energy system. As the size and complexity of the energy grid increases, collaboration and interdisciplinary discussion is vital. But as helpful as these new strategies may be, policymakers must put actions behind their words if we are to ensure energy security as we head towards net-zero. Society’s dependence on electricity is far greater than when the grid was first developed, and future hazards affecting the electricity grid could have a far more devastating impact on society than today. We must ensure that new infrastructure is built with the resilience to meet future energy needs.
Lizzie Knight is a 4th-year PhD student in Earth Sciences at Fitzwilliam College. She is interested in science policy, the energy transition, and 'net-zero' solutions. When not thinking about science, Lizzie can normally be found in a rowing boat. Illustration by Sumit Sen.
Clearing the Confusion about Nuclear Fusion
Xavior Wang discusses the promises and challenges of fusion energy
WhaT CoMes To mind when you see the word “nuclear”? Is it a post-apocalyptic hellscape engulfed by the flames of war, or an advanced civilisation fuelled by limitless energy? What, then, about the phrase “nuclear fusion”? Does that repaint the image in deeper shades of red or green? Nuclear energy has been heavily stigmatised in the wake of the Cold War-era nuclear arms race and the haunting tales of Fukushima, Chernobyl, and Three Mile Island. This has not only hindered the development of nuclear plants, but also cast a shroud of mystery, fear, and misinformation over adjacent technologies – technologies like nuclear fusion. It is more important, now than ever, to dispel the myths surrounding fusion, for it will eventually form the bedrock of our energy landscape.
THE DEFINITION OF FUSION | Nuclear energy, as people have come to know and fear, most commonly refers to nuclear fission – the splitting of heavy elements into lighter ones to produce energy. In contrast, nuclear fusion alludes to the binding of light elements into heavier ones while also releasing energy. Superficially, fusion and fission
seem to be two peas in a pod with pedantic differences. But in practice, they cannot be more different.
Fusion is a process typically associated with stars like our Sun, and the endeavour to build fusion reactors is nothing less ambitious than to create a miniature Sun. In the Sun, hydrogen nuclei collide with astounding speed and under immense pressure, while in fusion experiments, hydrogen isotopes (the same element with a different number of neutrons) are subjected to temperatures 7 times hotter than the Sun’s core. The result is a helium nucleus and an inconceivable amount of energy. Here’s a useful analogy: picture a rock sitting precariously atop a hill. Push it, and it will roll down the slope, slamming onto a tree. Likewise, the conversion from multiple hydrogen nuclei into one helium nucleus is a downhill journey, and the sorry state of the tree reflects the associated energy release.
However, this process doesn’t occur spontaneously. Instead, there is an intricate interplay between two fundamental forces of nature. The electromagnetic force keeps two
positively charged nuclei apart, and its effect extends ad infinitum. Conversely, the strong nuclear force binds nucleons tightly together, but only works over very short distances. Hence, only when the nuclei come very close together under extreme temperature or pressure, will they then fuse. This is like having a wall between the rock and the valley – some work needs to be done to raise the rock over the wall for it to begin its descent.
THE ULTIMATE SOLUTION | Compared to conventional energy sources, the benefits of fusion sound almost fictional. Fusion is the most fuel-efficient process humanity can harness yet. One kilogram of fusion fuel is equivalent to ten million kilograms of fossil fuels, or four kilograms of fission fuel. Fusion reactors also utilise abundant resources. The major ingredient of fusion is deuterium, a hydrogen isotope which can be obtained by purifying seawater. The other reactant is tritium, a different isotope that exists in trace amounts in nature, but research is underway to integrate its production into the fusion process. Conversely, non-renewable sources like coal, natural gas, and oil are estimated to deplete within the next hundred-odd years. Perhaps the greatest advantage of fusion is the clean and safe waste. A fusion reaction produces none of the pollutants and greenhouse gases plaguing our Earth today, but only helium – an inert and harmless gas. The only radioactive material involved is tritium, with a lifespan of 12.3 years, as opposed to fission reactions that leave a cocktail of radioactive wastes lasting up to billions of years for future generations to inherit.
THE CURRENT SITUATION | As amazing as it sounds, fusion is still a work in progress. There are currently two prominent methods to achieve fusion: magnetic confinement, where fuels are injected and circulated within a doughnut-shaped magnet as white-hot plasma, and inertial confinement, where powerful lasers focus their beams onto a small fuel pellet to cause an implosion. A crucial threshold to measure the success of fusion reactors is the breakeven point, where the useful energy produced matches the energy required to run the reactor. This is quantified by the Q-value, such that exceeding Q=1 means more energy output than input. Thus far, the best result comes from the National Ignition Facility (NIF) in the USA, reporting a Q=0.70 from its inertial confinement reactor.
Looking forward, the highly anticipated International Thermonuclear Experimental Reactor (ITER), a magnetic confinement reactor currently being built by the European Union, will be completed in 2025 and can potentially achieve a staggering Q=10. Developments for commercial fusion are also underway: DEMO, a class of prototype reactors, will be operational by 2050 around the world; the UK has plans to build one in Nottinghamshire by 2040; and ARC, a compact reactor developed by MIT, will begin construction as early as 2025. Experts believe that with an optimistic outlook in funding and technological advancement, the second half of the century will witness the rise of fusion power in electricity generation.
CAVEATS AND COMPLICATIONS |
The possibility of fusion energy manifesting within our lifetime is a great cause for celebration. However, the road ahead is not only fraught with difficult scientific and technical problems, but also myriads of political, social, and economic challenges. Scientists and engineers face off against the four horsemen of fusion:
1. Attaining and regulating temperatures over 100 million °C.
2. Developing materials capable of withstanding the steep temperature gradient and merciless particle bombardment.
3. Breeding and handling the scarce tritium fuel.
4. Maintaining the reactor remotely with robots.
Therein lies the crux of the matter: fusion is expensive – really expensive. The ITER collaboration, with an initial budget of €6 billion, is currently rocking a price tag of €22 billion. For perspective, the James Webb Space Telescope costs $10 billion, while CERN’s Large Hadron Collider goes for a measly $4.75 billion.
The hardest part is convincing people that fusion is worthwhile and safe. This requires dissociating fusion from the nuclear anxiety that stems from the histories of nuclear meltdowns, and the fear of nuclear warfare, especially in a knee-jerk reaction to Putin’s nuclear threats amidst the ongoing Russo-Ukrainian War. Indeed, the debate on nuclear power is on an unfortunate trajectory deviating from evidence-based reasoning and driven by the fear-mongering of anti-nuclear organisations. But the facts are difficult to contest, if only they were rationally considered: fusion plants will not melt down, and cannot be weaponised. A fusion reactor does not operate on chain reactions, and any slight disruption will halt the reactor in seconds. Furthermore, a fusion-powered warhead cannot be made with the limited fuel present in a reactor, and will also require an additional fission bomb to detonate.
THE FUSION CONVERSATION | At present, nuclear fusion remains an obscure subject that most people know little about, and it is certainly not a major topic on the government agendas. But as scientific progress accelerates in the coming decades, political and social conversations ought to keep up. And when the technology is ready for the world, the world needs to be ready for it. There is a running joke that fusion energy is 30 years away and always will be, but let its delay be the fault of scientists and engineers, not politicians and protesters.
Xavior Wang is a third year Physics student at St Edmund's College. His interests span the spectrum of length scales, from nuclear fusion and quantum computation to astrophysics and cosmology. Illustration by Pauline Kerekes.
Quantum Internet
An Entangled World
NOBEL PRIZE | Last October, the Nobel Prize was awarded to Alain Aspect, John Clauser and Anton Zeilinger, who independently conducted ground-breaking experiments on quantum entanglement with photons, which are what light is made up of. This is an important recognition as it was awarded for their work in quantum information which is gaining a lot of momentum.
We are so reliant on the internet that if the internet were to break down, it would have a devastating impact. Security and reliable transmission of messages are at the heart of the internet. The modern internet encodes messages into light signals, which are transferred from one location to another via fibre optic cables. These messages are often encrypted so that if the wrong person intercepts the message, the classical computers that we use today won’t be able to decrypt the message to figure out what it says.
A fundamental issue with classical cryptography is that it is often based on encryptions that are very difficult for a ‘classical computer’ to decrypt. Unfortunately, with the advent of quantum computers, the performance and efficiency of calculations will be so incredibly high that it will become very easy to crack such encryptions using quantum algorithms. This is where a quantum internet which also harnesses the principles of quantum mechanics becomes very important.
system or object - its position, spin or energy, just to list a few. For classical objects, these properties are well-defined given some initial conditions - for example, once we kick a football in the air, it will spin in air and there is a velocity at which it will land on the ground. We can use this information to completely describe the state of the ball when it lands. However, a quantum object can be ‘undecided’ on its properties, meaning that there will be a probability distribution over the values it can take for a physical quantity; the presence of these probability distributions is called a superposition.
Let’s consider a special case where a quantum object can take one of only two measurement outcomes with a particular probability. Such a quantum object is called a qubit. For example, the vertical component of an electron’s spin can only be either ‘up’ or ‘down’ each with a particular probability. But after measuring it, its vertical spin will be instantaneously “decided” and the measurement outcome will define its evolution from that point. So from the point at which the measurement is made, its value is determined and hence the range of values it can take “collapses under measurement”.
CRASH COURSE ON QUANTUM MECHANICS
| Superposition, collapse under measurement and entanglement are three key principles of quantum mechanics that we can utilise. Entanglement will be the primary focus of this article, but to understand entanglement, we need to understand the other two principles first.
There are many physical quantities we can measure about a
Now that we’ve grasped superposition and collapse under measurement, it becomes slightly easier to understand what entanglement is. Entanglement is a way that distant photons can interact with each other, to make quantum communication possible between them. Quantum objects are said to be ‘entangled’ to each other if a different measurement outcome of a quantity of any one of these objects will result in a different overall state of the others, i.e. there is a correlation between the values that each object takes. Importantly, this interaction is instantaneous. Zeilinger and colleagues reported in 2015 that they were able to create entanglement between photons that were separated by 143 km between two Canary islands. This work was a major step in demonstrating that quantum entanglement could be
Shreyas Iyer introduces the reader to the strange and remarkable implications of a quantum internet and its applications in secure communication
used for long-distance quantum communications. Since then, China has released its famous Micius satellite, the world’s first quantum communications satellite, that could be used to create entanglement between ground stations that were over 1,120 km apart.
WHAT QUANTUM COMMUNICATION ISN'T |
We have seen that the overall state of entangled objects miles away can seemingly change instantaneously when we perform a measurement on a local particle. Does this mean we can have faster-than-light communication? This turns out not to be the case; let’s see why.
Let us introduce Alice (A) and Bob (B), who are in distant locations and share an entangled state of two photons that can take two possible polarisations ‘0’ or ‘1’ with equal probability (giving their superpositions). They are entangled so that if A’s photon is measured to have some polarisation, then B’s photon must also have that same polarisation, and vice versa. For example, if A measures her photon to be 0 then when B measures his photon he must also find that it’s 0.
Interestingly, there is a ‘no-signalling principle’ which states that Alice cannot convey any information to Bob by only performing local operations, so Alice and Bob must use some other non-local form of communication to tell each other what’s going on. Therefore, Bob cannot observe any change due to Alice’s actions faster than the speed of light since Bob can't be sure about what she did or what she observed. This certainly protects quantum mechanics from the violation, but this then raises the question about what quantum communication is even good for, if it can’t work independently of other forms of communicating. In the next section, you’ll see how in fact the quantum internet will pick up where our current internet has left off. To do these things, it becomes especially important to fully utilise measurement collapse.
ENTANGLEMENT SWAPPING AND ERROR CORRECTION |
The classical internet uses fibre optic cables to transfer information between locations. However, over long distances, the light signal will decay as it passes through the cable. As a result, we use intermediate ‘repeaters’ to amplify the signal and allow for long-distance transmission. Fibre optics can similarly be used for the quantum internet, but in the quantum setting, there is a ‘no-cloning theorem’ which prevents us from this amplification as it means that the signal cannot be copied exactly. Instead, ‘quantum repeaters’ use a process called “entanglement swapping” which we will look at in more detail.
Suppose now, Alice and Bob independently have two entangled photons each and they want a shared entangled photon state between them to be able to communicate, but they are too far apart (so A cannot just send one of her photons to B as it’ll deteriorate). We can introduce a third party, the Repeater R, which is directly halfway between A and B. Now A and B can
each send one of their photons to R, and now they will have a much greater chance of reaching R as it’s half the distance as before. R will conduct some local operations, including an intermediary measurement, and this will cause the leftover photons of A and B to be entangled! R is actually a quantum repeater, and we can continue to slice the transmission distance of the photon in this way, by introducing more quantum repeaters until the distance is reasonable. This process of ‘entanglement swapping’ is also how Zeilinger et al were able to create entanglement at such an immense distance.
While using quantum repeaters means we may not have to worry about our photons reaching their destinations, there will be some inevitable accumulation of noise. Thus, we’ll have to conduct some error correction to remove this noise, which involves adding some redundancy to our encoded data. Making error correction scalable over long distances is an area of ongoing research, as this currently requires a lot of computing power and for us to produce a large entangled state of photons.
QUANTUM KEY CRYPTOGRAPHY | Quantum key cryptography improves upon classical cryptography by combining classical communication with quantum mechanics to produce ‘keys’ that encrypt and decrypt messages, which cannot be figured out by eavesdroppers equipped with quantum computers. An eavesdropper Eve (E) trying to decipher the communication between Alice and Bob will have to take some measurement to get information from their messages. But with measurement comes a collapse that Alice and Bob may be able to notice, and this is the basis for all quantum cryptosystems. For example, quantum key distribution protocols detect eavesdropping and as long as it’s below a certain level, so A and B can correct for this and generate a secret key that can be used for provably secure communication. Quantum key cryptography generally uses a lot of resources; it requires a lot of qubits to be sent to create a key, and each key can be used only once.
CLOSING REMARKS | We’ve seen that quantum communication and classical communication have clear differences. While quantum mechanics has imposed some theoretical limitations to quantum communication, by combining it with classical communication, we can improve the security of our communication.Yet, there are still some practical limits, since we still cannot create enough qubits to be able to use error correction for communication over long distances, and for quantum information processing to work we’ll need quantum memory storage devices. However, these issues will eventually be solved and once they are, quantum communication will fundamentally change how our devices interact and the quantum internet will be born.
Shreyas Iyar is a Master's student studying Mathematics at St Catharine's College. Shreyas continues to explore the field of quantum computing as a part of his degree, and believes it has the potential to revolutionise the world. Illustration by Josh Langfield.
Weird and Wonderful
Shaking Up Scientific Publishing: eLife Announces Controversial New Publishing Plan
In October 2022, the nonprofit life sciences journal eLife announced a radical new model for scientific publishing. In conventional journals, only the final version of an accepted research article sees the light of day. Peer review – the essential process of vetting by other experts in the field to maintain the quality and credibility of scientific practice – is mostly kept behind the scenes.
In their new model, eLife will no longer ‘accept’ or ‘reject’ articles which have been through peer review, instead publishing every article sent for review along with reviewers’ comments. Authors can choose to address these concerns – or not. This move sparked controversy amongst the scientific community. Many researchers praised the shift towards open science and transparent publishing. Others expressed concerns over biases imposed by the journal’s editors who would now be the sole gatekeepers of work selected for publication, or the possibility that bad science could make it to the public eye with the endorsement of being called ‘peer-reviewed’.
According to eLife’s Editor-in-Chief Michael Eisen, “the future of science publishing is author directed publishing (preprints) combined with multifaceted, ongoing, public post-publication peer review.” Scientific publishing is a huge yet flawed industry, and although progress is being made towards a fairer system for scientists, only time will tell whether eLife’s bold experiment will drive science in the right direction. HS
Social Sciencists: Our Facourite Online Influencers
When you picture an Instagram influencer, a lab-coated, safetyspec-wearing scientist may not come to mind. However, the outbreak of COVID-19 and subsequent lockdowns triggered an increase in the use of “distance learning” and social media for both educational and communication purposes. Instagram, with over one billion users, provides the perfect platform to communicate science and inspire the next generation of researchers. Here are three of our favourite science Instagram accounts to keep you occupied on your morning commute. Best for…
PhD motivation: @Paigeinscience
Paige White provides helpful tips on how to navigate life as a new PhD student. If you want to avoid academic burnout, perfect your scientific writing or improve productivity, this is the account for you.
Career advice: @Soph.talks.science
Dr Sophie Milbourne is a former stem cell biologist turned science communicator. Her account Soph.talks.science features a weekly roundup of the latest science communication opportunities and courses - great for any aspiring science communication professionals.
Photography: @natgeo
With over 200 million followers, National Geographic is the world’s top non-celebrity Instagram account. Whether it be the first view of an eclipse or a tiger bathing in a river, @natgeo features an abundance of breathtaking imagery and information about the natural world. Science Instagram accounts can be both informative and inspiring. Why not incorporate Instagram into your daily life as a scientist? Whether you’re snapping a colourful experiment, taking a selfie on a field trip or recording a timelapse of your day in the lab…you could be the next Influencer! LB
Diabolically Ironclad
Weird and Wonderful Authors:
HS - Holly Smith
LB - Libby Brown
TW - Tasmin Wood
Artwork by Josh Langfield.
Beetles may be small but some are much tougher than they look. The spooky-looking diabolical ironclad beetle can even survive being run over by a car. These impressive insects are part of the sub-family of beetles known as Zopherinae and their distinctive wing casings, called elytra, can resist up to 149 Newtons of force. Their strength has even influenced the ways we humans create materials and engineer joints in complex structures.
The sutures of these beetles are of particular interest. Sutures are found on the inner edges of both wing cases and shaped like stamp perforations which interlock to join the two elytra together. These sutures dissipate any external pressure across the wing casings, so that no region of the elytra is particularly prone to stress. The wing casings can therefore resist much higher forces before fracturing.
In addition to linkages, the elytra consist of laminated structures. Under tension these structures delaminate, dissipating energy. So, if pressure on the wing cases is great enough to break them, then the laminated microstructures reduce further impact by preventing fissures at the edges. Instead of the elytra disconnecting, their connecting sutures swell and lock them together. This way, when ironclad beetles do break, their elytral connection is reinforced and protects more vulnerable internal structures. Sounds like we puny humans have a lot to learn. TW