Research Outreach Issue 107

Page 1

Connecting science with society ISSN 2517-7028 ISSUE 107

FEATURING RESEARCH FROM:

University of Catania and National Institute for Nuclear Physics; Japan Aerospace Exploration Agency; University of Cambridge; University of Illinois; University of Edinburgh; National Institute of Advanced Industrial Science and Technology; Stony Brook University; Spelman College; Kaetsu University; Kyoto University of Advanced Science; Frankfurt University of Applied Sciences; INSEAD and Collegio Carlo Alberto; Kyoto University; Harvard Medical School; Purdue University; University of Freiburg; Savannah River National Laboratory, Clemson University, the University of South Carolina, and Savannah River Consulting; Universidad Católica del Norte of Chile; University of Guelph; USDA Forest Service; Invicro; University of Pennsylvania; University of Lynchburg and University of Georgia Costa Rica; University of New Mexico; Baylor College of Medicine; The Quaich Inc.; Erasmus Medical Centre; National Prostate Cancer Register; Moffitt Cancer Center, Polish Academy of Sciences and the Research Features 3 Helmholtz Centre for Infection Research; The Ministry of Health of Russia and IlmixGroup; Avera Cancer Institute; Seoul National University Bundang Hospital; Institute of Biomedical Sciences, Academia Sinica.


Serving society by making science open, inclusive, and accessible.

RESEARCH OUTREACH ISSUE 107

WELCOME

We speak to Michael Matlosz, President of grassroots association EuroScience, which believes that science forms the basis for the continued health and prosperity of Europe and its citizens. Michael explains how EuroScience supports, encourages and represents European scientists, and discusses some of the challenges facing European scientists today. The Global Footprint Network is an international research organisation that is changing the way the world uses its natural renewable resources by publishing simple, scalable data for everybody. Co-founder Mathis Wackernagel tells us how Global Footprint Network’s initiatives are helping advance and communicate the science of sustainability. We hope you enjoy this issue of Research Outreach as much as we’ve enjoyed putting it together.

Rachel Perrin Editor Please feel free to comment or join the debate. Follow us on twitter @ResOutreach or find us on Facebook https://www.facebook.com/ ResearchOutreach/

www.500womenscientists.org

ISSN 2517-701X ISSUE 107

TO ISSUE 107

Research Outreach continues to showcase research from a broad range of institutions around the globe. In this issue, we are excited to celebrate some of the pioneers driving the latest discoveries and innovations in research.

Twitter @ 500WomenSci Facebook @ 500WomenSci

Connecting science with society

FEATURING RESEARCH FROM:

ISSN 2517-701X

500 Women Scientists

University of Catania and National Institute for Nuclear Physics; Japan Aerospace Exploration Agency; University of Cambridge; University of Illinois; University of Edinburgh; National Institute of Advanced Industrial Science and Technology; Stony Brook University; Spelman College; Kaetsu University; Kyoto University of Advanced Science; Frankfurt University of Applied Sciences; INSEAD and Collegio Carlo Alberto; Kyoto University; Harvard Medical School; Purdue University; University of Freiburg; Savannah River National Laboratory, Clemson University, the University of South Carolina, and Savannah River Consulting; Universidad Católica del Norte of Chile; University of Guelph; USDA Forest Service; Invicro; University of Pennsylvania; University of Lynchburg and University of Georgia Costa Rica; University of New Mexico; Baylor College of Medicine; The Quaich Inc.; Erasmus Medical Centre; National Prostate Cancer Register; Moffitt Cancer Center, Polish Academy of Sciences and the Research Features 3 Helmholtz Centre for Infection Research; Ministry of Health of Russia and IlmixGroup; Avera Cancer Institute; Seoul National University Bundang Hospital; Institute of Biomedical Sciences, Academia Sinica.

THIS ISSUE Published by: Research Outreach Founder: Simon Jones simon@researchoutreach.org Editorial Director: Emma Feloy emma@researchoutreach.org Operations Director: Alastair Cook audience@researchoutreach.org Senior Editor: Hannah Fraser hannah@researchoutreach.org Editor: Rachel Perrin rachel@researchoutreach.org Designers: Craig Turl, Sarah Sharpe, Carlton Hibbert Global Project Director: Julian Barrett julian@researchoutreach.org Project Managers: Tobias Jones tobias@researchoutreach.org James Harwood james@researchoutreach.org Contributors: Olivia Azouaghe, Rachael Baker, Leonardo Bernasconi, Tim Davies, Jessica Dean, Christopher Evans, Siobhan Fairgreaves, Sara Firman, Emma Green, Natasha Hancock, Rebecca Ingle, Sam Jarman, Matt Jarvis, Gillian Livesey, Kate McAllister, Helena Parsons, Emily Porter, Kate Porter, Jyoti Sood, Niall Taylor, Laura Turpeinen, Sam Wainwright, Polly Wells, Stuart Wilson. /ResearchOutreach /ResOutreach

Copyright © and ™ 2019 Research Outreach

CC BY This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit http://creativecommons. org/licenses/by/4.0/ or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.

www.researchoutreach.org

3


6

NUMEN PROJECT: EXPLORING KEY ASPECTS OF NEUTRINOLESS DOUBLE BETA DECAY BY NUCLEAR REACTIONS Professor Francesco Cappuzzello and Dr Clementina Agodi Using cutting edge technology to enhance our understanding of fundamental physics.

10

AKATSUKI: PIONEERING THE PLANETARY METEOROLOGY OF VENUS Professor Takehiko Satoh Understanding the cause and maintenance mechanism of the Venus environment.

14 RECONSTRUCTING

ASTRONOMICAL IMAGES WITH MACHINE LEARNING Edward Higson Bayesian inference and machine learning methods applied to astrophysics and cosmology.

18

22

26

REFINING THE SEARCH FOR THE LARGEST GRAVITATIONAL WAVES A. Miguel Holgado Exploring gravitational wave signatures of black holes, neutron stars and white dwarfs. THE MATHEMATICS OF LIFE: METABOLIC CONTROL IN LIVING CELLS Dr Diego A. Oyarzún Using mathematics to understand the function of natural networks.

30

THE MATERIALS MAKING POTASSIUM-ION BATTERIES POSSIBLE Dr Titus Masese Exploring high performance next-generation rechargeable battery systems.

34

38

42

58

SUSTAINABLE WATER PURIFICATION USING BIOMASS Dr Benjamin S. Hsiao Using nanocellulose enabling membranes and adsorbents for water purification.

INSTITUTIONAL INVESTORS AND INFORMATION ACQUISITION: IMPLICATIONS FOR ASSET PRICES AND INFORMATIONAL EFFICIENCY Adrian Buss Investigating the impact of financial frictions and institutional investors on asset prices and financial-market efficiency.

62

82

INQUIRY LEARNING: EMPOWERING AFRICAN AMERICAN WOMEN IN STEM Dr Leyte Winfield Teaching chemistry through authentic, culturally relevant learning experiences.

HONEYCOMB WINGS CREATED BY NATURE’S MECHANICS Dr Kaoru Sugimura Exploring the physical principles behind the regulation of cell mechanics.

66

DROSOPHILA FEZF FOUND TO BE ESSENTIAL IN NEURAL CIRCUIT FORMATION Dr Matt Pecot Understanding how the nervous system is assembled during development.

86

MEASURE WHAT YOU TREASURE Global Footprint Network Education initiatives to reduce our ecological deficit.

70

46 INTEGER-DIMENSIONAL

FRACTALS OF NONLINEAR DYNAMICS Dr Zonglu He Exploring the control mechanisms of nonlinear dynamics.

50

54

WILL AN OLD PROBLEM YIELD A NEW INSIGHT? Jim Tilley Finding alternative solutions to one of the most famous mathematical conundrums, the 4-colour problem.

MEASURING SHARED KNOWLEDGE WITH GROUP FALSE MEMORY Dr Yoshiko Arima Estimating the complexities of our collective intelligence through shared knowledge structure. THE ROLE OF SOCIAL CAPITAL IN THE HEALTH DEVELOPMENT OF CHILDREN Andreas Klocke and Sven Stadtmüller Is there a causal effect of social capital on the health of children?

74

78

PREDICTING PROTEIN FUNCTION AND ANNOTATING COMPLEX PATHWAYS WITH MACHINE LEARNING Dr Daisuke Kihara Developing new techniques for computational protein function prediction. MACHINE METAPHORS: ETHICAL AND PHILOSOPHICAL CHALLENGES IN SYNTHETIC BIOLOGY Dr Joachim Boldt Using machine metaphors to describe biological systems, and how language affects our approach towards synthetic biology research. IN-SITU MONITORING OF MICROBIAL CIRCUITRY Dr Charles Turick Electrochemical techniques to monitor microbial metabolic activity.

Our goal is to engage people by helping them fall in love with the most significant and intriguing challenges we are facing – how to thrive within the means of our planet.

43 4

MATHIS WACKERNAGEL, CO-FOUNDER OF THE GLOBAL FOOTPRINT NETWORK Page 43 Nikifor Todorov/Shutterstock.com

www.researchoutreach.org

35

83

106 ggw/Shutterstock.com

CONTENTS

106

THE ACIDIC BRINE LAKES OF CHILE: A SURPRISING MICROBIAL COMMUNITY Drs Cecilia Demergasso and Guillermo Chong Characterising the microbiota of an acidic brine lake in Chile. SECRETS OF THE STALK: REGULATING PLANT TEMPERATURE FROM THE INSIDE OUT Dr Peter Kevan How do plant stems help to regulate temperature?

90

110

ARE WILDFIRES FOLLOWING BARK BEETLES MORE SEVERE? Carolyn Sieg, Rod Linn, Chad Hoffman, Joel McMillin, L. Scott Baggett, Judy Winterkemp, Francois Pimont Computer-based models to better understand the complex interplay of factors behind forest fires.

94

ADVANCES IN CNS DRUG DEVELOPMENT Dr Eugenii (Ilan) Rabiner Molecular and functional imaging to understand brain pathophysiology and facilitate drug development.

98

GOING WITH THE FLOW: WATER QUALITY AND COMMUNITY HEALTH IN COSTA RICA Dr Thomas Shahady Understanding the relationship between water quality, community health and water use in Costa Rica.

Physical Sciences

Informatics & Technology

EUROSCIENCE: SUPPORTING SCIENTISTS ACROSS EUROPE TO WORK TOGETHER FOR A BRIGHTER EUROPEAN FUTURE EuroScience Enhancing the contribution of science, technology and innovation.

114

118

FRUIT FLIES HELP SHED LIGHT ON DRUG DISCOVERY FOR ALS Drs Nancy Bonini and Leeanne McGurk Promising new molecules could help lead to new treatments for amyotrophic lateral sclerosis (ALS).

102

DRILLING FOR KNOWLEDGE: A COLLABORATIVE APPROACH TO OCEAN DRILLING IN THE ARCTIC Dr Lindsay Worthington Subsurface imaging of the Earth’s crust to understand geologic processes.

HOMELESSNESS IN WESTERN SOCIETY: THE DARK SIDE OF THE MOON Dr Fabrizia Faustinella Exploring the root causes of homelessness and the challenges of street life. A FRAMEWORK FOR GLOBAL HEALTH PROMOTION: THE CIRCLE OF HEALTH Patsy Beattie-Huggan, Professor Dr. med. Kirsten Steinhausen & Stefanie Harsch Supporting knowledge translation and innovation in applications of the Circle of Health© (COH).

122

USING EHEALTH TO MONITOR IDIOPATHIC PULMONARY FIBROSIS Dr Marlies Wijsenbeek and Dr Karen Moor Evaluating a new home monitoring programme for idiopathic pulmonary fibrosis.

126

NPCR: NEXT GENERATION CANCER REGISTER The National Prostate Cancer Register A tool for quality assurance and

Arts & Humanities

Behavioural Sciences

Health & Medicine

quality improvement of health care for all men with prostate cancer in all ages in Sweden.

130

TOWARDS A QUANTITATIVE PERSONALISED ONCOLOGY Dr Heiko Enderling Integrating mathematical, biological and clinical sciences to model, simulate and predict treatment response for individual patients.

134

EXCITING ADVANCEMENTS IN OVARIAN CANCER TREATMENT Lev Ashraphyan, Vsevolod Kiselev, Ekaterina Muyzhnek, Gennady Sukhikh Fresh insights into the biology and progression of ovarian cancer may lead to new therapies.

138

FLUORESCENT IMAGING SHEDS NEW LIGHT ON APOPTOSIS Dr Nandini Dey Novel tools to explore the physical and biochemical processes behind apoptosis.

142

RNA SEQUENCING REVEALS SECRETS OF SKIN AGING Dr Jeong-Sun Seo Signs of skin aging are directly linked to over-exposure of UV light.

146

ADHD AND ITS COMORBIDITIES: IMPLICATIONS FOR MANAGEMENT Dr Wen-Harn Pan Investigating risks for attention deficit hyperactivity disorder and implications for its management.

150

APOLLO: FIFTY YEARS ON

Biology

Earth & Environment

www.researchoutreach.org

5


Physical Sciences ︱ Professor Francesco Cappuzzello and Dr Clementina Agodi

NUMEN Project:

The MAGNEX spectrometer at INFN-LNS

Exploring key aspects of neutrinoless double beta decay by nuclear reactions The Standard Model of particle physics may represent our most advanced understanding yet of the universe’s fundamental building blocks, but many physicists believe it is incomplete. One of the most enticing prospects for updating the model lies with ‘neutrino-less double beta decay’ – a process which has been theorised for many decades, but has yet to be observed. Professor Francesco Cappuzzello at University of Catania and Dr Clementina Agodi at National Institute for Nuclear Physics (Laboratori Nazionali del Sud) are the lead researchers of the NUMEN Project. Representing the efforts of a global team of physicists, they ultimately aim to unveil the key nuclear aspects of neutrino-less double beta decay for the first time.

B

eta decay is one of the most wellunderstood processes in subatomic physics. It occurs when an atomic nucleus is composed of particular arrangements of protons and neutrons which make it unstable, causing it to either release a beta particle (an electron) from one of its neutrons, turning it into a proton, or an anti-beta particle (a positron) from one of its protons, transforming it into a neutron. In some very rare cases, double beta decay can occur, where two neutrons will simultaneously transform into protons, or vice versa, releasing either two positrons or two electrons. In our current understanding of particle physics, defined by a set of rules called the Standard Model, there are several fundamental values which need to be conserved in every process, one of which is the ‘lepton number’. Leptons are a family of fundamental particles which include positrons and electrons, along with chargeless and extremely light particles, named neutrinos. Since electrons are ‘matter’ and positrons, ‘antimatter’, the Standard Model dictates

that during beta decay, they must be released alongside either an anti-neutrino or a neutrino, respectively. However, another type of beta decay has been predicted theoretically where this is not necessarily the case. In ‘neutrino-less double beta’ (0νββ) decay, two beta particles are emitted, but no neutrinos. On the surface, this process would appear to violate lepton number conservation, but for several decades, physicists have had reason to believe it may still be possible. Therefore, observing the process experimentally would have profound consequences for our current understanding of fundamental physics. BREAKING THE STANDARD MODEL In 1937, Italian physicist Ettore Majorana made an intriguing proposal: that it could be possible for some particles to be their own antiparticles. We know for a fact that this can’t be the case for a majority of fundamental particles, since one particle cannot have both positive and negative charges, for example. However, based on our currently-

limited understanding of neutrinos, the possibility remains that they could be so-called ‘Majorana particles’. Since 0νββ-decay emits no neutrinos, its observation would confirm for the first time that elementary Majorana particles can exist. “0νββ-decay is potentially the best resource to probe if neutrinos are their own antiparticles, as predicted by Majorana, and to extract their effective masses,” explains Professor Cappuzzello. “Moreover, if observed, 0νββ-decay will signal that the exact balance between matter and antimatter can be violated in nature.” This disparity in the enormous amount of matter compared with antimatter in the universe is currently one of the biggest problems faced by physicists. The direct confirmation that 0νββ-decay can occur would, therefore, be a big deal for particle physics, nuclear physics and cosmology; potentially giving researchers the tools they need to re-draw

Unified Theory of fundamental forces (electromagnetic, weak and strong interactions) and unveil the source of matter-antimatter asymmetry observed in the Universe.” THE RACE FOR 0νββ DETECTION Shaking up our understanding of the most fundamental building blocks of the universe is, understandably, an enticing prospect for physicists. As Dr Agodi describes, this has meant that the hunt for 0νββ-decay has ramped up in recent years. “Despite the fact that the process has never been observed, there is a kind of worldwide race, involving large scientific collaborations and international laboratories in the fields of neutrinos, nuclear and particle physics, in order to detect it,” she says. Theoretical physicists have proposed a variety of methods to infer the extremely long decay time for 0νββ process (more than 1016 times longer than the life of universe), stemming from the rules

If observed, 0νββ-decay will signal that the exact balance between matter and antimatter can be violated in nature. Vire Animations/Shutterstock.com

the Standard Model to more accurately describe the fundamental constants of the universe. “Presently, this physics case is among the most important research into ‘beyond the Standard Model of Particle Physics’,” Professor Cappuzzello continues. “It might guide the way toward a Grand

6

www.researchoutreach.org

which must be obeyed by atomic nuclei in all double-beta decay (ββ) processes. “Since the ββ-decay process involves transitions in atomic nuclei, nuclear structure issues must be also accounted for, in order to describe it,” Professor Cappuzzello continues. “In particular, the key quantities are the so-called Nuclear Matrix Element (NME), which

Detail of the Superconducting Cyclotron accelerator at INFN-LNS

express the probability that the parent nucleus can spontaneously convert into a daughter nucleus, as result of ββ-decay.” Based on state-of-the-art theoretical calculations, physicists in worldwide studies are now aiming to evaluate the NMEs which control the 0νββ-decay time specifically. However, since it would be incredibly difficult to detect the process occurring directly, these calculations are faced with significant challenges in providing the meticulous levels of precision which physicists require in their experiments. “Despite the tremendous efforts and improvements achieved by nuclear structure studies, the ambiguities in the present models are still too large to provide NMEs with the necessary accuracy,” says Dr Agodi. In their research, Professor Cappuzzello, Dr Agodi and their colleagues are aiming to realise this level of accuracy for the first time, with the help of cutting-edge facilities based at the INFN Laboratori Nazionali del Sud in Catania.

www.researchoutreach.org

7


Behind the Research

OBSERVING A CLOSE ANALOGY ββ-decays are not the only processes in physics where two charges are exchanged simultaneously. As Professor Cappuzzello explains, other ‘Double Charge Exchange’ (DCE) processes would be far easier to observe directly than 0νββ-decay. “In this scenario, the experimental study of DCE reactions, which are processes promoting analogous nuclear transitions as ββ-decay, could provide important information,” he says. “The advantage is that DCEs can be studied under controlled laboratory conditions.” Professor Cappuzzello and Dr Agodi’s team take advantage of the fact that the processes underlying 0νββ-decay are comparable with those governing easierto-observe DCE reactions. Therefore, the physicists suggest that 0νββ-decay NMEs could be inferred by observing DCEs occurring in one particular controllable environment: collisions between heavy, fast-moving ions. The team have attempted to do just this in previous studies. So far, however, their efforts have been hindered by the tiny probabilities by which the products of their heavy ion collisions were scattered – a value named the collision ‘cross-section’. With such imperceptible cross-sections, it has been incredibly hard for the physicists to know where to look for the collision products, where DCEs are occurring. But since these setbacks, new upgrades have been made to the INFNLNS facility of Catania, owned by Italy’s National Institute for Nuclear Physics, which could make their task far easier.

Professor Francesco Cappuzzello E: cappuzzello@lns.infn.it T: +39 095542384 (Office) +39 3281646400 (Mobile) W: www.dfa.unict.it/docenti/francesco.cappuzzello W: https://www.lns.infn.it/ W: https://web.infn.it/NUMEN/index.php/en/

At the heart of the new capabilities of the NUMEN Project are two, state-of-the-art installations at the INFN LNS laboratory

With the new equipment, Professor Cappuzzello and Dr Agodi’s team now hope to make cross-section measurements of heavy ion collisions far more easily. “Recently, we established an innovative experimental approach at the INFN-LNS laboratory in Catania, which allows us to largely overcome such experimental challenges, and consequently, to extract quantitative information on NMEs from DCE reactions,” recounts Professor Cappuzzello. INTRODUCING: THE NUMEN PROJECT In 2014, Professor Cappuzzello, Dr Agodi

With the help of these facilities, along with new cutting-edge theoretical calculations which have been developed in view of

8

www.researchoutreach.org

the NUMEN results, the researchers soon hope to embark on the most extensive search for 0νββ NMEs ever undertaken. “The INFN-LNS facility is today unique for this research in worldwide context, and will likely be in view of a major upgrade of the whole research infrastructure, which is going to start in the coming months,” says Professor Cappuzzello. A PROMISING FUTURE The NUMEN Project represents the coming together of minds encompassing a diverse range of fields in physics. As Professor Cappuzzello and Dr Agodi conclude, the researchers at NUMEN

NUMEN is a challenging project at the intersection of nuclear and neutrino physics, driven by an important physics case. Catania. The first piece of equipment is named the K800 Superconducting Cyclotron – a particle accelerator capable of accelerating heavy ions to high speeds, while ensuring precise, head-on collisions between ions. Secondly, the MAGNEX spectrometer will analyse the products of these collisions to incredibly high degrees of precision. “While K800 accelerates the heavy-ion beams with the required high resolution and low emittance, MAGNEX detects the reaction products with large acceptance and high resolution in mass, energy and angle,” Professor Cappuzzello explains.

E: agodi@lns.infn.it T: +39 095542642 (Office) +39 3283780365 (Mobile) W: www.researchgate.net/profile/Clementina_Agodi W: https://www.lns.infn.it/ W: https://web.infn.it/NUMEN/index.php/en/

Research Objectives

The INFN-LNS laboratory in Catania

and their colleagues unveiled their plans for the ‘NUclear Matrix Elements for Neutrino-less double beta decay’ (NUMEN) Project. “The aim of NUMEN is to investigate the nuclear response to DCE reactions for all the isotopes explored by present and future studies of 0νββ-decay,” says Dr Agodi. “Several aspects of the project require the development of innovative techniques, for both experiment set-up and theoretical interpretation of the collected data.”

Dr Clementina Agodi

believe this is exactly what is needed to make a major new discovery. “NUMEN is a challenging project at the intersection of nuclear and neutrino physics, driven by an important physics case and opening interesting scientific scenarios and potential technological fallout,” they say. In the coming years, the researchers involved with the project look set to be at the forefront of the global race to unveil 0νββ-decay experimentally, and provide the first clear evidence for the existence of Majorana particles. If they are successful, a long-awaited shakeup to the Standard Model could finally be realised, allowing for an unprecedented opportunity to explore the most fundamental building blocks of the universe.

The NUclear Matrix Elements for Neutrino-less ‘double beta decay’ (NUMEN) Project is an innovative project based at the INFN-Laboratori Nazionali del Sud. Using cutting edge technology, it aims to enhance our knowledge and understanding of fundamental physics.

Detail Professor Francesco Cappuzzello and Dr Clementina Agodi Via S.Sofia n. 62 95123 Catania, Italy

del Sud. In 2007–2015 she was member of Nuclear Physics board of INFN. Dr Agodi is Spokesperson of NUMEN Project.

Bio Clementina Agodi is an experimental nuclear physicist working in the field of reactions and structure of atomic nuclei. She holds a degree in Physics from Catania University and is currently senior researcher at INFN Laboratori Nazionali

Francesco Cappuzzello is Associate Professor of Experimental Nuclear Physics of the Catania University and Associate Researcher of the Istituto Nazionale di Fisica Nucleare. He was awarded his doctorate from the Catania University in 1999, discussing a thesis

References Cappuzzello, F. Agodi, C. Cavallaro, M. Carbone, D. Tudisco, S. Presti, D.L. Oliveira, J.R.B. Finocchiaro, P. Colonna, M. Rifuggiato, D. Calabretta, L. et al. (2018). The NUMEN project: NUclear Matrix Elements for Neutrinoless double beta decay. The European Physical Journal A, 54(5), 72. Cappuzzello, F. Agodi, C. Carbone, D. Cavallaro, M. (2016). The MAGNEX spectrometer: Results and perspectives. The European Physical Journal A, 52(6), 67. Vergados, J.D. Ejiri, H. Šimkovic, F. (2012). Theory of neutrinoless double-beta decay. Reports on Progress in Physics, 75(10), 106301. Dell’Oro, S. Marcocci, S. Viel, M. Vissani, F. (2016). Neutrinoless double beta decay: 2015 review. Advances in High Energy Physics, (10):1-37.

about the MAGNEX spectrometer project. He is Spokesperson of the NUMEN project. Funding Istituto Nazionale di Fisica Nucleare (INFN), Università di Catania Collaborators •D r Manuela Cavallaro INFN – LNS •D r Diana Carbone INFN – LNS •T he NUMEN Collaboration

Personal Response The NUMEN Project is a truly global collaboration. Can you give us a bit of the history behind how this collaboration was formed? We proposed NUMEN in 2014 within the initiative, “WHAT NEXT?”, in which INFN questioned itself on possible future avenues in the physics of fundamental interactions. The project was based on a pioneering experiment, DoCET, proposed by our group (spokespersons: M.Cavallaro and F.Cappuzzello). NUMEN was considered worthy of the INFN challenge and was reviewed by a prestigious international evaluation committee, that strongly supported it. Initially, we were only few researchers of Catania, but before long NUMEN attracted several colleagues, allowing us to cover all the aspects of the challenge. Today, we are about 100 researchers from 35 Institutions in 15 countries.

www.researchoutreach.org

9


Physical Sciences ︱ Professor Takehiko Satoh

Akatsuki:

The launch of the Akatsuki probe. in May 2010. The team at JAXA after the successful orbit insertion on 7 Dec 2015.

Pioneering the planetary meteorology of Venus

Image of Venus © ISAS/JAXA; image of Akatsuki probe by Akihiro Ikeshita, courtesy of NASA, public domain.

Venus may be the closest known planet to Earth in both size and distance from the Sun, but the atmosphere of our nearest neighbour is so thick that much of its dynamics remain shrouded in mystery. Professor Takehiko Satoh at the Japan Aerospace Exploration Agency (JAXA) is exploring the Venusian atmosphere. His research utilises the orbiting probe, Akatsuki. Named after the Japanese for ‘dawn’, Akatsuki has transformed our understanding of the diverse, often violent dynamics which play out in our neighbour’s atmosphere, and is also helping us to understand more about our own weather.

W

ith a radius just 5% smaller than our Earth’s, and an orbital path 28 percent closer to the sun than Earth’s, astronomers often refer to Venus as Earth’s ‘twin sister’. However, within the dense clouds of our twin, the environment could hardly be more different. “The planet is completely shrouded by thick cloud layer composed of sulfuric acid, its dense atmosphere is mostly CO2, the surface pressure is 90 times stronger than Earth’s, and the temperature is around 460°C due to the extremely strong greenhouse effect,” explains Professor Takehiko Satoh, a researcher at JAXA’s Institute of Space and Astronautical Science. With a climate vastly more hostile than any to be found on Earth, it isn’t surprising that the weather patterns astronomers have observed on Venus so far are entirely unlike our own.

But the local weather patterns aren’t the only mystery our twin holds; there is also a significant, large-scale disparity between the incredibly slow “retrograde” rotation of the planet itself and the far more rapid rotation of its upper atmosphere. “The atmospheric dynamics of Venus are represented by its ‘superrotation’,” Professor Satoh continues. “While the solid body completes one rotation every 243 earth days, the atmosphere encircles the planet in only 4 earth days at the cloud-top level, 60 times faster than the more massive body.” AKATSUKI ENTERS ORBIT In 2001, Professor Satoh and a group of his colleagues at several institutes in Japan began work on designing the first spacecraft ever to orbit Venus in its equatorial plane, aiming to discover the causes of a wide variety of properties observed in the planet’s atmosphere. After almost a decade of careful planning, Akatsuki was launched from JAXA’s Tanegashima Space Centre in May 2010. A mishap in manoeuvring (December 2010) meant that Akatsuki orbited the Sun for five years with only a little data acquisition, but thanks to expert recovery efforts from the Akatsuki team, the spacecraft finally began its orbit of Venus in December 2015. “Akatsuki has now been in Venus orbit for more than three earth years and has transmitted valuable data about the atmospheric dynamics of Venus back to Earth,” Professor Satoh says.

Tha Akatsuki probe has been orbiting Venus since 2015.

10

www.researchoutreach.org

Akatsuki collects this data using five onboard cameras, each of which observes Venus using different wavelengths, ranging from ultraviolet to infrared. Since these wavelengths can penetrate to different depths, each infrared or ultraviolet image of Venus which Akatsuki sends back to Earth can tell us something different about the dynamics of Venus’ atmosphere.

In addition, Akatsuki is carrying out a ‘radio occultation’ experiment, in which it fires a radio beam straight through Venus’ atmosphere, back to an antenna station on Earth. By measuring changes in these radio waves, astronomers can explore the Venusian atmosphere right down to its thick lower depth. So far, a comprehensive analysis of these images has yielded three major discoveries about Venus’ unique atmosphere. STUBBORN GRAVITY WAVES Just three hours after its arrival at Venus, Akatsuki made its first major discovery. The orbiter’s mid-infrared and ultraviolet imagers revealed a vast, stationary bow-shaped wave in Venus’ sulfuric acid cloud tops, 65 kilometres above its solid surface. At 10,000 kilometres across, the feature appeared to be in a fixed position, despite its surrounding, smaller-scale features moving at speeds of superrotating atmosphere around 100 metres per second. The Akatsuki team believe that this stubborn feature is a ‘gravity wave’. Not to be confused with gravitational waves, these features are created when two different fluids, each with a different density, meet at an interface. Waves can subsequently form at this interface as the force of gravity acts to restore equilibrium in the overall system. The phenomenon is common on Earth, particularly at the interface between the air and the ocean. However, gravity waves can never be found on this scale on our home planet. In this case, the astronomers propose that the vast gravity wave observed by Akatsuki was caused by mountain ranges on Venus’ surface pushing its dense lower

Photo Credit: Narita Masahiro CC BY-SA 3.0

For more than three Earth years, (Akatsuki) has transmitted valuable data about the atmospheric dynamics of Venus back to Earth. atmosphere to higher altitudes, where gases are far less dense. “This is a striking discovery, since we have seen features extending horizontally for thousands of kilometres that are not super-rotating but are stationary, or fixed to the highlands of the slowly-rotating solid planet,” comments Professor Satoh. “This would hint at how strongly the solid body and the atmosphere are coupled, and how this works to the production and maintenance of super-rotation.” Mysteries still remain over how such a gravity wave could have formed, given our current knowledge of the conditions of Venus’ atmosphere just above its surface. If Professor Satoh and his colleagues are correct about the cause of the feature, these surface conditions could be far more dynamic than astronomers currently realise.

VENUS’ EQUATORIAL JET Akatsuki’s next discovery was the evidence of a fast-moving jet encircling Venus’ equator in its mid-to-low atmosphere (50 to 60 km altitudes). The Akatsuki team made the discovery after analysing the planet’s ‘atmospheric window’ – a band of relatively weak CO2 absorption in the near-infrared region, through which radiation from deeper levels can escape to space. The astronomers calculated wind speeds at different altitudes by tracking the silhouettes of clouds in Venus’ midto-low atmosphere, against the backdrop of the atmospheric window. The astronomers found that at these altitudes, cloud speeds increased the closer they were to Venus’ equator, suggesting the presence of an equatorial jet encircling the planet. Their finding was particularly unusual, as atmospheric

www.researchoutreach.org

11


Behind the Research

The team conduct the final check before the probe is loaded onto the launch vehicle.

Professor Takehiko Satoh

E: satoh@stp.isas.jaxa.jp T: +81 50 3362 3838 W: www.isas.jaxa.jp/about/researchers/takehiko_ sato.html W: www.isas.jaxa.jp/en/missions/spacecraft/current/akatsuki.html

Research Objectives

The astronomers hope to uncover the mechanisms causing this jet through further research. Professor Satoh believes that these studies could further unravel the mystery of why Venus’ upper atmosphere is rotating so much faster than the planet itself. “Solving this problem would also be a key to understand the mechanism to accelerate the entire atmosphere to the superrotation,” he says. WAVES, JETS, AND THERMAL TIDES Akatsuki’s observations of Venus’ night-time atmospheric window also indicated unusual patterns of streaky clouds extending some thousands of kilometres across the planet. To explain their formation, the Akatsuki team proposed that Venus’ atmosphere was flowing strongly downwards in localised regions; an idea they tested using computer simulations. “The numerical simulation group successfully reproduced the large streaky features seen in Venus’ atmosphere,” Professor Satoh says. “Surprising enough, these large-scale streaks seem to be produced by interaction of atmospheric waves and mid-latitude jets that are commonly seen in our Earth’s atmosphere as well.” The success of reproducing such phenomena may imply that the numerical simulation of Venusian atmospheric dynamics enters a new stage, not only to simulate observed phenomena but to reveal the mechanism underlying what we see. The Akatsuki team also found increased wind speeds in the early half of the night near the equatorial region, indicative of the aftermath of solar heating in the cloud layer. “This may tell us how solar heating at the upper cloud layer drives the dynamics and effects to lower levels,” says Professor Satoh. These heating dynamics are known as ‘thermal tides’; caused by the periodic localised heating of the upper atmosphere by the Sun,

12

www.researchoutreach.org

Professor Satoh’s research links with previous and on-going Venus missions, with a specific focus on understanding the cause and maintenance mechanism of the Venus environment.

Detail

strongest at the sub-solar point, the effect which can be seen on Earth to an extent. Instabilities in lower layers of the atmosphere subsequently cause this heat to move around vertically, to lower altitudes. The astronomers believe that thermal tides could at least partially be a cause of Venus’ mysterious super-rotation.

an important supplement to the limited amount of data Akatsuki is able to provide. “For the earth, we have many meteorological satellites and a huge number of ground stations to monitor the weather,” Professor Satoh explains. “For Venus, in contrast, we have only one satellite, Akatsuki. Therefore,

By enhancing our understanding of meteorology in our sister planet, [the Akatsuki mission] ultimately benefits our lives on the earth. As Professor Satoh explains, this would be the case if thermal tides were acting to slow Venus down in its direction of rotation, while boosting the speed of its upper atmosphere. “One hypothesis for super-rotation is the heating of clouds and the generation of thermal tides,” he says. “This ultimately transfers ‘eastward’ momentum to the solid planet, leaving ‘westward’ momentum in the atmosphere, which accumulates to the super rotation.” GROWING OUR KNOWLEDGE OF VENUS From the insights provided by these three discoveries, Professor Satoh and his colleagues are now working on improving their computer simulations to more accurately reproduce Akatsuki’s observations. These simulations are

numerical simulation of the observed data is quite important. Data assimilation, a modern and powerful technique in Earth’s meteorology, will help us better understand the dynamical phenomena observed by Akatsuki.” The work of the Akatsuki team has already greatly enhanced our understanding of Venus’ unique atmosphere, and the fascinating mechanisms which drive its dynamics. Professor Satoh believes that by learning more about how Venus’ atmosphere differs from the Earth’s, we can understand more about our own weather. As he concludes, “the Akatsuki mission is unveiling Venus’ meteorology, and by enhancing our understanding of meteorology in our sister planet, ultimately benefits our lives on the Earth.”

Professor Takehiko Satoh Department of Solar System Sciences, Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency 3-1-1 Yoshinodai, Chuo-ku, Sagamihara Kanagawa 252-5210, Japan.

Bio Takehiko Satoh is Professor at Department of Solar System Sciences, Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency (ISAS/JAXA). He has been with Japan’s Venus orbiter mission, Akatsuki, since 2001, serving as PI of the IR2 near-

References Fukuhara, T. Futaguchi, M. Hashimoto, G.L. Horinouchi, T. Imamura, T. Iwagaimi, N. Kouyama T. Murakami, S.Y. Nakamura, M. Ogohara, K. Sato, M. (2017). Large stationary gravity wave in the atmosphere of Venus. Nature Geoscience, 10(2), 85. Horinouchi, T. Murakami, S.Y. Satoh, T. Peralta, J. Ogohara, K. Kouyama, T. Imamura, T. Kashimura, H. Limaye, S.S. McGouldrick, K. Nakamura, M. (2017). Equatorial jet in the lower to middle cloud layer of Venus revealed by Akatsuki. Nature Geoscience, 10(9), 646. Peralta, J. Muto, K. Hueso, R. Horinouchi, T. Sánchez-Lavega, A. Murakami, S.Y. Machado, P. Young, E.F. Lee, Y.J. Kouyama, T. Sagawa, H. (2018). Nightside Winds at the Lower Clouds of Venus with Akatsuki/IR2: Longitudinal, Local Time, and Decadal Variations from Comparison with Previous Measurements. The Astrophysical Journal Supplement Series, 239(2), 29. Kashimura, H. Sugimoto, N. Takagi, M. Matsuda, Y. Ohfuchi, W. Enomoto, T. Nakajima, K. Ishiwatari, M. Sato, T.M. Hashimoto, G.L. Satoh, T. (2019). Planetary-scale streak structure reproduced in high-resolution simulations of the Venus atmosphere with a low-stability layer. Nature communications, 10(1), 23. Sugimoto, N. Yamazaki, A. Kouyama, T. Kashimura, H. Enomoto, T. Takagi, M. (2017). Development of an ensemble Kalman filter data assimilation system for the Venusian atmosphere. Scientific Reports, 7, article number: 9321. DOI:10.1038/s41598-01709461-1

infrared camera and Project Scientist of the mission. Collaborators All Akatsuki team members (science, engineering, modelling, and secretaries), and people in industry.

Personal Response Your research has significantly enhanced our understanding of the unique atmosphere of Venus. What’s next for your work? Too many things remain “invisible” to us (what we have revealed are mere speculations). I love “Seeing is believing” and would love to visualise all we want to know. How the solid body pushes the atmosphere, how momentum is transferred, how waves/eddies decay to release the energy that they carry, where the source of molecules for enormous clouds in Venus is, etc. Even for our Earth, many of these are not yet visualised. But we need to know and understand through that which we see. Explorations of planets are to realise such dreams.

Photo Credit: CC BY-SA 4.0

physics predicts that wind speeds will be lower close to the equator – a rule that is obeyed by Venus’ super-rotating upper atmosphere. “It is not easy to accelerate the atmosphere in the equator,” explains Professor Satoh. “If conservation of angular momentum is considered, acceleration at mid- or high-latitudes could naturally be produced, but this generally cannot happen at the equator.”


Nerthuz/Shutterstock.com

14

www.researchoutreach.org

Dr Higson is particularly interested in using nested sampling to reconstruct noisy astronomical images. The nested sampling software his team has developed can be used to analyse these images using Bayes’ mathematics. This allows a neural network to predict what a clear image should look like by trying to find a simple (‘sparse’) mathematical description of the image. “We implement ‘Bayesian Sparse Reconstruction’ using nested sampling - a popular method for Bayesian

noisy data

fit

true signal

noisy data

fit

0.0 0.2 0.4 0.6 0.8 x 0.0 0.2 0.4 0.6 0.8 0.00 x 0.0 0.2 0.4 0.6 0.8 true x signal 1.00 true signal 1.00 0.75 true signal 1.00 0.75 0.50

0.25 0.00

y

0.75 0.50 0.25

y

0.50 0.25 0.00 0.0 0.2 0.4 0.6 0.8 x 0.0 0.2 0.4 0.6 0.8 0.00 x 0.0 0.2 0.4 0.6 0.8 true x signal 1.00 true signal 1.00 0.75 true signal 1.00 0.75 0.50 0.25 0.00

0.75 0.50 0.25 0.50 0.25 0.00

1.0 3σ 3σ 2σ 3σ 2σ 1σ 2σ 1σ

However, the efficiency of their algorithms wasn’t the only difficulty in the implementation of nested sampling. To ensure that their reconstructed images were reliable, the researchers first needed to find ways to understand errors in the calculations their software performed and estimate how big the possible inaccuracies were.

1.0 1.0 0.5

0.5 0.5 0.0

1 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 1.0 1σ 0.0 x x 1 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 1.0 0.0 x (a) Data from a xsingle generalised Gaussian. 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 1.0 1 single generalised Gaussian. x noisyx data(a) Data from a fit 0.8 noisy data(a) Data from afit single generalised 3σ Gaussian. 0.8 0.6 noisy data fit 3σ 2σ 0.8 0.6 0.4 3σ 2σ 0.6 0.4 1σ 0.2 2σ 1σ 0.4 0.2 0.0 1 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 1.0 1σ 0.2 0.0 x x 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 1.0 1 0.0 x(b) Data from the sum x of two generalised Gaussians. 1 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 1.0 Data from the sum x of two generalised Gaussians. noisyx(b) data fit 1.0 noisy (b) dataData from the sum fit of two generalised Gaussians. 3σ 1.0 noisy data fit 3σ 2σ 1.0 0.5 3σ 2σ 0.5 1σ 2σ 1σ

0.5 0.0

0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 1.0 1σ 1 0.0 x x x 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 1.0 1 0.0 Given0.00 these 3 noisy data x sets (middle column) which x visually look from very similar, Bayesian sparse generalised Gaussians. x of three (c) Data the sum 0.0can 0.2work 0.4out0.6 1 0.0 0.2 0.4 the 0.6underlying 0.8 0.0 0.2 0.4 (left 0.6column) 0.8 1.0 reconstruction the0.8 differences between true signals and produce6.fitsFitting of the data (right column) which very(c) closely match the the true Figure Gaussian functions tosignals. 100 data points sampledGaussians. from diffe Data from sum three generalised xgeneralised xbasis x of Reproduced et al. Bayesian sparse reconstruction: approach to astronomical each rowfrom theEdward first Higson plot shows the true signal (a suma brute-force of basis functions); where this contains mor Figure Fitting generalised Gaussian basis functions to permission 100 data sampled from differ imaging and 6. machine learning. MNRAS (2019) 483 (4): 4828-4846. Reproduced with bypoints Oxford (c) Data from the sum of three generalised components are shown with dashed Society. lines. Please The data, which includes added normallyGaussians. distribute University Press on behalf of the Royal Astronomical visit: https://academic.oup.com/mnras/ each row the first plot shows the true signal (a sum of basis functions); where this contains more is show in the second plot. TheGaussian third shows the using thesampled adaptivefrom method article/483/4/4828/5232384?searchresult=1. This figureplot is not included underfit thecalculated Creative Commons license Figure 6. Fitting generalised basis functions to 100 data points diffew components are shown with dashed lines. The data, which includes added normally distribute of this publication. For permissions, pleaseiso-probability contact journals.permissions@oup.com. contours represent posterior credible intervals on y(x). The bar plots on the rig each rowinthe plotplot. shows thethird true plot signalshows (a sum functions); this contains mor is show thefirst second The theoffitbasis calculated usingwhere the adaptive method w different numbers of basis functions N ; values calculated using the vanilla method anddistribute using the components are shown with dashed lines. The data, which includes added normally contours represent posterior iso-probability credible intervals on y(x). The bar plots on the rig sampling are also included for Results shown for the estimation adaptive method use a combin CHECKING FOR ERRORS The team then is show in the second plot.functions Thecomparison. third shows the fit used calculated using the adaptive different numbers of basis N ;plot values calculated usingthis the vanilla method and method using thew computes a full posterior on N and uses 1,000 live points; adaptive runs using dynamic nest contoursnested represent posterior credibleshown intervals on the y(x). Themethod bar plots rig Dynamic sampling grew from as a basis for dynamic sampling are also included foriso-probability comparison.technique Results for the adaptive useona the combin ninit = 500numbers and dynamic_goal = 1. Results for the vanilla method usevanilla 5 separate runs, each with different of basis functions N ; values calculated using the method and using the Dr Higson and hisposterior colleagues’ sampling algorithm, computes a full onefforts N and uses nested 1,000 live points; adaptivewhich runs using dynamic nest for each value of Nincluded . All runs use the settingResults num_repeats = 100. The parameters of use the abasis fu are also for comparison. shown for the adaptive method combin nsampling = 500 and dynamic_goal = 1. Results for the vanilla method use 5 separate runs, each with 2 to understand the sources of errors automatically identifies what their init results for the computational efficiency of the different methods are shown in Table C1 and Tab computes a full posterior N and usessoftware 1,000 live points; adaptive runs usingofdynamic for each value ofthen N . teach All runs the setting num_repeats = 100. Theimprove parameters the basisnest fu in their software itontouse should do to best ninit = 500 dynamic_goal 1. Results method are use 5 separate runs,C1 each with results for and theidentify computational of for the different methods shown in Table and Tab N = 1 =efficiency Nthe = 2vanilla N = 3 N = 4 automatically and improve calculation accuracy. for each 1.00 value of N . All runs use the setting num_repeats = 100. The parameters of the basis fu the parts of the calculation which N = 1 efficiency of the N = 2 N = 3 = 4Tab results for the computational different methods are shown in Table C1Nand were most1.00 likely to contain errors. These efforts also lead to another 0.75 N = errors 1 N = 2 package which N =3 N =4 “We analysed the sampling in software provides 1.00 0.75 nested sampling parameter estimation error analysis for nested sampling 0.50 and created a method for estimating calculations. “We developed diagnostic 0.75 0.50 0.25 them numerically for a single nested tests for detecting when software has sampling 0.25 calculation,” says Dr Higson. not performed the nested sampling 0.50 0.00 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0 0.25 0.00 x x x x 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0.8 0.0 0.2 0.4 0.6 0 0.00 x x x x Figure 7. Fits the0.4 data in Figure conditioned basis 0.0 of0.2 0.6shown 0.8 0.0 0.2 0.46c0.6 0.8 0.0 on 0.2different 0.4 0.6 numbers 0.8 0.0 N 0.2of0.4 0.6fu0 vanilla method nested sampling runs, as the adaptive method runs contain relatively few sample x x x x Figure 7. Fits of the data shown in Figure 6c conditioned on different numbers N of basis fun N. vanilla method nested sampling runs, as the adaptive method runs contain relatively few sample Figure 7. Fits of the data shown in Figure 6c conditioned on different numbers N of basis fu N . vanilla method nested sampling runs, as the adaptive method runs contain relatively few sample N. 0.25 0.00

y

APPLYING NESTED SAMPLING Nested sampling is an increasingly popular computational technique for performing Bayesian statistics calculations. The approach was invented by Cambridge astrophysicist John Skilling in 2004 and is capable of handling challenging data sets and highdimensional probability distributions.

Using the updated algorithms, the researchers wrote software which could efficiently carry out Bayesian statistical analysis on large sets of data. As Dr Higson explains, dynamic nested sampling promises to become a powerful tool for processing the stream of data and images which modern telescopes produce. “Dynamic nested sampling has now been applied to a variety of problems in astrophysics,” he continues. “These increases in computational efficiency combined with advances in nested sampling software have the potential to allow the Bayesian sparse reconstruction framework to be applied to larger and more complex data sets in the future.”

true signal

y

Using these methods to analyse inputs as complex as mysterious astronomical images is, understandably, a tall order. However, statistical techniques have advanced significantly since Bayes’ time, and are now allowing Dr Higson’s team to reliably reconstruct these images, based on the computing technique of ‘nested sampling’.

IMPROVING NESTED SAMPLING EFFICIENCY Despite its usefulness, the rapid, repetitive calculations demanded by nested sampling-based machine learning can make the technique computationally intensive. In earlier research, Dr Higson’s team combated the issue by constructing algorithms which greatly reduced the computing power required to run nested sampling software. “‘Dynamic nested sampling’ is a generalisation of the nested sampling algorithm which can produce large improvements in computational efficiency compared to standard nested sampling,” explains Dr Higson.

fit

P (N P (N |L, π) P|L, (Nπ) |L, π)

y y

0.50 0.25 0.00

y

“My research is focused on methods for using data to make inferences about the universe, including in cases where we do not have a good theoretical model for the signal we are observing,” Dr Higson explains. “I use Bayesian statistics, which defines probability as a ‘degree of belief’ in a proposition and can analyse non-repeatable events where our uncertainty is due to limited information.”

noisy data

y

A LITTLE HELP FROM BAYES When researchers need to make predictions about the likelihood of certain events occurring, they have a wide variety of statistical techniques to choose from. Many popular methods use Bayesian Statistics – an approach named after the English statistician Thomas Bayes in the 18th century, who considered how our confidence that a proposition is true changes as more information is gathered about it through experiments. Bayes modelled this statistical behaviour with an equation now known as ‘Bayes’ theorem’. Centuries later, Dr Higson and his colleagues are using his mathematics as a basis for neural networks which can successfully reconstruct noisy astronomical images.

0.75 0.50 0.25

y

T

o gain knowledge about the universe, astronomers use a wide variety of telescopes to gather electromagnetic radiation from across the sky. As telescopes improve, and researchers observe the sky in everincreasing detail, intriguing observations which have yet to be explained by theoretical physicists continue to roll in. However, obstacles ranging from dust clouds to imperfections in telescopes mean that no observation can be perfect; inevitably astronomical signals will have some degree of noise associated with them. Because of the complex nature of these signals, astronomers struggle to work out what their more unexpected signals should look like without any noise. In their research, Dr Higson and his colleagues have devised a cutting-edge solution to the issue involving image processing based on machine learning.

true signal

Higson et al.

1.00 0.75 0.50

y

Much of what we know about how our universe works has been learnt by analysing the astronomical signals captured from the sky. However, these signals will inevitably have some noise associated with them – so how can astronomers be sure that their observations of strange, unexpected signals reflect reality? Edward Higson at the University of Cambridge and his colleagues have made progress in dealing with this issue, by constructing machine learning algorithms which can process noisy astronomical images to reconstruct clear ones. Using Bayesian statistics, their software provides astronomers with useful tools for processing their observations.

1.00 0.75

y

Reconstructing astronomical images with machine learning

computation which is widely used in astrophysics,” Dr Higson continues. “Much of our research has focused on improving this technique to make it better able to handle Bayesian Sparse Reconstruction problems, and on understanding the uncertainties in these calculations.” To get to this point, Dr Higson and his colleagues have carried out extensive research into how nested sampling can be used for Bayesian statistics calculations while taking full advantage of the current capabilities of modern computers.

y

Physical Sciences ︱ Edward Higson

E. 1.00

P (N P (N |L, π) P|L, (Nπ) |L, π)

8

E. Higson et al. E. Higson et al.

P (N P (N |L, π) P|L, (Nπ) |L, π)

8 8

I use Bayesian statistics, which … can analyse non-repeatable events where our uncertainty is due to limited information.

www.researchoutreach.org

15


noisy data

Bayesian sparse posteriorreconstruction distribution of N

fit 1.0

noisy data

fit

x2

signal

noisy data

fit

x2

signal

x1 signal x1 signal

x1

x1

vanilla

posterior distribution of adaptive N

1.0 0.5

dyn. adap. vanilla for signal reconstruction, in distribution which the posterior ofadaptive N dyn. adap. 1.0

signal is modelled by basis functions 0.5 number and form is determined whose by the data themselves”. 0.0

1

2

3

4

x2

x2

P (N |L, P π)(N |L,Pπ) (N |L, π)

x2

x2

x2

dyn. adap. 1.0 1.0 galaxy. (a) Image of an irregularly shaped 0.6 vanilla RECONSTRUCTING REAL SIGNALS 0.5 0.8 adaptive 0.4 signal noisy data fit posterior In the next part of the study,distribution Dr Higson of N dyn. adap. 1.0 1.0 colleagues 0.6 and performed their 0.2 vanilla 0.5 technique on noisy images gathered 0.8 adaptive 0.4 0.0 0.0 by real telescopes. The software was dyn. adap. x1 x1 x1 0.6 2 3 4 5 0.2 also used to1process images using 0.5 number of basis functions N neural networks – computational 0.4 0.0 0.0 x1 x1 x1 systems inspired by the 1 2 processes 3 4 5 (b) Image containing several galaxies. 0.2 which play out in our brains – which in number of basis functions N this be useddistribution in a similar of N 0.0 0.0 application can signal noisy data fit posterior x1 x1 x1 manner to basis functions. In this 1.0 1.0 1 2 3 case, 4 5 (b) Image containing several galaxies. vanilla the team’s algorithms used neural number of basis functions N 0.8 signal noisy data fit posterior distribution networks to adaptive perform Bayesian statistics of N dyn. adap. 1.0 1.0 on noisy signals, producing clear images. (b) Image containing several0.6 galaxies. vanilla 0.5 0.8 adaptive 0.4 signal noisy data fit posterior distribution of N “We demonstrate our method for dyn. adap. 1.0 1.0 0.6 reconstructing noisy one- and two0.2 vanilla 0.5 dimensional signals, 0.8 adaptive including examples 0.4 0.0 0.0 of processing astronomical images,” dyn. adap. x1 x1 x1 0.6 1 2 3 4 5 0.2 explains Dr Higson. “We also applied In each row, the left image shows galaxies photographed by the Hubble Space Telescope. Given 0.5 number of basis functions N very noisy versions of these images (middle of each row), Bayesian spares reconstruction produces the 0.4 0.0 0.0 Bayesian sparse reconstruction a fit of the noisy data reproduces the original x1 (right of each row) which accurately x1 x1 image. framework to where 1 neural networks, 2 3 4 5 (c) Another image containing several galaxies. 0.2 it allows us to work out the type Reproduced from Edward Higson et al. Bayesian sparse reconstruction: a brute-force approach to astronomical number ofbest basis functions N imaging and machine learning. MNRAS (2019) 483 (4): 4828-4846. Reproduced with permission by Oxford 0.0 0.0neural of networkDeep to useField for fitting the Figure 13. As for Figure 11 but fitting 32 × 32 images from the Hubble Space Telescope eXtreme (Illingworth et al. 2013); University Press onxbehalf of the Royal Astronomical Please visit: https://academic.oup.com/mnras/ xSociety. x1 1 1 1 2 3 4 5 data set.” Unlike previous algorithms, = 0.2. each pixel has added normally distributed y-errors with σ (c) Another image containing several galaxies. article/483/4/4828/5232384?searchresult=1. This figure is not included under the y Creative Commons license of this publication. For permissions, please contact journals.permissions@oup.com. Dr Higson’s software allows models number of the basis functions N Figure 13. As for Figure Space Telescope Deep Field et al. 2013); N = 1 11 but fitting 32 N×=322 images from theNHubble =3 Nadapt = eXtreme 4 their level N =(Illingworth 5 to the to of complexity = 0.2. each pixel has added normally distributed y-errors with σ 1.0 y (c) Another image containing several galaxies. complexity of the data. A BASIS FOR SIGNAL algorithm accurately,” Dr Higson

x2

Edward Higson

5

0.5 this framework, the team Using number of basis functions N 0.4 0.0 0.0 established a principled method for x1 x1 1 2 3 4 5 (a) Image of an irregularly shaped 0.2 galaxy. identifying clear signals in noisy data. number of basis functions N This first part of the study, therefore, 0.0 0.0 noisy data fit posterior distribution of N provided them with a basis on which x1 x1 1.0 galaxy. 1.0 1 2 3 4 5 (a) Image of an irregularly shaped they could test the performance of their vanilla number of basis functions N 0.8 algorithms in real situations.distribution of N adaptive noisy data fit posterior

continues. “The estimates, N =RECONSTRUCTION Nuncertainty = 1 11 but N Hubble =3 N = eXtreme 4 N =(Illingworth 5 Figure 13. As for Figure fitting 32 × 322 images from the Space Telescope Deep Field A NEW TOOLKIT In the 2019 study, Dr Higson’s team and diagnostics, are implemented each pixel has added normally distributed y-errors with σy = 0.2.

Behind the Research

vanilla adaptive dyn. adap.

P (N |L, P π)(N |L,Pπ) (N |L, π)

x2

x1

0.8 1.0 0.6 0.8 0.4 1.0 0.6 0.2 0.8 0.4 0.0 0.6 0.2

13

1.0 P (N |L, P π)(N |L,Pπ) (N |L, π)

signal

E: e.higson@mrao.cam.ac.uk

W: https://github.com/ejhigson/dyPolyChord W: https://github.com/ejhigson/nestcheck

Research Objectives

References

Edward Higson’s research applies Bayesian inference and machine learning methods to astrophysics and cosmology.

Higson, E., Handley, W., Hobson, M. and Lasenby, A. (2019). Bayesian sparse reconstruction: a brute-force approach to astronomical imaging and machine learning. Monthly Notices of the Royal Astronomical Society, 483(4), 4828–4846.

Detail Bio Edward Higson studied physics at Oxford University before obtaining a PhD in physics at Cambridge University. Since submitting his PhD thesis he has been working at Goldman Sachs. Collaborators • Dr Will Handley • Professor Mike Hobson • Professor Anthony Lasenby

x2

first developed methods to reconstruct

by building upNmodels N =signals 2 = 3 of them

x2

x2

N =1

Higson, E., Handley, W., Hobson, M. and Lasenby, A. (2019). nestcheck: diagnostic tests for nested sampling calculations. Monthly Notices of the Royal Astronomical Society, 483(2), 2044–2056. Higson, E. (2018). nestcheck: error analysis, diagnostic tests and plots for nested sampling calculations. Journal of Open Source Software, 3(29), 916. Higson, E., Handley, W., Hobson, M. and Lasenby, A. (2019). Dynamic nested sampling: an improved algorithm for parameter estimation and evidence calculation. Statistics and Computing, 1-23. Higson, E. (2018). dyPolyChord: dynamic nested sampling with PolyChord. Journal of Open Source Software, 3(29), 965

Personal Response What particular images do you think will yield particularly interesting results after Bayesian sparse reconstruction?

et0.8 al. 2013); 1.0 0.6 0.8 0.4 1.0 0.6 0.2 0.8 0.4 0.0 0.6 0.2

FOR ASTRONOMERS Thanks N = 4 to the work of Dr Higson N =5 and his colleagues, astronomers through machine learning which seek By integrating nestcheck into their have access to a principled approach to express the signal as simply as possible. dynamic nested sampling algorithms, to using machine learning techniques The elemental building blocks of these the researchers brought together all x1 x1models are called ‘basisxfunctions’ 1 1 1 for xprocessing their images. x Their – sets of their efforts to improve the efficiency software can help to reconstruct clear of mathematical functions which can and accuracy of their software. In their 0.0 Figure 14.research, Fits of Dr the data and in Figure 13c conditioned on different numbers of basis functions N ;for these plotsstrange use results from0.4 the vanilla images the most be used to approximate the fundamental latest his xHigson x x x1 even x 1 1 1 1 method. and unexpected of observations, for properties of astronomical signals. colleagues have used their advanced 0.2 which theoretical physicists have yet As Dr Higson explains, “we presented software to demonstrate their Bayesian Figure 14. Fits of the data in Figure 13c conditioned on different numbers of basis functions N ; these plots use results from0.0 the vanilla to offer Sparse Reconstruction x1 techniques. x1 a principled Bayesian framework x1 x1 any explanation. As xtelescopes 1 method. continue to look at the universe in ever greater levels detail, the use software Figure 14. Fits of the data in Figure 13c conditioned on different numbers of basis functions N ; of these plots results from the vanilla could be an important tool in ensuring method. that the reconstruction of increasingly strange and diverse astronomical observations can be truly reliable. in the ‘nestcheck’ software package.”

Higson, E., Handley, W., Hobson, M. and Lasenby, A. (2018). Sampling errors in nested sampling parameter estimation. Bayesian Analysis, 13(3), 873–896.

The technique is particularly useful when noise and multiple overlapping sources make it difficult to determine how many astronomical objects are present in the image and what each looks like. A really exciting application is to the most distant photographs ever taken by the Hubble Space Telescope, which show a jumble of galaxies near the edge of the observable universe. Bayesian sparse reconstruction can be used to work out how many separate overlapping galaxies are present in such images, and what shape the individual galaxies are.

We implement ‘Bayesian Sparse Reconstruction’ using nested sampling - a popular method for Bayesian computation.

MNRAS in press, 1–20 (2015)

Marina Sun/Shutterstock.com

MNRAS in press, 1–20 (2015) 16

www.researchoutreach.org

MNRAS in press, 1–20 (2015)

www.researchoutreach.org

17


Physical Sciences ︱ A. Miguel Holgado

Refining the search for the largest gravitational waves Ask many astronomers, and they will tell you that gravitational waves are the greatest scientific discovery of the 21st century so far. In his research, Miguel Holgado at the University of Illinois studies the clever astronomical techniques which can be used to observe the very largest of these elusive ripples, originating from supermassive black holes as they orbit around each other. His results suggest that these binaries are likely far rarer than astronomers originally thought, but could also help them to refine their observation techniques.

I

n September 2015, the world of astronomy was changed forever, as ripples in spacetime, first theorised by Einstein over a century ago, were directly observed for the first time. The gravitational wave, discovered by a global team of researchers at the LIGO-Virgo collaboration, was created as two black holes spiralled into each other and merged, releasing colossal amounts of energy.

LIMITATIONS IN LIGO’S DETECTIONS So far, every gravitational wave LIGO has detected has originated from mergers of black holes and neutron stars with similar masses to our sun. However, astronomers predict that far larger waves could originate from ‘supermassive’ black holes – objects many millions of times heavier than the Sun, which are thought to occupy the centres of many galaxies.

Using kilometre-scale interferometers, accurate to within the widths of single atoms, physicists picked up the wave as it passed through Earth, minutely stretching and squeezing the fabric of spacetime as it went. Nearly a dozen more gravitational wave observations have been made since then, and again this year, astronomers are turning their attention to detecting mergers of black holes, neutron stars, and more.

As Holgado explains, these waves can form when two galaxies merge together, causing each of their supermassive black holes to orbit each other. ‘Some of the loudest gravitational waves are expected from the orbital motion of supermassive black hole binaries that may form within the centres of merged galaxies,’ he says. Yet despite their enormous size, these waves are notoriously difficult to detect. Where the ripple observed by LIGOVirgo in 2015 dramatically completed a cycle in fractions of a millisecond, those waves from supermassive black hole binaries with periods of years to decades can endure for thousands to millions of years.

An artist’s depiction of a blazar, which is thought to be an active supermassive black hole with a relativistic jet pointed along our line of sight.

18

www.researchoutreach.org

‘LIGO cannot detect gravitationalwaves from these sources because the frequencies from in-spiralling supermassive black hole binaries are expected to be of order nanohertz,’ Holgado continues. ‘LIGO is only sensitive to gravitational wave frequencies of order tens to thousands of hertz, which mostly correspond to the mergers of stellarmass black hole binaries and neutronstar binaries.’ Astronomers have thus

The Fermi Gamma-Ray Space Telescope’s view of the gamma-ray sky. The disk of gamma-ray emission is from our own Milky Way Galaxy. A significant fraction of the point-like gamma-ray sources above and below the Milky Way are gamma-ray emission from blazars.

been relying on a different technique to detect nanohertz gravitational waves by exploiting the most reliable timekeepers known to astronomy. RELIABLE ASTRONOMICAL TIMEKEEPERS Astronomers were baffled when they first discovered a number of mysterious objects which appeared to emit relentlessly regular flashes of high-energy radiation. The pulses kept time so accurately that some researchers were even drawn to extraterrestrial explanations of their origin at first. However, as Jocelyn Bell and Antony Hewish famously discovered in 1967, these signals have a perfectly natural explanation. The duo revealed that the pulses are actually beams of radiation, which are continually emitted from the poles of rapidly-rotating neutron stars – small, dense objects which were once the cores of massive stars. Once every rotation, one of these beams would point straight towards the Earth, resulting in a regular sequence of flashes from our perspective. The pulses are emitted with unwavering uniformity like a highly precise metronome, but as Holgado explains, this could appear to change from our perspective if a pulsar signal with was affected by a gravitational wave. ‘If gravitational waves pass through a pulsar

If gravitational waves pass through a pulsar and the Earth, the arrival times of the pulses are shifted due to the stretching and squeezing of spacetime. and the Earth, the observed pulse timesof-arrival are shifted due to the stretching and squeezing of spacetime,’ he says. Astronomers are now using these theorised shifts to monitor changes in groups, or arrays, of these distorted pulsar signals, over periods of several years. ‘In order to detect gravitational waves, we can time an array of pulsars and correlate the observed shifts of the pulse arrival times for each pulsar,’ Holgado continues. ‘A pulsar timing array (PTA) is a particular type of gravitational wave detector that is sensitive to nanohertz gravitational waves from supermassive black hole binaries.’ A GROWING LIST OF BINARY CANDIDATES This technique is vital for detecting lowfrequency gravitational waves that LIGO is not sensitive to. However, observations of PTA signal shifts have yet to yield concrete evidence of the low-frequency ripples. All the same, researchers have now begun the search in earnest; paying particular attention to the dense, bright clusters of gas which collect at the centres of some galaxies.

‘Even though PTAs have yet to detect nanohertz gravitational waves, their upper limits are of astrophysical relevance and importance,’ Holgado says. ‘A number of telescopes have been finding and observing active galactic nuclei (AGN), which are the centres of galaxies that are observed to be very bright compared to normal galaxies due to a supermassive black hole feeding on hot gas.’ In several of these bright galactic centres, astronomers have found that the light they emit dips and rises at regular intervals. As Holgado explains, ‘some AGN have light curves that show some periodic behaviour, which is sometimes thought to come from the orbital motion of a supermassive black hole binary.’ If the orbit of the black hole binary is inclined relative to our perspective, then each black hole would routinely move towards and away us as they orbit each other. This motion relative to our line of sight would cause the bright material surrounding the black holes to become even brighter when the motion is towards us and to become dimmer when

www.researchoutreach.org

19


Behind the Research

the motion is away from us. Another possibility is that if the orbit is completely face-on, i.e., no motion towards nor away from us, then the gaseous material can still fall into the black holes in a manner driven by the orbital motion of the binary. The rate at which the material falls into the black holes can increase and decrease on orbital timescales, and as this accretion rate changes, so does the luminosity of the gas that is falling in. Astronomers can, therefore, refine their search for nanohertz-frequency gravitational waves by pointing their telescopes at those periodic AGN positioned behind PTAs.

If astronomers are wrong about supermassive black hole binaries being the sole cause of periodic AGN, determining the proportion which are responsible for them could be vital for gravitational wave observations in the future. In his research, Holgado is working to determine just how many time-variable AGN can be explained by black hole binaries. DETECTIONS IN BLAZARS When AGN are particularly bright, they will release vast amounts of energy through colossal jets, which point perpendicular to the galaxy’s disc. AGN which display these jets are called ‘quasars’, and in special cases, they point straight towards Earth, making them ideal for our observation. The data gathered in observations of these special quasars, called ‘blazars’ is of particular interest to Holgado. ‘Blazars are a particular type of AGN whose jets are pointed directly

20

www.researchoutreach.org

E: holgado2@illinois.edu W: https://amholgado.gitlab.io/

Research Objectives A. Miguel Holgado’s research focuses on the gravitational wave signatures of compact objects such as black holes, neutron stars, and white dwarfs.

Detail Miguel Holgado 1002 W. Green St., Urbana IL, 61801

An artist’s depiction of a pulsar timing array, where a set of pulsars are observed for evidence of gravitational waves from the cosmic population of supermassive black hole binaries with year-like orbital periods.

Bio A. Miguel Holgado completed his undergraduate studies at Texas A&M University. He is now a PhD Candidate at the University of Illinois at Urbana-Champaign researching gravitational-wave astrophysics. Miguel will be attending the 69th Lindau Nobel Laureate Meeting, which will focus on Physics.

Using PTA upper limits, we have shown that only a small fraction of blazars may be binaries. towards us, making them particularly bright among AGN,’ he explains. ‘NASA’s Fermi Gamma-Ray Space Telescope has obtained the most complete census of blazars out to distances farther than what pulsar timing arrays can observe.’ Many blazars exhibit the light variations which caused astronomers to theorise supermassive black hole binaries in the first place, giving Holgado a highlydeveloped database of observations to work with. ‘Blazars are notorious for showing quasi-periodic behaviour in their light curves at a variety of wavelengths, which is also sometimes thought to be due to the presence of a binary,’ he continues. With this information, Holgado can use PTA upper limits on nanohertz gravitational waves to directly quantify the proportion of periodic blazars which may host binaries. LIMITED EVIDENCE FOR SUPERMASSIVE BINARIES So far, Holgado’s results have revealed that only a small fraction of blazars can host gravitational wave emitting binaries that would be consistent with PTA upper limits. This suggests that supermassive black hole binaries are responsible for a

Funding NANOGrav Physics Frontier Center, NSF PIRE, DOE Krell Institute

far smaller proportion of periodic AGN than astronomers originally thought. Instead, Holgado theorises that a majority of periodic AGN must consist of uneven distributions of light-emitting gas orbiting individual supermassive black holes.

Collaborators • Alberto Sesana • Angela Sandrinelli • Stefano Covino • Aldo Treves • Xin Liu • Paul Ricker

‘Using PTA upper limits, we have shown that only a small fraction of blazars, around 0.01 to 0.1%, may be binaries, which means that the quasi-periodicity often seen in blazars is likely to be due to periodic motions of the hot gas surrounding the supermassive black hole,’ he explains. Such a low fraction may be somewhat disappointing to some astronomers, but the result could prove vital in ensuring potential future observations of gravitational waves originating from supermassive black hole binaries are genuine. The constraints on the binary fraction also provide some insights into how efficiently supermassive black hole binaries from in the centers of galaxies. In the future, Holgado’s work could play an important role in our understanding of the dynamic ripples in spacetime which we now know to permeate our universe.

References Holgado, A.M., Sesana, A., Sandrinelli, A., Covino, S., Treves, A., Liu, X. and Ricker, P. (2018). Pulsar timing constraints on the Fermi massive black hole binary blazar population. Monthly Notices of the Royal Astronomical Society: Letters, 481(1), L74-L78. Sesana, A., Haiman, Z., Kocsis, B., and Kelley, L. (2018). Testing the Binary Hypothesis: Pulsar Timing Constraints on Supermassive Black Hole Binary Candidates. The Astrophysical Journal, Volune 856, Number 1.

Personal Response There is increased effort towards finding potential supermassive black hole binaries. With pulsar timing arrays improving their upper limits, you must be excited about the future in this field. Very much so! Even without a direct detection of nanohertz gravitational waves, we can still do multimessenger astrophysics, where electromagnetic observations from telescopes tell us about the possible presence of binaries from periodic AGN and gravitationalwave upper limits tell us what fraction of these binary candidates may be real. Once pulsar timing arrays finally detect nanohertz gravitational waves, we hope to gain a clearer picture of how supermassive black hole binaries form and how this is tied to the evolution of galaxies over cosmic time.

DnG Photography/Shutterstock.com

LACK OF GRAVITATIONAL WAVES If astronomers were right about supermassive black hole binaries being the main cause of periodic AGN, we might expect to have a rapidly-growing database of nanohertz gravitational wave observations by now. Unfortunately, however, this is not the case. ‘The growing number of candidates that the telescopes are finding imply a population of binaries that may be emitting gravitational waves, which pulsar timing arrays are sensitive to,’ Holgado continues. ‘However, pulsar timing arrays have not yet detected gravitational waves. This must mean that some fraction of the candidate supermassive black hole binaries in AGN may be false detections.’

A. Miguel Holgado

www.researchoutreach.org

21


Physical Sciences ︱ Dr Diego A. Oyarzún

The mathematics of life:

Gorodenkoff/Shutterstock.com

Computational models allow a detailed understanding of metabolic networks in cells, which can be exploited for cutting-edge synthetic biology and healthcare technologies.

Metabolic control in living cells Cellular metabolism is a complex network of chemical processes that convert nutrients into energy and molecules for survival. Advances in experimental and mathematical techniques are paving the way for quantitative descriptions of how metabolism regulates itself and of how it can be artificially controlled for biotechnology. Dr Diego A. Oyarzún (University of Edinburgh) uses computational models to understand metabolism, and to exploit it in cutting-edge synthetic biology and healthcare technologies.

A

t its most fundamental level, life is chemistry. Billions of chemical reactions occur simultaneously in living cells every second, which can involve complicated macromolecules like proteins and DNA, as well as smaller and more mobile molecules that are free to roam within the cell and even cell membranes. These chemical reactions do not happen in isolation: in order for a cell to survive, thrive and reproduce, a high degree of control of how and when individual reactions take place is essential. Cellular metabolism is an extremely complex and robust network, in which individual subunits (proteins and metabolites) interact to keep the whole network working and to guarantee survival in unfavourable environmental conditions. Understanding the details of this enormously complicated machinery appears to be well beyond the grasp of traditional analytical approaches, but this is set to change. A HOLISTIC VIEW OF LIFE Systems biology is a multidisciplinary field of research that applies mathematical and computational approaches to model complex biological systems. It has gained substantial importance in the last two decades as a powerful set of methods to study the interactions between the components of a biological system (like, for instance, proteins and metabolites in cells) and to define how these interactions determine the function of the system, its behaviour and its response to external perturbations. At its core, systems biology attempts to describe an emergent property of a biological system by integrating information about the interactions among its constituents. At variance with traditional reductionist approaches, which focus on defining and identifying the elementary constituents of large interacting systems, systems biology aims to provide a

rigorous framework for the interpretation of a system’s function and behaviour from the quantitative observation of multiple components simultaneously and from the integration of these data with mathematical models. QUANTITATIVE MODELS OF CELLULAR METABOLISM For decades, cellular metabolism has been largely seen as a process isolated from the rest of the cellular machinery. This traditional view has been challenged in recent years, in light of a number of studies that highlight the interplay between metabolism and other cellular functions. For example, it is now established that metabolic regulation plays an important role in disease. Conditions such as cardiovascular diseases and cancer have for instance been linked to metabolic misregulation, and pathogens can exploit their own metabolic regulation systems to evade drug treatments. Dr Oyarzún has been pioneering the application of systems biology to metabolic regulation in living cells. His work aims to understand the interplay between metabolism and gene regulation in natural systems. This is a cellular control strategy that is widely conserved across species and enables robust homeostatic adaptations to fluctuating environments. Various metabolic control systems can be addressed, in order to elucidate the role of the regulatory architecture on metabolic phenotypes, including pathways relevant for next-generation antibiotics and synthetic biology. The long term ambition of this work is to develop a predictive theory of how complex regulatory networks shape the metabolic response of cells to the environment. Such a theory is the key to understand how cells self-regulate in response to external perturbations,

including for instance the effects of pathogens or antibiotic treatments. ENGINEERING AND SYNTHETIC BIOLOGY Dr Oyarzún’s research applies computational modelling to unravel the complexity of the metabolic machinery in cells and to understand the principles of how its function can adapt to changing environments. His approach is based on the integration of mathematics, engineering and biology and aims to provide a quantitative description of how life, as an emergent property of a complex biological network, sustains itself in diverse situations and under a variety of external stimuli. This knowledge can be exploited to make predictions on how genetic modifications influence the cell response, which can then in turn be used to control the cell behaviour and adapt it to serve human purposes, for instance to produce new therapeutic drugs. One of Dr Oyarzún’s interests concerns the mechanisms of metabolic regulation in bacteria. Bacteria use their metabolism to consume nutrients from their environment, which provides energy and raw materials for synthesis of new

Carlos E. Santa Maria/Shutterstock.com

Dr Oyarzún’s research uses ideas from control engineering and automation to study living systems.

Vit Kovalcik/Shutterstock.com

Interactions between theory and experiments can reveal the fundamental rules of biological processes.

Metabolism is an incredibly complex network of chemical reactions essential for the survival of cells. molecules required for their survival and reproduction. Depending on the conditions of their environment, in particular on the availability of nutrients, bacteria need to adjust their metabolism. This is accomplished through a complex web of feedback mechanisms that detect changes in nutrients and modify metabolism accordingly. In order to shed light onto this intriguing feedback effect and to understand how bacteria self-adapt to changing environments, Dr Oyarzún has been pioneering the use of control theory, a discipline borrowed from the engineering world, whose objective is to develop algorithms to robustly control dynamical processes found in virtually every technology, from manufacturing to aircraft control and communication systems. He applies this approach to

understand how different regulatory architectures found in nature enable microbes to survive environmental shocks. The novelty of Dr Oyarzún’s approach to the problem is in the use of mathematics to achieve what is virtually impossible to obtain from purely experimental approaches: a separation between the intertwined roles of regulatory architecture and regulatory parameters that control metabolism. The goal of this effort is to develop an understanding of how microbes self-adapt their metabolism to ensure their survival. BIOTECHNOLOGY Modelling cellular metabolism and its regulation is an ambitious and far reaching programme from a basic science perspective, but it can also

Rost9/Shutterstock.com

22

www.researchoutreach.org

www.researchoutreach.org

23


MATHS AND MEDICINE: PRECISION HEALTHCARE The mathematical approaches developed by Dr Oyarzún can have far reaching impact even outside basic science and biotechnology. Precision medicine, whose holy grail is to deliver the right medicine, to the right patient and at the right time, is one field that is likely to benefit greatly from

Behind the Research Dr Diego A. Oyarzún

E: d.oyarzun@ed.ac.uk T: +44 0131 651 1211 W: www.ed.ac.uk/profile/diego-oyarzun/ @doyarzunrod

Research Objectives NicoElNino/Shutterstock.com

have important implications in rapidly developing fields like biotechnology. One of the current crucial challenges in biotechnology is how to achieve robust, predictable and economically sustainable processes for a variety of targets, like the synthesis of therapeutic drugs, the development of new materials and the production of food. The quantitative biology approaches developed by Dr Oyarzún and his collaborators are providing a key contribution towards achieving this goal. In particular, the availability of robust and quantitative models of cell metabolism can pave the way for the design of artificial biological systems, in which cell metabolism is reprogrammed to deliver custom functionalities. Promising applications of this technique are in microbial cell factories, which exploit microorganisms to produce therapeutic drugs and a variety of other chemicals.

Microbial metabolism can be controlled through genetic engineering to produce therapeutic drugs. approaches based on the integration of system biology, mathematics and engineering. Progresses in screening technologies (like DNA sequencing) and image analysis algorithms now offer physicians the ability to distinguish between healthy and diseased individuals and to identify stages in disease progression. In this context, Dr Oyarzún has been exploring the

Dr Oyarzún’s group develops computational methods to analyse biological networks in living cells. His team uses mathematics to understand the function of natural networks, as well as to design novel biomolecular systems for Biotechnology and Healthcare.

Detail Dr Diego Oyarzún, Lecturer in Computational Biology, School of Informatics & School of Biological Sciences, Informatics Forum, 10 Crichton St., EH8 9AB, UK. Funders • Human Frontier Science Program • EPSRC Centre for Mathematics of Precision Healthcare

application of tools from network theory to analyse cancer omics datasets and identify biomarkers and new drug targets. This is part of the multiple efforts devoted to developing the new mathematical tools for data integration and analysis that may transform precision medicine into a mainstream tool for wide segments of the world’s populations.

• Cancer Research UK • Wellcome Trust Bio Dr Oyarzún leads the Biomolecular Control Group at the University of Edinburgh, with a joint appointment at the School of Informatics and School of Biological Sciences. Previously he was a Research Fellow in Biomathematics at

References Oyarzún, DA (2019). Diego Oyarzún, University of Edinburgh. [online] Personal website. Available at: https://www.ed.ac.uk/ profile/diego-oyarzun/ [Accessed 17 March 2019]. Liu, D., Mannan, A. A., Han, Y., Oyarzún, D. A. & Zhang, F. (2018). Dynamic metabolic control: towards precision engineering of metabolism. Journal of Industrial Microbiology and Biotechnology 45, 535–543.

Computational models reveal how the growth of bacterial cells is governed by the way they allocate their limited resources to different cellular processes.

Tonn, M. K., Thomas, P., Barahona, M. & Oyarzún, D. A. (2019). Stochastic modelling reveals mechanisms of metabolic heterogeneity. Communications Biology 1–9. Chaves, M. & Oyarzún, D. A. (2019). Dynamics of complex feedback architectures in metabolic pathways. Automatica 99, 323–332.

paulista/Shutterstock.com

Beguerisse-Díaz, M., Bosque, G., Oyarzún, D., Picó, J. & Barahona, M. (2018). Flux-dependent graphs for metabolic networks. npj Systems Biology and Applications 4;32.

24

www.researchoutreach.org

Oyarzún, DA; Lugagne, J-B; Stan, G-BV. (2015). Noise propagation in synthetic gene circuits for metabolic control. ACS Synthetic Biology, 4 (2), 116-125.

Imperial College London and a Marie Curie Fellow at INRIA Sophia Antipolis. He obtained his PhD in 2010 from the Hamilton Institute, Maynooth University, Ireland. In 2016, Dr Oyarzún was appointed Global Future Council Fellow by the World Economic Forum and in 2017 he was selected as one of the 100 Young Global Changers by the Think 20 Summit, mandated by the G20 presidency.

Personal Response The ability to develop quantitative and predictive models of cell metabolism, especially in bacteria, is an extremely ambitious and sought after goal. What do you think are the fields that will benefit most from your work in the short term and what are the major remaining challenges for the application of the approaches you are developing to the field of biotechnology? The most direct benefits are in synthetic biology and metabolic engineering. Our work is paving the way for computer-aided design of cell factories, much like the way things are done in other engineering disciplines. A major challenge is the lack of methods to integrate various layers of data, such as metabolomics, proteomic and transcriptomic, into tractable models. With the current big data revolution in biology, there is huge potential for machine learning and artificial intelligence to bridge this gap, so that we can harness the full potential of such molecular data. Another big challenge is the role of heterogeneity . Even genetically identical cells display different metabolic phenotypes, which not only negatively affects performance of engineered cell factories, but is also thought to play key roles in bacterial responses to antibiotics, one of the most pressing challenges in global health.

Weisse, A. Y., Oyarzún, D. A., Danos, V. & Swain, P. S. (2015). Mechanistic links between cellular trade-offs, gene expression, and growth. Proceedings of the National Academy of Sciences 112, E1038–E1047.

www.researchoutreach.org

25


Physical Sciences ︱ Jim Tilley

Will an old problem yield a new insight? Perhaps an elegant proof of the 4-colour theorem? The 4-colour problem is one of the most famous mathematical problems. It resisted proof for more than a hundred years before finally succumbing; in the end, there was a valid proof, but one that relied on more than a thousand hours of computer time. Jim Tilley’s research suggests that a dramatic simplification might ultimately be possible. He has discovered a property, Kempe-locking, that a minimum counterexample to the 4-colour theorem must exhibit and has formulated a conjecture, based on extensive investigations, that the Birkhoff diamond is the sole fundamental Kempe-locking configuration.

Figure 1. A proper colouring of the planar triangulation on 12 vertices that represents the icosahedron. The single red-blue Kempe chain is shown highlighted.

26

www.researchoutreach.org

G

raph colouring involves assigning labels, or colours, to the vertices of a mathematical object known as a graph. The 4-colour problem provided one of the original motivations for the development of algebraic graph theory. Graph colouring is used in many real-world problems, such as minimising conflicts when scheduling sports events, planning examination timetables and organising seating plans, even for CCTV camera placement in a building with many corners in order to minimise camera overlap. It also provides the foundation for Sudoku puzzles.

colouring of its vertices.) Only planar graphs that are triangulations need to be considered, as any graph that is not a triangulation can be turned into one by inserting edges. If the resulting triangulation is 4-colourable, then the original graph is also 4-colourable. It is highly useful to be able to restrict one’s scrutiny to triangulations.

THE 4-COLOUR PROBLEM Francis Guthrie, a student of the famous British mathematician and logician Augustus De Morgan, posed the 4-colour problem in 1852. He formulated the problem with respect to maps that satisfy certain conditions, such as not containing any holes and having every region (e.g. country or state) connected so that no region exists in two or more noncontiguous parts. Guthrie claimed that for such maps it would never take more than four colours to colour the map such that no two neighbouring regions were the same colour.

AN EARLY ATTEMPT AT PROOF Of those who offered a proof, Alfred Bray Kempe, an English barrister and amateur mathematician, was the person whose reputation was most tainted by his failed effort. He announced his success in 1879 and his ‘proof’ was published in the American Journal of Mathematics. Eleven years later, however, another English mathematician, Percy Heawood, created a map that he 4-colored up to the final region in such a way that Kempe’s method failed for that final region. Despite his failure, Kempe left a useful tool – a Kempe chain. It is a maximal, connected (every vertex reachable from any other by a path along edges) subgraph in which all vertices use exactly two colours. Kempe chains have proven instrumental in colouring and recolouring graphs. See Figure 1.

Nowadays, the 4-colour problem is expressed in terms of graphs, and instead of colouring regions in maps, one colours vertices in graphs. All existing proofs of the 4-colour theorem demonstrate that a minimum counterexample cannot exist. (A minimum counterexample is the smallest planar graph that requires more than four colours for a proper

PROOF AT LAST The first valid proof was announced in 1976 by Kenneth Appel and Wolfgang Haken. It required over a thousand hours of computer time to verify particular aspects of their argument. This notion of relying on computer code, potentially containing human-induced errors (and their code did!), rather than a ‘human’ proof, has not satisfied a great part of the mathematical community. Appel and Haken’s proof was elegant in its overall structure, but the details were ugly.

Converting a map into a graph; regions become vertices.

The proof was refined in 1996 by a team of four mathematicians: Robertson, Sanders, Seymour, and Thomas, but they still relied on computer code to complete their proof. In 2010, Steinberger offered another variation. However, there is still no completely satisfying answer as to why the 4-colour theorem is true. (Once the conjecture was proved, it gained the status of a ‘theorem.’) THE SEARCH FOR AN ALTERNATIVE PROOF The 4-colour problem is so easy to articulate and comprehend that it has attracted the interest of many thousands of amateur mathematicians, all believing they can find a simple classical proof and thus become famous. Jim Tilley’s father, a physicist and principal of Canada’s Royal Military College from 1978-1984, was one such dreamer. Whenever he felt that he had made a breakthrough, he would ask his son, Jim, to check his work. Jim always found a flaw. Yet, despite his initial scepticism that his father’s efforts would ever bear fruit, he became infected with an enthusiasm for the 4-colour problem and began to study it intensely. KEMPE-LOCKING Jim Tilley’s research led to his discovering a new property that a minimum counterexample to the 4-colour theorem must exhibit. He named it ‘Kempelocking.’ He realised that it was likely to be incompatible with another property that a minimum counterexample must exhibit – viz., how connected a graph is (how many vertices must be removed from a graph before it falls apart). Tilley’s Kempe-locking is a property

associated with an edge in a triangulation. The notion starts with deleting an edge xy between adjacent vertices x and y. If for every 4-colouring of the resulting graph in which the colours of x and y are the same, there is no sequence of colour-interchanges on Kempe chains that causes the colour of x to differ from the colour of y, then the original triangulation is said to be Kempe-locked with respect to the edge xy. Tilley proved that a minimum counterexample to the 4-colour theorem has to be Kempe-locked with respect to every one of its edges; every edge in a minimum counterexample must have this colouring property.

entire map will be 4-colourable. Thus, a minimum counterexample cannot contain that particular configuration. It has come to be known as the Birkhoff diamond. Tilley found that a Kempe-locked edge xy seems to arise only when x and y are also the endpoints of the graph version of a Birkhoff diamond. See Figure 2. Not having encountered any Kempelocking configuration without a Birkhoff diamond, he conjectured that the Birkhoff diamond is the only ‘fundamental’ Kempe-locking configuration, one that doesn’t contain a smaller Kempe-locking configuration as a subgraph. However, Tilley found that he could not prove

It would be an astounding simplification if the Birkhoff diamond alone is the key to 4-colourability. Kempe-locking is a particularly restrictive condition that becomes more difficult to satisfy as a triangulation gets larger. Tilley set out to discover if there is anything common to triangulations that have Kempe-locked edges. His earliest search for Kempe-locking led him to the Birkhoff diamond. THE BIRKHOFF DIAMOND In 1913, G. D. Birkhoff discovered that a certain configuration of ten countries in a map (a boundary ring of six countries that encloses a set of four countries) has an important property. If that configuration is present in a map and if the submap with that configuration removed is 4-colourable, then the

Alfred Bray Kempe, an English barrister and amateur mathematician.

www.researchoutreach.org

27


his critical conjecture. It was frustrating because, if true, the conjecture would directly prove the 4-colour theorem. (To be a minimum counterexample, a triangulation would have to contain a Birkhoff diamond subgraph, but if it did, it couldn’t be a minimum counterexample.) AN OVERWHELMING SUPPORTING CASE Instead of proving his conjecture, Tilley did the next best thing. He decided to play the role of an experimentalist and build an overwhelming case to support his conjecture. He divided all relevant planar triangulations into two classes: those in which at least four vertices have to be removed before the graphs fall apart (4-connected) and those in which at least five vertices have to be removed before the graphs fall apart (5-connected). See Figure 3. Helpful in Tilley’s extensive search was that he had to examine only one member of each isomorphism class (graphs

Behind the Research Figure 2

E: jimtilley@optonline.net

Kenneth Appel and Wolfgang Haken.

T: +1 914 242 9081

W: https://en.wikipedia.org/wiki/Jim_Tilley

W: https://www.researchgate.net/scientific-contributions/2085779401_James_A_Tilley W: www.jimtilley.net/

References

Research Objectives

Tilley, J. (2018). ‘Using Kempe exchanges to disentangle Kempe chains.’ The Mathematical Intelligencer, 40, 50–54.

Jim Tilley’s primary mathematical research interest is graph colouring and, in particular, finding an alternative solution to the 4-colour problem that offers a compelling reason why the 4-colour theorem must be true.

Tilley, J. (2018). ‘Kempe-locking configurations.’ Mathematics, 6(12), 309. Tilley, J. (2017). ‘The a-graph coloring problem.’ Discrete Applied Mathematics, 217, 304–317.

Figure 3

Figure 2. The Birkhoff diamond configuration with endpoint vertices x and y. Figure 3. Two planar graphs, one a 4-connected triangulation on 6 vertices and one a 5-connected triangulation on 17 vertices.

The conjecture is easily stated, understandable, and intriguing, and offers a compelling explanation for why all planar graphs are 4-colourable. that are structurally identical). Tilley examined all 8,044 isomorphism classes of 4-connected planar triangulations on up to 15 vertices and all 9,733 isomorphism classes of 5-connected planar triangulations on up to 24 vertices. He found only three Kempe-locked

Jim Tilley

y

x

triangulations. Each discovered Kempelocked edge featured a Birkhoff diamond and each occurred in a 4-connected triangulation. There were none at all among 5-connected triangulations. Tilley expanded his search among 4-connected triangulations by examining all 30,926 isomorphism classes on 16 vertices and all 158,428 isomorphism classes on 17 vertices. Computation-time limitations meant restricting his search to samples of 100,000 randomly generated non-isomorphic triangulations each for classes on 18,19, and 20 vertices. The expanded search turned up 45 additional Kempe-locked triangulations, but exactly the same results as the original search: each Kempe-locked edge in a triangulation featured an associated Birkhoff diamond. WHERE FROM HERE? Tilley’s extensive searches easily confirmed that the Birkhoff diamond

is a fundamental Kempe-locking configuration. He has thoroughly tested his conjecture, but it remains unproven. If (or when) Tilley’s conjecture is proved true, i.e., that the Birkhoff diamond alone is the key to 4-colourability, it would be an astounding simplification of the problem: a single configuration that explains it all – mathematical elegance. Proving the 4-colour conjecture required the efforts of many prominent mathematicians. Tilley’s conjecture that the Birkhoff diamond is the sole fundamental Kempe-locking configuration may be even more difficult to prove. However, the experimental evidence is strong. Theorists might say ‘why bother?’ Tilley’s answer is that insatiable curiosity will win out: “After all, the conjecture is easily stated, understandable, and intriguing, and offers a compelling explanation for why all planar graphs are 4-colourable.”

Tilley, J. (2017). ‘D-resolvability of vertices in planar graphs.’ Journal of Graph Algorithms and Applications, 21(4), 649661. Tilley, J. (2018). ‘The Birkhoff Diamond as Double Agent.’ Working paper at arXiv.

Personal Response

Detail 61 Meeting House Road Bedford Corners, New York 10549-4238 USA Bio Jim Tilley earned a doctorate in Physics from Harvard University. Since retiring in 2001 as Morgan Stanley’s Chief Information Officer, he has published original research on graph colouring. He has also published three books of poetry. His debut novel, Against the Wind, will be released in September 2019.

What initially prompted your enthusiasm for the 4-colour problem and led you to study it so intensely? It was my father’s obsession with the problem in his retirement and his desire to use me as the sounding board for his ideas. What are your plans for future research in this area? I have recently reviewed a complicated paper involving a wholly different approach to the 4-colour problem. As the paper stands, I believe it has flaws. Yet, it has promise. I might be tempted to explore a collaboration.

Julia Sudnitskaya/Shutterstock.com

28

www.researchoutreach.org

www.researchoutreach.org

29


Engineering and Technology ︱ Dr Titus Masese

The materials making potassium-ion batteries possible You are probably familiar with lithium-ion batteries that can be found everywhere from inside our mobile phones to electric cars. However, lithium’s larger brother potassium may soon find its way into the batteries that power our everyday lives. Dr Titus Masese at the National Institute of Advanced Industrial Science and Technology in Osaka, Japan, has been developing new materials for electrodes to help overcome some of the current limitations of potassiumion battery technologies to allow them to reach their potential as a promising low-cost rechargeable battery material for energy storage.

A

n electrical battery is any device that stores energy that can be converted to electrical energy. A basic battery provides electricity, or a flow of electrons, by having a positively charged cathode and negatively charged anode at each end, with an electrolyte, a conductive solution, in between.

One of the most famous types of rechargeable batteries is the lithium-ion battery, where its high-power density makes it ideal for portable, energyhungry devices like smartphones. However, lithium-ion batteries are not the only option for rechargeable battery technologies.

When a battery is connected to an electrical circuit, chemical reactions within the battery cause electrons to start building up at the anode. Eventually, this pile-up of electrons becomes unstable and will move through the electrolyte separating the anode and cathode, and flow around the circuit, providing the necessary electrical power.

Dr Titus Masese at the National Institute of Advanced Industrial Science and Technology in Osaka, Japan has been working on new, potassium-based materials for developing potassiumion based rechargeable batteries. There are good motivations for doing this. Developments in higher-energy, longer-lifetime and lower-cost battery technologies are a key part of the necessary energy storage strategy required for a more sustainable future, and potassium-ion batteries may offer a lower-cost alternative, partly as potassium is over eight hundred times more abundant on Earth than lithium.

However, each of these chemical reactions depletes the stored potential energy in the battery. Rechargeable batteries try to overcome this by reversing the oxidation and reduction reactions, the chemical reactions that occur at the cathode and anode and, this time, convert electrical energy to chemical energy.

There are multiple possible applications for potassium ion batteries. These range from industry, renewable energy, domestic uses, electric vehicles and mobile phones.

30

www.researchoutreach.org

A MATERIALS CHALLENGE Part of the reason that phones today don’t come with potassium-ion batteries is some of the technical challenges with their development that people like Dr Masese and his colleagues are working to overcome. The name of lithium or potassium-ion batteries comes from the type of chemical element

Schematic illustration of the operating mechanism of potassium-ion batteries. Akin to lithium ion batteries, potassiumions shuttle back and forth through the electrolytes to the electrodes. A layered cathode and graphite as anode is shown for brevity. The cathode (highlighted in red) principally restricts the energy density of potassium-ion batteries and was the central focus of this study.

that is embedded in the electrodes. This ion, which is oppositely charged to the electrons produced in the battery, is released from the electrodes and moves in the opposite direction to the electron. The movement of the ions is as crucial for the battery performance as the movement of the electrons: if the ion fails to move, the battery can discharge and fail to provide energy. What Dr Masese and his colleagues have been successfully able to do is create honeycomb layered cathode frameworks that incorporate potassium ions and are capable of sustaining very high voltages. One of the challenges for using potassium ions in rechargeable batteries is that their large sizes can make it difficult to incorporate them in the tight-packed lattice frameworks that make up the type of electrons used in lithium-ion batteries. In an ideal battery, the ions would be fast to release from the framework, move through the electrolyte and then be reincorporated back into the frameworks as required.

The honeycomb structures that Dr Masese has pioneered currently show the largest voltage for any layered cathode material. due to the large size of the potassium ions. Many electrode materials are made from highly organised, regular crystalline structures, so any gaps left by a departing ion will remain in the same place. However, recombination processes and unwanted chemical reactions between the electrolyte and electrode can lead to damaging and aging of the battery. This is why many phones undergo significant deterioration of battery performance even within a few years of manufacture. These honeycomb materials have shown to be thermally stable and maintain the high voltages they are capable of producing, which bodes well for the

potential lifetimes of potassium-ion batteries based on these cathodes. It is not just the cathode though that is important for battery stability and durability. As the electrolyte can also play a role in detrimental chemical reactions, this too must be as benign as possible. FUTURE OF ELECTROLYTES Dr Masese and his colleagues have been using ionic liquids as the electrolyte materials in the development of potassium-ion cells, or tellurate-based materials where a solid electrolyte is desirable. Ionic liquids are unusual in their behaviour as they are liquids that contain dissolved salts that

The honeycomb structures that Dr Masese and his colleagues have pioneered currently show the largest voltage for any layered cathode material, meaning it could potentially show the highest energy delivery as part of a battery. He has also been working to incorporate these into rechargeable potassium-ion batteries as a step towards more practical potassium-ion battery technologies. DURABLE ELECTRODES The secret of the success of Dr Masese’s honeycomb structures lies in their ability to facilitate reliable potassium ion recombination, a significant challenge

Schematic showing the protocols employed in screening potential cathode materials for rechargeable potassium-ion batteries. Potassium-based compounds were screened using reliable computational processes after which potential candidates were selected. The selected material powders were assembled as electrode materials and their performance evaluated in coin cells using standard electrochemical measurement procedures.

www.researchoutreach.org

31


Behind the Research

Dr Titus Masese

E: titus.masese@aist.go.jp T: +81-72-751-9224 W: www.aist.go.jp/index_en.html W: https://unit.aist.go.jp/riecen/index_en.html W: https://www.youtube.com/watch?v=HQl8AJxCAlw

Snapshot of the various novel potassium-based compounds (cathode candidates) synthesised over the course of this study.

Dr Masese’s materials are helping to herald in the post-lithium-ion battery age. exist as charged ions. Conventional electrolytes contain organic solvents, rendering them flammable. They also tend to decompose at high voltages and temperatures. Ionic liquids do not contain flammable organic solvents. Therefore, they are safe and stable at high voltages and high temperatures which are beyond the capacity of conventional electrolytes.

Solid electrolytes are also highly desirable to combat some of the safety concerns about rechargeable ion batteries. Liquid electrolytes can leak if the battery is damaged and are also highly flammable, which is the origin of many of the highprofile stories about mobile phone battery fires. The problem with solid electrolytes is that, although they are better from a safety perspective, they

CRYSTAL STRUCTURE P2-Type layered structure

SIDE VIEW

Honeycomb structure

TOP VIEW

The figure on the left shows the layered crystal structure. On the right is the same structure viewed from above.

32

www.researchoutreach.org

Research Objectives

References

Working with the Advanced Battery Research Group at AIST, Dr Masese’s research explores high performance next-generation rechargeable battery systems.

Masese, T. et al., (2018), Rechargeable potassium-ion batteries with honeycomb-layered tellurates as high voltage cathodes and fast potassium-ion conductors, Nature Communications, 9, 3823.

Dr Masese at work in the lab.

Detail

do not show the same efficiency as their liquid counterparts, but Dr Masese has demonstrated that the tellurate-based materials do show high conductivity for the potassium ions and may well play an important role in the development of solid electrolytes for potassium-ion batteries of the future.

Dr Titus Masese National Institute of Advanced Industrial Science and Technology (AIST) 1-8-31 Midorigaoka, Ikeda-shi, Osaka, Japan PO Box 563-8577 Kansai Center

LIGHTENING THE LOAD One of the key advantages of the potassium-ion batteries that Dr Masese is helping to realise is their very high voltage capabilities, that he has demonstrated through the performance of the electrode material. High-voltage supplies reduce the need for so many cells within a battery pack. This then entails small volume, cost and weight battery packs, something that is essential for improving the performance of, for instance, electric vehicles. The 85 kWh battery pack in the Tesla Model S weighs in excess of an enormous 500 kg and accounts for nearly a quarter of the total weight of the car. Although 40 litres of petrol weighs around 30 kg and there is the weight of the fuel tank and fuelling system to consider, finding a solution for the heavy weight and bulky sizes of electrical batteries would dramatically increase the feasibility of electric vehicles and significantly increase their efficiency.

Bio Titus Masese hails from Kenya. He won a Japanese government scholarship after emerging amongst the top students in KCSE 2002 national examinations. He obtained his BEng, MSc and PhD degrees from Kyoto University (BEng supervisor: Prof Haruyuki Inui; PhD supervisor: Prof Yoshiharu Uchimoto). He is currently a researcher at AIST. Funding • National Institute of Advanced Industrial Science and Technology (AIST) • J apan Prize Foundation Collaborators •D r Kazuki Yoshii •D r Yoichi Yamaguchi •D r Minami Kato • Dr Satoshi Uchida •D r Toyoki Okumura •D r Keigo Kubota • Dr Hiroshi Senoh • Prof Zhen-Dong Huang (Nanjing University) • Prof Yuki Orikasa (Ritsumeikan University) • Dr Hajime Matsumoto • Prof Martin Månsson (KTH Royal Institute of Technology) • Many thanks to Ms Kumi Shiokawa for the relentless technical assistance

Masese, T. et al., (2019). A high voltage honeycomb layered cathode framework for rechargeable potassiumion battery: P2-type K2/3Ni1/3Co1/3Te1/3O2, Chemical Communications, 55, 985. Kato, M. et al., (2019), Organic positive-electrode material utilizing both an anion and cation: a benzoquinonetetrathiafulvalene triad molecule, Q-TTF-Q, for rechargeable Li, Na, and K batteries, New Journal of Chemistry, 43, 1626.

Personal Response What are the next steps in your work to turn these honeycomb materials into a part of battery technologies? From a fundamental point of view, we have shown that high voltages are attainable with the honeycomblayered tellurate materials. Although tellurium has been studied intensively from both fundamental and technological perspectives, its use may render these honeycomb cathode materials unpractical. Part of our ongoing work is the design of tellurium-free related materials also demonstrating high voltages. While we note that there are other challenges that should ultimately be solved for the nascent potassium-ion technology to reach the market, the insights garnered in this study as well as other studies reported so far have identified many exciting and promising routes forward to designing a high-voltage battery prototype. We believe that by reaching out to the wide-spanning scientific community, we can bring the potassium-ion technology closer to reality.

Dr Masese’s materials are helping to herald in the post-lithium-ion battery age and demonstrate a significant advance in the feasibility of potassium-ion based technologies.

www.researchoutreach.org

33


Physical Sciences ︱ Dr Benjamin S. Hsiao

The village outside Ileret in Marsabit County near Turkana Lake, Kenya, one of the poorest regions on Earth.

Sustainable water purification using biomass Nanoscale cellulose materials obtained from the chemical treatment of biomass are very effective agents for the removal of toxic species from water, including heavy metal ions. Professor Benjamin S. Hsiao and his collaborators at Stony Brook University have developed a simple, inexpensive and environmentally friendly approach to preparing nanostructured cellulose for water purification, based on a nitro-oxidation reaction carried out on biomasses of diverse origins. In addition to providing cellulose with a superior affinity for dissolved toxic ions, this process yields nitrogen-rich salts as byproducts, which can be recovered and used as fertilizers.

T

he ability to remove pollutants quickly and efficiently at low cost is a basic requisite for the human utilisation of water in a large variety of environments and situations. For instance, in many developing countries, clean water remains a rare and precious commodity, since the available (and often very limited) sources of water frequently contain human pathogens of bacterial origin, incompatible with human consumption. In industrialised societies, metal pollutants also pose a severe threat to health and the environment. Dr Hsiao’s team has been at the forefront of research on the chemical modification of nanostructured cellulose (nanocellulose) for water purification, and they have recently demonstrated a simple, innovative, and environmentally friendly approach to exploit nanocellulose from virtually ubiquitous and low-cost natural resources for water purification. HEAVY METAL CONTAMINANTS IN WATER Heavy metal ions are among the most common pollutants of drinking water in modern societies. For instance, lead ions are powerful neurotoxins and they constitute the most prevalent form of heavy metal water pollution on a global scale. Over a hundred thousand deaths attributed to lead poisoning have been reported in 2016. Lead poisoning has also been linked to the appearance of defects at birth and to cancer. Lead is commonly used in the infrastructures for water transportation and supply around the world, and the amount of metal dissolved in drinking water increases with time due to the progressive corrosion of the infrastructure. Furthermore, the recent practice to add chloramine for disinfection in water treatment facilities has led to even higher concentrations of lead ions in drinking water, because of the reaction

of chlorine with lead in domestic pipes, which promotes the metal dissolution. Another major pollutant is cadmium, which is extensively used in electronic circuits, batteries, solar cells, paints and pigments, and can enter water sources through industrial waste and run-offs. Consumption of water or food contaminated with cadmium can lead to severe gastrointestinal irritation and, potentially, to death. Increasing levels of cadmium contamination have been reported in recent studies in some areas of Africa, Asia and South America, although the problem is by no means limited to these regions. Uranium is also a common water contaminant. High levels of uranium salts are observed not only in nuclear waste but also in water sources from regions (including New Mexico, Australia, Austria, Kazakhstan, Canada, India and the Czech Republic) in which this element exists in large concentrations in the bedrocks and in groundwater. Upon ingestion, uranium can rapidly enter the bloodstream and bind to red cells, to form a uranylalbumin complex, which can accumulate in the kidneys and in bones. Removal of heavy metal ions, as from the above examples, or bacterial pathogens from drinking water can be accomplished by exploiting the ability of materials with suitable functionality (for example activated charcoal or synthetic polymers) to bind the pollutants, whilst remaining insoluble in water. After binding, they form secondary contaminants and need to be removed. The floc formed from the interaction of nanocellulose and heavy metal ions (as well as bacterial pathogens) can be easily removed by gravity-driven filtration or decanting, thus avoiding the addition of costly means. Nanocellulose is one of the most promising classes

of materials for water purification, in view of its availability, abundance and low environmental impact, as it can be extracted from any biomass such as trees, plants and weeds. CARBOXYCELLULOSE: A FUNCTIONAL MATERIAL FOR WATER PURIFICATION Cellulose is the most abundant organic polymer on Earth, and it is a primary component of the cell walls in plants. It is composed of long chains of D-glucose units, connected by bridging oxygen atoms. It is abundant in natural fibres (for instance, its content in cotton is roughly 90%) and in wood (40-50%) and it is the raw material for the largescale production of important materials, including paper, cellophane and rayon. Cellulose derivatives obtained by chemical treatment of raw cellulose can also bind efficiently with metal ions in water. The work of Dr Hsiao has focused specifically on carboxycellulose, which is composed of cellulose chains that have been chemically modified to include carboxylate groups (--COO-) in their structure. Nanostrcutured carboxycellulose has two key features, which make it highly attractive as an ion-binding material. First, its nanoscale structure originates from the existence of building blocks (cellulose microfibrils) in the cell walls of raw biomass materials, rather than from the recombination of

dissolved cellulose chains. The production of nanostructured carboxycellulose, therefore, does not necessarily require energy-intensive processes. Second, the chemical modification of the cellulose matrix (through processes like oxidation, carboxymethylation, phosphorylation,

NANOSTRUCTURED CARBOXYCELLULOSE FROM BIOMASS Carboxycellulose nanofibers can be obtained through several approaches. One of the most efficient approaches is the TEMPO-mediated oxidation

Biomass is a vast source of nanoscale cellulose, which can be chemically modified to act as an effective water purification material. acetylation and silylation) introduces negative charges in the cellulose structure, which promote nanofiber dispersion in water and provide functional molecular sites for the adsorption of dissolved species. For example, carboxycellulose nanofibers offer very large surface areas and chemically active functional groups, which make them ideally suited for filtration membranes and adsorption media for water treatment.

Plant cell wall

reaction, which converts -OH groups in the cellulose polymer into carboxylate groups in mild conditions. This promotes the fibrillation of large cellulose aggregate into nanofibers whilst maintaining long fibre length (submicrons to microns). However, this

Cellulose Fibre

Cellulose Nanofibre

Cellulose Microfibre

A hierarchical structure of cellulose fibres with different diameters in a plant cell wall.

www.researchoutreach.org

35


Behind the Research

method is carried out as a sequence of several steps, and it requires speciality chemicals (e.g. sodium hypochlorite, sodium bromide and TEMPO reagents) that generate dangerous radical species. Its sustainability as a large-scale process to produce nanostructured carboxycellulose remains, therefore, limited. Alternative approaches have also been proposed, based on etherification, oxidation, esterification and carboxymethylation of cellulose, which are only effective for cellulose samples with small concentrations of lignin and hemicellulose and require a preliminary treatment with chemicals like alkali and bleaching agents, along with mechanical treatment, to fully fibrillate the cellulose matrix into nanofibers. NITRO-OXIDATION: A CLEAN ROUTE TO FUNCTIONAL NANOCELLULOSE Dr Hsiao and his co-workers have developed a simpler and far more sustainable approach to the production of carboxycellulose nanofibers from untreated biomass, based on the use of a mixture of nitric acid (HNO3) and sodium nitrite (NaNO2) as the only required chemicals. This process has been shown to work very efficiently for untreated (raw) biomasses of various origins (such as jute, spinifex grass and bamboo cellulose) and, crucially, to be a strictly single-step treatment. This considerably reduces the electrical energy and the water consumption needs compared to other methods. It has been hypothesised that HNO3 in the mixture initiates the fibrillation process of the untreated biomass by removing non-cellulosic components. The reaction of HNO3 and NaNO2 generates nitrosonium ions (NO+) in the presence

Dr Benjamin S. Hsiao

E: Benjamin.hsiao@stonybrook.edu T: +1 631 839 4402 W: www.hsiaoglobal.org

Varying underutilised biomasses that are good resources for the extraction of nanocellulose.

Carboxycellulose nanofibres obtained using the nitro-oxidation method exhibit very exceedingly high affinity for several common water contaminants. Cellulose nanofibers obtained by nitrooxidation provide excellent adsorbent materials for the removal of heavy metal ions, including lead, cadmium, mercury, chromium, uranyl and arsenic as well as bacteria from water. For heavy metal ions, adsorption capacities several times higher than those of the most effective adsorbents in the literature have been reported. The metal-adsorbed nanocellulose flocs can easily be removed using simple and inexpensive gravity driven microfiltration or decanting.

Nanocellulose suspension as effective adsorbent/flocculent capable of removing heavy metal ions. Their maximum removal capacity (indicated below the picture) is significantly higher than those reported in the literature.

IMPACT Vast amounts of biomass sources, including agriculture waste, weeds and shrubs, are available at low or zero cost, and all of them provide ideal raw materials for the nitro-oxidation process. The approach developed by Professor Hsiao represents a tremendous step forward toward the exploitation of underused resources like biomass to develop efficient processes for the removal of waterborne pathogens from drinking water, particularly in developing countries and off-the-grid communities.

www.researchoutreach.org

References

Dr Hsiao and his collaborators are focusing their research on the use of nanocellulose enabling membranes and adsorbents for water purification.

Hsiao, BS (2019). Benjamin S. Hsiao. [online]. Benjamin S. Hsiao Research Group. Available at https://www.hsiaoglobal. org/ [Accessed 27 February 2019].

Detail

Sharma, PR; Chattopadhyay, A; Sharma, SK; Hsiao, BS. (2017). ‘Efficient Removal of UO22+ from Water Using Carboxycellulose Nanofibers Prepared by the Nitro-Oxidation Method’. Industrial & Engineering Chemistry Research, 56, 13885-13893.

Chemistry Department Stony Brook University Stony Brook, NY 11794, USA

of excess acid, which can attack hydroxyl groups in cellulose and produce carboxylate groups. The resulting carboxycellulose fibres exhibit relatively low crystallinity and substantially higher fibre length and aspect ratio than those of cellulose nanocrystals. Furthermore, the effluent obtained as a by-product can be neutralised efficiently using (inexpensive) sodium or potassium hydroxide, to give nitrogenrich salts that can be used as fertilizers in agriculture.

36

Research Objectives

Bio Dr Hsiao is a Distinguished Professor in Chemistry at Stony Brook University. He is also the Founding Director of Center for Integrated Electric Energy Systems in Stony Brook University, with the mission to enhance the development of advanced technologies for the innovative nexus of food, energy and water systems. Funding • The National Science Foundation (Division of Materials Research) • The Claire Friedlander Family Foundation Collaborators • Dr Priyanka R. Sharma • Dr Sunil K. Sharma

Sharma, PR; Joshi, R; Sharma, SK; Hsiao, BS. (2017). ‘A Simple Approach to Prepare Carboxycellulose Nanofibers from Untreated Biomass’. Biomacromolecules, 18 (8), 2333-2341. Sharma, PR; Chattopadhyay, A; Zhan, C; Sharma, SK; Geng, L; Hsiao, BS. (2018). ‘Lead removal from water using carboxycellulose nanofibers prepared by nitro-oxidation method’. Cellulose, 25, 1961-1973. Sharma, PR; Chattopadhyay, A; Sharma, SK; Amiralian, N; Martin, D; Hsiao, BS. (2018). ‘Nanocellulose from Spinifex as an Effective Adsorbent to Remove Cadmium(II) from Water’. ACS Sustainable Chemistry & Engineering, 6, 3279-3290.

Personal Response What are the key advantages of the nitro-oxidation method you have developed compared to existing approaches for the production of water purification agents, and what do you think are the most promising environments in which its application will have the largest impact? There are three key advantages of the nitro-oxidation method. First, the method greatly reduces the consumption of chemicals, energy and water. Second, the processing effluent can be efficaciously neutralised to produce plant fertilisers. Third, the method is effective to extract nanostructured cellulose from underutilised raw biomass such as agriculture waste. The resulting nanocellulose is proven to be an efficient water purification material (membrane or adsorbent) that can treat a wide range of water pollution problems. The demonstrated technology represents an innovative means to enhance the nexus of food, energy, and water systems, and has many far-reaching impacts to improve quality of life.

www.researchoutreach.org

37


SELF-GUIDED LEARNING Dr Winfield’s approach to curriculum design runs counter to the ‘chalk and talk’ approach common in university settings. She uses interactive engagement teaching strategies, which she hopes will benefit institutions with culturally and ethnically diverse populations, as well as contributing to the general trend away from lecture-based content delivery in higher education.

IMPLEMENTING THE FRAMEWORK Dr Winfield began to implement the CoI framework to find out if this learning environment would improve students’ ability to learn and utilise key ideas in organic chemistry. Could a Community of Inquiry framework encourage students to engage actively with their own learning? Could self-regulated learning still ensure that students were able to learn the content? Organic chemistry is essential for advancement into many biological and health-related careers. It is a popular course at Spelman College, despite its difficulty. As a result, a high number of students struggle with the academic rigor. For Dr Winfield, this makes chemistry an excellent focal point for innovative teaching methods that could improve diversity across STEM subjects at higher levels. This framework for teaching manifests itself very clearly at Spelman College. For a start, the classrooms in the AlbroFalconer-Manley Science Center, where chemistry and biochemistry courses are taught, have been remodelled to facilitate

FO Y

Selfpaced

COGNITIVE PRESENCE Content focused

Inquirybased

Face-2Face

Setting Climate

Creating Content Masterybased

Independent

ACTIVE LEARNING

Peer Interactions

Metacognitive

Projectbased

Teambased

SOCIAL PRESENCE

Facilitating Discourse

W

TH

Dr Leyte Winfield is the current Chair for the Division of Natural Science and Mathematics at Spelman College. In her previous position as Chair of the Department of Chemistry & Biochemistry, she was able to redesign the organic chemistry curriculum in order to better engage the female students in her courses. More specifically, the measures were designed to increase the number of chemistry and biochemistry majors who persist in this course of study, as part of a broader goal of creating better equity for African American women in STEM higher education.

METACOGNITION AND AGENCY Metacognition, or more specifically metacognitive skillfulness, is an awareness of how learning occurs – or more simply, ‘thinking about thinking’. It can be used to self-evaluate and influence learning. Agency is a description of someone’s ability to control their own actions. Learners with a sense of agency are more able to engage with and invest in learning. In combination, metacognitive skillfulness and agency allow learners to self-regulate by setting goals, employing effective learning techniques, and examining the results of their efforts.

SET & SELF-EFFIC D N AC MI

Research often highlights this issue, but there is little information on successful measures for improving retention of under-represented groups in academia and into the labour force. Measures for broadening participation of African American women in STEM may be found amongst the many Historically Black Colleges and Universities (HBCUs) across the United States. These are institutions which were set up to provide higher education to African American people before the Civil Rights Act of 1964 prevented racial segregation, when most higher education institutes either prevented African Americans from attending or enforced quotas on enrolment. HBCUs are well-

SPELMAN COLLEGE One such institution is Spelman College in Atlanta, Georgia. Spelman College has been educating African American (AA) women since 1881, the first institution created for this purpose, and is the top bachelor degree granting institution of origin for AA females who go on to earn STEM PhDs; the second for AA individuals in general. More than half of the faculty members in Spelman’s STEM departments are female, 64% of whom are African American. One-third of those entering Spelman’s degree courses major in STEM subjects.

Confusion Resolution

Role playing Collaborative

HNOLOGY ENHA TEC

www.researchoutreach.org

versed in providing an education for African Americans and facilitating their progression through academic science.

TEACHING PRESENCE

N C ED

38

A

frican Americans make up almost 15% of the United States’ population. Despite this, in 2013, around 5% of PhD recipients in the US were African Americans, and fewer than 1% of PhDs were awarded to African American women. Whilst African American women are wellrepresented early on in higher education in Science, Technology, Engineering and Mathematics (STEM) subjects, the proportion of this demographic drops at each point along the ‘STEM pipeline’ – the journey through STEM education into the workforce. The under-representation of black women in academia may not come as a surprise, but these statistics reveal a startling injustice: African American women face significant barriers to progressing in STEM careers.

Problembased

&

By employing modern theories of learning such as metacognitive skillfulness, agency, and inquiry-based learning, Dr Leyte Winfield, former Chair of the Department of Chemistry & Biochemistry and current Chair of the Division of Natural Sciences and Mathematics at Spelman College in Atlanta, GA, is creating an environment where African American women can gain critical thinking skills to thrive in Science, Technology, Engineering and Mathematics. Students learning chemistry at the college are being educated through authentic, culturallyrelevant learning experiences.

StudentTeacher Interaction

EB

Empowering African American women in STEM

ED S CU

W

Inquiry learning:

COMMUNITY OF INQUIRY The type of self-regulated learning Dr Winfield is employing in the department is based on a framework for learning called Community of Inquiry (CoI). The CoI framework identifies that social, cognitive and teaching factors are all important in shaping how people learn. Combining these factors creates an environment where learning can occur through group work on problem-solving, with an emphasis on questioning and critical thinking. Dr Winfield’s research explores what happens when the CoI framework is used with students who have been taught theories of metacognition and agency.

GR O

Education and Training ︱ Dr Leyte Winfield

The structure of the chemistry course at Spelman College is loosely based on the Community of Inquiry (CoI) framework which connects elements of social presence, cognitive presence, and teaching presence.

active learning: they contain modular workstations for group work, holding data projectors and computers. Now more

‘flipped learning’ – so-called because the impetus to learn is ‘flipped’ onto the student. The benefit? More classroom

Statistics reveal a startling injustice: African American women face significant barriers to progressing in STEM careers. than half of the STEM faculty use digital teaching methods which facilitate active learning. Instead of using classroom time for organic chemistry lectures, students now watch narrated presentations on a digital platform, followed by an online quiz, utilising a technique known as

time is now free for face-to-face skills development, guided by worksheets. Dr Winfield hoped that this combination of flipped learning with classroombased inquiry and digital learning would have a positive impact on her students.

The Community of Inquiry framework applied at Spelman College improves students’ ability to learn.

www.researchoutreach.org

39


Behind the Research

Dr Leyte Winfield E: lwinfield@spelman.edu T: +1 404 270 5748

Organic chemistry lab courses at Spelman College place emphasis on inquiry and discovery.

After measuring performance over two consecutive year-groups, Dr Winfield had an answer: not only was the academic performance of the students comparable to those taught using traditional methods, but pre- and post-testing of students also showed that students remained motivated throughout the course, and demonstrated more responsibility for their own learning. What’s more, students interacted more with their peers – a skill that Dr Winfield thinks is an important one for student self-belief. IN THE CLASSROOM One challenging aspect of the chemistry curriculum is organic chemistry, which deals with large, complex molecules. In organic chemistry, high-level concepts are made even more difficult by the need to visualise chemical structures in three dimensions. Traditional pen-and-paper representations limited to two dimensions mean that many students struggle to make links between representations of molecules and the three-dimensional molecular world.

Peer-led learning is encouraged in the organic chemistry course at Spelman College.

and physical models, students work individually through a problem sheet. This inquiry-based approach is followed by small-group discussions where peer-led learning can occur, and an instructor is available to answer questions and confirm answers. Across the four years this method has been used and evaluated, 71% of students felt that using technology was valuable or extremely valuable, and 79% thought that the iPad app should be used in future teaching courses for the same material. Inquiry plays an important role in laboratory work too. Lab courses are being moulded into experiences which place an emphasis on inquiry and discovery. Prescriptive instructions are removed, and experimental outcomes are left to the students to discover.

expanded and honed the format for other organic chemistry courses: the courses are topped and tailed by assessments to check student progress, and include timed elements of individual work, group work and solutions given by the instructor. Evaluations of these workshops once again showed positive outcomes: students valued the workshops, reported higher confidence on completion of the worksheets, and demonstrated learning through the pre- and post-tests.

By closely mimicking research procedures, students learn problem-solving skills important for STEM careers.

IMPACT A major motivating factor in Dr Winfield’s curriculum design is the introduction of ‘culturally-relevant teaching’ – the idea that curricula should be designed with a regard to the culture and language of the students undertaking the course. By utilising discovery learning methods, Dr Winfield has created a curriculum where students encounter rigorous and authentic problem-solving tasks which have culturally-relevant, real-word implications.

EXPANSION More recently, Dr Winfield has been able to complete a five-year assessment of the flipped learning format. This evaluation indicates that students perform better under the flipped learning teaching style than students did previously. As a result, Winfield has

Dr Winfield feels that her work is not done yet – she highlights in her research that evaluation of the courses is ongoing. One thing can be said: Winfield’s influence is one which, by utilising contemporary teaching approaches, is empowering many African American women with critical thinking skills for careers in science.

Now more than half of the STEM faculty use digital teaching methods which facilitate active learning.

Dr Winfield’s approach here is to turn to digital tools for inquiry-based and peerled learning. In one organic chemistry course, students are given instruction on how to use the digital tools, and by visualising chemical structures on iPads

40

www.researchoutreach.org

Research Objectives

References

Dr Winfield is dedicated to creating culturally responsive initiatives and curricula that result in the productive engagement of minorities and women in various academic settings and in activities that promote gender equity in science careers.

Winfield, L.; Hibbard, L.; Jackson, K.; Johnson S.S. (2019) Cultivating Agency through the Chemistry and Biochemistry Curriculum at Spelman College. Broadening Participation in STEM, 152-181.

Detail 350 Spelman LN, SW Box 231 Atlanta, GA 30314-4395 USA Bio Leyte Winfield is the Division Chair for Natural Sciences and Mathematics at Spelman College. She directed departmental efforts to establish new strategies for structured curricular reform. In doing so, she led the department’s efforts to broaden the curriculum to reflect a liberal arts education while simultaneously providing students with resources that promote improved engagement and performance in chemistry and biochemistry courses. She is dedicated to creating culturally responsive initiatives and curricula that result in the productive engagement of minorities and women in various academic settings and in activities that promote gender equity in science careers. Her work currently focuses on characterising agency in interactive and peer learning spaces. Funding Funded in part by the National Science Foundation Historically Black Colleges and Universities Program (HBCUUP) Targeted Infusion Project Award No. HRD-1332575 and the National Science Foundation Improving Undergraduate STEM Education (IUSE) Award No. 1626002 Collaborators • Lisa Hibbard • Shannon Sung • Suazette Mooring • Shanina Sanders

Winfield, L.; Jackson, K. (2014) Realigning the Crooked Room: Spelman Claims a Space for African American Women in STEM. Peer Rev. 2014. 16(2), 9−12. Hibbard, L.; Fullilove, F.; Winfield, L. (2016) Engineering Course Success Through Interactive Engagement. Teaching a New Generation of Students, A National Symposium [online] Available at: https://facultyresourcenetwork.org/publications/ teaching-a-new-generation-of-students/felicia-fullilove-lisahibbard-leyte-l-winfield-engineering-course-success-throughinteractive-engagement/. Winfield, L.; McCormack, K.; Shaw, T. (2019) Using iSpartan To Support a Student-Centered Activity on Alkane Conformations. Journal of Chemical Education, 96(1), 89-92. Winfield, L. (2015) Community-based Interactive Engagement in an Organic Chemistry Course. Appears in: Proceedings of the International Conference of Education, Research and Innovation 2015 (IC-ERI2015) Conference, 16th-18th November 2015, Seville, Spain. ISBN: 978-84-608-2657-6.

Personal Response Benefitting African American women is clearly a driving force in your research into inquiry learning. Why do you think these teaching approaches specifically benefit African American women in STEM over other demographics? I don’t believe these practices benefit African American women more than other demographics. I do believe that these teaching approaches benefit diverse populations in general as they are flexible. They speak to individuals with different learning styles and they place students at the centre of the learning activities. Active learning strategies inspire science identities by showing students that you have confidence in their ability to learn at a higher level, thinking critically and acting without prompting but with thoughtful facilitation from the instructor.

www.researchoutreach.org

41


Thought Leader

huyangshu/Shutterstock.com

WHAT IS YOUR ECOLOGICAL FOOTPRINT?

TO SUPPORT HUMANITY’S DEMAND ON

1.7

DrimaFilm/Shutterstock.com

IT TAKES

Measure what you treasure How Global Footprint Network is using data to show people how to protect our planet

There is only one Earth, but humans are consuming biological resources as if there were more Earths available. Last year, Global Footprint Network marked ‘Earth Overshoot Day’ on 1st August; humanity had used our budget of resources for the year five months early. For over a decade, Global Footprint Network has been striving to reduce this ecological deficit through education initiatives on both an individual and national scale. The organisation is intent on helping forge a successful pathway to ‘one-planet prosperity.’

I

n 2008, a journalist suggested that you were now more likely to hear the term ‘footprint’ in relation to one’s impact on the planet than the mark one’s foot makes in the sand (www.nytimes. com/2008/02/17/magazine/17wwlnsafire-t.html). This shift in thinking has been led and further developed by Mathis Wackernagel, co-creator of the Ecological Footprint concept and co-founder of the Global Footprint Network. Global Footprint Network looks to change the way the world uses its natural renewable

42

www.researchoutreach.org

resources by publishing simple, scalable data for everybody. The Footprint Calculator, which assesses the Ecological Footprint for each individual based on their consumption and tells them how many planets we’d need if everybody lived like them, is the perfect example of such a tool. Global Footprint Network’s President, Mathis tells us more about his vision for the future, and how new digital resources, tools and data are helping

people understand the grave predicament our one and only planet is in. Hi Mathis! Can you tell us more about Global Footprint Network in terms of its mission, background and core principles? Our vision is a world where all can thrive within the means of one planet. Humanity is currently operating an ecological Ponzi scheme, where we are vastly overusing our planet, thereby putting at risk human well-being. Global Footprint

N ATUR E

EA RT H ’ S

We use more ecological resources and services than nature can regenerate through overfishing, over-harvesting forests, and emitting more carbon dioxide into the atmosphere than forests can sequester.

HOW MUCH DOES FOOD CONTRIBUTE TO YOUR ECOLOGICAL FOOTPRINT?

Network’s mission has always been to help end ecological overshoot by making ecological limits central to decisionmaking. Since 2003, we have pursued this mission by producing the National Footprint Accounts, communicating the science of sustainability through the Footprint Calculator and, now for over a decade, Earth Overshoot Day www.overshootday.org.

FOOD

WORLD

MAKES UP

ECOLOGICAL FOOTPRINT OF FOOD

AFRICA

26%

TOTAL ECOLOGICAL FOOTPRINT

ASIA

LATIN AMERICA & CARIBBEAN

OF HUMANITY’S

ECOLOGICAL

EUROPE

FOOTPRINT

OCEANIA NORTH AMERICA 0

1

2

3

4

5

6

7

(GLOBAL HECTARES PER PERSON)

We reach far and wide by partnering with countries, cities and organisations around the world. We have strived, since our inception, to most effectively contribute to making the human enterprise oneplanet compatible. Why did you set up Global Footprint Network and what you do on a daily basis? There was and still is no other metric available that comprehensively compares human demand with what Earth can regenerate.

As we work with people around the world, my day starts with early morning phone calls and often ends with late night calls. In between, I work with my colleagues to deliver on our mission. This means working on projects, designing

new strategies, engaging with potential supporters, etc. I have fabulous colleagues at Global Footprint Network which makes doing the work extra rewarding and intriguing.

Our vision is a world where all can thrive within the means of one planet.

www.researchoutreach.org

43


Thought Leader

WHAT CAN WE DO TO REDUCE OUR FOOD FOOTPRINT? The way we eat is a fundamental agent of change towards SUSTAINABILITY INCREASE THE PROPORTION OF CEREALS, VEGETABLES AND FRUITS

HOW?

Rich Carey/Shutterstock.com

DECREASE FOOD WASTE

Nikifor Todorov/Shutterstock.com

IF EVERYONE IN THE WORLD • CONSUMED WORLD AVERAGE CALORIES • REDUCED THE FOOTPRINT INTENSITY OF THEIR DIET • CUT FOOD WASTE IN HALF

We would

#MoveTheDate

of Earth Overshoot Day Last year, Global Footprint Network published an article titled ‘Has humanity’s Ecological Footprint reached its peak?’ Do you think the tide has turned in favour of the planet in recent years? I wish we could report this. Carbon emissions are still going up, and even more quickly than in previous years. There is a lot of promising technology and many intriguing projects underway, but they are still not reaching the speed and scale necessary to turn the trend, let alone reducing the demand fast enough to

avoid the predictable ecological damage. The technology is available, the financials add up, but society is still not willing. Looking forward, what issues are high on Global Footprint Network’s priority list to tackle in terms of reducing human impact on the planet? I spend a lot of time thinking about and exploring what is holding us back most significantly. My current hypothesis is that society is committed to a false narrative. If we believe that ‘not being one-planet compatible is not a risk to

MOVE DOWN THE PYRAMIDS TO #MOVETHEDATE

our own success’, it is unlikely we will act commensurably. Although, much evidence demonstrates that humanity’s ecological overshoot, and each person’s unpreparedness vis-à-vis those trends, are a significant risk to their success. Hence, my priority is to make this more obvious. Global Footprint Network has won a lot of global awards; what campaign/tool are you most proud of? The things I am most proud of were put in place by others. For instance, the concept of Earth Overshoot Day is to explain overshoot in more understandable terms. Now, we are generating three billion media impressions per year, on a shoe-string campaign. The ability to explain, with Earth Overshoot Day, the challenge of sustainability, even in quantitative terms, without using complicated concepts or more than two syllable words is invaluable: ‘From January to August 1, people have taken more from nature than Earth can renew in the entire year’. All two syllable words, except for January with three – but hopefully everyone knows that one!

The Barilla Center for Food and Nutrition double pyramid shows that foods recommended by nutritionists are also better for the planet. Move down the food pyramid for a healthier diet and a healthier planet.

44

www.researchoutreach.org

#MoveTheDate is a great campaign – the idea that if we moved Earth Overshoot Day into the future, by just five days every year we would be back to one

planet before 2050. We currently use the biological resources of 1.7 Earths. #MoveTheDate is positive – it shows we need to succeed collectively (it is not just about individual suffering), and it is about expanding our resource security. The National Footprint and Biocapacity Accounts reveal interesting realities – check data.footprintnetwork.org for results. They use 15,000 data points per country and year, based on super basic and therefore robust science principles. We call it sometimes, pun intended, ‘pedestrian science’. Of course, the devil is in the detail with such large amounts of data. But the principle is simple: we add up all the human demands that compete for biologically productive space. Any high-school student could do this. Global Footprint Network’s visual data tools appear to be reaching people globally; what do you think it is about presenting the information in this way

that impacts people? Information must be accurate, relevant and empowering. These are the principles to follow. People must believe it is accurate – and to build trust the information needs to be transparent, and without apparent conflicts of interest. That is why we are now building a whole new initiative to make the National Footprint and Biocapacity Accounts an independent organisation. We still need to reach far more people though. What do you see as the biggest barriers to reducing the human Ecological Footprint today? Modern society has a distorted belief system. Society would be better off with a fresh narrative and recognise its dependence on nature. Urbanites believe that nature is no longer important, while we humans have actually become increasingly dependent on material inputs. Beyond individuals, has Global Footprint Network had positive experiences with organisations and governments who have employed it? How important are these collaborations to the Network? We called ourselves Network because we realised that as a small group of people, we can only have an impact in collaboration with others, including businesses and national governments. Obviously, the best would be if we could convince all policy advisors that eroding

We need to shift our narrative and shed our misconceptions about our dependence on nature.

their resource security is a fundamental risk to their country’s or city’s longterm success. There are some timid recognitions, but we still need to scale this way up. And when I say ‘we’, I mean everybody, business schools (who often still teach 20th century understanding of economics), climate advocates, high school teachers, etc. – everybody. Earth Overshoot Day fell on 1st August last year. What campaigns have you got planned to raise awareness for 2019? Our goal is to engage people by helping them fall in love with the most significant and intriguing challenges we are facing – how to thrive within the means of our planet. We want to partner with as many people as possible to engage in this journey. If you would like to work out your own Ecological Footprint or find out more about Global Footprint Network and its work, you can visit their website at: www.footprintnetwork.org.

Global Footprint Network Oakland Office: 426 17th Street, Suite 700 Oakland, CA 94612, USA E: media@footprintnetwork.org T: +1 510 839 8879 W: www.footprintnetwork.org @EndOvershoot @GlobalFootprintNetwork

www.researchoutreach.org

45


Informatics and Technology ︱ Dr Zonglu He

FRACTAL DIMENSION Geometric shapes have dimensions. For example, a point has a dimension of 0, a line has a dimension of 1, an area has a dimension of 2, a volume has a dimension of 3, and from these we can derive shapes such

46

www.researchoutreach.org

Fractal dimension was developed as a quantitative measure of complex, irregular objects, essentially because it was accepted that fractals could not be characterised by integer dimensions. Dr Zonglu He, a Professor of Economics at Kaetsu University, is challenging this basic concept with her novel finding that fractals actually can occur in integer dimensional space-times. RESEARCH BACKGROUND In the late 1990s, Professor He was initially interested in statistics, particularly stochastic (random) processes, and time series analysis. At that time, a major discovery was that economic large variations could result from the cumulative effects of noise (such as money demand-supply shocks – sudden temporary increases or decreases in demand and supply) caused by the so-called ‘unit root’ structure. (A unit root is a randomly determined trend in a time series; its occurrence shows an unpredictable systematic pattern.) This finding challenged traditional thinking of stationary fluctuations about a deterministic trend where money demand-supply shocks were assumed to have no long-term impact on the economy. A structural break (an unexpected change in the economy prompted by a sudden event such as a war or a change in government policy) and long-range dependence (also known as long memory, a feature of statistical

rootstock/Shutterstock.com Dinozzzave/Shutterstock.com

as cones and spheres. In the same way, fractals also have dimensions. These provide a measure of how complicated the fractal is. The fractal index, however, can have non-integer values e.g. a curve with a fractal dimension of 1.1 will perform like a one-dimensional line, while a curve with a fractal dimension of 1.9 twists through space almost like a twodimensional surface.

Serg64/Shutterstock.com

F

ractals are never ending patterns. They can be curves or geometric figures, but each part appears to be the same as the whole pattern, a property known as self-similarity. They are created by the iterative repetition of a process or function. Fractals are images of dynamic systems driven by recursion – the image of chaos. Fractal patterns can be found throughout nature, for example snowflakes, seashells, flowers, trees, coastlines and galaxies. Abstract fractals can be generated by computing an equation over and over. Fractals can be used to model structures where patterns recur repeatedly at gradually smaller scales. They can also be used to describe random or chaotic phenomena such as crystal growth and fluid turbulence. Fractals also have applications in economics, such as modelling market price fluctuation or market risk.

Contrary to the underpining concept of fractal dimension, Professor He’s ground breaking findings challenge traditional thinking.

time series involving persistently strong autocorrelation between remote observations) were also recognised as displaying unit root behaviour. It was difficult to say whether either one of these data features actually results in economic fluctuations. While a plethora of analytical literature exists regarding the testing of the data features, little attention has been paid to their causes. THE NONLINEAR AUTOREGRESSIVE INTEGRATED (NLARI) MODEL Professor He pointed out that noninteger fractal dimensions do not explain what creates fractal behaviour and controls fractal levels. She believed that the solution to these issues would be found in modelling the processes involved in generating the data in different situations based on physical laws. If not, the model parameters would not have explicitly physical effects. She went on to highlight that an interdisciplinary unified time series model was required in order to systematically understand the essential mechanisms of time series processes in the real world. This led to Professor He inventing the nonlinear autoregressive integrated (NLARI) model.

Using the NLARI framework, Professor He was able to explain the nature and causes of unit roots and trend breaks. dimensional space could exhibit fractal behaviour over time, particularly in the parameter ranges of nonlinear stochastic and deterministic dynamics”. She derived the NLARI model by applying Newton’s second law (force = mass x acceleration) to stochastic selfrestoring systems in order to achieve a unified data-generating process of economic variables.

RESEARCH FINDINGS Using the NLARI framework, Professor He was able to explain the nature and causes of unit roots and trend breaks. Further advances in the model meant that Professor He could clarify the NLARI nonlinear dynamics, by obtaining the analytic solutions of its deterministic system, and develop the statistical method for the NLARI

Fractal patterns can be found throughout nature, including galaxies.

M.Aurelius/Shutterstock.com

Mardoz/Shutterstock.com

Fractal dimension was developed as a quantitative measure of complex, irregular objects, essentially because it had been accepted that fractals could not be characterised by integer dimensions. Dr Zonglu He, a Professor of Economics at Kaetsu University, is challenging this basic concept with her novel finding that fractals actually can occur in integer dimensional space-times. Her research demonstrates the consistency of both fractional and topological dimensions. Moreover, she discovers the control mechanisms of fractal behaviour and explores the effects of nonlinear dynamics.

Ruth Black/Shutterstock.com

Integer-dimensional fractals of nonlinear dynamics

Professor He explains that she created the NLARI model to, “explore whether the data generating process for integer

www.researchoutreach.org

47


CONSISTENCY WITH TOPOLOGICAL DIMENSION This study reveals that it is possible for fractal dimension to be consistent with topological dimension. Professor He found that the properties of typical nonlinear dynamics, such as stable fixed point, periodic and aperiodic oscillations, and chaos, and those of typical fractal behaviours including long-range dependence, together with self-similarity, and power law were the same as the properties of the datagenerating process in integer dimensional space-times. (A power law is a functional relationship between two quantities, where independent of the initial size of the quantities, one quantity varies as a power of another.) Professor He discovered that longrange dependence and self-similarity with power law were determined by the comparative strength of external environmental and internal organisational influences. If the observation scale is large enough, or the frequency of data is low enough, it is possible to observe typical fractals. Further research with the NLARI model revealed that a stochastic (random) stable fixed point had both self-similarity and long-term memory properties, while a deterministic (non-random) stable fixed

a

400

a'

800

400

b

240

40

80

120

40

80

120

48

www.researchoutreach.org

f'f'f'

160

160 160 160 160

100 100 100 100

200 200 200 200

400 400 400 400

fff

400 400 400 400

3000 3000 3000 3000

E: zongluhe@kaetsu.ac.jp T: +81 42 446 3711 W: http://www.kaetsu.ac.jp/ W: https://www.researchgate.net/profile/Zonglu_He

300 300 300 300

800 800 800 800

800 800 800 800

1200 1200 1200 1200

6000 6000 6000 6000

9000 9000 9000 9000

a and a’ One for precipitation (Philadelphia in 1820-01 to 1903-04) and the other for heartbeats with ventricular tachycardia (0209.vt1). b and b’ One for peak streamflow (Wolf River at New London, 191401 to 1938-10) and the other for brainwaves in a brainstorm project. c and c’ One for exchange rate (EXSZUS, 1971Q1-2010Q2) and the other for heartbeats with ST segment alterations (s20641). d and d’ One for Changjiang flows (Han Kou, 1965-01 to 1948-04) and the other for heartbeats with ventricular fibrillation (0217.vf1). e and e’ One for ozone thickness (Switzerland, 1926-01 to 1952-12) and the other for heartbeats from an older adult (f1o04). f and f’ One for Saugeen River flows (Port Elgin, 1988-01-01 to 1991-11-01) and the other for nerve impulses from a snail neuron.

EFFECTS OF NONLINEAR DYNAMICS Plotting the data generated by NLARI revealed that the fractal level of a stable fixed point is controlled by the wave indicators reflecting the relative strength of external and internal forces: a larger gradient disclosed a higher positive dependence i.e. long memory, whereas a smaller amplitude indicator disclosed a higher level of self-similarity.

Non-integer fractal dimensions do not explain what creates fractal behaviour and controls fractal levels. point usually only showed self-similarity. Both stochastic and deterministic period cycles together with chaos only display long-term memory. The length of the restorative delays also had a significant effect with self-similarity being observed when the restorative delay was an even number. Self-similarity didn’t occur when the restorative delay was an odd number.

e' e' e'

Dr Zonglu He

800 800 800 800

80 80 80 80

e e e

240

120

c

400 400 400 400

d' d' d'

800

120

b'

c'

Behind the Research

d d d

Additionally, a larger amplitude indicator or an even restorative delay could make the sample autocorrelation function oscillate. Professor He observed that the fractal levels of period cycles and chaos relied on the intrinsic resistance, restoration, and regulative delays. Once the internal structure becomes so robust that the system can generate periodic

cycles or chaos, however, the extrinsic disturbances can be ignored. CONCLUSION Contrary to the underpinning concept of fractal dimension, Professor He’s ground breaking findings challenge traditional thinking and suggest that fractals of self-regulating systems can in fact be measured by integer dimensions. Professor He also indicates that other similar fractals and dynamics occur in different disciplines and that these could also be characterised by the NLARI process that she has uncovered by applying Newton’s second law to self-regulating systems. Professor He’s research suggests that the pervasiveness of similarity in dynamics, or fractals, of nonlinear models in areas such as modelling rainfall, river stream flow, ozone thickness, brainwaves and nerve impulses could indicate that Newton’s second law underpins the phenomena in the real world.

Research Objectives

References

Professor He’s research focuses on control mechanisms of nonlinear dynamics.

He, Z. L. (2007). A new class of nonlinear integrated models. Far East Journal of Theoretical Statistics, 23, 31–50.

Detail

He, Z. L. (2013). Dynamics and stability of a new class of nonlinear integrated models with resilience mechanisms. Far East Journal of Dynamical Systems, 21, 1–32.

Faculty of Management and Economics Kaetsu University 2-8-4 Minami-cho, Hanakoganei, Kodaira-shi Tokyo 187-8578 Japan

He, Z. L. (2014). Estimation and tests of nonlinear autoregressive integrated models. Far East Journal of Theoretical Statistics, 49, 129–164.

Bio Dr Zonglu He is a professor of Economics at Kaetsu University. She received her PhD from Hiroshima University. Her current research interests focus on control mechanisms of nonlinear dynamics and fractals by modelling the data generating processes of different fields in integrating information, data, perspectives, concepts, background knowledge and theories.

He, Z. L. (2018). Integer-dimensional fractals of nonlinear dynamics, control mechanisms, and physical implications. Scientific Reports, 8, 10.1038/s41598-018-28669-3. Available at https://www.nature.com/articles/s41598-018-28669-3 [Accessed 7th March 2019]

Personal Response What initially prompted your research into integerdimensional fractals of nonlinear dynamics? Fractal dimension seems not to be a physical scale of complexity because it does not satisfy the uniqueness condition as a physical scale. If fractals model complex physical processes and dynamical systems, the nonlinear dynamics and the control physical laws for these fractaldimension physical processes should result from fractaldimension spacetimes. Alternatively, fractals may occur in integer-dimension spacetimes. What are your plans for future developments with the NLARI model? I will use the NLARI model to extract depth information from temporal fluctuations, for example, detect heartbeat parameter alterations from heartbeat series to assess cardiac, metabolic, and autonomic nervous functions and the risk for cardiovascular diseases; and explore information transmission in cortical circuits.

www.researchoutreach.org

Number1411/Shutterstock.com

parameter estimation and hypothesis testing. Using these, she was able to investigate integer-dimensional fractal behaviours and their control mechanisms as well as the relationships between fractal and dynamic behaviours, and the effects of stochastic disturbances on fractals.

49


Research has shown that when we make decisions, we are influenced by biases relating to the way we structure knowledge in our brains. If these biases are shared by others, they can be exaggerated. This often results in false memories, which is the recall or recognition of phenomena that did not occur. Yoshiko Arima and colleagues at Kyoto University of Advanced Science investigated the conditions that could lead to false memories within a group process, and the relationship between false memories and shared knowledge structure.

C

ollaborative groups are known to create a higher level of memory accuracy than individuals. This is thought to be the result of a larger information pool, the opportunity to revise errors and efficient decisionmaking. Despite their improved accuracy, collaborative groups tend to make errors in recall tasks. Several studies have found that people are more confident about the accuracy of memories recalled by a group than those recalled by individuals, even when these memories are false. Taken together, this suggests that group false memory occurs when all group members fail to detect errors. REACHING CONSENSUS IN GROUP DECISIONS There is wide variability in the time taken to reach consensus when making a decision through a group process. The two main factors which

influence the likelihood of reaching agreement are within-group discrepancy (differences among group members) and between-group discrepancy (difference among groups). The former has been considered the basis for intragroup disagreements and the latter for intergroup disagreements. However, previous research has suggested that these behaviours can influence each other. Studies have found that between-group discrepancies influence not only intergroup processes but also intragroup processes. FACTORS INFLUENCING GROUP CONSENSUS Group polarization is the tendency for a group to make decisions that are more extreme than the initial inclination of its members (known as the mean tendency). Groups polarize their attitude to the direction of mean

15.00

Post-group Variance

Measuring shared knowledge with group false memory

Pre-group Variance

Behavioural Sciences ︱ Dr Yoshiko Arima

10.00

5.00

30.00

40.00

Pressmaster/Shutterstock.com

50.00

Pre-group Mean

60.00

30.00

40.00

50.00

Post-group Mean

60.00

Scatter plots of the pre-test and post-test means and variances for the consensus and disagreement groups. Results of discussion: Consensus ( ); Disagreement ( ). Source: Y. Arima, R. Yukihiro & Y. Hattori Scientific Reports 8, Article number: 10117 (2018) www.creativecommons.org/licenses/by/4.0/.

tendency of whole sample even though they do not know which direction it was. Using the group polarization paradigm, research conducted by Dr Yoshiko Arima based at Kyoto University of Advanced Science, explored the conditions that lead to intragroup disagreement and attitude change following a disagreement among 269 participants. Her results show that the probability of consensus was low when the group means differed from the mean of the whole sample. When small differences among group members were found, depolarization (reverse direction to the polarization) followed disagreement. This suggests that the groups which deviated most from the population tendency were the most likely to cause within-group disagreement, while within-group differences determined the direction of attitude change following disagreement within the group.

Dr Arima investigated the effect of shared knowledge, manipulated using

www.researchoutreach.org

10.00

5.00

Groups seem to reach a consensus or not randomly, however, they are controlled by the variance of the whole society. There is an ‘Invisible Hand’ who controls public opinion from one pole to another. The basic assumption of Dr Arima’s study is the invisible hand of our shared knowledge structure that needs identity and difference. The problem was, how to measure it?

50

15.00

associated or randomly ordered word lists, on the association between group remembering and group polarisation. In one experiment, 159 university students answered a questionnaire about the common stereotype that blood type determines personality. Half were given lists of words that were consistent with this idea and the other half were provided with randomly ordered word lists. After completing the questionnaire, students were tested on how many words they could recall from the lists that had appeared in the questionnaire. Tests were conducted in either a group or an individual setting. The results demonstrated that stereotype-consistency of the word list reduced the groups’ ability to detect incorrect answers, compared with the individual condition. In another experiment, 131 high school and university students were divided into three groups: group members having the same blood type (low-diversity), members with two different blood types (medium-diversity), and members with three or more different blood types (high-diversity). This condition induces three levels of variety of memories

because of self-reference effect, which is the finding that information relating to oneself is easier to recall than unrelated information. The difference of belief in blood type stereotypes before versus after the recall task is called the shift score and represents the extent of group polarisation. The results showed an association between the shift score and the total number of recalled words including false memories when the word list was consistent with the blood type stereotype. This suggested that shared knowledge influenced group polarisation. FALSE MEMORY VS ERRORS To measure shared knowledge structure, Dr Arima and her colleagues investigated whether group collaboration increases false memories if a word list is arranged in a way that is consistent with pre-stored group knowledge. This was studied using the Deese-Roediger-McDermott (DRM) paradigm. This involves participants first taking a free recall test after learning a word list associated with a certain nonpresented word (known as the critical word) which is not presented within the list. After several of these recall tests, participants take a recognition test that includes the presented associative

There is wide variability in the time taken to reach consensus when making a decision in a group context.

www.researchoutreach.org

51


Behind the Research

ALL data Correct False 1.0

Dr Yoshiko Arima

Read.2F Novel.2 Fast.3 Memory.5

0.5

Dimension 2

Spring.6

0.0

Happy.5F Play.5 Television.4F Warm.6F

Tired.1

Leg.3F

Antenna.4 Shoes.3

Sleep.1

E: arima@arimalab.com T: +81-75-406-9230 W: https://www.linkedin.com/in/yoshikoarima/

Painful.5

Screen.4

Amusing.4

-0.5

Fat.3

Library.2 News.2 Fireplace.6

Cool.6

-1.0

Rest.1F

ALSCAL results for all data (S-stress = 0.13, squared correlation (RSQ) =0.94). The horizontal dimension represents the pleasure or notpleasure scales. The vertical dimension seems to discriminate between intellectual activities or rest. Source: Arima, Y., Vol 110, Issue 2, 2012. Copyright © [2018] DOI: [10.2466/01.11.17.21. PR0.110.2.607-623].

Absent.1

-1.5 -2

-1

0

1

Dimension 1

words (known as targets), non-presented associative words (critical words whereby a response indicates a false memory), and non-presented irrelevant words (known as distractors whereby a response indicates an error). Usually, the proportion of false memory is much higher than that of errors. The activation-monitoring hypothesis predicts that false memory occurs when critical words continually become activated, eventually producing a memory. Evidence for this idea has been obtained from studies showing that false recognition increases, and accurate recognition decrease with the number of targets in a word list. Based on the activation-monitoring hypothesis, Dr Arima predicted that false memory would be dependent on the strength of the critical word as a semantic cue at the moment of learning. ‘Knowledge structure’ represents the associative-semantic network related to specific cues. ‘Shared knowledge structure’ is the intersection of group members’ knowledge structures. Two hypotheses were tested: firstly, that group false memory would be larger for the

2

3

4

consistent word list condition than for the randomised word list condition, and secondly, that after collaboration, group false memory would increase more than errors and the difference between the consistent and randomised word lists at the pre-test stage would remain the same.

randomised word lists. Dr Arima also measured the response latency of each character using in word lists. The results demonstrated that there are relationships between response latency and the score of knowledge structure analysed by MultiDimensional Scale.

Using the DRM, they conducted the first experiment in 121 university students. Participants collaborated to select keywords using between-subject condition from a consistent or randomised word list. Between-subjects means that groups are compared with each other. The proportion of false responses was larger than that of error responses, particularly for the consistent word list condition and group collaboration increased false recognition for both word lists, which was not caused by social contagion.

CONCLUSIONS Dr Arima’s findings demonstrate that group false memory is shaped by a shared knowledge structure, a process which can have both positive and negative consequences. With shared knowledge, it is easy to exchange concepts, but this may mean that increasing mistakes resulting to group polarization. If a group has diverse knowledge structures among group members, the group will be able to detect these mistakes. However, within the group process, there will be difficulties as a result of miscommunication. To overcome this problem, it is necessary to have a complex shared knowledge structure. However, a compromise may exist between shared knowledge and the amount of shared memory.

A second study on 119 university students used a within-subjects design, where differences within a group are compared and confirmed the findings of the first experiment. False memories increased after group collaboration regardless of the meaning of the words, maintaining the difference between the consistent and

The activation-monitoring hypothesis predicts that false memory will occur when critical words become activated continually, and eventually produce a memory.

Research Objectives

References

Dr Arima’s research investigates shared knowledge structure with the aim of estimating complexities of our collective intelligence.

Arima, Y. (2012). Effect of Group Means on the Probability of Consensus. Psychological Reports, 110(2), 607-623.

Detail

Arima, Y. (2013). Effect of word-list consistency on the correlation between group memory and group polarization. Psychological reports, 112(2), 375-389.

Dr Yoshiko Arima Department of Psychology Kyoto University of Advanced Science 18 Yamanouchi Gotanda Cho Ukyo-ku, Kyoto 615-8577, Japan

Arima, Y., Yukihiro, R., & Hattori, Y. (2018). Measuring shared knowledge with group false memory. Scientific Reports, 8(1), 10117.

Bio Dr Yoshiko Arima obtained her undergraduate degree within the Faculty of Human Science, Osaka University in 1980 before obtaining her PhD in Social Physiology. She is currently Professor at the Department of Psychology, Faculty of Humanities, Kyoto University of Advanced Science. Collaborators • Roji Yukihiro • Yosuke Hattori

Personal Response What is the focus of your future research? The focus of my future research is to integrate group process studies with collective intelligence studies. Even if crowd deviated from the truth, expert crowds, or machine learning from expert’s data, will perform over the best member because crowds can integrate more variables than a person does. However, there is no guarantee that we can understand the complexities in the answer of collective intelligence. The common knowledge effect and cyber cascade polarize crowds and machines may learn the biases. The study on measuring shared knowledge structure is an attempt to estimate complexities of our knowledge to prevent polarization of our society.

The results of the current study show that group false memory can be an index for a pre-stored and a post-shared knowledge structure. False memories in consistent word lists can be used as a measure to understand what associations are shared in pre-stored knowledge, and randomised word lists can reveal associations with which a group creates a post-shared knowledge.

amasterphotographer/Shutterstock.com

52

www.researchoutreach.org

www.researchoutreach.org

53


Behavioural Sciences ︱ Andreas Klocke and Sven Stadtmüller

The role of social capital in the health development of children H Social capital refers to networks and ties, which deliver support, information and trust for the members. Being part of such a network is your social capital, which in turn can improve health and wellbeing. Andreas Klocke and Sven Stadtmüller from Frankfurt University of Applied Sciences investigated the impact of social capital on the health of children in their developmental process. Researchers aimed to determine whether there was a causal effect of social capital and whether this was consistent across sociodemographic groups.

ealth status and behaviours are strongly linked with both wellbeing and quality of life, as documented in worldwide population studies. However, determining how to improve health in individuals and societies is more uncertain. It is questionable whether health is determined solely by income or wealth. Interestingly, research shows that not all wealthy and advanced countries rank at the top of the global league table of happiness. The Scandinavian countries tend to rank highly in this respect. These countries are characterised by a comparatively high level of wealth but also a high level of life expectancy, low levels of corruption and a sense of belonging, which together facilitate greater trust and solidarity. It is likely that the degree of social capital also contributes to this sense of cohesion and connectedness, a concept which has become increasingly studied within the social sciences in recent years.

DEFINING SOCIAL CAPITAL The concept of social capital became prominent through the work of Putnam and was initially applied to political science pertaining to shared values and local networks. Later, Bourdieu’s work was rediscovered, who addresses social capital on the level of individuals. Today, there is a lack of consensus about how social capital should be defined. Halpern proposed that social capital exists at three different levels: the micro level (family), the meso level (neighbourhood) and the macro level (nation). The key feature of social

54

www.researchoutreach.org

capital is its focus on the relationships among individuals. Unlike human capital (which focuses on individual abilities) and economic capital (which refers to possession), social capital addresses the networks and ties, in which individuals are woven. Being a member of a network gives the person an advantage, through obtaining information, support, access and trust. These, in turn, improve life satisfaction and wellbeing.

Three forms of social capital have been defined by Putnam: bonding, bridging and linking. Bonding refers to strong direct links between individuals in a similar socio-demographic and socio-economic or socio-cultural environment. Bridging pertains to comparatively weak horizontal connections between different groups, which originate from a similar social class. Linking refers to vertical links between privileged and less privileged groups. Although the concept of social capital is widely used in social science research, it has not frequently been used in the study of children. Andreas Klocke and Sven Stadtmüller addressed this gap in the literature by exploring whether an intrapersonal change in social capital affects children’s health status and behaviour over time. They used longitudinal panel-data (long-term data), rather than cross-sectional data, which looks at data at one point in time, so that they could determine if the effect of social capital on health was causal. Researchers also investigated whether the results varied across different sociodemographic groups. APPLYING SOCIAL CAPITAL TO CHILDREN Despite not being commonly studied in children, all three types of social

An index of social capital in young people QUALITY OF RELATIONSHIP TO PARENTS

EASY TO TALK TO FATHER (STEP FATHER)

EASY TO TALK TO MOTHER (STEP MOTHER)

QUALITY OF SCHOOL CLIMATE

MOST STUDENTS IN MY CLASS LIKE BEING TOGETHER

capital are applicable to this group, as long as adjustments are made for different ages. Family ties and friendships (bonding) amongst similar socio-demographic groups are common throughout childhood. However, children also make connections with those from different backgrounds when they join organisations and associations, such as schools and sports clubs, which is characteristic of bridging. The linking form of social capital is more challenging for children as these are inherently hard to establish and might even have a marginalising effect of those from less privileged backgrounds because children can often decipher different social backgrounds. When applying social capital in children, it is most useful to consider the concept on the individual level. This is because the other levels are largely unrecognised in this age group. The individualistic approach can be conceptualised with Bourdieu’s definition of social capital, which states: ‘Social capital is the aggregate of the actual or potential resources which are linked to possession of a durable network of more or less institutionalized relationships of mutual acquaintance and recognition.’ For children particularly, the concept can be viewed as a trust-based network, which can be accessed when social support is needed. Children cannot usually change their personal circumstances so they must turn to trusted others, particularly adults.

MOST STUDENTS IN MY CLASS ARE KIND AND HELPFUL

QUALITY OF NEIGHBOURHOOD

THE OTHER STUDENTS ACCEPT ME AS I AM

PEOPLE GREET EACH OTHER AND SPEAK TO EACH OTHER

with regard to this relationship. The first is that children with a high amount of social capital show a better health

SMALLER CHILDREN CAN PLAY OUTSIDE DURING THE DAY

ONE CAN TRUST PEOPLE

with trust in children. In childhood and adolescence, individuals usually gain trust and support from their parents and thus

It is questionable whether health is determined solely by income or wealth. status and health behaviour. Secondly, researchers hypothesised that children who gain more social capital over time should see significantly improved health. The third prediction was that social capital is a causal protective factor in the health development of children, which holds true across different sociodemographic groups. MEASURING SOCIAL CAPITAL IN CHILDREN Klocke and Stadtmüller used three indicators to measure social capital in children, which were the quality of relationship to parents, quality of school climate and quality of the neighbourhood. These elements were selected because they were felt to be most strongly linked

this relationship is of high importance. School is also highly significant for children because it is where they spend most of their day and involves a high amount of socialisation. The quality of the school environment was measured by relying on the perceived relationships with other pupils (mutual trust). The third element considered important was the quality of the immediate neighbourhood because it concerns security. Health was determined by asking children to rate their general health on a five-point scale. They were also asked about sleep, concentration levels, perceived fitness and wellbeing, and consumption of soft drinks and vegetables. Sociodemographic status was assessed by adopting the Family Affluence Scale,

Based on the established link between social capital and health in adults, a similar relationship might be expected to exist in children. This was the focus of the work of Andreas Klocke and Sven Stadtmüller, who made three predictions

www.researchoutreach.org

55


Behind the Research

EFFECTS OF SOCIAL CAPITAL AND FAS ON HEALTH Fixed effects regression models b

Social Capital se p

Family Affluence b se p

Constant b se

Sleep difficulties

-.164

(.018)

***

-.017

(.019)

2.94

(.202)

Could not concentrate well

-.202

(.016)

***

.013

(.016)

2.62

(.172)

Feeling fit and comfortable

.355

(.018)

***

.019

(.019)

1.94

(.196)

Consumption of Coke and lemonade

-.091

(.015)

***

-.005

(.016)

2.01

(.170)

Consumption of salads and vegetables

.152

(.016)

***

.107

(.017)

2.22

(.176)

• A n (within-children) increase in the volume of social capital reduces (mental) health problems and improves health behaviour • Changes in FAS hardly play any role asking about family cars, holidays, whether they had their own bedroom and whether there were books at home. SOCIAL CAPITAL IN THE HEALTH OF CHILDREN Researchers found that social capital was associated with better health status, as indicated by higher perceived fitness and wellbeing levels and greater consumption of vegetables. In addition, higher social capital was linked with less likelihood of sleep difficulties,

behaviour, with the greatest effect for the different features of mental health (sleep and concentration, perceived fitness and wellbeing) and smallest for physical health behaviour (drinking soft drinks and eating vegetables).

this holds true over time. In addition, all three components of social capital (quality of relationship to parents, quality of school climate and quality of neighbourhood) were equally important in predicting health in children.

The study also found that a change in social capital was significantly associated with the health and health behaviour of a child. Greater social capital over time was linked with increased perceived fitness and wellbeing, fewer problems

THE FUTURE Klocke and Stadtmüller have demonstrated that social capital is a powerful tool in the analysis of health and health behaviour in childhood. This is especially true as children are in the process of growing up. The research findings highlighted that an intrapersonal change in social capital over time has a significant effect on the health and health behaviour of an individual. Social capital can thus be conceptualised as a causal factor affecting health development in children, which operates independently of socio-demographic status. In order to improve the health of children in both the short and long term, it is thus important to establish and maintain social networks and resources (social capital) rather than merely providing financial assistance.

Children cannot usually change their personal circumstances so they must turn to trusted others, particularly adults. fewer problems with concentration and reduced consumption of soft drinks. With the exception of soft drink consumption, social capital was a greater predictor than family affluence on health. Overall, the amount of social capital was highly associated with health status, mental health and health

with concentration, a reduced likelihood of sleep difficulties, higher vegetable consumption and improved health status. These results held true regardless of family affluence or background, demonstrating that social capital is a much greater predictor of health than socio-demographic status and

Andreas Klocke E: andreas.klocke@fzdw.de T: +0049 69 15332188 W: fzdw.de

Sven Stadtmüller E: sven.stadtmueller@fzdw.de T: +0049 69 15333187 W: fzdw.de

Research Objectives

References

Andreas Klocke and Sven Stadtmüller aim to determine whether there was a causal effect of social capital on the health of children and whether this was consistent across socio-demographic groups.

Klocke, A., & Stadtmüller, S. (2018). ‘Social capital in the health development of children’. Child Ind Res: pp 1–19. https://doi. org/10.1007/s12187-018-9583-y.

Detail

Personal Response

Frankfurt University of Applied Sciences, Nibelungenplatz 1, 60318 Frankfurt, Germany Bio Andreas Klocke is Professor in Sociology and head of the Research Centre on Demographic Change at Frankfurt University of Applied Sciences, Germany. His main areas of research are social inequalities, youth, health and demography.

Does the relationship between social capital in children and health appear to be linear or is there a cut-off point whereby health benefits are maximised? No, there is no threshold which we have to overjump before we see a positive effect of social capital on the health development of children. All improvements in social capital are powerful. Furthermore, all sociodemographic groups are benefitting from a higher volume of social capital, which makes it applicable to general health-promoting programs.

Sven Stadtmüller is Senior Researcher at the Research Centre on Demographic Change at Frankfurt University of Applied Sciences and at GESIS – Leibniz-Institute for the Social Sciences. His main areas of research are survey methodology, demography, and political attitudes. Funding German Social Accident Insurance (DGUV)

RESEARCH DESIGN

Data base is the panel study “Health Behaviour and Injuries during School Age”, which started on an annual cycle in the school year 2014/15 in Germany (for details see: www.fzdw.de/gus)

56

www.researchoutreach.org

The study initially surveyed students of the 5th grade (10-12 year olds), comprising 10,621 pupil, which we survey for six waves (years).

Class room sampling, 587 classes in 148 schools (wave one). Stratified sample by Federal State, Schools Size and Urbanity.

Self-administered computer assisted (tablets) survey.

www.researchoutreach.org

57


Institutional investors and information acquisition:

Implications for asset prices and informational efficiency The research of Dr Adrian Buss (INSEAD) and Matthijs Breugem (Collegio Carlo Alberto) explores how the growth of assets under management by institutional investors with relative performance concerns influences the efficiency of financial markets, asset prices and investors’ portfolio returns. The team have developed a theoretical framework which illustrates, among others, that ‘benchmarking’ distorts the informational efficiency of stock prices. Their work has important implications for the ability of financial markets to incorporate and provide information in the presence of institutional investors and passive investment management.

T

oday, roughly two-thirds of all U.S. stocks are held by institutional investors, such as banks, pension funds, mutual funds, insurance companies and hedge funds. Consequently, institutional investors are major players in financial markets and have a substantial impact on asset prices and financial-market efficiency. The performance of these institutional investors is usually evaluated relative to a benchmark (stock-market index) – a practice that has the potential to substantially alter the investment strategies of fund managers and, in turn, affect financial-market equilibrium. Understanding such effects is the objective of the work by Adrian Buss, Assistant Professor of Finance at INSEAD, and Matthijs Breugem, Collegio Carlo Alberto.

In a study recently published in The Review of Financial Studies, Breugem & Buss (2019) have developed a theoretical framework that seeks to explain how the rise of benchmarked institutional investors affects financial markets. Their model extends previous theories by explicitly accounting for both information and portfolio choice in the presence of institutional investors who are concerned about their performance relative to a benchmark. Notably, their findings suggest that benchmarking can cause considerable distortions in the value of private information and, hence, informational efficiency. INSTITUTIONAL INVESTORS Over the last decades, the importance of institutional investors has grown steadily. For example, the fraction of U.S. equity owned by institutional investors has risen from about 7% in 1950 to 67% in 2010 (Stambaugh, 2014). Similarly, institutional investors nowadays account for a majority of the transactions and trading volume (Griffin, Harris & Topaloglu, 2003).

Photorasee/Shutterstock.com

Like other investors, institutional investors try to manage their funds in order to maximise the absolute value of these investments. However, their performance relative to a benchmark (or, their peers) is also of great importance. That is, there exists considerable academic evidence that funds that beat their benchmark attract more new capital in the future. Also, many institutional investors get paid more by their clients when they deliver a return that is higher than that

58

www.researchoutreach.org

Wpadington/Shutterstock.com

Arts & Humanities ︱ Adrian Buss

of the benchmark. For example, recently, Japan’s Government Pension Investment Fund, the world’s largest retirement fund, has introduced a system whereby it pays all active managers a fee based on their relative return. The idea of such performance-fees is to align the incentives of fund managers with those of their investors. In particular, a fund manager will only receive a high fee if his/her investment return is higher than the one investors could achieve by simply buying the benchmark. PRIVATE INFORMATION AND INFORMATIONAL EFFICIENCY Investors in financial markets consistently try to uncover information that other investors do not possess. For that purpose, they study financial statements, gather information about consumers’ taste, hire outside financial advisers, or subscribe to proprietary databases. Such private information is particularly valuable because – if not available to other investors – it allows an investor to generate trading profits; by buying stocks that he/she has positive

information about and selling stocks with negative information.

because it determines the efficient allocation of capital.

However, whenever investors trade stocks based on private information, part of this information will be incorporated into the price. For example, if an investor has access to information that indicates a good performance of a firm in the future, he/she would buy shares of the respective firm to benefit from this information. This, in turn, will lead to an increase in the firm’s stock price which would then, partially, reveal the investor’s positive information to the other investors. As a result, by studying stockprice fluctuations in financial markets, investors can learn about the private information of other investors.

RESEARCH OBJECTIVE The main objective of the work by Breugem & Buss is to understand how the growth of assets under management by institutional investors with relative performance concerns influences the efficiency of financial markets, asset prices and investors’ portfolio returns.

Overall, this process implies that stock prices (imperfectly) reflect the information that the various investors in financial markets possess. The degree to which financial markets incorporate and reflect private information is denoted informational efficiency (Fama, 1970) and is of foremost importance for an economy

Research by Buss and Breugem demonstrates how the growth of assets under management by benchmarked institutional investors affects informational efficiency and asset prices.

RESEARCH DESIGN The research team has developed a theoretical framework to address these research questions. The model differs from previous studies on the effects of institutional investors on financial markets (Cuoco & Kaniel, 2011; Basak & Pavlova, 2013; and Buffa, Vayanos and Woolley, 2018) in that it allows institutional investors who are concerned about their relative performance not only to choose their optimal portfolio but also to determine how much time and capital they want to invest into the acquisition of private information. This novel modelling framework enables unique conclusions to be drawn regarding the impact of benchmarking on informational efficiency and asset prices. In the model there exist two types of investors. While one group of investors is concerned about its performance relative to the benchmark (“benchmarked investors”), the other one is not. To understand how the growth in assets

www.researchoutreach.org

59


total

MAIN FINDINGS The research team documents that benchmarking has two distinct effects on the efficiency of financial markets. First, institutional investors with relative performance concerns acquire less private information. That is, to minimise the risk of underperforming their benchmark, benchmarked investors overweight the stocks that are part of the benchmark in their portfolio. In that regard, they diverge from traditional share picking methods that are used by non-benchmarked funds. Intuitively, by partially replicating the benchmark, their performance becomes more similar to the benchmark and, hence, the risk of under-performance declines. However, note that such “hedging trades” are not based on information, as they simply require the purchase of the stocks in the benchmark. Hence, a part of benchmarked investors’ portfolios will not benefit from private information and, as a result, these investors will reduce their acquisition of private information. Second, benchmarked funds are less aggressive in using their private information. For example, in the case of positive information about a firm, a benchmarked institutional investor will acquire less shares of the respective firm than a comparable investor without relative performance concerns. Consequently, the firm’s stock price reacts less and, hence, less of the benchmarked investors’ information will be reflected in the price. Both effects imply a reduction in the informational efficiency of financial markets. However, the underlying economic mechanisms are quite distinct. The first mechanism implies that there is less private information available in financial markets due to the decline in the information-production activities of benchmarked investors. Hence, prices can only reflect this lower amount of information. In contrast, the second mechanism implies that less of the available information is incorporated into prices because benchmarked

60

www.researchoutreach.org

Behind the Research

0.6 0.5

Dr Adrian Buss

Explained variance (R2 )

under management of institutional investors that took place in the past years affects financial markets, the authors analyse how financial-market equilibrium changes as the proportion of benchmarked investors in the economy rises.

0.4

E: adrian.buss@insead.edu T: +33 1 60 72 44 84 W: www.insead.edu/faculty-research/faculty/adrian-buss

0.3 0.2 0.1 0.0 0

0.25

0.5

0.75

1

Fraction of benchmarked investors (Γ)

Price Informativeness. The figure depicts stock-price informativeness, measured by the fraction of the variance of the payoff that is explained by the stock price, as a function of the fraction of benchmarked institutional investors. The graph is based on the framework described in Section 3.1 of Breugem & Buss (2019).

[Their model] allows to draw unique conclusions regarding the impact of institutional investors on financial markets. investors are less aggressive in their trading. Ultimately, less information is known about the stocks in the index. This prediction is consistent with the empirical evidence that an increase in ETF ownership is associated with lessinformative security prices (Israeli, Lee and Sridharan, 2017). The reduction in the informational efficiency of financial markets has important implications for asset prices and fund managers’ returns. For example, the prices of stocks in the benchmark should fluctuate more than those not part of the benchmark. Intuitively, because less information is reflected in their price, the arrival of any piece of news leads to a stronger reaction in the stock price and, hence, more pronounced fluctuations. Another important prediction of the model is that investors which are not (or less) concerned about their performance relative to an index should outperform (more) benchmarked investors. Intuitively, less-benchmarked funds who gather more

private information are better placed to make correct investment decisions, which in turn means that they outperform their benchmarked rivals. SUMMARY The research by Buss and Breugem demonstrates that the growth of assets under management by benchmarked institutional funds can have a substantial impact on financial markets. Benchmarking reduces informational efficiency because benchmarked investors acquire less information and trade less aggressively on their available information. As a result, the returns of stocks in the benchmark become more volatile. Moreover, fund managers whose performance is not tied to a benchmark outperform benchmarked investors; increasingly so as the fraction of benchmarked investors in the economy rises. These results are of significance not only to investment professionals, corporate decision makers and regulators but also to retail investors.

Research Objectives

References

The research of Adrian Buss investigates the impact of financial frictions and institutional investors on asset prices and financial-market efficiency.

Basak, S., and A. Pavlova (2013) “Asset Prices and Institutional Investors”, American Economic Review, vol. 103, pp. 17281758.

Detail

Breugem, M and A. Buss (2019) “Institutional Investors and Information Acquisition: Implications for Asset Prices and Informational Efficiency”, Review of Financial Studies, https:// doi.org/10.1093/rfs/hhy103

Dr Adrian Buss INSEAD, Boulevard de Constance 77305 Fontainebleau France Bio Adrian Buss is an Assistant Professor of Finance at INSEAD. He holds a PhD in Finance from Goethe University Frankfurt and Masters in Mathematics and Business Informatics from the University of Mannheim. His research has been published in leading academic journals (Journal of Finance, Review of Financial Studies, Journal of Monetary Economics). Funding This work has received financial support from the Europlace Institute of Finance, the Labex Louis Bachelier and the ‘Asset Management Academy – An initiative by Paris Dauphine House of Finance, EIF and Lyxor International Asset Management’. Collaborators • Matthijs Breugem from Collegio Carlo Alberto

Buffa, A., D. Vayanos, and P. Woolley (2019), “Asset Management Contracts and Equilibrium Prices”, CEPR Discussion Papers. Cuoco, D., and R. Kaniel (2011) “Equilibrium Prices in the Presence of Delegated Portfolio Management”, Journal of Financial Economics, vol. 101, pp. 264-296. Fama, E.F. (1970) “Efficient capital markets II”, Journal of Finance, vol. 46, pp. 1575-1617. Griffin, J. M., J. H. Harris, and S. Topaloglu (2003) “The Dynamics of Institutional and Individual Trading”, Journal of Finance, vol. 58, pp.2285-2320. Israeli, D., C. M. Lee, and S. A. Sridharan (2017) “Is There a Dark Side to Exchange Traded Funds (ETFs)? An Information Perspective”, Review of Accounting Studies, vol. 22, pp. 10481083. Stambaugh, R.F. (2014) “Investment Noise and Trends”, Journal of Finance, vol. 69, pp. 1415-1453.

Personal Response Does your research suggest that benchmarking should be regulated? No. Benchmarking plays an essential role in financial markets. It aligns the incentives of retail investors with those of their fund managers. Hence, in general, it should lead to better portfolio returns for retail investors. However, our research highlights a novel tension between benchmarking as a tool to align incentives and its adverse effects on informational efficiency and benchmarked investors’ portfolio returns. Also, our research suggests that non-benchmarked fund managers outperform their benchmarked peers, so that there might be a natural limit to the size of benchmarked (passive) funds.

www.researchoutreach.org

61


Biology ︱ Dr Kaoru Sugimura

Honeycomb wings created by nature’s mechanics

Female adult fruit fly, Drosophila melanogaster.

62

www.researchoutreach.org

Y

ou probably don’t think all that much about forces. You probably didn’t have to think about whether the chair you are sitting in would support you or whether the building you work in would keep its shape. But a mechanic or engineer has put a lot of thought into designing these things. Each part has to be carefully assembled out of the correct material to ensure you don’t topple off your chair or the building doesn’t cave in. It’s easy to think about forces in something you might build but what about the forces inside your body? You and I, and all the other living things on the planet, are made up of cells. When a human infant develops, forces act between tens of trillion cells to shape our bodies. If you think of an animal off the top of your head, the likelihood is that they are a very different shape and

size to you. We live in a world full of diverse animals and plants; to understand how these different forms arise we need to understand the forces underlying development. We need to look at the mechanics of nature.

A snapshot of the left wing of Drosophila pupa. Cells are labeled with GFP-tagged cell adhesion molecules, E-cadherin. The vertical and horizontal directions are aligned with the anterior-posterior (AP) and proximal-distal (PD) axes, respectively.

chemical signals trigger cell movement and changes in configuration. Cells respond in many different ways to the mechanical force, including elongation, increasing elasticity and directional cell rearrangement. This is where cells rearrange by changing the contact they have with the cells around them. In the Drosophila wing, directional cell rearrangement causes the cells in the wing to align in hexagons, forming a beautiful honeycomb structure.

Imagine two judo players in a wrestling match. They push and pull one another and use force to try and win. They also respond to each other – a pull from one player may result in a push from the other. Just like the judo players, cells push and pull during development to create shapes and patterns. Within the cell, proteins are hard at work generating forces whilst cells communicate to each other through mechanical movements. Cells also sense and respond to their environment just like the judo players.

As the Drosophila wings deform in concert with the hinge (where the wing connects to the body) the tension of the tissue increases, causing the wing cells to

The understanding of force generation within cells and tissues has evolved in the last decade. However, much less is known about how cells are able to respond to forces during development. This mysterious area of research has been pursued by Dr Kaoru Sugimura and her colleagues, Dr Keisuke Ikawa from Kyoto University and Dr Shuji Ishihara from the University of Tokyo, who have heroically uncovered the developmental story of wing cells in Drosophila flies. HONEYCOMB WINGS When Drosophila are in the pupal stage of development, the larva is changing (metamorphosing) into an adult fly. During metamorphosis, the cells rearrange their positions to create adult fly tissue. Mechanical forces push and pull cells and

Pupa of Drosophila melanogaster.

Tomasz Klejdysz/Shutterstock.com

nechaevkon/Shutterstock.com

Forces shape the world we live in, including the bodies we grow in to. Tiny motors in our cells push and pull to create different shapes, surrounding cells respond and react to create diverse patterns of tissue resulting in a world full of unique plants and animals. Drs Sugimura and Ikawa, from Kyoto University, studied the Drosophila fly wing to uncover the components involved in resisting and responding to external force, resulting in a beautiful cellular honeycomb arrangement.

…these forces cause the cells in the wing to align in hexagons, forming a beautiful honeycomb structure… elongate and stretch out horizontally. This tissue tension pulls the cells into contact with surrounding cells. The contact point with these other cells is known as a cell junction. At the cell junction forces act to counterbalance the tension of the tissue. If this tension is not resisted the cells would be pulled apart. Stabilising the balance of external tension and internal forces allows the cells to comfortably rearrange into a hexagonal shape. Sugimura and

her colleague, Ishihara, neatly unravelled all these factors at play, but there is more to be understood – what generates these resisting forces in the cell? Sugimura dug deep into the cell to find the individual molecules within the cell that were involved in this process. ACTIN Actin is a protein found in most cells in the body. It’s important for muscle contraction, for movement and division of cells and giving cells their shape. Actin senses and resists forces, and can react and respond to the mechanical environment by altering its elasticity or changing its structure. Actin exists in two forms – fibrous and globular. The small globular actin proteins come together to form a chain or fibre. In its chain form it’s useful for all sorts of things including shaping and moving cells! Actin could also be involved in the organisation of lots of cells within a tissue because actin is connected up between cells through special junctions called adherens junctions. Actin can only change its structure with the help of other proteins called actin-

www.researchoutreach.org

63


elongation Cell rearrangement

Tissue

Behind the Research

Tissue tension

Dr Kaoru Sugimura

Tissue tension

E: ksugimura@icems.kyoto-u.ac.jp T: 81-(0)75-753-9866 W: www.koolau.info

Tissue tension

Top: Still images from time-lapse recording of the pupal wing development. Orange rectangles indicate the elongation and shrinkage of wing tissue along the PD and AP axes, respectively. The forces generated in the proximal region of the body stretch the wing along the PD axis (blue two-way arrow). Bottom: In response to tissue tension (blue arrows), cells change their relative positions along the PD axis by remodeling cell junctions (magenta lines).

ACTIN’S COLLEAGUES Sugimura and Ikawa started by identifying AIP1 in the wing cells using a fluorescent protein tagged onto AIP1 so they could see exactly where it was in the cells. They found it was mainly located at the vertical sides of the elongated cells (picture a cell being stretched, like a horizontal baguette, and imagine the two short curved sides or ends of the baguette, these are the vertical sides; aka AP junctions). As the wing cells are being stretched it seems to make sense that forces to resist the tension during cell rearrangement would originate from the AP junctions. However, Sugimura and Ikawa previously found other forces at play at the elongated horizontal sides resisting the tension. Further research into AIP1 revealed it was only present when cell rearrangement was taking place. These all point to the importance of AIP1 in cell rearrangement but the gold standard test is disrupting the gene’s function and seeing what happens!

64

www.researchoutreach.org

AIP1 has a partner in crime – cofilin. Without cofilin, AIP1 only weakly binds to actin, having little effect on changing its behaviour. Sugimura and Ikawa found that when cofilin was disrupted, AIP1 was no longer localised to the AP junctions and when AIP1 was disrupted, cofilin changed its localization. Seems like AIP1 and cofilin are like peanut butter and jelly – they work better together. MYOSIN AND CANOE/AFADIN It turns out that it’s not just AIP1 and cofilin involved in resisting the tissue tension, there are other proteins that are vital to prevent the cells being ripped in two. Sugimura and Ikawa investigated myosin, a motor protein previously shown to be involved in regulating cell rearrangement. Actin and myosin often work so closely together that even their names have merged. Actomyosin is the name given to networks of actin and myosin that work together to facilitate, amongst other things, cell movement and stability. The researchers here identified an important role for actomyosin in

Ikawa K., Sugimura K. (2018). AIP1 and cofilin ensure a resistance to tissue tension and promote directional cell rearrangement. Nature Communications 9, 3295

Dr Kaoru Sugimura iCeMS Research Building Kyoto University Yoshida Honmachi Sakyo-ku Kyoto 606-8501 Japan

cell rearrangement. They found that actomyosin requires AIP1 to function correctly. When AIP1 was disrupted myosin fibres were disfigured into circles but this didn’t occur when AIP1 was present. The researchers also identified Canoe/Afadin as an important protein for connecting actomyosin to the adherens junction so that when actomyosin moves it pulls the cell membrane with it. Without any one of these proteins the cells don’t rearrange in the right way, leading to mismatched shapes and irregular hexagons in the wings. BRINGING IT TOGETHER Building on their findings, Sugimura and Ikawa have put together a chain of events suggesting how cells respond to tissue tension resulting in cell rearrangement. The tissue tension causes actin to twist at the AP junctions, this twisting allows cofilin to bind to actin. Cofilin causes AIP1 to arrive at the AP junctions, the combination of the AIP1 and cofilin (peanut butter and jelly) results in actin turnover maintaining the Canoe/Afadin link between actomyosin at the adherens junction. This resists the tissue tension that would otherwise break the cells in two. Now these mechanisms have been uncovered, Sugimura and Ikawa are keen to conduct further experiments to reveal their importance in other developmental contexts.

Research in Dr Sugimura’s lab combines mechanical and genetic perturbations with live imaging and Bayesian force-inference to explore the physical principles underlying regulation of epithelial morphogenesis.

Detail

…AIP1 and cofilin are like peanut butter and jelly – they work better together. The researchers disrupted the AIP1 gene so it wasn’t functioning and they found a disturbance in hexagonal cell packing and a defect in directional cell arrangement. All of these results point to the importance of AIP1 in cell rearrangement.

References

Bio Kaoru Sugimura received her PhD from the Graduate School of Science, Kyoto University in 2006. She is a currently an associate professor at WPI-iCeMS, Kyoto University. Sugimura uses a wide range of techniques, including live imaging, Bayesian statistics, and physical modelling, to understand the mechanisms by which mechanical forces shape the body. Funding JSPS KAKENHI Grants The JST PRESTO Program The AMED PRIME Program

Sugimura K., Ishihara S. (2013). The mechanical anisotropy in a tissue promotes ordering in hexagonal cell packing. Development 140, 4091-4101

Personal Response Will understanding directional cell rearrangement be useful in understanding aetiology of diseases? I would say yes. Directional cell rearrangement is essential for various developmental processes such as neural tube closure and hearing tube formation (Nishimura et al. Cell, 2012; Kidokoro et al. Development, 2018). Their failure leads to serious birth defects in humans, including anencephaly. Some disease-related proteins also function in directional cell rearrangement. For instance, Pten, a tumor suppressor, controls the elongation of newly generated adherens junctions (Barder et al. Dev. Cell, 2013). Understanding directional cell rearrangement would help us understand how serious birth defects and fatal diseases are caused.

Collaborators • Dr Keisuke Ikawa

Omelchenko/Shutterstock.com

binding proteins, like AIP1 and cofilin. Because of actin’s ability to sense and resist forces, Sugimura and Ikawa studied whether these proteins could be involved in cell rearrangement in the Drosophila fly wing tissue. Other studies have shown that AIP1 and cofilin are important for causing changes in actins shape and behaviour in vitro, so was a great place to start exploring further.

Research Objectives

www.researchoutreach.org

65


Health and Medicine ︱ Dr Matt Pecot

Drosophila Fezf found to be essential in neural circuit formation Billions of neurons are wired up to into highly organised neural circuits in the brain. The creation of these circuits is a complex process and it is essential that neurons find the correct partners. Matt Pecot and colleagues from Harvard Medical School are interested in understanding this neural circuit formation. In their studies, by using Drosophila, they have identified the gene dFezf as an important part of the puzzle.

B

rains are grey, squishy and wrinkly, they are not all that exciting to look at. But if you use a microscope it gets a whole lot more interesting, you can see long, thin and wire-like neurons that are important for pretty much everything you do – thinking, planning, feeling, seeing, typing, understanding, reading this article, and the list goes on and on. Neurons connect like wires at points called synapses to send signals and messages across the brain. When you are young your brain is making loads of synapses because it is learning lots of new things – it’s learning how to understand what you see and hear, how to remember a name, recognition of family, how to speak, how to walk and so on. So, it’s important that these neurons connect correctly. But how do they do it? There are so many neurons how do they know which ones to partner up with? Imagine I gave you a box of wires and asked you to connect them all up to make a robot. This sounds tricky but I wonder if there are some instructions somewhere? Instructions in living things are found in genes in our DNA. Matt Pecot and his research group are interested in finding the genes that help neurons find the right connections in the brain. WHY THE FLY? Do you fancy donating a portion of your brain for

research? Most likely not, this is why researchers use models to help figure out what is going on in humans, by studying a model, researchers can gain insights into the biology of humans. Matt Pecot and his researchers use the Drosophila melanogaster, the fruit fly, to study how neurons connect to one another. One of the main reasons for using the fly is because of its similarities with the human brain. Both the Drosophila and the human brain have complex layered circuits in the optic lobe (the part of the brain that interprets what you see). Examining this area also allows researchers to see and study individual neurons – another great reason to use Drosophila! Matt’s group is interested in how the layered circuits are formed in Drosophila. TRAFFIC INTO THE M3 Confusingly the optic lobe is divided up into more small parts – we just need to know about two of those, the lamina and the medulla. Neurons from the lamina, helpfully named lamina neurons, synapse in the medulla. In an adult Drosophila, the medulla is clearly organised into distinct layers: M1 – M10 formed by the synapsing of laminar neurons. But how are these layers formed? It’s all down to Earmuff! Let me explain further, L3 is a lamina neuron that synapses in the medulla. In the adult Drosophila, L3 neurons synapse in the M3 layer. The research group previously identified a gene called Drosophila Fezf (dFezf for short, aka Earmuff) in L3 neurons and were interested to see if it had a role in targeting L3 to the M3 layer.

Anatomy of the Drosophila visual system (Adapted from Fischbach and Diettrich 1989). This figure was previously published in Millard, S. S., & Pecot, M. Y. (2018). Strategies for assembling columns and layers in the Drosophila visual system. Neural Development, 13(1). doi:10.1186/s13064-018-0106-9 and is under the Creative Commons Licence CC BY 4.0.

66

www.researchoutreach.org

Outer layers develop in a stepwise manner from broad domains. h APF = hours after puparium formation (a) A representation of the adult morphologies of lamina neuron axons L1-L5. The arborizations of lamina neuron axons help define specific outer medulla layers. (b) A drawing of lamina neuron growth cones L1-L5 in early pupal development. Prior to arborizing in discrete layers lamina growth cones terminate in distal or proximal domains within the outer medulla. This figure was previously published in Millard, S. S., & Pecot, M. Y. (2018). Strategies for assembling columns and layers in the Drosophila visual system. Neural Development, 13(1). doi:10.1186/s13064-018-0106-9 and is under the Creative Commons Licence CC BY 4.0.

BYE-BYE EARMUFF One way that researchers try to understand the importance of a gene is to get rid of it! When the gene is gone or knocked-down you can see what effects this has on the neuron. In this case, the scientists were interested in looking at whether the L3 neurons synapsed on M3 without dFezf. Matt’s group used flies that were still developing neuronal connections to see the importance of dFezf. The chosen flies were in the pupal stage of development, where the larva is metamorphosing into an adult fly. Here the researchers encountered a problem, they found that the mature M1-10 layers don’t exist in pupal flies!

Matt suggests this mechanism for assembling circuits may have remained unchanged throughout evolution so it will be found in multiple species. At this stage, the layers in the medulla haven’t fully formed, but they observed two broader regions– the distal and proximal regions. The researchers conducted the knockdown experiments and saw in pupal flies without dFezf that L3 neurons stopped growing in the distal domains of the medulla. However, in the control pupal flies with functioning dFezf,

the researchers found L3 synapsed in the proximal domain. It seems that dFezf is important in targeting L3 to the proximal domain during development. The researchers continued to study the flies throughout development. They found in flies without dFezf that the mature layers, M1-10, weren’t distinguished and the neurones displayed abnormal

DFezf coordinates the formation of laminar-specific connections.

(A) Early in medulla development, dFezf promotes the targeting of L3 growth cones to the proximal versus distal domain of the outer medulla. DFezf may regulate this step by controlling a program of dpr gene expression. (B) L3 growth cones segregate into the developing M3 layer and secrete Netrin, which regulates the attachment of R8 growth cones within the layer. DFezf also regulates this step by activating the expression of Netrin in L3 neurons. (C) Within the M3 layer, L3 and R8 axons synapse onto Tm9 dendrites. When dFezf function is lost in L3 neurons, L3 and R8 axons innervate inappropriate layers while Tm9 dendrites innervate the M3 layer normally. As a result, connectivity with Tm9 neurons is disrupted. This figure was previously published in Pecot, M. Y. et al. (2018). Drosophila Fezf coordinates laminar-specific connectivity through cell-intrinsic and cell-extrinsic mechanisms. eLIFE,7:e33962. https://doi.org/10.7554/ eLife.33962 and is under the Creative Commons Licence CC BY 4.0.

www.researchoutreach.org

67


architecture. However, in the flies with dFezf, they found L3 neurons had integrated into the developing M3 layer. By knocking down dFezf the researchers were able to answer their question – dFezf is important for targeting neurons to the correct layer during development! The team dug down even further to figure out exactly what dFezf was doing to ensure that L3 neurons synapse in the M3. They wanted to see whether dFezf had direct control (instructional) over where L3 synapsed or whether a third factor may be involved. To do this they expressed dFezf in two other lamina neurons that also synapse in the medulla. L5 neurons have been found to synapse in the proximal domain so expressing dFezf had no effect on the synapsing location, it still synapsed in the proximal domain. L2 neurons, however, usually synapses the distal domain, so if dFezf is expressed you would expect L2 to synapse in the proximal domain if dFezf is instructional. Matt and his group found exactly this! This means that dFezf has an important role in directing the growth of neurons!

does dFezf actually do? Does it yell instructions like a pirate to the crew on a ship? Probably not. In fact, Matt’s research group have some compelling evidence demonstrating that some other more favourable mechanisms of communication may be at play. They had a good look at the levels of expression of the genes in neurons where dFezf was removed and where it was functioning correctly. They

initial thoughts that dFezf is required for NetB expression. They also found that NetB is important in supporting other communications as well including regulating the synapsing of another type of neuron called R8 into the M3. It seems that dFezf is quite busy, not only does it regulate the expression of genes that target L3 neurons to the M3 but it also regulates NetB expression to recruit other cells to the M3 layer! Remember there are nine other layers so it is likely that similar genes coordinate their assembly like dFezf coordinates targeting of different cells to the M3 layer.

dFezf has an important role in directing the growth of neurons.

WHAT’S GOING ON? We know that dFezf is important in targeting L3 neurons to M3. But what

found lots of genes were affected by the lack of dFezf, including genes involved in regulating the cell surface and release of molecules. It’s possible that dFezf could be a master regulator gene that initiates layer formation through regulating the expression of other genes. Faced with altering levels of multiple genes, the research group had to choose their best candidate to study in more detail. The lucky winner this time was Netrin. NETRIN Netrin or Net for short exists in two forms – A and B. The researchers here focused on NetB as there is more of it in L3 neurons. They found that disrupting expression of dFezf in L3 neurons caused a reduction in NetB confirming their

Researchers from other groups have also identified Fezf as being important for assembling these layered circuits in mammals (mice). Matt’s group propose that this mechanism for assembling circuits is evolutionary conserved – it has remained unchanged throughout evolution so it will be found in multiple species. Before this can be said for sure, more research is required. The group would like to study the other genes that dFezf regulates to pin down exactly what dFezf does to assemble these layered circuits.

Behind the Research

Dr Matt Pecot

E: matthew_pecot@hms.harvard.edu T: +1 617 432 5157 W: http://www.Pecotlab.com

Research Objectives

References

Matt’s laboratory at Harvard Medical School seeks to understand how the nervous system is assembled during development.

Millard S. & Pecot M. (2018). ‘Strategies for assembling columns and layers in the Drosophila visual system’. Neural Development, 13:11.

Detail

Pecot M., Chen Y., Akin )., Chen Z., Tsui C., Zipursky L. (2014). ‘Sequential Axon-derived Signals Couple Target Survival and Layer Specificity in the Drosophila Visual System’. Neuron:16; 82(2): 320–333.

Matt Pecot 220 Longwood Ave, Armenise 203, Boston, MA 02446 USA Bio For his PhD, Matt worked with Vivek Malhotra at UCSD investigating the biogenesis of the Golgi apparatus, and he performed post-doctoral studies with Larry Zipursky at UCLA studying neural circuit formation. Matt’s laboratory at Harvard Medical School seeks to understand how the nervous system is assembled during development. Funding • NIH K01NS094545 • NIH-NINDS R01 NS103905 01A • NIH-NINDS R01 NS110713 01

Peng J., Santiago I., Ahn C., Gur B., Tsui C., Su Z., Xu C., Karakhanyan A., Silies M., Pecot M. (2018). ‘Drosophila Fezf coordinates laminar-specific connectivity through cell-intrinsic and cell-extrinsic mechanisms’. eLife;7:e33962 Available at: 10.7554/eLife.33962.

Personal Response Do you think studying neural circuit formation will provide insights into aetiology or cures for diseases that affect the brain? Absolutely. It is becoming clear that developmental defects in neural connectivity are causal to neurological disorders. Thus it’s imperative to understand the fundamental molecular principles underlying how neural connections are established during development. Using these principles as a guide will allow us to develop therapeutic strategies to restore proper circuitry in diseased individuals.

kurt_G/Shutterstock.com

Collaborators Marion Silies Johannes-Gutenberg-Universität Mainz iDN: Institute for Development and Neurobiology Hanns-Dieter-Hüsch Weg 15 55099 Mainz, Germany

Pecot M., Tadros W., Nern A., Bader M., Chen Y., Zipursky L. (2013). ‘Multiple Interactions Control Synaptic Layer Specificity in the Drosophila Visual System’. Neuron, 23; 77(2): 299–310.

68

www.researchoutreach.org

www.researchoutreach.org

69


Biology ︱ Dr Daisuke Kihara

Dr Daisuke Kihara’s team at Purdue University have created novel computational approaches for predicting protein functions. Instead of following a oneprotein-one-function approach, their algorithms can predict the functional relationships of entire groups of proteins related to a specific biological process. The team has also expanded into mining oversighted or previously unknown proteins that have multiple, independent functions. The team’s methods challenge the logic behind conventional protein function studies and propose tools that may better capture the complicated nature of protein interactions in biological processes.

KJ07/Shutterstock.com

70

www.researchoutreach.org

P

roteins are the main working units of biology. Identifying and understanding what proteins do is crucial for biologists hoping to solve the complex interactions and systems that drive cellular processes. Although protein function needs to be ultimately validated by hand in the wet lab, researchers first need a hypothesis in order to design assays, which can then define the probable function of a protein. BIOINFORMATICS FOR PREDICTING PROTEIN FUNCTION Biologists can build such hypotheses of gene function with computers. As genome sequencing becomes routine in experimental laboratories, computational gene function prediction has also become increasingly important. Computational methods are very suitable for function prediction because function information of a gene can be inferred from a database search that identifies similarity between the gene and known proteins or experimental data. Sequence similarity tools like the Basic Local Alignment Search Tool (BLAST) is one such method that searches against all previously recorded sequences and suggests a scored list of possible roles for it.

PROBLEMS WITH PREVIOUS COMPUTATIONAL METHODS However, existing bioinformatic tools can’t always predict protein function accurately, and often end up incorrectly annotating proteins within a biological system. Traditional protein function prediction tools like BLAST are usually reliable when a high sequence similarity is detected, but their accuracy falls quickly for sequences with lower similarities. For example, enzyme functions differ immensely when similarity scores fall below a certain level. Moreover, in many cases traditional methods do not annotate any function if highly similar sequences are not found, leaving many genes unannotated. In addition, other metrics such as similarity in three-dimensional structure, gene expression, or interaction data could be used. However, each of these metrics are often missing for many proteins under investigation, and so have limited applicability in reliable research. NEW TOOLS FOR BETTER ACCURACY Recently, several new protein annotation methods have been developed to improve overall prediction accuracy. One such developer is Dr Daisuke Kihara from Purdue University, who develops function prediction methods with new logical frameworks. In 2009, his team created an automated predictive algorithm, called the extended similarity group (ESG) method, which runs a continual comparing system, instead of a single search. From each sequence found from the first inquiry, the ESG algorithm runs a second search through the database. By combining results from this multi-levelled tactic, the ESG method significantly improves functional scoring for query proteins and outperforms previous function prediction algorithms.

Gorodenkoff/Shutterstock.com

Predicting protein function and annotating complex pathways with machine learning

Yet the team did not stop here. In a 2019 paper, they combined phylogenetic tree construction tools with traditional sequence-based prediction, called the Phylo-PFP method. They first confirmed that close similarities of protein sequences did not align with the proteins’ distances on a phylogenetic tree. By adding these distances into the sequence homology score, the protein query ranks became more reliable, and they could be more accurately linked to their gene source. Unsurprisingly, the study established Phylo-PFP significantly improved the function prediction accuracy over existing methods. PROTEIN GROUP FUNCTION ANNOTATION Protein function annotation is typically run on a one-protein-one-function approach, yet this mindset can grossly oversimplify the protein function universe. In fact, most experiments find dozens of interacting proteins related to a single biological event. To understand the role of an entire protein set, their function should be determined from the group as a whole, even if the function of each individual protein is unknown. This is no simple task. Therefore, Dr Kihara’s team focused on a new computational approach for annotating the functions of protein groups. In 2019, they proposed an iterative Group Function Prediction (iGFP) method, which holds a completely new

logical framework at its core. The iGFP algorithm considers a set of proteins as input, and predicts the role of the function of the entire group, as well as its individual members. The iGFP algorithm blends sequence data from multiple sources and builds a complementary network. The method then separates the proteins into clusters that have functional relevance and compares them based on functional and interaction relationships.

Moreover, the system automatically assumes that some proteins are unknown and uses a range of other comparative features to make an accurate prediction. During this scan, the algorithm considers protein-protein interactions, phylogenetic profile similarity, gene co-expression, large-scale pathway similarity, and gene ontology similarity. This type of comprehensive group function prediction could be an altogether improved

Dr Daisuke Kihara from Purdue University develops function prediction methods with new logical frameworks.

The iGFP algorithm iteratively assigns functions to protein groups and to individual proteins in the groups.

www.researchoutreach.org

71


out moonlighting proteins from previously published literature. Their text mining tool DextMP could find out whether a protein had multiple functions or not based on information from journal publications and functional descriptions from protein databases. Using systematic literature processing tools, the researchers could significantly reduce time to annotate moonlighting proteins and move closer to clarifying the complex interplay of proteins within the cell.

Aashish Jain and Dr Kihara discuss functions assigned to a metabolic pathway.

reflection of the real mechanisms at work in, for example, developmental or disease-causing pathways. IDENTIFYING PROTEINS WITH MULTIPLE FUNCTIONS In addition to analysing protein groups, the Kihara team has taken another step away from the one-protein-onefunction scheme by studying multifunctional proteins. Most bioinformatic tools do not take into account that proteins, enzymes in particular, can be multi-functional. The Kihara lab has thus aimed to predict whether a query protein is a moonlighting protein – one that has multiple autonomous and often unrelated functions. These proteins are difficult to annotate, since their functions are not genome or protein family specific, nor linked to other indicators, such as a shared switching mechanism. Yet these proteins play key roles in

cellular disease states such as cancers, and so identifying them is important. To solve the problem, Dr Kihara’s team has developed a new systematic approach to study moonlighting proteins. In 2016, the team proposed an automated prediction framework which uses several non-sequence-based data to identify moonlighting proteins. They used machine learning classifiers to predict multi-functional proteins, after which they cross validated the results using existing databases. Dr Kihara’s team could predict moonlighting proteins that had previous gene sequence data with 98% accuracy. Even if no sequence data was available, the system showed an impressive 75% accuracy. Furthermore, in a 2018 paper the team used deep learning to sniff

The iGFP algorithm considers a set of proteins as input and predicts the function of the entire group, as well as its individual proteins.

IMPROVEMENTS AND FUTURE PREDICTIONS Computational biology desperately needs new ways to accurately reflect the true nature of biological processes. Dr Kihara’s team has made innovative strides to step away from a traditional one-protein-one-function effort and identified functions for entire protein groups. Their algorithms outperform previous sequence-based methods by layering multiple protein characteristics and taking into account evolutionary relationships, which can be better indicators of shared functions than the simple amino acid backbone. Further, the team’s machine learning methods can predict whether a protein serves a double role, and whether such proteins have unknowingly been described in previous literature. Despite these promising developments, bioinformatic prediction tools are only as intelligent as their design, and there is still a way to go towards fully automated, AI-driven research in protein function annotation. Overall, Dr Kihara’s team suggests that combining previous methods with emerging ones from omics experiments and evolution distance analysis will further solidify functional prediction accuracies in the future.

Behind the Research

Dr Daisuke Kihara

E: dkihara@purdue.edu T: +1 765 496 2284 W: http://kiharalab.org

Research Objectives

References

Dr Kihara’s work focuses on developing new techniques for computational protein function prediction.

Chitale, M., Hawkins, T., Park, C., & Kihara, D. (2009). ESG: extended similarity group method for automated protein function prediction. Bioinformatics, 25(14), 1739–1745.

Detail

Jain, A., & Kihara, D. (2019). Phylo-PFP: improved automated protein function prediction using phylogenetic distance of distantly related sequences. Bioinformatics. 35(5):753-759.

Department of Biological Sciences Department of Computer Science Purdue University 249 S. Martin Jischke Dr West Lafayette, IN 47907, USA Bio Dr Kihara is a full professor in the Department of Biological Sciences and the Department of Computer Science at Purdue University, West Lafayette, Indiana. He received a BS degree from the University of Tokyo, Japan in 1994, and a PhD degree from Kyoto University, Japan in 1999. After studying as a postdoctoral researcher with Prof Jeffrey Skolnick he joined Purdue University in 2003. He was promoted to full professor in 2014. Since 2018, he has held an adjunct professor position at Department of Pediatrics, University of Cincinnati. He has been working in various topics in protein bioinformatics. His current research projects include the developments of algorithms for protein-protein docking, protein tertiary structure prediction, structure modelling from low-resolution image data, structure- and sequence-based protein function prediction, and computational drug design. He has published over 150 research papers and book chapters. His research projects have been supported by funding from the National Institutes of Health, the National Science Foundation, the Office of the Director of National Intelligence, and industry. He has served on the program committee of various bioinformatics conferences including the Intelligent Systems for Molecular Biology (ISMB) where he is a track chair in 2019. In 2013, he was named a University Faculty Scholar by Purdue University.

Jain, A., Gali, H., & Kihara, D. (2018). Identification of Moonlighting Proteins in Genomes Using Text Mining Techniques. Proteomics, 18(21–22), 1800083. Khan, I. K., & Kihara, D. (2016). Genome-scale prediction of moonlighting proteins using diverse protein association information. Bioinformatics, 32(15), 2281–2288. Khan, I. K., Jain, A., Rawi, R., Bensmail, H., & Kihara, D. (2019). Prediction of protein group function by iterative classification on functional relevance network. Bioinformatics, 8, 1388-1394.

Personal Response What kind of role will machine learning play in protein function prediction and understanding biological processes? Machine learning has already been playing a big role in protein function prediction, and more widely, in bioinformatics. It is particularly effective in identifying subtle signatures that are easily overlooked by humans in input data including protein sequences that are relevant to particular functions. It is also very suitable for integrating many different types of data together to make predictions.

SmirkDingo/Shutterstock.com

Funding • National Science Foundation (NSF) • National Institutes of Health (NIH) Collaborators • Aashish Jain • Ishita Khan

72

www.researchoutreach.org

www.researchoutreach.org

73


Biology ︱ Dr Charles Turick

In-situ monitoring of microbial circuitry Microbial metabolisms are valuable tools in industrial biotechnology. The ability to monitor and measure the productivity of microbes is essential, but many standard techniques are limited by issues of labourand time-intensity. With funding from the Department of Energy’s Bioenergy Technologies Office, Office of Science, and Environmental Management Program as well as the Department of Defense’s Defense Threat Reduction agency. Dr Charles (Chuck) Turick at the Savannah River National Laboratory and collaborators from Clemson University, the University of South Carolina, and Savannah River Consulting, have applied electrochemical techniques to enable in-situ monitoring of microbial metabolic activity uninhibited by the limitations of standard technologies.

74

www.researchoutreach.org

M

icroorganisms are incredibly valuable tools in industrial biotechnology. The plasticity and repertoire of microbial metabolisms are further increased through genetic engineering, giving rise to a vast array of potential uses for the microbes. They can clean oil spills from oceans, yield fuel from plants, as well as produce pharmaceuticals such as penicillin. The biotechnology industry, therefore, forms a significant part of the economy (known as the bioeconomy), with U.S. revenues totalling $350 billion, or 2.5% of their GDP, in 2012. For quality control purposes and to maximise process efficiency, the activity of the microbes in the bioprocess must be monitored. Slight changes in oxygen concentration, pH, temperature, etc. can have large impacts on the metabolic activity of microorganisms and therefore on product yields. By looking at changes in cell numbers and a range of metabolites

within a microbial culture, scientists can gain insight into the metabolic status of the bioprocess and how this can be manipulated and optimised. MEASURING MICROBES However, many methods to sample and analyse the conditions of microbial cultures are time-consuming, labourintensive and costly. This is particularly evident when microbes must be grown in the absence of oxygen or at a harsh pressure, making sampling a much more complex procedure. The time and labour taken when samples are manually collected and analysed (termed offline monitoring) limit the amount of data that may be collected as well as the opportunity to detect issues in the process as they develop and, therefore, the capacity to optimise the bioprocess. The discontinuous nature of offline monitoring also leaves gaps in data, which can lead to misinterpretation. To avoid such issues, automated sensors can be submerged within a microbial culture to facilitate continuous and real-time monitoring of conditions (so-called online in-situ monitoring). However, because the sensor remains in-situ, microbes can accumulate on the sensor itself and reduce the accuracy of its measurements (known as biofouling). Costly and timeconsuming cleaning and maintenance of the sensor is then required. The ideal technology to monitor bioprocesses should navigate around these limitations

A typical microbial growth curve (green circles) can be defined electrochemically (below) and modelled mathematically (black circles) using an equivalent circuit (inset) to provide physiological data, relative to abiotic controls (white circles).

whilst providing accurate and useful data. Dr Chuck Turick at the Savannah River National Research Laboratory with his colleagues Ms Ariane L. Martin and Dr J. Michael Henson at Clemson University, Dr Sirivatch Shimpalee, Dr John Weidner, Mr Pongsarun Satjaritanun, and Mr Blake A. Devivo at the University of South Carolina, and Dr Scott Greenway at Savannah River Consulting have developed an automated, non-invasive approach to monitor microbial activity with real-time data acquisition to do just this. Part of this work has also resulted in a patent for monitoring microbial growth on surfaces (biofilms). Microorganisms are traditionally considered as sources of biochemical catalysts (enzymes) and are typically analysed using chemical methods, such as chromatography and mass spectrometry. As in all chemical reactions, those occurring within microbes are driven by the flow of electrons. For example, the microbial cell surface is negatively charged which attracts positive ions, resulting in the formation of a double layer on the cell membrane. Dr Chuck Turick and his collaborators, therefore, began to consider microbes as electric circuits as well as chemical catalysts. Any disturbances to the bioprocess may be detected by looking at changes in both electron flow and chemical composition. The combination of these methods gives rise to the growing fields of electrochemical monitoring and electromicrobiology. WHAT IS ELECTROMICROBIOLOGY? Electromicrobiology uses electrochemical techniques, such as cyclic voltammetry and electrochemical impedance spectroscopy, to study microbial activity. Dr Chuck Turick has been using such technologies for over a decade to study how microbes can sequester uranium in contaminated soils. More recently, Dr Chuck Turick has developed electrochemical techniques to

Dr Chuck Turick and his collaborators began to consider microbes as electric circuits as well as chemical catalysts. monitor microbial growth and metabolic activity in-situ, offering an attractive alternative to standard analytical methods. HOW DOES IT WORK? Cyclic voltammetry works by measuring the current that is generated as a range of potentials are applied to a redox active system, such as a microbial culture. A redox-active metabolite can

switch between oxidised and reduced forms through the loss or gain of one or more electrons, respectively. As increasing potentials are applied to an electrode submerged in a microbial culture, the current of this electrode increases over time as it gains more electrons. The electrode then reaches the reduction potential of a metabolite of interest – meaning the potential at

Electrochemical monitoring is being developed for use in bioreactors at various scales. Future applications of this technology include monitoring biopharmaceutical processes in conjunction with studies in Dr Sarah Harcum’s Cell Culture and Fermentation Optimization Laboratory at Clemson University.-

www.researchoutreach.org

75


which the metabolite will switch between reduced and oxidised forms, also known as peak current or charge density – and the electrons pass from the electrode to the metabolite via extracellular electron transfer. This reduces the metabolite and oxidises the electrode, resulting in a detectable drop in the electrode’s current. The current is then reversed and the redox reaction reverses, repeating in a cycle. When microbial cultures are monitored in this way, the charge density of reduction peaks is measured as an average across the entire culture. Changes in the charge density of the peak can correspond to changes in the ratio of living cells to dead cells within a culture, as well as changes in the metabolic status of the cells. Electrochemical impedance spectroscopy (EIS) is used to gain insight into the composition of cells within a culture. In simple electrical circuits, resistance is a measure of the circuits ability to resist the current flow. In complex circuits, such as microbial cultures, resistance also becomes a more complex parameter and is instead described by impedance. The impedance of a microbial culture is measured by applying potential to an insitu electrode and monitoring the change in current as it passes through the culture. By using a range of frequencies, the impedance values can be analysed to understand how the composition of a culture varies as cells grow and their metabolisms change over time.

Behind the Research

Dr Charles (Chuck) Turick

E: Charles.Turick@srnl.doe.gov T: +1 803 507 2714 T: +1 803 819 8407 W: https://amb-express.springeropen.com/articles/10.1186/ s13568-018-0692-2 W: www.youtube.com/watch?v=7pjtjsj2Wu0&list=PLcR_7p639ABWvSLYpr6IXgxtjvkUDlQHu&index=12&t=199s W: www.srs.gov/general/srnl/tech_transfer/tech_briefs/SRNL_TechBriefs_AutoElectrochemicalTechnique.pdf

Electrochemical techniques are used to follow and define electron flow in microbes during growth. Studies incorporate conventional microbial methods. This approach provides abundant data related to microbial activity.

Electrochemical techniques reduce the labour and time-intensity of data collection [and] avoid the limitations posed by biofouling. In Dr Chuck Turick and his collaborators latest publication, cyclic voltammetry and EIS were demonstrated to accurately and effectively monitor microbial growth. The techniques show both impedance and charge density of reduction peaks correspond to the number of cells in a culture and their growth status. Now a patented technology, impedance was also shown to change as cellular growth slowed, indicating shifts in metabolic activity.

The key advantages of using electrochemical techniques include its capacity for real-time data acquisition and its ability to monitor in-situ, thereby minimising disruption to the microbial culture and reducing the labour and timeintensity of data collection. In addition, cyclic voltammetry was demonstrated to effectively prevent microbes from accumulating on electrodes submerged in cultures. This suggests the technology can be used to clean electrodes without removing them from the bioprocess, avoiding the limitations posed by biofouling. The field of electromicrobiological research has gained great interest in the last decade since Dr Turick organised the first international symposium on the subject in 2008. Electrochemistry is already a well-established technology amongst physical scientists but was rarely used beyond traditional electric circuits. Its novel application in microbiology has now inspired greater collaboration between scientists and engineers from different backgrounds to continue developing and expanding the field of electromicrobiology, changing how microbial metabolisms are considered, monitored, and even manipulated.

76

www.researchoutreach.org

Research Objectives

References

Dr Turick studies electron transfer mechanisms in microbes incorporating electrochemical techniques to define microbes as tiny electrical circuits for applications in industrial biotechnology.

Carlson, R. (2014) ‘The U.S. Bioeconomy in 2012’. [online] Synthesis. Available at: http://www.synthesis. cc/synthesis/2014/01/the_us_bioeconomy_ in_2012?rq=bioeconomy%20in%202012 [Accessed 13/01/2019].

Detail

Martin, A. L., Satjaritanun, P., Shimpalee, S., Devivo, B. A., Weidner, J., Greenway, S., Henson, J. M., Turick, C. E. (2018). ‘In-situ electrochemical analysis of microbial activity’. AMB Express, 8(162).

Environmental Science and Biotechnology Savannah River National Laboratory BLDG 999W Aiken, SC 29808 USA Bio Chuck Turick received his BS in biology from California University of Pennsylvania, MS and PhD in microbiology from West Virginia University and the University of New Hampshire, respectively. Funding • Savannah River National Laboratory Directed Research and Development Program • Department of Energy (DOE)/ Office of Energy Efficiency & Renewable Energy - Bioenergy Technologies Office • DOE/Office of Science • DOE/Environmental Management Program • DoD/Defense Threat Reduction Agency (DTRA) Collaborators • Ariane L. Martin, J. Michael Henson Department of Biological Sciences, Life Sciences Facility, Clemson University, Clemson, SC, USA. • Sirivatch Shimpalee, John Weidner, Pongsarun Satjaritanun, Blake A. Devivo Department of Chemical Engineering and Computing, University of South Carolina, 541 Main Street, Columbia, SC, USA • Scott Greenway Savannah River Consulting, 301 Gateway Drive, Aiken, SC, USA

Turick, C. E., Beliaev, A. S., Zakrajsek, B. A., Readon, C. L, Lowy, D. A., Poppy, T. E., Maloney, A., Ekechukwu, A. A. (2009). ‘The role of 4-hydroxyphenylpyruvate dioxygenase in enhancement of solid-phase electron transfer by Shewanella oneidensis MR-1’. FEMS Microbiol Ecol, 68, 223-235. Turick, C. E., Ekechukwu, A. A., Milliken, C. E., Casadevall, A., Dadachova, E. (2011). ‘Gamma radiation interacts with melanin to alter its oxidation-reduction potential and results in electric current production’. Bioelectrochemistry, 82, 69-73.

Personal Response Can electrochemical techniques replace standard methods such as measuring optical density of microbial cell cultures? At this stage of the technology, we are interested in providing additional information to fill the gaps between conventional analyses or to detect sudden unexpected changes between conventional sampling intervals. As this technology matures, it may very well replace conventional techniques.

www.researchoutreach.org

77


Biology ︱ Drs Cecilia Demergasso and Guillermo Chong

The acidic brine lakes of Chile: A surprising microbial community The Andean salt flats of Chile are well known for their vast natural beauty. Interspersed within their white salt-crusts, small lakes, springs and creeks can be found alongside the unique microbial communities that inhabit them. Drs Cecilia Demergasso and Guillermo Chong along with a multidisciplinary research team from the Universidad Católica del Norte of Chile, are currently investigating the inhabitants of the rare acidic brine lakes found in Chile with the aim to learn more about the extreme microorganisms surviving in these hostile environments and how they contribute to the complex local environment through biogeochemical cycling.

T

he Andean salt flats are the remnants of former saline lakes of Neogene/Cuaternary age that dried up along geological and recent times. Originally and in different spans of time, large bodies of water sat within basins of the mountainous regions of the Andes, before severe and cyclic climatic changes gradually led to the evaporation of water from these lakes exceeding the rate of precipitation. Basins were originally closed to store the lakes with no outlet to drain and the evaporation caused them to leave behind thick crusts of salt, and, in several smaller cases, brine lakes mainly fed by underground recharge from the High Andes were also closed as the large basins dried up.

solutions like those observed in Gorbea. The combination of both a highly saline and acidic environment means that only the toughest and most resilient microorganisms can survive. The study of the ability of the microorganisms that inhabit and persist in this kind of saline ecosystem has prompted researchers at the terrain as a potential Mars analogue.

WHY SO ACIDIC? Typically, brines of salt lakes are neutral or alkaline. However, in Chile, of the many brines in salt lakes of the region, two rarer acidic brine lake systems – Salares Gorbea and Ignorado (Salar means salt flat in Spanish) – have become the focus of research for Demergasso and Chong’s team. A large variety of hydrated minerals found on Mars using spectrometry, has been shown to potentially be genetically linked to the volcanic activity and the occurrence of acid sulphate

SALAR DE GORBEA Acidic brine lakes have only been studied outside of Chile in places like Australia. Salar de Gorbea (the subject of the Demergasso and Chong’s team investigations) could provide a useful site to compare differences in microbial populations due to the contrasting geological and environmental conditions. The acidic brine lake is located in the high Andes (4000m above sea level) with a salinity of 28% – eight times saltier than average sea water. Flanked

Demergasso and Chong’s research hopes to shine a light on the diverse microbial communities living within the acidic brine lakes to further improve our understanding of our planet’s unique microbial diversity, which could have wideranging implications on conservation and potential applications in Biotechnology.

by the Cerro Bayo complex – a volcano found on the Chile-Argentine border – geochemical studies have suggested that the local acidity of the lakes (pH 2 – 4) may have arisen due to several reasons: the presence of volcanic native sulphur; the oxidation of sulphur releasing sulphuric acid, which seeped into the lakes; and a change in the buffering capacity of local rocks (the ability of the rocks to help maintain and control pH change in the system). Demergasso and Chong were keen to find out whether microorganisms (known to play a relevant role in acid production by using sulphur as their energy source), such as the bacteria Acidithobacillus thiooxidans, were contributing to the acidity of the system as similar microbes have been detected in other acidic environments such as hot springs and acid mine/rock drainage systems. To do this, several sampling expeditions from five zones across the Salar de Gorbea were carried out over a six-year period. Sediments and brine samples collected in the 2009 and 2011 campaigns were then selected for further analysis. Microbiological analysis was firstly carried out by extracting DNA from the brine and sediments samples. Then the DNA was subjected to Denaturing Gradient Gel Electrophoresis (DGGE) – a technique used in molecular microbial ecology to separate out different fragments of DNA – creating a fingerprint of a complex microbial community living within the samples. Recently, metagenomic analysis have been performed and phylogenetic analysis and bioinformatics tools were then used to determine the species found and characterize the functional content of the genes discovered.

A THIOTROPHIC COMMUNITY Microorganisms are usually grouped together according to their metabolic activity which enables the basic

salt to survive (halophilic) or can grow under saline conditions (halotolerant) predominate, but analysis of sediment and water samples were taken from

The combination of both a highly saline and acidic environment means that only a specialized and resilient microbiota can survive. functions of life – growth, repair and reproduction. Unlike us, who require oxygen and energy from sugars to survive, microorganisms can exploit a range of other elements found in Nature to sustain their activities, including making their own energy sources from inorganic compounds. Typically in salty environments, such as salt lakes, heterotrophic microbes that need

the Salar de Gorbea did not detect any of these species, suggesting that oligotrophy – organisms able to live in an environment with very low nutrient levels – is an important feature of survival in such an extreme ecosystem. Instead, Demergasso and Chong found an unusual microbiota characterized by living supported by the energy of sulphur.

www.researchoutreach.org

79


A minor range of sulphur metabolising bacteria was also detected from other groups, including sequences related to Alicyclobacillus, Sulfobacillus and Leptospirillum – chemolithotrophic microorganisms that produce energy by oxidation of sulphur. Based on the metagenome analysis performed by Demergasso and Chong´s team, many of the organisms described were found to be potentially capable of carbon fixation and sulphur oxidation. Carbon fixation is a process whereby inorganic carbon (such as carbon dioxide)

is converted to organic compounds, perhaps the most well-known example of organisms capable of carbon fixation are plants, which convert CO2 into sugars using photosynthesis exclusively (autotrophs). Yet, in Demergasso and Chong´s team studies, they found that

One of the most surprising findings was the detection of an abundant Mycobacteria population living in the acidic brine lakes. carbon fixation capability appears to be part of a mixotrophic model that microbes in the acidic brine lakes exploit: their primary production is through the oxidation of reduced sulphur compounds, which allows them to efficiently conserve energy for fixing carbon if necessary. Demergasso and Chong have discovered a specialized microbiota supported by volcanic sulphur in a salty environment with low iron and organic matter content: a thiotrophic community. NON-TUBERCULOUS MYCOBACTERIA One of the most surprising findings from the analysis was the detection of Mycobacteria living in the acidic brine lakes. The most well-known species Mycobacterium tuberculosis and Mycobacterium leprae have been well studied due to the pathogenic role they play in causing the diseases tuberculosis and leprosy. Yet, there are many other less well-known species that belong to

Above: Native sulphur at one of the active solfataras in Lastarria Volcano.

Behind the Research

the genus, these are known as nontuberculous forms and are found in an array of different ecosystems, including engineered and natural water systems, soils and even swamps. This is the first time that Mycobacterium has been found living in an acidic brine system

Cecilia Susana Demergasso E: cdemerga@ucn.cl T: +56 553 55622 W: www.ucn.cl W: www.cbar.cl

www.researchoutreach.org

E: gchong@ucn.cl T: +56 55 2355745 W: www.ucn.cl W: www.cbar.cl

and given that they are representatives of the discovered specialized microbiota, Demergasso and Chong´s team hope to analyze the genomes of the Mycobacteria isolated from Salar de Gorbea to better understand the reason of their abundance in this system.

Research Objectives

Detail

References

A MICROBIAL PLANET Microorganisms inhabited the earth long before animals and as such, their ability to adapt and survive in a range of different habitats has led to them being the most abundant organisms on the planet. Microorganisms involvement in biogeochemical cycling makes them important research subjects to help us understand how ecosystems respond to environmental change. By learning more about the microbiota that can survive in these extreme environments, Demergasso and Chong´s team hope that their data can be used to inform conservation which respects and protects our microbial planet.

Biotechnology Center, Universidad Católica del Norte Avda Angamos 0610, 1240000 Antofagasta, Chile

L. Escudero, Noetiker, K Gallarado, et al. (2018). ‘A thiotrophic microbial community in an acidic brine lake in Northern Chile’. Antonie Van Leeuwenhoek; 111:1967.

Drs Cecilia Demergasso and Guillermo Chong Díaz research aims to characterise the microbiota of an acidic brine lake in Chile to improve understanding of these complex environments.

Bio Cecilia Susana Demergasso is the Director of the Center of Biotechnology, Universidad Católica del Norte as well as associate researcher in the Centro de Investigación Científica y Tecnológica para la Minería. Cecilia conducts research in i) Microbiology and their current project is ‘Microbial diversity in the saline domain in Northern Chile’ and in ii) Biotechnology, mainly in the biomining subject. Guillermo Chong Diaz is a Chilean geologist and Professor at the Universidad Católica del Norte at Antofagasta, Northern Chile. The mineral chongite is named after him. Guillermo’s main research interest is the “Saline Domain” of Northern Chile including items like Climatic/ paleoclimatic evolution in Cenozoic times; salt flats and its geomicrobiological environments; salt flats economic settings including lithium/potash deposits and the unique world around nitrate/iodine deposits. Funding • BHP Minerals Americas Project 32002137 • Fondecyt Project 1100795 Collaborators • Lorena Escudero González • Cinthya Tebes-Cayo • Juan José Pueyo Mur • Karen Gallardo • Nia Oetiker • Juan José Pueyo Mur • Karen Gallardo • Nia Oetiker • Paulina Cortéz-Rivera • Mayra Cortés Cabrera • Alex Echeverría-Vega • Roberto Véliz Cruz

80

Guillermo Chong Diaz

Personal Response How can an understanding of acidic brine lakes inform conservation in the region? Salars and saline lakes are extremely fragile environments in the Andes framework. Over the last 20-30 years, they have been systematically exploded for their industrial minerals (B, Li, K), but mostly for water extraction. This situation, in some cases, compromises its conservation and its flora, fauna and microbiota. However, there are some existing evaporitic deposits that are practically intact, like Salars with acid brines. By using publications, describing Salars in terms of geology, microbiota and other characteristics, directly emphasise its importance as a landscape, its unique content in microorganisms, and its biochemical potential as well as its unique habitat worldwide. What effects do you think a changing climate may have on the unique lake systems? Salars and saline lakes ages are not measured in historical time but in geological time. These deposits have already suffered climate changes that can be observed in the size diminution of the original saline basin, paleocoasts, isolated terraces, and fossils of organic structures (stromatolites), indicating a previous development during a more humid climate.

www.researchoutreach.org

81


Quality Stock Arts/Shutterstock.com

Biology ︱ Dr Peter Kevan

Secrets of the stalk: Regulating plant temperature from the inside out Have you ever questioned the reasoning behind hollow stems in some plants? When we think about it there are a surprising number of plants with this feature. A common misconception would be to believe it is solely for nutrient or water transport. In fact, the reality has much more to do with temperature regulation. Dr Peter Kevan from the University of Guelph is one of the first in the field to pursue an understanding of this mechanism, in the hope that it will provide economic, environmental, and horticultural benefits to both nature and society.

T

he plant stem is a complex structure, providing a transport system, mechanical support, and also the freshly explored function of thermal regulation. To this day, the physiology of the inside of stems has rarely been examined with respect to heat transfer, a process normally associated with the leaves and flowers. In order to fill this research gap, Dr Peter Kevan from the University of Guelph is studying differences in plant stem type, and how effectively they regulate temperature.

LUMEN MICROMETEOROLOGY The temperature inside the lumen (hollow centre) of a stem is dependent on the amount of solar radiation (energy from the sun). When this hits the stem wall, some is reflected, some absorbed into the stem tissue itself, and some transmitted through the stem wall into the lumen. This is what maintains the temperature of hollow stems through energy absorption into the ambient gases in the lumen – these circulate through convection, conduction, and re-radiation.

This rather intriguing study, also known as micrometeorology, is literally as the name describes – studying ‘weather’ on a miniature scale. Factors that normally

This entire process has been referred to as the ‘microgreenhouse effect’ as it is an exact replica of this on a tiny scale inside the plant stem. Previously

Over half of the plant species collected had hollow stems. contribute to air temperature on a large scale are simply applied to the inside of a hollow stem. There are numerous factors that can affect this, both biotic (with respect to the plant), and abiotic (the external conditions). Dr Kevan wanted to examine a variety of different plant species with different hollow stem traits to understand the relationship between climate and plant stem physiology.

this has been studied with reference to flowers – an increase in temperature encourages growth of the sexual organs, which in turn helps the flower develop quickly, and have a more prominent presentation for enhanced attraction of pollinators. This, however, is one of the first times it has been considered with respect to the stem and stem growth.

maoyunping/Shutterstock.com

The lumen also loses energy through emissivity to the outside air. Some energy also becomes absorbed into the liquid transport system, and is used up during photosynthesis and plant metabolism (growth). It’s clear that by understanding these physical properties, Dr Kevan will be able to decipher why particular traits of a plant stem have evolved for certain climactic conditions. For example, a translucent stem may allow more energy through; however, it may also mean much is lost – a weak stem tends to droop, so a compromise is required.

82

www.researchoutreach.org

A PRELIMINARY STUDY To start answering these questions, Dr Kevan and his team began collecting and examining plants from across temperate regions including Canada and the UK. Primarily, over half of the plant species collected had hollow stems, even discounting those with a pith-filled lumen. Of these, there was a large variety in lumen diameter, stem wall width, and stem wall texture. Temperature probes were inserted into the lumen and results compared with the ambient air and presence of light. Generally, in the presence of sunlight, the temperature in the lumen was above that of the ambient air. In particular, the hollow stem of a dandelion was up to 8°C higher than the ambient air in sunlight. However, lumen temperature normally dropped slightly below ambient when shaded. This certainly reinforces the theory that there is a distinct microclimate within the stem which changes dependent on abiotic and biotic conditions. In Magadan, far northeastern Russia, studies on oat, dandelion, thistle and sneezewort plants displayed the same trend of hot stems, from 1˚C to 7˚C, depending on species, location, weather and other factors. This large temperature difference could go some way to suggest a high metabolic rate, high growth and also high tolerance – often seen in weed species. Pubescent (hairy) stems, were also recorded as having a higher temperature. Hairs on the stem trap a layer of air acting as insulation. But why is it important to maintain an increased stem microclimate? BENEFITS OF STAYING WARM There have certainly been many studies to show that keeping a high enough

Slicing plant stems open reveals the hollow lumen inside. This lumen allows the plant to maintain a distinct internal microclimate.

temperature aids the success of a plant. Enzyme activity in the cells increases with temperature until a certain denaturation threshold. Maintaining this increased activity is important for optimum growth, with the hollow stem acting as a form of ‘thermos flask’ for the air inside. Although not the primary point of photosynthesis, the stem is the primary growth point of the plant with respect to height, so ensuring a high metabolic rate here encourages improved presentation of sporophylls (sexual

organs) and increases the plant’s chance of reproduction. Similarly, the stem contains much of the fluid transport, including xylem and phloem tissues which move nutrients and water around the plant. A higher energy in this area further improves these natural physical and chemical processes. It is also possible that having the previously mentioned ‘microgreenhouse’ effect, can protect the plants from temperature shock, or changes in temperature over time. Dr Kevan’s study

Many plants in temperate climates, including the daffodil, have hollow lumens.

www.researchoutreach.org

83


Tatyana Mi/Shutterstock.com

Kasabutskaya Nataliya/Shutterstock.com

Behind the Research

Dr Peter Kevan

E: pkevan@uoguelph.ca T: +1 519 824 4120 Ext: 52479 W: www.uoguelph.ca/~cabrnet/Peter%20 G%20Kevan.html W: https://www.uoguelph.ca/ses/people/peter-kevan

Milk thistle grows in Europe and Southern Russia.

sites ranged across Canada and the northern UK, two geographic regions prone to seasonal weather changes.

When sunlight hits leaves, it is used for photosynthesis. However, the warmth from sunlight also contributes to the microclimate within the stem.

waste)? Perhaps we could double the yield of certain crops, or grow plants where previously the land was deemed

This research reinforces the theory that there is a distinct microclimate within the stem which changes dependent on abiotic and biotic conditions. The high number of hollow stems found in these plants may reflect an adaptation geared towards surviving in these conditions. PLANTING THE SEED It is surprising that something seemingly so important has been academically neglected until now when one begins to think of the economic and agricultural benefits of this research. As soon as we understand the physical properties for improved plant fitness, we can begin to genetically examine these traits. With this knowledge, we can use basic gene-editing techniques to greatly improve plant strength, yield, and resilience. Could the creation of a ‘microgreenhouse’ inside every plant put an end to actual greenhouses (which require far more resources and produce

unsuitable. Ornamental flower growers and horticulturalists benefit from increased knowledge about regulating plant growth, as flowering, seed setting, floral and seed presentation all require growth and development of plants that is stimulated by heat. Thus, understanding the micrometeorology within plant stems and other plant organs is an area of study that will greatly contribute to the viability and marketability of ornamental and crop plants. Dr Kevan’s research plants the seed for a whole new avenue of research, stemming from thermodynamics, all the way through to genetic engineering and agriculture. These traits could also be harnessed by horticultural specialists to grow more resilient plants for recreational/ornamental use.

References

Dr Peter Kevan’s research examines how plant stems help regulate temperature.

Kevan, P.G., Nunes-Silva, P. & Sudarsan, R. Int J Biometeorol (2018) 62: 2057. https://doi.org/10.1007/s00484-018-1602-7

Detail

Kevan, P. G., E. A. Tikhmenev and P. Nunes-Silva 2019 Temperatures within flowers and stems: Possible roles in plant reproduction in the north. Vestnik of the Far-Eastern Science Centre DVO RAN No. 1; pp 38 - 47.

(Prof) Peter G Kevan, PhD University of Guelph 50 Stone Road East Guelph, Ontario, N1G 2W1 Canada

Quality Stock Arts/Shutterstock.com

Flowers of the sneezewort, one of the plants collected by Dr Kevan.

Research Objectives

THE NEXT STUDY In order to further this work and present these ideas to the next generation of researchers, Dr Kevan is about to embark on a new multiyear study looking into hollow stem physiology in much more detail. This study will include a detailed database of horticulturally important plants, recording their anatomical and microscopic morphologies. The microscopic focus will hopefully shed light on some of the important cellular and physical structures which dictate the heat transfer properties of the plant stem.

Bio Peter G. Kevan was Scientific Director at the Canadian Pollination Initiative Strategic Network and is University Professor Emeritus at the School of Environmental Sciences, University of Guelph. He has published over 200 peerreviewed papers, book chapters, encyclopaedia entries and many popular articles. He is a Fellow of the Royal Society of Canada. Funding COHA, NSERC, Russian Academy of Sciences

Personal Response In your opinion, what is the most exciting aspect of this research? The most exciting aspect of this research is the novelty of the phenomenon, the fact that no one seems to have investigated it before, and its likely importance in understanding the rapidity of plant growth to both basic and applied plant science.

Collaborators • Dr Patrícia Nunes-Silva • Research associate: Charlotte Coates • Dr Rangarajan Sudarsan • E. A. Tikhmenev

The investigation will examine the microclimate inside the stems of these plants, and how that relates to their particular morphologies. The outcomes of these observations will then be used to create, and systematically refine, biophysical mathematical models. These will be able to simulate how various stem types manage heat transfer, allowing scientists to project the morphologies of ‘optimum plants’ for a specific geography, climate, and purpose. This knowledge will go a long way to help improve commercial and recreational/ ornamental plant production worldwide.

rvika/Shutterstock.com

84

www.researchoutreach.org

www.researchoutreach.org

85


Biology ︱ Carolyn Sieg

Are wildfires following bark beetles more severe? Bark beetles are responsible for large numbers of dead trees in ponderosa pine forests in the United States. The relationship between tree mortality caused by bark beetles and increasingly severe wildfires has been analysed by Carolyn Sieg and colleagues using a detailed physics-based fire behaviour model. The team seeks to understand if fires that follow beetles are more severe, that is if they leave less green foliage, or if the tiny beetles could have more complex interactions with subsequent wildfires than previously understood.

Bark beetles (Ips species).

T

hroughout western North America, native bark beetles (Dendroctonus, Ips spp.) are thriving. Unfortunately, their success contributes to widespread tree mortality. The impact of tree mortality caused by bark beetles is one of the many factors which are studied to better understand the causes of severe wildfire in ponderosa pine forests in the USA. Whether extensive tree mortality caused by bark beetles causes more severe fires to occur has been debated among scientists and conflicting results have led to lots of confusion in the press. Research by Carolyn Sieg, of the USDA Forest Service, and her team presents a new method of analysing the complex interplay of factors involved in forest fires. The team uses a physics-based fire model developed by Rod Linn of the Los Alamos National Laboratory in collaboration with the Rocky Mountain Research Station, to explore how fires burn in bark beetle-impacted forests. Their work contributes towards scientific understanding of the relationship between tree mortality caused by the tiny bark beetles and the severity of subsequent wildfires.

Forest wildfires can kill large swaths of trees and are notoriously difficult to predict, control and prevent. There are many factors which affect fire severity of fires burning through trees killed by bark beetles. These factors include forest type, amount and stage of tree mortality, and weather conditions. Due to the complexity of the interplay of so many factors, it would be virtually impossible to replicate the conditions for forest fires in order to analyse a single variable in an experimental scenario. Consequently, computer based simulations can be a useful tool for researchers to study forest fires. COMPUTER MODELS PREDICTING FOREST FIRE BEHAVIOUR Carolyn Sieg currently works as a research ecologist with the USDA Forest Service Rocky Mountain Research Station and has studied the effects of wildfires for many years. To conduct her latest research, Sieg managed a team of people with diverse research backgrounds to maximise the benefit of shared knowledge and experience. The combination of skills has allowed for the application of a physicsbased model to predict fire behaviour. The multidisciplinary team of

FIRE BEHAVIOUR SIMULATIONS researchers was led by Sieg to explore the relationship between forest fires and bark beetles. The team included researchers from national laboratories in the U.S. (Rod Linn of Los Alamos National Laboratory) and France (Francois Pimont of Ecologie des Forêts Méditerranéennes), universities (Chad Hoffman of Colorado State University), plus other Forest Service staff (Joel McMillin of Idaho and L. Scott Baggett of Colorado). Research for the project was conducted in the USA where forest fires occur frequently, and their severity in some forest types continues to grow. As it would be unrealistic to the complexity of the issue to eliminate too many factors, the research team was keen to use computer modelling software to analyse and predict patterns in data, thus allowing them to acknowledge the numerous factors at play. Though studying the factors independently is important, the physicsbased model used by the team allowed for even more of the complex relationships between factors to be considered. RUNNING THE SIMULATIONS The team used three-dimensional physics-based modelling software to build large virtual forests based on actual field data. The team gathered extensive field data about forest characteristics in a ponderosa pine forest with varying levels of tree mortality caused by bark beetles. They then used the data to create a virtual forest where they could explore fire behaviour and severity at different stages of mortality, with different levels of mortality, and under different wind levels. When bark beetles attack a tree it causes the needles to change in colour from green to red due to the loss in fuel moisture. In the next stage, the needles fall to the ground, which decreases the amount of fuel in the tree canopies and increases the amount of fuel on the ground. The wind speed effects the ability of a fire to bridge gaps in the tree canopy. The influence of all these changes in fuel moistures, fuel loadings, and wind speed on how fires burn can be captured in their physics-based model.

No mortality

20% mortality

58% mortality

100% mortality

Fire behaviour simulations: field data was used to create analogous forest. 4 levels of fire mortality were simulated.

Research by Carolyn Sieg, of the USDA Forest Service, and her team, presents a new method of analysing the complex interplay of factors involved in forest fires. contribute to amplification or dampening of forest fire severity. Other simulations showed the presence of bark beetles had no significant effect. One of the factors which played an important role in the analysis of the model data was wind speed. At low or medium wind speeds the fire severity was greater if bark beetles had already caused some level of tree death, but the pine needles remained on the trees. In contrast, if bark

beetles had caused moderate levels of tree death resulting in large quantities of needles that had already fallen to the ground, then any subsequent fire was less severe under these same wind conditions. IMPROVING FIRE MANAGEMENT AND PUBLIC UNDERSTANDING OF BARK BEETLES There are wide ranging consequences of this research. Broadly, any action which can be taken to better understand forest

Ponderosa pine forest, post-fire. Linda Wadleigh (left) and Mary Lata (right), from the USFS, carrying out post-fire measurements.

The model simulations showed a range of results. Some simulations showed that the presence of bark beetles can

86

www.researchoutreach.org

www.researchoutreach.org

87


Behind the Research

Ponderosa pine trees, with many dying due to bark beetle kill. But does this set forests up for more risk from fire?

rk beetle activity on >1.2 million trees in Arizona during 2002 Carolyn Sieg

Bark Beetle Activity in Ponderosa Pine and Pinyon Pine (2003 Aerial Detection Survey)

Pinyon Pine Bark Beetle Activity Ponderosa Pine Bark Beetle Activity Pinyon Pine Range Ponderosa Pine Range

Area affected by bark beetle activity in ponderosa and pinyon pine trees in Arizona. Courtesy of USFS.

fires is important progress and will benefit land managers. Forest fires in pine forests are particularly difficult to predict due to the many factors which can affect their spread and their severity. The activity of the bark beetle is just one of the exacerbating factors which may lead to the effects of the fire being intensified. More specifically, the research by Sieg and her team is important to the scientific community as it helps to clarify some of the confusion

This research emphasises the complexity of forest fire analysis and highlights the reality that one factor should not be attributed undue weight. 3 concerning the role of the bark beetle on fire behaviour. Not only did the study reveal that the influence of bark beetle-induced mortality depended on the stage and amount of tree mortality, but it revealed that unexpected high fire severity can occur at low wind speeds and in forests with only a small amount of mortality. Also of interest to land managers, the simulations suggest that bark beetle-induced tree mortality may actually decrease fire severity of remaining trees when the dead needles have fallen to the ground.

Ultimately, the team has developed model simulations which highlight the complex interactions between the influence of tree mortality caused by bark beetles and any subsequent fires. With the help of their research it becomes clear why current reporting, and research, can often appear confused or conflicting. The impact of tree mortality induced by bark beetles does not appear to be a simple calculation and the models developed by this team of researchers will continue to be used to provide novel insights into the behaviour of wildfires. Most importantly, this research emphasises the complexity of forest fire analysis and highlights the reality that one factor should not be attributed undue weight. The complex interplay of factors which produce forest fires will continue to be studied for many years to come, with the tools for analysis provided by Sieg and her team a useful addition to the resources available.

88

www.researchoutreach.org

Chad Hoffman

E: csieg@fs.fed.us T: +1 928-380-5422

Joel McMillin

National Forest Boundary Reservation Boundary

Rod Linn

L. Scott Baggett

Judy Winterkemp

Francois Pimont

Research Objectives

References

Carolyn Sieg’s research aims to understand how to better manage forests impacted by severe wildfires, whether beetle-infested areas present serious fire threats, and strategies for lessening the impact of exotic invasive species.

Sieg, C.H., Linn, R.R., Pimont, F., Hoffman, C.M., McMillin, J.D., Winterkamp, J. and Baggett, L.S., (2017). Fires following bark beetles: Factors controlling severity and disturbance interactions in ponderosa pine. Fire Ecology, 13(3), pp.1-23.

Detail

Hoffman, C.M., Linn, R., Parsons, R., Sieg, C. and Winterkamp, J., (2015). Modeling spatial and temporal dynamics of wind flow and potential fire behavior following a mountain pine beetle outbreak in a lodgepole pine forest. Agricultural and Forest Meteorology, 204, pp.79-93.

USDA Forest Service Rocky Mountain Research Station 2500 S. Pine Knoll Dr. Flagstaff, AZ 86001 USA Bio Carolyn Sieg is a research ecologist with the USDA Forest Service Rocky Mountain Research Station, where she studies the role of changing disturbance regimes in forests and woodlands in the Intermountain West, USA. Her research aims to quantify the effects of wildfires and their interaction with other disturbances such as bark beetles. Funding USDA Forest Service (FS) Research Rocky Mountain Research Station and Washington Office National Fire Plan, USDA FS Forest Health Monitoring, and Los Alamos National Laboratory’s Institutional Computing Program provided computational resources. Collaborators • Rod Linn • Chad Hoffman • Francois Pimont • Joel McMillin, and L. Scott Baggett

Hoffman, C.M., Sieg, C.H., Morgan, P., Mell, W.R., Linn, R., Stevens-Rumann, C., McMillin, J., Parsons, R. and Maffei, H., (2013). Progress in understanding bark beetle effects on fire behavior using physics-based models. Technical Brief CFRI-TB-1301. Fort Collins, CO: Colorado State University, Colorado Forest Restoration Institute. pp.10. Linn, R.R., Sieg, C.H., Hoffman, C.M., Winterkamp, J.L. and McMillin, J.D., (2013). Modeling wind fields and fire propagation following bark beetle outbreaks in spatiallyheterogeneous pinyon-juniper woodland fuel complexes. Agricultural and Forest Meteorology, 173, pp.139-153.

Personal Response Which of the potential benefits of your research has been the biggest source of motivation throughout this project? Using the physics-based model has allowed us to understand how spatial variations in forest fuels interact with the fire and wind flow. We gained insights that could not be captured in simpler models that cannot account for how changes in fuel moisture, amounts, and arrangements influence fire-wind interactions following bark beetle-induced tree mortality.

www.researchoutreach.org

89


Biology ︱ Dr Eugenii (Ilan) Rabiner

Advances in CNS drug development

Andrii Vodolazhskyi/Shutterstock.com

The global prevalence of diseases affecting the central nervous system (CNS) demands the development of efficacious therapies for these unmet needs. However, drug development for CNS diseases is complicated by a limited ability to measure whether a drug candidate is accessing and affecting the human brain, particularly in earlystage human trials. Research by Dr Eugenii (Ilan) Rabiner and his colleagues: Dr Roger Gunn and Dr Jan Passchier at Invicro, highlights the unique capacity for translational imaging technologies, such as PET and MRI scanning, to provide this information. These advances in early-stage drug development have the potential to dramatically reduce the costs of developing a drug and help deliver effective new medications.

O

ver one billion people suffer from diseases of the central nervous system (CNS) globally. Whilst pharmaceutical treatments for many diseases have made huge strides in the past decades, the development of pharmaceuticals for diseases of the CNS has been relatively stunted. This is, in part, because of our limited ability to determine how effective pharmaceutical compounds are in the early stages of drug development. Instead, millions of dollars are spent researching and testing compounds that are found to be ineffective during late-stage where largescale human trials are long and expensive. Translational imaging methods such as Positron Emission Tomography (PET) and Magnetic Resonance Imaging (MRI)

are non-invasive technologies capable of providing direct and quantitative data on drug distribution, drug interaction with its intended target and the resulting drug efficacy in the human body. Such information can be used to stop the development of compounds that will not be successful and define the dose range for testing of promising compounds in later phase clinical trials. This allows resources to be concentrated on developing only the most promising pharmaceutical compounds, thereby reducing the size and cost of later phase drug development. Dr Eugenii Rabiner at Invicro, London with Dr Roger Gunn and Dr Jan Passchier have focused their research on such translational imaging technologies to advance drug development for CNS disorders.

DRUG DEVELOPMENT PROCESS The development of pharmaceutical drugs is a long and costly process, spanning nearly a decade and costing over $2 billion for each new drug brought to market. Numerous stages of testing must be performed before it may be deemed safe and effective for use in patients. Firstly, a specific molecular target within the body is identified and many compounds are screened for appropriate properties to allow interaction with this target. If this yields a promising lead compound, a library of analogues is created and screened to determine the most effective. The identified candidate compounds initially undergo in vitro and in vivo preclinical testing to understand if they might be effective and whether any toxicity issues manifest themselves. In vitro testing will typically make use of isolated cells and tissues, while in vivo testing will be performed in animal models. Whilst the use of animals in drug testing is controversial, it remains a necessary step in the process of drug development. Efforts to reduce animal testing (the so-called 3Rs – Replace, Reduce, Refine) benefit from access to non-invasive imaging methods, as they allow repeated testing in a smaller number of subjects, and provide more physiologically relevant data. If the candidate compound is found to be effective and no unacceptable safety signals are seen, it may then be trialled in humans. Human trials have 4 phases designed to demonstrate the safety and efficacy of new medications, with each phase growing in size, resource intensity, and cost. Phase III studies can cost hundreds of millions of dollars and all too often provide the stage for candidate failure. There is, therefore, a desperate need to improve the characterisation of drug candidates in early Phase I and II clinical trials to increase the probability of success when progressing them to long and expensive Phase III studies. THE THREE PILLARS OF DRUG DEVELOPMENT The Three Pillars approach has been formulated to define the information

90

www.researchoutreach.org

Study volunteer undergoing a brain PET scan.

that can be obtained in early phase development to significantly enhance the probability of drug candidate success. The Three Pillars are (1) evidence of tissue exposure (the extent to which the drug penetrates the target tissue of interest), (2) evidence of molecular target engagement (the extent to which the drug interacts with the biological target) and (3)

Substitution of a carbon or a fluorine atom in a candidate compound with a positron emitting isotope ( 11C or 18F) does not change its pharmacological or physicochemical properties and allows the visualisation of the compound’s distribution in the body. The absolute concentration of the drug in body tissues, such as the brain, can be quantified

The richness of information from occupancy studies also allows estimates to be made [that are] typically not possible until later phase human trials. evidence of pharmacological activity (the ability of the drug to modulate its molecular target and subsequent pathways). Whilst many drug effects can be monitored by measuring drug concentration in blood, monitoring the effects of CNS drugs is more difficult due to the presence of the blood-brain barrier that prevents many drugs from entering the brain. PET imaging is uniquely placed to be able to monitor drug tissue distribution and target engagement, and can provide useful information on drug pharmacological activity in combination with a variety of MRI techniques.

accurately and compared to that measured in plasma to understand its tissue distribution. A radio-labelled tool compound, or radio-ligand, that is specific for a molecular target, allows quantification of the density of that molecular target. Comparing the density of the available target (such as a particular receptor or an enzyme) in the brain at baseline with that following the administration of a dose of the candidate drug allows calculation of the proportion of the total target that is occupied by a particular dose of the drug. This information enables confirmation of drug-target engagement. This approach often termed an “occupancy study”,

www.researchoutreach.org

91


Behind the Research

is preferred to the drug distribution study described above, as it allows the quantitative evaluation of drug target engagement in addition to brain entry. The rich information obtainable from such a study enables the prediction of the levels of target engagement produced by repeat dose studies in patients in later phases of development. The use of appropriate experimental design and analysis methods mean that such information can be available following a first in human, single ascending dose study in Phase I. The performance of an “occupancy study” requires the availability of a specific radio-ligand for the particular molecular target. Unless already available (true for around 40 CNS targets), a new radio-ligand has to be developed, a complex process that requires 12 months or longer, and one that should be planned early in the course of a drug development programme. The introduction of in silico biomathematical modelling in recent years has accelerated the process of biomarker development by optimising the choice of candidate molecules from a starting series to develop the novel radio-ligand. Techniques such as blood oxygen level dependent (BOLD) and arterial spin labelling (ASL) functional Magnetic Resonance Imaging (fMRI) can provide non-invasive information on the changes in brain activity following drug administration. fMRI can be combined fruitfully with PET to provide information on both target engagement and resulting pharmacological modulation by the drug in the same study. Alternative imaging avenues to evaluate target modulation or downstream effects include measuring the change in the density of a protein involved with disease progression (such as the misfolded proteins β-amyloid or τ-protein characteristically found aggregated in the brain of Alzheimer’s patients). Other approaches include the

Dr Eugenii (Ilan) Rabiner

E: ilan.rabiner@invicro.co.uk T: +44 7717801552 W: www.invicro.com/ www.linkedin.com/in/eugenii-a-ilan-rabiner-a953866/

Examples of PET occupancy data demonstrating target engagement by CNS active compounds.

www.researchoutreach.org

References

Dr Rabiner’s research focuses on molecular and functional imaging to understand brain pathophysiology and facilitate drug development.

Ereshefsky, L., Evans, R., Sood, R., Williamson, D., English, B. A. (2015). Venturing into a new era of CNS drug development to improve success. Available at: https://www.parexel.com/ files/4314/4113/4032/Venturing_Into_a_New_Era_of_CNS_ Drug_Development_to_Improve_Success.pdf [Accessed 07/02/2019].

Detail Invicro, Burlington Danes Building, Hammersmith Hospital, Du Cane Road, London, W12 0NN UK Bio Dr Rabiner trained at the University of the Witwatersrand, the University of Cape Town and the MRC Cyclotron Unit, Hammersmith Hospital, London. From 2001-2011, he was the Head of Clinical Imaging Applications at GlaxoSmithKline Clinical Imaging Centre, London. He is now the Executive VP and Global Head of Translational Applications at Invicro. Collaborators • Roger N Gunn, Invicro • Jan Passchier, Invicro Imaging modalities and applications across the drug development pipeline.

Gunn, R.N. and Rabiner, E.A., (2017). Imaging in central nervous system drug discovery. Seminars in nuclear medicine. 47(1), 89-98.

Personal Response With such advances, how much of a reduction in the time and money spent developing drugs for CNS diseases can be expected? The intelligent use of translational biomarkers will accelerate the process of drug development by allowing a better choice of doses to be tested in clinical trials. That may lead to a saving of one to two years in the life of a drug. However, the main benefit of these technologies is not in accelerating the development of individual compounds, but by allowing us to discard unsuccessful compounds very early in the development process and reallocate the resources to more promising candidates – thus increasing the total number of drugs being developed.

Applications of PET and MRI technologies in drug development

measurement of the direct or indirect (downstream) effect of drug candidates on the proliferation of brain immune cells, microglia, as a sign of brain inflammation, or on changes in the release of neurotransmitter molecules such as dopamine and serotonin across synapses (the gaps between brain cells) to indicate changes in brain function. Detection of target modulation or downstream effects provides confidence that the novel compound is pharmacologically active – and in combination with data acquired in an occupancy study can allow

PET imaging is uniquely capable of quantifying each of the three pillars of CNS drug development.

92

Research Objectives

the determination of the relationship between the dose administered and the resulting pharmacological effects.

Dr Eugenii Rabiner and Dr Roger Gunn’s 2017 paper describes the current state of the art in the application of translational imaging in support of CNS drug development. With a growing number of biomarkers being developed, the amount of information that occupancy studies and PET imaging can provide will rapidly expand. This has a huge significance in accelerating the field of CNS drug development, reducing the money and resources spent researching ineffective pharmaceutical compounds, and concentrating efforts to develop effective treatments for CNS diseases. Andrii Vodolazhskyi/Shutterstock.com

www.researchoutreach.org

93


Biology ︱ Drs Nancy Bonini and Leeanne McGurk

Fruit flies help shed light on drug discovery for ALS

nechaevkon/Shutterstock.com

Amyotrophic lateral sclerosis (ALS) is a devastating and incurable neurodegenerative disease that affects people in adulthood. It leads to the death of neurons involved in muscle control, eventually affecting almost all facets of the body, including walking, swallowing and breathing. Drs Nancy Bonini and Leeanne McGurk at the University of Pennsylvania are using fruit flies, mammalian cellular systems like neurons, and in vitro protein preparations to investigate promising new molecules that could open avenues to new treatments for this devastating condition.

94

www.researchoutreach.org

A

myotrophic lateral sclerosis (ALS) – also known as motor neuron(e) disease – is a severe and incurable disease that has a devastating effect on the quality of life of people living with the condition as well as their carers and families. ALS most resonates with the general public in relation to high-profile personalities who have experienced the condition, including American baseball player Lou Gehrig – hence, the disorder is sometimes referred to as Lou Gehrig’s disease – and world-renowned British physicist, Stephen Hawking. The first signs and symptoms of ALS might include tripping, leg weakness and difficulty controlling hand movements, or, what people might first dismiss as simple clumsiness. Diagnosis is usually aided by imaging and scanning techniques that can rule out other causes of nerve damage and lack of motor control, including brain tumours and neuropathy. Over time, symptoms of ALS gradually become worse and affect almost all areas of the body causing muscle weakness, stiffness and paralysis. As the disease advances, people living with ALS find their breathing becomes more difficult and eating becomes impossible as muscles associated with swallowing are affected. Almost all people with ALS will eventually find their ability to speak frustratingly difficult, leading some to rely on technology to communicate – with early versions of this technology perhaps best represented by Stephen Hawking’s infamous computerised voice. The most common cause of death for people with ALS is respiratory failure as the disease interferes with patients’ ability to breathe or cough. Many people with ALS die within two to five years of their diagnosis with many still in middle age.

Risk factors are not well understood but include older age, with most people diagnosed in middle to older adulthood. A small proportion of cases are inherited and run in families, whilst most cases are apparently sporadic in nature. Although several genes have been linked to ALS, no one gene that has been discovered can account fully for the disease. The urgent need for more research into ALS has gained awareness after highprofile campaigning by advocacy and patient groups. A very successful ‘Ice Bucket Challenge’ online campaign firmly cemented ALS in the public conscientiousness in the past years and raised over $100 million to support vital research and services for the people living with the disease. Despite increased awareness and funding for the condition, there is a paucity of effective treatments available for people living with the condition. After decades of research, there are currently no cures for people living with ALS. Existing treatments are limited and can help patients manage some symptoms of the condition but only extend life by up to three months. Clinical trials for desperately-needed treatments are few and far between and relatively little is known about the molecular processes that are responsible for its widespread effects on the central nervous system. PROMISING NEW HORIZONS Dr Leeanne McGurk, a postdoctoral researcher is based in Dr Nancy Bonini’s laboratory at the Department of Biology at the University of Pennsylvania, has committed her working life thus far to understanding more about the molecular mechanisms underlying ALS in the hope of uncovering new avenues in the quest for successful drug treatments. Together, McGurk and Bonini have made important strides to pinpoint molecules that could

In healthy individuals, TDP-43 is in the nucleus, in ALS and related disorders, TDP-43 leaves the nucleus and accumulates in the cytoplasm.

In cells undergoing a stress response, TDP-43 accumulates in the cytoplasm. McGurk and Bonini found that under stress, disease-like TDP-43 accumulations form near structures called stress granules. Photo credit: Dr Leeanne McGurk.

help tackle some of the processes that go awry in the condition. As part of their research tool kit, Drs McGurk and Bonini have made some of their most fundamental discoveries using a small but powerful model of neurodegenerative disease – common fruit flies. They extend their findings from the tiny fly to other systems and approaches in studies with their team of collaborators. FRUIT FLIES TO MODEL ALS Drosophila melanogaster – more commonly known as fruit flies – are surprisingly more similar to humans than they may first appear and have been used in scientific studies of disease for over 100 years. Fruit flies share many of the genes that are seen in humans and display more complex behaviours than may be expected, from movement to social interactions. The fact that fruit flies have a smaller genome than other laboratory animals, such as mice, means that their DNA can be studied more easily than mammals. Their shorter life spans that can be measured in days rather than months make it easier, quicker and less expensive for researchers to track disease-related changes that occur over a

lifetime. A number of molecules have been tested in these tiny creatures, some have even ended up in clinical trials for a range of diseases. TDP-43 – A VITAL INSIGHT INTO ALS Many patients with ALS show cells that have a build-up of a protein known as TDP-43. In healthy human cells, this protein is contained in the heart of the cell, known as the nucleus. However, in ALS, TDP-43 seeps out of this cell core and accumulates in clumps elsewhere in the cell. Protein clumping is seen in other neurodegenerative diseases, including Alzheimer’s disease, characterised by clumping of proteins known as amyloid

and tau, and frontotemporal degeneration, which also involves the TDP-43 protein in approximately half of all cases.

manipulating cells in the nervous system to express human TDP-43. Flies expressing this human protein showed both neurodegeneration and a shortened life span, similar to features reflective of ALS. This advance was a vital milestone as it offered the team a model for efficiently studying ALS on the molecular level. Further studies with fruit flies shed light on the behaviour of TDP-43, focusing on protein clumping, TDP-43 transportation throughout the cell and the response of the protein in conditions of stress, such as that seen in the disease.

As an early postdoctoral researcher, Dr McGurk, as well as other members of the Bonini lab, modelled ALS in fruit flies by

TDP-43 TARGETED THERAPIES In exciting follow-up experiments, Bonini and McGurk’s team found that the damage

After decades of research, there are currently no cures for people living with ALS.

Healthy

TDP-43

Rat spinal cord cultures

TDP-43 + PARP inhibitor

= nucleus

= nerve cells

Healthy rodent nerve cells can be grown in a dish (left panel). Nerve cells, infected with a virus engineered to make TDP-43 protein, mimic disease by showing neural loss (middle panel). McGurk and Bonini found that treating the nerve cells with an inhibitor to PARP-1/2 reduces the nerve cell loss caused by expression of TDP-43. This figure was previously published in L. McGurk, et al. “Nuclear Poly(ADP-Ribose) Activity Is a Therapeutic Target in Amyotrophic Lateral Sclerosis.” (2018). Acta Neuropathologica Communications, BioMed Central, 6:84, and is under the Creative Commons Licence CC BY 4.0.

www.researchoutreach.org

95


Behind the Research Dr Nancy Bonini

Dr Leeanne McGurk

E: nbonini@sas.upenn.edu T: +1 215 573 9267 W: www.bio.upenn.edu/people/nancy-bonini W: http://web.sas.upenn.edu/bonini-lab/

E: lmcgurk@sas.upenn.edu W: https://leeannemcgurk.wordpress.com/ @LeeanneMcgurk

Research Objectives Drs Nancy Bonini and Leeanne McGurk are using fruit flies, mammalian systems and in vitro protein preparations to investigate promising new molecules that could help lead to new treatments for amyotrophic lateral sclerosis (ALS).

caused by expressing TDP-43 in the fly can be rescued. They discovered that reducing levels of an enzyme known as PARP-5 meant that the flies expressing TDP-43 stopped the early death. The researchers believe that PARP enzymes could be key to understanding why proteins accumulate in the cell cytoplasm outside of the nucleus and further, could shed light on ways to reduce the build-up of TDP-43 in this part of the cell. When McGurk and Bonini investigated this with their collaborators in human spinal cord cells, they found that PAR – a chain of molecules tagged on to target proteins by PARP and controls the toxicity of TDP-43 – were found in high levels in the nucleus of motor nerve cells. This suggested that levels of PARP activity in the human spinal cord were elevated in the presence of ALS.

The ALS ice bucket challenge was an internet phenomenon, raising over 100 million US dollars for resources and research into ALS.

To test this theory further, the researchers tested a drug to reduce activity of PARP-1/2 in a rat nerve cell model of ALS expressing human TDP-43. Promisingly, the drug – originally developed as a cancer treatment – was shown to reduce neural death caused by TDP-43. The findings

Together, McGurk and Bonini have made important strides to pinpoint molecules that could help tackle some of the processes that go awry in the condition. In ALS patients, McGurk and Bonini saw high levels of PAR in the nucleus (arrow) of the nerve cells that degenerate in ALS. This suggests that the enzymes that make PAR are active in these cells in disease. This figure was previously published in L. McGurk, et al. “Nuclear Poly(ADP-Ribose) Activity Is a Therapeutic Target in Amyotrophic Lateral Sclerosis.” (2018) and is under the Creative Commons Licence CC BY 4.0.

96

www.researchoutreach.org

Marcos Mesa Sam Wordley/Shutterstock.com

Several well-known individuals have lived with and died of ALS, including the baseball player Lou Gehrig and the physicist Stephen Hawking.

Healthy

ALS

Patient motor neurons

suggest that PARP enzymes could be a vital target for investigating new drug treatments in diseases linked to TDP-43 clumping. FUTURE STEPS There is still much to learn about PARP enzymes before they could be used to treat ALS patients, including how the subtypes of PARP enzymes work together and how they affect the transport of TDP-43 throughout the cell. However, these findings shed light on the molecular mechanisms underlying the disease and are a beacon of hope in a drug discovery landscape blighted by a drought of advances in recent years. The findings could also shed light on treatments for disorders that share some of the same molecular characteristics, including frontotemporal degeneration. Whilst there is some way to go before it can be determined whether PARP-based therapies could work for ALS, scientists such as Drs McGurk and Bonini continue to strive towards understanding ALS at the molecular level, offering hope to those living with this devastating condition.

Detail

References

204G Lynch Laboratory, Department of Biology, University of Pennsylvania, Philadelphia, PA. 19104, USA

McGurk, L. et al (2015). ‘Drosophila as an In Vivo Model for Human Neurodegenerative Disease’. Genetics, 201(2): 377-402.

Bio Bonini received her PhD from the University of WisconsinMadison, then performed postdoctoral research at Caltech. She then launched her laboratory at Penn, where she developed Drosophila as a model for human neurodegenerative disease. She is an elected member of the National Academy of Sciences, National Academy of Medicine and American Academy of Arts and Sciences. McGurk received her PhD from the University of Edinburgh and joined Bonini at Penn as a National Ataxia Foundation Investigator. As a post-doctoral researcher, she uncovered that the post-translational modification poly(ADPribosylation) may play a role in mediating amyotrophic lateral sclerosis. This coming Fall, she will join the Division of Cell & Developmental Biology in the School of Life Sciences at the University of Dundee, as an independent investigator. Funding NIH/NINDS Collaborators Investigators at Penn: • Dr Vivianna Van Deerlin • Dr Edward B Lee • Dr Virginia M-Y Lee • Dr John Q. Trojanowski • Dr James Shorter An investigator at Northwestern: • Dr Robert G. Kalb

McGurk, L. et al (2018). ‘Poly(ADP-Ribose) Prevents Pathological Phase Separation of TDP-43 by Promoting Liquid Demixing and Stress Granule Localization’. Molecular Cell, 71(5):703-717. McGurk, L. et al. (2018). ‘Nuclear poly(ADP-ribose) activity is a therapeutic target in amyotrophic lateral sclerosis’. Acta Neuropathol Commun, 6(1):84. McGurk, L. et al (2018). ‘Poly(ADP-ribose) engages the TDP-43 nuclear-localization sequence to regulate granulo-filamentous aggregation’. Biochemistry, 57 (51), 6923-6926.

Personal Response What steps would have to be taken before PARP enzyme-based therapies could be tested in a clinical trial? ALS is a heterogeneous disease, meaning that there are many different genes that when mutated give rise to the disease. We need to understand if PARP inhibitors are beneficial to all, or only some of these disease gene situations using our model systems of cells and animals. These types of studies will inform if certain patients may be better suited to these compounds, should they prove promising. Many of the developed PARP inhibitors have very specific modes of action and some are better at crossing the blood-brain-barrier than others. We need to first understand if there are differences in these PARP inhibitors in treating ALS-associated toxicity in cells and in neurons so that we can select the best compound or compounds for further testing. Traditionally, drugs are tested in rodent models of disease before they can move forward into a clinical setting. The field has made much progress in rodent models that recapitulate aspects of the disease process. Once we know which inhibitors are most promising in our settings, the next step will be to advance and test in a rodent system.

www.researchoutreach.org

97


Earth and Environment ︱ Dr Thomas Shahady

Going with the flow: Water quality and community health in Costa Rica Water quality is one of the most pressing issues facing the world, with solutions to combat poor water quality remaining elusive. This is what makes the work of Dr Shahady and fellow researchers at the University of Lynchburg and the University of Georgia Costa Rica so important. Their pioneering work in promoting community engagement with a macroinvertebrate index to ensure effective water resource management has the potential to drive governmental agencies to react to water pollution. This work in Costa Rica has the potential to ensure both river and community health in the future.

Researcher José Montero discusses water quality observations with Dr Shahady.

98

www.researchoutreach.org

W

ater quality is one of the most important issues facing Latin America today. In Costa Rica, issues such as agricultural development, pollutant runoff, inadequate sanitation and legislation all contribute to water pollution. These problems are compounded by seasonable patterns in rainfall, challenging terrain, a lack of effective water quality monitoring and ineffective water resource management. Solutions to these problems remain elusive due to centralised governmental control of decision making. Often information about local water bodies is scarce, the needed environmental testing complex, changes across landscapes rapid and local control minimal. However, the need to find solutions to poor water quality is paramount. Climate change is estimated to significantly alter water quantity and quality in the future as a result of its effect on environmental services, agricultural activities and biodiversity.

With a background in understanding the relationship between water quality, community health and water use and funded by the University of Georgia Costa Rica, the University of Lynchburg and Bosqueterno S.A, Dr Shahady is well placed to conduct this research.

Dr Shahady of the University of Lynchburg and fellow researchers at the University of Georgia are at the forefront of researching solutions to these problems.

To make matters worse, the physical terrain of Costa Rica and seasonal rains also significantly contributes to the ineffective management of water quality. Steep and rugged slopes often limit accessibility for stream monitoring, Heavy seasonal rains wash away existing roads and prevent access to waterways unless individuals have experienced knowledge of local infrastructure. Wet and dry seasons limit consistent application of water quality indices. Without effective water monitoring, water quality problems cannot be pinpointed. To further compound the issue, equipment, adequate laboratory space and analytical knowledge are also limited. As a result, local communities often use anecdotal assumptions regarding their local water quality, which leads to poor decision making concerning local sources of water.

A BACKGROUND TO WATER QUALITY ISSUES IN COSTA RICA While the availability of potable water in Costa Rica is good, serving over 99% of the population, wastewater treatment is highly lacking. Mounting concern with septic tanks and agricultural waste contributing significantly to water pollution in the country is increasing. Although on paper water quality is managed by the central government in San Jose and through a variety of five government agencies, due to disparities in clear regulatory authority and unclear governmental responsibility, wastewater and greywater generally remain unregulated.

Measuring discharge and water depth in a stream.

Dr Shahady and his research team suggest that their trialled macroinvertebrate index is a vehicle to this awareness and change. USING INSECTS TO MEASURE WATER QUALITY To solve this problem, a clear, usable, affordable and easily measurable methodology is needed for water quality prediction that takes into consideration the concerns and abilities of local communities. One possible solution is the use of a macroinvertebrate index. This is a methodology whereby the ecological condition of water bodies is inferred using information about what insects live there. This is already a well-established technique for assessing the quality of natural water bodies. The government of Costa Rica implemented such an index in 2007. Known as the BMWP-CR Index, this methodology standardises the use of aquatic insects for water quality detection. However, there are issues associated with using this index. Its use requires certain ways of collecting insects and demands expertise in insect identification skills. Dr Shahady and his team therefore set about testing whether an alternative and more appropriate macroinvertebrate index (known as the PMA Index) could be used by communities that required less expertise and knowledge. Dr Shahady and fellow researchers set out to compare these two indices, by conducting research as part of a larger water quality monitoring project in the Bellbird Biological Corridor in Costa Rica. They tested 16 sampling sites between 2015 and 2016. The sampling

sites traversed different elevations, and different land-use types. At each site, water quality was measured in three ways: using a water quality metre for chemical analysis, and collecting insects for the BMWP-CR index and simpler PMA index simultaneously. This simpler PMA index was used by volunteers from local areas who used a worksheet which explained insect identification based on descriptions and drawings. The simpler PMA index was found to be better at predicting water contamination than the more complex BMWP-CR index, suggesting that the simpler index, which locals can use, is practical as a predictor of water quality. This further suggests that

community volunteers can provide water quality index calculations similar to more sophisticated scientific studies. CONTAMINATED WATER AND THE FUTURE OF COMMUNITY HEALTH During their study, Dr Shahady’s team found that several of the sites they tested were contaminated with bacteria such as E. coli, especially during the wet season, likely as a result of pollution from the inadequate collection and treatment of wastewater. Inadequate sanitation burdens health care systems, but solutions to this problem remain elusive. Unfortunately, building and maintaining wastewater treatment plants is not

Water Quality Laboratory on Campus of UGA – Costa Rica.

www.researchoutreach.org

99


Behind the Research

Dr Thomas Shahady

E: shahady@lynchburg.edu T: +1 434 944 5684 T: +1 434 544 8545 W: https://www.lynchburg.edu/academics/academiccommunity-centers/center-for-water-quality/stream-ecology-management/costa-rica/ W: https://costarica.uga.edu/ W: https://ugacostaricablog.com/2016/10/14/research-spotlight-video-water-quality-with-darixa-hernandez/ www.facebook.com/ugacostarica/

Using nets to collect aquatic macroinvertebrates for the index.

a solution. Current ones already in place in Costa Rica are currently not well maintained and future construction is very expensive. They often have unrepairable structural damage and are blocked by build-ups of sewage sludge. To make matters worse, there is not one specific government agency, programme or law directly assigned to protect rivers from inadequate sanitation, despite strong evidence suggesting that improved water quality leads to reductions in disease occurrences. Compounding the issue, climate change is expected to have a significant impact on water quality and river health in Costa Rica in the future. Models suggest that predicted increases and variability in precipitation will alter the ways that rivers flow, which means that water bodies that are currently considered to be healthy may become contaminated in the future. Contaminated surface water exchange with groundwater is unknown. There is a real possibility that drinking water contamination may occur in Costa Rica in the next few decades. The bottom line is that communities need engagement with water issues to drive local and collective governments to react in order to aid public health. Dr Shahady and his research team think that community science and their simplified index used for water quality monitoring will do just that.

Using the metre to measure water chemistry in the streams.

CITIZEN SCIENCE TO ENSURE COMMUNITY HEALTH In Costa Rica, one method in place to protect water quality are payments for environmental services (PSA). These are incentives offered to landowners in exchange for managing their land to provide some sort of environmental service. PSA programmes focus on maintaining adequate supplies of water for purposes valued by the community such as hydroelectric power. Decisions to create PSA programmes are locally made and driven by economic factors that make sense to local land-owners. But indirectly, these programmes provide improved water quality management for entire communities. People that live in areas with PSA programmes are more likely to think positively about the environment, and therefore expect better water quality. However, outside of PSA areas, locals have very different ideas about water quality. While they have may have concerns, they perceive government programmes as being responsible for water quality maintenance. With this perception, sources of water pollution are generally ignored, and disease risk significantly increased. However, if these pollutants were to be monitored, water quality concerns could be heightened, and residents would become more concerned about the water they are actually drinking.

Community volunteers are capable of providing water quality index calculations similar to more sophisticated scientific studies.

100

www.researchoutreach.org

If locals are aware of the benefits to be gained from maintaining good water quality, it is likely that they would make decisions to benefit themselves, just as with areas with PSA programmes. Dr Shahady and his research team suggest that their trialled macroinvertebrate index is a vehicle to this awareness and change. Using their tested simplified index, a community group would be able to provide information otherwise unavailable to residents that may make them care more about their water quality and make local decisions about how to deal with water pollution. Areas of contamination could be identified, and this information could be used to begin an improved legal framework for river protection. Locals involved could begin to establish longterm monitoring stations and organise efforts to improve water quality. THE FUTURE Although drinking water in Costa Rica is in adequate supply, sanitation management, particularly in rural areas, is highly lacking. Rivers carry high levels of pollutants and could be a significant source of disease. A simple index to measure water quality, as trialled by Dr Shahady and his team’s work, may be the catalyst to change this. By monitoring their own water, Dr Shahady and his team hope that communities will work together to install management practices for agricultural and sanitary wastewater, in order so that water quality can be improved. They believe that engagement and support for water monitoring practices is the only way forward in ensuring the future of water health in Costa Rica.

Research Objectives

References

Dr Shahady’s research focus centres on understanding the relationship between water quality, community health and water use in Costa Rica.

Arias, A. (2010). ‘Situación de Potabilización y Saneamiento en Costa Rica’. En: Decimosexto In- forme Estado de la Nación en Desarrollo Humano Sostenible, pp. 36.

Detail 1501 Lakeside Drive Lynchburg VA 24501, USA Bio Dr Shahady is an Environmental Scientist and Researcher. He earned his PhD from North Carolina State University in Zoology and his master’s of science in public health and engineering from UNC – Chapel Hill. Funding • University of Georgia Costa Rica for Logistical and Matching Support • University of Lynchburg for Direct Funding of Research • Two BESA (Bosqueterno S.A.) grants for direct financial support Collaborators • Jose’ Montero – Researcher UGA-CR • Martha Garro – Researcher UGA-CR • Fabricio Camacho – Director UGA-CR • Scott Connelly – Professor UGA • Veronica Sheehan – Sustainability Center

Bower, K. (2014) ‘Water supply and sanitation of Costa Rica’. Environmental Earth Sciences, Vol. 71, pp. 107-123. Cairns, J. & Pratt, J. (1993). ‘A history of biological monitoring using benthic macroinvertebrates’, in Rosenburg, D. & Resh, V. (eds.) Freshwater biomonitoring and benthic macroinvertebrates. New York: Chapman and Hall. Karmalkar, A., Bradley, R & Diaz, H. (2011). ‘Climate change in Central America and Mexico: regional climate model validation and climate change projections’. Climate Dynamics, Vol. 37, pp. 605-629. Kuzdak, C. & Wiek, A. (2014). ‘Governance scenarios for addressing water conflicts and climate change impacts’. Environmental Science and Pollution Research, Vol. 42, pp. 181-196. Schwarzenbach, R., Egli, T., Hofstetter, T., von Gunten, U. & Wehrli, B. (2010). ‘Global water pollution and human health’. Annual Review of Environment and Resources, Vol. 35, pp. 109-136. Wood, M., Sheridan, R., Feagin, R., Castro, J. & Lacher, T. (2017). ‘Comparison of land use change in payments for environmental services and National Biological Corridor programs’. Land Use Policy, Vol. 63, pp. 440-449.

Personal Response What made you so interested in researching and analysing water quality in Costa Rica in particular? Teaching students of all ages and the discovery of needs in local communities. The people of Costa Rica are wonderful and have helped me teach, learn and enjoy everything about Costa Rica. I wanted to give something back and use my expertise to help them solve problems. The streams also piqued my interest in tropical stream ecology and patterns of insect worldwide distribution. Discoveries in nature, ecological response to both natural and human disturbance, and teaching about living sustainably with these amazing natural resources are some of my greatest interests.

www.researchoutreach.org

101


Earth and Environment ︱ Dr Lindsay Worthington

Drilling for knowledge:

EQRoy/Shutterstock.com

A collaborative approach to ocean drilling in the Arctic

T

he Northern Pacific, Bering Sea and Western Arctic regions contain important tectonic data, paleoclimatic records (data concerning the earth’s past climatic changes) and paleoceanographic records (data concerning the history of the oceans in the geologic past with regard to circulation, chemistry, biology, geology and patterns of sedimentation and biological productivity). These records can be used to obtain a better understanding of modern day climate change as well as geologic hazards in the area, including volcanism and tsunamis, as well as the Alaska-Aleutian margin. This is a boundary between two tectonic plates in the region that has been associated with several large magnitude earthquakes in the last decade alone.

sirtravelalot/Shutterstock.com

These regions remain seriously undersampled by scientific ocean drilling however, and is necessary for obtaining these tectonic, paleoclimatic and paleoceanographic records. Only one International Ocean Discovery Program (IODP) drilling expedition has been north of the Aleutian Islands (a volcanic island chain belonging to both Russia and the

USA in the North Pacific) since 1971, and only one expedition has recovered cores from the Arctic Ocean. The entire eastern Gulf of Alaska has never been drilled. Both tectonic, paleoclimatic and paleoceanographic records can be collected via the coring and drilling of sea floor sediments and sea ice via the US National Science Foundation’s (NSF) funded drill ship, the R/V JOIDES Resolution which serves the NSF International Ocean Discovery Program (IODP). This ship is scheduled to operate in the North Pacific in 2023. This presents a tremendous opportunity for planning and achieving high-priority science objectives in this region. In order to ensure that this ship is used to its best advantage, it is necessary that drilling expeditions are proposed prior to this time. Well-conceived and coordinated drilling expeditions are needed, both to expand scientific understanding and to make effective use of the NSF’s funded drill ship. This is why in September of 2018, 76 scientists from nine different countries participated in the workshop ‘Scientific Exploration of the Arctic and North Pacific: Community-driven Priorities for Ocean Drilling’. Convened by Dr Lindsay

The Bering Strait on the western edge of Alaska.

ggw/Shutterstock.com

The Arctic contains important records of tectonic and geological oceanic history, yet is a vastly under-sampled area. However, a better knowledge of the past climate and tectonic history of this region is crucial to understanding contemporary climate change and modern geological hazards. This is what makes the research of Dr Lindsay Worthington at the University of New Mexico and fellow researchers so important. Their collaborative work to develop new proposals and reinvigorate existing proposals for scientific ocean drilling in the region will combat the issue of under-sampling in the Arctic.

Worthington from the University of New Mexico, as well as Dr Kristen St. John from James Madison University, Dr Bernard Coakley from University of Alaska, Dr Juliane Mueller from the Alfred Wegenger Institute and Dr Matthias Forwick from the University of Tromso, the workshop aimed to develop new proposals and reinvigorate existing proposals for scientific ocean drilling in the area. Funded by the International Ocean Discovery Program, the US National Science Foundation, the US Bureau of Ocean and Energy Management and the Norwegian Petroleum Directorate, this project and associated workshop focused on regional coordination, therefore encouraging the development of new coordinated drilling strategies. Successful drilling proposals usually take four years from initial submission. Therefore, the rationale for the workshop was to form proponent groups to submit proposals for scientific drilling to IODP, in order to take full advantage of the planned availability of the JOIDES Resolution. WHAT WILL OCEAN DRILLING ACHIEVE? Ocean drilling is necessary in order to explore and study the composition

102

www.researchoutreach.org

and structure of the Earth’s oceanic basins. Ocean drilling can enable access to marine sediments, which contain important archives for better constraining past climate and tectonic history. This information is critical to understanding modern climatic changes as well as geologic hazards. In the North Pacific, the Bering Sea and

of contemporary climate change. It is therefore pertinent that these historical records are accessed via ocean drilling. This is just one example of how ocean drilling can help to inform researchers on contemporary climate change. Sea ice records accessed via ocean drilling can also shed light on how ice and sea

This is just one such proposal that has the potential to extraordinarily advance knowledge on modern climate change and geologic hazards. the Arctic Ocean region, previously limited drilling is currently hampering scientific understanding of fundamental Earth processes such as the impacts of oceanic exchange between the Arctic and North Pacific Oceans on long-term climate evolution. Variations in climate change can be seen in marine sediment cores that enable reconstructions of changes in sea level, ocean salinity, temperature, sea-ice extent, and the response of marine ecosystems. This paleoclimatic data can be used to inform understanding

levels respond to warming and cooling. Information such as this is also important in expanding knowledge of climate change today. Not only can ocean drilling inform current knowledge of contemporary climate change, it can also expand knowledge of Alaskan earthquakes, and other tectonic activity in the area. By finding out more about past tectonic activity in the area through obtaining data on the evolution of the mantle magmatic source in the

www.researchoutreach.org

103


Behind the Research E: lworthington@unm.edu T: +1 505-277-3821

Paleoceanographic data concerning patterns of subduction zone earthquakes and tsunamis can be accessed through ocean drilling.

UNDERSTANDING MEGA-THRUST EARTHQUAKES One such proposal addressed the lack of knowledge that surrounds understanding the variability of mega-thrust earthquakes. Mega-thrust earthquakes occur when one tectonic plate is pushed under another. These earthquakes are the planet’s most powerful, and can cause intense damage. Currently, the variability of these mega-thrust earthquakes are thought to be caused by a number of factors such as the location of splay faults, smaller fault lines that branch off from the main tectonic fault.

During the workshop, a total of 15 proposal ideas were recognised as priorities, and science and data leads were identified. These proposals spanned the geographic region of interest, and included three for the Bering Sea, four for the Arctic and eight for the North Pacific. These proposals were written by proponents at the meeting as well as by other researchers in the broader community.

Through the study of higher resolution paleoseismic records (records of past tectonic and seismic activity), it is likely that more can be learned about the factors that contribute to the intensity of mega-thrust earthquakes. Southern Alaska presents a great opportunity to investigate these factors, due to the excellent conditions thought to have preserved high resolution data. It is thought that through ocean drilling, more

104

www.researchoutreach.org

Dr Lindsay Worthington’s research focuses on subsurface imaging of the Earth’s crust to understand geologic processes such as mountain-building and feedback between surface processes and tectonics.

Bufe, C., Nishenko, S., and Varnes, D. (1994) Seismicity trends and potential for large earthquakes in the Alaska-Aleutian region. Pure and Applied Geophysics. Volume 142, pp. 83-99.

University of New Mexico Dept of Earth and Planetary Sciences 1 University Place – MSC0203 Albuquerque, NM 87131 Bio Dr Worthington is an assistant professor at the University of New Mexico. She uses seismology and geophysics to study Earth structure, mountain-building, glacial dynamics and plate boundary fault processes. Funding International Ocean Discovery Program, US National Science Foundation, US Bureau of Ocean and Energy Management, Norwegian Petroleum Directorate

could be learnt about how mega-thrust earthquakes progress.

Collaborators • Kristen St. John • Bernard Coakley • Juliane Mueller • Matthias Forwick

This is just one such proposal that has the potential to extraordinarily advance knowledge on modern climate change and geologic hazards through ocean drilling in the North Pacific, the Bering Sea and Arctic Ocean region. The Scientific Exploration of the Arctic and North Pacific project has made substantial progress in bringing together interested parties, significantly pushing forward the potential for ocean drilling research in the Arctic region. With 2023 oncoming, the JOIDES Resolution’s arrival in the Pacific is highly anticipated; the potential for using this ship to uncover paleoclimatic and paleoceanographic data in the area is huge. With this data comes knowledge, of both geologic hazards and contemporary climate change. It is safe to say that the Scientific Exploration of the Arctic and North Pacific project has made a significant contribution to future knowledge on pressing issues.

References

Detail

It is safe to say that the Scientific Exploration of the Arctic and North Pacific project has made a significant contribution to future knowledge.

WORKSHOP SUMMARY AND THE FUTURE OF OCEAN DRILLING FOR RESEARCH IN THE ARCTIC A particular focus of the Scientific Exploration of the Arctic and North Pacific workshop was to get early-career scientists involved with the drilling program, in order to ensure longevity of research. To support this goal, a pre-workshop session discussed the process of preparing, submitting, and revising drilling proposals, as well as what happens after a proposal is selected for scheduling. Notably, the workshop was successful in engaging early-career scientists, but was also crucial in developing new collaborations, as well as strengthening existing partnerships in order to create proposals that have the potential to advance scientific knowledge in the region.

Research Objectives

Coakley, B., & Stein, R. (2010) Arctic Ocean Scientific Drilling: The Next Frontier. Scientific Drilling. Volume 9, pp. 45-49. Finn, S., Liberty, L., Haeussler, P., & Pratt, T. (2015) Landslides and megathrust splay faults captured by the late holocene sediment record of eastern Prince William Sound, Alaska. Bulletin of the Seismological Society of America, Volume 105(5), pp. 2343–2353. Liberty, L., Finn, S. Haeussler, P. Pratt, T. & Peterson, A. (2013) Megathrust splay faults at the focus of the Prince William Sound asperity, Alaska. Journal of Geophysical Research: Solid Earth, Volume 118. Worthington, L., St. John, K. and Coakley, B. (2019) The future of scientific drilling in the North Pacific and Arctic, Eos, Volume 100.

Personal Response Other than the NSF’s funded drill ship, what made you and your collaborators realise the need for a workshop to address a lack of ocean drilling in the North Pacific, the Bering Sea and the Arctic Ocean region? The Arctic region is one of the last frontiers in oceanography and tectonics and has essentially operated as a ‘black box’ for many questions addressing global and hemisphere-wide processes. Further, climate change at these high latitudes presents a scientific challenge with far reaching societal implications. Accessing archives of past climate can help us understand what to expect as the Arctic warms in the future. In Alaska, multiple large scientific initiatives in recent years have contributed significant insight into Earth structure. Ocean drilling is the next step in ground-truthing observations.

Oleksandr Umanskyi/Shutterstock.com

region, understanding of subduction and tectonics in the North Pacific can be increased. Further paleoceanographic data concerning patterns of subduction zone earthquakes, submarine landslides and tsunamis can also be accessed through ocean drilling. Increasing knowledge on these fundamental issues has great potential to help modern understanding of tectonic activity and climate change through the use of paleoceanographic reconstructions.

Vladimir Endovitskiy/Shutterstock.com

Everett Historical/Shutterstock.com

Dr Lindsay Worthington

www.researchoutreach.org

105


Thought Leader

EuroScience: Reception offered by EU Commissioner Carlos Moedas at ESOF2018.

Supporting scientists across Europe to work together for a brighter European future EuroScience is a non-profit grassroots organisation that believes that science is an essential part of Europe’s cultural heritage and that it forms the basis for the continued health and prosperity of the continent and its citizens. By supporting European scientists at all levels of their career and promoting research and development, since 1997, the organisation has been enhancing the contribution of science, technology and innovation to people’s wellbeing and culture.

W

hether they are into physical or biological sciences, social sciences or humanities, work for a public sector institution, university, research institute or the business sector, are early career or top of their field, EuroScience aims to support, encourage and represent all European scientists. In an interview with EuroScience’s President, Michael Matlosz, we found out how the organisation has become ESOF2018 sculpture.

an anchor point for anyone who wants to interact with the European science community, a platform that policy-makers can approach for advice and expertise, and an open space for discussion, debate and creativity for thousands of members in nearly 80 countries. Hi Michael! Can you tell us more about your role and responsibilities as the President of EuroScience? My role as President is to provide direction and leadership for the

development of the action and visibility of EuroScience. EuroScience is a unique pan-European organisation representing the voice of individual researchers and science professionals in both the public and private sectors, regardless of their institutional affiliation and across all geographic and disciplinary boundaries. The role of the President of EuroScience is particularly rewarding for me as a university researcher and educator, having spent the majority of my scientific career in Europe and with strong involvement in collaborative action on a pan-European level. Providing a voice for researchers and science professionals as individuals, is the unique role of EuroScience as a contributor to the creation of a dynamic European Research Area. Can you tell us more about EuroScience’s history, mission and core principles? EuroScience was founded in 1997 at

106

www.researchoutreach.org

Plenary session ESOF2018, Toulouse, France.

Past and present EuroScience Presidents (from left) Jean-Patrick Connerade, Enric Banda, Michael Matlosz, and Lauritz Holm-Nielsen.

a public meeting open to scientists from all of Europe. The basic idea was to create a platform for individual European scientists to meet, exchange views, have their voices heard in European policy discussions, work on proposals to enhance their careers, discuss how science impacts society and in general advance science to the benefit of Europe. The mission of EuroScience is to contribute to an integrated European continent of science and to the development of prosperity and mutual understanding among all Europeans through the advancement of scientific research. Are there any specific scientific fields you would personally like to see moved forward and be invested within Europe in the next few years? The future of European science does not depend on any specific field, but rather on our collective ability to attract the best and the brightest scientific talents in all fields to Europe. The priority for Europe, not only on the supranational level of the European Union, but also on the level of all European member states and associated states, should be the promotion of attractive scientific careers for those bright talents who wish to remain in Europe or to come

EuroScience is a unique pan-European organisation representing the voice of individual researchers and science professionals. to Europe to live and work. EuroScience promotes geographic and professional mobility and equal opportunity for women and men, regardless of their national origins. What do you feel are the most troubling political blocks facing European scientists today? How is EuroScience trying to help overcome these? Among the issues of preoccupation that European scientists face today, I would cite: continued weakening of public trust in scientific expertise; increasing disregard for academic freedom; unnecessary

opposition between short-term innovation and long-term research in public science policy; inequalities and insufficient investment and capacity building in research and innovation across Europe; and rapidly changing career opportunities and working conditions. EuroScience is working actively to address these and other major issues. In particular, we are collaborating with several organisations representing earlycareer researchers, such as the Marie Curie Alumni Association and EuroDoc, and we have also developed contacts

www.researchoutreach.org

107


Thought Leader

ESOF Party at Cité de l’Espace, Toulouse: Ariane 5 rocket.

Plenary session discussion ESOF2018.

was a joint declaration on the importance of public sector support for long-term research. The declaration was presented jointly to European Commissioner Carlos Moedas at the ESOF conference in Manchester in 2016.

with informal movements such as those created in Southern European countries in the wake of the austerity policies implemented after the financial crisis. The EuroScience Open Forum (ESOF) is a major platform for debate and discussion on joint action. ESOF’s career programme is an important opportunity and the EuroScience magazine EuroScientist provides a powerful vector for expression. EuroScience also issues joint policy papers and statements regularly with sister organisations throughout the world. One such statement

ESOF2016 Manchester, UK.

108

www.researchoutreach.org

What do you feel are the biggest challenges facing European scientists today? There are many challenges facing European scientists, among which I would like to mention the challenges to: contribute to the solutions required to face global societal concerns, but without sacrificing the quality of highly original scientific research; maintain focus on true frontier research and disruptive innovation, despite cumbersome organizational structures; engage much more strongly with the public at large; and interact with policymakers to ensure that scientific evidence is taken into account. EuroScience aims to influence scientific policy change in Europe – are there any policies in particular that the organisation has contributed to in the last few years that you are particularly proud of? EuroScience contributed significantly to the development of the European Charter for Researchers and the Code of Conduct for the Recruitment of Researchers, and as a founder and contributor to the Initiative for Science in Europe, we were also a major factor in the effort to mobilise the European scientific community to promote the creation of the European Research Council.

Opening Ceremony ESOF2018.

In 2017, EuroScience and the Royal Institution organised a high-profile debate on the impact of Brexit on science in the United Kingdom and the EU-27. The debate contributed to a much better understanding of what is at stake and what options are available for moving forward. EuroScience is also strongly involved in the Open Science agenda, including support for initiatives in Open Access, Open Data and Citizen Science. The increasing importance of digital technologies on scientific practice and scientific communication is changing in many significant and profound ways the exercise of the scientific profession, and it is important that the voice of active scientists in this context be heard, independently of their institutional affiliation. How does EuroScience raise awareness of scientific research in society in general? Our most successful visible action is ESOF and associated actions such as the Science in the City events. The European Cities of Science title is a very visible celebration of science and its relation to society. An ESOF event is organised once every two years in a major European city and the event has been constantly growing in participation and visibility since its first edition (Stockholm 2004). The most recent edition (Toulouse 2018), was a major success with over 4000 registered participants. The 400-500 media representatives present spread the message, and the Science in the City

component of ESOF goes well beyond the conference itself to bring science to the public and promote engagement. EuroScience has also granted in recent years several European Science Writers Awards at the conference, underlining how important science journalists are in connecting science to society.

EuroScience promotes geographic and professional mobility and equal opportunity for women and men, regardless of their national origins.

EuroScience gives out awards each year (Rammal Award, European Young Researchers Award, European Science Writers Award) – why do you feel it is important to recognise researchers in this way? For researchers themselves, awards are important for their careers. They encourage them to persevere on their path to break new ground. Awards also make visible to society as a whole that science is a vital pillar for progress, and that individual creativity is crucial for success. The Rammal Award recognises not just scientific achievements but also contributions to building bridges between societies in the wide Mediterranean area torn apart by history, politics and religion.

member organisations of the International Liaison Group are the American Association for the Advancement of Science; the Brazilian Society for the Advancement of Science; the Chinese Association for Science and Technology; the Korean Society for the Promotion of Creativity and Science; and the Japanese Agency for Science and Technology. Organisations from India and Australia have participated in meetings of the Liaison Group as well. We are also working closely with the Department of Science and Technology in South Africa, and the Programme Committee of the Next Einstein Forum (NEF) in Africa.

Are there any movements within the organisation to connect with scientists/ institutions outside of Europe? Since 2012, several global general science organisations have established a Liaison Group. They share common interests in scientific endeavour and in advancing science to the benefit of mankind. In some cases, we have common sessions at conferences, and sometimes – such as the example of the Manchester declaration – we publish joint statements. The current

There appears to be a clear focus on young and new researchers within EuroScience programmes – why are your researchers so important to the organisation? As in all professional sectors, organisations involved in research and innovation, whether they be universities, public research institutes or commercial enterprises, need to create the best conditions for renewal of their talents and to prepare the path for the

generations to come. To do so, requires not only high-quality education, but also working conditions that allow future researchers and science professionals to be successful in their careers and combine that professional success with a positive and rewarding personal and family life. If you would like to find out more, please visit their website at https://www. EuroScience.org/.

EuroScience 1, Quai Lezay-Marnesia 67000 Strasbourg France E: office@euroscience.org T: +33 (0) 3 88 24 11 50 W: www.euroscience.org @EuroScience www.facebook.com/ euroscience.association

www.researchoutreach.org

109


Behavioural Sciences ︱ Dr Fabrizia Faustinella

Homelessness in Western Society: The Dark Side of The Moon Homelessness is becoming a relevant phenomenon in western nations despite improvements across economic and healthcare indices over the last decades. Dr Fabrizia Faustinella, Associate Professor of Medicine at Baylor College of Medicine, Houston, Texas, has crafted a compelling documentary on homelessness exploring both its causes and consequences and giving a voice to those living on the streets. The documentary, entitled The Dark Side of The Moon, sheds light on this often unspoken and much-maligned aspect of society and those facing its challenges on a daily basis.

A

lmost all of us living in the urban west encounter homelessness on a nearly daily basis. The sight of people living on the streets is very frequent and yet we barely acknowledge this problem or give it a second thought. The fact that there is no internationally agreed definition for homelessness makes it difficult to compare homelessness rates across nations. The term ‘literally homeless’ refers to those staying in shelters, abandoned buildings or living on the streets. This group is further characterized as ‘sheltered homeless’ or ‘unsheltered homeless,’ depending on whether they are using some form of accommodation or not. Furthermore, those who stay with family or friends as a result of having no alternative lodging arrangements are termed ‘precariously housed.’ Despite the complexity of measuring homelessness, however,

available data paint a striking picture of its prevalence. A 2007 study, for example, shows that 1 in 13 people in the UK have been homeless at one point in their lives and the number is now increasing. According to official UK Government statistics that do not even include people sleeping in a shelter or in an overnight hostel, the number of homeless in England has doubled over the period 2012-2018. Some figures suggest that rates in the United States are decreasing, with 100,000 fewer people sleeping in shelters and on the streets over the past nine years. Other figures, however, suggest that homelessness, in fact, has been steadily increasing since the 1970s in smaller, rural townships as well as in America’s biggest cities. More than 550,000 people, including about 36,000 unaccompanied youth under the age of 25, are homeless in the US on any given night, according to data from 2018, and homelessness remains a widespread phenomenon. According to The National Center on Family Homelessness, a staggering 2.5 million children are now homeless in the US. Why, despite the obvious wealth of developed western nations, the prevalence of homelessness is so high?

Followtheflow/Shutterstock.com

THE DARK SIDE OF THE MOON Dr Fabrizia Faustinella, Associate Professor in the Department of Family and Community Medicine, Baylor College of Medicine, Houston, Texas, takes a holistic view of homelessness. Trained as a physician and with a PhD in genetics, Dr Faustinella has created a documentary to shine a light on homelessness, its root causes, its effects, and ultimately, possible solutions. What makes people vulnerable

110

www.researchoutreach.org

to homelessness? What are the links between drug use, mental illness, crime and homelessness? These are some of the questions that she addresses in her film, evocatively named The Dark Side of The Moon. WHY PEOPLE BECOME HOMELESS? THE HARDSHIP OF STREET LIFE There are many reasons why people become homeless: loss of income or insufficient income, lack of affordable housing, poverty, marriage breakdown, domestic violence, mental and physical illness are, among others, common causes. Many people talk of a spiral sparked by one event, for example, the loss of a job, resulting in financial instability and eviction, which, combined with family breakdown, lack of support and lack of needed services, eventually leads to homelessness. In the United States, racial and ethnic minorities are disproportionately affected by this problem. When compared to the total population, those who are homeless are more likely to be adult, male, African American, unaccompanied/alone, and disabled. More than 10% of adults living on the streets in the US are veterans. Many people experiencing homelessness often have serious mental illness and issues with drug and alcohol abuse, likely to be both the cause and consequence of their condition. While approximately up to 6% of Americans are affected by a mental health condition that can severely impact behaviour and quality of life – more than 20% of homeless people fall under this category. Furthermore, about 45% of the homeless population in the US has a history of mental health diagnoses. This number does not consider those living with mental health problems that have not been diagnosed or addressed by healthcare professionals. It is easy to see why mental illness could make someone vulnerable to becoming homeless. Severe mental health issues, such as major depression, bipolar

disorders, schizophrenia and psychosis, often lead to chaotic living and strained relationships, in addition to impacting a person’s ability to work and provide for themselves financially. Mental illness can also impact an individual’s ability to care for their own health, leaving them more susceptible to other disease

Dr Faustinella has created a documentary film to shine a light on homelessness, its root causes, its effects, and ultimately, possible solutions.

processes. All these factors do increase the likelihood that a person may end up living on the streets. As explained in The Dark Side of The Moon, many of those living on the streets come from abusive families and have suffered neglectful, traumatizing experiences as a child. This can have long-lasting, negative effects on their neuro-biological development, leaving them at increased risk of behavioural problems and mental illness. Sadly, these fractured family dynamics also mean that these individuals are less likely to have family support when they fall on

www.researchoutreach.org

111


Behind the Research

hard times, again compounding their risk of homelessness. While there have been great strides in trying to improve understanding and treatment of mental illness in recent years, most wealthy nations’ healthcare systems still do not provide adequate resources and funding for much needed mental health services. As a result, mental health issues are not properly addressed and once people are living on the streets, the chances of being treated appropriately are even lower or non-existent. Early intervention is key to supporting those at highest risk and yet many services are insufficient and woefully underfunded.

During the film, Alexis, a young homeless woman, speaks of her experiences of being ignored and the issue of homelessness being unseen, likening it to the ‘dark side of the moon’. This metaphor is woven throughout the film – as well as being its title – as those living on the streets talk about their daily experiences of being overlooked, or even feared, by their fellow citizens. Studies show, however, that people who are homeless are more likely to be victims of crime than perpetrators. Sleeping on the street with all their worldly possessions makes them extremely vulnerable to attacks. Rates of rape in homeless women stand at approximately 30%. Those with serious mental health conditions are even more likely to suffer abuse. Unfortunately, particularly in the US, these individuals are also at higher risk of incarceration. In fact, across the United States, people

E: fab7faust@gmail.com E: Fabrizia.Faustinella@bcm.edu T: +1 281 450 2449 W: www.bcm.edu/people/view/fabrizia-faustinella-m-d-ph-d-facp/5cc974e8-abd6-11e4-8d53005056b104be

Tero Vesalainen/Shutterstock.com

A powerful aspect of The Dark Side of the Moon is the amount of on-screen time dedicated to giving the homeless their own voice, allowing them to explain their own situation and talk about their own personal experiences. Many speak of living in a vicious cycle of struggling to get formal identification documents, therefore being denied access to support services, which creates insurmountable obstacles when trying to come out of homelessness. Their accounts tell a tale of life on the streets being a demeaning, humiliating and, at times, dehumanising experience. Clearly, living without material comforts is only one part of their plight. The mental struggle caused by isolation and abuse is often an even more difficult burden to bear.

Dr Fabrizia Faustinella

Many people experiencing homelessness are affected by serious mental illness, likely to be both cause and consequence of this tragic human condition. with serious mental health issues are more likely to be jailed than to be hospitalized. It has been found that 17.3% of prison inmates with severe mental illness were homeless prior to being arrested and 40% were homeless at one point in their lives, compared to 6% of undiagnosed inmates. To complicate matters, homeless people are often targeted by a number of city ordinances which prohibit particular behaviours such as obstructing sidewalks, loitering, pan-handling, trespassing, camping, being in particular places after hours, sitting or lying in particular areas, wearing blankets, sleeping in public, storing belongings in public places, and so forth. Under these laws, homeless people are regularly cycled through prisons and jails, which perpetuate abuse and discrimination. Homelessness and incarceration increase the risk of each other causing a cycle of hardship and uncertainty. WHERE TO GO FROM HERE? As emphasized in the documentary, there is no easy, clear-cut solution for how to deal with homelessness. It’s apparent that a viable approach from policy-makers needs to address multiple issues at the same time: early housing, affordable housing, job training, access

to healthcare and rehabilitation services, access to social services, in addition to engaging the wider community. As homeless people don’t vote and many of us remain oblivious to their struggles, the onus for politicians to tackle the issue is reduced. What is obvious from available data is that doing nothing to alleviate the problem is costlier than intervening. The chronically homeless population uses expensive public services at very high rates. As cities deal with increasing emergency department visits and hospitalisations, shelters, mental health services, incarceration, court time and so forth, the cost of preventing homelessness and helping people coming out of it pales in comparison. In the film, a social worker credits the ‘grace of God’ as the sole reason for being homed and living a comfortable life while others are on the streets. His narrative, as long as many other testimonials, conveys how quickly any one of us could become homeless due to a dramatic life change, compelling the viewers to re-evaluate their own personal attitudes and prejudices towards homeless people, to understand the complexity of the human condition and find within themselves elements of shared humanity.

Research Objectives

References

Dr Fabrizia Faustinella’s documentary, The Dark Side of the Moon, explores the root causes of homelessness and the challenges of street life, as long as societal biases and prejudices towards homeless people. It also focuses on how a difficult upbringing, amidst abuse and neglect, can have a devastating impact on the cognitive and physical development of a child.

Toro, P. A. et al. (2007). ‘Homelessness in Europe and the United States: A comparison of prevalence and public opinion’. Journal of Social Issues, 63(3), 505-524.

The documentary discusses possible solutions to the widespread and devastating problem of homelessness.

Detail Dr Fabrizia Faustinella 3718 South MacGregor Way Houston, TX 77021, USA Bio Dr Faustinella is currently associate professor of medicine at Baylor College of Medicine, Houston, TX. She’s also a filmmaker and has recently written, directed and produced a documentary on homelessness entitled The Dark Side of the Moon which will compel the viewers to re-evaluate their own personal attitudes and prejudices towards homeless people, to understand the complexity of the human condition and find within themselves elements of shared humanity.

Criminalizing Homelessness Violates Basic Human Rights: https://www.hrw.org/news/2018/07/05/criminalizinghomelessness-violates-basic-human-rights. Homelessness. Available at: https://ourworldindata.org/ homelessness. Mental Illness and Homelessness: Facts and Figures: http:// www.hcs.harvard.edu/~hcht/blog/homelessness-and-mentalhealth-facts. The National Law Center on Homelessness and Poverty (2015). Homelessness in America: Overview of Data and Causes: https://www.nlchp.org/documents/Homeless_Stats_ Fact_Sheet.

Personal Response What else do you think needs to be done to raise awareness and tackle the issue of homelessness? Sooner rather than later, political leaders will have to recognize that it’s far less expensive to prevent homelessness and help people coming out of homelessness than to maintain the status quo. In fact, there will never be shortage of homeless people, if the conditions leading to homelessness won’t be addressed and solved. Homelessness is only the tip of the iceberg and is the end result of many social ills including poverty, unaffordable housing, unemployment, untreated mental illness, lack of social safety nets, incarceration and family disintegration, among others. It is therefore of paramount importance to educate the public and spread awareness of this problem, its causes and its solutions. The documentary also reminds us that this is an issue our society cannot afford to ignore if anything for no other reason at all but doing what’s humane. Many of us choose to look away and ignore the homeless because it’s easier to do that than accepting a painful reality that doesn’t align with our own, or because we harbour prejudices against them and don’t see them as worthwhile human beings. Understanding the complexity of the problem will lead to a positive shift in attitudes and intentions toward homelessness, eventually resulting in increased support for changes in social, political and economic policies.

Srdjan Randjelovic/Shutterstock.com

112

www.researchoutreach.org

www.researchoutreach.org

113


Patsy Beattie-Huggan BN, MScN, Professor Dr. med. Kirsten Steinhausen & Stefanie Harsch, MA Health Education

A framework for global health promotion: The Circle of Health The Circle of Health (COH) is an interactive and exemplary framework that bridges gaps in health promotion and draws together both the external and internal factors that drive our health. Spearheaded by Patsy Beattie-Huggan, BN, MScN, founder and president of The Quaich Inc., this novel tool was developed, refined and evaluated through broad consultation in PEI, Canada. With collaborators Kirsten Steinhausen and Stephanie Harsch, the team is now focusing on how it can be used to identify translational gaps in WHO declarations, and disseminated more broadly.

Elnur/Shutterstock.com

114

www.researchoutreach.org

A

n individual’s health is a combination of many driving internal and external factors. To protect and to promote health, various interventions have been developed but often many factors are overlooked, and these therefore lack sustainability. The Circle of Health (COH) is a translational tool and value-based framework that effectively links together all these health factors to provide effective education, planning, collaboration and evaluation in health promotion. By simplifying the complexity of health into a usable tool, it plays an important role in helping to bridge gaps in health promotion and disease prevention versus lifestyle and social factors. The COH integrates the concepts required to plan for health at an individual, family, community, system and societal level. The COH can also be used

“The Quaich” is a Scots rendering of the Gaelic word meaning “cup”. They date back centuries, and are a symbol of friendship and community.

to re-analyse findings and provide the whole picture for the dynamics of health. Its particular strength compared to other models is its relevance. Taking a holistic approach, the framework incorporates a complete understanding of health, health promotion strategies, determinants of health and six key values such as caring and justice. It is written in plain language text and arranged as a multi-coloured tool with moveable rings which reminds one of a compass. Visual and kinetic learners find it easy to use. The COH is designed to overcome barriers of low literacy, and is structured to provide a holistic, ‘health in all settings’ approach. Today, the COH is used worldwide and has been validated and internationally evaluated. The COH is also delivered in a workshop format, and is available in six different languages and is transferable to diverse settings in 20 countries.

Health Association, Health Promotion Network Atlantic and the PEI Women’s Network. Patsy Beattie-Huggan, founder and President of The Quaich Inc. has been at the forefront of the development, use and education of the COH since the outset. Together with collaborators Kirsten Steinhausen and Stefanie Harsch, with the support of Luis Saboga-Nunes, the team is now focused on using the COH in different environments and for diverse target groups to bridge the gap in the release of World Health Organization (WHO) declarations where translational strategies are missing. UPTAKE AND EVALUATION Soon after the launch of the COH, it was observed that there had been a spontaneous uptake of the COH in Canada and other countries. This prompted an evaluation in 1997, which reported that users in multiple settings including education, research, and planning rated both the concept and design highly at 91%. Patsy BeattieHuggan explains: “The unanticipated uptake of the COH internationally speaks to a need for tools that focus on holistic health and empower stakeholders in multiple sectors to work together.” In 2004, international reviewers from the UK, USA, Australia and Canada provided positive feedback on the relevancy of the COH. With further work, a website, online workshops, facilitator manual and supplemental knowledge translation tools were developed. The Quaich started delivering online workshops in 2008. In 2014, the evaluation of the online workshops

As a shared framework in addressing complex organisational and global issues [the COH] could fill the gap. showed that 100% learned from the online workshops; 91% regard the COH as useful to engage others; and 83% found it beneficial to address social justice and health equity. In 2018 the online workshops were adapted and delivered as credit courses for students at Furtwangen University, Germany. Positive evaluations were received from students and it was concluded that implementing online training is an ideal method to train facilitators in different places. As of 2018, over 10,000 English and 700 copies in other languages of the

COH have been distributed. Workshops have also been held internationally and many facilitators trained. Researchers, practitioners, and academics have found it to be clear, easy to use and all inclusive. Some are introducing it to students and encouraging them to use it as a planning and assessment tool. The tool has been utilized in contexts where students’ first language is not English. A 2018 study conducted by researchers in Freiburg, Germany analysed existing health interventions for refugees in Germany and assessed the

The COH was developed in 1996 during health reforms in Prince Edward Island, Canada. Originally created to generate a shared understanding of health promotion amongst many sectors, COH was developed from a consultative community-based, community development process: a collaboration involving individuals from government and marginalised sectors. Funding came via a partnership including the Prince Edward Island (PEI) Health and Community Services, Canadian Public

Rawpixel.com/Shutterstock.com

Health and Medicine

www.researchoutreach.org

115


Behind the Research

Circle of Health Workshop - EUPHA Conference 2018, Ljubljana, Slovenia.

Patsy Beattie-Huggan BN, MScN

Professor Dr. med. Kirsten Steinhausen

Stefanie Harsch, MA

E: patsy@thequaich.pe.ca T: +1 902-894-3399 W: www.thequaich.pe.ca E: stefanie.harsch@ph-freiburg.de E: kirsten.steinhausen@hs-furtwangen.de

Research Objectives

References

To support knowledge translation and innovation in applications of the Circle of Health© (COH). One of the leading consulting tools in health promotion, the COH aims to analyse, plan and implement programs and identify gaps.

Beattie-Huggan, P. (2016). Circle of Health - A holistic and systematic approach to health promotion research, education and practice. SpiritualitéSanté, Vol. 9, No. 1 (Published in French; available in English in (Health Promotion Preliminary Research Report: Looking Back ... Moving Forward, p. 33-38, Alliance for Health Promotion, Versoix, Switzerland (2016).

Detail The Quaich Inc., Charlottetown, PEI, Canada www.thequaich.pe.ca www.circleofhealth.net

feasibility of the COH as an analysing tool. Over 150 interventions were identified and analysed. Notably, the COH helped to identify and systematically organise the broad field of interventions, analyse the approaches in general, explore gaps, raise awareness of unaddressed areas (e.g. critical health literacy) and of underlying principles and inspire the development of new interventions. BRIDGING THE GAPS: WHO DECLARATIONS It is vital to analyse the health promotion approaches in a country comprehensively, considering its most relevant factors and how they link together. The COH is therefore ideally placed to bridge the theory to practice gap in health promotion because it is holistic. Importantly, the COH can also assist policy makers and community leaders undertake steps to promote health at a system-level. A gap in the release of the WHO declarations (Ottawa Charter-Shanghai Declaration) highlights that translational strategies have been missing. There are a number of reasons why the COH can plug these gaps. It is already used worldwide and is a translational tool for

116

www.researchoutreach.org

The continued relevance of the COH demonstrates that tools based on values, validated with multiple stakeholders, with a holistic approach resonate with a wide range of users and can be applied across different languages and cultures. health promotion. The COH promotes the inclusion of values, the Ottawa Charter, determinants of health and the aboriginal medicine wheel. It has been validated by multiple stakeholders and demonstrates a holistic approach to areas such as culture and literacy. As a shared framework in addressing complex organisational and global issues it could fill the gap. VALIDATION AND REDESIGN Multiple stakeholders have validated the COH concept. These include Aboriginal elders, self-help groups and people with intellectual disabilities. They recommended adding the Aboriginal Medicine Wheel, determinants of gender and culture and plain language materials to the original prototype design. As a model of best practice, the COH has a global potential to be the ‘people’s framework’. It appeals

to a range of different learning styles and level of education, and has the ability to inspire new approaches to health promotion. CONCLUSION AND LESSONS LEARNED The continued relevance of the COH demonstrates that tools that employ a holistic approach and are based on values resonate with a wide range of users. The COH has been successfully applied across different languages and cultures. It is relevant to practitioners, educators, researchers, policy makers, and the general public. Patsy BeattieHuggan explains: “This speaks to its global potential as a useful framework and tool for collaboration to address complex issues.” Because of these qualities, COH is well-placed to meet the demand for translational tools to implement WHO declarations.

Bio Patsy Beattie-Huggan BN, MScN, founder and President of The Quaich Inc. has provided leadership in nursing education, health system redesign, health promotion and development of the Circle of Health©, holding a Bachelor of Nursing from the University of New Brunswick and MSc in Nursing and Health Studies from the University of Edinburgh. Professor Dr. med. Kirsten Steinhausen holds the position of Professor for Health Sciences and Vice Dean at The Faculty of Health, Safety and Security, Furtwangen University. She has experience of hospital project management where she has implemented evidence-based guidelines. Her research interests include public health, health policy, health economics, evidence based practise and health promotion in the workplace. Stefanie Harsch, MA, Ph.D. student, works as a research associate at the University of Education Freiburg, Germany in a research project supported by the German Federal Ministry of Education and Research focussing on promoting health literacy of people with migration and refugee history in German as a second language course. Funding Funding for the development of the Circle of Health came via a partnership including the PEI Government, Canadian Public Health Association, PEI Women’s Network, and Health Promotion Network Atlantic.

Beattie-Huggan, P. (2018). Circle of Health – a unique knowledge transitional tool with global potential. European Journal of Public Health, Volume 28, Issue suppl_4, November 2018, cky214.185, https://doi.org/10.1093/eurpub/ cky214.185. Harsch, S., Beattie-Huggan, P., Steinhausen, K., SabogaNunes, L., & Bittlingmayer, U. H. (2018). Applying the circle of health to analyse health interventions for people with refugee experiences in Germany. European Journal of Public Health, 28(suppl_4), cky213-823. Mitchell, T., Beattie-Huggan, P. (2006). Bridging the distance between lifestyle and determinants of health approaches: The Circle of Health as a synthesis tool. International Journal of Health Promotion & Education, Volume 44 (2): 78 – 82. PEI Health and Community Services Agency. (1996). Circle of Health©, Charlottetown (PE, Canada). Rocha, D., Beattie-Huggan, P., (2016). Potential of distance learning using a shared framework amongst a community of dispersed learners across large geographic areas for implementing a local, national, regional, and global strategy Presented at 22nd IUHPE Conference, 2016, Curitaba, Brazil.

Personal Response The Circle of Health has gone from strength to strength. What is next for the tool? The potential for the Circle of Health to be used in additional settings such as those working with children, refugees and vulnerable populations is endless. The hope is to expand its use internationally - and share stories of its use to facilitate a shift to a more holistic approach to promoting global health.

Collaborators Gratitude is expressed to the many people who contributed to the development, evaluation and dissemination of the Circle of Health, with special thanks to Luis Saboga-Nunes, President, EUPHA Health Promotion Section for more recently encouraging its application in the European context.

www.researchoutreach.org

117


Health & Medicine ︱ Dr Marlies Wijsenbeek and Dr Karen Moor

REASSESSMENT DURING DISEASE COURSE

Using eHealth to monitor idiopathic pulmonary fibrosis

BIOLOGY ENVIRONMENT

to developing IPF when their lung is damaged. Patients experience symptoms such as breathlessness, coughing and fatigue. These symptoms greatly lower the patient’s quality of life, affecting even simple tasks such as going to the shops. Patients lose their independence and often have to rely on friends and family for support. Currently, there is no cure for IPF. In very rare cases, patients may be given a lung transplant. In general, patients are given medication and may also attend therapy to deal with their symptoms. For example, pulmonary rehabilitation consists of a specific exercise programme which helps the patient cope with feeling short of breath. There are two anti-fibrotic drugs which are specifically used to slow down lung scarring – pirfenidone and nintedanib. However, these drugs could have bothersome side-effects such as nausea, fatigue and diarrhoea.

michaeljung/Shutterstock.com

The uptake of oxygen via the lungs is key for good health. The scarring in IPF reduces the lung’s elasticity and makes this process less effective.

118

www.researchoutreach.org

MONITORING ON INTERVENTION

Environmental factors

PATIENT PERSPECTIVE

I

diopathic pulmonary fibrosis (IPF) is a devastating progressive lung disease with a median survival of only three to five years. IPF typically occurs more often in men than women and generally affects elderly patients, aged 50 years and above. IPF has a highly variable disease course. Some patients experience slow disease progression, whereas others suffer from a rapid decline in lung function. The incidence of IPF is steadily increasing and according to the British Lung Foundation, over 6,000 people are diagnosed with IPF every year in the UK. IPF is characterised by an accumulation of scar tissue (fibrosis), reducing lung elasticity which impacts oxygen uptake. The term ‘idiopathic’ means that the cause of fibrosis is unknown. However, it is thought that scar tissue accumulates due to lung damage from either acid reflux, viruses, or environmental factors such as breathing in certain kinds of dangerous dust. Furthermore, some patients may be more genetically prone

Biomarkers Microbiome

PERSONALISED TREATMENT

Lifestyle

• Patient-reported outcomes • Patient-collected outcomes

Comorbidities

• Physiological parameters

Co-medication

• Imaging • Biology (ie. biomarkers, comorbidities)

Patient needs and perspective Patient experiences Home monitoring Patient-reported outcomes

INDIVIDUAL PATIENT PROFILE

This diagram illustrates the numerous elements involved in personalised treatment, as well as the reassessments and adjustments needed.

A personalised approach to IPF requires frequent monitoring due to the highly variable nature of IPF and the variation in response to therapy by different patients. Dr Wijsenbeek and Dr Moor from the Erasmus Medical Centre were inspired by the possibility of using eHealth tools to monitor the patient more frequently at home with a low burden for the patients. They aimed to investigate whether this could improve patient quality of life.

However, Dr Wijsenbeek and Dr Moor believe that patient factors, such as lifestyle, comorbidities, preferences and experiences should also play a role in personalised medicine. Patient engagement is also an important aspect. For example, it is vital to assess the

TREATMENT ADJUSTED

allowing for frequent home monitoring, evaluation of treatment response and potential treatment adjustments. In fact, a previous study in IPF showed that home-based measurements predicted disease decline better than less frequent hospital-based measurements of lung function. Moreover, home monitoring could be invaluable for IPF patients as they often struggle to attend frequent hospital visits due to reduced motility and breathlessness. As a result, the team developed an award-winning eHealth tool for patients to improve their understanding of their health and become actively involved in managing their disease.

eHealth involves the use of information and communication technologies to exchange health-related data between the patient and healthcare provider.

Sebastian Kaulitzki/Shutterstock.com

Dr Marlies Wijsenbeek and Dr Karen Moor from the Erasmus Medical Centre are developing novel eHealth tools, in collaboration with patients, to enable home monitoring of patients with idiopathic pulmonary fibrosis (IPF). Patients are able to use the eHealth tool to find information about IPF, record their symptoms and conduct consultations via video or email. Furthermore, the patient can take regular real-time spirometry measurements which can be immediately assessed by healthcare providers. This alerts the patient and healthcare provider to changes in a patient’s condition leading to potential treatment adjustments.

Genome

PERSONALISED MEDICINE Lately, there has been increasing interest in the use of personalised medicine to treat and monitor disease in IPF. Until now, personalised medicine has mainly focused on understanding the molecular mechanisms underpinning IPF. Little is understood about the influence of interactions between environmental, molecular and genetic mechanisms and how biological factors affect disease progression and influence the effectiveness of different treatment options. Improving our knowledge of these mechanisms could lead to the identification of specific biomarkers which could be used to develop targeted therapy.

perspectives of patients before treatment. In randomised controlled trials, medication may show beneficial results at a group level. However, for some individuals, certain drugs may prove ineffective or the side effects may outweigh the benefits of treatment. Currently, over-use and under-use of medication is fairly common in IPF. However, this may not be the case if the patient’s preferences are considered. These preferences may alter as the disease progresses, so personalised treatment plans must be regularly evaluated. Dr Wijsenbeek and Dr Moor recognise the importance of personalising treatment for IPF, taking into consideration the patient’s perspective, physiology and lifestyle. They believe that eHealth could contribute to personalised medicine by

eHEALTH AND HOME MONITORING eHealth involves the use of information and communication technologies to exchange health-related data between the patient and healthcare provider. Dr Wijsenbeek and Dr Moor developed a novel eHealth tool, IPF Online, in collaboration with patients. This innovative eHealth tool won two Dutch prizes - the Patient Participation Prize in 2018 and the Lung Foundation public prize in 2016. IPF Online has many features for patients to utilise including: an information library, the possibility for

www.researchoutreach.org

119


econsultations and video consultations, information about medication and possible side-effects, an area for patients to fill in their symptoms and possible sideeffects they may be experiencing and other patient-reported outcomes, such as quality-of-life questionnaires. Patientreported outcomes are reports of the patient’s quality of life, health or functional status that come directly from the patient, without, for example, interpretation by clinicians. IPF Online is integrated with real-time wireless spirometry. This is a simple test used to monitor lung conditions using a spirometer to measure how much air is breathed out in one forced breath (forced vital capacity or FVC). Interestingly, the team believed that a major challenge would be patient engagement as many elderly people could possibly be hesitant using online tools. However, the majority of IPF patients who were asked to participate wanted to use IPF Online. By using IPF Online, the patient can gain insight into their own health condition and become involved in the management of their disease. The team performed multiple studies to determine the feasibility of the home monitoring programme/eHealth tool and patient satisfaction.

by both patients and healthcare providers. Additionally, automated email alerts are sent if the patient reports troublesome side-effects or if the FVC declines by >10% for three consecutive days. Overall, the results revealed that home spirometry highly correlated with hospital spirometry, showing that home spirometry is a reliable test. Furthermore, 80% of patients considered daily spirometry easy and 90% thought that the process was not burdensome at all. All patients found realtime spirometry very useful and would recommend it to other IPF sufferers. Patients and hospital staff in the study group did identify several potential issues with home monitoring and spirometry and recommendations were made. For example, a handheld spirometer may be difficult for patients to use. Therefore,

Patients can see a daily overview of their lung function results (FVC) in IPF Online. Healthcare providers can also see these results and receive an alert if FVC declines >10% for three consecutive days.

120

www.researchoutreach.org

Behind the Research

the team suggested that patients should be provided with clear training before they start the programme. Furthermore, some patients had no internet access. To tackle this issue, these patients could be supplied with a 4G SIM card to guarantee internet access. Patients who had never used the internet before were able to

IPF patients found real-time spirometry very useful and would recommend it to others.

Mr Aesthetics/Shutterstock.com

PILOT STUDY Ten IPF patients from the Erasmus Medical Centre were involved in the pilot

study with home spirometry. Patients were asked to test IPF online for one month using a tablet. During this time patients performed daily home spirometry and online patient-reported outcomes at baseline and after four weeks. The spirometry data is transmitted real-time and is directly available for analysis

use the tablet with ease and perform spirometry due to the simple design. Additionally, some patients may not comply with taking measurements daily. In these situations, patients could be sent email reminders. A limitation of the pilot study is that it was performed at a single centre with a relatively small sample size. Although this was effective at highlighting reliability, patient satisfaction and identifying potential barriers, a larger scale, multicentre study is needed to see whether it improves patients’ quality of life in the long term. The study team is currently addressing this by performing a multicentre randomised clinical trial with three other centres in the Netherlands. The team is following patients for six months to investigate whether the home monitoring programme/eHealth tool improves quality of life compared to standard care. FUTURE PERSPECTIVES Dr Wijsenbeek and Dr Moor, in collaboration with patients and healthcare staff, have designed a novel eHealth tool that enables personalised, individuallytailored therapy. Patients can understand their own condition and have a say in managing their own disease. Furthermore, by performing real-time spirometry measurements, which are immediately available to health care providers, changes in condition are quickly identified and the right treatment can be given. eHealth tools could revolutionise how we treat chronic, long-term conditions. In the future, eHealth tools could be integrated in clinical practice to truly embrace personalised medicine.

Dr Marlies Wijsenbeek

Dr Karen Moor

E: m.wijsenbeek-lourens@erasmusmc.nl E: c.moor@erasmusmc.nl T: +31650031750 W: www.ipfonline.nl

Research Objectives Drs Wijsenbeek and Moor evaluated a new home monitoring programme with real-time wireless home spirometry in idiopathic pulmonary fibrosis.

Detail Erasmus Medical Center Rotterdam Department of Respiratory diseases Dr. Molewaterplein 40 3015 GD Rotterdam Bio Marlies Wijsenbeek is a pulmonary physician and associate professor at the Erasmus MC in Rotterdam. She is chair of the Erasmus MC multidisciplinary interstitial lung disease center (ILD) centre, secretary of the Idiopathic Pulmonary

Pneumonia Group of the European Respiratory Society and the Dutch National ILD section, and board member of the Netherlands Respiratory Society. Her research interests include eHealth and other patient-centred outcome measures in interstitial lung diseases, and new therapies in IPF and sarcoidosis. Karen Moor is a PhD candidate at the Respiratory Department of the Erasmus MC in Rotterdam. The aim of her research is to improve clinical outcome measures

References Moor, C.C., Heukels, P., Kool, M. and Wijsenbeek, M.S., 2017. Integrating patient perspectives into personalized medicine in idiopathic pulmonary fibrosis. Frontiers in medicine,4, p.226. Moor, C.C., van Manen, M.J., Tak, N.C., van Noort, E. and Wijsenbeek, M.S., 2018. Development and feasibility of an eHealth tool for idiopathic pulmonary fibrosis. European Respiratory Journal,51(3), p.1702508. Moor, C.C., Wapenaar, M., Miedema, J.R., Geelhoed, J.J.M., Chandoesing, P.P. and Wijsenbeek, M.S., 2018. A home monitoring program including real-time wireless home spirometry in idiopathic pulmonary fibrosis: a pilot study on experiences and barriers. Respiratory research,19(1), p.105.

and quality of life for patients with IPF, with a special focus on the development and use of eHealth tools. Funding ZonMw, Roche, Boehringer Ingelheim, Erasmus MC Thorax Foundation Collaborators Currently a randomised controlled trial is being carried out together with the hospitals Zuyderland MC, St. Antonius hospital and OLVG in the Netherlands.

Personal Response How could eHealth tools, such as IPF Online, improve patients’ quality of life in the long term? eHealth tools enable frequent monitoring at home at a low burden for patients. This is especially important in a chronic progressive disease such as IPF, which has a huge impact on patient quality of life and a high symptom burden. Our home monitoring programme IPF Online has the potential to improve quality of life through patient engagement, better medication use, low-threshold communication, stimulation of self-management and earlier detection of disease deterioration.

Idiopathic Pulmonary Fibrosis. British Lung Foundation. Available at: https://www.blf.org.uk/support-for-you/ idiopathic-pulmonary-fibrosis-ipf [Accessed 12/02/2019]

www.researchoutreach.org

121


Health and Medicine ︱ The National Prostate Cancer Register

NPCR: Next generation cancer register Knowledge is power and in this modern age, data is often our most valuable commodity. Nowhere is this truer than in healthcare, where patient data is revolutionizing the way we treat diseases such as cancer. To that end, the National Prostate Cancer Register (NPCR) is providing a valuable resource for physicians, patients, and researchers. And it has already contributed to several improvements to prostate cancer care in Sweden.

T

he National Prostate Cancer Register (NPCR) of Sweden is a clinical cancer register, which serves as a tool for quality assurance and research. The ultimate goal of the NPCR is to ensure that every man diagnosed with prostate cancer receives optimal treatment. The NPCR enables this by collecting information on men diagnosed with prostate cancer, their treatments and the outcome of treatments. Information in NPCR is used in a number of ways by patients, physicians and researchers to improve patient outcomes. The forerunner to the NPCR was the Southeast Regional Prostate Cancer Registry, set up in 1987. Other healthcare regions joined and by 1998 all six of Sweden’s healthcare regions were part of NPCR. Since 1998, NPCR has captured information from 98% of all newly diagnosed cases of prostate cancer as compared to the National Cancer Registry to which reporting is mandated by law. As of March 2018, a total of 181,660 cases of prostate cancer had been registered.

Figure 1. Linkages and number of cross-linked men in the Prostate Cancer data Base Sweden (PCBaSe) 4.0.

Management and operation of the register is overseen by its steering committee, consisting of one urologist

The Prostate Cancer data Base Sweden (PCBaSe) 4.0

Swedish cancer register 39,039 cases

The cause of death register 80,381 cases

The multi-generation register 61,719 men with brothers; 97,170 men with fathers

The LISA database

www.researchoutreach.org

160,121 cases

NPCR

185,729 cases x 5 controls

122

The prescribed drug register

185,714 cases

The national patient register 184,416 cases

and one oncologist from each of the six healthcare regions in Sweden, as well as two patient representatives. The Swedish Association of Local Authorities and Regions is a major funding body of NPCR. Using the unique Swedish person identity number, the NPCR has been linked with a number of other registries such as The National Patient Registry, the Cause of Death Registry and the Prescribed Drug Registry in the Prostate Cancer data Base Sweden (Fig 1). This means that researchers now have access to a database of close to 200,000 prostate cancer cases and one million prostate cancer-free men to use as controls, creating an invaluable resource for research. The usefulness of PCBaSe as a tool for researchers is evident from the fact that it has formed the basis of more than 130 peer-reviewed research articles since its inception in 2010. RAISING STANDARDS OF CARE The reports generated from data in NPCR allow both patients and physicians to access information on quality of care in all departments in Sweden and their adherence to the national treatment guidelines. The first of these reports, termed “What’s going on” is a dashboard panel that enables healthcare professionals to access data on patient treatment in real-time on a secured server (Fig 2). Data is collected, collated, and reported promptly, and the system is designed to provide information in a user-friendly format that enables efficient quality control of patient care. The service allows physicians to select specific groups of patients and compare the care provided by their department with national standards and other departments in Sweden. The dashboard

INDICATORS 1

1. Reported to NPCR

PATIENTS 45/99

Sweden overall 45% 10%

1

2. Navigator nurse

70/87

80% 50%

0

3. Waiting time to first visit

50/86

58% 40%

4. Waiting time to cancer diagnosis

5/99

5. Bone Imaging, high risk cancer

20/25

2

6. Active surveillance, very low risk cancer

21/22

95% 75%

1

7. Multidisciplinary team meeting, high risk cancer

15/30

50% 45%

1

8. Curative treatment, localized high risk cancer

75/87

86% 71%

2

9. Nerve sparing intention

33/24

138% 45%

60

1

10. Negative margins

%

90 %

45

1

70 95 %

10 22 31 40 %

-

-

-

-

%

62 59 72 71 %

5% 10% 30

60

%

1

1

2

4

%

80% 82%

1. Proportion of men who were reported to NPCR within 30 days of their cancer diagnosis. 3. Proportion of men who had a first outpatient visit due to suspicion or prostrate cancer within 50 days of referral. 4. Proportion of men who were informed of their cancer diagnosis within 18 days after prostate biopsy. 5. Proportion of men up to 80 years of age with high risk cancer who ere investigated for bone metastases. High Risk Cancer: T1,2 and Gleason 8-10 or PSA 20-50 ng/ml or T3 and PSA<50ng/ml. 6. Proportion of men with very low risk cancer who were who were started on active surveillance. Cancer with very low risk. T1c, Gleason 6 or lower, PSA below 10ng/ml, PSA density below 0.15ng/ml, no more than four cores with cancer, no more than 8mm of cancer in total at biopsy.

50 64 81 75 %

7. Proportion of men with high risk cancer for whom curative Tχ can be considered who were discussed at a multidisciplinary team meeting.

70 95 %

50 64 68 81 %

8. Proportion of men not older than 75 years with localized high risk cancer who received curative treatment. Localized high risk cancer: T1, 2 and Gleason 8-10 and/or PSA 20-50 ng/ml and no metastases.

70

%

15 25 39 49 %

70

95%

61 74 84 85 %

Patients Number of patients for whom high quality level was reached out of all reported patients.

70

95 %

45 46 59 61 %

My clinic/Sweden overall Number of patients for whom inidcator level was reached. Vertical lines indicate upper and lower level cut-offs. Mean value for the whole country

70 95 %

75 77 69 80 %

Patients Performance in the last four years.

70 95 %

9. P roportion of men for whom nerve sparing intention at radical prostatectomy was documented preoperatively.

30

36/48

Legend to indicators

2. P roportion of men who have named navigator nurse.

30

0

PERFORMANCE 2010-2013

My clinic

75% 64%

10. Proportion of men diagnosed with pT2 cancer at prostatectomy who had negative margins, ie. radically removed tumour.

Figure 2. ‘What’s Going On’ online report on the performance on 10 quality indicators at each urological department based on data reported to the National Prostate Cancer Register (NPCR) of Sweden. The definition of each indicator is seen to the left of the page. In the far left column, a colour-coded indicator demonstrates the performance level achieved by the clinic. Red = 0 points, lowest category of performance; yellow = 1 point, intermediate level; and green = 2 points, highest level. The number of cases for whom the goal was reached, i.e. the numerator, and all patients eligible for the specific indicator, i.e. the denominator, is displayed in the third column. The horizontal bars in this diagram show the proportion of men for whom the goal was reached, for the reporting clinic in blue and the national mean in grey. The vertical bars are the cut-offs for the three target levels. The next column with vertical bars shows the performance for the clinic during the preceding 4 years.

report presents results for ten indicators of the quality of care taken from Swedish National Prostate Cancer Guidelines.

desired population. The data can then be presented as a table, figure, or heatmap, and users can export the results as a spreadsheet.

Another public reporting system has recently been set up. RATTEN (www. npcr.se/RATTEN) is an online portal where patients and other stakeholders can access a database and create reports according to their own requirements (Fig 3). At the RATTEN home page, the viewer is presented with a short report and some key data, they can then select the healthcare region, year of diagnosis, age and cancer risk category they desire in order to create a report for their

The steering committee of the NPCR strongly promotes the use of the What’s going on and RATTEN reporting systems, as they firmly believe that use of these systems by healthcare professionals and patients will improve standards of care in the treatment of prostate cancer. Indeed, improvements have already been seen as a result of the work of NPCR. For example, the NPCR has repeatedly drawn attention to the number of men

Figure 3a. Median waiting times (days) from referral to first visit for 2018 diagnoses. Data source: www.npcr.se/RATTEN.

Figure 3b. Diagnostic median waiting time (days) per region for 2018 diagnoses. Data source: www.npcr.se/RATTEN.

with low-risk prostate cancer who received bone scans at each department in its annual report. In 2008 and 2009, the number of patients receiving these unnecessary investigations was as low as 3%, down from 45% in 1998. The use of active surveillance, the recommended treatment strategy for low-risk cancer, increased from 40% to 74% between 2009 and 2014. Although this is also the recommended treatment in many other countries, no other register has reported such high use of active surveillance. A randomized clinical trial showed the superiority of radiotherapy and androgen

Figure 3c. Radiotherapy treatment in men aged 70-80 years with high risk prostate cancer. Data source: www.npcr.se/RATTEN.

www.researchoutreach.org

123


deprivation therapy (ADT) as compared to ADT alone in decreasing mortality. Accordingly, in men aged 70–80 with high risk, non-metastatic prostate cancer, use of radiotherapy plus ADT has increased from 10% in 2001 to almost 50% in 2012 (https://statistik.incanet.se/npcr/; Fig 3c). The steering committee believes the NPCR shares the credit for bringing about these positive changes.

FUTURE OUTLOOK The NPCR and PCBaSe continue to be improved. For example, NPCR now implements electronic patient reported outcome (PROM) questionnaires distributed before radical radiotherapy and prostatectomy, and after 3 months, 1, 3 and 5 years after treatment. This

124

www.researchoutreach.org

The National Prostate Cancer Register

No. cases RP: Surgeon #1

RP: Surgeon #2

RP: Surgeon #3

RP: Surgeon #4

RP: Surgeon #5

Baseline

26

3 months

24

1 year

26

Baseline

45

3 months

42

1 year

37

E: npcr@npcr.se T: +46 18 611 46 61 W: www.npcr.se

3 months

99

1 year

92

Baseline

75

3 months

72

References

The National Prostate Cancer Register (NPCR) is a tool for quality assurance and quality improvement of health care for all men with prostate cancer in all ages in Sweden.

Loeb S, Folkvaljon Y, Lambe M, Robinson D, Garmo H, Ingvar C, Stattin P. (2015). Use of Phosphodiesterase Type 5 Inhibitors for Erectile Dysfunction and Risk of Malignant Melanoma. AMA. 313(24):2449-55.

Detail

Stattin P, Sandin F, Sandbäck T, Damber JE, Franck Lissbrant I, Robinson D, Bratt O, Lambe M. (2016). Dashboard report on performance on select quality indicators to cancer care providers. Scand J Urol. 50(1):21-8.

1 year

69

Baseline

50

3 months

46

1 year

46

Pär Stattin, register holder, professor and consultant in Urology, Department of Surgical Sciences, Uppsala University Hospital, Sweden, entrance 70 751 85 Uppsala, Sweden. Bio NPCR is a nationwide, population-based clinical register that captures 98% of all newly diagnosed prostate cancer cases in Sweden since 1998. Through the use of the unique Swedish person identity number, NPCR has been linked to other health care registers and demographic databases in prostate Cancer data Base Sweden (PCBaSe).

3 months

283

1 year

270

Baseline 2536

RP: Sweden

Research Objectives

Baseline 102

Baseline 298

RP: Hospital

3 months

2409

1 year

2282 0

RISK GROUP

In a more recent study, Robinson et al found little support for an association between the use of GnRH agonists, a drug used by men with advanced prostate cancer and Alzheimer’s dementia, which had been reported in some previous publications. These studies show the value of a comprehensive and readily accessible clinical database with complete data on drug exposure, cancer outcome and other relevant factors such as socioeconomic status.

Figure 4. NPCR’s electronic Patient Reported Outcome Questionnaire (PROM) allows quality control of results from treatments that carry a risk of erectile dysfunction. This data is fictive.

1) No ED (IIEF-5 score 22-25) 2) Mild ED (IIEF-5 score 17-21) 3) Mild to moderate ED (IIEF-5 score 12-16) 4) Moderate ED (IIEF-5 score 8-11) 5) Severe ED (IIEF-5 score 5-7)

NPCR AS A RESEARCH TOOL Post-authorisation safety studies (PASS) are investigations of rare adverse events from drugs that are in common use. To perform such studies, data on drug use and outcome from very large populations are required. PCBaSe is the perfect basis for PASS. In a paper published in JAMA in 2015, Loeb and colleagues used information from the PCBaSe to investigate a purported link between one of the drugs most commonly used to treat erectile disfunction (Viagra, a phosphodiesterase Type 5 Inhibitor PDEi) and an increased risk of malignant melanoma. They found no clinically meaningful increase in risk. The comprehensive data in PCBaSe combined with data from the Melanoma Register demonstrated that men taking Viagra had higher risk of malignant melanoma than expected. For example, men taking Viagra had a higher level of education and income, which are associated with more sun exposure and higher risk of melanoma.

Behind the Research

Erectile Dysfunction

Very low risk

Localised high risk

Low risk (other)

Locally advanced

Low risk (type missing)

Regionally metastatic

Intermediate risk

Distant metastases

20

40

60

80

100

Percent 1)

2)

3)

4)

5)

NPCR, a clinical cancer register is used in a number of ways by patients, physicians and researchers to improve cancer care and patient outcomes. enables the quality control of the results of radical treatments that carry an important risk of erectile dysfunction and urinary incontinence (Fig 4). NPCR has also initiated Patient-overview Prostate Cancer (PPC) that longitudinally records data on men with metastatic prostate cancer. This is an invaluable resource since there are now several drugs that can increase survival and quality of life in men with metastatic prostate cancer. At the other end of the disease spectrum, diagnosis

of prostate cancer has drastically changed with the advent of improved magnetic resonance imaging. The effect of this new diagnostic must be documented and assessed and here also NPCR has an important role to play. NPCR continues to strive to reach the goal that every man diagnosed with prostate cancer receives optimal treatment, maximal cancer treatment with minimal side-effects. For more information about NPCR visit http://npcr.se/

Funding Swedish Association of Local Authorities and Regions, Prostate Cancer Patient Association, Swedish Cancer Society, Swedish Research Council, Uppsala University Collaborators The National Prostate Cancer Register of Sweden (NPCR) steering group: • Pär Stattin (register holder) • Ingela Franck Lissbran • Anders Widmark • Maria Nyberg • Camilla Thellenberg • Ola Bratt Karlsson • Olof Ståhl • Ove Andrén • Olof Akre • Lennart Åström • Per Fransson • Magnus Törnblom • Eva Johansson • Stefan Carlsson • Mats Lambe • Marie Hjälm-Eriksson • Fredrik Sandin • David Robinson • Karin Hellström • Mats Andén • Calle Waller • Jonas Hugosson • Gert Malmberg.

Stattin, P., Sandin, F., Hellström, K, Ingela R, Lissbrant F. (2017). The National Prostate Cancer Register of Sweden, basis for quality assessment, quality improvement, and research. Tijdschr Urol; 7:50. Stattin P, Sandin F, Loeb S, Robinson D, Lissbrant IF, Lambe M. (2018). Public online reporting from a nationwide population-based clinical prostate cancer register. BJU Int. 122(1):8-10. doi: 10.1111/bju.14213. Epub 2018 Apr 17. Van Hemelrijck M, Garmo H, Wigertz A, Nilsson P, Stattin P. (2016). Cohort Profile Update: The National Prostate Cancer Register of Sweden and Prostate Cancer data Base--a refined prostate cancer trajectory. Int J Epidemiol. 45(1):73-82. Robinson D, Garmo H, Van Hemelrijck M, Damber JE, Bratt O, Holmberg L, Wahlund LO, Stattin P, Adolfsson J. Androgen deprivation therapy for prostate cancer and risk of dementia. BJU Int. ePUB 2019 Jan 13.

Personal Response What does the NPCR steering committee see as the important trends in the future of prostate cancer treatment, and how will the NPCR and PCBaSe evolve to meet those needs? The NPCR steering committee are focusing on reporting of patient reported outcome (PROM) questionnaires as it is a crucial part of the evaluation of the quality of care. PROM is collected both in conjunction with radical therapy and longitudinally in men with advanced prostate cancer.

www.researchoutreach.org

125


Health and Medicine ︱ Dr Heiko Enderling

Towards a quantitative personalised oncology Dr Heiko Enderling from Moffitt Cancer Center, together with researchers from the Polish Academy of Sciences and the Helmholtz Centre for Infection Research, focus on Mathematical Oncology, and radiotherapy in particular, highlighted by the interconnectivity of metastatic disease through a patient’s immune system. Studies show that different radiation doses induce anti-tumour immunity at different strengths. This prompted the research team to develop a variety of mathematical models to calculate the best radiation protocols and optimum timing of treatment in their quest to maximally synergise radiation-induced cell death and the subsequent waves of immune responses.

Dr Enderling and a colleague examine the results from their work.

126

www.researchoutreach.org

M

ore than half of cancer patients receive radiation treatment at some point during their care. Radiation therapy is prescribed more often than any other oncology treatment. CURRENT RADIATION PROTOCOLS Currently, radiation treatment involves delivering the maximum tolerable doses to patients. This protocol was established after many dose escalation trials in order to increase average survival rates. Unfortunately, patients’ side effects can include severe radiation toxicities. Radiation, however, is also thought to boost the immune system to help fight cancer. Research suggests that the success of radiotherapy treatment could well be a combination of killing tumour cells by direct cytotoxicity, and maybe even more importantly, its anti-tumour immunity effect. Current radiation protocols do not focus on enhancing these immune responses. Research studies have shown that various radiation doses stimulate anti-tumour immunity

at different rates. This has led researchers to search for the best possible radiation dose and dose fractionation to optimise the amalgamation of cell death by direct radiation cytotoxicity and the ensuing anti-tumour immune responses. Dr Heiko Enderling and Dr Rachel Howard from the Department of Integrated Mathematical Oncology at H. Lee Moffitt Cancer Center & Research Institute, Florida, together with Dr Jan Poleszczuk from the Polish Academy of Sciences, Poland and Dr Juan Carlos Lopez Alfonso from the Helmholtz Centre for Infection Research, Germany, are integrating mathematical, biological, and clinical sciences to model, simulate, and predict treatment response for individual patients. They have developed a variety of mathematical models to calculate the best radiation protocols and optimum timing of treatment. Their work is at the forefront of ‘virtual trials’ to personalise cancer treatment in order to provide optimal adaptive cancer therapy for each patient. METASTATIC CANCER The researchers have observed patients with metastatic cancer. This is where the cancer cells have spread from the place where they first formed to another part of the body. The cancer cells break away from the initial tumour and travel through the blood or lymph system, forming a new tumour in another location. This new, metastatic tumour is the same type of cancer as the original tumour although it is located in a different place. Dr Enderling and his team have studied the response of tumours outside of the radiation field, known as the Abscopal Effect. In patients with metastatic disease, the irradiation of a single tumour site causes regression of tumours elsewhere in the patient body as well. As Dr Enderling explains: “This highlights the interconnectivity of metastatic disease

Dr Enderling engages in teaching as well as research activities.

through the patient’s immune system, and local therapy such as radiotherapy, or even surgery, changes the number of immune cells within the treatment field. Some of the immune cells are killed while others are stimulated. These changes will manifest systemically as the immune system is body-wide.” These exciting observations have prompted the research team to depart from the concept of ‘local therapy’ in order to develop their understanding of radiation as a biological agent that acts on the entire body. The researchers’ work shows that in metastatic disease, tumours in different parts of the body participate differently in immune surveillance. Most intriguingly, the interdependence of different cancers in a patient’s body through the immune system is challenging our current understanding of cancer metastases. The traditional belief that tumours metastasise when they are big may not always be accurate. Rather, some tumours may be big because they have metastasised. If tumours with a strong immune response shed metastases throughout the body, some cells of the immune system will be re-routed to the distant metastases. This divideand-conquer strategy will allow for escape from immune-mediated tumour control. Dr Enderling and his team therefore need to identify the best radiation treatment target so as to induce systemic responses and kill the tumours that were not directly targeted by radiation. MATHEMATICAL MODELLING APPROACH Radiation is often prescribed with other therapies such as surgery, chemotherapy, targeted agents and immunotherapy. The vast number of possible radiation doses together with the different combinations of various therapies, in various orders, at various times for patients with metastatic diseases render it impossible to experimentally and clinically test all the possible permutations. The researchers’ mathematical modelling approaches use the available preclinical data and outcomes from clinical studies to simulate all possible permutations of treatment protocols. Dr Enderling and his team are using machine learning and optimisation

More than half of cancer patients receive radiation treatment at some point during their care. theory concepts to help identify those treatment approaches with the highest likelihood of success. These treatments can then be put forward for subsequent experimental and clinical validation. The research team have developed a mathematical model that simulates radiation response, radiation-induced immune activation, immune checkpoint blockade therapy and inter-exchange of activated T cells (a type of white blood cell that kills cancer cells) between

tumour sites. This model differs from the theoretical constructs of predator-prey systems that have gone before. The new model is first compared to experimental data. Based on the experimental setting, tumour sites are modelled at different spatially-separated locations and each one is characterised by a time-dependent volume. At each site, the researchers add data of four co-existing and interacting populations: (1) viable cancer cells, (2) cancer cells

Tumour 1

Tumour 2

Tumour 1 antigen presenting cells

lymph node

Tumour 2 antigen presenting cells

GI tract and spleen liver

lungs

lymph node

other organs effector T cells Activated against Tumour 2 effector T cells activated against Tumour 1

Schematic illustrating how tumours are connected through the patient’s blood system.

www.researchoutreach.org

127


dying in a non-immunogenic manner, (3) cancer cells dying in an immunogenic manner and (4) activated tumour-specific cytotoxic T cells (effector cells). This datafitting procedure allows the researchers to estimate the model parameters. These are then used to predict responses to those dosages that have not already been considered in the experimental setting. Complex interactions occur between tumours and the patient’s immune system, the outcomes of which can range from tumour eradication to tumourimmune co-existence (or dormancy) and rapid outgrowth of the cancer cell population. This issue becomes more

T cells. The researchers are able to model the effects of treatment such as surgical resection and radiotherapy to estimate both the decrease in the original tumour volume and the change in the overall tumour burden. The model provided qualitatively similar responses to those reported in the clinical setting. It would also suggest that if metastatic sites are interconnected through the patient’s immune system, truly local therapy does not exist. MATHEMATICAL ONCOLOGY The numerous clinically feasible radiation doses and dose fractionations render an exhaustive pre-clinical evaluation

Their work is at the forefront of ‘virtual trials’ to personalise cancer treatment in order to provide optimal adaptive cancer therapy for each patient. complex when we learn that cytotoxic T cells can move through the patient’s circulatory system. Dr Enderling and his team have coupled mathematical models of local tumour-immune dynamics and systemic T cell trafficking in order to simulate the evolution of tumour and immune cell populations following local radiation and model the immune interconnectivity. Their results suggest that the presence of a second tumour may either inhibit or promote the growth of the original tumour, depending on their capacity for immune recruitment and the resulting systemic redistribution of

impossible. Progress in integrated mathematical oncology, however, may make such analyses possible. Dr Enderling and his research team are introducing novel mathematical models which are calibrated with experimental data in order to make inroads into deciphering the complexity of radiation and immune system synergy. The model estimates the optimal number of radiation fractions and radiation dose per fraction to obtain the most beneficial systemic immune-mediated tumour responses for clinically relevant total and biologically effective radiation doses. This will enable

The team takes a collaborative approach to their work.

128

www.researchoutreach.org

Behind the Research

a departure from the current protocol of giving patients the maximum tolerable doses, a one-size-fits-all approach, and instead provide patient-specific precision radiation therapy and move towards personalised medicine. DISCUSSION The ground-breaking model simulations are trained on specific experimental datasets in order to highlight the treatment that will have the highest likelihood of success. Importantly, this may contribute to the eradication of the tumour targeted by the radiation, as well as those tumours and individual cancer cells outside the radiation field through activation of systemic immunity. Dr Enderling’s team have found their model simulations making predictions suggesting that conventional radiation schemas may not be able to obtain strong immune-mediated tumour responses. It is plausible that systemic responses would be especially important for metastatic patients with lymph node involvement, circulating tumour cells or subclinical or undiagnosed metastatic deposits. This will add a further layer of complexity to the modelling so the researchers can balance the different biological consequences of radiation therapy in order to induce maximum cell killing, while sparing immune cells and maximising subsequent radiation-induced immunity. The researchers are aware that their results to date may be biased by the experimental data that they have used for model calibration. To further develop the model, they will be examining more complex metastatic disease distributions, comprising combinations of tumours in different organs, using the proposed framework. Nevertheless, their work so far provides the quantitative foundations to evaluate radiation fractionation protocols in order to induce immunemediated systemic anti-tumour responses. With the continuously increasing number of clinical trials combining radiation and various forms of immunotherapy, this model could become an invaluable tool for evaluating clinical responses. Excitingly, it could also be employed to help design subsequent clinical protocols and, eventually, support individual patient treatment planning, providing quantitative personalised oncology.

Dr Heiko Enderling

T: +1.813.745.3562 W: http://labpages2.moffitt.org/enderling/

Research Objectives Positioned at the forefront of ‘virtual’ trials to personalise cancer treatment, the Quantitative Personalised Oncology Lab, housed within Moffitt Cancer Center, integrates mathematical, biological and clinical sciences to model, simulate and predict treatment response for individual patients.

Detail H. Lee Moffitt Cancer Center & Research Institute, 19202 USF Magnolia Dr., Tampa, FL, 33612, USA Bio Dr Enderling received his PhD in Mathematical Biology from the University of Dundee before taking up position

of Postdoctoral Researcher at Dundee and then Postdoctoral Fellow at Tufts University School of Medicine. He is currently an Associate Member of the Integrated Mathematical Oncology Department at Moffitt Cancer Center, where his research interests lie in developing patient-specific optimal cancer treatments.

References JC Lopez Alfonso, J.C., Poleszczuk, J., Walker, R., Kim, S., Pilon-Thomas, S., Conejo-Garcia, J.J., Soliman, H., Czerniecki, B., Harrison, L.B., Enderling, H. (2019). On the immunological consequences of sequencing cancer radiotherapy and surgery. JCO Clin. Cancer Inform. 3:1-16. Poleszczuk J., Enderling H. (2018). The Optimal Radiation Dose to Induce Robust Systemic Anti-Tumor Immunity. Int J Mol Sci. 19(11). Walker, R., Poleszczuk, J., Pilon-Thomas, S., Anderson, A., Czerniecki, B., Harrison, L., Moros, E., Enderling, H. (2018). Immune interconnectivity of anatomically distant tumors as a potential mediator of systemic responses to local therapy. Sci. Rep. 8(1), 9474. Poleszczuk, J., Luddy, K., Chen, L., Lee, J.K., Harrison, L.B., Czerniecki, B.J., Soliman, H., Enderling, H. (2017). Neoadjuvant radiotherapy of early-stage breast cancer and long-term disease-free survival. Breast Cancer Res. 19, 75. Poleszczuk, J., Luddy, K.A., Prokopiou, S., Robertson-Tessi, M., Moros, E.G., Fishman, M., Djeu, J.Y., Finkelstein, S.E., Enderling, H. Abscopal benefits of localized radiotherapy depend on activated T cell trafficking and distribution between metastatic lesions. Cancer Res. 2016, 76, 1009–1018.

Collaborators • Dr Jan Poleszczuk (Polish Academy of Sciences, Poland) • Dr Rachel Howard (Moffitt Cancer Center) • Dr Juan Carlos Lopez Alfonso (Helmholtz Centre for Infection Research, Germany)

Personal Response What is your next move towards providing optimal adaptive cancer therapy for individual patients? Our mathematical models generate a number of very exciting hypotheses. The next step is to validate mathematical predictions experimentally and ultimately clinically. We have designed intriguing experiments together with our colleagues in cell biology, cancer immunology, radiation biology, and radiation oncology to fully decipher the complex, adaptive dynamics of cancers when confronted with an immune response before, during and after radiation therapy. We have begun to amend ongoing clinical trials to specifically collect biological specimens from biopsies, surgery and blood draws during the long treatment course. These samples will be analysed for the role of all participating players in the tumour-immune ecosystem, and how such roles and interactions might change over time. Then, we might be able to prescribe radiation not only to kill as many cancer cells as possible, but also with the intent to activate robust immune responses to help fight the tumour.

www.researchoutreach.org

129


Health and Medicine ︱ Ekaterina Muyzhnek

Exciting advancements in ovarian cancer treatment O

CI Photos/Shutterstock.com

Ovarian cancer is treacherous and challenging to treat. With standard treatment unchanged since the 1990s and newer costly treatments failing to restrain the tumour’s growth over long periods of time, an overhaul in anti-cancer treatment is urgently needed. With fresh insights into the biology and progression of ovarian cancer, Lev Ashraphyan, Vsevolod Kiselev from Institute of Gynecologic Oncology and Mammology of Acad. V.I.Kulakov National Medical Research Centre of Obstetrics, Gynecology and Perinatology, the Ministry of Health of Russia, and Ekaterina Muyzhnek from the pharmaceutical company, IlmixGroup, and Gennady Sukhikh from Acad. V.I.Kulakov National Medical Research Centre of Obstetrics, Gynecology and Perinatology, the Ministry of Health of Russia have uncovered a promising new treatment.

130

www.researchoutreach.org

varian cancer is treacherous and crafty. Laying low and silently spreading, the tumour thrives undetected. When it’s finally identified the tumour is often advanced and is a formidable challenge to treat. Its aggressiveness leaves women with a five year survival rate of anywhere between 12-42%, taking 150,000 lives annually worldwide. For this, it is known as the ‘Silent Killer’. A promising new therapy for this cruel cancer has been developed by clinicians and scientists including Academician of the Russian Academy of Sciences, Prof Lev Ashraphyan and Associate academician of the Russian Academy of Sciences, Prof Vsevolod Kiselev from Institute of Gynecologic Oncology and Mammology of Acad. V.I.Kulakov National Medical Research Centre of Obstetrics, Gynecology and Perinatology, the Ministry of Health of Russia; Dr Ekaterina Muyzhnek from the pharmaceutical company, IlmixGroup, and Academician of the Russian Academy of Sciences, Prof

Gennady Sukhikh from Acad. V.I.Kulakov National Medical Research Centre of Obstetrics, Gynecology and Perinatology, the Ministry of Health of Russia. TENACIOUS TUMOURS Treatment for ovarian cancer has changed little since the late 1990s. The standard treatment involves surgery to remove as much of the tumour as possible (maximal cytoreductive surgery) followed by a cocktail of platinum-based drugs and taxanes (chemotherapy), aimed at killing cancer cells. One would think that attacking the tumour from all angles would wipe it out for good, yet this is not the case. Within 6 – 24 months after the treatment, 60-80% of patients relapse and further cycles of chemotherapy are required. Unfortunately, chemotherapy doesn’t kill all the cancer cells, some are resistant to the drugs. When a drug attempts to kill a cancer cell, it has no effect on resistant cancer cells and they stay alive. Once the resistant cells start multiplying they

form chemoresistant tumours, whichs can’t be destroyed, leading to medical complications and early deaths. The need for more effective treatments has never been more urgent. In recent years, targeted anti-tumour drugs known as ‘Breakthrough Therapies’ have been made widely available. With a catch phrase like that we should all be jumping for joy. For some, targeted therapy has helped extend patient’s lives for months, this is viewed as a success, as success is often measured in days. This truly sounds like a breakthrough so why is there a growing scepticism about targeted therapies amongst researchers and clinicians?

Indole-3-carbinol

3,3’-Diindolylmenthane

Epigallocatechin-3-gallate

Figure 1: Indole-3-carbinol, 3,3’-diindolylmethane, and epigallocatechin-3-gallate: chemical structure

Imagine this, you have a rubbish bin in your house but you don’t understand rubbish collection days so you keep filling it with rubbish. As the pile gets bigger you squash it down and pack it in, trying to contain it, but one day you know that if you don’t figure out the collection day the rubbish will take over your house. Now, imagine the rubbish bin is a tumour and the human trying to control the waste is the targeted therapy. According to the original vision, targeted therapies aimed to decrease tumour size and, eventually, eliminate the tumour. Actually, it turns out that targeted therapies are unable to get rid of the tumour. The main positive effect of targeted drugs is usually less about decreasing tumour size than achieving prolonged stabilisation of tumour management process. Like the human not understanding collection days, if scientists don’t figure out how to get rid of the tumour, or rather to remove tumour’s source (its root system) instead of reducing its size, the patient could relapse. Another drawback is that over time, tumours develop resistance to the monotargeted drugs, just like their predecessors, so the positive effects don’t last long. Targeted therapies cannot be used alone or even in combination with conventional chemotherapy to effectively fight ovarian cancer. For the next generation of cancer therapies, a better understanding of chemoresistant and recurrent ovarian tumours is needed to help promote patients’ survival.

THE NEXT CHAPTER It’s not all doom and gloom. In recent years there has been a huge effort to understand tumour recurrence and resistance. A recent scientific breakthrough in cancer research involves the discovery of a population of hardy cancer cells, known as cancer stem cells, or CSCs. Research into this rare population of immortal cells suggests they are responsible for chemoresistance and recurrent tumours in many different cancers, including ovarian cancer. This

There are also lots of CSCs in ascites (the abnormal buildup of fluid in the abdomen) which commonly occurs in advanced ovarian cancer and its relapse. However, if the patients are treated with CSCs inhibitors during the chemotherapy this may reduce or prevent recurrence and even promote patient survival. It’s about time anti-cancer therapy started a new chapter. Equipped with a greater understanding of tumour chemoresistance and recurrence, scientists and clinicians

Its aggressiveness leaves women with a five year survival rate of anywhere between 12-42%, taking 150,000 lives annually worldwide. discovery has opened up a whole new opportunity for cancer research scientists, who over the last decade, have been searching and developing new drugs to destroy CSCs. Astonishing findings reveal that combining conventional chemotherapy with ovarian CSCs inhibitors is essential for tackling the tumour. In the first round of chemotherapy the bulk of ovarian tumour cells are eliminated but hardy, chemoresistant CSCs survive the treatment. While the symptoms for the patient disappear, CSCs silently grow into tumours. At recurrence the patients have large tumours full of CSCs that cannot be destroyed with chemotherapy.

can begin to uncover more effective anticancer treatments. BACK TO NATURE Today, natural agents play a dominant role in the discovery of leads for the development of drugs against many human diseases, especially cancer. Currently, over 60% of anti-cancer drugs are derived in one way or another from natural sources, with the rest being artificially synthetised. In the recent study led by Academicians Lev Ashraphyan and Vsevolod Kiselev and their colleagues, compounds of natural origin were used in a clinical trial to identify their effectiveness in treating

www.researchoutreach.org

131


Behind the Research

advanced ovarian cancer. Indoles - indole3-carbinol (I3C), 3,3'-diindolylmethane (DIM) and flavonoid epigallocatechin3-gallate (EGCG) are the rock stars of anti-cancer treatment and widely called “a therapeutic marvel”. Their humble origins in cruciferous vegetables, including sprouts, and green tea has not set them back. These natural substances have been comprehensively studied in various cancers, including ovarian cancer, and have unique anti-cancer properties. In contrast to monotargeted drugs that hit only one molecular target of the tumour for some time, these compounds possess multiple antitumour activity and attack cancers simultaneously from multiple angles. They stop tumour cells from growing and dividing and can even instruct cells to die. They fight off molecules important for fuelling cancer cell growth and spread including aggressive estrogens, inflammatory molecules, oxidants, growth factors. Incredibly, they are also capable of inhibiting CSCs by inducing their death and blocking key pathways responsible for their aggressiveness and chemoresistance. All these activities, including development of chemoresistance, are being implemented by reversible epigenetic mechanisms changing gene expression. All three above natural substances have demostrated epigenetic antitumour activity. In the study, these substances were used in the form of the medical drug Indinol® Forto and dietary supplement Promisan® (both MiraxBioPharma, Joint-Stock Company, Russia) manufactured under a special technology according to current GMP standards. Unique properties like this make these agents an excellent contender for the next generation of anticancer treatments. LIGHT IN THE DARK In a first of its kind clinical trial conducted by the research team, 284 women with untreated advanced ovarian cancer were assigned maintenance therapy with orally administered I3C as well as I3C with EGCG as Indinol® Forto and Promisan® pharmaceutical agents. Maintenance therapy was administered before, during, and for five years after combined treatment (chemotherapy and surgery

132

www.researchoutreach.org

Lev Ashraphyan

Vsevolod Kiselev Ekaterina Muyzhnek Gennady Sukhikh E: vkis10@mail.ru

E: MuyzhnekEL@ilmixgroup.ru

T: +7 495 765-87-87 (19-74) W: http://ncagp.ru/ W: http://ncagp.ru/index.php?_t8=537 W: www.ilmixgroup.ru

Figure 2: Multiple anti-cancer activity of indole-3-carbinol (I3C), 3,3’-diindolylmethane (DIM), and epigallocatechin-3-gallate (EGCG)

to remove the tumour). The results were shocking. Maintenance therapy with I3C and EGCG in advanced ovarian cancer dramatically increased median overall survival by almost one and a half times over five years and 5-year overall survival increased from 37% to 67%. What’s more, researchers found patients were living longer without the tumour getting worse and there was a dramatic reduction in relapses with ascites. A huge benefit was identified from taking agents based on I3C and EGCG along with chemotherapy prior to surgery; in these patients, surgeons were able to completely remove all

daily activities. In addition, these drugs do not demonstrate any significant toxicity to patients, they are safe and have no additional side effects. This safe approach to treating ovarian cancer is also inexpensive and affordable. Current anti-cancer therapies can cost anywhere from $10,000 - $30,000 per month, a huge problem for patients. Oncologists worry about the rising costs of cancer treatment, in the USA there has been a 100-fold increase in cost per patient over the last 50 years. The recent clinical trial with agents based on I3C and EGCG shows huge promise in tackling this problem allowing people

…therapy with I3C and EGCG in advanced ovarian cancer dramatically increased median overall survival by one and a half times and 5-year overall survival increased from 37% to 67%. visible tumours. This was not the case in the control patients who hadn’t received the maintenance treatment. The scientists believe that this radical tumour removal important for improvement of treatment outcomes is due to the anti-tumour effects of I3C and EGCG, including their ability to kill CSCs. Prolonged treatment with I3C and EGCG showed a measurable improvement in the patient’s quality of life and their

to afford effective anti-cancer treatments. A fresh understanding of the biology of ovarian tumours has allowed Academicians Lev Ashraphyan and Vsevolod Kiselev and their colleagues to develop a promising, effective, safe and affordable treatment to tackle the formidable ovarian cancer. In the future it’s possible these results may be used to develop more effective and safer treatment approaches for other difficult to treat cancers.

Detail

Research Objectives

Bio Lev Ashraphyan: A highly regarded gynaecologic oncologist and a surgeon in gynaecologic cancers, Honoured Doctor of the Russian Federation. Director, Institute of Gynecologic Oncology and Mammology of Acad. V.I.Kulakov National Medical Research Centre of Obstetrics, Gynecology and Perinatology, the Ministry of Health of Russia (Moscow). Academician of the Russian Academy of Sciences, Professor, MD. A member of Russian and foreign Associations of gynecologists and gynecologic oncologists, President of Russian Association of specialists in the treatment of female reproductive system tumours.

The research is focused on an effective and safe approach of maintenance therapy for advanced ovarian cancer that is both inexpensive and affordable.

Vsevolod Kiselev: A highly regarded clinician-scientist, specialist in molecular biology, molecular medicine and biotechnology. Deputy Director, Institute of Gynaecologic Oncology and Mammology of Acad. V.I.Kulakov National Medical Research Centre of Obstetrics, Gynaecology and Perinatology, the Ministry of Health of Russia (Moscow). Associate academician of the Russian Academy of Sciences, Professor, D. Biol. Sci. Twice winner of Russian Government Award in Science and Engineering, a laureate of the International Prize Galen (Prix Galien Russia), a winner of the National Prize to the best doctors of Russia “Vocation”. Ekaterina Muyzhnek: Chief Scientific Officer, Joint-Stock Company “IlmixGroup”, Moscow, Russia. PhD (Biochemistry, Biol. Faculty, M.V.Lomonosov Moscow State University). A winner of the Russian Academy of Medical Sciences Prize.

References Kiselev V., Ashrafyan L., Muyzhnek E., Gerfanova E., Antonova I., Aleshikova O. & Sarkar F. (2018). A new promising way of maintenance therapy in advanced ovarian cancer: a comparative clinical study. BMC Cancer 18:904.

Personal Response How long will it take for this treatment to be available to all people with ovarian cancer? We believe it depends on two points: the speed at which available information on this new treatment approach would be distributed in public space and on the ability of oncologists (who are known as the most conservative clinicians) to accept current ideas and knowledge and their desire to apply them in clinical practice. But importantly, safety, affordability, simple peroral usage, and clinically proven efficacy of the preparations used in the trial could give a hope for a long happy life to large number of oncological patients and their relatives with no participation of physicians if they aren’t ready yet.

Gennady Sukhikh: Honoured Scientist of the Russian Federation, Director, Acad. V.I.Kulakov National Medical Research Centre of Obstetrics, Gynaecology and Perinatology, the Ministry of Health of Russia (Moscow). Academician of the Russian Academy of Sciences, Professor, MD. Collaborators • Evgeniya Gerfanova • Irina Antonova • Olga Aleshikova (doctors who participated in this study)

www.researchoutreach.org

133


Health and Medicine ︱ Dr Nandini Dey

Fluorescent imaging sheds new light on apoptosis Researchers at the Avera Cancer Institute are using state-ofthe-art fluorescent imaging to shed new light on the physical and biochemical processes behind apoptosis - also known as ‘programmed cell death’. The novel, cost-effective and user-friendly method uses three fluorescent stains, enabling the real-time, structure-function based identification of apoptosis in live, apoptotic tumour cells. This unique protocol developed by Dr Nandini Dey, in collaboration with Dr Pradip De, and with the help of Mrs Jennifer Carlson Aske is enabling the translational oncology research team to explore the efficacy of anti-cancer drugs in tumour models.

Jezper/Shutterstock.com

E

very second, approximately one million cells in our body are destroyed by apoptosis and replaced with new ones. This process of ‘cellular suicide’ is critical for maintaining our cellular homeostasis and bodily health, and plays an important role in embryological, physiological and pathological conditions. In pathological conditions, cells bypass apoptosis, essentially living past their ‘use-by date’ and can become tumourigenic or cancerous by virtue of oncogenic alterations. In these cases, chemotherapies and anti-cancer drugs are used to induce apoptosis, preventing further tumour proliferation, with the aim of curing patients. Apoptosis is a therapeutic goal following the treatment of anti-cancer drugs. To ensure these drugs are working properly, it’s vital that researchers and health practitioners can accurately identify and quantify apoptotic cells. While several methods currently exist, most require

of tumour cells in response to treatment, and apoptosis is one of the main areas of focus in their laboratory. The method established by the group presents an event-based identification of the phenomenon of apoptosis in live cells which is enabling them to monitor the apoptotic cells following the anti-cancer drug treatment including conventional chemotherapy drugs, targeted therapy drugs and immune-therapy drugs in cells. Setting out to improve the identification of apoptosis in live cells, the Avera Cancer Institute team have developed a costeffective and user-friendly method to visualise the physical and biochemical processes behind apoptosis in live cells in real time. Recently published in Scientific Reports, the team show that this unique tool allows the efficacy of anti-cancer drugs to be determined, and has the potential to identify new drug targets. The novelty of this approach is that it identifies

This tool allows the efficacy of anti-cancer drugs to be determined, and has the potential to identify new drug targets. expensive and sophisticated equipment and procedures such as cryo-microscopy, incredibly high resolution microscopy or specialised confocal microscopy. In addition, these methods use processed cell samples, rather than live cultures and cannot show the drugs’ mechanisms of action in the process of inducing apoptosis event-by event in a real-time manner in a standard laboratory set-up. Changing this are Dr Nandini Dey, Dr Pradip De and Ms Jennifer Carlson Aske, based at the Avera Cancer Institute, South Dakota. The research group have spent many years studying anti-cancer drugs and their mode of action. In particular, their interest lies in the behaviour

three sequential cardinal biochemical, enzymatic, and morphological events of apoptosis in a laboratory-friendly way. WHAT IS APOPTOSIS? In a time of crisis, would you sacrifice one to save many? This is what our cells do: they are genetically programmed to ‘self-destruct’ if they suffer irreparable DNA damage, protecting the body from further harm. This ’programmed cell death’ is called apoptosis. Apoptosis is the hallmark of cancers. It involves certain morphological, biochemical, and physiological sequential steps controlled by a dying cell. Apoptosis is characterised by physical signs of cell shrinkage, rounding, nuclear shrinkage and loss of nuclear membrane

Live triple-fluorescence in OVK18 cells treated with paclitaxel plus BKM120. Live OVK18 cells were stained with MitoView Blue + NucView488 Casp3 substrate +CF 594 Annexin V (A–D). Insets represent non-apoptotic and apoptotic cells with various combination of staining as mentioned within each inset (a-d for non-treated cells and e-h for treated cells). In separate experiments, the validation of the effect of paclitaxel (2.5 nM) plus BKM120 (1 μM) was tested by the changes in real-time proliferation (E), mitochondrial potential (F), expression of apoptotic markers by Western blot (G) and apoptosis by flow cytometry (H).

integrity, plasma membrane blebbing, nuclear fragmentation and finally, cell fragmentation. Apoptotic cells also undergo a series of biochemical changes including executioner caspase activation, mitochondrial membrane alterations and release of a protein called cytochrome C, externalisation of phosphatidylserine – an ‘eat me’ signal for scavenging macrophages – to the cell surface, poly (ADP-ribose) polymerase (PARP) cleavage and nuclear DNA fragmentation. By tracking these well-regulated and sequential processes, researchers can observe the apoptotic effects of anticancer drug treatments on tumour cells and potentially identify new drug targets. This is where the research of Dr Dey and her group comes in. APOPTOSIS IN A NEW LIGHT Using a fluorescence microscope and three fluorescent dyes (fluorophores), the Avera Cancer Institute team have coupled these three sequentially occurring rate-limiting features of apoptosis in live cells. In their ‘triple-fluorescence staining’, they can observe the enzymatic activity of executioner caspase 3 (green fluorophore), external presentation of phosphatidylserine (red fluorophore) and altered mitochondrial functions (blue fluorophore) in live ovarian cancer cells.

To induce apoptosis, the team treated two ovarian cancer cell lines – OVK18 and A2780 – with anti-cancer drugs called paclitaxel (conventional chemotherapy drug) and BMK120 (targeted pan PI3K inhibitor), either alone or in combination. The cells’ fluorescent patterns were then compared with those of untreated cells.

cells fluoresced bright blue, since their mitochondria were still intact. In contrast, treated cells fluoresced green and/or red. This was expected given that caspase 3 initiation and phosphatidylserine presentation only occur during apoptosis.

Upon examination under the microscope, the researchers found that most untreated

VALIDATING THE CONCEPT To ensure that the anti-cancer drugs were

Figure: Ki67 Ki67 and cleaved Caspase3 double staining of fixed MDA-MB468 cells, control and treated (ABT888+ Carboplatin+GDC0980) for 72 hrs: Treated and control cells cell fixed and double stained with Ki67 (Ki67 Alexa Fluor 647) and cleaved Caspase3 (Cleaved Caspase-3 Alexa Fluor 488). Confocal pictures were captured from 3-dimensional (XYZ coordinates were included in the picture) Z-sections. Projected images were reconstituted for a 3D-movie. Actin (Alexa Fluor® 555 Phalloidin) was used as a counter stain in the immunofluorescence images.

www.researchoutreach.org

135


Unsurprisingly, proliferation of paclitaxeltreated OVK18 cells decreased or stopped, depending on the dose given, with apoptosis setting in after 48 hours and the simultaneous expression of apoptotic markers. Furthermore, these samples had a larger number of dead cells overall, compared to the untreated samples. Similar results were seen in paclitaxel and BMK120-treated OVK18 cells, with apoptosis setting in at 48 hours, however, this time, changes in mitochondrial activities first appeared at 24 hours. Paclitaxel-treated A2780 cells took the longest to reach an apoptotic state, with the process occurring 48 and 72 hours after treatment. However, changes in mitochondrial activity occurred 24 and 48 hours post-treatment. This sample also had a larger number of dead cells than the untreated samples, proving that the paclitaxel causes cell death, therefore verifying the microscopy results. Thus using this protocol the group successfully identified the temporal and eventual pattern of drugaction in inducing apoptosis. IDENTIFYING LIVE AND DEAD CELLS In order to identify living and dead cells, Dr Dey and her group treated their samples with a new set of fluorophores. Dead cells fluoresced red while live cells were green. The mitochondrial potential was measured by flow cytometer while live-dead cells were read under fluorescent microscope. Fluorescent

tivation Ac o

xecutioner fE

Membrane Asymmetry Due to Flipped Phosphatidyl-Serine

spases Ca

responsible for these apoptotic effects, the researchers treated the ovarian cancer cells with different doses of paclitaxel or paclitaxel and BMK120 and imaged them at different magnifications and at different time points.

www.researchoutreach.org

Apoptotic Stoptotic

Activated Caspase & its Products

Loss of Mitochondrial Potential

Dr Nandini Dey

Loss of Cellular Esterase Activity

Every second, approx one million cells in our body are destroyed by apoptosis and replaced with new ones. imaging comparisons using the ‘live and dead cell assay’ showed no overlapping of red (dead) cells and green (live) cells in treated and untreated samples. In contrast, fluorescent imaging comparisons using their ‘triplefluorescence staining assay’ showed overlapping of red (Annexin V positive apoptotic cells) and green (active caspase3 positive apoptotic cells) in treated and untreated samples; this is because green cells and red cells were at different stages of apoptosis. No overlap was observed between blue and red and/or green cells, and is to be expected since blue cells with active mitochondria were less likely to exhibit apoptosis, while red and/ or green or both red and green cells were less likely to exhibit functional blue mitochondria. Interestingly, however, a few drug treated cells

E: Nandini.Dey@avera.org T: USA-605-322-3298

Damaged Membranes & EthD-1 Binds to Buclecic Acids

Schematic showing salient features of apoptosis used in triple-fluorescence staining of live cells.

From left to right: Dr Pradip De, Dr Nandini Dey, Mrs Jennifer Carlson Aske

136

Behind the Research

Activation of Initiator Caspases

still displayed mitochondrial activity (blue fluorescence). WHY IS THIS IMPORTANT? While not all of these biochemical processes are specific to apoptosis and apoptotic events are stage-, timeand stimuli-dependent, the strength of the triple-fluorescence staining method lies in its ease of use and its ability to simultaneously stain three critical morphological and biochemical apoptotic steps – making it much easier to accurately identify apoptotic versus non-apoptotic cells. In comparison, most existing techniques rely on one apoptotic (but not necessarily apoptosisspecific) characteristic to identify and quantify these cells. Furthermore, triple-fluorescence staining can show apoptotic processes in live cells in real time – allowing researchers to see anticancer drugs’ mechanisms of action, and their efficacy. This laboratory friendly method can be standardized with minimalistic laboratory equipment and resources, making it a relatively cheap and accessible method that can be applied to any relevant drug treatment. Notably, it has already been used to observe the effects of different drug combinations on other tumour cell lines. The strength of this assay lies in its ability to identify modes of action of pro-apoptotic drugs in an eventbased manner.

Dr Pradip De

Jennifer Carlson Aske

E: Pradip.De@avera.org T: USA-605-322-3297

E: Jennifer.Aske@avera.org T: USA-605-322-3289

Research Objectives

References

The Avera Cancer Institute team study anti-cancer drugs and their mode of action, in the context of tumour cell behaviour, including cell death by apoptosis.

De, P., Carlson, J., Leyland-Jones, B., Williams, C. and Dey, N. (2018). Triple fluorescence staining to evaluate mechanism-based apoptosis following chemotherapeutic and targeted anti-cancer drugs in live tumour cells. Sci Rep, 8, 13192-13202. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6123436/ [Accessed 18.03.2019]

Detail Dr Nandini Dey Translational Oncology Laboratory, Avera Cancer Institute, 1000 E 23rd Street, Sioux Falls, SD, 57105, USA 3rd Floor of Prairie Center, Suite # 3611; Room # 3610 & Departmental of Internal Medicine, SSOM, University of South Dakota, Sioux Falls, SD 57105, USA Dr Pradip De Translational Oncology Laboratory, Avera Cancer Institute, 1000 E 23rd Street, Sioux Falls, SD, 57105, USA 3rd Floor of Prairie Center, Suite # 3611; Room # 3609 & Departmental of Internal Medicine, SSOM, University of South Dakota, Sioux Falls, SD 57105, USA & Consultant, Viviphi, Greenwood Village, CO 80112, USA Jennifer Carlson Aske Translational Oncology Laboratory, Genomic Oncology Institute, Plaza 2, 1301 South Cliff Avenue, Plaza 2, Suite 703, Sioux Falls, SD, 57105 Bio Dr Nandini Dey, M.S., PhD is a Senior Scientist, & Director of Translational Oncology Laboratory, Avera Cancer Institute, USA. She studies the biology of tumour cells in solid tumours with a special interest in cell phenotypes. Dr Dey is a member of the Royal Society of Medicine. Dr Pradip De, M.S., PhD is a Senior Scientist of the laboratory with 25 years’ experience in signal transduction biology. Together, Dey & De have over 50 years’ experience in translational research and have authored or co-authored over 100 publications. Mrs Jennifer Carlson Aske, M.S. is the Research Laboratory Supervisor with 10 years of research experience.

Ulukaya, E., Acilan, C. and Yilmaz, Y. (2011). Apoptosis: why and how does it occur in biology? Cell Biochem and Function, 29 (6), 468-480. Available at: https://onlinelibrary.wiley.com/doi/ abs/10.1002/cbf.1774 [Accessed 18.03.2019]

Personal Response Why would some cells show red and blue staining in the ‘live and dead cell assay’? The ‘live and dead cell assay’ was carried out to show live cells as green and dead cells as red. There is no blue colour in the assay. During the ‘triple fluorescence staining’ however, a few cells rarely showed red and blue staining under treatment conditions because, during the initial stage of apoptosis, mitochondria are still functional (blue) while their cell membranes have just started to show apoptotic features (red). These are really rare events and dependent on the drug used to induce apoptosis. In culture conditions, all cells are not in the same stage of apoptosis, and all cells were not synchronized for their stages of cell cycle. What do you see for the future of this research? Considering the laboratory friendly nature of the ‘triple-fluorescence staining’ in live cells, we expect that this method will become a commonly used laboratory tool to study drug effects in cancer research. Although currently the protocol is validated and used in translational oncology, we think that this procedure will be meaningfully adapted in studies of apoptosis in other fields. For example, the phenomenon of apoptosis is a critical event in the field of developmental biology, regenerative medicine, and stem cells research. We envision that studies in the field of Parkinson’s disease, Alzheimer’s disease, myocardial ischaemia, myocardial infarction, and traumainduced apoptosis will have the opportunity to use this protocol effectively.

Funding Avera Cancer Institute, Sioux Falls, SD, USA

www.researchoutreach.org

137


As we age, so does our skin. One of the factors that affects how quickly skin appears to age is exposure to UV light. Dr Jeong-Sun Seo of Seoul National University Bundang Hospital, Republic of Korea, is investigating how UV light affects skin aging at the genetic level. Using RNA sequencing data, Dr Seo and his team demonstrate that some signs of skin aging, such as the development of wrinkles, are directly linked to over-exposure to UV light.

O

ur skin has to tolerate a great deal. Exposure to sun, wind and rain, harsh soaps, poreclogging cosmetics, a poor diet; these are just a few of the challenges our skin might endure throughout a lifetime. Our skin also accounts for a large part of our appearance and is, for many people, where the most obvious signs of aging can be seen. Our skin condition reflects our apparent age; no wonder, then, that skin care is such a huge market. Many factors are known to affect the way that skin ages. Some of these factors, such as genetic qualities, are intrinsic to the body while others are extrinsic. Of all the extrinsic factors, perhaps the one that has received the most publicity is excessive exposure to sunlight. The harmful part of sunlight is ultraviolet (UV) light, which is also the type of light used

Skin offers a unique opportunity to investigate the effects of UV light on aging. iconogenic/Shutterstock.com

in tanning beds and lamps. Exposure to UV light is not only an important factor in skin aging: excessive exposure to UV also increases the risk of skin cancer. The dangers of sunburn and the importance of sunscreen are well-known, but the exact ways in which UV damages skin are still under investigation.

138

www.researchoutreach.org

UV AND SKIN AGING Previous research has shown that UV light affects the aging process of skin by causing changes in gene expression (when and how a gene produces a protein). Exactly how this happens, however, is not well understood. Using state-of-the-art technology, Dr Seo and his team can explore genetic information to investigate how skin ages, with the aim of reaching a better understanding of this process. In particular, the team looked at the various effects of UV light on skin, a process known as photo-aging. ANALYSING THE TRANSCRIPTOME Although there has been some previous research on the effect of UV light on skin aging, it has focussed on the behaviour of just a few genes. While this is useful, genes are known to interact with one another in myriad ways. Focusing on a few individual genes therefore risks missing important genetic pathways that could influence the aging process. To avoid this problem, Dr Seo utilised a much broader type of genetic information to investigate gene expression changes in skin. Ribonucleic acid (RNA) is produced from DNA during the process of protein synthesis. The complete set of RNA transcripts produced by the genome of a particular species – in this case, humans – is known as the transcriptome. Analysing

the transcriptome therefore allows researchers to characterise the genetic activity of a tissue, such as skin, as a whole; it shows which genes are active (and how active they are) and which are not. SKIN CHANGES AND UV EXPOSURE Skin offers a unique opportunity to investigate the effects of UV light on aging. Some parts of our skin are frequently exposed to sunlight, while other parts are usually kept covered. In his study, Dr Seo and his colleagues compared UV-exposed skin from the lower leg with UV-protected skin from the suprapubic region (the area between the navel and the pubic bone).

ANGIOGENESIS AND WRINKLE FORMATION Wrinkles are probably the most obvious sign of aging in skin. Angiogenesis – the formation of new blood vessels – is known to play a significant role in the development of wrinkles. In this study, the researchers found that certain genes involved in angiogenesis were up-regulated (became more active) in photo-damaged skin. In fact, there was

Tefi/Shutterstock.com

RNA sequencing reveals secrets of skin aging

Thomas Andreas/Shutterstock.com

Health and Medicine | Dr Jeong-Sun Seo

Dr Seo’s team used transcriptomic RNA sequencing data to examine the behaviour of a very large number of different genes at the same time. In total, the researchers assessed genetic data from almost 600 samples of skin. This in-depth investigation allowed the team to pin-point the actual biological changes caused by exposure of skin to UV light. Their results showed a clear difference between the transcriptomes of the lower leg and suprapubic skin samples, suggesting that gene activity is notably different in these two areas of skin. This difference could be due to variation in exposure to UV light.

www.researchoutreach.org

139


Behind the Research

a significant difference in the activity of these genes in the skin from the lower leg, when compared with skin from the suprapubic region. Greater angiogenesis means a higher rate of development of new blood vessels, so this result suggests that exposure to UV light directly influences wrinkle formation.

BLACKDAY/Shutterstock.com

www.researchoutreach.org

A leading scientist in the field of genomics, Dr Jeong-Sun Seo has broadened our understanding of human variations in cancer and rare diseases.

Detail

Aged skin loses elasticity compared to younger skin due to broken elastic fibre and decreased collagen and hyaluronic acid.

Precision Medicine Center Seoul National University Bundang Hospital Dolma-ro 172, Seongnam Bundang-gu, Gyeonggi-do 13605 Republic of Korea

Dr Seo’s team used transcriptomic RNA-sequencing data to investigate the genetic pathways of aging. by exposure to UV light. In normal cells, MMP genes are known to play a role in breaking down collagen. When MMP genes become more active, collagen is broken down faster, leading to reduced elasticity in the skin. AGE-RELATED SKIN CHANGES Some skin changes occur with aging whether or not the skin has been exposed to excessive sunlight. Previous work has shown that wound healing can be between 20% and 60% slower in

Bio Dr Jeong-Sun Seo is a Distinguished Professor at Seoul National University Bundang Hospital, where he is also Director of Precision Medicine Center. He established Macrogen Inc. 22 years ago, one of the first biotechnology companies in Korea. Recently, his team built the most contiguous reference genome, AK1, using de novo assembly and phasing, taking another step forward for precision medicine in Northeast Asia.

older skin. Dr Seo found that the rate of wound healing decreased with aging in both types of skin. A set of genes known to be involved in wound healing were down-regulated (became less active) in both UV-exposed and UV-protected skin. This result suggests that slower healing is an inevitable consequence of aging, whether skin has been exposed to a lot of sunlight, or not.

References Cho, B, Yoo, S & Seo, J. (2018). Signatures of photo-aging and intrinsic aging in skin were revealed by transcriptome network analysis. Aging, 10, 1-14.

Personal Response How can transcriptomic RNA sequencing be used in other areas of aging research? With advanced medical care and technology, the elderly population has greatly increased compared to several decades ago. Although aging is not a pathological process, it still influences some diseases. If the transcriptome of each organ is fully analysed, we may able to predict the occurrence of diseases through assessing the levels of gene expression or use it as a tool to find a therapeutic target for each disease.

THE FUTURE In many parts of the world, life expectancies have been steadily increasing. This has triggered increased research interest in the various aspects of aging. In the field of skin aging research, Dr Seo’s team is the first to use transcriptomic RNA-sequencing data to investigate the genetic pathways of aging. Photo-aging due to excessive exposure to sunlight can only be studied in the skin; it is therefore crucial that the consequences of photo-damaged skin are properly understood. This work is an important step in that direction. In future, the knowledge gained from Dr Seo’s work could be used to develop innovative treatments for skin aging, increasing health and well-being in old age. In the meantime, his work confirms a vital message, one that is well-known but bears repeating: wear sunscreen.

140

Research Objectives

watchara/Shutterstock.com

As skin ages, it tends to lose elasticity, which can contribute to development of wrinkles. This is partly due to a falling amount of collagen, an important connective protein, in the skin. Dr Seo and his colleagues were able to show that a certain class of genes, known as MMP genes, are up-regulated in skin damaged

E: jeongsunseo@gmail.com T: +82 31 600 3011

Designua/Shutterstock.com

UV LEAVES SKIN VULNERABLE Lipid metabolism, the process by which fats are either made or broken down in cells, is one of the factors vital to maintaining the function of the skin as a protective barrier. When lipid metabolism is impaired, the barrier function becomes less effective, leaving the skin more vulnerable to damage. Dr Seo found that lipid metabolism is dramatically reduced by photo-aging, suggesting that over-exposure to UV light can leave skin lacking protection. Three lipid metabolism-related genes in particular (known as LIPN, LIPK, and SMPD3) were found to be important in this process. Similarly, water loss regulation in skin was found to decrease with age. This could adversely affect both the condition and the appearance of the skin.

Dr Jeong-Sun Seo

www.researchoutreach.org

141


Health & Medicine ︱ Dr Wen-Harn Pan

ADHD and its comorbidities:

Allergies and asthma may be associated with ADHD as the immune responses resulting from allergic diseases may affect the central nervous system.

Implications for management

A

ttention deficit hyperactivity disorder (ADHD) appears to be becoming more prevalent in children. Although not clear whether this is due to an increase in the incidence of the disorder, or whether better diagnostic tools simply detect more cases, ADHD can have substantial impacts on children and their families. The work of Dr WenHarn Pan at the Institute of Biomedical Sciences, Academia Sinica, explores some of the risk factors associated with the disorder, with the ultimate aim to improve its diagnosis, prevention and management. ADHD is one of the most commonly diagnosed neuropsychiatric disorders in children and is characterised by a lack of impulse control, inattention and hyperactivity. ADHD can have

in red blood cells that carries oxygen), and low levels of serotonin (a neurotransmitter, sometimes called the happy chemical). If this association can be better understood, then there may be potential to modify some of these risk factors, and in turn to improve quality of life for children diagnosed with ADHD. UNDERSTANDING ADHD FURTHER Dr Pan and her colleagues recruited 216 children diagnosed with ADHD, and 216 children without ADHD but who were similar in terms of age, sex, height and weight from 31 schools in Taiwan. The students were aged between 8-10 years old, and a diagnosis of ADHD was confirmed by a child psychologist. The International Study of Asthma and Allergies in Childhood (ISAAC)

The presence of four biochemical factors was associated with a 6-7-fold increase in the risk of having ADHD. significant impact on a child’s performance at school, family relationships and social interactions between the child and their friends and family. Interestingly, ADHD is associated with several comorbidities, defined as additional conditions occurring alongside the main disorder, including allergic disease, anaemia (low levels of haemoglobin, the red pigment

142

www.researchoutreach.org

questionnaire to measure signs of allergy and blood samples were analysed to link to the number of white blood cells, immunoglobulin E levels (IgE) and serotonin levels. The researchers measured IgE because it is an antibody which is often elevated in allergic diseases. In addition to IgE, a type of white blood cell, eosinophils, are often found in allergic diseases as they respond to tissue damage and inflammation. The most common cause of anaemia, defined as a reduction in haemoglobin concentration, is a lack of iron. Indeed, low levels of iron have previously been associated with behavioural problems in children. Understanding the role of anaemia in ADHD is also important

Dr Pan’s study showed that children with signs of allergy were two times as likely to have ADHD.

in treating ADHD, as previous studies have shown that iron deficiency may also reduce the effectiveness of drugs used to treat the condition. Serotonin was included in Dr Pan’s study as mutations in the serotonin gene have been shown to influence impulsivity in humans and there is a link between serotonin levels and immune responses, potentially showing a link between serotonin, allergy and ADHD. However, previous studies have not been able to show an association between ADHD and serotonin levels. This study with larger sample size therefore provides valuable evidence for confirming this relationship. Other neurotransmitters, such as dopamine or norepinephrine, may also be involved in ADHD but were not measured in this study. The research team wanted to investigate whether common comorbidities or biochemical factors are singly or additively associated with an increased risk of ADHD as there may be a cumulative effect of risk factors. FINDINGS FROM THE STUDY Blood samples from the students were analysed and interpreted in combination with information on family education

levels, family income, lifestyle and disease history of students and their parents that was collected using a questionnaire. Four indicators were found following statistical analysis of the results and Dr Pan and her team identified haemoglobin, eosinophil, IgE and serotonin as being associated with ADHD. The researchers found that children with signs of allergy were two times as likely to have ADHD and those with ADHD often had low levels

of haemoglobin, a marker of anaemia, as well as lower level of serotonin but high IgE and eosinophil levels. One of the reasons that allergies, such as allergic rhinitis (hay fever-like symptoms), atopic dermatitis (eczema) and asthma may be associated with ADHD is that the immune responses resulting from allergic diseases may affect the central nervous system. Due to this, children may be more likely to develop or aggravate neurological disorders. In addition, treatment of

7

6.53***

6

RISK RATIO OF ADHD

Dr Wen-Harn Pan, Institute of Biomedical Sciences, Academia Sinica, is interested in the relationship between nutrition and health. One of her most recent projects, published last year, investigated the link between attention deficit hyperactivity disorder and whether risk factors such as allergy and anaemia could be modified to improve the ways in which attention deficit hyperactivity is both identified and treated.

5

4.47***

4

2.90*

3

1.87

2 1

1

0 Allergic disease

Anaemia

Low-serotonin

High IgE

High eiosinophil

PRESENTING RISK FACTORS Effects of the number of biochemical risk factors on ADHD. (Numbers denoted above the bar are the exact risk ratios of corresponding groups; *p < 0.05, **p < 0.01, ***p < 0.001).

www.researchoutreach.org

143


Behind the Research

Dr Pan’s research increases understanding of underlying mechanisms of ADHD and will offer novel therapies and treatment options.

Dr Wen-Harn Pan

E: pan@ibms.sinica.edu.tw T: +886 922423324

Research Objectives

References

Dr Pan’s research is focused on nutritional epidemiology, food-disease omics, dietary therapy for geriatric diseases, and worksite health promotion programs.

Wang, LJ, Yu, YH, Fu, ML, Yeh, WT, Hsu, JL, Yang, YH, Chen, WJ, Chiang, BL & Pan, WH. (2018). Attention deficithyperactivity disorder is associated with allergic symptoms and low levels of hemoglobin and serotonin. Scientific Reports. 8:10229, DOI:10.1038/s41598-018-28702-5.

Detail N141, Institute of Biomedical Sciences Academia Sinica No.128, Sec. 2, Academia Rd. Taipei, Taiwan 11529

allergies has also been associated with an improvement in ADHD symptoms. Overall, the greater the number of comorbidities the child had, the greater the chance that they would be diagnosed with ADHD. The presence of four biochemical factors was associated with a 6-7-fold increase in the risk of having ADHD. The statistical analysis also considered social factors and the study revealed that the children with ADHD were more likely to have parents with lower education levels, unstable incomes, from manual jobs, potentially leading to high levels of stress, siblings with ADHD and previous health problems during or after birth. In addition, children with ADHD tended to have mothers who were current smokers, smoked during pregnancy and had experienced miscarriage symptoms in their first trimester of pregnancy. This is interesting as these results suggest there may be sociological influences on the occurrence of ADHD, as well as biological influences.

increase understanding of underlying mechanisms of ADHD and offer potential novel therapeutic and treatment options. WHAT DO THESE FINDINGS MEAN? This study is one of the first to examine the relationship between ADHD and allergies, in combination with indicators of inflammation and immune responses. The findings of Dr Pan and her team suggest that the causes of ADHD are multidimensional and other conditions, such as allergy, anaemia, inflammation

children with ADHD did not reach the threshold criteria for anaemia, a trend towards lower levels of haemoglobin was observed. Although this study was much larger than previous studies, the finding that lower levels of serotonin was associated with ADHD is controversial, suggesting that further work still needs to be done in this area in order to confirm this relationship. In particular, more longitudinal studies observing subjects repeatedly over a longer time period and intervention studies are needed to determine whether ADHD risk could be modified by managing allergic symptoms, haemoglobin levels and serotonin levels. Limitations of the study include the lack of causal relationships and the oversimplification of ADHD and allergy diagnosis, however the findings are another piece in the puzzle of ADHD causal pies.

This study is one of the first to examine the relationship between ADHD and allergies, in combination with indicators of inflammation and immune responses.

Whilst it is important to remember that these relationships may not be causal, e.g. allergy does not cause ADHD or vice versa, the findings of Dr Pan’s research

144

www.researchoutreach.org

and low levels of serotonin may have similar underlying causes to ADHD. Children with ADHD had more allergic conditions than children without, although the causal relationship between the two is not currently clear. Dr Pan proposes that allergic reactions result in inflammation and the immune reactions may stimulate a neuroimmune mechanism in the brain and affect emotional and behavioural control. Iron is also involved in neurotransmitter production, so iron deficiency could be the link between anaemia and ADHD. Whilst the blood samples from

Bio Dr Wen-Harn Pan is currently a Distinguished Research Professor in the Institute of Biomedical Sciences, Academia Sinica, Taiwan. She is a nutrition epidemiologist and has led the Nutrition and Health Survey in Taiwan since 1992. Dr Pan won the Life Time Achievement Award from Asia Pacific Society of Clinical Nutrition. Dr Pan is a Fellow of the American Heart Association (FAHA).

Personal Response How will you take the findings of this study forward and do you have other studies planned to further investigate the association between ADHD and other clinical indicators? My colleagues and I are undertaking a prospective study to investigate whether children with allergic diseases will develop ADHD. Furthermore, since poor quality of diet has been linked to allergic disease and poor-school performance, I am investigating the association between ADHD and dietary and nutritional factors.

Funding Academia Sinica, Taiwan National Science Council, Taiwan Collaborators • Liang-Jen Wang, MD • Min-Lin Fu, PhD

Routine screening for these risk factors is straightforward and can be easily performed, therefore the findings of Dr Pan’s study play an important role in devising comprehensive diagnostic and treatment strategies for ADHD in the future.

www.researchoutreach.org

145


COMMUNICATION

Apollo: Fifty years on This year marks the 50th anniversary since the first historic Moon landings. Not only did the groundbreaking innovations that made Apollo 11 such a success leave a legacy of beneficial spin-off technologies that are used in our daily lives, the iconic mission also reshaped how we see the Earth.

J

uly 20, 1969 saw NASA’s famous achievement, the moment that Neil Armstrong and Edwin ‘Buzz’ Aldrin first set foot on our very own Moon. Five decades after Neil Armstrong took his giant leap for mankind, people are still in orbit. Launched in 1998, the International Space Station is still going strong. And excitingly, we’re seeing a new era of growth in the commercial space sector, pushing technological advances ever more rapidly to enable more people to get to space. LUNAR EXPLORATION Cosmonaut Yuri Gagarin became the first man in space in April 1961, closely followed four months later by fellow Russian Gherman Titov. The 20th century ‘space race’ saw the USA’s Apollo 8 astronauts orbit the Moon in ’68 before the iconic Apollo 11 mission. Apollo 17’s Harrison Schmitt was the last man on the Moon, 47 years ago. Over time, space endeavours became less competitive. The first joint US-Soviet spaceflight, ‘Apollo-Soyuz’ in 1975 paved the way for the International Space Station programme. MOONSTRUCK Around 240,000 miles away from Earth, our very own satellite is all too familiar a presence. We take it for granted. But life would be quite different without it. Here on Earth we’re closely attuned to the Moon’s phases: our calendar months are roughly the same length of time as it takes to go from one full moon to the next (the time it takes for the Moon to orbit Earth – 29.5 days). Our planet is subject to the pull from Moon’s gravity, causing tides – the predictable rises and falls in sea levels. Many argue that life on Earth would never have come about if it weren’t for the Moon.

ONE GIANT LEAP We’re all familiar with Neil Armstrong’s ‘One Small Step’ speech. But not so many are aware of the unpractised first words of the second man on the Moon. As Buzz Aldrin stepped off the lunar lander, he radioed back to mission control: ‘Beautiful, beautiful. Magnificent desolation.’ One commentator at the time concluded that, “on the way to the Moon we’d discovered the Earth.” BLUE MARBLE Space travel has changed the way we think about the planet we call home. The famous image of Earth taken in 1972 by Apollo 17’s crew in the last manned lunar mission is one of the most reproduced images in history. Dubbed the ‘Blue Marble,’ it captures how fragile, precious

and beautiful our world is. A reminder that actually, in such a harsh and punishing cosmos, we ain’t got it bad. Summing it up nicely is Michael Collins, Apollo 11’s third astronaut who stayed in orbit while Armstrong and Aldrin explored the lunar surface: “There seems to be two Moons now, the one I see in my backyard and the one I remember from up close. Intellectually, I know they are one and the same, but emotionally they are separate entities. The small Moon, the one I have known all my life, remains unchanged, except that I now know it is three days away. The new one, the big one, I remember primarily for its vivid contrast with the Earth. I really didn’t appreciate the first planet until I saw the second one.”

Space travel has changed the way we think about the planet we call home. The iconic Apollo 11 mission reshaped how we see the Earth.

CROWDFUNDING FOR GIRLS’ & WOMEN’S EMPOWERMENT

@WomensW4 146

www.researchoutreach.org

WomensWorldWideWeb

www.W4.org


Partnership enquiries: simon@researchoutreach.org Careers and guest contributions: emma@researchoutreach.org www.researchoutreach.org


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.