Research Outreach Issue 106

Page 1

The outreach quarterly connecting science with society ISSN 2517-701X ISSUE 106

FEATURING RESEARCH FROM:

Brown University and the National Marine Mammal Foundation; Sugar Research Australia; Technical University of Crete; New Mexico Tech; Tokyo University of Marine Science and Technology; Research Origin for Dressed Photon; Collini Holding AG; Yokohama National University; University of Indianapolis; Hungarian Academy of Science; Swiss Federal Institute of Technology; Virginia Commonwealth University; University of Oklahoma; Geneva Institute of Landscape, Engineering and Architecture; Tel Aviv University; International University Centre in Arezzo; Konkuk University; Radboud University Medical Centre; Sheffield Hallam University; University of Liverpool; University of Utah School of Medicine, IBM Watson Health & NovartisResearch Pharmaceuticals Features 3 Corporation; Yale University School of Medicine; NOVA University of Lisbon; University of Denver; Case Western Reserve University; Hungarian Academy of Sciences; National Institutes of Health; Amsterdam UMC, location VUmc; Western University


500 Women Scientists Serving society by making science open, inclusive, and accessible.

Twitter @ 500WomenSci Facebook @ 500WomenSci

www.500womenscientists.org


RESEARCH OUTREACH ISSUE 106

WELCOME

The outreach quarterly connecting science with society ISSN 2517-701X ISSUE 106

TO ISSUE 106

The Women in Engineering at Rochester Institute of Technology (WE@RIT) programme was initiated by Professor of Mechanical Engineering Margaret Bailey in 2003. Since then, it has become a bastion of support for female students in an academic field that remains even now, vastly male-dominated. Dr Bailey shares her experiences of leading the programme and her hopes for the future. Global non-profit Iridescent challenges the negative myths surrounding AI and uses technology to inspire underserved children to become innovators. We speak to founder and CEO Tara Chklovski, discussing how Iridescent’s goals have become a reality. We hope you enjoy this issue of Research Outreach – we’ve certainly enjoyed putting it together.

Emma Feloy Editor Please feel free to comment or join the debate. Follow us on twitter @ResOutreach or find us on Facebook https://www.facebook.com/ ResearchOutreach/

FEATURING RESEARCH FROM: ISSN 2517-701X

The fields of science research are varied and diverse and becoming increasingly global and collaborative. In this issue of Research Outreach, we feature researchers from across the globe including Europe, the US, Australia and Japan. We also speak to some of the people at the forefront of ensuring that the face of research truly reflects the society it serves.

Brown University and the National Marine Mammal Foundation; Sugar Research Australia; Technical University of Crete; New Mexico Tech; Tokyo University of Marine Science and Technology; Research Origin for Dressed Photon; Collini Holding AG; Yokohama National University; University of Indianapolis; Hungarian Academy of Science; Swiss Federal Institute of Technology; Virginia Commonwealth University; University of Oklahoma; Geneva Institute of Landscape, Engineering and Architecture; Tel Aviv University; International University Centre in Arezzo; Konkuk University; Radboud University Medical Centre; Sheffield Hallam University; University of Liverpool; University of Utah School of Medicine, IBM Watson Health & Novartis Pharmaceuticals Research Features 3 Corporation; Yale University School of Medicine; NOVA University of Lisbon; University of Denver; Case Western Reserve University; Hungarian Academy of Sciences; National Institutes of Health; Amsterdam UMC, location VUmc; Western University

THIS ISSUE Published by: Research Outreach Founder: Simon Jones simon@researchoutreach.org Editorial Director: Emma Feloy emma@researchoutreach.org Operations Director: Alastair Cook audience@researchoutreach.org Editor: Hannah Fraser hannah@researchoutreach.org Designer: Craig Turl Global Project Director: Julian Barrett julian@researchoutreach.org Project Managers: Tobias Jones tobias@researchoutreach.org James Harwood james@researchoutreach.org Contributors: Rachael Baker, Leonardo Bernasconi, Tim Davies, Siobhan Fairgreaves, Sara Firman, Emma Green, Natasha Hancock, Rebecca Ingle, Sam Jarman, Matt Jarvis, Efstratios Koutris, Gillian Livesey, Helena Parsons, Rachel Perrin, Emily Porter, Kate Porter, Niall Taylor, Laura Turpeinen, Stuart Wilson. /ResearchOutreach /ResOutreach

Copyright © and ™ 2019 Research Outreach

CC BY This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit http://creativecommons. org/licenses/by/4.0/ or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.

www.researchoutreach.org

3


CONTENTS 6

READING BETWEEN THE CLICKS: A NEW APPROACH TO ECHOLOCATION Dr Alyssa Accomando Understanding the precise sound patterns produced by bats and dolphins.

10

14

A RESEARCH MODEL FOR CARBON-PARTITIONING IN SUGARCANE Dr Frederik (Frikkie) Botha Examining Yellow Canopy Syndrome in sugarcane to better understand its impact on plant health. STRIPPING PAINTINGS OF THEIR SECRETS WITH HYPERSPECTRAL IMAGING Professor Costas Balas Using and developing hyperspectral imaging technology in art conservation.

18

THE POWER OF LIGHT: PRODUCTION OF SOLAR FUELS Professor Michael D. Heagy Work on novel nanostructures aiming to improve the efficiency of solar fuel production.

22

FORECASTING TSUNAMIS USING SHIP NAVIGATION RECORDS Dr Daisuke Inazu Developing effective ways of monitoring and forecasting tsunamis.

26

THE DRESSED PHOTON: SHINING LIGHT ON THE UNKNOWN USING THE UNCONVENTIONAL AREA OF OFF-SHELL SCIENCE Professor Motoichi Ohtsu Using the Dressed Photon in a new method to create silicon-based light emitting diodes and lasers.

30

SETTING NEW HORIZONS IN ELECTROPLATING OF ZINC DIE CASTINGS Valeriia Reveko Investigating how to make zinc alloys more sustainable.

34

EVALUATING STUDENTS’ PERCEPTIONS OF THE ROLES OF MATHEMATICS IN SOCIETY Professor Toshikazu Ikeda Developing an analytical tool to evaluate the changes in students’ perceptions of the roles of mathematics in society.

38

FRANÇOIS VIÈTE’S REVOLUTION IN ALGEBRA Professor Jeffrey Oaks Discovering the innovations of François Viète that underpin modern algebra.

42

GENERALISING THE ENTROPY FORMULA THROUGH MASTER EQUATIONS Dr Tamás Biró Using statistics to study problems as diverse as the formation of hadrons, changes in biodiversity, and patterns in popularity on Facebook.

ASSESSING PERFORMANCES OF COMPUTER-AIDED DIAGNOSIS OF BREAST CANCER Professor Bin Zheng Harnessing the power of Computer-Aided Diagnosis technology.

58

DEVELOPING THE GENEVA SOLAR CADASTER: A DECISION SUPPORT TOOL FOR SUSTAINABLE ENERGY MANAGEMENT IN URBAN AREAS Professor Gilles Desthieux Developing a tool for modelling solar radiation and energy production from building rooftops and facades.

62

ENGINEERING THE FUTURE Dr Margaret Bailey How the WE@RIT programme is encouraging more women into the Engineering sector.

66

GENDER INEQUALITY: OCCUPATIONAL DEVALUATION AND PAY GAPS Professor Hadas Mandel Addressing the analytical and methodological distinctions between structural and individual aspects of gender inequality.

70

THE DAWN OF MEMORY MODULATION AND SELFPRESCRIBED FORGETTING – A MORAL DILEMMA Dr Andrea Lavazza Examining the ethics around memory modulation and erasing.

74

EXTRACELLULAR VESICLE DNA: A PROMISING CANCER BIOMARKER Professors Kye Young Lee and Jae Young Hur Studying and developing cancer detection methods using liquid biopsy with extracellular vesicles.

2

APPROACH TO PROTEINLIGAND BINDING Dr Julien Orts Developing multidisciplinary approaches to study proteinsmall molecule complexes using NMR spectroscopy, X-ray crystallography and computational methods.

50

LASER ABLATION IN LIQUID: A POWERFUL ROUTE TO NEW NANOPARTICLE CATALYSTS Dr Katharine Tibbetts

TARA CHKLOVSKI, CEO OF IRIDESCENT Page 126

www.researchoutreach.org

54

46 NMR : A HIGHLY ACCURATE

For 12 years, we have been introducing children worldwide to [...] new scientific advances.

4

Dveloping a novel approach for the synthesis of metal nanoparticles, based on a reactive laser ablation in liquid technique.


78

VALUES AND EVIDENCE MEET: APPROPRIATE HEALTHCARE ASSESSMENT FOR VULNERABLE PATIENTS Gert Jan van der Wilt Exploring the ethical and social implications of health care technologies.

82

COUNTING THE COSTS OF ANKYLOSING SPONDYLITIS Dr Jessica Walsh, Dr Xue Song, Dr Gilwan Kim and Dr Yujin Park Unveiling the direct costs of healthcare for patients with ankylosing spondylitis.

86

90

94

HOW DATA IS IMPROVING DRIVING POLICIES FOR EPILEPSY PATIENTS Dr Laura Bonnett Using prediction modelling to inform driving regulations for people with seizures and epilepsy. NEW INNOVATIONS IN TRAUMATIC BRAIN INJURY RESEARCH Dr Lynne Ann Barker Focusing on the effects of brain injury on neural structures and cognition and behaviour. MACBRAINRESOURCE: VIRTUAL ACCESS TO DECADES-OLD PRIMATE BRAINS Drs Lynn Selemon and Alvaro Duque MacBrainResource is an online repository of macaque brain material available for the use of researchers.

98

THE CAROTID BODY Professor Silvia Conde Examining the carotid body as a candidate for regaining glucose tolerance in Type 2 diabetes.

102

A POTENTIAL NEW TREATMENT FOR BRAIN INJURY Dr Daniel Linseman Exploring a prospective nutritional supplement, Immunocal®, for enhancing

126 resilience and improving recovery following traumatic brain injury.

106

110

PLANT PHYTOCHEMICALS Dr Sanjay Gupta Identifying and developing cost-effective, minimally toxic bioactive agents as cancer preventative agents for longterm use and as adjuvants in various therapies with a focus on epigenetic research. ADIPOSE STEM CELLS MAY PROMOTE CANCER PROGRESSION Dr Robert Katona Using a mouse adipose stem cell model system to study cancer and cancer stem cell development.

114

THE BIOLOGY OF AGEING Dr Nan-ping Weng Understanding the mechanism of age-related changes in immune function.

118

DREAM TEAM: IMPROVING HEARTS AND BONES WITH VITAMINS D AND K Dr Hanne van Ballegooijen Dr Hanne van Ballegooijen has been working on vitamin D for almost 10 years. Her work suggests that vitamin D status alone is not strongly related to cardiovascular disease and that combining vitamin D with vitamin K might improve its efficacy.

RESEARCH AREAS

Biology

Physical Sciences

Enginering & Technology

122

EVOLUTIONARY ARMS RACE A 400 MILLION-YEAR-OLD BATTLE BETWEEN HIV AND ANCIENT GENES, HERC5 AND HERC6 Dr Stephen Barr Illuminating the evolution of the HIV virus and the family of HERC genes that inhibit HIV.

126

IRIDESCENT: DISRUPTING THE CLASSROOM FOR THE BETTER Revolutionising education by providing programmes that empower underserved children through technology and engineering.

130

ACCIDENTAL SCIENCE!

14

34 Behavioural Sciences

Health & Medicine www.researchoutreach.org

5


Biology ︱ Dr Alyssa Accomando

Reading between the clicks: A new approach to echolocation Biologists are increasingly appreciating the importance of bioacoustics in conservation. By understanding the soundscape of an environment we can learn far more than previously thought. This is also the case for animal behaviour, a classic example being echolocation in bats and dolphins. Dr Alyssa Accomando, from Brown University and the National Marine Mammal Foundation, is studying echolocation to determine how these animals navigate the environment around them and what we can learn from this process.

E

cholocation is one of nature’s great superpowers. It is a type of sonar, where sound is projected by an animal and travels through the environment interacting with objects on the way. The sound is reflected by these objects, producing echoes. Echoes from these objects heard by the animal provide information on each object’s size, distance, and shape. Although we commonly associate echolocation with bats and dolphins, it’s also used by orcas, sperm whales, and even some humans. There are also studies of echolocation jamming from tiger moths, who disrupt the sonar emitted by bats to avoid detection. But why is all of this important? First, it is important to the animal because it is how they navigate their environment and find food. Unravelling the code of echolocation will allow us to better understand the behaviour of animals such as the bat, and any differences

in echolocation techniques that exist between species. This can help conservationists improve environmental design in areas such as reserves or zoos, or to improve location choice for any species reintroduction programmes. It also helps zoologists understand a species’ dietary preferences, communication, and predator avoidance and prey capture. Secondly, we can use the principles of echolocation in our everyday lives and technological advancement. For example, some fishing boats use sonar to locate shoals of fish and the same technology can be used to help people who are blind to ‘see’. In fact, drones themselves are used in conservation efforts, for example searching for dolphin populations, or studying rainforest canopy cover, so the improvements to this technology that come from nature positively feed back into helping nature itself – quite fitting really. BAT ECHOLOCATION By using ultrasonic microphones, scientists can record and analyse bat echolocation sounds. Dr Alyssa Accomando from Brown University and the National Marine Mammal Foundation is doing just this. She is looking statistically at the relatedness of patterns of bat echolocation pulses in a way people haven’t tried before.

Bats like big brown bats, Eptesicus fuscus, use echolocation to navigate and locate food.

6

www.researchoutreach.org

Dr Accomando had to first understand how bats echolocate while performing different behaviours. The normal way to measure echolocation patterns is to look at the time between pulses, or the IPI (Inter-Pulse Interval). Differences in IPIs can then be used to group echolocation


The frequency of bats’ sonar pulses changes as they home in on their prey.

signals into different functions. For example, when hunting, the bat emits increasingly frequent pulses (shorter IPIs) to home in on prey, and then a rapid burst just before the catch. The same behaviour is shown in dolphins. However, these patterns can change when the

The species she used were big brown bats, Eptesicus fuscus. Bats not only produce doublets of echolocation signals, but can also produce (less frequently) singles, triplets and quadruplets. The first task for Dr Accomando was to deduce which pulses belonged to what group

We are now coming to terms with the fact that sounds in nature hold a huge amount of information and can be valuable indicators of ecosystem health, change, and mechanism. surrounding environment is cluttered and contains many obstacles. In these very “busy” environments, bats emit ‘doublets’, or two echolocation pulses, one just after the other. Each pulse produces echoes that reflect back to the bat from near and far objects. But this produces a challenge because it can be difficult to interpret which sound is the first echo from the second pulse, and which is the second echo from the first pulse. This is also a common problem in many technological sonar applications. A CORRIDOR OF CHAINS To understand how bats solve this problem, Dr Accomando set up an experiment. This involved a corridor lined with chains hanging from the ceiling and acting as obstacles. The bats would fly through the corridor, and she designed three corridors of decreasing width to provide different degrees of challenge.

and how many were produced under the different corridor conditions. Bats in the narrower corridors emitted fewer doublets, and more singlets, triplets and quadruplets relative to that observed in uncluttered environments. Also, the IPI decreased with corridor width, for a more rapid stream of sound, suggesting that it’s valuable for a bat to receive more echoes, even if it means deciphering

Still photographs taken from infrared video of bats flying in the obstacle array. Obstacles were plastic chains hanging from the ceiling of the flight room.

the ambiguous source of echoes that comes with a faster echolocation signal production rate. However, it could also be the case that the alternation of IPIs might have something to do with how the bats ignore these echoes. The fact that the IPI structure changed from primarily doublets, to a greater mixture of pulse groups suggests that by changing the number of pulses per group, the bats were able to more easily identify the source of echoes – exactly how this is done remains a mystery for now. REMEMBERING PATHWAYS Using something called a Spike Train SIMilarity Space (SSIMS) analysis, Dr Accomando spatially mapped the different pulse patterns and compared their similarity. SSIMS is a mathematical

Sonar uses the same concept as bats – by emitting pulses and then analysing the reflected echoes, the location, distance and size of an object can be detected.

www.researchoutreach.org

7


Dolphins also use echolocation to locate and catch prey. Inset: the Tiger Moth disrupts bats’ sonar to avoid detection.

method used to analyse neural pulses but can also be applied to echolocation. It works by comparing two patterns of pulses and attributing a ‘cost’ every time a pulse has to be moved, inserted or removed, in order to make the two patterns identical. Firstly, she changed the shape of the corridors to see how adding a level of difficulty (straight, S path, and reverse-S path) would affect the bats’ echolocation pulse patterns. Secondly, bats were tested multiple times when the corridor shape was consistent across trials, or when it was randomly changed. This was done to investigate whether bats remembered their environment from trial to trial. Interestingly, the bats hardly changed their echolocation patterns in the curved paths, whether random or fixed. However,

TAKING THINGS UNDERWATER Dr Accomando also wants to apply this knowledge under the water to bottlenose dolphins (Tursiops truncatus). Rather than obstacle avoidance, which is not really an issue for dolphins in the ocean, she wants to see whether the click patterns in dolphins can predict prey-catching behaviour. Little is known about how the changing ocean environment impacts the ability of dolphins to catch fish. Dolphins, much like bats, emit a ‘terminal buzz’ just before capturing prey. A more rapid burst of clicks called a ‘burstpulse’ is emitted just after prey capture, but it has been recorded that some dolphins emit this early, pre-empting a successful capture. The echolocation sounds produced by dolphins are commonly called “clicks.”

… [bats] can treat a space differently when it is new compared to when it is a path that was recently seen. when the straight path was changed, the bats’ echolocation pattern was different compared to when the straight path was maintained, suggesting they can treat a space differently when it is new compared to when it is a path that was recently seen. Further study will involve testing different species of bat. This will answer the question of whether alternative behaviours have evolved in different species to deal with obstacles, and how these species’ techniques might relate to their specific environment.

8

www.researchoutreach.org

Using the same SSIMS analysis as with the bats, Dr Accomando can compare the ICIs (Inter Click Intervals) in the series of clicks produced by each dolphin, called the click train, and be able to spatially separate different pre-, and post-prey-capture echolocation patterns. The experiment would involve using a GoPro to confirm normal fish captures, and ultrasonic hydrophones (underwater microphones) to listen to echolocation clicks. Dolphins would be trained and rewarded for retrieving a target placed

in the water by an animal trainer. On occasion, the target would be randomly removed just before capture. The goal of the study is to determine whether dolphins produce different sound patterns if the prey escapes compared to if the prey is captured. Burst-pulses are communication sounds produced by dolphins, and SSIMS analysis might also be able to help quantify sounds that are not used for echolocation but which are similar in the way they are produced. For example, new research suggests that burst pulses have timing patterns that might be used to identify individual narwhals in the wild (Blackwell, et al., 2018). Understanding echolocation behaviour is an important step towards completing a fuller picture of environmental bioacoustics. Previous ecological and zoological studies, which are primarily terrestrial, have heavily focused on the use of vision and olfaction in an animal’s interaction with their environment. There has been relatively little emphasis on bioacoustics. We are now coming to terms with the fact that sounds in nature contain a huge amount of information relevant to animals and can be valuable indicators of ecosystem health, change, and mechanism. The work from ecologists and behavioural neuroscientists such as Dr Alyssa Accomando will not only help us better understand how animals use sound to obtain information about and interact with their environment, but will also help improve sonar technology and facilitate wildlife conservation based on bioacoustic information.


Behind the Research Dr Alyssa Accomando

E: alyssa.accomando@nmmf.org T: +1 877-360-5527 W: www.linkedin.com/in/alyssaaccomando-phd-aa431146 W: www.nmmf.org/

Research Objectives Dr Accomando specialises in understanding the precise sound patterns produced by bats and dolphins.

Detail 2240 Shelter Island Dr., Ste 200 San Diego CA 92106 USA Bio Alyssa Accomando is a research scientist at the National Marine Mammal Foundation, where she conducts neuroscientific research with bottlenose dolphins. She earned her PhD studying big brown bat biosonar at Brown University. Her research aims to understand how dolphins and bats process acoustic signals resulting from echolocation. Funding • Office of Naval Research (N00014-14-1-05880, awarded to James A. Simmons) • NINDS-Javits (NS025074, awarded to John Donoghue) • Katie Samson Foundation grant (awarded to John Donoghue) Collaborators • Carlos E. Vargas-Irwin (coauthor) • James A. Simmons (coauthor) • Ikuo Matsuo (collaborator)

References Accomando AW, Vargas-Irwin CE, and Simmons JA (2018). Neural spike train similarity algorithm detects effects of obstacle proximity and experience on temporal patterning of bat biosonar. Frontiers in Behavioral Neuroscience, https://doi.org/10.3389/fnbeh.2018.00013 Wheeler AR, Fulton KA, Gaudette JE, Simmons RA, Matsuo I and Simmons JA (2016) Echolocating Big Brown Bats, Eptesicus fuscus, Modulate Pulse Intervals to Overcome Range Ambiguity in Cluttered Surroundings. Frontiers in Behavioral Neuroscience 10:125. doi: 10.3389/ fnbeh.2016.00125 Blackwell, S.B. et al (2018). Burst-pulses in East Greenland narwhals: Further evidence for unique, individual-specific vocalizations. Paper presented to the 176th Meeting of the Acoustical Society of America, Victoria, BC, Canada, November 9th, 2018.)

Personal Response Why do you think the bats’ echolocation pattern differed on variable straight paths rather than the curved paths? I think that, since they had the most experience flying the straight path, combined with fewer physical demands like turning in a narrow space, this was the least difficult for them to navigate. However, when the bats flew down a straight path after having just flown in more difficult curved paths only a minute beforehand, that element of uncertainty about what they would face when released into the flight room led them to treat the relatively easier straight path more similarly to the curved paths. This makes sense because bats could collide with obstacles and emitting more sound groups gives them more information to avoid that.

www.researchoutreach.org

9


Biology ︱ Dr Frederik (Frikkie) Botha

A research model for carbon-partitioning in sugarcane Yellow Canopy Syndrome (YCS), first observed in 2012, is an undiagnosed condition affecting Australian sugarcane. It causes mid-canopy leaves to turn yellow, decreasing crop sugar yields. Dr Frederik Botha oversees YCS research for Sugar Research Australia (SRA). Focused on gene expression and protein and metabolite levels, this research seeks molecular targets to improve genetic tolerance to YCS and enhance sugarcane productivity in general. In particular, a model of how sucrose build-up regulates leaf metabolism through feedback control has been developed. Understanding what elevates leaf sucrose levels as YCS develops could provide insights into mechanisms underpinning sugarcane diseases and physiological disorders.

C

ommercial sugarcane (a hybrid of Saccharum officinarum and S. spontaneum) produces a higher biomass yield than the other major world crops, rice, wheat and maize. However, sugarcane yields worldwide have not improved significantly over the past three decades. Good crop yields depend on ensuring that, at each stage of plant growth, the supply of assimilates from the ‘source’ (leaves) to the ‘sink’ (growing or filling tissues) is optimal. Although sugarcane is one of the most efficient crops in converting solar energy into biomass, commercial yields remain half that of experimental potential. There are several reasons for inefficient conversion of solar energy into biomass. Of particular interest in sugarcane are reduced photosynthetic rates in the leaves and slowed biomass gain in the culms due to feedback control of the plant’s metabolism by high levels of sucrose and other sugars in the leaves. It is difficult to experimentally manipulate sugar levels without changing light input or damaging leaf and culm tissues. Since in YCS leaf sucrose exceeds normal physiological levels, discovering what causes this could give clues to improving productivity. Sugarcane turns yellow for various reasons that can now be distinguished from YCS, including herbicide application, nutrition and known diseases. Indications are that the syndrome is a combination of abiotic and biotic factors leading

to a physiological disorder. Dr Botha and colleagues have found that YCS is especially associated with altered carbon-partitioning in the leaf. Disruption of the sink–source relationship causes sugars to accumulate in leaves, and when sugar exceeds a critical level it induces senescence. High levels of sucrose in sugarcane leaves are therefore an indicator of compromised crop health. THE SOURCE–SINK SYSTEM How well a plant grows depends on acquiring raw material (carbon fixation and mineral uptake), distributing this through plant organs and coping with environmental stresses. The process known as carbon-partitioning is critical for distributing the energy captured by plants through photosynthesis. In C4 plants like sugarcane, CO2 is converted into four-carbon sugar compounds. These then enter into chemical reactions that take place in chloroplasts, the plant cell organelles conducting photosynthesis. Carbon fixed during photosynthesis and converted into sugar in ‘source’ cells is distributed to ‘sink’ cells. Phloem is the tissue that transports the soluble organic compounds (mainly sucrose), made during photosynthesis and known as photosynthates, to wherever they are needed in the plant. The sugars are imported into sink tissues for consumption (providing energy for plant functions) or storage. Some stored sugars provide structural biomass as cellulose, hemicelluloses and lignin. Sucrose synthesis in source tissue, its translocation and its partitioning between storage, respiration and biosynthesis are


A field of sugarcane affected by yellow canopy syndrome.

systemically coordinated in plants. Not only is sucrose the primary product of photosynthesis and the building block for biomass accumulation but it also serves as a sensitive metabolic switch controlling photosynthesis and carbon-partitioning in the plant. A model for the biochemical process of carbon-partitioning in sugarcane is being developed through research on YCS. Sugarcane has a unique source–sink system. Stem-sinks store photosynthates as soluble sucrose, which can reach exceptionally high concentrations in commercial sugarcane varieties. Most other plant stems store carbon as insoluble polysaccharides (such as starch or cellulose) with low concentrations of sucrose. In many plants, sucrose is stored (after conversion to insoluble starch) in terminal sink organs such as tubers, grains or fruits, rather than in the stem. Valuable sucrose from sugarcane culms is extracted and purified for use in the food industry or fermented to produce ethanol.

in the symplast and apoplast respectively. Immature sugarcane tissues partition carbon into protein and fibre, whereas mature culms mainly partition it to sucrose storage. During maturation of commercial sugarcane cultivars, leaf photosynthetic activity decreases, as culm sucrose content increases. Thus, sink regulation of source capacity is taking place. SUCROSE ACCUMULATION IN SUGARCANE In YCS, leaf yellowing occurs in the late stage of sucrose accumulation, senescence is induced and tissue death begins. Normal diurnal changes of sucrose concentrations (low in the morning and high at the end of the day) are absent in YCS affected plants, even

The mysterious yellow canopy syndrome (YCS) of sugarcane.

The excessive increase in sucrose suggests disruption of phloem transport. Sugar is loaded into the phloem but not exported from the leaf, since the highest levels are found in the midrib and sheath. Expression levels of genes for sucrose transporters and SWEET protein (not previously characterised in sugarcane) are also greatest in these plant parts. The sucrose accumulation could be caused by physical blockage of the phloem (for which there is currently no evidence) or arise because the sink is not using transported sugar fast enough which creates an overflow into the surrounding leaf blade, midrib, dewlap and sheath. Increased sucrose also leads to elevated glucose, fructose and trehalose, sugars that play major roles in metabolic signalling. Furthermore, sucrose synthesis slows down which probably leads to a lowering of available inorganic phosphate (Pi) within chloroplasts. A feedback signalling mechanism involving sucrose in the symplast could result from chronic cellular

Sucrose serves as a sensitive metabolic switch controlling photosynthesis and carbon-partitioning in sugarcane.

During development, sucrose synthesised in sugarcane leaves is translocated via phloem to internodes (the stem sections that run between leaf-carrying nodes), the storage sink. Sucrose accumulates inside and outside the cell membranes,

before yellowing. So, significant metabolic changes occur well before visual signs. Studies at SRA reveal that these changes include an increase in soluble sugars, a decrease in photosynthetic rate, decreased internal leaf CO2, decreased conductance through stomata (pores in leaves and stems for gas exchange), uncoupling of the photosynthetic electron transport (PET) chain and altered carbon-partitioning.

www.researchoutreach.org

11


• Over reduction of the PET

Light

• Transcription and translation of the major components of PSII down regulated

PSI

• Physiological ‘fitness’ of PET reduced and photosynthetic reaction centres inactivated (more heat production and less reductant and ATP produced) • Water splitting around PSII decreased as a result of changes in gene expression

PSII

HIGH SUCROSE

200mM

Sucrose

INTERMEDIATE SUCROSE

• Accelerated senescence initiated • Chloroplast and total protein breakdown stimulated

Biomass

• Major upregulation of components of the ubiquitin pathway

100mM

NADPH ATP

CO2

• Cyclic electron flow upregulated to protect PSI • Carbon partitioned to the phenylpropanoid pathway to act as antioxidants

Hexose-P R1

Leaf - Supply

• Carbon partitioned to β glucan to reduce sucrose accumulation

• Accumulation of amino acids and amines

Respiration

R2

• Down regulation of the C4 and C3 carbon fixation pathways • Increase in sugars associated with cell wall degradation

Culm - Demand

• Major changes in both nuclear and plastid gene expression • Heat shock and other stress proteins accumulate • Up regulation of chlorophyll breakdown

• Stomata partially closed to reduce photosynthesis

• Yellowing of the leaf evident • Respiration maintained to facilitate senescence process

• Changes in both nuclear and plastid gene expression

• Carotenoid retention

Leaf health is determined by the sucrose level in the photosynthetic mesophyll and bundle sheath cells. The sucrose level is determined by the difference between production (R1) and utilisation in the culm (R2). Daily fluctuation between 20 and 100 mM is normal as a result of variation in photosynthetic rate. Between 100 and 200mM sucrose (intermediate levels) a series of events are triggered that are aimed at protection of the photosynthetic electron transport chain, reduction in carbon fixation and creation of an alternative sink for the reduced carbon. Prolonged levels above 200mM (high) triggers accelerated senescence, collapse of the electron transport system, chlorophyll breakdown and cell death.

Pi limitation. Research shows that raised sucrose also alters gene expression of key photosynthetic proteins in leaf cells. From the model developed so far, YCS symptoms appear to be caused by downregulation of photosynthesis through Pi limitation leading to chronic inability to export reductant away from the PET chain during cellular sugar accumulation. Down-regulation of genes encoding Photosystem (PS) II and I, cytochrome and CP12 (an essential regulatory protein) results in decreased synthesis of these proteins, which then limits photosynthesis. ADVANCING GENETIC STUDIES OF SUGARCANE The sugarcane genome has only recently been mapped, owing to sugarcane’s complexity: high polyploidy (more than two-paired sets of chromosomes); aneuploidy (varied numbers of chromosomes); bispecific origin of chromosomes; and structural differences and interspecific chromosome recombinants. A reference genome is now available for researchers. DNA sequencing, development of geneexpression technologies and improved genetic/genomics resources for Saccharum are enabling the regulatory networks of carbon-partitioning to be further elucidated.

metabolism) and transcriptome (messenger RNA molecules expressed from the genes) analyses of the metabolic pathways in the leaves and sink tissues of sugarcane are helping researchers to identify reactions that lead to YCS. Comparing leaf transcriptomes of symptomatic and asymptomatic plants confirms that a complex network of changes in gene expression underpin the observed changes in the metabolome. Fluorescence and gene expression data from YCS studies indicate that PS II is the sensitive process/component, linked to reduced electron flow producing reduced co-enzyme. The early change in photosynthetic rate is accompanied by changes in the expression of

elevated sucrose. This increases caffeoylquinic acids and quinate, compounds that provide antioxidants to buffer free radical production in the chloroplast as a result of decreased electron flow to the terminal electron acceptors of PS I. Upregulation of the phenylpropanoid pathway probably shifts carbonpartitioning towards lignins, flavonoids and anthocyanins. In the early stages of sucrose accumulation, several other changes also occur: significant levels of metabolites indicative of microorganisms that associate with injured tissue, especially where there are significant available carbohydrates; significant increases in caffeoyl/chlorogenic type compounds indicative of wounding and activation of plant defence systems; and increases in amino acids and metabolites indicative of stress metabolism and of disruption of the electron transport system, which is dependent on fast turnover of protein components.

A model for the biochemical process of carbon-partitioning in sugarcane is being developed through research on YCS.

Metabolome (low-molecular-weight metabolites produced during

12

www.researchoutreach.org

phosphoenolpyruvate carboxylase (PEPC). NADP-malic dehydrogenase expression is more sensitive to the accumulation of sucrose than are NAD-malic dehydrogenase and PEPC. This demonstrates that chloroplast metabolism is down-regulated when sucrose levels rise. Furthermore, genes in the shikimate and phenylpropanoid metabolic pathways are upregulated in early response to

A genomic approach is now being pursued for YCS in sugarcane, using nextgeneration RNA sequencing to compare and analyse genetic data for affected and unaffected plants from diverse field locations. Genetic explorations of how different tissue samples express different proteins, continues to provide clues to the cause of YCS and to understanding sugarcane metabolism in general.


Behind the Research Dr Frederik (Frikkie) Botha

E: fbotha@sugarresearch.com.au T: +61 488400074 / +61 7 33313318 W: https://sugarresearch. com.au/team/dr-frikkie-botha/

Research Objectives

References

Dr Botha’s work examines leaf sucrose levels in sugarcane, among other plants, and their impact on overall plant health.

Botha, F. (2019) Metabolic Changes Associated with the Development of Yellow Canopy Syndrome in Sugarcane. Conference: International Plant and Animal Genome Conference XXIII 2015.

Detail 50 Meiers Road Indooroopilly Queensland 4078 Australia Bio Frederik (Frikkie) Botha is the Executive Manager Strategic Initiatives at Sugar Research Australia and Honorary Professor at the University of Queensland, Australia. His research focus is on the genetic and molecular control of carbon partitioning in the culm and leaves of sugarcane, which is the driver of biomass composition and yield. The research aims to understand the control of carbon partitioning between the cell wall components, respiration and sucrose accumulation in the culm and the impact of this on sink strength. An early switch to sucrose accumulation reduces biomass accumulation and reduces sink strength. The limited capacity to buffer leaf sucrose through partition of carbon to starch requires maintenance of a strong sink demand to prevent induction of premature senescence in the canopy. Funding • Sugar Research Australia • Australian Research Council • University of Queensland Co Authors • Annelie Marquardt (SRA) Sugar Research Australia • Gerard Scalia (SRA) Sugar Research Australia • Kate Wathen-Dunn (SRA) Sugar Research Australia • Robert Henry (UQ) Queensland Alliance for Agriculture and Food Innovation, The University of Queensland

www.researchgate.net/publication/268116827_Metabolic_ Changes_Associated_with_the_Development_of_Yellow_ Canopy_Syndrome_in_Sugarcane Marquardt, A., Scalia, G., Wathen-Dunn, K., Botha, F.C. (2017) Yellow Canopy Syndrome (YCS) in sugarcane is associated with altered carbon partitioning in the leaf. Sugar Tech, 19, 647–655. Marquardt A., Scalia G., Joyce P., Basnayake J., Botha F.C. (2016) Changes in photosynthesis and carbohydrate metabolism in sugarcane during the development of Yellow Canopy Syndrome (YCS). Funct Plant Biol 43:523–533. Wang. J, Nayak, S., Koch, K., Ming, R. (2013) Carbon partitioning in sugarcane (Saccharum species). Frontiers in Plant Science, 4, 201. Sugar Research Australia. Yellow Canopy Syndrome. https:// sugarresearch.com.au/growers-and-millers/pests-anddiseases/yellow-canopy-syndrome/ [Accessed January 2019].

Personal Response What impact do you hope this research will have over the next five years? Conventional and genetic manipulation studies have shown that accumulation of sucrose leads to biomass penalties consistent with sucrose feedback control on photosynthesis. We need a better understanding of why sucrose accumulates in the leaves of sugarcane during stress and what the impact of this is on leaf metabolism and crop yield. This will contribute to the finding of management solutions for physiological disorders and biotic stress that lead to sucrose accumulation. However, more importantly it could lead to genetic targets that provide an opportunity to break out of the current yield plateau that has frustrated sugarcane breeders for the past three decades.

www.researchoutreach.org

13


Physical Sciences ︱ Professor Costas Balas

Stripping paintings of their secrets with hyperspectral imaging What lies beneath hundreds of years of paint? Professor Costas Balas at the Technical University of Crete, Greece is privy to some of the secrets of the great Renaissance masters with his hyperspectral imaging devices. Such devices can be used to identify not just what chemicals are present in pigments on the surface layers of paintings, but also what lies underneath.

A

rt restoration is a difficult business. A single painting will use numerous pigments to create a wealth of different colours but while those colours might be indistinguishable by eye, each of those pigments may in fact have a different chemical composition. Even worse, different pigments or regions of the painting may have been treated with different binding materials and varnishes, which will also affect the types of solvents and materials that can be used for cleaning or touching up certain areas. If colour alone is not a reliable indicator of the pigments and chemicals used in a painting, how can art experts and scientists identify what materials have been used with a great degree of accuracy, all in a non-destructive manner?

The answer lies in tools typically used in analytical chemistry. Most of these involve shining non-visible wavelengths of light through the painting, including X-rays and infra-red radiation, as these are not completely absorbed by the surface paint layers, unlike visible light. Unfortunately, such spectroscopic techniques are very expensive and provide point-by-point information. This means that each section of the painting must be meticulously scanned with the incoming X-rays or infra-red to record all the chemical information on different sections of the painting. It may also be necessary to tune the wavelengths of the incident light to see different elements. Given how time-consuming pointby-point scanning is, one alternative

Fluorescence imaging combining spectral images of El Greco’s St Francis of Assisi reveals invisible conservation interventions. Materials used for conservation are similar to the background color but their different chemical composition is evidenced by the different fluorescence emission pattern.

14

www.researchoutreach.org


approach is to combine spectroscopic methods with imaging methods. Spectroscopic methods are ideal for accurately determining which chemical elements are present in a particular region but imaging methods have very high spatial resolution and can therefore disentangle differences in even small, detailed areas of the painting. This combination is known as hyperspectral imaging and Professor Costas Balas at the Technical University of Crete, Greece and his research group are experts in improving and developing hyperspectral imaging technologies for both medical and art conservation applications. HYPERSPECTRAL DEVICES Professor Balas’s interest in hyperspectral imaging was originally motivated by his work in the field of cancer diagnosis and biomedicine. Here, there is a need for devices that can successfully and accurately profile and image tumours within the body as part of the diagnostic process but there are a number of features of the technique that have meant it can be applied to a diverse range of applications, including art restoration and identification. A hyperspectral imaging device works by simultaneously recording spectral and spatial images to make three-dimensional data sets. A spatial image is like a photograph that shows the location of all the different objects in the frame. However, as the spectral information can only be obtained for a limited number of wavelengths per spatial image, this means only certain coloured objects are recorded in a single image, so it would be like taking a photograph which would only show green objects. To make up a full threedimensional image, another photograph would be taken, that would only show blue objects, then red objects and so forth until all the wavelengths of interest had been covered and the full scene could be created. In practice, as the human body and paint absorb visible light, the imaging ranges of the hyperspectral imaging devices span from the ultraviolet to

Hyperspectral analysis of the palimpsest (72 verso) (Mount Athos Monasteries-Greece). Regular colour photography (top) shows only the newer script entitled “Mandona Anthems” which has been dated as 13th century AD. There is an older invisible script, however, dated between the 10-11th century AD, entitled “Porphyrogennetus”, which is revealed with hyperspectral imaging. The fusion of selected spectral images made the visualisation of both the new (vertical script) and the old (horizontal script) possible.

The team found a number of surprising results that only hyperspectral imaging could have unearthed. the infra-red. Using these wavelengths of light in the infra-red allows Professor Balas to peel back the layers of paint to what lies on the canvas underneath. The flexibility of the wavelengths that can be used with hyperspectral imaging has many advantages for dealing with complex materials such as paintings. The use of visible light is important for identifying colours and pigments,

with ultraviolet and infra-red detection that provide more definitive chemical assignments and are sensitive not just to the surface layers. SEEING THE CANVAS Professor Balas and his team have been able to use this technology to study one of Doménikos Theotokópoulos’s, otherwise known as El Greco,

The hyperspectral camera, developed by Professor Balas and his team, (left) captures a series of monochromatic images across the spectrum, building up the so-called spectral cube (right). This data structure assigns a full spectrum to every point in a scene, giving information on its chemical composition.

www.researchoutreach.org

15


Malachite Verdigris Green Earth (1125) Green Earth (4081) Green Earth (4082) Hyperspectral analysis of “El Espolio” by Jorge Manuel (El Greco’s son). Spectral mapping of a detail, shown in the red frame, combined with machine learning methods allows for the identification of malachite pigment non destructively. Areas with malachite are artificially colored with red for the purpose of facilitating the visualisation of the pigment’s distribution in the painting

masterpieces, ‘The Baptism of Christ’. Thought to have been painted in the early 1600s, the painting depicts a complex scene involving both the heavens and the earth, where Christ stands with St John the Baptist at his right. Strikingly, a number of figures are robed in bright greens, reds and yellows, giving a small indication of the complexity of the chemistry occurring on the campus.

Professor Balas’s interest in hyperspectral imaging was originally motivated by his work in the field of cancer diagnosis. extracted from the hyperspectral imaging indicated this ‘I’ was not a numeral as part of the date, but actually intended to be part of a nearby scene.

The goal of the project coordinated by the Benaki Museum was to provide technical information to art historians to allow them to date and authenticate the painting and to determine what the physical condition of the painting was to aid with conservation work. During this process, the team found a number of surprising results that only hyperspectral imaging could have unearthed, including the numerals ‘MDLXVI’, indicating the year 1566. This date had gone previously unnoticed as it had been overpainted, but the quality of the spatial resolution from the imaging made it possible to unambiguously resolve most of the date.

While hyperspectral imaging may be more efficient than point-by-point spectroscopic methods, the size of the painting posed some problems for Professor Balas and his team. They were able to identify over ten different pigments used by El Greco and his contemporaries on the painting, including lapis lazuli and azurite, but needed a more efficient way of identifying them over the full scale of the canvas. To do this, they turned to machine learning methods that, once they had been trained to identify the spectroscopic signatures of the various pigments, could then automatically assign which pigments had been used to paint specific regions, speeding up the process immensely.

There was one small complication to this, though. Next to MDLXVI appeared to be another I, which would indicate the year to be 1567. Most other analysis techniques would have suggested the additional ‘I’-like feature was intended to be part of the date, owing to its location and similarities in colour. However, the chemical information also

CAPTURING THE PAST While not all paintings have dates conveniently hidden under layers of paint, hyperspectral imaging can be used to date paintings through a different approach. When chemists and artists alike found ways to create new pigments, this was often intimately related to the development of new artistic styles. Their

16

www.researchoutreach.org

colours and the techniques used to paint them often provide art historians with many visual clues as to a painting’s age, which can in turn be verified by a more indepth chemical analysis of exactly which compounds were used where. Hyperspectral imaging can be used not just for identification and dating but also for historical preservation. In cases where restoration is not possible due to the fragile nature of the object, imaging can be used to create digitised records of particularly frail manuscripts. One example is the Codex Sinaiticus, the earliest manuscript containing the complete New Testament which dates from the mid-fourth-century. The Codex Sinaiticus is currently kept in four different locations, so as well as historical preservation, the goal of imaging the whole text also served the purpose of allowing a reunification of the complete text. The reunification of the text through imaging gave some ideas as to how the book may have originally been bound and researchers now have a wealth of information on the composition of the different coloured inks used in the text, all of which have been made possible by developments in hyperspectral imaging.


Behind the Research Professor Costas Balas

E: balas@electronics.tuc.gr T: +30 28210 37212 W: www.linkedin.com/in/costas-balas-b4473236/

Research Objectives

References

Professor Balas uses and develops hyperspectral imaging technology in art conservation and in biomedical diagnosis.

Balas, C., Epitropou G., Tsapras A. and Hadjinicolaou N., (2018) Hyperspectral imaging and spectral classification for pigment identification and mapping in paintings by El Greco and his workshop, Multimedia Tools and Applications, 77, 9737–9751.

Detail Costas Balas PhD Professor | Director Electronics Lab | School of Electrical & Computer Engineering Technical University of Crete University Campus Acroterion 73100, Chania, Crete Greece Bio Costas Balas is a full Professor at the Electrical and Computer Engineering department of the Technical University of Crete. He is a recognised expert and innovator in biophotonics and hyperspectral imaging. He has patented and published life-saving, FDA-approved photonic methods and imaging technologies for non-destructive analysis and for noninvasive diagnosis. He teaches both graduate and undergraduate courses.

Personal Response What are the unique challenges of imaging art pieces? Art pieces are very complex in nature with multiple layers and chemicals contributing to image formation. When referring to historic art pieces, construction materials are often unknown and therefore it is very difficult to develop material replicas to be used as reference for the analysis. Innovative platforms integrating hyperspectral imaging and machine learning methods emerge as a powerful tool for addressing these challenging diagnostic/ analytical tasks.

Collaborators The guidance and the data interpretation offered by the eminent Art Historian Professor Emeritus Nicos Hadjinicolaou and the experimental work conducted by my graduate students are gratefully acknowledged.

Technical University of Crete

www.researchoutreach.org

17


Physical Sciences ︱ Professor Michael D. Heagy

The power of light: Production of solar fuels Harnessing the power of the sun as a source of energy is highly appealing for many reasons, but primarily because it is a renewable, clean source of energy. Plants are already incredibly proficient at not just converting light to useable energy but also at storing that energy in the form of glucose, which can be considered a solar fuel. Professor Michael D. Heagy at New Mexico Tech has taken inspiration from plant photosynthesis to develop new materials, designed to address some of the challenges involved in efficiently converting and storing solar energy.

W

hen it comes to solar energy conversion, plants are the experts. Through photosynthesis, plants can use light energy absorbed from the sun, in combination with carbon dioxide and water, to create glucose and oxygen. Despite photosynthesis being a complex, multi-step process, plants are able to perform it with very high conversion efficiencies, owing to their highly adapted network of molecular machinery. Glucose, the type of sugar produced during photosynthesis, is a very energydense fuel that the plant uses for two main purposes. Firstly, glucose provides the chemical building blocks for the plant

a plant’s abilities for light harvesting, conversion and storage by breaking the photosynthetic process into two key steps. Water splitting, where the key goal is to oxidise water into hydrogen and oxygen, followed by the reduction of carbon dioxide to allow solar fuel synthesis. Professor Michael D. Heagy at New Mexico Tech and his research team are experts at designing materials tailored to promote the conversion of carbon dioxide to solar fuels using the power of light alone. SOLAR FUELS Recreating the energy conversion efficiency of plants in the laboratory with artificial photosynthesis is a tricky task

Liquid solar fuels have an advantage not just in terms of their energy density but can also be used as chemical feedstocks. to be able to grow. Secondly, plants also exploit glucose as part of respiration, a process common to plants and humans, which involves converting the energy locked in the glucose fuel into a useable form. Trying to mimic the incredible abilities of plants to convert and store the energy from sunlight has inspired the field of artificial photosynthesis. Artificial photosynthesis attempts to recreate

as well as overcoming one of the other issues with solar energy: how to provide a continuous, uninterrupted energy supply from solar sources? The key to this lies in finding a way to store the solar energy, which is often done by using photoinduced chemical processes to create solar fuels. Probably the most famous example of a solar fuel, and the one created during the water splitting stage of natural photosynthesis, is hydrogen. After using light-induced processes to form the hydrogen, this then acts as a fuel reservoir, as the hydrogen can later be burnt or converted to release the stored energy when required. However, while hydrogen is now being used directly as a fuel by hydrogen-powered cars, it is not an ideal solar fuel. For practical purposes, the hydrogen needs to be highly pressurised


Silver nanoparticles act as plasmonic sensitiser via Ag/Cu2O nanocomposite and augment solar-driven bicarbonate formate conversion. A

B

Proposed mechanism for (a) Cu2O semiconductor and (b) Ag/Cu2O. (Blue block arrow represents resonant energy transfer from metal to semiconductor.)

and is difficult to store safely owing to its flammability. This is why Professor Heagy is interested in using photochemical processes to create other kinds of solar fuels, including finding ways to efficiently produce formate, the negatively charged version of the simplest carboxylic acid, formic acid. Producing formate and other liquid fuels not only is advantageous in terms of their superior energy density, but they can also be used as chemical feedstocks. Part of our global reliance on fossil fuels is not just using them as a source of energy, but also as chemical feedstocks for the manufacture of plastics and other chemicals. The idea of using methanol and dimethyl ether as alternatives for fossil fuels, for both energy and chemical synthesis, has been championed by the Chemistry Nobel laureate George Olah, who was a strong advocate for encouraging the development of this ‘methanol economy’. MAKING MATERIALS Professor Heagy’s research focuses on designing photocatalysts and novel nanomaterials with specially designed structures for the reduction of bicarbonate to solar fuels. Bicarbonate exists in equilibrium with dissolved CO2 and is the predominant species at neutral pH. While plants have specific cellular and chemical architectures adapted for all the stages of the photosynthetic cycle, in the laboratory, driving the same processes requires the use of photocatalysts. Photocatalysts work by capturing

the chemical reactants on their surface. Then, when they are illuminated, the photocatalyst helps to change the charge distribution of electrons between itself and the reacting molecules. This process results in significant acceleration of the rate of chemical reactions such as the reduction of carbon dioxide.

One photocatalyst that Professor Heagy has been investigating for solar fuel production is iron oxide. In his team’s work on iron oxide nanostructures, they found that the photocatalytic activity for reducing bicarbonate to formate is strongly dependent on the shape and structure of the material. By using

Two crystal forms of ZnS were synthesised and evaluated for their photochemical properties. The wurtzite crystal form showed the highest AQE of 0.9% when using IPA as the whole scavenger. The AQE increased to 3.2% when the electron donor was changed to glycerol. Given the large gain in productivity and the potential renewable source, glycerol is a preferable solvent and positive hole scavenger.

SEM image of (left) micron Cu2O, and (centre) nano Cu2O. (right) TEM images of nano Cu2O and Ag/Cu2O.

HR-TEM images of (left) micron Cu2O, (centre) nano Cu2O, and (right) Ag/Cu2O. (The inset shows magnified Ag nanoparticles of ~5nm surrounded by Cu2O. Scale of insert is 5 nm.)

www.researchoutreach.org

19


Hierarchical nanoflowers were synthesised from earth-abundant ZnO and evaluated under solar AM 1.5 input as catalysts for the photochemical reduction of bicarbonate to value-added formic acid.

TEM and HR-TEM images of nP-Sphal (A, B) and nP-Wurtz (C, D).

Investigating how and where the carbon dioxide molecules bind to the surface will help to drive intelligent design of photocatalysts.

arrangements for the iron and oxygen atoms that gave the largest surface area for the bicarbonate to interact with, they could significantly increase the productivity of the catalysts for the production of formate. Iron oxide is also an appealing choice of material for this type of application as it is nontoxic and low-cost.

catalysts than just having amorphous, unstructured nanoparticles.

THE NEW ECONOMY Iron oxide is not the only material that looks like a promising candidate for photoreducing carbon dioxide to formic acid. Professor Heagy’s group has also been investigating zinc sulfide structures and copper oxides to see whether these would be more appealing photocatalysts for the production of formate and how the mechanism of the reaction differs, or is influenced by the shape of the particles, not just by changes in the surface area. His work on zinc oxides has shown that exotic nanostructures, such as nanorods, nanobelts and nanoflowers, seem to be better

While Professor Heagy’s group are developing new ways of controlling the miniature architectures of nanoparticles, a lot of their research involves really trying to understand why and how the different earth-abundant photocatalysts work for these reduction reactions. Investigating how and where the dissolved carbon dioxide molecules bind to the surface will help to drive more intelligent, targeted design of nanoparticle structures. Alongside this, they are also investigating whether changing the solvent environment for the photocatalysis also has an impact on the yield and rates of reaction.

This is important for such photocatalytic reactions as one of the goals is not just to have a practical, inexpensive reaction that is only driven by light, but to ensure all the reagents are non-toxic and as environmentally friendly as possible. While the reduction of carbon dioxide to organic compounds using metal oxides was first reported in 1979, it has become a very hot area of research recently. With the work of Professor Heagy and others improving our fundamental understanding of how some of the photocatalysts involved work and designing more efficient, effective materials that can be used for such processes, we are one step closer to realising the methanol economy.

Formate production in ppm with ZnO rods and flowers in 2-propanol and glycerol (left) and productivity (right).

20

www.researchoutreach.org


Behind the Research Professor Michael D. Heagy

E: Michael.heagy@nmt.edu T: +1 575 835 6185 W: https://sites.google.com/nmt.edu/heagyhome-page/home

Research Objectives

References

Professor Michael D. Heagy’s work on novel nanostructures aims to improve the efficiency of solar fuel production.

Leonard D., Pan H. and Heagy M., (2015), Photocatalyzed Reduction of Bicarbonate to Formate: Effect of ZnS Crystal Structure and Positive Hole Scavenger, ACS Applied Materials and Interfaces, 7, 24543−24549

Detail Dr Michael Heagy NMT Department of Chemistry 801 Leroy Place Daniel H. Lopez Chemistry Rm 115 Socorro NM 87801 USA Bio Michael D. Heagy received an AB degree in Chemistry from Franklin & Marshall College, Lancaster, Pennsylvania and PhD under the direction of Nobel Laureate George A. Olah at the University of Southern California, Los Angeles. At the Massachusetts Institute of Technology, he conducted postdoctoral research with Prof Julius Rebek, Jr. Funding National Science Foundation Collaborators • Dr Hanqing Pan • Dr Sanchari Chowdhury

Pan H., Chowdhury S., Premachandra D., Olguin S., Heagy M., (2018), Semiconductor Photocatalysis of Bicarbonate to Solar Fuels: Formate Production from Copper(I) Oxide, ACS Sustainable Chemistry and Engineering, 6, 1872-1880 Pan H., Martindale K., Heagy M., (2018), Iron Oxide Nanostructures for the Reduction of Bicarbonate to Solar Fuels, Topics in Catalysis, 61, 601-609 Pan H., Risley V., Martindale K., Heagy M.D., (2019) Hierarchical Zinc Oxide Nanostructures for the Photochemical Reduction of Bicarbonate to Solar Fuels, ACS Sustainable Chemistry and Engineering, 7, 1210-1219

Personal Response What are the remaining impediments to implementing the methanol economy? Carbon dioxide capture remains a significant science and engineering challenge. While CO2 exists at high concentrations from sources such as fossil fuel power plants and often accompanies methane at oil-drilling operations, automobile exhaust, and other diluted sources are difficult to capture. Given the hydrophilic nature of methanol, if no inhibitors are used, it can be corrosive to certain metals such as aluminium and zinc. Existing pipelines designed for petroleum products cannot handle methanol. Until new pipeline infrastructure can be built, or existing pipelines are retrofitted for methanol transport, methanol requires shipment at higher energy cost in trucks and trains.

www.researchoutreach.org

21


Physical Sciences ︱ Dr Daisuke Inazu

Forecasting Tsunamis using Ship Navigation Records In recent years, the Earth has been rocked by devastating Tsunamis. The ability to predict how intense they will be is crucial in deciding how to mitigate their effects. This is what makes the work of Daisuke Inazu at Tokyo University of Marine Science and Technology and fellow researchers so important. Their pioneering work in using the Automatic Identification Systems (AIS) on ships to monitor tsunami current has huge potential to help developing countries to forecast tsunamis so that they may reduce loss of life and infrastructure damage.

T

sunamis are one of the most devastating and destructive natural occurrences and can cause mass mortality and infrastructure damage when they occur. Coastal cities have to be able to find ways to reduce these losses. Although some of these losses can be mitigated through the use of coastal defences, the immediate actions of individuals and organisations after the occurrence of a tsunami are critical for reducing the loss of life that can often directly occur. This is why it is so important to be able to tell when a tsunami is occurring, and how intense it is likely to be. Real-time forecasting of earthquakes and tsunamis are useful when it comes to making decisions for disaster mitigation. This is what makes the work of Daisuke Inazu at Tokyo University of Marine Science and Technology and fellow researchers so important. Daisuke Inazu is no stranger to researching ways to predict the intensity of natural disasters. His research experiences centres around using physics to understand and forecast oceanic events, such as tsunamis, tides and storm surges. Funded by the Japan Society for the Promotion of Science and the Nippon Foundation, he is perfectly positioned to research the likelihood of tsunami occurrences.

Our university vessel “Umitaka-Maru” is also transmitting AIS messages, being one of AIS-derived tsunami current meters.

22

www.researchoutreach.org

Most current systems in place to detect tsunamis use seismic wave observations. Seismic waves are usually generated by movements of the earth’s tectonic plates. However, they can also be caused by volcanoes and landslides that occur underwater. Seismic wave detection technologies are useful in regards to predicting the magnitude and intensity of an earthquake but are not always perfect when forecasting tsunamis. This is because they estimate elastic deformation that may be indirectly related to tsunamis, and are sometimes unable to accurately predict how a tsunami will form. This is because tsunamis can be formed by landslides that often occur due to earthquakes and volcanic activities. This, therefore, makes tsunami forecasting much more difficult than simply predicting the magnitude of an earthquake. It is, therefore, better to use direct observations of offshore sea level to estimate the resultant size of tsunamis. There are many systems that use this method, such as the Deepocean Assessment and Reporting of Tsunami (DART) buoys forecasting system. However, while these types of observatories are reliable, they are also incredibly expensive to maintain and require regular replacements, typically several decades after installation.


Ship distributions derived from terrestrial (orange) and satellite (blue) AIS. Adopted from Vesseltracker.com.

Since gigantic tsunamis may occur once typically during tens to hundreds of years at a certain place, the cost of maintaining and replacing parts of these observatories can add up. Offshore tsunami forecasting needs to be sustainable as well as reliable. It is therefore important that other, more sustainable, ways of monitoring and forecasting tsunamis are developed. Inazu and his fellow researchers have conducted pioneering research into solving this issue, looking at the possibility of using Automatic Identification System (AIS) data to predict the current and height of tsunamis. WHAT IS AIS DATA? AIS data is the navigation records of ships. It includes the ship’s latitude and longitude, speed over ground (SOG), course over ground (COG) and ship heading (HDG). All of this information is used to determine the ship’s position, and where it is going. AIS data is collected under International Maritime Organisation (IMO) regulations, which means that all ships exceeding 300 gross tonnage, and all passenger ships, are required to send this data via very-high-frequency (VHF) radio transmission. AIS data from ships close to shore can be received by coastal stations, and ships further ashore can be received by low-Earth-orbit satellites. The number of ships with recognised AIS data is increasing annually with increases in seaborne trade and in a number of satellites. Real-time ship distributions

can be derived from AIS data on many websites. For a price, AIS data is readily accessible, but there has been an effort On ways in which to make AIS data more accessible in the future.

deviations under tsunami conditions. By conducting a quantitative investigation, these deviations were used as a proxy to accurately measure the tsunami current, and could help to predict the tsunami

The number of ships recognised by the AIS is continually increasing, meaning more and more data can be used around to the world to forecast and measure tsunamis. HOW CAN AIS DATA HELP TO FORECAST TSUNAMIS? While AIS data has not been traditionally used to forecast tsunamis, it is predicted to be useful because it enables sudden changes in the position of ships to be seen, which could help to monitor tsunami currents and predict wave heights. Inazu and his research team are working on contributing to the lack of research regarding the use of AIS data to forecast tsunamis. By using data from the devastating 2011 Tohoku earthquake and resultant tsunami, they have investigated the relationship between ship horizontal drift and tsunami current. The 2011 Tohoku earthquake caused a tsunami that reached coasts with significant wave heights (greater than 10 m) a few tens of minutes after the tsunami generation. Inazu and his research team used AIS data obtained from 16 different ships that were out at sea during this tsunami. They found that measurements obtained from AIS data during the 2011 Tohoku tsunami, had significant

An image of tsunami current detection.

www.researchoutreach.org

23


Historical tsunami sources around the Pacific Ocean. Adopted from International Tsunami Information Center.

source and wave height at the coast of the 2011 Tohoku tsunami. The AIS data obtained from these ships indicated that COG significantly deviates from HDG as the tsunami passes the ship. Using a mathematical model, they proved that ships immediately respond to tsunami current by moving in a diagonal direction with an equivalent velocity. This research shows that crowdsourcing AIS data will help to predict

Ship Heading Course/Speed Over Ground

Ship velocity component in heading-normal direction Tsunami current component in heading-normal direction Ship velocity component in the heading-normal direction is a clear proxy of tsunami current in the same direction.

24

www.researchoutreach.org

Inazu and his research team are working on contributing to the lack of research regarding the use of AIS data to forecast tsunamis. the source, magnitude and wave height of a tsunami. What is so brilliant about this way of measuring tsunamis is possible in the current AIS framework. The number of ships recognised by the AIS is continually increasing, meaning more and more data can be used around the world to forecast and measure tsunamis. This novel method of crowd-sourcing data has huge potential to change the way that coastal populations forecast tsunamis. Although there is more that could be done to increase AIS data reliability, Dr Inazu’s research team is seeking to enhance the sensitivity of AIS data to tsunami currents by using real-time coastal and satellite AIS data. Using AIS data is a relatively cheap, promising way of measuring and forecasting tsunamis in the future.

THE FUTURE FOR THIS TECHNOLOGY While countries such as Japan already employ offshore observatories to forecast and predict tsunamis, many developing countries do not have the economic resources needed in order to set these up. However, due to the low cost of obtaining AIS data, it is likely that coastal cities in Southeast Asia and South America will be able to use data obtained from ships in order to monitor and forecast tsunamis. Tsunami disaster mitigation is of paramount importance in many coastal areas where large populations reside, which is what makes the work of Inazu and his fellow researchers so vital. The use of AIS data for tsunami current monitoring may be a sustainable way of ensuring that tsunamis can be mitigated in areas where life loss prevention and infrastructure damage are so important.


Behind the Research Dr Daisuke Inazu

E: inazud@m.kaiyodai.ac.jp T: +81 3 5463 0417 W: https://sites.google.com/site/inazud4ocean/

Research Objectives

References

Dr Inazu and his fellow researchers have conducted pioneering research into developing effective ways of monitoring and forecasting tsunamis by looking at the possibility of using Automatic Identification System (AIS) data of offshore navigating ships.

Tang, L., Titov, VV., Moore, C., Wei, Y. (2016). ‘Real-time assessment of the 16 September 2015 Chile tsunami and implications for near-field forecast’. Pure Applied Geophysics: Volume 173, pages 369–387. https://doi.org/10.1007/s00024-015-1226-3

Detail Department of Marine Resources and Energy, Tokyo University of Marine Science and Technology 4-5-7 Konan, Minato, Tokyo 108-8477, Japan Bio Daisuke Inazu is an Associate Professor of the Department of Marine Resources and Energy at Tokyo University of Marine Science and Technology. He earned a PhD in Geophysics from Tohoku University, Japan, in 2007. His research experiences have been mainly based on numerical simulation of physics of the ocean such as tsunamis, tides, and storm surges. Funding • Japan Society for the Promotion of Science (JSPS) • The Nippon Foundation Collaborators • Tsuyoshi Ikeya, Tokyo University of Marine Science and Technology, Tokyo, Japan • Takuji Waseda, Toshiyuki Hibiya, UTokyo Ocean Alliance, The University of Tokyo, Tokyo, Japan • Yoshinori Shigihara, National Defense Academy Kanagawa, Japan

Proud, R., Browning, P., Kocak, DM. (2016). ‘AIS-based mobile satellite service expands opportunities for affordable global ocean observing and monitoring’. OCEANS 2016 MTS/IEEE Monterey. https://doi.org/10.1109/OCEANS.2016.7761069 Kong, Q., Allen, RM., Schreier, L., Kwon, Y-W. (2016) ‘MyShake: A smartphone seismic network for earthquake early warning and beyond’. Science Advances: Volume 2, page e1501055. https://doi.org/10.1126/sciadv.1501055 Inazu, D., Ikeya, T., Waseda, T., Hibiya, T., Shigihara, Y. (2018). ‘Measuring Offshore Tsunami Currents using Ship Navigation Records’. Progress in Earth and Planetary Science: Volume 5, page 38. https://doi.org/10.1186/s40645-018-0194-5

Personal Response What first piqued your interest in finding effective ways to forecast tsunamis? My research idea was actually inspired by the smartphone seismic network proposed by the University of California, Berkeley. They utilise accelerometers embedded in our smartphones as seismic monitors from crowd-sourced agents. Our smartphones are onshore. How about offshore? I was speculating tsunami monitoring using information from navigating ships over the global ocean. Then I decided to study AIS data in detail.


Physical Sciences ︱ Professor Motoichi Ohtsu

The Dressed Photon: Shining light on the unknown using the unconventional area of off-shell science The seas of science are unrelenting: researchers have spent lifetimes trying to attain the unattainable – and fallen overboard into obscurity when they were unsuccessful. In the disciplines of Quantum Field Theory and Materials Science, the ability to create light emitting devices from silicon has long been seen as a perpetual white whale. But, as Professor Motoichi Ohtsu, of Research Origin for Dressed Photon (Japan) has demonstrated, anything can be achieved if you choose the right tools. In his case, an exotic particle known as the Dressed Photon is exactly what was needed to conquer the unconquerable.

B

ack in 1981, when the first CDs were released, scientists had inadvertently spawned an audio revolution. Digital audio had already been around for about a decade at that point, but it was now possible to play it from a storage device which could be held in the hand. Records and audio cassettes were relegated to the dusty cupboards of history or the pristine shelves of connoisseur collectors. In the mid-90s, a similar thing happened with video: VHS gave way to DVD and we could now squeeze around four times the data that a record could hold onto a disk taking up less than 16% of the area. In 2004,

www.researchoutreach.org

In this particular case, advances in laser technology are what made the difference. Data is etched onto such devices using lasers – and read in the same way. The size that one bit of data takes up on a CD, DVD or Blu-ray depends on one thing only: the wavelength of light used to read and write it. For CDs, a 780 nanometre (nm) laser was used (that’s a laser that produces light with a wavelength of 780 thousand-millionths of a metre!). DVDs

The Dressed Photon has a lot of unique features which have never been described by the conventional quantum field theories. when the first ever Blu-ray was released, the quality of our home entertainment catalogues rocketed, with a whopping 25GB of data now easily stored on that

and Blu-rays used 680nm and 405nm laser light respectively. So why don’t they just create a laser with a wavelength of 1nm and be done with it?

Dispersion relation of a quantum field

Well, when it comes to light emitting diodes (LEDs) and lasers, scientists have long been struggling with the same problem: the wavelength of light emitted by a device is intrinsically linked to the underlying atomic structure of the materials used. In other words, scientists can’t just decide which wavelengths they want to produce: they have to find materials, or engineer specific combinations of materials, that will give the results they are looking for. That’s why advances in this technology come in apparent sporadic leaps and jumps.

Dispersion relations for on-shell and off-shell quantum fields.

26

same small disk – that’s 60 times the data that a record can contain. Have you ever wondered what was behind these leaps and bounds in home entertainment?

Professor Motoichi Ohtsu, of Research Origin for Dressed Photon (Japan), has been able to make another leap forward in this field by creating the first high powered Silicon (Si) LED and laser


devices. He has achieved this by exploiting the peculiarities of the recently discovered Dressed Photon. THE DRESSED PHOTON To understand what a Dressed Photon is, you first have to understand that the physical world is largely made up of tiny building blocks of matter and energy. Atoms are the most widely known about, but they too have their own constituent parts: neutrons, electrons and protons to name a few; and then there are their more exotic cousins such as quarks, muons, pions and bosons (amongst others). You might even have heard of the famous Higgs Boson, the so-called God particle, whose existence was finally proven at CERN in 2012. The fact is that our universe, which is bigger than we can possibly imagine, is absolutely teeming with tiny particles that are so small we can’t even see them. A photon is simply a particle of light (when light chooses to behave like a particle, that is). Particles, of all shapes and sizes, interact with their environment in very specific ways. For example, photons can – and do – knock electrons out of their orbits on the outskirts of atoms – but only for a limited time: the ousted electron usually returns to send the photon off with a swift kick and a burst of light. Scientists call this process excitation and emission. It is a phenomenon that has been exploited throughout science to produce many of the technological wonders we currently take for granted. How a particle interacts with its surroundings can usually be described in very precise, reliable ways.

Fabrication and operation of a Si-LED.

Top: Fabrication by the Dressed Photon-Phonon Assisted Annealing. Bottom: Operation, exhibiting Photon Breeding.

few) – and one of the great problems of our time is that these areas don’t always play nicely with each other. Quantum Field Theory (QFT) was developed to try and bridge the gap between classical field theory, special relativity, and quantum

mechanics and is used to represent the world of sub-atomic particles. In QFT, when a particle obeys a set of equations from classical physics, we say that it is an example of on-shell science.

Optical output powers and spectra of a Si-LED and a Si-laser

Sometimes, though, particles behave unexpectedly: they appear to break the laws of conventional physics, having more influence on their surroundings than their size, or other measurable qualities, indicates they should. Scientists call these Dressed Particles and as Professor Ohtsu has demonstrated by exploiting a particularly special case, the Dressed Photon, they are beginning to lead us in completely new directions. OFF-SHELL SCIENCE: SCOUTING THE HINTERLANDS OF QUANTUM FIELD THEORY Physics can be divided into many different areas – classical physics, relativistic physics, quantum physics (to name a

Top left: Optical output power of the Si-LED. Top right: Spectral profile of the light emitted from the Si-LED. Bottom left: Output optical power of the Si-laser of 15 mm-length. Bottom right: Output optical power of the Si-laser of 30 mm-length. Open squares are copies of those in the left figure.

www.researchoutreach.org

27


Dressed photon

The terminology comes from the fact that these specific equations relate to the particle’s energy, momentum and what physicists like to call its mass-shell (but which can be thought of as plain old mass for our purposes). When a particle doesn’t obey these equations of motion, it is an example of off-shell science— an area of physics we still have much to learn about. ‘The Dressed Photon has a lot of unique features which have never

Creation (left) and detection (right).

entertainment systems and so much more: laser technology is central to our current push towards nuclear fusion which will unlock an essentially limitless energy supply. If achieved, it will change the course of history forever. HIGH-POWERED LEDS AND LASERS USING SILICON CRYSTALS Professor Ohtsu has used this fertile area of science to develop the world’s first

In one process, Professor Ohtsu has simultaneously achieved the apparently unachievable and unlocked new information. been described by the conventional quantum field theories that treat only the phenomena of on-shell science,’ explains Professor Ohtsu. Off-shell science is an extremely fertile area of physics and nobody really knows what secrets it might unlock in the future. Back in the year 1960 when a couple of unassuming physicists built the first laser, it didn’t really have a purpose. Today, as we have already seen, it is at the heart of our home

high-powered LEDs and lasers from silicon crystals. This is important because, after oxygen, silicon is the most abundant material in the Earth’s crust: for more than half a century it has been the primary material used in electronics and, as a result, a worldwide infrastructure is already in place for creating silicon-based devices. In other words, it would be extremely cost effective to develop light emitting devices based on silicon. On the other hand, silicon is widely believed to have a low light emitting efficiency and, as a result,

Structure and photograph of a Si-laser

Cross-sectional structure (top) and a photograph (bottom) of a high-power Si laser of 15 mm device length.

28

www.researchoutreach.org

is thought of as an extremely difficult material to work with when creating light emitting devices. Rather than be put off by such a seemingly unattainable challenge, scientists around the world have set it as their target – intent on catching the white whale of materials science. To create his devices, Professor Ohtsu has not engineered a new combination of materials to achieve the light emitting devices required but developed a new method for engineering them. His new approach, which he calls Dressed PhotonPhonon Assisted Annealing is a nanofabrication technique which uses off-shell science to produce light emitting materials in a novel way. Using a technique called Joule-heating, and flooding the material with light during the process, he is able to lock-in specific wavelengths in a way that has never been seen before. ‘The unique feature of the fabricated device is that the wavelength of the emitted light is equivalent to that of the light irradiated during the annealing,’ explains Professor Ohtsu. ‘This surprising feature is named the “Photon Breeding”, which has never been observed in conventional LED and laser devices.’ In one process, Professor Ohtsu has simultaneously achieved the apparently unachievable and unlocked new information about the world that we live in. Not only will his devices be extremely useful in many aspects of science and technology, but the surprising new science he has uncovered provides a new direction for researchers to head in their efforts to understand the unknown. Chipping away at the vast wall of the undefined and filling the small cracks with knowledge and understanding is how science progresses. Who knows what else might be uncovered by shining the light that Professor Ohtsu has created with his research?


Behind the Research Professor Motoichi Ohtsu E: ohtsu@rodrep.or.jp T: +81 90 1603 0562 W: http://rodrep.or.jp/

Research Objectives

References

Professor Motoichi Ohtsu uses the Dressed Photon in his new method to create silicon-based light emitting diodes and lasers.

Sakuma, H., Ojima, I. and Ohtsu, M. (2017). Dressed photons in a new paradigm of off-shell quantum fields. Progress in Quantum Electronics 55 (2017) 74-87.

Detail Prof Dr Motoichi Ohtsu (General Incorporated Association) Research Origin for Dressed Photon (RODreP) 3-13-19 Moriya-cho Kanagawa-ku Yokohama Kanagawa 221-0022 Japan Bio Motoichi Ohtsu was awarded his Dr. Eng. from the Tokyo Institute of Technology where he went on to become an Associate Professor and Professor. Later, he became Professor at the University of Tokyo. He is now Professor Emeritus at both the University of Tokyo and the Tokyo Institute of Technology. Professor Ohtsu is also Director-inchief of research institute, the “Research Origin for Dressed Photon”. Professor Ohtsu has published 560 papers and 80 books. He has received 87 patents and been awarded 20 prizes. Collaborators • Dr I. Ojima (RODreP) • Dr H. Sakuma (RODreP) • Prof T. Kawazoe (Tokyo Denki Univ.)

Ohtsu, M. (2016). Silicon Light-Emitting Diodes and Lasers. (Springer, Heidelberg) 2106.

Personal Response Your work takes place in an extremely multidisciplinary area, simultaneously uncovering new phenomena in science and having practical uses in technologies around the world – which do you find most interesting? It is most interesting to draw a precise physical picture of a dressed photon. Also, from the technical point of view, it is essential to investigate the mechanism of fabricating and operating Si-LEDs and Si-lasers. Do you feel that one area is more beneficial to the public? No, I do not think so because all the areas I have developed are based on off-shell science and are mutually correlated. Their progression in an organised manner will be effective to establish novel industries and markets, which may be more beneficial to the public. What do you think are the main differences between carrying out research within industry and academia? On-shell science has been almost saturated. Industry and academia should keep going in different directions until they get new products from the fertile field of off-shell science. They are: Academia: Focus on original basic research! Do not hop on the latest fashion! Industry: Produce novel technology! Encourage young engineers! Industry-academia collaboration is nothing more than a by-product of basic research.

www.researchoutreach.org

29


Physical Sciences ︱ Valeriia Reveko

Setting new horizons in electroplating of zinc die castings Our supply of zinc, a metal abundant in our planet’s crust, could run out by 2100 unless we begin to change the way we use it. Collini Holding AG creates a step change in uncovering and addressing the core causes of common faults of plating on zinc die cast by investigating this material and the ways of it surface finishing in a more sustainable way.

W

ith over 7 billion people on the planet – theoretically from today – there will be an inevitable increase in the demand on the world’s natural resources. Natural gas, oil, and coal are some of the key natural resources that are in danger of being diminished. Yet, there are additional resources provided to us by our planet that although their existence is an integral part of our daily lives, they will eventually vanish; that is unless we begin to change the way we use them. Zinc and its alloys are essential to manufacturing processes because of their respective properties such as castability and performance while also offering significant energy and cost savings. However, this metallic element is a profound example of a vanishing natural resource because its supply from our planet’s crust is predicted to last only until 2100. This is exactly why

Collini Holding AG, the leading group of companies in coating metals and plastics in Europe, has been investigating and developing surface solutions to render zinc alloys more sustainable and efficient. Founded as a grinding shop over 120 years ago, Collini is now an industrial surface treatment company, creating surfaces for fittings, electrical and automotive industry, machine, plant and building construction, and also for medicine and various consumer goods. The company has thirteen production sites in Austria, Germany, Italy, Russia, Mexico and Switzerland that specialise in a wide range of surface technologies including electroplating, hot-dip galvanizing, anodizing and organic coating. ZINC DIE CASTING From toy cars that children play with to the metal parts used in actual cars, there is a huge variety of consumer parts made using the die casting process. In fact, die casting is a manufacturing process that has been around for more than 180 years and was initially invented in order to develop portable typewriters. Essentially, this process involves heating metal alloys until they become molten and consequently pressed between steel moulds until they cool down and solidify in the required shape. One of the most popular alloys to use in the die casting process is zinc alloys because they are easier to cast and solidify at a lower temperature in comparison to alternatives such as aluminium, therefore, the die casting process is cheaper and more efficient. Zinc is a hard, ductile, self-lubricating material that has high thermal conductivity and dimensional stability. However, unalloyed zinc is brittle, is weaker and it is susceptible to corrosion in acidic and/or

30

www.researchoutreach.org


strong alkaline environments. Therefore certain elements, like aluminium or copper, are added to obtain zinc alloys with superior properties. Although, adding those dissimilar elements opens the way to corrosion, a process of metal deterioration. In this way, electroplating coatings are a necessity in order to change surface properties: reduce metal cations from a solution with the help of electric current, and hence form a protective metallic coating. Furthermore, and disregarding the long usage of electroplating as the surface treatment for it, zinc die casting persistently generates several challenges for plating, especially when it comes to decorative use (Reveko and Møller, 2018). Consequently, there are a variety of challenges associated with zinc die casting that has recently become the primary research of the Austrian surface treatment company Collini Holding AG in

SPECIAL ASPECTS OF ELECTRODEPOSITION ON ZINC DIE CASTINGS Zinc die casting part in cross section: difference in structure between the surface and the bulk.

By 2100, it is anticipated the supply of most metals will run out. conjunction with the Technical University of Denmark (DTU). Their work has resulted in the completion of a series of studies by Collini scientist Valeriia Reveko. ZINC PLATING COLOUR CHANGE Valeriia Reveko, a PhD candidate in the Department of Mechanical Engineering, DTU, has published several peer-reviewed research studies that highlight the distinct challenges that are associated with the electroplating of zinc die-casted parts. More specifically, Reveko has conducted thorough morphological and compositional tests and analyses to verify the performance and safety of zinc galvanic coatings on zinc die-cast items to improve the plating quality. For instance, and as noted in Reveko, Lampert, Winther Electron Image 1

and Møller (2018), one of the most common issues with zinc coating over zinc die-cast components involves a distinct blue discolouration – blue areas on the surface of the electroplated zinc coatings. In this study, Reveko and co-workers suggest that aluminium from the substrates will gradually diffuse through the coating and will propel the manifestation of these blue areas as a result of oxidation under the influence of ambient moisture and potential contaminants. Furthermore, and perhaps more importantly, the environment within which zinc plating occurs – from alkaline or acidic solution– directly influences the rate of aluminium diffusion: alkaline zinc demonstrates higher rates of diffusion compared to acidic zinc due to its AI K Series

Electron Image 1

morphology. Hence, and as suggested by Collini team, a viable and attractive solution can be a double-layered zinc coating in which the inner layer is mildly acid zinc and the outer layer is alkaline zinc. INACCURATE ANALYSIS OF CHROMIUM-PASSIVATED ZINC SURFACES In principle, trivalent chromium conversion coating is a very efficient way to anticipate corrosion of zinc coating while meeting environmental and safety requirements. However, some recent research studies reported the presence of hexavalent chromium in the surface passivation layers. AI K Series

Scanning electron microscope images with Energy-dispersive X-ray spectroscopy analysis showing Al presence in zinc plated, zinc die-casted component before (left) and after (right) blue discolouration.

www.researchoutreach.org

31


The Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) has banned hexavalent Cr coatings because they are carcinogenic. In addition, 4% of people in Europe and 5% in America are allergic to chromium, and hexavalent Cr is much more likely to cause a skin reaction such as dermatitis than the trivalent one. In general, colourimetric 1,5-diphenylcarbazide (DPC)-based spot test is one of the principal methods used to identify hexavalent chromium on various metallic and leather surfaces. However, and as noted in Reveko, Lampert, Din, Thyssen and Møller (2018), DPC testing on trivalent chromium

ZINC PRESSURE DIE CASTING Conventional Hot Chamber Machine

Zinc die casting machine.

Research in Collini intends to reveal and address the causes of common plating faults and support sustainable use of materials. passivated zinc surfaces can actually be misleading. X-ray photoelectron spectroscopy (XPS) measurements performed in this research study did not verify the presence of hexavalent chromium previously identified on such surfaces by DPC. The study suggests that unintended oxidation of the DPC indicator in reaction with the substances present

on the surface because of atmospheric corrosion can be the reason for these falsepositive results. The importance of this study is not limited to the impeccable level of quality control needed in the European metal industry, but also because these false-positive results can unavoidably result in the prescription of erroneous treatment to patients with contact dermatitis.

ZINC RESPONSIBLE USAGE CYCLE

SUSTAINABLE INCREASED SERVICE LIFE

N UCTIO OD PR

Zn

EFFICIENT ZINC RECYCLING

LE

ABUNDANT NATURAL RESOURCES

RECYCIN G

SUSTAINABLE IMPROVED PROFITABILITY

RE

LE CONSUMPTIO SIB N ON SP

SUST AIN AB

SUSTAINABLE PRODUCTION

CO2

COSTS REDUCTION

32

www.researchoutreach.org

DECREASED ENVIRONMENTAL IMPACTS

SUSTAINABLE SOLUTIONS This research would be remiss if it did not consider the present-day ecological sustainability challenges. Coated metal alloys consist of parts with multiple coating layers of different metals that are tricky to recycle. This is because separating metals such as chromium and nickel – metals that are traditionally used to coat zinc metal alloys – is a process with increased energy demands. Therefore, researchers have shifted their focus on coating zinc alloys with zinc coatings because this process reduces the overall number of metals used, thus rendering the recycling process easier. This is exactly why Collini has turned its attention towards improved approaches of the surface treatment of zinc die-cast parts. The usage of recycled zinc is an energetically favouring process because its energy requirements are less than 10% compared to the production of primary zinc. Essentially, the research presented in this article is directly connected to the increasing industrial need for recyclable zinc components. CONCLUSIONS This research conducted by Collini intends to identify and evaluate the core causes of common plating faults while supporting the sustainable use of materials. Valeriia Reveko’s study provides greater insight into the electroplating of zinc die castings while at the same time, it manages to evaluate the hidden challenges of zincplated and zinc die-casted components, and trivalent chromium-passivated zinc surfaces.


Behind the Research Valeriia Reveko

E: vreveko@collini.eu T: +43 664 6105 729 W: collini.eu

www.linkedin.com/in/valeriia-reveko-10047a118/

Research Objectives

References

PhD candidate Valeriia Reveko from the Austrian electroplating company Collini, has led a series of studies looking into how to make zinc alloys more sustainable.

Valeriia Reveko and Per Møller (2018). ‘Special Aspects of Electrodeposition on Zinc Die Castings’. NASF Surface Technology White Papers, 82 (8), pp. 1-9, Article Post: 3/19/2018.

Detail

Valeriia Reveko, Felix Lampert, Grethe Winther and Per Møller (2018).’Change of The Decorative Properties Of ZincPlated Zinc Die Castings Over Time’. International Journal of Metalcasting, pp. 1-7, https://doi.org/10.1007/s40962-0180237-0.

Collini GmbH, Schweizer Str. 59, 6845 Hohenems Austria Bio Valeriia Reveko is a Product- and Process Development engineer in Collini GmbH, Austria. She holds an MSc degree in technical electrochemistry from Kiev Polytechnic Institute and is conducting a PhD project in materials science and surface technology at the Technical University of Denmark (DTU). Funding The research was funded by Collini GmbH, Hohenems. Collaborators • Professor Per Møller, Technical University of Denmark (DTU).

Valeriia Reveko, Felix Lampert, Rameez Ud Din, Jacob P. Thyssen, Per Møller (2018) ‘False-positive result when a diphenylcarbazide spot test is used on trivalent chromiumpassivated zinc surfaces.’ Contact Dermatitis, Volume 78: Issue 5, pp. 315-320, doi:10.1111/cod.12955.

Personal Response Why do you think it’s important to find ways to recycle zinc? It is my position that in today’s world, we cannot continue to use natural resources in such an exploiting manner as we currently do at the moment. Our planet has given us a great deal, and the least we can do in return is to use these resources wisely. Zinc, copper, nickel, steel, aluminium, plastic – before putting those materials to the market, we must carefully consider the whole product lifecycle, aiming for sustainable solutions; and the surface treatment industry can be a conductor who make a real impact on this. ‘Applied surface intelligence’ is the motto at Collini, and we all work together with the aim of using materials more effectively.

www.researchoutreach.org

33


Physical Sciences ︱ Professor Toshikazu Ikeda

Evaluating students’ perceptions of the roles of mathematics in society Toshikazu Ikeda, Professor of Mathematics Education at Yokohama National University, has found that while mathematical modelling is often evaluated with respect to mathematical attributes, little academic consideration has been given to the nonmathematical viewpoint. To fill this knowledge void, he has developed an analytical tool to evaluate the changes in students’ perceptions of the roles of mathematics in society following an experimental teaching program.

H

aving students recognise the roles that mathematics takes in society is a significant aim in the teaching and learning of mathematical modelling. This issue has been considered by academics since the 1980s. There have been several studies investigating student belief about the value of mathematics in society, but little has changed in the literature regarding the many roles of mathematics in society. CLASSIFICATION OF MATHEMATICAL MODELS Mathematical models are often classified according to their various attributes: whether they are concrete or abstract, descriptive or analytical, together with their underlying mathematical basis, such as algebra, geometry or statistics. While useful, these categories are essentially founded on mathematical perceptions. Toshikazu Ikeda, a Professor of Mathematics Education at Yokohama National University, considers the nonmathematical viewpoint in his study

and has developed an analytical tool to assess how students perceive the roles of mathematics in society. METHODOLOGY The Likert-scales method, where items are usually rated from ‘strongly agree’ to ‘strongly disagree’, is often used to assess students’ awareness of how useful mathematics is in the real world. While this is a simple and effective tool, it does not capture detailed information regarding students’ perceptions of the roles of mathematics in society. Professor Ikeda has developed four categories which are centred on why mathematical models are used in society. These combine three standpoints: personal-societal perspectives, clarity of role statements, and specific-general contexts. CATEGORIES In the first category, students see mathematical modelling only from a personal perspective. For example, they realise the usefulness of carrying out their personal financial calculations, but ignore the impact on society. The second category has students adopting some social perspective; however, their responses are not specific. For instance, they will acknowledge that mathematics is useful in society but not mention how it is useful or provide examples.

Experimental teaching: the number of years to double their savings.

34

www.researchoutreach.org

In the third category, students assume a societal viewpoint and refer to a specific context, such as how mathematics can help a particular company reduce its manufacturing costs, but they do not include a general context.


Mathematical Modelling Ex. Night time in Norway Real World

Concrete Model

Geometric Model x°

66.6°

Symbolic Model y=(2/15).cos-1 {tan(23.4).tan x}

90°

Y: Hours of night time

Graphic Model

NIGHT TIME IN NORWAY

X: North Latitude

Why is daytime so long? In Japan, it is pitch-dark at 10:40pm

From the mathematical model, we can understand that the hours of night time dramatically decrease in Norway

An example of using mathematics to explain and predict phenomena.

Having students recognise the roles that mathematics takes in society is a significant aim in the teaching and learning of mathematical modelling.

The fourth category has students adopting a societal perspective and referencing a general context, such as using mathematics to explain phenomena, contribute to decision making, or assist with designing objects. INVESTIGATION Professor Ikeda conducted an experimental program with a ninth-grade class of approximately 30 students. The procedure was carried out over two years (2007–2008) at a junior high school in Japan with the same teacher during both years. A teaching program comprising nine lessons was devised. These lessons took place once a week from September to November each year and were carried out by the classroom teacher. The first eight lessons each lasted 100 minutes and the ninth lesson took 50 minutes. THE EXPERIMENTAL TEACHING PROGRAM The program of lessons concentrated on three roles of mathematics in society: understanding, making decisions, and designing. Eight modelling tasks were selected. The ‘understanding’ tasks involved investigating reflections of a face in a mirror and calculating probabilities in rock-paper-scissors. During the ‘making decisions’ tasks, students examined

a bank interest system, calculated the number of years to double their savings and explored the position of tennis serves. The ‘designing’ tasks included investigating the shapes of cans, exploring the structure of a bicycle reflector and designing a parking space.

when we examine real-world problems from various perspectives?’ before commencing the program of nine lessons. During the first seven lessons, the teacher conducted group discussions regarding the modelling process. These were followed by problem-solving periods where the teacher presented one of the modelling problems. The students worked on the problem in groups of 4 or 5 and

The students were asked to write down their responses to the pre-program question: ‘How is mathematics useful

TO MAKE MONEY DOUBLE

X

1

2

3

4

5

6

7

8

9

10

Y

70

36

24

18

15

12

11

10

9

8

a

70

72

72

72

75

72

77

80

81

80

VALUE OF A

Calculation of average: a = 75.1 y = 75.1/x

A simple model to calculate the number of years to double their savings.

www.researchoutreach.org

35


then came together to discuss it as a class. In the seventh and eighth lessons, three tasks were introduced and each group of students selected one to work on.

Experimental teaching: investigating reflections of a face in a mirror.

Throughout the lessons, emphasis was put on the reason for solving problems and students were encouraged to reflect on the roles of mathematics in order to identify the significant aspects of the roles of mathematics in society. The students first reflected prior to the teaching program; followed by reflection after each lesson; and finally, after the teaching program was completed. During the ninth lesson, the students reviewed the eight modelling tasks. They reflected on the modelling processes and identified the common points in order to ascertain the roles of mathematics. The students discussed the problems they faced during the eight tasks and how they solved them. The students then broke into groups to the post-program question: ‘How is mathematics useful when we examine real-world problems from various perspectives?’. Then the teacher asked each of the students to write down their answer without any teacher-led discussion so that the students could provide their own answers supported by their group discussion, but without any influence from the teacher. ANALYSIS A total of 57 students took part in the study, 31 students during the first year and 26 students during the second. The students’ responses to the pre- and post-program questions regarding their perception of the roles of mathematics in society were coded and then analysed. Ikeda’s analytical tool revealed that students’ opinions concerning the roles of mathematics in society changed significantly over the course of the nineweek experimental teaching program. Analysis of the students’ perception relating to personal-societal perspectives, clarity of role statements, and specific-

general contexts was carried out. A few students continued to adhere only to their personal perspectives or provide vague statements, insisting that mathematics was not useful in their lives, even though they acknowledged that it might be useful for others. With respect to the personal-societal perspective, only a few students adopted a personal perspective without also developing a societal perspective. Regarding the clarity of role statements, several students still only used vague, general statements. From the specificgeneral contexts, the analytical tool revealed that some students were able to clearly identify several purposes of mathematical modelling. Professor Ikeda found that by using the analytical tool he could start to distinguish between the qualitative differences of the various students’ perceptions of the roles of mathematics in society as well as whether students were able to appreciate the roles from both personal and societal perspectives or only a personal perspective. He also discovered that students, who were able to perceive the roles of mathematics in society, did not automatically appreciate the utility of mathematics.

Students’ opinions concerning the roles of mathematics in society changed significantly over the course of the nineweek experimental teaching program.

36

www.researchoutreach.org

CONCLUSION Professor Ikeda found strong evidence to suggest that this type of teaching program is significantly effective in developing students’ thinking and their appreciation of mathematical modelling. This was reinforced when he examined the work that students produced in the study, demonstrating their thoughts on solving the set tasks. Ikeda found that these written examples of students’ thinking consistently demonstrated a marked increase in the quality and depth of the students’ insight into the mathematical modelling process. This study also revealed that Professor Ikeda’s analytical tool enabled the clarification of students’ perceptions of the roles of mathematics in society, both before and after the teaching program was performed. This revealed how students’ perceptions regarding the roles and utilisation of mathematics in society changed significantly over the course of the nine-week teaching program. Professor Ikeda recommends that further attention should be paid to both the validity and reliability of his analytical tool, particularly when setting up the pre- and post-program question and interpreting of the students’ response. He suggests that samples of students are interviewed to check the validity of the results. Keeping these points in mind, this analytical model still merits further use in future studies, particularly those involving student belief regarding the value of mathematics in society.


Behind the Research Professor Toshikazu Ikeda

E: toshi@ynu.ac.jp T: +81 45 339 3371 W: http://er-web.jmk.ynu.ac.jp/html/IKEDA_Toshikazu/en.html W: www.researchgate.net/profile/Toshikazu_Ikeda W: https://slideplayer.com/slide/8309101/ W: https://link.springer.com/article/10.1007/s11858-018-0927-3

Research Objectives

References

Toshikazu Ikeda, Professor of Mathematics Education at Yokohama National University, work focuses on teaching mathematical modelling and its applications. He has recently developed an analytical tool to evaluate the changes in students’ perceptions of the roles of mathematics in society following an experimental teaching program.

Ikeda, T. (2018). ‘Evaluating student perceptions of the roles of mathematics in society following an experimental teaching program’. ZDM Mathematics Education, 50(1-2), 259-271.

Detail Yokohama National University 40-8501 Kanagawa Prefecture, Yokohama, Hodogaya-ku, Japan Bio Toshikazu Ikeda is a Professor of Mathematics Education at Yokohama National University. He received his PhD in education at Waseda University in 2014. Since 1999, he has studied the teaching of mathematical modelling and applications, funded by a Grant-in-aid for Science Research in Japan. Funding Grant-in-aid for Science Research in Japan

Ikeda,T. and Stephens, M. (2010). ‘Three teaching principles for fostering students’ thinking about modelling: An experimental teaching program for 9th grade students in Japan’. Journal of Mathematical Modelling and Applications, 2(1), 49-59. Available at: http://proxy.furb.br/ojs/index.php/ modelling [accessed 02/01/19]. Ikeda,T. (2009). ‘Didactical Reflections on the teaching of mathematical modelling-Suggestions from concepts of ‘time’ and ‘place’’. In Morten BlomhØj and Susana Carreira, Roskilde University, Department of Science, Systems and Models, Mathematical applications and modelling in the teaching and learning of mathematics, 217-228. IMFUFA tekst nr.461. Ikeda, T., & Stephens, M. (2017). ‘Modelling as interactive translations among plural worlds: Experimental teaching using the night-time problem’. In G.A. Stillman, W. Blum, & G. Kaiser (Eds.), Mathematical modelling and applications – Crossing and researching boundaries in mathematics education (pp.399-409), Springer International Publishing.

Personal Response What initially prompted you to investigate students’ perceptions of the roles of mathematics in society? In Japan, there has been the tendency where students have not realised the usefulness of mathematics in society from the result of PISA. To overcome this tendency, mathematical modelling has been gradually emphasised in the teaching of mathematics. However, I am uncertain that students will realise the usefulness of mathematics by focusing on mathematical modelling in mathematics lessons. If students thought that mathematics might not be used in his/her future life, he/she might not realise the usefulness of mathematics even though he/she realises that mathematics is applied in the real world. This study starts from this question.

www.researchoutreach.org

37


Physical Sciences ︱ Professor Jeffrey Oaks

François Viète’s revolution in algebra François Viète is considered by many historians to be the founder of modern algebra, but his work has not received the academic attention it deserves. Professor Jeffrey Oaks from the University of Indianapolis seeks to redress this imbalance. Through his study of Medieval and Renaissance mathematics, Professor Oaks shows how Viète reestablished algebra on a geometrical foundation; and in the process created an entirely new notation. His work inspired Fermat’s and Descartes’ developments and led to algebra becoming the language of science.

T

he word algebra derives from the Arabic al-jabr, which means restoration, or the reunion of broken parts. Algebra can be traced back to ninth century CE Arabic books on the topic, and prior to that, we find that it was practised in India, Greece, and even ancient Babylonia.

Algebra before 1500, whether in Arabic, Latin, or Italian, was used predominantly for numerical problem-solving by practitioners such as merchants, government secretaries and surveyors. Only a few mathematicians employed it for more ‘scientific’ exploits, such as Diophantus in the 3rd century CE, Omar Khayyam in the 11th century, and Jordanus de Nemore in the 13th century. Algebra began to attract the attention of theoretically-minded mathematicians in sixteenth-century Italy. Mathematicians such as Scipione del Ferro, Niccolò Tartaglia, Girolamo Cardano and Rafael Bombelli had finally solved irreducible cubic and quartic equations, and in the process, they had begun to explore negative and complex numbers. FRANÇOIS VIÈTE François Viète (1540 -1603), a French lawyer in the court of Henry IV, took algebra in a completely different direction

38

www.researchoutreach.org

from his predecessors. Beginning in 1591, he published a series of short treatises in which his algebraic knowns and unknowns, which he calls ‘magnitudes’, possess dimension without limit, and, for the first time, arbitrary knowns are represented in notation. It is mainly because of his notational innovations that he has been credited by some historians as being the founder of modern algebra. A MISUNDERSTOOD MATHEMATICIAN Despite Viète’s importance, and partly due to his own terse and sometimes confusing style, his work has been misunderstood and has not received the serious attention it warrants. For starters, just what are these capital letters he employs in his new algebra? Jeffrey Oaks, Professor of Mathematics at the University of Indianapolis, however, is redressing this. Nearly two decades ago he decided to combine his two main interests, Mathematics and History, in the study of Medieval Arabic mathematics. Professor Oaks enlisted the help of a Palestinian colleague to teach him Arabic and embarked on the study of Arabic algebra. His early work exposed the conceptual differences between medieval and modern algebra, and those studies laid the foundation for his later work on Viète.


A geometric polynomial from Viète’s De Recognitione Aequationum (1615), which can be translated into our notation as A4+2B∙ A3+B2∙ A2. Among other differences, note the prepositions “in” for multiplication, and the lack of the coefficient “1” before the “A quad. quad.”

PREMODERN POLYNOMIALS Oaks discovered that algebraists before Viète conceived of the objects of their study, that is monomials, polynomials, and equations, differently than we do today. A premodern polynomial was considered to be a collection of different kinds of numbers or powers, without any operations present. Where our x2 + 3x, for example, is constructed from the operations of exponentiation, scalar multiplication, and addition, the medieval equivalent ‘a māl and three things’ (here translated from Arabic) was simply a collection of four items of two kinds, like saying ‘an apple and three bananas’. This interpretation lay behind the algebra in Ancient Greek, medieval Arabic, Latin, and Italian, and even in the algebra of sixteenth-century Europe. A NEW ALGEBRA FOR GEOMETRY Prior to Viète, the knowns and unknowns in algebra were positive numbers. Viète diverged from this norm, but in a way

What drove Viète was his interest in producing accurate astronomical tables. He was faithful to the Greek tradition exemplified in Ptolemy’s Almagest (2nd century CE) by regarding geometry as providing the theoretical footing for calculations in astronomy. (Even if magnitudes have no intrinsic numerical measure, one can assign numerical measures to them.) Ptolemy had not used algebra to express his theorems or to perform his calculations, but Viète, through his investigations in trigonometry, found a way to adapt the numerical algebra of his time to a geometrical setting. By working abstractly with higherdimensional magnitudes and by resolving proportions into equations, he laid the foundation for a new algebra. This new algebra, which he called logistice speciosa, wasn’t just another step towards modern algebra. It was a complete overhaul of the very foundation of the art. It inspired Fermat’s and Descartes’ developments, which ultimately led to the

Viète was also the first mathematician to explore beyond the third dimension in geometry. that had not been properly analysed before. Professor Oaks has reviewed the whole of Viète’s output, along with an extensive range of mathematical literature from the period, and has determined that Viète’s letters, standing for his knowns and unknowns, represent instead geometric magnitudes such as lines and surfaces. More specifically, they represent the relative sizes that geometric magnitudes have with respect to one another, without any regard to possible numerical measures. In other words, Viète created an algebra for classical geometry.

replacement of Euclidean geometry with algebra as the standard way to express scientific results. A RADICALLY NEW CONCEPT OF POLYNOMIAL, AND A NEW NOTATION TO GO WITH IT One natural consequence of the shift from an arithmetical to a geometrical foundation is that Viète’s polynomials were understood in an entirely new way. Where premodern polynomials were simply aggregations of the powers, Viète’s polynomials are modern in the sense that they are now constructed from

Portrait of Viète from Savérien’s 1773 Histoire des Philosophes.

operations. Before Viète, the powers of the unknown in algebra were considered to be different types of numbers and were given individual names. For instance, in 1575, Xylander called the first-degree unknown “numerus” and the second degree unknown “quadratum”, which he abbreviated as “N” and “Q”. In one problem, for example, he wrote “1Q+6N+36” for what would be our x 2+6x+36. While Xylander’s notation may look modern, the letters function differently than our powers of x. The “Q” is a denomination or type (like “euro”), and only with a coefficient (here a “1”) does it assume a value (like “1 euro”). This is how all the various algebras preceding Viète functioned, both rhetorically and in notation. The notation of Viète’s logistice speciosa performs differently from its premodern counterpart. Viète expressed Xylander’s polynomial as “A quadratum, + B in A, + B quadrato”, or translated into English,

www.researchoutreach.org

39


Title page to Vasset’s 1630 French translation of two of Viète’s works, showing Viète on the right.

A polynomial from Michael Stifel’s book Arithmetica Integra (1544), showing premodern notation. We would write it as 150x-√(4500x2)+x2. Note the coefficient of “1” on the last term. Compare with the notation on the preceding page.

“A squared + B (multiplied) by A + B squared”. While Viète’s notation may look a little less symbolic, his letters were the first in algebra to denote values, thus the lack of a “1” before the “A quadratum”. This term represents the size of a square relative to other magnitudes. This reconception opened the door to operations in algebraic expressions beyond polynomials that had been absent before. Further, because Viète’s algebra is founded in geometry, his coefficients are necessarily arbitrary geometric magnitudes (here “B” and “B quadrato” instead of “6” and “36”). This enabled the structure of solutions to be depicted

in a simplified equation, or formula; and because Viète’s goal was ultimately numerical calculation, this formula could be reused, substituting different knowns to generate tables. BEYOND THE THIRD DIMENSION Prior to Oaks, the only serious study of the ontology of Viète’s logistice

Viète’s geometrical algebra, built on a new foundation, would eventually oust the old premodern algebra.

René Descartes, whose 1637 La Geometrie built on Viète’s new algebra.

40

www.researchoutreach.org

speciosa was a 1936 article by the Germantrained philosopher Jacob Klein. Klein, searching for the origins of modern, axiombased mathematics, saw the objects of Viète’s algebra not as geometric magnitudes, nor as numbers, but as abstract entities that transcend the two. Klein’s thesis gained traction with its translation into English in 1968. Although not universally accepted, it has remained until now the only serious study of the ontology underlying Viète’s algebra.

According to Oaks, Klein probably went astray largely because he (and other historians as well) failed to notice that Viète worked with four-dimensional geometric magnitudes in two of his propositions. No mathematician before Viète had gone beyond the third dimension. Viète made this leap, not by some deep insight into the nature of geometry, but simply because it gives correct values when applied to numerical calculation. Like other impossible objects of his century, such as negative and complex numbers, higher dimensions in geometry were admitted because they proved to be useful. IMPACT Viète’s new geometrical algebra would eventually oust the old algebra. His concept of polynomial, together with his novel notation, was taken up in modified form in Descartes’ 1637 La Geometrie. Descartes presumed an intrinsic numerical measure for his magnitudes, and thus re-introduced numbers to algebra. He also preferred the lower case x and y, which we still use today, to Viète’s capital A, E, etc. It is the algebra of Descartes that became, and remains today, the standard mode of expressing mathematics, physics, and other fields. With Viète’s work, what had been a practical technique of merchants and surveyors was on its way to becoming the language of science.


Behind the Research Professor Jeffrey Oaks

E: oaks@uindy.edu T: 1 317 788 3454 W: https://www.uindy.edu/cas/mathematics/oaks/ W: https://www.uindy.edu/cas/mathematics/faculty W: https://www.researchgate.net/profile/Jeff_Oaks

Research Objectives

References

Professor Jeffrey Oaks’ study of Medieval Arabic Algebra at the University of Indianapolis has led to his discovery of the innovations of François Viète that underpin modern algebra.

Oaks, J.A. (2018). ‘François Viète’s revolution in algebra’. Archive for History of Exact Sciences, 72 (3), 245-302.

Detail Department of Mathematical Sciences University of Indianapolis 1400 E. Hanna Ave. Indianapolis, IN 46227 USA Bio Jeffrey Oaks received his PhD in Mathematics from the University of Rochester in 1991. Since 1992, he has been a Mathematics Professor at the University of Indianapolis. Oaks originally worked in differential geometry, but switched to history of mathematics in 1999. Since then, he has focused mainly on medieval Arabic algebra.

Oaks, Jeffrey A. (2017). ’Irrational ‘coefficients’ in Renaissance algebra’. Science in Context, 30, 141–172. Klein, J. (1968). ‘Greek Mathematical Thought and the Origin of Algebra’ (trans: Brann, Eva). Cambridge: MIT. Klein, J. (1936). ‘Die griechische Logistik und die Entstehung der Algebra. II’. Teil. Quellen un Studien zur Geschichte der Mathematik Astronomie und Physic, 3(2), 122–235.

Personal Response What initially prompted your research into medieval Arabic mathematics? I knew even as an undergraduate student that Arabic mathematics is as important as it is understudied. While many people are working on, say, eighteenth-century mathematics, very few are reading the Arabic manuscripts. I am currently one of the few people in the world working on Arabic algebra. What are your plans for future research in this area? At the moment, I am working on a translation and commentary of the Arithmetica of Diophantus of Alexandria with a co-author, Jean Christianidis. I am also planning other studies on Arabic mathematics, and I will eventually look beyond Viète to investigate the algebra of 17th c. Europe.

www.researchoutreach.org

41


Physical Sciences ︱ Dr Tamás Biró

Generalising the Entropy Formula through Master Equations Entropy is one of the most important and most widely studied quantities in physics, and for centuries, its value has been robustly described using simple mathematical relationships. Yet however elegant, Tamás Biró at the Hungarian Academy of Science believes the formula is hiding a more complex array of relationships. Through constructing ‘master equations’ to describe these relationships, Biró and his colleagues are using statistics to study problems as diverse as the formation of hadrons, changes in biodiversity, and patterns in popularity on Facebook.

E

very system around us has a different degree of chaos associated with it. While atoms within metals are arranged in neat, highly ordered lattices, the movements of gas molecules in the air are far more chaotic and unpredictable. Since the 19th century, physicists have described this chaos using a quantity known as ‘entropy’. The changes in entropy which take place during physical processes form the basis of one of the most fundamental laws of physics: The Second Law of Thermodynamics. The law states that in any system closed off from the outside, entropy must not decrease over time; as such, the universe becomes more chaotic as a whole. As Dr Biró describes, this inevitable increase in entropy represents a decrease in the number of possible leftover arrangements of particles, or ‘states’, that a system can find itself in as the one-way passage of time progresses. “Macroscopically, entropy has proved never to decrease spontaneously; an idea related to the ‘arrow of time’, or irreversibility,” he explains. “Many processes in nature end up only

Energy, Entropy, the 2nd law of Thermodynamics

High Randomness, High Entropy, High Disorder

42

www.researchoutreach.org

Low Randomness, Low Entropy, Low Disorder

in some selected states while they may begin in any of a vastly larger amount of possible states. All microscopic changes in a complicated closed system conspire in a way that the total entropy never decreases.” To quantify entropy, physicists ultimately need to use statistics to study how particles move around into different arrangements over time. “Kinetic theory and its follower, statistical physics, depend on our knowledge about entropy,” Dr Biró continues. “They can be viewed as given mathematical formulae based on the probability of being in a given state. All random, or ‘stochastic’ models are now checked for satisfying such a theorem, accounting for random forces in their calculations.” Even after centuries of scrutiny, these mathematical descriptions have proved time and again to robustly describe real physical systems. However, in their research, Dr Biró and his colleagues now want to take the theory a step further. GENERALISING BOLTZMANN’S FORMULA In the 1870s, Austrian physicist Ludwig Boltzmann was able to reduce the quantity of entropy down to a remarkably simple, elegant formula. He proposed that its value simply has a logarithmic relationship with the number of possible states a system can have under its current conditions. Yet although the formula is so effective in mathematically predicting changes in entropy, Dr Biró and colleagues believe it cannot completely account for the more complex physical processes which underlay it. In their research, the researchers aim to generalize Boltzmann’s simple equation; identifying mathematical


relationships which build up to the overall logarithmic relationship. “Our research couples to the novel efforts to generalize Boltzmann’s original formula, containing the logarithm – a neat function mapping products to sums,” says Dr Biró. “What are the more general dynamical processes which make it unavoidable to use Boltzmann’s logarithm?” Through several studies, the researchers have used real-world examples of entropy changes to study how Boltzmann’s logarithm can be generalised in this way. The problems they have analysed so far range from fundamental questions of particle physics to patterns of interaction on social media. SOLVING THE HADRONIZATION PROBLEM In the dynamic first moments of the universe, fundamental particles known as quarks and gluons existed as freelyflowing elements of a gas-like soup, or plasma. However, these particles ultimately cannot exist by themselves; within octillionths of a second after the Big Bang, they grouped together to form stable structures named hadrons – made from three quarks bound together by gluons. Yet the process of ‘hadronization’ throws up a problem for physicists: since hadrons appear to be far more ordered than a quark-gluon plasma, how could the overall entropy of the universe have increased?

All microscopic changes in a complicated closed system conspire in a way that the total entropy never decreases. In their research, Dr Biró and colleagues have proposed a new way of looking at the problem, based on the earlier work of Brazilian physicist Constantino Tsallis. In the 1980s, Tsallis drew up a new set of statistical parameters to generalise Boltzmann’s formula, resulting in a new branch of statistics for the specific case of entropy. In particular, his equations resulted in new sets of probability distributions – statistical formulae which describe how likely it is for individual

particles in a system to be in a certain place. The probability distributions derived by Tsallis could describe the entropy of physical systems far more realistically than previous models.

Number Dynamics

Probability Dynamics

Based on Tsallis’ work, Dr Biró’s team could construct a set of ‘master equations’ – mathematical relationships which build up to Boltzmann’s formula collectively but better describe the behaviour of physical systems individually.

Schematic representation of the coarse-grained random growth model. Previously published in Zoltán Néda et al. https://doi.org/10.1371/journal.pone.0179656 and is under the Creative Commons Licence CC BY 4.0.

www.researchoutreach.org

43


Rescaled distribution of the citation (share) numbers.

Previously published in Zoltán Néda et al. https://doi.org/10.1371/journal.pone.0179656 and is under the Creative Commons Licence CC BY 4.0.

This allowed the researchers to approach the hadronization problem from a new angle; providing new insights into how entropy could have increased as a whole in the earliest moments of the universe. Dr Biró and colleagues also applied their mathematics to other situations; testing the effectiveness of their model on systems including patterns of popularity on social media, distributions in income and wealth, along with size distributions of settlements and ecosystems. A SET OF MASTER EQUATIONS Through these studies, Dr Biró and colleagues have made strides towards generalizing Boltzmann’s original formula, using mathematics adapted from the principles of statistical physics which Tsallis first generalized for the case of entropy. As Dr Biró explains, “We found that nonlinear master equations: dynamical equations treating the probabilities of systems being in given states not linearly, but via some more complicated function, lead to the use of non-logarithmic formulas for the entropy.” Using such sophisticated mathematics, the researchers could

construct entropic probability distributions which more closely reflect those which can be observed in nature. Ultimately, these master equations allowed Dr Biró and colleagues to model how the probability distributions of systems evolve as time progresses, and entropy increases. Over time, an increase in entropy will mean the probability distribution associated with a system can be expected to shift. “Our way of showing this relied on the use of an ‘entropic divergence’: a nonsymmetric distance formula between two probability distributions,” Dr Biró continues. “Stable nonlinear master equation models describe a change of probabilities always towards a no-morechanging, stationary distribution.” As such, the team’s master equations could model the changes in entropy of real systems over time. Through this work, Dr Biró and colleagues could construct a ’nonextensive’ entropy formula, which accounted for how the probability distribution of a system after a process

Dr Biró and colleagues have made strides towards generalizing Boltzmann’s original formula.

44

www.researchoutreach.org

Left: scheme for processes with local changes (e.g. diffusion). Mid: scheme for locally one-way processes with resets to the ground state (e.g. popularity). Right: scheme for general processes with long jumps (e.g. stress in earthquakes).

has played out will be independent of the initial probability distribution for most materials. Since this entropic distance relationship more closely reflected reality, the master equations could be used to construct a highly generalized formula for calculating entropy. “A number of physical systems are described by stochastic master equations,” explains Dr Biró. “Their corresponding stationary distributions belong to maximal entropy; therefore, entropy can also be defined as the entropic divergence from the uniform distribution.” APPLYING THE MODEL The mathematics described by Dr Biró and colleagues can already be used to model the entropy changes which unfold in a wide variety of different systems. “We apply our mathematical model to various physical systems, including complex networks of popularity like citations or Facebook likes, city size distributions, produced hadron energies in big collider experiments, student exchange networks between universities, among others,” Dr Biró concludes. Yet the team’s mathematical description is still far from complete. In future work, the researchers will study more complex nonlinear master equations; enabling for an even greater understanding of how entropy works.


Behind the Research Dr Tamás Biró

E: Biro.Tamas@wigner.mta.hu T: +36 20 435 1283

W: www.rmki.kfki.hu/~tsbiro

Research Objectives

References

Biró and his colleagues are using statistics to study problems as diverse as the formation of hadrons, changes in biodiversity, and patterns in popularity on Facebook.

Biró, T.S., Barnaföldi, G.G., Biró, G., Shen, K.M. (2017). ‘Near and Far from Equilibrium Power-Law Statistics’. Journal of Physics: Conference Series, 779(1), 012081.

Detail

Biró, T.S., Schram, Z., Jenkovszky, L. (2018). ‘Entropy production during hadronization of a quark-gluon plasma’. The European Physical Journal A, 54(2), 17.

H-2081 Piliscsaba, Tó sétány 5A, Hungary Bio Tamás studied physics at Eotvos University from 1975 til 1980, and received an MSc in physics and biophysics. He later received his PhD degree at the same university. Tamás is the vice director of the Institute for Particle and Nuclear Physics, the half of the Wigner Research Centre for Physics, in Budapest. Funding • Ministry for Innovation and Technology • National Research Development and Innovation Office (both in Hungary) Collaborators • Zoltán Néda (UBB Cluj, Romania) • András Telcs (Wigner RCP, Budapest, Hungary)

Biró, T.S., Neda, Z. (2018). ‘Unidirectional random growth with resetting’. Physica A: Statistical Mechanics and its Applications. Biró, T.S., Telcs, A., Néda, Z. (2018). ‘Entropic Distance for Nonlinear Master Equation’. Universe, 4(1), 10.

Personal Response What future research studies do you have planned? So far, we have discovered that whenever a dynamical master equation uses non-linear functions of probabilities, then the general way of constructing entropic divergence and through this the entropy – probability relation leads to new formulae. On the other hand, more trivially, transition rates depending on the state properties (including linear preference rates) also determine the stationary distribution having a non-exponential, non-Boltzmannian form. Our plans include to study systems where both the initial and the target state influences the microtransition rate between them. Here the increase of entropy is highly a nontrivial question, and no definite answer has been found so far – except for cases restricted by the detailed balance condition. Another future development is to study processes where no detailed balance is possible, due to asymmetric big jumps occurring as easily as small transitions. Such physical systems can show avalanches or earthquakes. For these questions, we concentrate not on the final distribution of stress (which describes a state), but on the distribution of the size of the jumps in the dynamics.

www.researchoutreach.org

45


Physical Sciences ︱ Dr Julien Orts

NMR : 2

A highly accurate approach to protein-ligand binding A novel method to determine accurately and efficiently the structure of the receptor binding sites in protein-ligand complexes promises to revolutionize drug discovery. Dr Julien Orts and his collaborators at the Swiss Federal Institute of Technology are developing a powerful and general technique, based on liquid nuclear magnetic resonance (NMR), to shed light on the details of how proteins interact with drugs.

P

roteins are the fundamental building blocks of all living matter, from microscopic viruses and bacteria to highly evolved multicellular organisms, plants, animals and humans. They play a role virtually in all phenomena associated with life, including providing structural support to individual cells and tissues, enabling motion in complex organisms, producing energy and regulating signalling between cells in the body. Frequently, the physiological function of proteins is modulated by the interaction with small molecules (ligands), which can bind to specific receptor sites in a protein and trigger a response, e.g. in the form of a structural change or of a chemical reaction. For instance, ligands like hormones can promote cell growth by increasing the rate of protein production or they can induce relaxation in muscle

which drastically affects the bacterial cell wall growth and eventually leads to its degradation and to bacterial death. Some anti-cancer drugs act in a similar way, by blocking the proteins responsible for DNA synthesis and hindering the growth of new cancerous tissue. Understanding how proteins involved in bacterial infections or ontogenesis are affected by specific drugs, and how we can optimise these drugs to maximise their effects is currently one of the crucial and most far-reaching issues in pharmacological research and drug discovery. PROTEIN STRUCTURE DETERMINATION Proteins are typically large and complex macromolecules, composed of thousands of atoms, which are arranged in chains of subunits, called amino acids. Proteins

Understanding the nature of the interactions between proteins and ligands is a crucial issue in drug discovery. tissue. Similarly, cofactors are ligands that play an essential role in a number of complex chemical reactions involved in the metabolic processes that keep an organism alive. Most drugs can themselves be classified as artificial ligands. They work in the same way as natural ligands, by binding to specific sites in proteins and modifying the protein’s function. A famous example of this is penicillin (an antibiotic, and one of the first medications found to be effective against many bacterial infections caused by staphylococci and streptococci), discovered by Alexander Fleming in 1928. Penicillin acts by binding to receptor sites in bacterial cell membrane proteins,

46

www.researchoutreach.org

fold into 3D structures, which are characteristic of each protein and are connected at a fundamental level to the protein’s function. Accurate knowledge of a protein’s structure is the first step toward understanding how the protein works and how the chemico-physical processes that it carries out can be influenced by means of drugs. Nowadays, drug discovery typically starts by screening large databases of molecules or molecular fragments as potential ligands for a selected drug target. This approach is very error-prone and it requires validation, e.g. by comparison to refined 3D complex structures. The gold standard in 3D protein structure determination is currently provided by X-ray crystallography, a powerful


Method

The challenge is to derive, rapidly and reliably, proteinligand complex structures 1: W ithout any protein assignment 2: W ithout knowing the location of the binding site 3: W ith receptor (side chains & backbone) and ligand flexibility

Experimental inter-molecular NOEs measured on a high field spectrometer: Top: 13C,15N-filtered 2D-[1H,1H]-NOESY spectrum showing inter-molecular NOEs between the ligand and unknown methyl groups of HDMX. Bottom: Ligand 1H magnetization auto-relaxation curves (left) and inter-molecular cross-peak build-up curves (right) versus the mixing time of the filtered NOESY experiments.

approach that is frequently used to work out the structure of proteinligand complexes, essentially at an atomic level. This technique requires crystalline samples, which can be expensive and time-consuming to obtain. X-ray crystallography can also run into problems with specific classes of proteins, including membrane proteins and flexible receptors. The efficiency of X-ray crystallography can be improved substantially using the Molecular Replacement (MR) method, which relies on the existence of a previously resolved structure that is similar to the protein-complex under study. However, some classes of proteins (like membrane proteins and flexible receptors) remain beyond the capabilities of X-rays, and a radically different technique, nuclear magnetic resonance (NMR), has been proposed as a potential alternative for these cases. NMR uses magnetic fields to gain information about the environment of each individual atom in a protein, for example, its chemical shift and dipoledipole interaction. The main limitation of NMR is that this technique requires long measurements and extensive data analysis. A recently proposed and fastdeveloping technique, cryo-electron microscopic, has also been shown to have great potential in drug discovery for large systems. Currently, however, X-ray crystallography and, to a lesser extent, NMR are by far the most widely used approaches to structural determination, and hundreds of thousands of protein structures have been resolved to

date using these two complementary techniques. The new method developed by Dr Orts and his collaborators aim to bridge the gap between X-rays and NMR, in order to expand the power and applicability of NMR and to make it a robust tool for drug discovery. NMR2: A ZOOM ON THE BINDING SITES The protein-ligand binding arises from very local interactions: ligands only bind to very specific sites in a protein, which typically involve only a relatively small number of atoms compared to the whole protein. Binding sites are just like pockets in the protein, which a ligand can enter and in which it can be stabilised by steric (i.e. depending on the shape

BINDING

and size of the pocket), electrostatic or chemical interactions. The global structure of the protein is initially largely unaffected by the ligand binding. Based on this observation, Dr Orts and his collaborators have developed a powerful protocol to work out in great detail the structure of the protein-ligand binding site, using existing information about the global structure of the protein obtained in separate X-ray diffraction or liquid NMR measurements. This method, the Nuclear Magnetic Resonance Molecular replacement (NMR2), is based on solutionstate NMR structure determination, but it focusses on the structure of the bindingsite alone, rather than on the full protein, and therefore it circumvents the lengthy and tedious analysis required to resolve

INTERMOLECULAR COMMUNICATION

ALLOSTERIC SIGNAL TRANSDUCTION

From drug design to intermolecular communication and biological function via signal transduction.

www.researchoutreach.org

47


the whole protein structure from NMR data. NMR2 drastically reduces the time and effort required to obtain an atomically resolved structure of a binding pocket by using previously determined protein structures, from a couple of months to a couple of days. It can also be partially automated, and therefore it provides a natural tool for high-throughput workflows, which can be used to screen thousands of potential new drugs in sequence and to analyse the nature of their interaction with a target protein. A typical NMR2 protein-ligand structure determination requires a preliminary preparation of the sample, in which either the protein or the ligand are isotopically substituted (13C, 15N) or selectively labelled (e.g. isoleucine, leucine and valine methyl labelling). NMR experiments are then used to measure intra-molecular (ligand) and inter-molecular (ligandprotein) atomic distances, which in turn provide a model of the ligand structure in the binding pocket. To understand the exact nature of the ligand-protein interaction, protein structures from existing databases (obtained from X-ray or NMR measurements) are then used as

Binding Site Opening

HDMX binds to p53 and inhibits the transactivation of p53 target genes. HDMX binds to ligands via a mechanism where disordered regions become structured. Left: Protein binding pocket (circled in red) opening upon p53-peptidomimetic binding. Right: Effect on the protein NMR spectrum upon ligand binding (blue = protein bound, red = free protein).

input information. The structures selected can be either that of the protein in the absence of the ligand or those of similar (homolog) proteins. The NMR2 program then screens all possible assignment groups in the protein and calculates the protein-ligand complex structure for all options. At this stage, it is essential to reduce as much as possible the number of configurations to screen. This can be achieved by initially restricting the assignment groups in the protein to only 3 or 4 relevant ones. False assignments

The NMR2 methods reduce the time required to determine protein-ligand structures from months to a few days. RMSD 0.9 Å

RMSD 0.9 Å

A

B

C

D

RMSD 1.1 Å

21 intermolecular distances

RMSD 1.8 Å

NMR structures of HDMX in complex with cmpd2 under different starting scenarios using either a) the 3D structure of the native holo-protein (3fea), b) the X-ray crystallography structure of HDMX in complex with a different ligand (3fe7), c) a homology model of an X-ray crystallography structure of HDM2 bound to a different ligand (2axi), or d) a homology model derived from the NMR structure of the homologous protein HDM2 (1z1m) in its ligand-free state. The reference X-ray structure of the HDMX – cmpd2 complex is coloured green. The NMR2-derived structures are colour coded red for the ligand and blue for the protein with the exception of the protein structure in a) because the structure is identical to the reference structure. 2

48

www.researchoutreach.org

can be ruled out using geometric considerations, based on the knowledge of the input structures. This substantially reduces the calculation time. At the end of this procedure, the resulting complex structures have to be analysed carefully, to detect potential errors arising from the unconstrained relaxation of the protein backbone during the refinement procedure. It is important that a sufficient number of inter-molecular distances are taken into account, typically at least 12 or 15. A high signal-to-noise in the NMR spectra and a good signal resolution are also crucial. A NEW TOOL FOR DRUG DISCOVERY The NMR2 method has been applied successfully to the resolution of various classes of protein-ligand complexes. Several structures containing strong binders or small ligands have been determined with an accuracy of 0.91.5 angstroms relative to the reference structure. The applicability of NMR2 to complexes with ligands in fast exchange or weak affinity binding has also been demonstrated. In the case of the weak affinity binding complex HDM2-#845, a new complex has been characterised, never observed before. The efficiency NMR2 is well represented by the complex structure of SJ212-MDMX, which could be resolved at 1.35-angstrom accuracy within a day using a desktop computer. These are a few initial examples of the great potential of NMR2 in the study of protein-ligand interactions and protein function and they pave the way for its application as a fast, reliable and accurate protocol for drug discovery.


Behind the Research Dr Julien Orts

E: julien.orts@phys.chem.ethz.ch T: +41(0) 44 632 28 64

@JulienOrts

Research Objectives

References

Dr Orts and his collaborators have developed multidisciplinary approaches to study protein-small molecules complexes using NMR spectroscopy, X-ray crystallography and computational methods.

Orts, J (2018). Research profile. [online]. Available at: https:// twitter.com/JulienOrts http://n.ethz.ch/~ortsj/JulienOrts/ Interests.html [Accessed 28 Jan 2019].

Detail Dr Julien Orts Laboratory of Physical Chemistry Swiss Federal Institute of Technology ETH HCI F217, Vladimir-Prelog-Weg 2 8093 Zürich, Switzerland Bio Julien Orts graduated in Physics and Biophysics jointly from the Max Planck Institute and the European Molecular Biology Laboratory. He is currently a junior group leader at the ETH Zürich. He develops methods for structurebased drug design and demonstrated for the first time that 3D structure determination of a protein-ligand complex can be achieved from solution NMR data fully automatically. Collaborators • Peter Güntert • May Marsh • Roland Riek • Dean Strotz • Felix Torres • Marielle Aulikki Wälti

Orts, J; Wälti MA; Marsh, M; Vera, L; Gossert, AD; Güntert, P; Riek, R. (2016). ‘NMR-Based Determination of the 3D Structure of the Ligand−Protein Interaction Site without Protein Resonance Assignment’. J. Am. Chem. Soc., 138, 4393-4400. Wälti MA; Riek, R.; Orts, J. (2017). ‘Fast NMR-Based Determination of the 3D Structure of the Binding Site of Protein–Ligand Complexes with Weak Affinity Binders’. Angew. Chem. Int. Ed., 56, 5208-5211. Wälti MA; Orts, J. (2018). ‘The NMR2 Method to Determine Rapidly the Structure of the Binding Pocket of a Protein– Ligand Complex with High Accuracy’. Magnetochemistry, 4, 12.

Personal Response The accuracy and efficiency of the NMR2 methods have been documented in a number of protein-ligand complexes. What are the remaining challenges of NMR2 that need to be addressed in order to make your method a robust and easy to use routine tool for highthroughput molecular screening in drug discovery? Fragment-based drug discovery is becoming a major approach in both pharmaceutical companies and academic laboratories. Fragment-based methods need fewer compounds to be screened, synthesized and the fragment hits show usually high ligand efficiencies (potency per atom). Currently, no fast and robust NMR method can handle small fragments due to a lack of protons “probes”. We need to develop the current NMR2 method to create a new approach, that can automatically and simultaneously determine multiple structures of fragmentprotein complexes. Having access to the structure of the binding site for each binder allows investigating chemical scaffolds that would otherwise be discarded and to broaden the chemical knowledge as well as the drug-ability of the receptors.

www.researchoutreach.org

49


Laser ablation in liquid: Physical Sciences ︱ Dr Katharine Tibbetts

A powerful route to new nanoparticle catalysts Dr Katharine Tibbetts (Virginia Commonwealth University) has been developing a novel approach for the synthesis of metal nanoparticles, based on a reactive laser ablation in liquid technique. She uses ultra-short laser pulses to ionise water molecules and generate a highly energetic plasma of electrons capable of reducing soluble metal ions to neutral atoms, which then coalesce to generate a nanoparticle suspension. At variance with traditional chemical approaches based on strong reducing agents and surfactants, the laser ablation technique is fast, environmentally friendly and it can be used to control the nanoparticle size and properties.

N

anomaterials exhibit characteristic structures containing particles of sizes between 1 and 100 nm (1 nm=10-9 m), in an unbound state or in the form of aggregates. Examples of natural nanomaterials include the capsid structures of viruses, some butterfly wing scale patterns and colloidal fluids, such as milk and blood. Synthetic nanomaterials, on the other hand, are engineered and manufactured to have well-defined mechanical, physical or chemical properties, which makes them suitable for technology and in industrial processes. In particular, metal nanomaterials and metal oxide nanocomposites have important applications in optoelectronics, sensing, drug delivery and catalysis. Bimetallic nanoparticles have for instance been shown to be promising catalysts for processes including the water-splitting reaction and the reduction of CO2, and they represent a highly promising path towards sustainable energy generation, storage, transmission and utilisation

on a global scale. The enhanced activity of bimetallic nanoparticles has been attributed to synergistic effects between the metals and defect sites in nanocrystal structures. In order for nanomaterials to be chemically effective, the size of their constituent nanoparticles has to be tightly controlled. For instance, in catalytic processes, smaller nanoparticles yield better catalytic turnover, because the ratio between their surface area (where chemical reactions occur) and their volume is large. Traditional approaches to metal nanomaterial synthesis typically use strong reducing agents to produce nanoparticle seeds and surfactants to prevent excessive growth. Surfactants, however, are toxic and they can have a substantial environmental impact. Furthermore, surfactant molecules can hinder catalytic reactions by binding strongly to a nanoparticle’s active sites and making them inaccessible to potential reactant molecules. Finally, chemical

Schematic illustration of femtosecond reactive laser ablation in liquid method.

Solvent, metal salt(s), additives

Solid target

fs laser Plasma reactions

50

www.researchoutreach.org

Nanostructured products


Au nanoparticles synthesised in 10 mM isopropyl alcohol.

methods are slow, and they require hours, or even days, to grow nanoparticles of the desired size. PLASMA SYNTHESIS OF NANOPARTICLES WITH LASER LIGHT The reactive laser ablation in liquid method pioneered by Dr Tibbetts and her collaborators offers a powerful and flexible alternative to the purely chemical approaches to nanomaterial synthesis currently in use. It is a fast and “green” method, which generates nanomaterials within minutes, does not require toxic reducing agents and surfactants, and enables the formation of nanostructures with novel morphologies and bonding environments. The key idea of this method is that laser light, rather than a chemical agent, can be used to provide the electrons required to reduce soluble metal ions to their neutral metal atom counterparts, which then assemble themselves to form a nanoparticle phase. A typical synthesis uses water solutions of metal ions (e.g. Au3+ or Ag+), and it can include a solid target, such as a silicon wafer. Laser light is applied in the form of very short (ca. 30 fs, or 30 quadrillionths of a second) and intense pulses, which exchange enough energy with the water molecules to induce the emission of electrons from them. These electrons are highly energetic and form a dense plasma with transient temperatures as high as 5000-7000 K, comparable to the surface of the Sun. These free electrons reduce the Au3+ and Ag+ ions in solution to insoluble Au or Ag atoms, which assemble to form the nanoparticles. The rate of metal-ion reduction is

Au nanoparticles stabilised by amorphous silica.

Au nanoparticles synthesized with 532 nm nanosecond laser pulses.

Plasma synthesis operates in highly nonequilibrium conditions and affords the controlled and efficient creation of novel nanoparticle phases. directly proportional to the production of electrons in the laser-induced plasma, as has been demonstrated in the case of the reduction of Au3+ in [AuCl4]- in water solution. Furthermore, the rapid cooling following the reduction caused by the solution mixing can determine the formation of novel metastable nanoparticle structures, which are not observed in chemical synthesis carried out in milder conditions. CONTROLLING NANOPARTICLE SIZE AND COMPOSITION In addition to creating the reducing electron plasma, the laser radiation also forms hydroxyl radicals (OH.) by breaking apart water molecules.

Ag nanoparticles synthesised in 0.25 mM ammonia.

OH radicals react with each other to produce hydrogen peroxide (H2O2), which can act as a reducing agent for species like Au3+ (therefore further promoting the formation of Au nanoparticles), but also as an oxidising agent for reduced metal atoms, like Ag. As a consequence of the formation of H2O2, fast growth of Au nanoparticles is therefore typically observed, whereas no Ag nanoparticles form at all because the Ag atoms are quickly back-oxidised to Ag+ ions. To prevent the formation of H2O2, OH. scavengers, which include ammonia, sodium acetate and isopropyl alcohol, can be added to the solution. This enables superior control over the formation of both Au and Ag

Ag nanoparticles synthesised in 1 mM ammonia.

www.researchoutreach.org

51


Current understanding of reaction mechanisms forming stabilised nanoparticles from femtosecond reactive laser ablation in liquid.

nanoparticles. For instance, Dr Tibbetts and collaborators have shown that adding isopropyl alcohol reduces the size of Au nanoparticles from 4.9 to 3.8 nm and makes them significantly more monodisperse (the same size). This has been attributed to the reduced production of H2O2 in the presence of isopropyl alcohol, which prevents the formation of large particles. For Ag, different morphologies are observed depending on the amount of ammonia added to the solution. The presence of the silicon wafer can also be exploited to reduce the rate of back-oxidation, as the laser pulses cause the photo-ejection of silicon atoms and clusters from the wafer, which can both react with the forming OH radicals and stabilise the incipient nanoparticles by attaching to them.

obscure. The team lead by Dr Tibbetts has been pioneering the use of in-situ UVvis spectroscopy measurements carried out during plasma synthesis to unravel the complexity of the chemical reactions involved in the process and the dynamics of nanoparticle growth. Understanding the real-time dynamics of the species generated during laser ablation and the kinetics of the metal-ion conversion to nanoparticles, as well as establishing a correlation between the observed reaction rates and the properties

their chemical activity deteriorates over time, owing to metal leaching and particle agglomeration. A promising approach to hinder the occurrence of these adverse processes is to support the nanoparticles on thermally stable and chemically inert oxides, such as silica (SiO2). Chemical approaches have been proposed to this aim, which use commercially available silica supports. These methods do not use surfactants, but they require a separate precursor synthesis and they are relatively slow (typically their duration is at least two days). Recently, Dr Tibbetts’s group has demonstrated that the reactive laser ablation in liquid technique can be used to promote the formation of Au nanoparticles with diameters as small as 1.9 nm supported on an amorphous silica matrix. This is the first report of laser-synthesised supported sub-3 nm metal nanoparticles, which paves the way for the development of a clean, fast and highly tunable approach for the synthesis of metal oxide nanocomposites with enhanced catalytic activity. Plasma synthesis can also be extended to the creation of catalytically active multimetallic nanostructured materials with morphologies and physico-chemical properties not accessible to other methods. The work initiated by Dr Tibbetts will provide a more detailed and insightful picture of the plasma synthesis processes in a variety of conditions and enable better control of the properties of the resulting nanomaterials.

Nanostructured catalysts exhibit exceptional activity in several chemical processes for energy production and storage, including water splitting and CO2 reduction.

CHALLENGES AND OPPORTUNITIES OF PLASMA SYNTHESIS Although a general model of the physical and chemical phenomena occurring during laser ablation in liquid has been developed, based on the chemical and morphological characterisation of the synthesis products along with in situ measurements (optical emission spectroscopy, cavitation bubbles with shadowgraphy and small-angle X-ray scattering) and quantum mechanical (DFT) simulations, a number of details concerning the kinetics of the reactions induced by laser irradiation remain largely

52

www.researchoutreach.org

of the products are important steps toward the optimisation of the synthetic conditions for generating nanoparticles with wide varieties of compositions, sizes and chemical reactivities. In particular, the ability to synthesise sub-5 nm nanoparticles composed of earthabundant metals (Cu, Fe and Ni), in place of the noble metals Au and Ag, is an ambitious and far-reaching goal, which has the ability to transform laser ablation in liquid from a cutting-edge technique for the synthesis of metal nanoparticles into a practical, general and sustainable approach to develop new catalytic species for large (potentially industrial) scale chemical processes. A current significant limitation of bimetallic nanoparticle catalysts is that


Behind the Research Dr Katharine Tibbetts

E: kmtibbetts@vcu.edu T: +1 804 828 7515 W: https://chemistry.vcu.edu/people/faculty/tibbetts.html

Research Objectives

References

Dr Tibbetts Current research areas focus on (1) synthesising metal-based nanoparticles from metal salt precursors and elucidating reaction mechanisms for these transformations in the condensed phase, and (2) probing and controlling the dissociation dynamics of polyatomic radical cations in the gas phase.

Tibbetts, K (2018). Research profile. [online] College of Humanities & Sciences, Chemistry – Virginia Commonwealth University. Available at: https://chemistry.vcu.edu/people/ faculty/tibbetts.html [Accessed 21 Jan 2019].

Detail

Rodrigues, CJ; Bobb, JA; John, MG; Fisenko, SP; El-Shall, MS; Moore; Tibbetts, KM. (2018). ‘Nucleation and growth of gold nanoparticles initiated by nanosecond and femtosecond laser irradiation of aqueous [AuCl4]-’. Physical Chemistry Chemical Physics, 20, 28465-28475.

Department of Chemistry, Virginia Commonwealth University 1001 W. Main St. Richmond, VA 23284 USA Bio Katharine Tibbetts received her PhD in Chemistry at Princeton University with Herschel Rabitz. She was a postdoctoral fellow at the Center for Advanced Photonics Research at Temple University with Robert Levis. She began her independent career at Virginia Commonwealth University (VCU) in 2015. Funding American Chemical Society Petroleum Research Fund; US Army Research Office Collaborators • M. Samy El-Shall, VCU Chemistry • Puru Jena, VCU Physics • Gennady Gutsev, Florida A&M Physics

John, MG; Tibbetts, KM. (2019). ‘One-step femtosecond laser ablation synthesis of sub-3 nm gold nanoparticles stabilized by silica’. Applied Surface Science, 475, 1048-1057.

Meader, VK; John, MG; Rodrigues, C; Tibbetts, KM. (2017). ‘Roles of Free Electrons and H2O2 in the Optical BreakdownInduced Photochemical Reduction of Aqueous [AuCl4]-’. The Journal of Physical Chemistry, A, 121, 6742-6754. Meader, VK; John, MG; Frias Batista, LM; Ahsan, S.; Tibbetts KM. (2018). ‘Radical Chemistry in a Femtosecond Laser Plasma: Photochemical Reduction of Ag+ in Liquid Ammonia Solution’. Molecules, 23, 532.

Personal Response How do you expect your method evolve in the near future, and do you expect it to see it applied to industrial or commercial processes in the short term? As we learn how to manipulate the concentrations of reactive species in the plasma, we expect to gain greater control over nanoparticle sizes, compositions, oxidation states, and surface properties, as well as expand the scope of accessible metals. For instance, if we can’t remove enough hydroxyl radicals to make copper or iron nanoparticles in water, we could switch to a different solvent like methanol. Right now, our method can’t make sufficient nanoparticle quantities for commercial applications, but it is only a matter of time before a laser ablation in liquid method will achieve this. A group in Germany recently reported production of gold nanoparticles at a rate of 4 g/hr by ablating gold foil with a specialised highrepetition-rate laser.

www.researchoutreach.org

53


Engineering and Technology ︱ Professor Bin Zheng

Assessing performances of Computer-Aided Diagnosis of breast cancer S In under two decades, the techniques used to image, classify and diagnose breast cancer have significantly improved with the help of rapidly advancing computerbased digital image processing and machine learning technologies. Since the early days of developing ComputerAided Diagnosis technology in the 1990s, Professor Bin Zheng at the University of Pittsburgh and then the University of Oklahoma has dedicated his research to assess its capabilities. His work has allowed for advances ranging from confirmations of the success of some of the first detections of cancer from digitised mammograms to the use of MRI in assessing the responses of breast tumours to chemotherapy.

ince the late 1990s, Computer-Aided Detection and/or Diagnosis (CAD) has become an indispensable tool for clinicians as they attempt to identify and diagnose cancer in their patients. Through processing and analysing medical images taken through methods including X-ray and MRI scans, CAD is now capable of identifying and quantifying the image patterns that highly associate with cancer risk, characteristics and prognosis; more effectively than the human eye in many cases. Breast cancer is one area where CAD has become particularly important; in recent years, the technology has made it far easier to identify dangerous growths from digital mammograms and MRI scans. Since its emergence, Professor Zheng and his colleagues have been at the forefront of assessing the capabilities of CAD mammography.

THE POTENTIAL OF COMPUTER-AIDED DETECTION AND DIAGNOSIS (CAD) In the past, doctors (in particular radiologists) have faced many difficulties in reading and interpreting mammograms to detect suspicious breast lesions, and to

A GUI of the short-term breast cancer risk prediction model.

54

www.researchoutreach.org

classify the difference between malignant and benign lesions, simply through their trained intuition. When it first emerged in the 1990s, CAD showed the promising potential to provide radiologists with useful decision-making supporting tools to reduce errors in cancer detection and diagnosis, but it was far from certain whether the technology would be able to help radiologists more accurately identify cancers (or distinguish between malignant and benign lesions). In his research, Professor Zheng has strived to develop new ways to assess how effectively CAD can be used to diagnose and treat instances of breast cancer. Through rigorous testing involving the mammograms of real breast cancer patients, his work has provided crucial assessments of where CAD technologies are working, and has identified areas where they need to be improved. In earlier studies, this involved evaluating the performance of CAD in identifying breast cancer from mammograms taken by rapidly-improving imaging techniques. Based on this earlier work, Professor Zheng is now working towards assessing the abilities of CAD to image quantitative cancer image markers. Imaging these markers would allow radiologists both to predict short-term breast cancer risk for improving the efficacy of cancer screening, and to assess cancer prognosis; improving the efficacy of cancer treatment. ASSESSING LIMITATION OF EARLY CAD Professor Zheng and his colleagues carried out their first significant study in 2001 [1], as CAD was emerging as an important new diagnostic tool. In the study, the researchers investigated the potential clinical utility of using CAD to help improve radiologists’ performance in detecting breast cancer through mammograms. To do this,


the team carried out a multi-mode based observer performance study involving screening mammograms acquired from 209 women. While some of these patients had tested positive for breast cancer according to several reputable radiologists, the rest were confirmed to be cancer-free. For each screening, Professor Zheng’s team programmed the CAD scheme to four different levels of detection sensitivity and specificity (or false-positive rates) in detecting breast lesions, which were represented by both soft tissue masses and micro-calcification clusters. Afterwards, eight radiologists independently read and interpreted this set of mammograms five times; first without using CAD, and then using CAD at four different performance levels. The researchers then quantified the rates of correct diagnoses through comparisons with the radiologists’ conclusions in five reading modes. The team found that when using highperformance CAD schemes with both high sensitivity and low false-positive rates, the radiologists’ performance in detecting breast cancer from screening mammograms was significantly improved. At the same time, using CAD with lower performance levels, including either low sensitivity or higher false-positive rates, actually reduced radiologists’ performance. This laboratory study based finding was appraised by an editorial article of an expert in mammography [2], and has later been confirmed by a number of clinical studies (e.g., [3] ). Thus, Professor Zheng’s study revealed the importance of reducing false-positive rates when CAD is used in clinical practice. ADAPTING TO ADVANCING CAD TECHNOLOGY This finding promoted great research interest to explore and develop new technologies and approaches in the CAD research field. For example, Professor Zheng and his collaborators have not only converted the CAD scheme from the previous screen-film based digitized mammograms to full-field digital mammograms (FFDM) [4]. The team has also been working towards developing and evaluating a number of novel approaches, which include but are not limited to multi-image based CAD schemes [5] and adaptive CAD cueing methods [6].

A

C

B

D

Showing intermediate results of image processing steps including (A) the computed breast density maps, (B) detected focal density regions, (C) local density fluctuation maps, and (D) image maps generated using Gaussian bandpass filtering. Colour bars show volumetric density levels of the pixel values.

Professor Zheng hopes that clinicians will soon be able to use CAD-generated markers to predict the risk of breast cancer in individual patients. First, in developing a multi-view CAD scheme, the team proposed and applied a new narrow matching strip method, which significantly improved accuracy when matching two suspicious lesions detected in two view mammograms. As a result, the new CAD scheme enables radiologists to detect more lesions on two views, without increasing false-positive rates [5]. Second, the team developed a new case-based CAD scheme to predict the risk of a given case being positive for cancer, then fusing the case-based risk score with the conventional region

(or lesion) based score. This technique produced a new CAD cueing method, namely, an adaptive cueing method, to help detect more difficult or subtle cancers without increasing overall falsepositive rate [6]. In addition to detecting suspicious lesions, Professor Zheng and his team also developed CAD schemes to classify between malignant and benign breast lesions. For example, the team has performed pioneering work in applying content-based image retrieval (CBIR)

A multilayer topographic region growing algorithm is implemented to segment lesions on CEDM images.

www.researchoutreach.org

55


technology to develop CAD schemes using a large, diverse and balanced reference dataset. This data showed more than 4,000 regions of interest (ROIs) that depict biopsyverified malignant and benign lesions. This CBIR-based CAD scheme not only achieves higher classification performance, but also pursues higher visual similarity, aiming to increase radiologists’ confidence in accepting CAD-generated classification results [7].

A

B

Recently, Professor Zheng and his team identified and investigated a new shortterm breast cancer risk factor, or prediction model, based on the quantitative analysis of bilateral mammographic density and tissue asymmetry. Several studies (i.e., [8]), have demonstrated that using this new model had the potential to increase discriminatory power to predict short-term breast cancer risk, and thus, to help improve the efficacy of mammography screening. TRANSITION TO A NEW IMAGING MODALITY Recently, contrast-enhanced digital mammograms (CEDM) have emerged as a promising new imaging modality in breast cancer imaging. The technique takes advantage of both DDDM, which provides high spatial resolution, and breast magnetic resonance imaging (MRI), which provides physiologically functional information with significantly fast scanning and low cost. However, the possibility of developing CAD schemes of CEDM to assist radiologists in more accurately diagnosing suspicious lesions has not been investigated before. Professor Zheng and his collaborators in Mayor Clinic Arizona and Arizona State University have been working to develop new CAD schemes of CEDM. The work reported in the ABME paper [9] presents the first fully-automated CAD scheme of CEDM to classify between malignant and benign breast lesions. The study results showed, firstly, that the segmentation of lesions from dual energy subtraction (DES) images was much

(a) Workflow of CEDM imaging and (b) Four images from left to right: high energy image, low energy image, and dual energy subtraction image displayed at the original and the adjusted window and level for improving visibility, respectively.

easier and more accurate if lesions are enhanced. Secondly, however, DES lost lesion density heterogeneity information, which may be a disadvantage Finally, CAD yielded a significantly improved performance by mapping better lesion segmentation results from DES images to low-energy FFDM images. If the results can be further validated in future largescale studies, using CEDM and CAD will play an important role in reducing unnecessary biopsies in future breast cancer diagnoses. GENERATING NEW CAD-BASED IMAGE MARKERS Currently, Professor Zheng and a team of researchers are developing CAD of breast MR images, aiming to identify new image markers for assessing the response of breast lesions to neoadjuvant chemotherapy. For example, in a 2016 study, Professor Zheng and his colleagues used CAD to assess MR images of 151 women undergoing neoadjuvant

Professor Zheng’s study revealed that even in the first years of its development, CAD already showed significant potential to improve rates of a successful breast cancer diagnosis.

56

www.researchoutreach.org

chemotherapy for breast cancer; some of whom showed a complete response to the treatment, and others displaying only a partial response. This time, CAD generated image markers by checking for bilateral asymmetry of dynamic contrast enhancement signals between the left and right breasts of each patient. These markers were computed, and then selected to build a multi-feature, fusionbased machine learning model to distinguish between complete and partial responses to neoadjuvant chemotherapy, using MRI acquired pre-therapy [10 ]. The study demonstrated that even with a large number of MR images, CAD remained a reliable technique not only in identifying time-varying cancers from the images, but also for classifying between different responses of breast cancers to chemotherapy. This result reveals a promising potential for CAD to be used to generate quantitative cancer image markers in patients undergoing chemotherapy. With the insights provided by this latest development, Professor Zheng hopes that clinicians will soon be able to use CAD-generated markers to predict the risk of breast cancer in individual patients, to classify between malignant and benign tumours, and to assess patient responses to chemotherapy more effectively than ever before.


Behind the Research Professor Bin Zheng

E: bin.zheng-1@ou.edu T: +1 405 325 3597 W: www.Medical-imaging.rccc.ou.edu/zheng www.linkedin.com/in/bin-zheng-49ba3232/ W: https://scholar.google.com/citations?user=FnKrwz0AAAAJ&hl=en W: www.ou.edu/coe/ece/faculty_directory/dr_zheng

Research Objectives Professor Zheng’s research aims to provide clinicians with “visual-aided” tools in cancer diagnosis, developing and validating the computerised biomarkers extracted from the biomedical images and electrical signals in order to help improve accuracy and reliability of predicting cancer risk, classifying suspicious lesions, assessing cancer prognosis and treatment efficacy.

Detail 101 David L. Boren Blvd, Suite 1001, Norman, OK 73072, USA Bio Bin Zheng received PhD from the University of Delaware. Currently, he is a professor of Electrical and Computer Engineering and Oklahoma TSET cancer research scholar at the University of Oklahoma. His research interest is to develop and evaluate novel medical imaging informatics tools for disease risk prediction, early detection and prognosis assessment. Funding Current work is supported in part from Grant R01 CA197150 from the National Cancer Institute, National Institutes of Health, USA, as well as Oklahoma Tobacco Settlement Endowment Trust, Peggy and Charles Stephenson Cancer Center, University of Oklahoma. Collaborators • Dr Bhavika Patel in Mayo Clinic Arizona • Drs Teresa Wu and Jing Li at Arizona State University Co-PIs of grant R01 CA197150: • Dr Alan Hollingsworth in Mercy Health Center • Dr Hong Liu at the University of Oklahoma

Personal Response What are your future plans for research in this area? Developing novel quantitative image markers using CAD technology has demonstrated its potential to help clinicians (i.e., radiologists, pathologists, oncologists and surgeons) make more accurate and consistent decision in cancer diagnosis and treatment. Thus, Professor Zheng and his research team will continue their research effort to explore and identify new image features from different imaging modalities, as well as develop and test new machine learning models that can optimally fuse multiple images features to produce new image markers that can achieve higher discriminatory power in prediction of cancer risk, diagnosis of suspicious lesions, and assessment of cancer prognosis and treatment efficacy.

References [1] Zheng, B. Ganott, M.A. Britton, C.A. Hakim, C.M. Hardesty, L.A. Chang, T.S. Rockette, H.E. Gur, D. (2001). ‘Soft-copy mammographic readings with different computer-assisted detection cuing environments: preliminary findings’. Radiology, 221(3), 633-640. [2 ] D’Orsl C. (2001). ‘Computer-aided detection: There is no free lunch’. Radiology, 221:585-586. [3] Fenton J, Abraham L, Taplin S, Geller B, Carney P, D’Orsl C, Elmore J, Barlow W, (2011). ‘Effectiveness of computer-aided detection in community mammography practice’. Journal of National Cancer Institute, 103:1152-1161. [4] Zheng B, Sumkin JH, Zuley M, Lederman D, Wang X, Gur D, (2012). ‘Computer-aided detection of breast masses depicted on full-field digital mammograms: a performance assessment’. British Journal of Radiology, 85:e153-161. [5] Zheng B, Leader JK, Abrams GS, Lu AH, Wallace LP, Maitz GS, Gur D, (2006). ‘Multiview based computeraided detection scheme for breast masses’. Medical Physics, 33:3135-3143. [6] Wang X, Li L, Xu W, Liu W, Lederman D, Zheng B, (2012). ‘Improving performance of computer-aided detection of subtle breast masses using an adaptive cueing method’. Physics in Medicine and Biology, 57:561-575. [7] Zheng B, Lu A, Hardesty LA, Sumkin JH, Hakim CM, Ganott MA, Gur D, (2006). ‘A method to improve visual similarity of breast masses for an interactive computeraided diagnosis environment’. Medical Physics, 33:111117 [8] Wang X, Li L, Liu W, Xu W, Lederman D, Zheng B, (2012). ‘An interactive system for computer-aided diagnosis of breast masses’. Journal of Digital Imaging, 25:570-579. [9] Danala G, Patel B, Aghaei F, Heidari M, Li J, Wu T, Zheng B, (2018). ‘Classification of breast masses using a computer-aided diagnosis scheme of contrast enhanced digital mammograms’. Annals of Biomedical Engineering, 46:1419-1431. [10] Aghaei, F. Tan, M. Hollingsworth, A.B. Zheng, B. (2016). ‘Applying a new quantitative global breast MRI feature analysis scheme to assess tumor response to chemotherapy’. Journal of Magnetic Resonance Imaging, 44(5), 1099-1106.

www.researchoutreach.org

57


Engineering and Technology ︱ Professor Gilles Desthieux

Developing the Geneva Solar Cadaster: A decision support tool for sustainable energy management in urban areas Gilles Desthieux, Associate Professor at the Geneva Institute of Landscape, Engineering and Architecture and a Consultant in Urban Energy Planning with Amstein+Walthert Geneva, leads a team of researchers who have developed the Geneva Solar Cadaster, a tool for modelling solar radiation and energy production from building rooftops and facades.

R

ecent urban studies have shown that our cities play a significant role in environmental issues and, in particular, energy transition. One of the targets of the EU 2030 Framework for climate and energy is to reach a minimum of 27% renewable energy consumption and reduce carbon emission. In order to achieve this target, cities will have to accurately evaluate their renewable energy sources. SOLAR PANELS The use of solar panels on building rooftops is now widespread. In builtup areas, however, incoming sunlight is limited and restricts the deployment of solar power plants. The recent improvements in solar panel efficiency, together with innovative concepts such as Nearly Zero Energy Buildings and Building Integrated PhotoVoltaics, have enabled the exploration

of other usable surfaces including highway roofs and walls for potential energy production. Building facades are particularly appealing for producing solar energy during the winter months when the sun is lower in the sky. MODELLING POTENTIAL SOLAR ENERGY Modelling the accessibility of solar energy within the fabric of the built environment can be carried out using available 3D information about cities and image processing. Facade modelling for solar analysis, however, has received less consideration than rooftops, as it requires much more complex tools based on 3D GIS data. Gilles Desthieux, Associate Professor at the Geneva Institute of Landscape, Engineering and Architecture and a Consultant in Urban Energy Planning

Building facades are particularly appealing for producing solar energy during the winter months when the sun is lower in the sky. in Amstein+Walthert Geneva, leads a team of researchers who have developed a tool for modelling radiation and energy production on inner-city building rooftops and vertical facades. This integrated tool uses Light Detection and Ranging (LiDAR) data together with

58

www.researchoutreach.org


2D and 3D cadastral data. (In Switzerland, plots of land are registered in a cadaster or cadastre, and their geometric details, structures, current situation and ownership are recorded.) Around ten public and private stakeholders collaborated on this R&D project that was supported by several funders: the State of Geneva and Geneva Energy Company – SIG (in the framework of the Geneva’s solar cadaster), the Swiss Federal Agency for the Promotion of Innovation Based on Science (iCeBOUND project - Cloud-Based Design Support System for Urban Numeric Data) and the University of Applied Sciences Western Switzerland – HES –SO (joint research program Energy District 2050). AIMS The project aims to design and develop a Decision Support System, based on the use of 3D digital urban data, that facilitates environmental analyses in large built areas such as the assessment of solar energy potential. The project targets building rooftops and facades in urban areas so as to provide relevant indicators for decision makers in city planning and energy management and boost solar energy production. Two key goals of this project are to allow residents to check the solar energy potential of their building and to promote the installation of PhotoVoltaics, encouraging energy transition away from nuclear power. METHODOLOGY The methodology is underpinned by the researchers’ previous work on ‘Solar Energy Potential’, where they evaluated the potential of urban buildings’ roofs and facades to produce solar energy using both thermal and photovoltaic technology. Solar energy potential for a particular building is computed by summing the estimated value of the solar radiation in kWh for every hour during a given period of time. Data is input comprising location, meteorological data (statistics or measurements), average solar radiation parameters and the 3D surrounding relief and landscape. A Shading Algorithm, based on image processing techniques, is used to calculate the hourly shading and the ratio of visible sky, also known as the Sky View Factor.

3D representation of the solar radiation outputs (in kWh/m2/year) on existing building facades and roofs in the “Meyrin Cité” neighbourhood (Geneva). This figure was previously published in Desthieux G, et al. (2018). ‘Solar Energy Potential Assessment on Rooftops and Facades in Large Built Environments Based on LiDAR Data, Image Processing, and Cloud Computing. Methodological Background, Application, and Validation in Geneva (Solar Cadaster)’. Front. Built Environ. 4:14. doi 10.3389/ fbuil.2018.00014 and is under the Creative Commons Attribution License (CC BY 4.0).

The data is stored in a georeferenced TIFF raster format and is made up of a Digital Urban Surface Model, a slope matrix and an orientation matrix. The Digital Urban Surface Model is constructed from LiDAR data and represents the structure of the city, such as buildings and houses. The slope matrix describes the slope of each point using values between 0 and 90°. The orientation matrix describes the orientation of each point with values between 0 and 360°. These two matrices are produced using common GIS software. The researchers calculate solar radiation by adding its three main components and related shading indices: direct radiation which is directly proportional to the sun visibility, diffuse radiation derived from the sky visibility and reflected radiation from the ground and nearby objects. Solar maps can then be produced enabling the accurate evaluation of urban areas for the installation of renewable energy, such as PhotoVoltaics and thermal solar panels on building rooftops.

Web-based tool developed by arx is under the mandate of SITG; an example of the display for main rooftop characteristics. Reproduced from SITG – https://sitg-lab.ch/solaire (2017).

FACADE COMPONENT The Digital Urban Surface Model does not permit the modelling of building facades, so the researchers have developed a facade component. This generates a grid of facade ‘hyperpoints’ at 1-meter intervals using building outlines in the form of 2D vector data. The shading algorithm is then applied to calculate hourly shading and the Sky View Factor in a similar way to the rooftops. APPLICATION TO THE SOLAR CADASTER FOR GENEVA In 2011, Gilles Desthieux and the research team started working on a solar cadaster for the Canton (or State) of Geneva, an area of about 300 km2, as a support to boost solar energy production (in accordance with both the Geneva Energy Law and the Master Energy Plan of Geneva). The new solar cadaster, highlighting raw solar radiation on building roofs, was made available to the public. During the second phase of the project, in 2014, the researchers developed indicators relating to solar

3D representation of the solar radiation outputs (intensity in kWh/m2/year) on building facades in a sector of the Municipality of Carouge, mixing new tall building developments and existing industrial areas. This figure was previously published in Desthieux G, et al. Front. Built Environ. (2018). (CC BY 4.0).

www.researchoutreach.org

59


energy production, environmental assessment, economic investment and payback. In 2016, during the third phase of the Geneva solar cadaster project, the research team updated the 2011 solar cadaster using new LiDAR data and 3D urban models. They also refined the solar modelling algorithms and improved the computational models, using cloud computing, in order to provide decisionmakers with the relevant indicators for city planning and energy management. The outputs from the revised Geneva solar cadaster have been scientifically validated by an international expert on solar modelling. CLOUD COMPUTING The majority of urban solar cadasters available in Europe are based on restricted areas, usually less than 100 km2. Geneva’s solar cadaster, however, covers the whole Canton of Geneva, not just the city, extending to around 300 km2. This brings challenges for Desthieux and the team with regards to computation time. Their original Geneva solar cadaster calculation in 2011 was computed by a single server machine and took approximately 2000 hours. In order to reduce the computation time of solar analysis on large urban areas, the researchers opted to use cloud computing in the framework of the iCeBOUND project mentioned above. This has reduced the overall computation time for the whole state of Geneva to about 350 hours. RESULTS The solar cadaster of Geneva has shown that a potential solar electricity production of 700 MWe could be achieved if PhotoVoltaics were installed on all the

The principle of the shadow casting algorithm, adapted with permission from ©2018 Front. Built Environ. (Desthieux et al., 2018).

Sharing this data and information is essential to promote an open energy transition strategy in order to move away from nuclear power. suitable rooftops of the Canton. At the end of 2016, Geneva solar electricity production was approximately 45 MWe. Desthieux mentions that while achieving 100% of this potential solar electricity production might not be realistic, it remains a significant target. DISSEMINATION Results are communicated via the Geneva official geoportal (SITG – Geneva Territorial Information System) and a Web-based interface that has been customised for public use. The geoportal, aimed at professionals working with solar energy, allows users to extract the whole solar database on any perimeter or group of buildings, supporting solar energy planning on various scales. Interactive maps together with information identifying suitable rooftops are published on the public web interface for various end-users, such as energy planning authorities, energy companies’ marketing and investing departments and the general public, to promote awareness

and encourage the installation of PhotoVoltaics and thermal solar panels. Gilles Desthieux states that sharing this data and information is essential to promote an open energy transition strategy in order to move away from nuclear power to green technologies. The public web interface is used between 80 and 100 times per week. This is in line with the number of applications for public funds, particularly the Geneva funding program for the renovation of buildings. PERSPECTIVES The researchers are planning to develop workshops, seminars and training sessions for potential stakeholders as well as targeted information on the Web to boost the solar market. They also intend to extend the solar cadaster to cover the whole agglomeration of Geneva, including French municipalities (new project “G2 Solaire” supported by the European program Interreg V for French-Swiss collaborationships). It is anticipated that the computation time can be further reduced through the use of a Graphics Processing Unit which accelerates computing and is adapted to matrix processing.

Generation of facades’ “hyperpoints” using 2D vector data of building outlines. This figure was previously published in Desthieux G, et al. Front. Built Environ. (2018). (CC BY 4.0).

60

www.researchoutreach.org

The research team are planning the next version of the solar cadaster and aim to produce a collaborative platform for everyone involved with solar energy in order to increase the number of installations and boost the solar market.


Behind the Research Professor Gilles Desthieux

E: gilles.desthieux@hesge.ch T: +41 22 546 26 80 W: www.hesge.ch/hepia/ W: https://sitg-lab.ch/solaire/

Research Objectives

References

Gilles Desthieux’s current research & consultancy activities deal with integrated urban and energy planning at district and municipal levels, development of GIS tools for energy mapping and planning, 3D urban modelling for environmental assessment – solar energy (application to the solar cadaster in Geneva) and flood risks, platforms for econeighbourhood development support, collaborative urban planning support based on geographical indicator systems.

Desthieux G. et al. (2018). ‘Solar Cadaster of Geneva: A Decision Support System for Sustainable Energy Management’. From Science to Society, Springer, 129-137.

Detail Geneva Institute of Engineering, Architecture and Landscape HEPIA Geneva Prairie Street 4 CH-1202 Geneva Switzerland Bio Gilles Desthieux, PhD and MSc from EPFL, is an associate professor at the Geneva Institute of Landscape, Engineering and Architecture and a consultant in urban energy planning in Amstein+Walthert Geneva. His research & consultancy activities deal with urban and energy planning, GIS tools for energy mapping, and collaborative urban planning support. Funding • State of Geneva (Energy, Topographic, Geomatics agencies) • Geneva energy service agency (SIG) • Swiss Innovation Agency Collaborators • Claudio Carneiro (State of Vaud, Switzerland) • Reto Camponovo, (HES SO /hepia, Switzerland) • Peter Gallinelli, (HES SO /hepia, Switzerland) • Nabil Abdennadher, (HES SO /hepia, Switzerland) • Anthony Boulmier, (HES SO /hepia, Switzerland) • Eugenio Morello, (Politecnico di Milano, Italy) • Phelan Leverington, (State of Geneva, Switzerland) • Alberto Susini, (State of Geneva, Switzerland) • Christelle Anthoine-Bourgeois, (Geneva energy service agency, Switzerland) • David Beni (arx iT, Switzerland)

Desthieux G. et al. (2018). ‘Solar Energy Potential Assessment on Rooftops and Facades in Large Built Environments Based on LiDAR Data, Image Processing, and Cloud Computing. Methodological Background, Application, and Validation in Geneva (Solar Cadaster)’. Frontiers in Built Environment, 4, 14. Available at: https://doi.org/10.3389/fbuil.2018.00014 [Accessed 7/12/18].

Personal Response How much will installing PhotoVoltaics on building facades in Geneva increase potential solar electricity production? Considering the potential installation of solar PV panels on a small part of the existing building facades in Geneva (like 5% of the total of the facades), this would represent a potential of about 250 MWe in addition of the potential on the roofs, which is significant. However, solar installations on facades are easier to anticipate on new building projects. Therefore, in some new planned districts analysed, it was estimated that solar PV on facades could cover until 50% of the electricity needs in the district. Can this model be applied to other urban areas outside of Switzerland? In this article, the application of the solar potential tool was illustrated through the solar cadaster in Geneva. However, this experience can be reproduced without any problem in other cities, provided 3D digital and cadastral data are available.

www.researchoutreach.org

61


Thought Leader

Engineering the future: How the WE@RIT programme is encouraging more women into the sector

The Kate Gleason College of Engineering at Rochester Institute of Technology (RIT) is the only engineering college in the United States to be named solely after a woman engineer. Inspired by Kate Gleason’s legacy as an innovative and entrepreneurial scholar, the Women in Engineering at RIT (WE@RIT) programme was initiated by Professor of Mechanical Engineering Margaret Bailey in 2003. Since then, it has become a bastion of support for female students in an academic field that remains even now, vastly male-dominated.

S

ince its inception in 2003, the Women in Engineering at Rochester Institute of Technology (WE@RIT) programme has directly led to an increase in female students at the Kate Gleason College of Engineering at the institution, and the setting up of hugely successful projects such as the Establishing the Foundation for Future Organizational Reform and Transformation (EFFORT@ RIT) and Advancement of Women Faculty (AdvanceRIT). WE@RIT continues to act today as an example for other

62

www.researchoutreach.org

science, technology, engineering and mathematics (STEM) departments across the United States and beyond, proving that it is possible to change campus culture through well-planned and well-led strategic policies. We caught up with Founder of WE@RIT Professor Margaret Bailey to find out a bit more. Hi Margaret! Can you tell us a little bit more about the WE@RIT, its core mission and heritage and your previous leadership role?

I led the creation of the programme ‘WE@ RIT’, which is dedicated to expanding the representation of women engineers and leaders within the engineering profession. In support of this mission, WE@RIT provides opportunities for girls and young women to explore engineering, create an engineering community and lead within an engineering environment. I served as the Founding Executive Director from 2003-2011 during which time WE@ RIT received the ‘WEPAN 2008 Women in Engineering Program Award’


The AdvanceRIT project involves initiatives aimed at refining university structures and practices. In the spring of 2018, AdvanceRIT hosted a salary workshop for RIT faculty titled: “Let’s Talk about Money: Understanding RIT Pay Practices”. Participants explored salary-related resources which could help to shape future thinking and discussions regarding salary. They also discussed what RIT has learned from past studies regarding faculty salary. All the survey respondents agreed with the following: ”This session enhanced the way I think about the issues/topics discussed.”

The AdvanceRIT project involves cultural change focused workshops. Here, Dr Bailey is facilitating a bystander awareness and action workshop in the fall of 2016 which was attended by a large audience of women and men on the RIT faculty and staff. Over 80% of participants asked for more workshops to continue to learn how to become an effective and active bystander. The two-week after-event survey revealed that 100% of respondents agreed with the following: ”I think I can make a difference in making the campus more inclusive by being an active bystander.”

(http://www.wepan.org/). During my leadership, the Kate Gleason College witnessed a three-fold increase in the number of incoming female students annually (from approximately 50 to 150). In addition, external funding for the organisation reached an annual level of $400K. As the Executive Director, I advised the Dean on issues related to gender diversity within the college; created strategies with Admissions to improve recruitment of women engineering students; managed programme staff including a full-time programme manager; oversaw financial activities; created/maintained a governance body; established key partnerships; prepared successful funding

In academic fields, going for tenure occurs at the same time as we are having children, and women remain the primary care-giver in most cases. proposals; and created a thorough programme evaluation system. Can you explain your current roles as the Senior Faculty Associate to the Provost for ADVANCE and Principal Investigator (PI) and co-chair for the President’s Commission on Women? I serve as the PI of the NSF ADVANCE Institutional Transformation project at RIT titled ‘Creating Opportunity Networks for Engagement and Collective Transformation: Increasing the Representation and Advancement of Women Faculty at RIT’ or AdvanceRIT (http://nsfadvance.rit.edu/). The other positions you mention closely align with this effort. The AdvanceRIT project aims to increase the representation and advancement of female STEM faculty at RIT through

a set of strategic initiatives focused on refining campus culture, improving career navigation and creating new institutional structures. Complementary social science research efforts adapt interventions to address the needs of key sub-populations including women of colour and deaf and hard-of-hearing women faculty. What are your personal achievements and highlights at RIT? Significant grassroots engagement in AdvanceRIT’s work over the past years has added positive and amplifying energy to the programme, helping to demonstrate the ongoing necessity and value of the work, while engaging a wider group of RIT faculty. Policy and practice changes in support of managing work-life integration have also occurred. AdvanceRIT has been a model of the organisational agility that is a part of the university’s strategic plan.

www.researchoutreach.org

63


What challenges might women face in STEM education and careers? Would you say that these challenges have dramatically changed since you were studying and starting out in your career? Isolation and cultural/climate-related issues and challenges around work-life integration are rife. In academic fields, going for tenure occurs at the same time as we are having children, and women remain the primary care-giver in most cases. In addition, some fields are still male-dominated, so there are a lack of role models for both students and faculty. When I was an undergraduate at Pennsylvania State, there was no women faculty in my STEM classes. The Society

Dr Margaret Bailey has been on the faculty at RIT in Mechanical Engineering since 2003.

Cultural change efforts often challenge people to be reflective, sometimes a bit vulnerable, and open to the possibility of alternative models of behaviour. of Women Engineers (SWE) had recently started but the programming was minimal. We actually thought it was strange that they had programmes for us, back then, people did not talk about it. Currently, 25% of faculty are women in my department. The numbers have changed dramatically and so have programmes and initiatives focused on work-life integration, reducing isolation and supporting building community networks for women faculty and students. What has your personal experience been as a woman in a leading role? It has been wonderful to work with a team of predominantly women leaders, and many energised male leaders on campus around issues focused on inclusivity. Leading an institutional transformation project is challenging. Some of the things I have had to learn along the way include financial and managerial type skills and knowledge, but they also include things that I find exciting and very complex. Things like cultural change and changing structures within the university. Prior to joining the RIT, you were an Assistant and Associate Professor at the United States Military Academy located in West Point, New York.

64

www.researchoutreach.org

You created the first student section of the SWE to exist at West Point and served as its faculty advisor – can you tell us about your time there? I was at West Point for five years and the students were great to work with. It was very male-dominated. There were about ten civilian women on the whole faculty, and being a young civilian woman, I would get a lot of attention. I sometimes felt as if I was in the spotlight. West Point was created, designed and refined over the years to be a place where leadership development happens. A vital part of leadership development is the presence of role models around students. I looked around myself, to the students who were amazing young people and faculty colleagues who had been in the military for many years and I found many role models to also watch and learn from. I think I spent those five years transforming who I was, and what that model of leadership looked like for me. You have received many awards in recognition of leadership and significant contributions in supporting gender diversity initiatives, such as the Maria Mitchell Women in Science

Award, Edwina Award for Gender Diversity, and most recently, the Isaac Jordan Award for Inclusion and Pluralism – how do these awards and events help celebrate and promote women in science? They are very visible; symbolically and politically they highlight accomplishments and achievements, and often encourage other women and men to do this kind of work. The simple exercise of putting together packets for these awards require a great deal of effort and helps the nominee or the person doing the nomination to tell their own story clearly. This helps with self-awareness and how to talk about impacts that have been made. Can you tell us about the NSF Pathways Project and the research investigations in relation to the topic of gender within STEM? What did you establish with this project and how did it create an impact? Since 2008, I have served as the RIT lead researcher on a cross-university effort to investigate the relationship between undergraduate engineering student participation in cooperative (co-op) educational experiences and self-efficacy development. The NSF Pathways


Thought Leader Over the past five years, the AdvanceRIT project has hosted over 30 workshops focused on cultural change through unconscious bias education. Many of these sessions have used interactive theatre as the delivery approach. In the academic year of 2017/2018, nearly 400 participants from the RIT faculty and staff attended one of the workshops.

Project involves researchers from RIT, Northeastern University (lead), Virginia Tech and the University of Wyoming. The overarching model for the study proposes that self-efficacy is based on the impact of students’ demographic characteristics, the effect of work experience and the contextual support provided by the university. This research has resulted in several award-winning publications and findings verify the pathways model. Academic self-efficacy and contextual support in all time periods are found to be critical to retention. Contextual support is found to be particularly important to women. How important were both the Establishing the Foundation for Future Organizational Reform and Transformation (EFFORT@RIT) and Advancement of Women Faculty (AdvanceRIT) projects at RIT? Extremely important! The EFFORT grant was the catalyst for RIT obtaining the much larger institutional transformation grant. It involved objective data, working with human resources to collect the data, as well as the creation of the climate survey to collect and analyse job satisfaction type data, as well as

Here, RIT faculty participate in a mentoring workshop in the spring of 2018 which Dr Bailey cofacilitated. Participants explored different models of mentoring learned how to design and implement a networked mentoring programme.

benchmarking where RIT’s practices and policies compared with other schools’. Without the EFFORT grant, AdvanceRIT may not have been possible. What needs to be done to ensure that women continue to enter STEM education and careers? Efforts need to continue as long as women are under-represented in these areas, efforts focused on reducing isolation and promoting the growth of networks and cultural change for all members of the campus community. I would say out of all of those, cultural change is the most important and the most difficult. Cultural change efforts often challenge people to be reflective, sometimes a bit vulnerable and open to the possibility of alternative models of behaviour. This type of organisational development is challenging to create, and administer; however, the results can be well worth the effort.

To find out more about WE@RIT, please visit the following website www.rit.edu/ kgcoe/women/.

Rochester Institute of Technology One Lomb Memorial Drive, Rochester, NY 14623-5603 USA E: Margaret.Bailey@rit.edu T: +1 585 475 2960 W: www.rit.edu/ W: w ww.rit.edu/kgcoe/staff/ margaret-bailey W: https://nsfadvance.rit.edu/

www.researchoutreach.org

65


Behavioural Sciences ︱ Professor Hadas Mandel

Gender inequality: occupational devaluation and pay gaps T The comparative research of long-term trends of gender inequality largely neglects structural mechanisms. As more women reach positions of power, structural elements will become more significant. Despite the growing body of literature in this area, the long-term effect of the changing gender composition of occupations on their relative pay has been largely neglected. Hadas Mandel, an Associate Professor in the Department of Sociology and Anthropology at Tel Aviv University, has addressed this gap in the literature by exploring the negative effect of occupational feminisation on occupational pay in the US and the mechanisms underlying these trends.

he theoretical argument for understanding long-term trends of the association between feminisation and occupational pay rests on a distinction between two processes related to gender inequality, that have occurred in recent decades in the labour markets of most western democracies. The first process, which relates to women as individuals, addresses the upward occupational mobility of women, meaning that women are incrementally entering higher rungs of the occupational hierarchy. The second process – which is conceptualized as a structural process – refers to the criteria for rewarding occupations. The question in this regard

www.researchoutreach.org

Prof Hadas Mandel sought to address this lacuna in the literature by examining trends in the effect of occupational

Structural mechanisms of gender/race inequality are not directed at any specific individual and thus are more ambiguous and more difficult to track empirically. is whether gender is one of the criteria and if so whether occupations are devaluated following the entry of women. The literature documenting long-term trends in gender inequality has tended to focus heavily on the former – i.e. the upward occupational mobility of women on the occupational ladder. The latter, i.e. the structural implications are largely overlooked. However, the two are inherently connected,

66

as a notable consequence of the growing occupational attainments of women over recent decades is evident in the way occupational feminisation affects the pay level of occupations. Although a considerable amount of research has highlighted the negative association between the percentage of women in occupations and their rewards, most of these studies have focused on the causal mechanisms of the process rather than on the dynamics over an extended period of time.

feminisation on occupational pay over several decades in the US and exploring the mechanisms underlying these trends. Using integrated data on individuals and occupations from the US Census (19602010) and the ACS surveys (2001-2015), her findings show, similarly to previous studies, that in recent decades, and especially from 1980 onwards, a growing number of women in the US have approached the head of the occupational ladder (see the first figure below). This shift has been fuelled by women’s growing educational attainments, which, together with the rising economic premium to education, have greatly contributed to the decline in gender wage gaps. Furthermore, based on these changes, the negative association between female percentage in


The major role education plays in explaining the divergent trends is twofold. The entry of women into occupations requiring higher education, and the growing economic reward to high education and to occupations with higher educational requirements, may both conceal the trend in the devaluation effect as they contribute to weakening the correlation between the percentage of women and pay across occupations over the course of time. Thus, the intensification of the devaluation effect is revealed only after controlling for education (at both the occupational and individual level), because the growing educational level of women, and the growing rewards to education, are processes that run counter to devaluation and thus conceal its intensification. Let’s consider the example of industrial engineers and electrical engineers. Both occupations demand high education (more than 70% of incumbent workers in 2010 had an academic degree), and both enjoyed a wage premium during the period studied. However, while in both occupations the percentage of women in 1960 was negligible (2% and 1%, respectively), 50 years later only 10% of electrical engineers were women, compared to 19% – almost double – of industrial engineers. As both occupations enjoyed wage premiums, the devaluation effects may not be observed because the process of feminization was not followed by an absolute wage reduction. Rather,

Trends in female proportion in occupations by levels of average occupational pay. 0.6

Proportion of females in occupations

0.5

0.4

0.3

0.2

0.1

0.0 1960

1965

1970

1975

1980

1985

Low pay Tertile

1990

1995

2000

Mid pay Tertile

2005

2010

2015

High pay Tertile

Trends in the effect of gender composition (% female) on the average pay of occupations, before (Model 1) and after (Models 2) accounting for education. 0.0 -0.1 Effect of % of females in occupations

occupations and occupational pay levels declines over time (see model 1 in the second figure below). This decline is most apparent from 1980 onward, a period in which US women witnessed a significant improvement in their occupational standing, and a period where occupations requiring higher education enjoyed a large wage premium. However, when examining the effect of gender composition of the occupation after accounting for women’s higher education and for the level of education in occupations, the trend is reversed; the negative net effect of female percentage on occupational pay intensifies over time (see model 2 in the second figure below). These two opposite processes reflect the upward occupational mobility of women, on the one hand, and its gendered consequences, on the other hand.

-0.2

-0.3 -0.4 -0.5 -0.6 1960

1965

1970

1975

1980

1985

Model 1

feminization is associated with a smaller wage premium relative to comparable highly educated occupations. Thus controlling for education is essential for revealing the devaluation process. Indeed, we see that while electrical engineers enjoyed a premium of 25% during the period studied, industrial engineers enjoyed a premium of less than 19%. The findings demonstrate the interrelationship between two opposing gendered processes and

1990

1995

2000

2005

2010

2015

Model 2

provide concrete evidence that gender stratification operates differently at the individual and at the structural/ occupational level. The split between individual and occupational forms of gender in/equality and the divergent trend of each are crucial for our understanding of gender inequality in theory as well as in practice. This is because structural mechanisms are not directed at any specific individual and thus are more ambiguous and more difficult to track empirically. The danger is

www.researchoutreach.org

67


therefore that the importance of gender as a determinant of economic inequality in the labour market will be less visible, less amenable to empirical assessment, and not sufficiently acknowledged.

To address this issue, Prof Mandel, in collaboration with Moshe Semyonov, used IPUMS data between 1970 and 2010 to examine the trends and sources of the racial pay gap among men and women in the US labour force. Findings highlighted the significance of the intersection between gender and race. They found that gender differences in the racial pay gaps were so pronounced that it was not possible to reach conclusions regarding racial pay gaps based on data for only one of the two gender groups. Racial pay gaps were substantially larger among men than among women at all time points. Although this demonstrates the significance of the intersection of gender and race, the findings do not support a double disadvantage hypothesis. Instead, they show that black men, not black women, are the prime target of economic discrimination in the US labour market. However, the findings also show that earning inequality is more gendered than racialized. They find that women of both races have an

1000 900 800 Average Weekly Wage

GENDER AND RACIAL PAY GAPS A different topic that Prof Mandel investigates deals with the intersection between race and gender in earnings inequality. Segregation and earnings disparities between blacks and whites observed at the turn of the 21st century in the US, are considerably lower than those documented in the middle of the 20th century. The rate of decline in racial pay disparities was fairly rapid following the enactment of the Civil Rights Act but has slowed in recent decades. Despite the wide consensus that racial economic disparities are declining, researchers do not fully agree on the sources, causes, and trajectory of the decline or whether the trends are similar for both men and women. Although the literature on racial earnings disparities has grown, most studies on the topic focus on the male population.

Average Weekly wage by Race and Gender.

700 600 500 400 300 1970 White Male

1980 Black Male

economic disadvantage in comparison with men. The ‘racial advantage’ of black women compared with black men should thus be understood within their overall gender disadvantage. Despite these differences between the genders, the trend over time was very similar for both genders; racial gaps sharply declined between 1970 and 1980 and continued to decline at a slower rate until 2000. However, at the turn of the millennium, the trend reversed for both gender groups. Given that earning inequality was found to be “more gendered than racialized” the similar trends is intriguing. The post-1970 decline in the gap can be understood as

1990

2000 White Female

68

www.researchoutreach.org

One possible explanation relates the widening racial gaps to the sharp increase in overall income inequality during the 2000s. The findings showed that, in the case of men (but not women), once variations in income

Black Female

inequality across decades were statistically controlled, the increase in the racial pay gap during the 2000s was much smaller. This implies that in addition to changes that are directly related to race, shifts in income distribution during the first decade of the millennium were more detrimental to the earnings of blacks, particularly black men, than to those of whites. Some evidence also implies an increase in market discrimination against blacks at the beginning of the new millennium. During the 2000s, rewards to academic degree increased for whites more than for blacks and thus whites benefited more from higher education relative to blacks. This was followed by stagnation in the processes of occupational desegregation, which may also indirectly indicate a rise in economic discrimination against blacks. Although different pay rewards of higher education for blacks and whites, as well as stagnation in desegregation, are only implicit indicators for discrimination, the simultaneous changes in the two trends – i.e., the reversal of the trend among the two gender groups, coupled with our knowledge of government reforms during this very period– may point to growing discrimination.

Shifts in income distribution during the first decade of the millennium were more detrimental to the earnings of blacks, particularly black men, than to those of whites. resulting from the enactment of the Civil Rights Act and the implementation of affirmative action policies. The reversal of the trend at the turn of the millennium may have several explanations.

2010


Behind the Research Professor Hadas Mandel

E: hadasm@tauex.tau.ac.il T: +972 3 640 7922 W: http://people.socsci.tau.ac.il/mu/hadasm/ W: https://scholar.google.co.il/ citations?user=GUGugqEAAAAJ&hl=en&oi=ao W: www.researchgate.net/scientific-contributions/2009968734_Hadas_Mandel

Research Objectives Professor Mandel’s research addresses the analytical and methodological distinctions between structural and individual aspects of gender inequality, underpinning the development of gender inequality over time and across societies.

Detail The Department of Sociology and Anthropology Faculty of Social Science Naftali Building, Tel Aviv University. Ramat Aviv. Israel Bio Hadas Mandel is an Associate Professor in the Department of Sociology and Anthropology at Tel Aviv University. Her research focuses on the intersection between gender, class, race and the complex implications of welfare state policies on women’s economic attainments. Since 2017, she is the Principal Investigator of the ERC-funded ‘Structural vs. Individual’ project addressing the analytical and methodological distinction between structural and individual aspects of gender inequality underpinning the development of gender inequality over time and across societies. Funding The European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (Grant 641 Agreement No. 724351). Collaborators • Amit Lazarus • Assaf Rotman • Adi Moreno

References Mandel H. (2013). ‘Up the Down Staircase: Women’s Upward Mobility and the Wage Penalty for Occupational Feminization, 1970-2007’. Social Forces, 91 (4):1183-1207. Mandel, H. (2016). ‘The role of occupational attributes in gender earnings inequality, 1970–2010’. Social science research, 55,122-138. Mandel, H., & Semyonov, M. (2016). ‘Going back in time? Gender differences in trends and sources of the racial pay gap, 1970 to 2010’. American Sociological Review, 81(5), 1039-1068. Mandel, H. (2018). ‘A Second Look at the Process of Occupational Feminization and Pay Reduction in Occupations’. Demography, 55(2), 669-690.

Personal Response What is the focus of your future research in this area? I expect the dynamic of the structural vs. individual processes to vary between countries and also by class. I thus seek to examine the processes in four representative countries – Sweden, Germany, Spain and the United States – that differ in many of the institutional aspects that affect gender inequality, including the provision of welfare, gender ideology, wage structure, and political economy factors. Therefore, gender in/equality processes in these countries are expected to take different forms in both structural and individual appearances. That said, in all countries I expect gender in/equality processes to vary by class, and I thus seek to examine the processes after distinguishing between classes of workers. I expect gender equality to be more pronounced and rapid for advantaged women. At the structural level, however, the rapid upward occupational mobility of skilled and educated women may expose highly rewarded occupations to devaluation and pay reduction more than others.

www.researchoutreach.org

69


Behavioural Sciences ︱ Dr Andrea Lavazza

The dawn of memory modulation and self-prescribed forgetting – a moral dilemma Human memory is an incredible feat of the brain, storing all of our fondest memories, and all of our greatest heartaches, nightmares and frankly memories we would maybe rather do without. What was once considered an idea bound to science fiction may be a possibility, at least theoretically. Advances in Neuroscience and Psychology have allowed the idea of physically and intentionally altering memory and inducing forgetting. Andrea Lavazza, a research fellow at the International University Centre in Arezzo, Italy, focuses on bringing these ideas to light to allow proper debate on a powerful, and controversial topic.

H

umans often pride ourselves on being the ‘Top’ species, ever more creative, intelligent and sophisticated than even our closest relative species. We possess a number of seemingly unique, complex, and powerful abilities: imagination, will, language, consciousness and morality. Perhaps the most integral is memory. Memory, whilst shared by many species, is particularly important to humans – it truly holds the key to individuality and development of “The Self”. You are an amalgamation of experiences in time; thousands, even millions of moments, some trivial, others life changing, all stored by the brain. Your fondest moments are remembered, and can be replayed at will (for most of us at least), bringing a smile to your face. By the same token, your worst moments can do the same with quite different results. For many, childhood experiences and memories shape later lives and identity. For those fortunate enough to have a pleasant upbringing, this can provide stability and security. However, for others, where early experiences were acutely negative and traumatic, this can create difficult problems to navigate. The impact of painful memories on later life and the self can be seen in PostTraumatic Stress Disorder (PTSD): an acutely traumatic memory (resulting from

an accident, an aggression, or a war episode) can cause anguish long after the event. People can spend many hours and a lot of money on psychotherapies addressing negative memories in attempts to resolve them. The value of these approaches is unquestioned but perhaps newer science can offer an alternative method. SCIENCE FICTION MATERIALISES Altering memory has been a common trope of science fiction films for decades (think of “Total Recall”, “Strange Days” or “Eternal Sunshine of the Spotless Mind”). For most, it remains a fictional idea, rather than a possible reality. But not for all. Andrea Lavazza of Centro Universitario Internazionale (Arezzo, Italy) specialises in the field of neuroethics – a new field that examines the wider implications of neuroscience research. His recent work has focused on the question of memory modulation – Is it possible? Should we do it? What might the implications be? CAN WE? Lavazza highlights the interesting finding that a drug named Propranolol has been found to modulate emotional response to certain memories. Propranolol is a beta blocker typically prescribed for hypertension and anxiety. Studies have found that if taken within six hours of an emotionally salient event, the emotional response to that memory is mitigated (Brunet et al, 2008). The drug works by inhibiting or reducing the release of stress hormones such as adrenaline and cortisol. This inhibition, by proxy, means the negative response associated with the memory of the event will be reduced. The actual memory itself is no less detailed. Instead, it is the evoked emotional or stress response which is lessened.


Post-Traumatic Stress Disorder can be more prevalent in service personnel due to their experiences of conflict.

The possible impact on a person’s mind, self and identity caused by deleting memories must be considered. Recent work with animal models has provided some more powerful examples of memory modulation. For instance, rats have been conditioned to associate a particular stimulus, such as a cage, with pain. Then, the rat’s brain is physically altered via pharmaceuticals to stop the consolidation and storing of that memory; the valence (positive or negative) of a memory was even switched by researchers using optogenetics (Redondo et al, 2014). As with all animal model studies, the ability to transfer this to humans is not direct nor is it always advisable. However, the possibility to perform such modifications with humans is certainly real. So to the question “Is memory modulation possible?”, the answer is yes – although the area currently lacks finesse and certainty.

CAN THE SELF SURVIVE UNNATURAL CHANGE? Lavazza raises the issue of how memories of all forms, including the negatively charged, shape each person. If we begin to alter how we remember negative events, this can lead to a cascade of following decisions that may not have occurred without that initial alteration. Lavazza alludes to a thought experiment suggested by Erler (2011) where it is imagined a girl named Elisabeth encounters her former childhood bullies. Now ordinarily most of us as adults would not want to spend much time with anyone who caused us harm, even if it was a long

Bullying can also leave children with negative and disruptive memories.

time ago. However, imagine Elisabeth can now alter this memory of her bullies, or at least reduce the associated emotional impact. She can now interact positively with the bullies, perhaps even ‘forgive’ them. This seems like a positive outcome but the question it raises is: Was it really Elisabeth making that decision? Wouldn’t she have asked her friends to repent or have confronted the bullies if she had not had her memories altered? Our genetic makeup and life experiences shape the

Post-Traumatic Stress Disorder was first widely recognised as ‘shell shock’ in soldiers in WWI.

SHOULD WE? Moving beyond the practical ability to modulate or delete memories – the real question Lavazza deals with is: Should we? Assume the existence of perfectly accurate and reliable methods of memory modulation and deleting. We could now, in an extreme example, remove the traumatic memories that cause PTSD. On the surface, this sounds like a positive use of a potential technology. However, the possible impact of memory deletion on a person’s mind, self and identity must be considered.

www.researchoutreach.org

71


Problematic memories may make up integral parts of our ‘Self’.

self and to alter these can possibly change our self. Therefore, Elisabeth, by altering her ‘natural’ response to negative experiences, has altered Elisabeth as a person. Another hypothetical situation deserves a mention. Imagine exploited or ill-treated people were induced to take drugs that would alleviate their negative memories. They would suffer less but they might also lose the urge to try to challenge injustice. Not only would they “betray” their true self, but the whole of society could be damaged. Lavazza acknowledges that while a single memory may not impact a person’s core self, the possibility that it might do remains open. The impact of memory modulation and deletion at present depends on one’s position on what ‘The Self’ is: a hotly debated topic. Supporters of certain narrative-based theories of self argue that memory is crucial to the self as it allows the construction of one’s narrative within the realms of reality (see Schechtman, 2014). This means an accurate and truthful representation of reality is integral and hence sensitive to changes. Under this perspective, altering any memory can damage the ‘natural’ or truthful reality which shapes formation of the self. Narrative theories would suggest any natural changes to self are gradual with

72

www.researchoutreach.org

room for foresight from the individual, i.e. they can predict how these changes may take effect, preserving the continuity of self. Following this reasoning, any attempt to introduce immediate modulation or literal erasing of memories is a very dangerous game to play. Lavazza refers

dependent on how the self is defined. At present, popular theories are rooted in ‘normative’ conceptualisations of the self, rather than being based on empirical research. Generally, the consensus is either memory modulation could lead to self-improvement or it could lead to

If we begin to alter how we remember negative events, this can lead to a cascade of following decisions that may not have occurred without that initial alteration. to this kind of formulation of the self as ‘Rigid Identity’. Constructionist viewpoints, on the other hand, view the self as a more fluid, emergent property which will not be lost by changes to memory alone. In fact, the alteration of unpleasant memories could be seen as a form of self-improvement akin to any approach to achieve such. The possible damage of memory deletion and modulation is far less pronounced if this viewpoint – called ‘Extended Identity’ – is held. WHAT NEXT? Lavazza’s work highlights the current position of the debates surrounding memory modulation and deletion to be

the opposite, self-deletion. As is often the case with newly emerging fields, progression of technology and scientific research is going to be decisive here. A world where negative memories and experiences are eradicated may seem attractive to the imagination, but it’s dubious that anyone would seriously consider it a desirable reality. Learning from negative experiences is natural and crucial to human development. Empathy, compassion and many complex human emotions could be lost without natural responses to negative stimuli. This is truly a fascinating topic, which will require deliberate debate and formation of good science to investigate the implications fully.


Behind the Research Dr Andrea Lavazza

E: lavazza67@gmail.com T: 0039 02 67 38 29 34 W: www.cui.org/andrea-lavazza/ W: www.researchgate.net/profile/Andrea_Lavazza2

Personal Response

Research Objectives Dr Lavazza’s work examines the ethics around memory modulation and erasing.

How accurate do you think memory deletion can actually be? Can a single, specific memory be deleted? We must be clear in saying that today there is no safe and effective way to modulate memories at will. But research in this field is taking rapid steps forward. Moreover, a single memory, closely embedded into a semantic network of other memories, seems almost impossible to remove completely at the current state of knowledge.

Detail Centro Universitario Internazionale, via A. Garbasso, 42 - 52100 Arezzo (I) Italy Bio Andrea Lavazza is a senior research fellow at the Centro Universitario Internazionale, Arezzo, Italy. He specialises in moral philosophy and neuroethics. His interests focus on the social and legal implications of science and technology. His publications include the book “Frontiers in Neuroethics. Conceptual and Empirical Advancements” (Cambridge Scholars Publishing). Collaborators • Silvia Inglese,Fondazione IRCCS Ca’ Granda Ospedale Maggiore Policlinico, Milan, Italy

What one potential benefit to memory modulation would you highlight to the general public? There are situations of suffering related to psychological traumas that can paralyze a person and literally prevent them from living. The ability to modulate negative memories – making them neutral – could reduce many pains that today have no real cure. What are the potential risks of unregulated memory-modulation? One can think a duty to remember exists. A witness to a crime should keep her memories in good shape in order to testify in the courtroom. But should an individual who participated in a historical event remember and tell other people what she saw even though the event was shocking? Think of survivors of Nazi lagers. Should they remember the facts in order to transmit the vivid story of Shoah, with its capacity to mobilise people to prevent a similar tragedy from happening again? Or do they have the right to forget and live a happier life?

References • L avazza A (2018) Memory-Modulation: Self-Improvement or Self-Depletion? Front. Psychol. 9:469. doi: 10.3389/ fpsyg.2018.00469

• Lavazza A and Inglese S (2013). Manipolare la memoria. Scienza ed etica della rimozione dei ricordi. Milano: Mondadori Università.

• M uravieva E,V, Alberini C, M, (2010) Limited efficacy of propranolol on the reconsolidation of fear memories. Learn Mem.17:306–13.

• L avazza A (2015) Erasing traumatic memories: when context and social interests can outweigh personal autonomy. Philos. Ethics Humanit. Med. 10:3 DOI 10.1186/s13010-014-0021-6

• Brunet A, Orr SP, Tremblay J, Robertson K, Nader K, Pitman RK. (2008) Effect of post-retrieval propranolol on psychophysiologic responding during subsequent script-driven traumatic imagery in post-traumatic stress disorder. J Psychiatr Res.(42):503–6.

• R edondo R,L, Kim J, Arons AL, Ramirez S, Liu X, Tonegawa S.(2014). Bidirectional switch of the valence associated with a hippocampal contextual memory engram. Nat. 513:426–30.

• L avazza A (2017) Moral Bioenhancement Through Memory-editing: A Risk for Identity and Authenticity? Topoi DOI 10.1007/s11245-017-9465-9

• E rler A (2011). Does memory modification threaten our authenticity?. Neuroethics, 4(3), 235-249.

• S chechtman, M (2014). Staying Alive: Personal Identity, Practical Concerns, and the Unity of a Life. New York, NY: Oxford University Press.

www.researchoutreach.org

73


Health and Medicine ︱ Professors Kye Young Lee and Jae Young Hur

Extracellular vesicle DNA: A promising cancer biomarker Lung cancer patients could one day receive faster, cheaper and more accurate diagnoses thanks to extracellular vesicle DNA found in liquid biopsies. These were the findings of a research team led by Professors Kye Young Lee and Jae Young Hur of Konkuk University’s School of Medicine. Their work offers an alternative to invasive tissue biopsies which are currently used to detect cancer and determine treatments.

L

ung cancer is the biggest tumourrelated killer worldwide, accounting for 1.6 million deaths per year. Currently, doctors rely on biopsies – invasive procedures that involve taking tissue samples to determine the presence and extent of disease – to diagnose cancers and define the best course of treatment. In fact, patients are increasingly having multiple biopsies during treatment as doctors try to prevent or detect cancers’ resistance to targeted drugs and immunotherapies. However, while demand for biopsies increases, the chances of obtaining adequate tissue samples in later stages decreases, as patients become more susceptible to infection and other diseases, preventing them from having the procedure. Furthermore, the asymptomatic nature of lung cancer

74

www.researchoutreach.org

and lack of efficient diagnostic methods means the disease can often reach its later stages before a patient sees a doctor for their first biopsy, leading to poor prognosis. As a result, researchers have turned to liquid biopsies, such as blood tests, to detect cancerous DNA. The more efficient and less invasive nature of these tests mean that cancers can be detected with minimal risk to the patient and – once optimised – could even become a routine measure to help doctors detect cancers sooner and improve patient outcomes. However, finding enough cancerous DNA in blood to achieve truly efficient and accurate testing is challenging. Initially, researchers focused on circulating tumour DNA (ctDNA), however, there is only a very small amount of ctDNA in the


EGFR genotyping results The first lane shows primary tumour tissue biopsy results. Second to fourth lane compares the tissue rebiopsy results of resistance acquired patients with liquid biopsy using BALF cfDNA and BALF EV DNA. (Abbreviations; E19 del: exon 19 deletion, WT: wild type.)

Primary tissue biopsy p. T790M: 2/9 (22.2%) Rebiopsy tissue

blood, making it difficult to detect. And while some studies have shown relatively high rates of detection – or sensitivity – ranging from 66% to 78%, others have only achieved sensitivities ranging from 28.8% to 46%, rendering it unsuitable for diagnostic use. Now, a team of researchers led by Professors Kye Young Lee and Jae Young Hur at Konkuk University’s School of Medicine are looking to extracellular vesicle DNA (EV DNA) for faster, cheaper and more accurate and reliable cancer diagnosis. WHAT ARE EXTRACELLULAR VESICLES? Extracellular vesicles (EVs) are nanoparticles that are secreted by cells into body fluids such as blood, urine or saliva. They contain bioactive molecules including DNA, RNA, proteins and lipids (fats). Fortunately for researchers, doctors and patients, cancer cells secrete more EVs than healthy cells, making them – and the cancerous DNA – more abundant in body fluids. This, combined with the protection EVs offer DNA from degradation, make EV DNA easier to detect and a better target than cell-free DNA (cfDNA) TESTING IN NON-SMALL CELL LUNG CANCER PATIENTS To see if EV DNA lived up to its promise, Professors Lee and Hur decided to explore it in non-small cell lung cancer (NSCLC) patients, which account for 85% of lung cancer cases. Here, EVs were obtained through liquid biopsies of plasma and bronchoalveolar lavage fluid (BALF). (A bronchoalveolar lavage is a saline wash of the bronchial and alveolar spaces in the lungs, where saline displaces the EVs and any other extracellular material and the resulting mixture is sucked up for examination. A bronchoscopy is done at the same

p. T790M: 3/9 (33.3%) BALF cfDNA p. T790M: 5/9 (55.5%) BALF EV DNA E19 del

E19 del & p. T790M

time using a fibre optic cable so that pulmonologists and surgeons can see any diseased tissue.) Once isolated, the EVs were broken up and the cancerous DNA retrieved for sequencing. In particular, the researchers wanted to check that the sequences of the epidermal growth factor receptor gene (EGFR) found in plasma and BALF EV DNA matched those found in existing tissue samples. They chose the EGFR gene because its encoded protein helps control cell growth, division and survival and mutations in the gene have been associated with lung cancer.

p.L858R

WT

Not available, not done

The hope was that the three sample types would be in complete agreement for each patient and show that liquid biopsies could definitively replace solid tissue biopsies. They found that BALF liquid biopsies could. In fact, EGFR sequences in 23 BALF EV DNA samples matched 100% of those obtained from tissue biopsies from the same patients. In contrast, cfDNA EGFR sequences only matched 71.4% of those from tissue biopsies. Meanwhile, EGFR sequences from 20 plasma EV DNA samples matched 55%

Lung cancer is the biggest tumour-related killer worldwide, accounting for 1.6 million deaths per year.

Electron microscope image of EVs in BALF sample.

The red arrow indicates double stranded DNA (dsDNA) inside of an EV collected from BALF. DsDNA specific antibodies were used to performing immuno-EM.

www.researchoutreach.org

75


of sequences from corresponding tissue biopsies. While this is significantly lower than the BALF results, it is still higher than those achieved with plasma cfDNA, which only matched 30% of EGFR sequences in corresponding tissue biopsies. Professor Lee and his team chalk plasma EV DNA’s reduced sensitivity down to sample contamination. In their 2018 paper, they explain that lipoproteins present in the plasma are very similar to those of EVs, preventing them from completely isolating EVs and interfering with analysis. They also suggest that distance could play a role in a sample’s ability to demonstrate cancer status, where BALF EV DNA’s close proximity to the lung cancer helped it to better display tumour status than circulating EVs and DNA. DETECTING ACQUIRED DRUG RESISTANCE With such remarkable results, Professors Lee and Hur wanted to see whether BALF EV DNA could also be used to detect acquired drug resistance. Often, EGFR-tyrosine kinase inhibitors (TKIs) – anticancer drugs that prevent an enzyme called tyrosine kinase from transmitting signals that result in cell growth and division, and thus, prohibit Membrane Protein

Lung cancer patients could one day receive faster, cheaper and more accurate diagnoses thanks to extracellular vesicle DNA found in liquid biopsies. further growth or spread of cancer – can be used to successfully treat NSCLC patients. However, after about a year of use, patients with EGFR mutations such as Exon 19 deletion and p. L858 R often acquires resistance to EGFR-TKIs and need alternative treatments. To ensure the next set of treatments will be effective, doctors need to confirm that patients have the mechanism of resistance caused by a mutation in p. T790 M.

Protein

mRNA

DNA

miRNA

Mutant DNA

Phospholipid Bilayer

Diagram of tumour-derived EV containing RNA; DNA, including mutant DNA; and proteins.

76

www.researchoutreach.org

As a result, the researchers surveyed biopsies from nine patients with acquired resistance to EGFR-TKIs for the p. T790 M mutation. Of these, six were tested using BALF EV DNA and cfDNA obtained from liquid biopsies as well as a conventional tissue biopsy. The remaining three patients only had the BALF EV DNA and cfDNA samples as tissue samples were unattainable for various reasons. Here, BALF EV DNA revealed that five of the nine patients had the p. T790 M mutation while cfDNA only detected

it in three out of nine patients. However, these tests are still more sensitive than tissue biopsies which only exposed the mutation in two patients – including two from the BALF EV DNA and cfDNA test. Furthermore, two of the patients newly identified as pT790M carriers by liquid biopsy showed positive responses to the subsequent Osimertinib, a thirdgeneration TKI designed to target the pT790M mutation. THE FUTURE Despite the small sample size, these early findings demonstrate BALF EV DNA’s increased sensitivity to the p.T790M mutation, compared to tissue biopsies which are normally used to detect this mutation. One day, BALF EV DNA tests could help doctors optimise treatment plans sooner and significantly improve patient prognosis. Finally, while further studies into EV DNA is needed to confirm their ability to detect other cancers and mutations, Professors Lee and Hur believe that their research could lead to the identification of new biomarkers and help the advancement of precision medicine.


Behind the Research Professor Kye Young Lee

Professor Jae Young Hur

E: 20050690@kuh.ac.kr E: 20160475@kuh.ac.kr T: +82 2 2030 7521 W: www.kuh.ac.kr/medical/dept/centerDoctor.do?dept_cd=100011

Detail

Research Objectives

Konkuk University Medical Center Gwangjingu neungdongro 120-1 Seoul, Korea 05030

Professors Kye Young Lee’s and Jae Young Hur’s research team’s focus on studying and developing cancer detection methods using liquid biopsy with extracellular vesicles.

Bio Kye Young Lee MD PhD is a Pulmonologist and chief of Precision Medicine Lung Cancer Center at Konkuk University Medical Center. He received his MD and PhD from Seoul National University Medical School.

References

Jae Young Hur PhD runs the Liquid Biopsy lab in the Department of Pathology and Precision Medicine Lung Cancer Center at Konkuk University Medical Center. He received his PhD from Seoul National University. Funding AstraZeneca Korea Ltd Collaborators • Hee Joung Kim at Konkuk University Medical Center • Chang-Min Choi at Asan Medical Center • Jae Cheol Lee at Asan Medical Center • Min Kyo Jung at Korea Brain Research Institute • Wan Seup Kim at Konkuk University Medical Center

Personal Response How long do you think it will take before liquid biopsies overtake tissue biopsies in cancer diagnosis and treatment? At present, liquid biopsies require advanced technology not widely available in hospitals and medical labs. Most liquid biopsies are performed with blood, as it is the most conveniently drawn and frequently tested body fluid. However, because of its complex composition, results are often inconclusive or complicated. Therefore, liquid biopsy technique should be able to reduce cost, solve the problem of interpreting complicated results, and explore other bodily fluids to widen the use for cancer diagnosis. Discovery of a new technique and then application into the actual field often takes years, but liquid biopsy technique seems to have reached the level applicable to undiagnosed patients. There is currently lots of excitement in the field of liquid biopsy as we are sure this could lead to further research and advancement in the early detection and treatments of cancer patients.

Hartmaan, J., Haap, M., Kopp, H. and Lipp, H. (2009). ‘Tyrosine kinase inhibitors – a review on pharmacology, metabolism and side effects’. Current Drug Metabolism, [online] 10(5), 470481. Available at: www.ncbi.nlm.nih.gov/pubmed/19689244 [Accessed 20/01/2019]. Hur, J., Kim, H., Lee, J., Choi, C., et al. (2018). ‘Extracellular vesicle-derived DNA for performing EFGR genotyping of NSCLC patients’. Molecular Cancer, [online] 17(1), 15-20. Available at: www.ncbi.nlm.nih.gov/pubmed/29374476 [Accessed 20/01/2019]. Jiang, T., Su, C., Ren, S., C., F., et al. (2018). ‘A consensus on the role of osimertinib in non-small cell lung cancer from the AME Lung Cancer Collaborative Group’. Journal of Thoracic Disease, [online] 10(7), 3909-3921. Available at: www.ncbi.nlm. nih.gov/pmc/articles/PMC6106007/ [Accessed 20/01/2019]. Lee, J., Hur, J., Kim, I., Choi, C., et al. (2018). ‘Liquid biopsy using the supernatant of a pleural effusion for EGFR genotyping in pulmonary adenocarcinoma patients: a comparison between cell-free DNA and extracellular vesiclederived DNA’. BMC Cancer, [online] 18, 1236-1243. Available at: bmccancer.biomedcentral.com/articles/10.1186/s12885018-5138-3 [Accessed 20/01/2019]. Paul, M. and M, A. (2004). ‘Tyrosine kinase – Role and significance in cancer.’ International Journal of Medical Sciences, [online] 1(2), 101-15. Available at: www.ncbi.nlm.nih. gov/pmc/articles/PMC1074718/ [Accessed 20/01/2019]. Radha, S., Afroz, T., Prasad, S. and Ravindra, N. (2014). ‘Diagnostic utility of bronchoalveolar lavage’. Journal of Cytology, [online] 31(3), 136-138. Available at: www.ncbi.nlm. nih.gov/pmc/articles/PMC4274523/ [Accessed 20/01/2019]. Reck, M. and Rabe, K. (2017). ‘Precision Diagnosis and Treatment for Advanced Non–Small-Cell Lung Cancer’. The New England Journal of Medicine, [online] 377, 849-861. Available at: www.nejm.org/doi/full/10.1056/NEJMra1703413 [Accessed on 20/01/2019]. US National Department of Health Services. (2019). EFGR Gene. [online] Genetics Home Reference. Available at: https:// ghr.nlm.nih.gov/gene/EGFR#resources [Accessed 20/01/2019]. Williams, E. Amino acid single letter code. [online] UW Hematology. Available at: http://williams.medicine.wisc.edu/ aminoacidcodes.pdf [Accessed 20/01/2019].

www.researchoutreach.org

77


Health and Medicine ︱ Gert Jan van der Wilt

Values and evidence meet:

Appropriate healthcare assessment for vulnerable patients. Healthcare technology assessment is about discovering how healthcare technologies enable us to create value. A key factor is clinical ethics, yet historically this has been overlooked. Researchers Gert Jan van der Wilt, Herbert Rolden, Janneke Grutters and Angela Maas at Radboud University Medical Centre explore the ethical and social implications of health care technologies. Their research aims to develop concepts and methods that enable a patient-centred, comprehensive approach to help inform decision-making, both in the introduction of new technologies and for the appropriate use of existing technologies.

78

www.researchoutreach.org

I

n recent years, the number and variety of health technologies developed have increased at a rapid rate. These include new medications, diagnostic tests, devices, surgical methods, medical procedures and systems: all are developed to solve a health problem and improve our quality of life. We often take it for granted that these technologies are effective and that they show benefits to the patient, but to be adopted they also need to represent value for money. The effectiveness and cost of health technologies are assessed by a rigorous process. Clinical evidence is obtained to show how well the technology works – the health benefits. The evidence includes the impact on quality of life (for example, pain or disability), as well as the likely effects on mortality. Economic evidence shows how well the technology works in relation to how much it costs and whether it represents value for money. Health technology assessment (HTA) is a comprehensive evaluation framework that generates evidence of the value of health technologies. In a nutshell, HTA is about discovering how healthcare technologies enable us to create value.

It is a systematic, evidence-based mechanism that evaluates and prioritises new technologies from economic, social and ethical perspectives. VALUE OF INFORMATION ANALYSIS Economic evaluations are increasingly used to inform decisions in healthcare; however, decisions remain uncertain when they are not based on adequate evidence. Value of Information analysis is a systematic approach to measure decision uncertainty and assess whether there is sufficient evidence to support new health technologies. Essentially, it is a decision support tool for the allocation of resources to scientific research. ADDRESSING CLINICAL ETHICS Ethics concerns what is right and wrong and the reasons that we give for our choices and actions. Clinical ethics refers to the study of ethical issues and promotes making the ‘right’ choices and decisions in the delivery of healthcare. It concerns basic ethical principles such as autonomy (the right for individuals to make choices about what happens to them), beneficence (the desire to do good), non-maleficence (the duty to prevent harm), and justice (fairness).


ETHICAL DILEMMAS IN CLINICAL RESEARCH Given the tremendous rate of development of health innovations, it’s important to carefully consider the research questions that will be addressed: how, where, when and by whom? These questions pose a challenge when the patients participating in those studies are vulnerable.

patients in clinical research presents ethical dilemmas.

Patients may be vulnerable for a variety of reasons. Firstly, because they are unable to fully appreciate the implications of their participation (or non-participation) in a clinical study (e.g., children and elderly people with compromised cognitive abilities). Secondly, patients may be vulnerable because they are more likely to sustain adverse outcomes, for instance, because of comorbidities, such as diabetes or heart failure. Thirdly, their participation may be considered as exploiting some sort of disadvantage (e.g., they being poor or dependent). Conversely, despite the importance of these considerations, not providing vulnerable patients with the possibility to participate in clinical research results in the uncertainty of treatments being safe and beneficial for them. The participation of vulnerable

SPECIFYING NORMS Henry Richardson of Georgetown University, Washington DC (1) established the method of specifying norms as a method of moral

This is where the work of researchers Gert Jan van der Wilt, Janneke Grutters, Angela Maasand Herbert Rolden comes in. This dedicated team of researchers have worked together to investigate methods for exploring the ethical and social implications of health care technologies.

Norm 1. It is wrong to kill innocent people. Norm 2. We should respect the reasonable choices of parents regarding their children (i.e. respecting autonomy). Norm 3. We should act in the best interest of persons who have been entrusted to our care. Richardson points out that there is always a gap between general moral norms (such as respecting autonomy) and judgments as to what follows from our commitment to such norm in concrete situations.

These questions pose a particular challenge when the patients participating in those studies are vulnerable. argumentation – as a way of resolving ethical dilemmas in a transparent and systematic way. Richardson’s unique framework recognises that generic moral principles can be specified in multiple ways, also that it offers several rules for preserving the moral import of the original, unspecified principle. Giving the example of a severely malformed newborn child, whose parents wish that their child should be allowed to die, Richardson proposes three moral norms facing the healthcare provider:

One way to bridge this gap is by making the norm more specific. For example, norm 1 can be conceived as a specification of a still more general norm, expressing respect for persons: Norm 1: It is wrong to kill innocent people who have attained selfconsciousness or who have the potential to develop self-consciousness over time.

www.researchoutreach.org

79


As Prof van der Wilt explains: “According to this framework when we find ourselves in a dilemma, the key task is to develop alternative specifications of the various moral principles. In other words, we need to find out what follows from our commitment to a particular moral principle in a specific situation, taking into account that one or more other moral principles should be observed, too.” Building on Richardson’s work, the team present a framework that makes explicit the moral principles that guide decisions in a concrete situation and that are the cause of the ethical dilemma in hand. Professor van der Wilt says: “These could include, for example, our desire to respect patients’ autonomy, our desire to be able to act truthfully, our desire to treat all people fairly, and our desire to spend resources wisely. We then need to realise that such principles are quite abstract and generic and that in order to decide what follows from our commitment to these norms in concrete situations, we need to specify them.” A NEW FRAMEWORK The team at Radboud University neatly illustrate the application of the framework in a recent publication. The team used the case study of pre-menopausal women with atrial fibrillation, posing the question whether these women should be invited to participate in clinical studies of a new type of blood thinners, novel oral

80

www.researchoutreach.org

In addition to assessing safety, clinical benefits and cost-effectiveness, it is important to consider the social and ethical factors to guide decision-making. of premenopausal women with AF in trials of NOACs would have been the ideal option.

anticoagulants (NOAC). These women could be considered vulnerable since they are at increased risk of substantial bleedings that are difficult to control and that may have serious consequences. Due to their non-participation in key trials, there is uncertainty for this cohort whether the risks that are associated with these drugs are outweighed by the advantages, compared with conventional treatment. They addressed the question of whether research of this new class of drugs for these women would be appropriate, both from an ethical and economic perspective.

Incorporating the question whether research of NOACs in pre-menopausal women with atrial fibrillation can be justified on economic grounds was determined using the value of information analysis. The research team concluded that further clinical research on NOACs in premenopausal women with atrial fibrillation is justified – both on ethical and economic grounds.

SHOULD PREMENOPAUSAL WOMEN WITH AF HAVE BEEN INCLUDED IN TRIALS OF ANTICOAGULANTS IN THE FIRST PLACE? Using Richardson’s method of specifying norms as a wider framework the team proposed how the apparent ethical dilemma may be resolved, considering patients’ considerations and the need to spend resources for clinical research wisely. The team concluded that in fact inclusion

The team elegantly exemplify that addressing apparent ethical dilemmas by employing the use of a method such as specifying norms can improve the quality of public practical reasoning. Their work demonstrates that the method has substantial value to inform health policymakers by using solid scientific evidence on the medical, social, economic and ethical implications of investments in health care. In addition to assessing safety, clinical benefits and cost-effectiveness, it is important to consider the social and ethical factors to guide decision-making.


Behind the Research Gert Jan van der Wilt

Herbert Rolden

Janneke Grutters

Angela Maas

E: Gertjan.vanderwilt@radboudumc.nl T: +31 24 361 3126 W: www.radboudumc.nl/en/people/gert-jan-van-der-wilt/healthtechnology-assessment W: www.validatehta.eu W: www.narcis.nl/person/RecordID/PRS1272404/Language/en W: www.researcherid.com/rid/H-8120-2014 W: www.integrate-hta.eu W: www.htaplus.nl/what-is-hta-/ E: hj.rolden@raadrvs.nl T: +31 6 1503 5377 www.linkedin.com/in/herbert-rolden/ W: www.raadrvs.nl/over-de-rvs/medewerkers/medewerkers/dr.-h.j.-herbert-rolden E: janneke.grutters@radboudumc.nl T: +31 (0)24 361 69 22 W: www.radboudumc.nl/en/people/janneke-grutters W: www. radboudumc.nl/en/people/janneke-grutters @JannekeGrutters https://nl.linkedin.com/in/janneke-grutters-60a1a29 E: angela.maas@radboudumc.nl T: +31 6 51585435 W: www.hartvoorvrouwen.nl @maasangela www.linkedin.com/in/angela-maas-54984413/

Research Objectives

References

A team of researchers at Radboud University Medical Centre explore the ethical and social implications of health care technologies. Their research aims to develop concepts and methods that enable a patient-centred, comprehensive approach to help inform decision-making, both in the introduction of new technologies and for the appropriate use of existing technologies.

Richardson, HS (1990). ‘Specifying Norms as a Way to Resolve Concrete Ethical Problems’. Philosophy & Public Affairs. 19; 4:279-310.

Detail

Rolden et al. (2017). ‘Uncertainty on the effectiveness and safety of rivaroxaban in premenopausal women with atrial fibrillation: empirical evidence needed’. BMC Cardiovascular Disorders; 17:260. DOI 10.1186/s12872-017-0692-1.

Bio Gert Jan van der Wilt (Researcher ID: H-8120-2014) is professor and head of Health Technology Assessment (HTA) at Radboud University Nijmegen Medical Centre. Herbert Rolden is currently a policy advisor for The Council for Health and Society in The Netherlands. Janneke Grutters works as associate professor and junior principle investigator at Radboudumc. Angela Maas is a clinical cardiologist and is currently one of the most influential female doctors in Dutch healthcare. Funding The Netherlands Organisation for Health Research and Development (ZonMw)

van der Wilt, G., et al. (2018). ‘Combining value of information analysis and ethical argumentation in decisions on participation of vulnerable patients in clinical research’. Bmc Medical Ethics; 19:5 https://doi.org/10.1186/s12910-0180245-x

Personal Response In your view, how much importance should the perspective of the patient have in the evaluation of healthcare technology? The patient perspective plays a key role. HTA is about collaboratively exploring how health technologies enable us to better realise particular social and ethical values. This requires carefully integrating empirical analysis and normative inquiry. All stakeholders, and notably patients, should be able to recognise and endorse the choices that are made in the context of an HTA and agree with the interpretation of its findings.

www.researchoutreach.org

81


Health & Medicine ︱Dr Jessica Walsh, Dr Xue Song, Dr Gilwan Kim and Dr Yujin Park

Counting the costs of ankylosing spondylitis Ankylosing spondylitis (AS) is a chronic rheumatic disease. A debilitating condition, it has large patient and societal burdens. However, the financial impact of the disease is not fully understood. Aiming to change this are Dr Jessica Walsh from the University of Utah School of Medicine, Drs Xue Song and Gilwan Kim from IBM Watson Health, and Dr Yujin Park from Novartis Pharmaceuticals Corporation. Through retrospective analysis of recent administrative claims data, the team comprehensively reviewed all-cause and ASspecific costs of disease for US patients. Their findings provide insight into the direct medical costs associated with healthcare utilisation of patients with AS.

A

nkylosing spondylitis (AS) is a painful and progressive form of inflammatory arthritis that affects 0.1–1% of the US population. A chronic condition, AS mainly affects the spine (‘spondylo’), which becomes inflamed, causing chronic and often severe pain and stiffness, and can result in extreme tiredness. Over time, the inflammation can lead to ankylosis, where new bone formation in the spine causes sections of the spine to fuse in a fixed, immobile position. In extreme cases, patients become severely disabled and functional disabilities have a large impact on the ability to carry out everyday tasks and may result in the inability to work. AS can also cause inflammation, pain, swelling and stiffness in other areas of the body. Inflammation that occurs at the site where ligaments or tendons

and physical therapy. Consequently, the impact of AS is substantial; particularly since it develops relatively early in life, tending to first develop in late teens through to 40 years of age. Although AS onset after 50 years of age is unusual, because of the often invisible nature of the disease, AS may go undiagnosed for many years, and in some cases more than 10 years, leading to diagnosis at an older age. COMMON COMORBIDITIES Along with inflammation of the spine, joints, and entheses, patients with AS often present with peripheral arthritis, psoriasis, and inflammatory bowel diseases. Studies also show that patients with AS have significantly more comorbidities than matched controls in the general population, including cardiovascular disease, diabetes,

Their study is the first comprehensive analysis comparing healthcare utilisation and direct costs in AS patients, compared to matched controls. attach to the bone is known as entheses. Other areas such as the hips, ribs, heels, and other joints can also be affected. Symptoms tend to develop gradually, usually over several months or years, and patients may have periods of remission followed by relapse. Unfortunately, there is no cure for AS. However, treatment regimens can relieve pain and halt worsening of the condition. Common treatments include a combination of medication, exercise,

82

www.researchoutreach.org

malignancies, and depression [1-3]. In addition to the considerable burden faced by AS patients due to chronic pain and disability, financial burdens associated with AS are also substantial. Despite this, there are limited data about the direct costs of AS. A team of researchers is determined to gain a better understanding of the financial burden of AS: rheumatology specialist Jessica Walsh, MD, University of Utah School of Medicine; Xue Song, PhD, an Outcomes Research Practice Leader at IBM Watson Health; Gilwan Kim, PharmD, MS, Analyst Manager at IBM Watson


The gaps between vertebrae provide a lot of our ability to move. When sections of the spine fuse together, this mobility is lost.

Health; and Yujin Park, PharmD, Associate Director in Health Economics and Outcomes Research and Medical Access at Novartis Pharmaceuticals Corporation. Their research, supported by Novartis Pharmaceuticals Corporation, provides insight into the direct medical costs associated with AS. The team noted that the majority of research into AS-related costs has been undertaken outside the United States. Recognising a gap in the literature, the team investigated all-cause and AS-specific direct costs in US patients with AS to better understand the financial impact of the disease. Their study is the first comprehensive analysis evaluating healthcare utilisation and direct costs in US patients with AS compared with matched controls. COUNTING THE COSTS OF AS The research team undertook retrospective analysis of recent administrative healthcare claims data from US patients. The team used the IBM Watson Health MarketScan® Commercial Claims and Encounters (Commercial) database and Medicare Supplemental (Medicare) database. Both large databases provided longitudinal information on healthcare services (including inpatient and outpatient services, long-term care, and prescription drug claims) for patients insured under various health plans. Their study was recently published in Rheumatology and Therapy.

The study included 6,679 patients aged 18 years and older with at least one or more inpatient or at least two or more outpatient medical claims for AS between January 1, 2012, and December 31, 2014. Patients were matched (by age, geographic location, index calendar year, and sex) with controls without AS at a ratio up to 1:5 (19,951 patients). All-cause and AS-specific healthcare utilisations and associated direct costs were compared between the two patient groups over the course of 12 months. Their results were striking. Patients with AS had substantially higher

Physical therapy can help alleviate symptoms of AS.

healthcare utilisations and direct costs than matched controls with significantly higher rates of total allcause inpatient admission (12% vs 6%), emergency department visits (23% vs 15%), nonhospital-based outpatient visits (100% vs 84%), hospital-based outpatient visits (68% vs 46%), other outpatient services (97% vs 81%), and medication use (97% vs 82%) than the matched control group.

Over time, inflammation can lead to ankylosis, where new bone formation in the spine causes sections of the spine to fuse in a fixed, immobile position.

www.researchoutreach.org

83


A

The research team suggests that the presence of common comorbidities contributes to the increased utilisations and costs observed in patients with AS compared to matched controls. Their analysis showed that patients with AS had significantly higher rates of cardiovascular disease, depression, malignancies, osteoporosis, sleep apnoea, and spinal fracture, as well as inflammatory bowel disease and psoriasis than matched controls. These comorbidities require additional medications, treatments and complicate AS management, leading to increased costs. However, as the authors note: “Further research is needed to fully determine the potential cause and effect relationships between

$33,285

Patients with AS

Mean costs (USD 2015)

$30,000

Matched controls

$25,000 $20,000 $15,000 $10,000

$14,074

$13,220

$8,310 $4,990

$5,000

$1,971

Total healthcare

B

Inpatient

$385 Total medical outpatient service

$16,000

Emergency department

$2,053

$1,812

Outpatient visit Outpatient visit (non hospital(hospital-based) based) Total medical outpatient service

Other outpatient service

$1,255

$519

$1,737

Outpatient pharmacy

$16,337 $14,595

$14,000 $12,000 $10,000 $8,000 $6,000 $4,000 $1572

$2,000 $170

$21

$Total healthcare

Inpatient

Total medical outpatient

Emergency department

$402

$672

Outpatient visit Outpatient visit (non hospital(hospital-based) based) Total medical outpatient service

$477 Other outpatient service

Medicatione

A (top): Mean direct healthcare costs per patient per year over a 12-month follow-up period. All-cause healthcare costs for patients with AS and matched controls. Note how the cost for patients with AS is consistently higher than for the matched controls. B (bottom): AS-related healthcare costs for patients with AS.

AS and comorbidities and the role of inflammation in the development of comorbidities in patients with AS.” The team also noted that effects of severity of disease and other risk factors (e.g. obesity and smoking) on healthcare utilisation

Symptoms, such as inflammation and chronic pain, increase over time for many patients.

www.researchoutreach.org

$222

$18,000

The research of this team of colleagues fills a gap in the body of knowledge on AS comorbidities and costs of care for AS patients in the USA.

84

$6,419

$5161

$4,602

$-

Mean costs (USD 2015)

Notably, patients with AS had a tenfoldhigher median total all-cause healthcare cost than matched controls ($24,978 vs $2,139 per patient per year). Higher costs associated with AS were largely a result of increased medical outpatient services (mean cost $13,220 vs $4,602, per patient per year) and outpatient pharmacy costs (mean cost $14,074 vs $1,737, per patient per year), as illustrated in the figure opposite. Outpatient pharmacy costs associated with AS included use of biologic therapies, other AS-related medications (e.g. anti-inflammatory drugs, antirheumatic drugs), antihypertensives, and antidepressants.

$35,000

and costs could not be determined in the current study, since this information is not available from administrative claims data. CONCLUSION The research of this team of colleagues fills a gap in the body of knowledge on AS comorbidities and costs of care for US patients with AS. Their work helps better represent the overall costs associated with AS: in addition to its physical, psychological and social burden, the current study highlights the high economic burden of the healthcare needs of AS patients. Findings from their descriptive analysis using large administrative claims databases provide key insights into the direct medical costs associated with healthcare utilisation in patients with AS across the United States. The total financial burden of AS could not be determined in the current study because indirect costs related to AS – such as disability, loss of work productivity, and caregiver costs – were not measured. Further research aimed at quantifying the total financial burden of AS from the individual patient standpoint and society perspective would be beneficial.


Behind the Research Dr Jessica A. Walsh E: Jessica.walsh@hsc.utah.edu

Dr Xue Song

E: songx@us.ibm.com

Dr Gilwan Kim E: kimgi@us.ibm.com

Dr Yujin Park

E: Jina.park@novartis.com

Research Objectives The team’s work has unveiled the direct costs of healthcare for patients with ankylosing spondylitis.

Detail Jessica A. Walsh University of Utah School of Medicine and Salt Lake City Veterans Affairs Medical Center, Salt Lake City, UT Xue Song IBM Watson Health, Cambridge, MA Gilwan Kim IBM Watson Health, Cambridge, MA Yujin Park Novartis Pharmaceuticals Corporation, East Hanover, NJ Bio Jessica Walsh, MD, is an Instructor at the University of Utah School of Medicine and George E. Wahlen Veteran Affairs Medical Center. As a rheumatologist, her clinical interests include spondyloarthritis and psoriatic diseases. Xue Song, PhD, is an Outcomes Research Practice Leader at IBM Watson Health. Gilwan Kim, PharmD, MS, is an Analyst Manager at IBM Watson Health. Yujin Park, PharmD, is an Associate Director in Health Economics and Outcomes Research and Medical Access at Novartis Pharmaceuticals Corporation. Funding This study was sponsored by Novartis Pharmaceuticals Corporation, East Hanover, NJ. Dr Walsh is a consultant for Novartis Pharmaceuticals Corporation. Dr Song and Dr Kim are employees of IBM Watson Health. Dr Park is an employee of Novartis Pharmaceuticals Corporation.

References 1 W alsh JA, Song X, Kim G, Park Y. (2018). Healthcare utilization and direct costs in patients with ankylosing spondylitis using a large US administrative claims database. Rheumatol Ther. 5(2):463-474. doi: 10.1007/ s40744-018-0124-4. 2 W alsh JA, Song X, Kim G, Park Y. (2018). Evaluation of the comorbidity burden in patients with ankylosing spondylitis using a large US administrative claims data set. Clin Rheumatol. 37(7):1869-1878. doi: 10.1007/ s10067-018-4086-2. 3 W alsh JA, Song X, Kim G, Park Y. (2018). Evaluation of the comorbidity burden in patients with ankylosing spondylitis treated with tumour necrosis factor inhibitors using a large administrative claims data set. J Pharm Health Serv Res. 9(2):115-121. doi: 10.1111/jphs.12212.

Personal Response Your research provides insight into the cost of the healthcare needs of patients with AS. What’s next for your work? This study looked at direct costs associated with AS, but studies that evaluate societal and indirect costs, such as work productivity, disability, and caregiver costs, will provide a more complete picture of the economic burden. This is especially important for patients in the US healthcare system because previous studies have shown indirect costs associated with AS to be higher than direct costs. In addition, as the diagnostic delay of AS is one of the biggest challenges in optimising care, further research quantifying the cost of delayed AS diagnosis from payer and patient perspectives would help to raise the importance of this issue.

www.researchoutreach.org

85


Health and Medicine ︱ Dr Laura Bonnett

How data is improving driving policies for epilepsy patients Until recently, UK epilepsy patients’ quality of life and the public’s safety rested on driving policies informed by neurologists’ ‘expert opinions’. However, Dr Laura Bonnett of the University of Liverpool and her team have found hard evidence to back up and improve these policies. By determining the risk of recurring seizures in first-time seizure patients, those on antiepileptic drug withdrawal programs and patients suffering from breakthrough seizures, Dr Bonnett and her team have helped epilepsy patients regain their licences (and their freedom) sooner while ensuring that their risk of seizure falls under the Driver and Vehicle Licensing Agency’s threshold of 20%.

S

eizures are sudden surges of electrical activity in the brain. Often, they are accompanied by muscle spasms, lapses in consciousness or awareness, staring spells, or cognitive or emotional symptoms, such as fear, anxiety or deja vu and can last anywhere from a couple of seconds to several minutes. According to Dr Bonnett, one in 20 people will experience one-off seizures in their lifetime while one in 120 people in the United Kingdom (UK) will have epilepsy – a neurological and physical condition marked by recurring seizures. Although some people are prevented from driving due to their medical condition, road users – such as drivers, cyclists and pedestrians – may still encounter drivers with epilepsy or one-off seizures. Importantly, the UK Driver and Vehicle Licensing Agency (DVLA) only reinstates patients’ car licences after their risk of having a seizure in the next year falls to 20% – the same as the risk of a newly qualified driver having an accident in their first year behind the wheel. However, until recently, patients relied on the ‘expert opinions’ of neurologists to determine when they’d reached this threshold – often forcing them to wait a year before they could regain their licence as long as they didn’t have another seizure in the meantime. Now, research by Dr Laura Bonnett of the University of Liverpool and her team is helping to pinpoint the time that their risk of seizure is reduced to 20% or below, getting patients

86

www.researchoutreach.org

back on the road sooner and ensuring the safety of all road users. RISK OF SEIZURE RECURRENCE: First-time seizure patients In her 2010 study, Dr Bonnett found that people with a first-ever seizure were more likely to have a second seizure if they: have a known cause for the first seizure, such as a head injury or brain infection; have a parent with epilepsy; were asleep when the first seizure occurred or weren’t treated straight after the first seizure. Patients were also at higher risk if they displayed abnormal results from a head scan or electroencephologram (EEG). In her Award Lecture for the British Science Festival, Dr Bonnett likened an EEG to getting hair dyed at a salon. However, instead of covering their hair in foil and ending up with a bold new colour, patients have receivers stuck to their head and obtain a chart showing their brain activity. During this study, Dr Bonnett and her team used data from a randomised clinical trial with 637 patients, aged 16 years and over, that had had a first-time seizure. Of these, 317 received treatment immediately after the first seizure while 320 did not. Fortunately for the experts and road users, statistical analysis showed that if patients were to regain their licence a year after the initial seizure the risk of them having a seizure in the next 12 months was 7% if they had treatment and 10% if they didn’t. Moreover, if Dr Bonnett and her team were to account for patient variability, such as age and sex – by calculating 95% confidence intervals – the figures fell to


An electroencephalogram (EEG) allows doctors to study a patient’s brain waves.

between four and 11% and six and 15% for the groups with and without treatment respectively. (A confidence interval is a range of numbers in which researchers

According to Dr Bonnett, one in 20 people will experience one-off seizures in their lifetime. can be 95% certain the true statistic lies.) All well below the 20% threshold. However, when the researchers looked at the risk of seizure in the year following the six months after the first seizure, they found it to be 14% for the treated group and 18% for the untreated group. Still below the 20% threshold! These results led the DVLA to reduce the licence penalty from 12 months to six months, even though the upper limit of the untreated group’s confidence interval was 23%. Here, the DVLA chose to focus on the discrete figures, rather than confidence intervals, as the high volume of applications prevents them from calculating the risk on a case-bycase basis. After antiepileptic drug withdrawal Understanding the importance of her work, Dr Bonnett extended her research to explore the risk of seizure recurrence after antiepileptic drug (AED) withdrawal. When her study was published in 2011, the DVLA advised that patients undergoing AED withdrawal should not drive for six months after the last dose had been taken. Furthermore, under the Road Traffic Act, if patients were to restart treatment, they had to be seizure free for a year before they

would be allowed to drive again. And, the European Union (EU) standard stated that patients experiencing seizures during a physician-advised change or withdrawal of medication needed three months off driving, if the previously effective treatment was reinstated. To determine the suitability of these recommendations, Dr Bonnett and her team reviewed data from the Medical Research Council’s AED withdrawal study. Here, they found that patients

who had a seizure recurrence during or following AED withdrawal, recommenced treatment, and were seizure-free for six months after treatment was restarted had an 18% chance of having a seizure in the next 12 months. However, the small sample size prevented the team from achieving a risk definitively below 20% (with the confidence interval being 10% to 27%). Conversely, if patients were seizure free for twelve months after restarting treatment, the risk was 17% (8% to 27%) and at three months, the risk was 26% (17% to 35%). In her paper, Dr Bonnett expressed concerns that – based on the discrete estimates – UK legislation was too conservative and that the EU standard

A patient undergoing electroencephalography.

www.researchoutreach.org

87


Newly qualified drivers have a 20% risk of having an accident in their first year behind the wheel.

was too liberal and she has conveyed these concerns to policy makers. After a breakthrough seizure: According to Dr Bonnett, a breakthrough seizure is the first seizure after at least 12 months’ seizure freedom while on treatment. Under the DVLA regulations, patients who have had a breakthrough seizure cannot drive for a year. To ensure the risk of recurring seizures following a breakthrough seizure falls under the 20% threshold by this time, Dr Bonnett and her team chose to reanalyse data from the standard versus new antiepileptic drugs (SANAD) study.

Epilepsy is not the only condition that increases the likelihood of having an accident.

88

www.researchoutreach.org

Many of these policies are informed by ‘expert opinions’ rather than hard evidence... This involved taking data from a randomised clinical trial where they observed 339 eligible patients aged 16 years and over. While the initial study separated these patients into ‘Arm A’ and ‘Arm B’ based on the medications they received, Dr Bonnett and her team pooled the data to explore risk of recurring seizures regardless of treatment.

this is partly due to policies preventing people with certain medical conditions from driving. However, many of these policies are informed by ‘expert opinions’ rather than hard evidence – just as those around people driving with epilepsy were prior to Dr Bonnett’s work. This can significantly impact the public’s safety and the patients’ quality of life.

Overall, the team found the risk of recurring seizures at one year to be 17% (15% to 19%), while the risk at six months was 32% (28% to 36%), supporting the DVLA’s decision to have patients wait a year to retrieve their licence. However, the study found that some subgroups needed at least 15 months off before they reached the 20% threshold.

This is clearly shown through the three studies discussed in this article, where policies were initially too harsh (in the case of first-time seizure patients) or too lenient (in the EU’s policies around AED withdrawal). Furthermore, studies have shown that people with medical conditions other than epilepsy are 26 times more likely to have an accident than those with epilepsy. Thus, there is a greater need for research into those medical conditions and their impact on driving ability in order to inform policies and improve road safety. As a result, Dr Bonnett’s next mission is to explore driving policies around diabetes.

THE NEED FOR EVIDENCEBASED POLICY According to Dr Bonnett, the number of people killed or seriously injured in UK road accidents is relatively low compared to international standards. She believes


Behind the Research Dr Laura Bonnett

E: L.J.Bonnett@liverpool.ac.uk T: 0151 795 9686 W: www.liverpool.ac.uk/translational-medicine/ staff/laura-bonnett/ @ljbcmshe

Research Objectives

References

Dr Bonnett’s work focuses on prediction modelling within chronic conditions such as epilepsy. Her work has been used by the UK-based Driver and Vehicle Licensing Agency to modify their driving regulations for people with seizures and epilepsy.

Bonnett L, Shukralia A, Turdur-Smith C, et al. (2011). Seizure recurrence after antiepileptic drug withdrawal and the implications for driving: further results from the MRC Antiepileptic Drug Withdrawal Study and a systematic review. Journal of Neurology, Neurosurgery & Psychiatry, 82(12), 1-6.

Detail

Bonnett L, Tudur-Smith C, Williamson P, and Marson A. (2010). Risk of recurrence after a first seizure and implications for driving: further analysis of the Multicentre study of early Epilepsy and Single Seizures. BMJ, [online] 341, 1-8. Available at: www.bmj.com/content/341/bmj.c6477 [Accessed 04.12.18).

Department of Biostatistics Waterhouse Building Block F 1-5 Brownlow Street University of Liverpool L69 3GL UK Bio Dr Bonnett is a medical statistician primarily interested in the development and validation of prognostic (prediction) models. She researches methods to model recurrent events such as seizures. This work will inform patient counselling and improve quality of life for people living with long-term conditions such as epilepsy and asthma. Funding National Institute for Health Research Collaborators • L J Bonnett • A G Marson • G A Powell

• A Shukralla • C Turdur Smith • P R Williamson

Personal Response

Bonnett L, Powell G, Tudur-Smith C, Marson, A. (2017). Risk of a seizure recurrence after a breakthrough seizure and the implications for driving: further analysis of the standard versus new antiepileptic drugs (SANAD) randomised controlled trial. BMJ Open, [online] 7:e015868, 1-6. Available at: https:// bmjopen.bmj.com/content/bmjopen/7/7/e015868.full.pdf [Accessed 04.12.18]. D’Ambrosio R, Miller J. (2010). What Is an Epileptic Seizure? Unifying Definitions in Clinical Practice and Animal Research to Develop Novel Treatments. Epilepsy Currents, [online] 10(3), 61-66. Available at: https://onlinelibrary.wiley.com/doi/ abs/10.1111/j.1535-7511.2010.01358.x [Accessed 04.12.18]. Mayo Clinic Staff. Seizures. [online] Mayo Clinic. Available at: www.mayoclinic.org/diseases-conditions/seizure/symptomscauses/syc-20365711 [Accessed 04.12.18] Naik, Puja Appasaheb, et al. “Do drivers with epilepsy have higher rates of motor vehicle accidents than those without epilepsy?.” Epilepsy & Behavior 47 (2015): 111-114. Schachter, S., Shafer, P., Sirven, J. (2014). What is a seizure?. [online] Epilepsy Foundation. Available at: www.epilepsy.com/ learn/about-epilepsy-basics/what-seizure [Accessed 04.12.18].

What inspired you to explore driving policies around epilepsy and why have you chosen to focus on diabetes in your next project? As an applied statistician my main aim is to make a difference to people using numbers. My colleague (Prof Marson) is a member of the DVLA’s Drivers Medical Group. In 2008 this group discussed the differing driving guidelines between the EU and the UK and the need for evidence to support their decision making. As a statistician (and a PhD student) I was in the perfect position to provide the much-needed evidence. The rest, as they say, is history! Diabetes offers a new but related challenge – driving guidelines have to consider hypos and other outcomes such as eye problems and heart complications.

www.researchoutreach.org

89


Health and Medicine ︱ Dr Lynne Ann Barker

New innovations in traumatic brain injury research Lynne Ann Barker is a pioneer in the field of traumatic brain injury (TBI). Her work aims to understand how trauma to the brain leads to changes in behaviour and cognitive ability in those affected. She has also worked to develop new experimental methods for assessing cognitive ability, including computerised simulations that incorporate real-world tasks such as cooking, with the aim of increasing the speed and efficiency of diagnoses of these lifechanging injuries.

T

he brain is an organ of incredible complexity. Contained in a space no larger than a football is 100 billion cells that connect in unimaginably complex ways. From this complexity arises all the behaviours that make us human. From tacit (non-conscious) processes that occur without us even being aware, to executive functions: higher-order processes that are associated with goaldirected behaviours such as planning and target-seeking. The brain might be the most intricate and complicated organ in the body, but it is also the most precious and vulnerable. Damage to the brain can be disastrous to an organism, leading to cognitive impairments that can result in a lifetime

of debilitating symptoms. Lynne Ann Barker and her colleagues are working to understand how damage to the brain results in cognitive impairment and to develop new techniques to diagnose and treat those with traumatic brain injury. YOUNG AT HEART, DELICATE OF MIND One of the questions Barker has been keen to address is whether age plays a role in the response of the brain to injury. The so-called ‘latent deficit hypothesis’ suggests that injury sustained earlier in life (from 17-25 years old) may produce greater lasting cognitive impairments than those sustained later (28 years and older). In 2006, Barker and colleagues tackled this question by performing a study with people who had sustained head injuries that resulted in lesions to the frontal brain regions. They compared groups of young and older participants using a range of tests aimed at assessing their executive and implicit (unconscious) cognitive ability. Interestingly, although there were no differences in executive function, the young cohort showed greater deficits in implicit cognitive function than those that sustained their injuries later in life. Dr Barker thinks the differences in response to TBI between young and old might be due to the crucial period of brain development that occurs in early adulthood. She writes:“We are only now beginning to fully appreciate that swift morphological change to the brain between ages 17-25 confers vulnerability to greater effects of traumatic brain injury on cognition and an increased likelihood of the first incidence of mood disorder,

90

www.researchoutreach.org


Partial screenshot of item cooking in Cog-LAB task .

Brain in coronal plane showing lacuna infarcts to mid-brain structures resulting in constant pain for the patient.

psychosis and eating disorder during this age range” The exact nature of the morphological changes that occur during this period of adolescent neurodevelopment and how they lead to the vulnerability to brain injury is still under investigation, but it is hoped that continued work in this area will continue to yield further insight. THE SUBCONSCIOUS MIND It was interesting for Dr Baker and her group to note that the adolescents showed deficits in implicit as well as executive functions. In neuropsychology, implicit, also referred to as tacit or unconscious, brain functions describe a response to stimuli that are subliminal or go unnoticed. It had been previously thought that these types of cognition were less affected than their executive counterparts by brain injury. In fact, Dr Barker’s work has been instrumental in highlighting the importance of changes to implicit cognition following TBI. Her 2006 study in Neuropsychologia was the first to show through an empirically rigorous group study that tacit cognition is impaired in cases of head injury. Her group compared 20 patients with head injuries and a cohort of I.Q. and age-matched controls. The two groups underwent MRI scans and a battery of four tests to determine their implicit cognitive abilities. These tests included a serial reaction time task (SRT), a mere exposure effect task, automatic stereotype activation and hidden co-variation detection. The impact of the injuries was clear – those with brain trauma performed

The brain might be the most intricate and complicated organ in the body, but it is also the most precious and vulnerable. worse in three out of four tests of implicit cognition. Interestingly, performance on the SRT task correlated with a composite measure of executive function, suggesting a relationship between these two aspects of cognition. Indeed, the authors suggest that implicit and explicit cognition may interact to produce the dysfunctional behaviour that is observed after TBI. And the field seems to be taking note of this – following the publication of this work there has been a renewed focus on the role subconscious impairments can play in the behavioural deficits of people with TBI. Dr Barker recognises that given the greater emphasis on the impact of implicit impairments in TBI comes a need for better ways to test for these deficits. To this end, she has programmed a number of variants of the SRT task which are run using PSYSCOPE, a computer programme that allows researchers to design and run psychology experiments. According to Dr Barker, the importance of implicit deficits is often overlooked: “We toiled over the SRT task, trialling and refining it, to ensure that it reliably captured tacit processes”. This hard work seems to have paid off; almost every single one of the 150 people with brain

injury tested using the improved SRT test demonstrated impairment compared to healthy controls. These studies were included in the Brain Sciences special issue The Brain Knows More than It Admits: The Control of Cognition and Emotion by Non-Conscious Processes. This collection highlights the best experimental methods for capturing tacit cognitive processes. However, one shouldn’t think Dr Barker is interested only in the subconscious; she has been a leading voice in a discussion going on in cognitive neuroscience surrounding the study of executive function. She thinks that there needs to be a unification approach across disciplines, a refinement of techniques and a broader consensus about how to close the gap between theory and therapy, to ensure sufferers of brain injury receive the best possible treatment. Readers may be interested to seek out an editorial piece she has written on this subject for Frontiers in Behavioural Neuroscience, titled: ‘Executive Function(s): Conductor, Orchestra or Symphony? Towards a TransDisciplinary Unification of Theory and Practice Across Development, in Normal and Atypical Groups’.

www.researchoutreach.org

91


COOKING UP SOMETHING GREAT In addition to her pioneering work on the underlying biology of brain injury, Dr Barker has been leading the development of innovative new ways of diagnosing cognitive impairment. She was one of the first to recognise that modern technological advances could

sustains a brain injury or early on in the course of a neurodegenerative disorder because they require the coordination of multiple cognitive functions.” With this in mind, she has designed a task-based test for cognitive impairment that centres around a cooking simulation. This cooking task (CT) prototype named Cog-LAB has

Cog-LAB could be used as a novel approach for determining cognitive impairment in head injury patients. be used to design more sophisticated neurocognitive tests. Currently, diagnosis of cognitive deficits is expensive and time-consuming. However, she realised that modern computing offered the chance to design automated tests that could speed up this process. Dr Barker wanted to use the opportunity of the emerging technology to design tests that incorporate simulations of real-world tasks. She notes: “In real life, complex tasks such as cooking are often significantly affected when a person

the ability to measure multiple different aspects of cognitive function at once. In a 2015 study in Frontiers in Behavioural Neuroscience, Dr Barker and colleagues generated some preliminary data on the efficacy of Cog-LAB. The aim was to see how the CT stacks up to more conventional cognitive tests. To this end, forty-six non-impaired participants took part in the CT as well as various sub-tests from standardised neuropsychological tasks. The result of this comparison was that the CT performed well, providing a proof-of-principle that Cog-LAB could be used as a novel approach for determining cognitive impairment in head injury patients. Following the publication of the preliminary study, Dr Barker’s work was awarded the MEDIPEX award for NHS innovation in 2016 and a subsequent MRC Confidence in Concept grant to develop Cog-LAB. Brain injury is in the eye of the beholder Another innovative approach that

92

www.researchoutreach.org

shows great promise involves using scanning techniques to look for changes in the eye which could represent an early warning sign of brain injury. In a recently published study https://www. dovepress.com/investigating-possibleretinal-boilermakers-of-head-trauma-inOlympic-bo-peer-reviewed-article-EB, Dr Barker and colleagues compared Olympic boxers to control patients to see if they could detect alterations to different structures within the eye. Using a technique known as optical coherence tomography (OCT), the group found that the boxers had a thinning of their maculas and retinal nerve fibre layers (RNFL). This is an exciting proof-of-principle for a technique which could provide a much more efficient way to diagnose brain trauma, as well as potentially providing an early warning sign for damage that might not yet reach the threshold to produce a noticeable cognitive impairment capable of being detected with current methods. THE FUTURE Readers can now find out more about the eye-scanning study as it has been recently published in the journal Eye and Brain. As for Cog-LAB, its future lies in a planned clinical trial which aims to assess how well it can perform in patients with various neurological impairments, namely TBI, stroke and early dementia. The goal of the study is to assess the use-ability, reliability and sensitivity of the test compared to conventional diagnostic approaches, and with the promise shown by the work so far, the results are eagerly awaited.


Behind the Research Dr Lynne Ann Barker

E: l.barker@shu.ac.uk T: +44 0114 225 5379 W: www.frontiersin.org/research-topics/1270/executive-functions-conductororchestra-or-symphony-towards-a-trans-disciplinary-unification-of-theo W: www.lidsen.com/journals/neurobiology/neurobiologyspecial-issues/new-develop-brain-injury W: www.shu.ac.uk/about-us/our-people/staff-profiles/lynne-barker www.linkedin.com/ in/lynne-barker-19521a53/?originalSubdomain=uk W: www.shu.ac.uk/research/specialisms/centre-for-behavioural-science-andapplied-psychology

Research Objectives

References

Dr Barker’s work focuses on the effects of brain injury on neural structures and cognition and behaviour, brain and cognitive maturation in young adults, and effects of diet on brain injury recovery

Barker LA, Andrade J, Morton N, Romanowski CA, Bowles DP. (2010). ‘Investigating the ‘latent’ deficit hypothesis: age at time of head injury, implicit and executive functions and behavioural insight’. Neuropsychologia, 48(9):2550-63.

Detail

Barker LA, Andrade J, Romanowski CA, Morton N, Wasti A. (2006). ‘Implicit cognition is impaired and dissociable in a head-injured group with executive deficits’. Neuropsychologia, 44(8):1413-24.

Bio Dr Lynne Ann Barker is a Reader in Cognitive Neuroscience and conducts research on the functional effects of brain trauma, clinical functional measurement, biomarkers of injury and developmental brain morphology. She is currently writing a neuroscience textbook and teaches postgraduate students on the Applied Clinical Cognitive Neuroscience course. She supervises several PhD students and is the Postgraduate Research Tutor for all PhD students in the department. She is a founder member of the Cognition and Neuroscience (CaN)Research Group. She was recently awarded the Medipex NHS Innovation Prize. Funding Cog-LAB study funded by MRC Confidence in Concept competitive round 5. Collaborators • Dr Nick Morton • Prof Charmaine Childs • Prof Annalena Venneri • Dr Sophie Taylor • Leanne Greene • Professor John Reidy • Professor Brian McGuire • Rebecca Dennis • Dr Catherine Day • Dr Paul Richardson • Professor Jackie Andrade • Dr Charles Romanowski • Dr Jenny Drabble • Dr David Bowles • Dr James Taylor • Dr Lisa Reidy • Dr Sue McHale • Holly Ashton • Dr Alistair Atherton

HIVE Information Technology, Sheffield, UK – specifically (Sooze, Jamie and Allen) •D r Richard Grunewald •P rofessor David Sanders •C arolyn Taylor • Dr Caroline Jordan • Dr Bernard Corfe • Dr Caroline Dalton • Dr Mike Loosemore MBE • Alex Gage • Dr Lambros Lazuras • Dr Martin Thirkettle •D r Caroline Jordan •D r Bernard Corfe

Barker LA and Morton N. (2018). ‘Editorial: Executive Function(s): Conductor, Orchestra or Symphony? Towards a Trans-Disciplinary Unification of Theory and Practice Across Development, in Normal and Atypical Groups’. Front Behav Neurosci, 12: 85. Barker LA. (2012). ‘Defining the Parameters of Incidental Learning on a Serial Reaction Time (SRT) Task: Do Conscious Rules Apply?’ Brain Sci, 17;2(4):769-89. Childs C, Barker LA, Gage A, Loosemore M. (2018). ‘Investigating possible retinal biomarkers of head trauma in Olympic boxers using optical coherence tomography.’ Eye and Brain:10,1–10. Doherty TA, Barker LA, Denniss R, Jalil A, and Beer MD. (2015). ‘The cooking task: making a meal of executive functions’. Front Behav Neurosci; 9: 22.

Personal Response How does your range of experimental versions of the Serial Reaction Time Task help in the elucidation of the role of tacit cognition in deficits following TBI? Tacit processes are what happens whilst we are thinking about other things. Often overlooked, they play a vital role in reading and responding to others’ intentions and gestures, facilitating many aspects of social cognition. Social functions are disrupted by brain trauma. We toiled over the SRT task, trialling and refining it, to ensure that it reliably captured tacit processes. Almost every single person tested on this task with a brain injury (> 150) showed impairment compared to non-brain injured people: A fundamental function is lost. Presently, one of my PhD students (Leanne Greene) is investigating how this loss contributes to social deficits.

www.researchoutreach.org

93


Health and Medicine ︱ Drs Lynn Selemon and Alvaro Duque

MacBrainResource: Virtual access to decades-old primate brains MacBrainResource is a vast collection of macaque brain slides and electron microscope (EM) blocks generated in the laboratories of Dr Pasko Rakic and the late Dr Patricia Goldman-Rakic. Drs Lynn Selemon and Alvaro Duque at Yale University School of Medicine are committed to making these valuable brains available to researchers both on-site and remotely via website access (macbrainresource.org). MacBrainResource represents a 21st century solution for researchers faced with mounting obstacles to primate research and one that does not require the sacrifice of any additional animals.

T

he human brain is incredibly complicated. Decades of research have been devoted to elucidating the structure, function and development of the human brain and to identifying abnormalities associated with neurologic and neuropsychiatric disorders, such as stroke, epilepsy, Alzheimer’s, Huntington’s, and Parkinson’s diseases, to name only a few. Moreover, recent evidence suggesting that diseases like schizophrenia and autism are linked to prenatal disturbances highlights the importance of gaining a full grasp of early human brain development. Direct study of the human brain has unquestionably provided the basic foundation for our knowledge of human neuroanatomy and neural development. However, human studies are necessarily limited to non-invasive examination of normal architecture. Animal models provide a means of circumventing this limitation. Systemic and intracranial injections allow researchers to explore developmental mechanisms in greater depth and to elucidate brain connectivity. Experimental manipulations enable greater understanding of the impact of lesions on connected brain areas and of prenatal perturbation on brain development. Non-human primates, including prominently macaques (Macaca mulatta), have been used extensively in neuroscience research as their brains are very similar to humans and their brain development is protracted much like that in humans. Over the past few decades, the enormous cost of primate housing and care, tightened regulations on primate research and changing attitudes towards primate use in medical research has led to a reduction

94

www.researchoutreach.org

in primate research across the globe. These challenges are major deterrents to primate brain research. Drs Lynn Selemon and Alvaro Duque in the School of Medicine at Yale University have created a unique online resource known as Macaque Brain Resource (MacBrainResource) containing five distinct collections of macaque brain tissue. The mission of MacBrainResource is to foster new primate brain research using brain tissues generated over the past 50 years. MacBrainResource is derived from years of experimental research in Dr Pasko Rakic’s and the late Dr Patricia GoldmanRakic’s laboratories. Drs Selemon and Duque, who have worked with Drs Rakic and Goldman-Rakic, are now at the center of the pioneering effort to provide worldwide access to these collections. With funding from the National Institute of Mental Health, histologic slides and EM images will be digitised as requested: these materials will be freely available to researchers for remote access and analysis so they can be used to make new discoveries – with no animals being harmed in the process. NOSE TO THE GRINDSTONE Unravelling the mysteries of brain development, its structure and function was the passion of the late Dr Patricia Goldman-Rakic and continues to be a life-long endeavour for Dr Rakic. Early in their careers they chose the nonhuman primate brain as a model due to its similarity with the human brain. Developmental studies in the non-human primate are particularly challenging as these require a breeding colony with specialised facilities for pregnant, infant and juvenile non-human primates. Whilst in the Department of Neuroscience at Yale, the Rakic/Goldman-Rakic laboratories overcame these challenges and in doing so established one of the


few non-human primate breeding colonies in the world. Decades of research from many scientists in the Rakic/Goldman-Rakic laboratories generated an immense collection of over 7000 brain slides and EM blocks, that now comprise the five collections of MacBrainResource. Duplicating the scale of this collection today would be extremely difficult, making MacBrainResource an incredibly important resource for scientists. UNIQUE COLLECTIONS As a young scientist, Dr Rakic asked the question ‘where and when are neurons made during brain development?’ To answer this conundrum the Rakic laboratory used a technique that was novel at the time, injection of a compound called tritiated thymidine into pregnant monkeys. Radioactive tritiated thymidine gets incorporated into the DNA of cells that are undergoing cell division. Those cells that are undergoing their final cell division, that is the division that gives rise to newly generated neurons, will be most densely labelled. Quantitation of the number of densely radioactive neurons present in a specific brain structure following injections at various foetal ages allows determination of when neurons in that structure were born. The samples of brain tissue from these experiments were collected throughout the 70s and 80s and make up Collection 1. This collection is currently being used to investigate contemporary questions in brain development. For Collection 2 we introduce one of the most influential women in neuroscience, Dr Goldman-Rakic. Her research focused on understanding the prefrontal cortex – the part of the brain that resides under your forehead. Her research group mastered a technique that was relatively new in the 70s, known as tracer autoradiography. By labelling amino acids, the building blocks of proteins, the Goldman-Rakic lab was able to visualise neurons connecting different regions of the brain. Dr Goldman Rakic, her postdoctoral associate Dr Selemon and others in the group found multiple channels of communication across brain regions, including identifying connections between the prefrontal cortex and the hippocampus, a region involved

Figure 1. Slides available on the MacBrainResource website appear as low magnification thumbnails. Approved investigators will be granted access to a database of slides that can be zoomed to high magnification that enables viewing of labelling in single cells. Top row: sagittal views of the brain stem and cerebellum in a macaque brain harvested at postnatal day 76 (P76) following thymidine injection at embryonic day 30 (E30). Middle row: coronal sections through an embryonic day 60 (E60) brain following thymidine injection one week earlier (E53). Bottom row: coronal sections through P75 brain that has been immunocytochemically processed for neuropeptide Y.

in the formation of memories. After Dr Goldman-Rakic joined Dr Rakic at Yale she continued to study brain connectivity and accumulated a large collection of brain material revealing different connecting networks across the brain, constituting Collection 2. Collection 3 represents the culmination of collaboration of Drs Rakic and Goldman-Rakic at Yale. With their combined prowess as accomplished surgeons, they conducted surgery in prenatal animals either to resect or to inject brain areas at various developmental stages. To do this, they removed the foetus from the mother to perform the experimental procedure and then replaced the foetus back in the mother’s womb. Yale is one of the

only places in the world where these surgeries were conducted largely due to Dr Rakic and Dr Goldman-Rakic’s expertise. Postnatal lesioned cases are also included in Collection 3. Drs Rakic and Goldman-Rakic turned their attention towards schizophrenia for Collection 4. This complex disorder affects thoughts, emotions and behaviour and it can lead to inappropriate actions, false perceptions, fantasy and delusion. Dr Selemon’s studies in cortical brain samples from patients with schizophrenia had indicated that the prefrontal cortex lacks connectivity to the rest of the brain. With this in mind, researchers in the Rakic/Goldman-Rakic labs set out to create a non-human primate model to test the role of brain development

Duplicating the scale of this collection today would be extremely difficult, making MacBrainResource an incredibly important resource for scientists.

www.researchoutreach.org

95


in schizophrenia. They used prenatal exposure to X-irradiation to curtail generation of neurons during foetal development. Many of the animals were raised to adulthood at which time their brain tissue was collected and processed in a manner compatible with stereologic analysis. These celloidin-embedded, thick brain sections comprise Collection 4. This collection provides a unique opportunity to study the effects of disturbing foetal brain development in the adult brain. Collection 5 takes a closer look at the connections between different neurons, known as synapses. Researchers from the Rakic/Goldman-Rakic groups studied synaptogenesis, the formation of synapses, in multiple regions of the non-human primate brain. High resolution electron microscopy was used to reveal structures of neurons that cannot be seen with a light microscope. In one prominent study, Drs Rakic, GoldmanRakic and collaborators analysed 25 non-human primates from foetal brains up to adults of 20 years. They found that in the non-human primate cortex synapses are overproduced during development and are subsequently reduced during adulthood, concurring with the patterning described in the human cortex. GOLD MINE Decades later, these five collections are still being used to generate data. Researchers from universities around the world, including Illinois, California and Sydney, have visited the School of Medicine at Yale to analyse regions of the brains that had not been examined by Rakic/ Goldman-Rakic labs. Drs Lynn Selemon and Alvaro Duque hope that through digitisation, MacBrainResource will make all five collections more widely accessible to researchers not affiliated with Yale. Drs Selemon and Duque have utilised the collections in some of their studies and are passionate that other neuroscience researchers get the opportunity to do the same. Collection 1 has been utilised by Dr Duque

Figure 2. Collection 4 is comprised of fetally irradiated and control brains that were processed for stereologic analysis. This figure illustrates that images suitable for stereologic analysis can be generated for remote access. A low-magnification view of the striatum of an adult animal exposed to fetal irradiation is shown (top inset); the box indicates the location of high-magnification views spanning the 15 µm, z-axis focal plane. Arrows indicate cells in focus at focal plane 0 µm (top), -7.5 µm (middle), and -15 µm (bottom).

who compared a more contemporary method of identifying new neurons, called BrdU, to the tritiated thymine injected non-primates in Collection 1. He found that neurons taking up BrdU weren’t as healthy. This has important implications for researchers using BrdU as a marker of new neurons. So important in fact that Dr

Making MacBrainResource available to the larger neuroscience community will facilitate primate brain research for a new generation of researchers.

96

www.researchoutreach.org

Duque was asked to contribute a chapter to a technical book series. Collections 2-5 haven’t been utilised as extensively. In one recent publication, Dr Selemon and her colleagues compared a new tracing method, dynamic programming in conjunction with diffusion tensor imaging, to the tracer autoradiography method found in Collection 2 to corroborate findings. Dr Selemon, Dr Duque and other researchers at Yale believe that making MacBrainResource available to the larger neuroscience community will facilitate primate brain research going forward.


Behind the Research Dr Lynn Selemon

Dr Alvaro Duque

E: macbrainresource@yale.edu T: +1 203 785 4323 W: macbrainresource.org

Detail

Research Objectives

Department of Neuroscience Yale University School of Medicine New Haven, CT 06510

MacBrainResource is an online repository of macaque brain material available for the use of researchers.

Bio Dr Lynn Selemon, a Research Scientist in the Department of Neuroscience, earned a doctorate in neuroscience from the University of Rochester. As a postdoctoral associate of Dr Goldman-Rakic at Yale, she conducted tract tracing experiments to elucidate cortical pathways in the brain. Her more recent research has focused on neuroanatomic abnormalities associated with schizophrenia and PTSD.

References

Dr Alvaro Duque, a Research Scientist in the Department of Neuroscience, earned his doctoral degree in neuroscience from Rutgers University in Dr Laszlo Zaborszky’s lab. He was a postdoctoral associate at Yale in the labs of Drs Patricia Goldman-Rakic and David McCormick. Currently, he is working with Dr Pasko Rakic on studies of primate development.

Selemon LD, Rajkowska G, Goldman-Rakic PS (1995) Abnormally high neuronal density in the schizophrenic cortex. A morphometric analysis of prefrontal area 9 and occipital area 17. Arch Gen Psychiatry, 52(10), 805-818.

Funding NIMH (1RO1MH113257) Collaborators Dr Alvaro Duque and Dr Lynn Selemon are co-PIs on this project. They would like to acknowledge the following collaborators: • Dr Pasko Rakic, consultant • Philip Barello, computer systems manager • Mary Pappy, histology technician • Yuri Morozov, EM technician • Tayor Spadory, current undergraduate intern • Aviva Rabin-Court, past undergraduate intern

Personal Response Using animal models for research is presenting more challenges now than it has done in the past. How do you see neuroscientists conducting research in the future? Today neuroscientists who wish to address scientific questions in the non-human primate face enormous hurdles in terms of cost, regulation and an increasingly unfavourable public attitude towards animal research. MacBrainResource provides a 21st century solution that circumvents these obstacles by allowing researchers worldwide to conduct new research without having to bear the exorbitant costs of primate research or sacrifice any animals. It is our mission to maximise usage of these valuable primate brain materials and by doing so promote primate research for the benefit and advancement of science.

MacBrainResource, Yale School of Medicine, available at URL: macbrainresource.org https://medicine.yale.edu/neuroscience/macbrain/ SFN 2018 abstract and poster: MacBrainResource: archived macaque brains available for neuroanatomical and neurodevelopmental studies

Selemon LD, Zecevic N (2015) Schizophrenia: a tale of two critical periods for prefrontal cortical development. Transl Psychiatry, 5, e623. Ratnanather JT, Lal RM, An M, Poynton CB, Li M, Jiang H, Oishi K, Selemon LD, Mori S, Miller MI (2013) Cortico-cortical, cortico-striatal and cortico-thalamic white matter fiber tracts generated in the macaque brain via dynamic programming. Brain Connect, 3(5) 475-490. Duque A, Rakic P. (2011). Different effects of BrdU and 3H-Thymidine incorporation into DNA on cell proliferation, position and fate. J Neurosci. 31(42):15205-15217. Duque A, and Rakic P. Identification of proliferating and migrating cells by BrdU and other thymidine analogues. Benefits and limitations. In: Immunocytochemistry and Related Techniques. Neuromethods Vol 101 p.123-139. A. Merighi and L. Lossi, Ed. Springer Science and Business Media. 2015 Duque A, Krsnik Z, Kostovic I, Rakic P. (2016) Secondary expansion of the transient subplate zone in the developing cerebrum of human and nonhuman primates. PNAS 113:9892-9897. Duque A, Selemon L. MacBrainResource: sharing primate specimens around the world. (2018). Research Features Magazine. Health and Medicine. May 1st, 2018 Issue. England. https://researchfeatures.com/2018/05/01/ macbrainresource-sharing-primate-specimens/ Rash BG, Duque A, Morozov YM, Arellano J, Micali N, Rakic P. (2019). Origins and dynamics of gliogenesis in the outer subventricular zone of the developing primate cerebrum. PNAS in print.

www.researchoutreach.org

97


Health & Medicine ︱ Professor Silvia Conde

The carotid body A candidate for regaining glucose tolerance in Type 2 diabetes Professor Silvia Conde and her team from the NOVA Medical School, NOVA University of Lisbon, have proposed a new strategy for treating metabolic diseases. Previously, they showed that an over-active homeostasis sensor called the carotid body could cause insulin resistance and disrupt glucose tolerance, common in Type 2 diabetes. In their new study, Prof Conde’s team applied a kilohertz frequency alternating current into the carotid sinus nerve of diabetic animal models, which resulted in a significant recovery. This type of bioelectric neuromodulation could be developed into an alternative therapy and clinical diagnostic tool for a range of metabolic disorders in the future.

T

here’s no sugar coating it – how and what we eat has a huge impact on our bodies. A high fat, high sugar diet can upset a body’s mechanism to deal with the excess. This upset can cause resistance to the sugar-regulating hormone insulin and failure of insulinsecreting pancreatic beta cells. These result in deterioration of overall glucose metabolism. Once diet and exercise can no longer contain the condition, patients need external glucose-lowering agents to control their blood sugar levels. Despite a number of treatment options, Type 2 diabetes is one of the fastest growing health threats in the world, with a predicted 500 million global patients by 2040. Many patients maintain poor glucose control despite insulin treatment and combination therapy, which suggests these strategies may not treat the prime sources for the disease onset. In fact, obstructive sleep apnoea and ovarian syndrome patients have also shown resistance to insulin. If distorted glucose tolerance can occur independently from obesity, some unknown biological triggers could be at the root of these metabolic disorders.

The carotid body is located by the fork in the carotid artery – shown here in red.

98

www.researchoutreach.org

CAROTID BODIES One possible trigger candidate for Type 2 diabetes is the carotid body. Carotid bodies are chemoreceptors located in the bifurcation of carotid arteries, which run along both sides of the throat. Their main purpose is to detect changes in oxygen, carbon dioxide and pH levels in the blood. When faced with dangerous levels, carotid bodies increase the frequency of impulses, known as ‘action potentials’, in their sensor nerve, the carotid sinus nerve.

This nerve signals the central nervous system to turn on a range of responses to calm the situation. These responses induce homeostasis – balance of the internal environment. In addition, carotid bodies can sense insulin and may even be able to detect blood sugar levels. Due to their sensing abilities, carotid bodies are likely to be important metabolic sensors for controlling energy homeostasis. Yet how carotid bodies are linked to insulin resistance and rising metabolic disorders has been critically unclear. A NEW THERAPEUTIC TARGET Professor Silvia Conde and her team thus began to investigate the link between carotid bodies and diabetes. In 2013, they showed that carotid bodies were over-active in insulin-resistant, glucose-intolerant and hypertensive animal models. Over-stimulated carotid bodies were shown to flood downstream neural signalling, causing symptoms present in Type 2 diabetes, metabolic syndrome and obstructive sleep apnoea. Prof Conde’s research team saw the carotid body as a new therapeutic target for treating metabolic diseases. In 2017, they performed a chronic bilateral surgical resection (partial removal) of the carotid sinus nerve, which disconnected its signalling to the brain in diabetic rat animal models. Astonishingly, they found that resection of the sinus nerve restored normal insulin sensitivity and glucose homeostasis for up to 11 weeks. Prof Conde’s study proved that an over-active carotid body could therefore be key to understanding the pathophysiology of Type 2 diabetes and other metabolic diseases.


Being overweight and leading a sedentary lifestyle are both risk factors for Type 2 diabetes.

Monitoring blood sugar levels can help to manage diabetes.

However, Prof Conde suggests that the surgical procedure is an unlikely therapy for diabetic patients. Resection of the carotid body is not only irreversible and invasive, but could have lasting side effects, such as impaired responses to oxygen and carbon dioxide levels, and loss of adaptation to exercise and blood pressure regulation. A safer way of decreasing carotid body function was therefore still missing. BIOELECTRIC NEUROMODULATION Prof Conde’s team embarked on the hunt for an alternative solution. In partnership with Galvani Bioelectronics (formerly GlaxoSmithKline Bioelectronics), her 2018 study proposes a new strategy for rebooting over-active carotid bodies – energy itself, or in this case, electricity. The team hoped to find out whether bioelectric modulation of the carotid sinus nerve could restore insulin sensitivity

and glucose tolerance in Type 2 diabetic animal models. A group of rats were fed a high fat and sugar diet for 14 weeks, after which cuff electrodes were implanted bilaterally into the carotid sinus nerve of the diabetic group. Similar electrodes were also planted onto control group rats, fed with a normal diet. Additional sensors were placed in the rat diaphragm to record electrical activity of the skeletal and heart muscles. A 50 kilohertz (kHz) frequency alternating current (KHFAC) was then run through the electrons into the carotid sinus nerve. After nine weeks of bioelectrical treatment, the rats were examined for insulin sensitivity and glucose tolerance by measuring biomarkers such as plasma insulin, glucagon, c-peptide and lipid profiles. The results were remarkable. The team found that the bioelectric treatment significantly increased insulin sensitivity and glucose tolerance after one week. These effects continued throughout the rest of the treatment period but reversed back to diabetic levels after the KHFAC was ceased. Bioelectric treatment

showed similar efficiency as did resection of the carotid sinus nerve. However, bioelectric modulation was fully reversible, and had no significant side effects. FUNCTIONAL DIAGNOSIS Knowing that the carotid body chemoreceptors are involved in metabolic disease progression has vast clinical relevance. Since the carotid body function and glucose homeostasis are connected, evaluating the organ’s activity could act as a promising door to unravel the disease phenotypes of a range of metabolic diseases. Prof Conde highlights that currently no medical device links carotid body function to neuroendocrine status, nor are there Sugar in the diet becomes sugar in the bloodstream. The carotid body can become tolerant to blood glucose levels.

Due to their insulin-sensing ability, carotid bodies are likely to be important metabolic sensors for controlling energy homeostasis.

www.researchoutreach.org

99


Bioelectric treatment of the carotid body could help modulate the body’s response to glucose.

any clinical guidelines focusing on how the organ balances glucose and lipid homeostasis. She highlights that functional diagnosis of the carotid body would also have disease predictive value, as it could screen for patients with developing respiratory and metabolic disorders. OVERCOMING SIDE EFFECTS Despite these predictions, neuromodulation of the carotid sinus nerve is still far from a viable therapy. Bioelectric modulation can result in similar side effects as surgical resection, since KHFAC treatment could affect other functions of the carotid sinus nerve, such as accurately adjusting blood pressure. Although these effects did not appear in Conde’s 2018 study, the continuous electrical current into the nerve could impair blood pressure fine tuning, exercise intolerance and carbon dioxide sensitivity in humans.

A healthy lifestyle can reverse prediabetes.

Prof Conde proposes bioelectric modulation of the carotid sinus nerve for controlling carotid body activity in metabolic diseases. Prof Conde’s team currently focuses on minimising the off-target effects of bioelectric neuromodulation. One way to do this is to identify and characterise specific neural circuits related to overactivity in metabolic diseases. The team hypothesises that different fibres within the nerve link to different stimuli and mapping out these neural pathways can develop selective therapeutic modulation.

Prof Conde’s group use a diabetic rat model to study the impact of modulation of the carotid sinus nerve in this disease.

100

www.researchoutreach.org

Prof Conde states that promising uses for continuous high frequency blocking of the sinus nerve would be “to resynchronize action potential firing disease patterns in specific fibres” or perform an “intermittent block of the carotid sinus nerve”. These approaches could lead to bioelectric treatments that

discriminate between glucose and insulin mediating pathways without affecting others, such as oxygen and carbon dioxide chemo-sensitivity. THOUGHTS FOR THE FUTURE Overall, Prof Conde’s research shows how little we still understand our basic energy metabolism. Her research takes us a step closer to the root causes of common disorders like Type 2 diabetes. The proposed bioelectric treatment of the carotid body is a tuneable and reversible strategy, which could have minimal interference with a patient’s daily activities. However, as Prof Conde points out, the next step is to find a bioelectric method for selective modulation and with no off-target effects.


Behind the Research Professor Silvia Conde

E: silvia.conde@nms.unl.pt T: +351 918974400 W: http://cedoc.unl.pt/neuronal-controlmetabolic-disturbances/ W: www.facebook.com/groups/1118454071556934/

Research Objectives Prof Conde’s work focuses on the carotid body and its role in metabolic diseases.

Detail CEDOC, NOVA Medical School, Rua Câmara Pestana, nº6, Edificio 2, piso 3 Lisbon Portugal Bio Silvia Conde graduated in Biochemistry in 2000 and pursued her PhD in 2007 from both NOVA University of Lisbon and the University of Valladolid. She is Principal Investigator at CEDOC and assistant professor at NOVA Medical School. In 2009 she was awarded the Portuguese L’Oréal Medals Honor for Women in Science. Funding • Galvani Bioelectronics • Portuguese Foundation for Science and Technology Collaborators • The CEDOC team: Maria P. Guarino, Joana F. Sacramento, Maria J Ribeiro, Bernardete Melo • The Galvani team: Daniel Chew, Sonal Patel, Nishan Ramnarain, Victor Pikov, Kristoffer Famm

References Conde, S., & Guarino, M. (2018). Targeting bioelectronically the carotid sinus nerve in Type 2 diabetes: strengths, drawbacks and challenges for the future. Bioelectronics In Medicine,1(3), 167-170. Ribeiro, M., Sacramento, J., Gonzalez, C., Guarino, M., Monteiro, E., & Conde, S. (2013). Carotid Body Denervation Prevents the Development of Insulin Resistance and Hypertension Induced by Hypercaloric Diets. Diabetes, 62(8), 2905-2916. Sacramento, J., Ribeiro, M., Rodrigues, T., Olea, E., Melo, B., & Guarino, M. et al. (2016). Functional abolition of carotid body activity restores insulin action and glucose homeostasis in rats: key roles for visceral adipose tissue and the liver. Diabetologia, 60(1), 158-168. Sacramento, J., Chew, D., Melo, B., Donegá, M., Dopson, W., & Guarino, M. et al. (2018). Bioelectronic modulation of carotid sinus nerve activity in the rat: a potential therapeutic approach for type 2 diabetes. Diabetologia, 61(3), 700-710.

Personal Response How did you first get interested in the possibility of bioelectric neuromodulation of the carotid body? After our publication in 2013 in Diabetes (Ribeiro et al., Diabetes, 2013) where we described how the carotid body is involved in the genesis of peripheral insulin resistance and that carotid body dysfunction is present in animal models of metabolic dysfunction, we were contacted by the recently (at that moment) formed GSK bioelectronic unit. They were searching for diseases controlled by the peripheral nervous system that could be targeted bioelectronically. We started a collaboration with them that resulted in the present findings – that carotid sinus nerve bioelectronics modulation can restore insulin sensitivity and glucose homeostasis in animals with Type 2 diabetes.

www.researchoutreach.org

101


Health & Medicine ︱ Dr Daniel Linseman

A potential new treatment for brain injury Injuries to the brain can have dire consequences, leading to debilitating symptoms and an increased risk of long-term degenerative diseases such as Alzheimer’s disease and amyotrophic lateral sclerosis (ALS). Despite the impact these injuries have on the lives of millions, there are no good treatments or preventative therapies. Associate Professor Daniel Linseman, of the Department of Biological Sciences and the Knoebel Institute for Healthy Aging at the University of Denver, is working to remedy that. His research focuses on understanding the mechanisms that underlie brain injury and dysfunction. Interestingly, his recent work on the dietary supplement Immunocal ®, shows it to be a potential option for mitigating the deleterious effects of traumatic brain injuries (TBIs), such as concussion, and diseases like ALS.

T

he brain is an incredibly complex organ. From that complexity comes the ability to think, imagine, and perform equally complex tasks. The brain is our species’ most precious commodity, responsible for making us the most successful animal on the planet. It is also our most precious asset, making us who we are as humans. Unfortunately for us, it is particularly susceptible to injury. There are 1.7 million incidents of traumatic brain injury (TBI) in the USA alone, each year. The consequences can be severe; one in three of every injuryrelated death is due to a TBI. Those who survive can be burdened with motor or cognitive impairments that severely reduce their quality of life. There can also be long-term implications; people who have suffered TBI have an increased risk of developing neurodegenerative diseases such as Alzheimer’s disease and amyotrophic lateral sclerosis (ALS). Despite the severe impact on individuals and society, there are no approved treatment options. That is not to say that researchers are not actively pursuing their development; Dr Linseman is one such researcher interested in understanding the mechanisms that could explain what happens in the brain when it is diseased or injured. He sees one aspect as being particularly important: oxidative stress. His work on reducing the effects of oxidative stress has shown that the dietary supplement Immunocal® could be useful for reducing the severity of symptoms in disorders where oxidative stress causes problems, such as TBI or ALS.

102

www.researchoutreach.org


Aβ/p-Tau

BDNF

mTBI (multiple)

IL-10/TNFα CONCUSSED BRAIN AD Tauopathy

HEALTHY BRAIN Aβ/p-Tau

IL-10/TNFα

BDNF

THERAPEUTIC INTERVENTION Immunocal® (GSH precursor) IMMUNOCAL®, AN OXIDATIVE STRESS RELIEVER IN TBI To generate energy, our cells take oxygen from the air we breathe and use it to release energy stored in the food we eat. This process is essential to our survival. By-products of this process are potentially harmful molecules known as reactive oxygen species (ROS). These

mitochondria. This damage can result in cell death, producing a worsening of symptoms. Dr Linseman’s approach is to minimise the damage caused by ROS in the injured brain. To do this, he is looking into methods to increase the ability of the brain to neutralise ROS,

The brain is our species’ most precious commodity, responsible for making us the most successful animal on the planet. It is also our most precious asset.

How Immunocal® can relieve oxidative stress in traumatic brain injury.

such as boosting levels of the ROS scavenger glutathione. To that end, his group has been testing a dietary supplement derived from whey protein called Immunocal®. Immunocal® contains high levels of a molecule called cysteine, an important precursor from which glutathione can be produced. The hope is that by providing precursors such as cysteine, the amount of glutathione will increase, and cells will build up a more robust defense against harmful ROS following injury. A 2018 study by Dr Linseman’s team demonstrated that this is indeed the case.

molecules are harmful because, if left unchecked, they can lead to damage of important cellular components such as DNA and mitochondria (the energy producing parts of a cell). However, some production of ROS is normal, and cells come equipped to deal with them. Cells contain scavengers – antioxidant molecules such as glutathione that can react with the ROS and convert them to harmless waste products. However, research has shown that this process is disrupted during injury and disease. Injuries such as TBI can result in an increase in the generation of ROS. The scavenging system becomes over-burdened and ROS start to damage parts of the cell such as the

Dr Linseman’s team used a mouse model to test the effects of Immunocal®.

www.researchoutreach.org

103


His group used a mouse model of TBI to determine if supplementation with Immunocal® could improve symptoms.

In ALS, motor neuron function is impaired leading to a loss of control over movement.

They gave mice the supplement twice daily for 28 days prior to administration of a moderate TBI. The first question the group had was whether feeding the mice Immunocal® would increase glutathione levels. They were pleased to see that the Immunocal®-treated group showed a significant increase in the levels of glutathione in the brain compared to untreated mice subjected to TBI. Next, they looked to see if symptoms of the injury were improved. Excitingly, mice that received Immunocal® showed less impairment in their motor and cognitive abilities than controls, demonstrating that Immunocal® has the potential to improve the outcome of serious injury to the brain. Next, the group looked in more detail at the mouse brains to try and understand how they changed in response to injury. Although the initial damage to the brain seemed to be similar between treatment and control groups, the researchers noticed differences in how the brain degenerated over time. The supplemented mice had less degeneration of their neurons (the cells responsible for transmitting information in the brain), and the connections between neurons in different parts of the brain remained more intact. Interestingly, the supplemented mice had reduced lipid peroxidation in the brain, a sign that they suffered reduced levels of oxidative stress compared to controls. NEURODEGENERATION There is ample evidence that oxidative stress plays a key role in the pathology of various neurodegenerative diseases such as Alzheimer’s disease and ALS.

Excitingly, mice that received Immunocal ® showed less impairment in their motor and cognitive abilities than controls. In ALS, motor neurons that carry signals from the brain to various parts of the body begin to die. As the cell death worsens, people with the condition progressively lose control over their movement. It is a devastating disease with a lack of effective treatments and although the cause of the neuronal death in ALS is not known, it is suspected that oxidative damage plays a key role. Dr Linseman therefore wanted to know if Immunocal® could improve the symptoms of ALS as it had done in their TBI model. In the 2014 paper in the journal Antioxidants, his group used a mouse model of ALS. The mice they used have a mutation in the gene superoxide dismutase 1 (SOD1) which causes them to develop a condition which is similar to that of human ALS. They wanted

The effects of oxidative stress

Reactive oxygen species or ‘free radicals’ can damage cells, even causing cell death.

104

www.researchoutreach.org

to know if supplementing these mice with Immunocal® could improve their ALS symptoms compared to mice that did not receive supplementation. They were pleased to see that mice supplemented with Immunocal® had a modest delay in the age at which their symptoms presented, 98.5 ± 1.1 days compared to 91.6 ± 0.9 days for the untreated ALS mice. What is more, their symptoms also improved; mice that received Immunocal® showed less decline in grip strength (a measure of the loss of muscle control that occurs as the disease progresses). Finally, looking at the levels of glutathione confirmed that, as in the TBI model, Immunocal® supplementation of the ALS mice had a bolstering effect on levels, both in the blood and in spinal cord tissue. THE FUTURE A wealth of research suggests that oxidative stress leads to increased damage and a worsening of symptoms in a number of brain disorders including TBI and ALS, and yet a good means of combating these diseases is yet to be found. From Dr Linseman’s preclinical work in mice, Immunocal® shows great promise. The next step will be testing the effects of Immunocal® in humans and, although the road from animal studies to a treatment for humans can be a long one, we wish him great success.


Behind the Research Dr Daniel Linseman

E: daniel.linseman@du.edu T: +1 (303) 359 5905 T: +1 (303) 871 4663

Research Objectives

References

Dr Linseman and his lab explore a prospective nutritional supplement, Immunocal®, for enhancing resilience and improving recovery following traumatic brain injury which could be taken as a preventative measure by high-risk populations.

Ignowski, E., Winter, A. N., Duval, N., Fleming, H., Wallace, T., Manning, E., Koza, L., Huber, K., Serkova, N. J., & Linseman, D. A. (2018). The cysteine-rich whey protein supplement, Immunocal®, preserves brain glutathione and improves cognitive, motor, and histopathological indices of traumatic brain injury in a mouse model of controlled cortical impact. Free radical biology & medicine, 124, 328341.

Detail Daniel Linseman Knoebel Institute for Healthy Aging University of Denver 2155 E. Wesley Ave. Denver, CO 80208, USA Bio Dr Linseman’s research is focused on mechanisms of neuronal death in degenerative disorders and episodes of neurotrauma, with a particular emphasis on ALS, Alzheimer’s, and traumatic brain injury. Dr Linseman is an Associate Professor in the Department of Biological Sciences and the Knoebel Institute for Healthy Aging at the University of Denver. Funding Immunotec, Inc. (Quebec, Canada) Collaborators • Natalie Serkova, PhD University of Colorado Anschutz Medical Campus

Ross, E. K., Gray, J. J., Winter, A. N., & Linseman, D. A. (2012). Immunocal® and preservation of glutathione as a novel neuroprotective strategy for degenerative disorders of the nervous system. Recent patents on CNS drug discovery, 7(3), 230-5. Ross, E. K., Winter, A. N., Wilkins, H. M., Sumner, W. A., Duval, N., Patterson, D., & Linseman, D. A. (2014). A Cystine-Rich Whey Supplement (Immunocal(®)) Delays Disease Onset and Prevents Spinal Cord Glutathione Depletion in the hSOD1(G93A) Mouse Model of Amyotrophic Lateral Sclerosis. Antioxidants (Basel, Switzerland), 3(4), 843-65. Winter, A. N., Ross, E. K., Daliparthi, V., Sumner, W. A., Kirchhof, D. M., Manning, E., Wilkins, H. M., & Linseman, D. A. (2017). A Cystine-Rich Whey Supplement (Immunocal®) Provides Neuroprotection from Diverse Oxidative Stress-Inducing Agents in Vitro by Preserving Cellular Glutathione. Oxidative medicine and cellular longevity, 2017, 3103272.

Personal Response Could you tell us about any planned or current studies with Immunocal® in humans? Immunotec, Inc. (Quebec, CA) is currently performing a pilot clinical study in patients with mild cognitive impairment (MCI): “Nutritional Intervention With the Dietary Supplement, Immunocal® in MCI Patients: Promotion of Brain Health”. The Principal Investigator for this study is Dr Hyman Schipper at the Memory Clinic/Jewish General Hospital located in Montréal, Quebec, Canada.

www.researchoutreach.org

105


Health & Medicine ︱ Dr Sanjay Gupta

Plant phytochemicals A new cancer chemopreventative? Dr Sanjay Gupta from Case Western Reserve University is exploring the chemopreventative properties of phytochemicals such as polyphenols and flavonoids, found in plants. The team aim to enhance our understanding of the epigenetic mechanisms that are influenced by phytochemicals resulting in cancer prevention. By manipulating gene expression, without changing the genetic code, phytochemicals can inhibit both oxidative stress and the effects of cancer-inducing proteins. Using dietary agents could be a cost-effective, nontoxic therapeutic tool to prevent cancer and improve the quality of life of patients.

Phytochemicals are derived from plants that we eat as part of a healthy diet.

106

www.researchoutreach.org

C

ancer is a leading cause of death worldwide and is responsible for one in six mortalities. Furthermore, around 70% of cancer deaths in 2008 occurred in developing countries. Cancer treatments are extremely expensive and the effectiveness of these drugs is limited. Consequently, many people cannot afford treatment, especially patients from low-income countries, resulting in suffering and increased mortality. However, around 30% of all cancer deaths could be prevented by a change in lifestyle and diet. It is well known that phytochemicals, derived from plants eaten as part of our diet, are powerful bioactive compounds and have an important role in antioxidation and cancer chemoprevention. This inspired Dr Gupta and his team to conduct extensive research that explores natural cancer prevention using dietary agents. In particular, Dr Gupta is focussed on the role of phytochemicals as epigenetic modifiers in cancer. WHAT IS EPIGENETICS? Epigenetics is the study of chemical changes to the genome that are heritable, reversible and affect gene expression without alteration of the genetic code itself. One essential epigenetic mechanism is DNA methylation. This is where methyl groups are added to

the DNA molecule, most commonly at the 5-carbon of the cytosine ring. The addition of the methyl group inhibits a process known as transcription, resulting in reduced gene expression. The removal of these methyl groups, or ‘DNA demethylation’, can therefore enhance gene expression. Another important epigenetic mechanism is histone modification. Histones are proteins which have a role in DNA packaging. DNA wraps itself around eight histone proteins to form chromosomes. Modification of histones by processes such as acetylation can affect gene expression. Addition of acetyl groups to the histone proteins decreases the strength of the interaction of the histone protein with the DNA resulting in a more relaxed chromatin structure. This is associated with increased levels of gene transcription. Conversely, the removal of acetyl groups from the histone protein (deacetylation) results in reduced levels of gene expression. By investigating the impacts of phytochemicals on epigenetics, Dr Gupta aims to use cost-effective, minimally toxic dietary agents as a cancer preventative or to improve the quality of life of suffering patients. GREEN TEA POLYPHENOLS In a recent study, Dr Gupta and his colleagues showed that polyphenols, found in green tea, can be used to delay breast cancer progression and invasion. The molecular pathway regulating tumour development is extremely complex. One of the earliest steps in the pathway is extracellular matrix (ECM) remodelling. The ECM is a network of macromolecules such as glycoproteins and enzymes that provide support to the surrounding cells. In tumour progression, matrix metalloproteinases (MMPs) degrade parts of the ECM, generating matrikines. These are active molecules


Many people, particularly in low-income countries, cannot afford expensive cancer treatment.

Dr Gupta has shown that polyphenols, found in green tea, can be used to delay breast cancer progression and invasion.

GTP has the potential to alter both DNA methylation and chromatin remodelling – two essential epigenetic mechanisms. that enhance tumour progression and tumour cell invasion. MMP activity is regulated by ‘tissue inhibitors of matrix metalloproteinases’ (TIMPs). These are natural inhibitors of MMPs and it has been hypothesised that upregulation of TIMPs in cancer cells could inhibit invasion and metastasis. Interestingly, studies have shown that green tea polyphenols (GTPs) such as epigallocatechin-3-gallate (EGCG) suppress the activity of MMPs in breast and prostate cancer lines. Dr Gupta and his team decided to explore this further by investigating the effect of EGCG on TIMP regulation and the potential epigenetic mechanisms underpinning this interaction. The team treated two different lines of breast and prostate cancer cells with EGCG for 72 hours and found that TIMP-3 mRNA and protein levels increased significantly. Additional investigation revealed that this was due to epigenetic regulation. EGCG treatment increases histone protein acetylation, resulting in increased levels of TIMP. Furthermore, the team discovered increased H3K27 trimethylation at the TIMP-3 promoter site, which enhanced gene expression.

CHEMOPREVENTION OF PROSTATE CANCER Prostate cancer is the second leading cause of cancer-associated mortality and is the most common cancer in men. In the UK, 129 men are diagnosed with prostate cancer every day. Clearly, there is an urgent need to develop novel therapies to combat prostate cancer. Dr Gupta and his colleagues explored the molecular pathway underpinning prostate cancer and investigated the epigenetic

impacts of GTP exposure. Epigenetic silencing of gluthathione-S-transferase pi (GSTP1) is a key feature of prostate cancer. GSTP1 is an enzyme that has an important role in detoxification of potent compounds such as carcinogens. In prostate cancer, GSTP1 gene expression is repressed due to the epigenetic mechanism of DNA methylation at the promoter site. The team performed

PROSTATE CANCER

Tumours in the prostate compress the urethra making urination difficult and uncomfortable.

www.researchoutreach.org

107


a study in which human prostate cancer cells were exposed to GTP for 1–7 days. Results revealed that GSTP1 was reexpressed and DNA methyltransferase (the enzyme responsible for DNA methylation) was inhibited, reducing the inhibiting effects of DNA methylation at the GSTP1 promoter site. Furthermore, chromatin immunoprecipitation assays revealed that prostate cancer cells treated with GTP have remodelled chromatin, resulting in increased transcriptional activation of the GSTP1 gene, leading to increased gene expression. Overall, these ground-breaking results show that GTP has the potential to alter both DNA methylation and chromatin remodelling – two essential epigenetic mechanisms, highlighting the value of GTP as a chemopreventative tool for prostate cancer. ANTIOXIDANT PROPERTIES OF APIGENIN Apigenin is a flavone subclass of flavonoid, found in many herbs, fruits and vegetables such as parsley, onions, oranges and chamomile and is an essential component of our diet. Studies have shown that Apigenin is found in oranges, as well as many other fruits, vegetables and herbs.

108

www.researchoutreach.org

Current treatment options for prostate cancer include radiation therapy.

Apigenin accumulates in the nuclear matrix and binds to DNA, reducing DNA damage via oxidative stress. apigenin possesses a variety of beneficial properties such as tumour growth inhibition and antioxidant activity. Dr Gupta and his team aimed to research the effect of apigenin on prostate cancer cells. Human prostate cancer is extremely vulnerable to oxidative stress in which free radicals react with proteins and DNA, often causing extensive damage. DNA mutations caused by oxidative stress can result in malignant changes, enhancing the risk of prostate cancer development. The team performed an in-depth study, exploring the cellular uptake of apigenin in prostate cancer cells. Results revealed, for the first time, that apigenin accumulates in the nuclear matrix and binds to DNA, reducing DNA damage via oxidative stress. Interestingly, apigenin preferentially accumulates in cells which have androgen receptors. Apigenin binds to androgen receptors and inhibits its expression in prostate

cancer cells. Dr Gupta suggests that apigenin could possess androgen receptor inhibition properties, interfering with androgen signalling, reducing oxidative stress. However, more research is needed to clarify this hypothesis. FUTURE RESEARCH Dr Gupta’s ground-breaking research has shown the value of dietary phytochemicals as cancer preventative agents. The role of phytochemicals as epigenetic modifiers has been supported by the studies of Dr Gupta and his team. However, our knowledge about the exact molecular mechanisms underpinning epigenetic alterations is still limited. Furthermore, more clinical research must be performed. For example, an important next step is determining the most effective dose of phytochemicals required for optimal beneficial effects in humans. Overall, these cost-effective, low toxic dietary agents have the potential to be used as preventatives for people at risk of developing cancer or as treatments to improve the quality of life of those suffering.


Behind the Research Dr Sanjay Gupta

E: sanjay.gupta@case.edu T: +1 216 368 6162 W: http://casemed.case.edu/dept/urology/biogupta.cfm W: https://cwru. pure.elsevier.com/en/persons/sanjay-gupta/network/ W: www.youtube.com/watch?v=KKlfg8XvXac W: www.epibeat.com/ aging-environment-disease/diet-cancer-phytochemicals-epigenetic-modifiers/5329/ W: www.youtube.com/watch?time_ continue=4&v=21fFd1JnZHY

Research Objectives

References

Dr Gupta’s mission is to identify and develop cost-effective, minimally toxic bioactive agents as cancer preventative agents for long-term use and as adjuvants in various therapies with a focus on epigenetic research.

Deb, G., Thakur, V.S., Limaye, A.M. and Gupta, S., 2015. Epigenetic induction of tissue inhibitor of matrix metalloproteinase-3 by green tea polyphenols in breast cancer cells. Molecular carcinogenesis, 54(6), pp.485-499.

Detail

Kanwal, R., Datt, M., Liu, X. and Gupta, S., 2016. Dietary flavones as dual inhibitors of DNA methyltransferases and histone methyltransferases. PloS one, 11(9), p.e0162956.

Dr Sanjay Gupta Case Western Reserve University Department of Urology 2109 Adelbert Road Wood Research Tower, RTG01 Cleveland, Ohio 44106 USA Bio Dr Gupta was awarded his PhD by Avadh University in 1992 and his research was conducted at the Industrial Toxicology Research Center and King George’s Medical College, Lucknow, India. He completed his postdoctoral fellowship in the Department of Dermatology at Case Western Reserve University. Funding • National Cancer Institute (NCI) • National Center for Complementary and Integrative Health (NCCIH) • National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) • Cancer Research Foundation • Department of Defense (DoD) • Ohio Board of Regents-Presidential Research Initiative • Gateway for Cancer Research • United States Department of Veterans Affairs (VA) Collaborators • Cleveland State University, USA • Cleveland Clinic Foundation, USA • Purdue University, USA • University of Louisville, USA • University of Oslo, Norway • University of Ioannina, Greece • Central University of Punjab, India • Indian Institute of Technology, India • Institute of Life Sciences, India • North-Eastern Hill University, India • Second Hospital of Lanzhou University, China

Pandey, M., Shukla, S. and Gupta, S., 2010. Promoter demethylation and chromatin remodeling by green tea polyphenols leads to re-expression of GSTP1 in human prostate cancer cells. International Journal of Cancer, 126(11), pp.2520-2533. Shankar, E., Kanwal, R., Candamo, M. and Gupta, S., 2016, October. Dietary phytochemicals as epigenetic modifiers in cancer: promise and challenges. In Seminars in cancer biology (Vol. 40, pp. 82-99). Academic Press. Sharma, H., Kanwal, R., Bhaskaran, N. and Gupta, S., 2014. Plant flavone apigenin binds to nucleic acid bases and reduces oxidative DNA damage in prostate epithelial cells. PLoS One, 9(3), p.e91588. Cancer: Key Statistics. The World Health Organization. Available at: www.who.int/cancer/resources/keyfacts/en/ [Accessed 05/01/2018] About Prostate Cancer. Prostate Cancer UK. Available at https://prostatecanceruk.org/prostate-information/aboutprostate-cancer [Accessed 05/01/2019]

Personal Response Where do you see your research focus in five years’ time? In the next five years, our plan is to move our research from the bench to the clinic in testing the promising agents through clinical trials.

www.researchoutreach.org

109


Health and Medicine ︱ Dr Robert Katona

Adipose stem cells may promote cancer progression Adipose stem cells have been considered ideal for use in regenerative medicine due to their ‘safe and reliable’ qualities and ability to be easily and repeatedly harvested from humans. However, new research led by Dr Robert Katona of the Biological Research Centre of the Hungarian Academy of Sciences’ Institute of Genetics calls their safety into question, suggesting that the stem cells may promote cancer development.

S

tem cell-based therapies have carried the hopes of researchers, physicians and patients for over thirty years, showing great promise in their ability to restore damaged and diseased organs and tissue. Most stem cell therapies rely on adult stem cells called mesenchymal stem cells (MSCs), which can differentiate into cartilage, bone, connective tissue, muscle or adipose (fat) tissue. MSCs can be obtained from almost any organs or tissues (bone marrow, lung, spleen, liver, fat etc.). The MSCs derived from fat tissue are called adipose stem cells (ASCs) and these can be isolated with minimal ethical conflict. Until recently, bone marrow served as the main source of MSCs. However, adipose tissue became much more attractive as researchers realised that ASCs were abundantly available and could be easily and repeatedly sampled with minimally invasive procedures. Furthermore, adipose stem cells have demonstrated low immunogenicity (i.e., they are unlikely to react with the immune system) and some immunosuppressive properties which prevent the body from rejecting them – making them safe for therapeutic use. As a result, ASCs have been used in several clinical trials to treat conditions such as diabetes mellitus, liver disease, traumatic injuries and corneal lesions, worldwide. However, further research into human

and mouse ASCs has revealed that they are prone to chromosomal instability – they may lose or gain chromosomes. These kinds of mutations have been associated with cancer and, while no cancerous transformations of ASCs have been observed yet, ASCs have been shown to incorporate into tumours and promote their growth, raising concerns about the true safety of ASC therapies. Now, research led by Dr Robert Katona of the Biological Research Centre of the Hungarian Academy of Sciences’ Institute of Genetics has shown that ASCs with an abnormal number of chromosomes may promote cancer development and act as tumour support structures, known as stroma. INVESTIGATING ASC BEHAVIOUR IN MICE In order to get a better understanding of ASC behaviour, Dr Katona and his team extracted the visceral fat – the deep abdominal fat that surrounds organs – from mice and cultured the visceral adipose stem cells (vASCs) in vitro for several months. The changes were almost immediate and cell growth and division slowed down, with cell numbers decreasing in the first few weeks. Soon after, the cells began to deteriorate, becoming enlarged and flattened, with most stopping growth and proliferation altogether. Surprisingly, however, some continued to divide and after 50 days, the cell culture had grown beyond the initial sample size. By now, the cells were showing signs of increased DNA content, with most cells having four or more copies of at least one chromosome, rather than


30 00 00

p13

25 00 00

vASC P3 vASC P6

280

vASC P20

20 00 00 15 00 00

p3 p2 p4

10 00 00

p1 5 00 00

p5

p10

COUNTS

NUMBER OF CELLS

350

p12

p11

p9 p6

p7

p8

140

70

0

0 0

ASC B6

210

50 NUMBER OF DAYS

100

0

200

400

600

800

1000

DNA-CONTENT

Fifty thousand vASCs were plated and cultured in triplicate samples. Cells were passaged when the culture reached confluency and the living cell number of vASCs was determined with trypan blue staining and counting with BioRad TC10 counter device. The graph shows the average ± SD of living cell numbers in three parallel samples. The x-axis indicates days in culture from the initial plating, and the measuring points are referred as p1 to p13. A representative of three independent experiments is shown.

The DNA content of vASCs at passage numbers 3, 6 and 10 and of ASC.B6 cell line was determined by propidium-iodide staining and flow cytometric analysis.

the usual two, causing cells to have even up to 163 chromosomes in total, rather than the usual 40. This was confirmed using flow cytometry – where a liquid cell culture is poured down a tube and passed through one or more lasers to determine the physical and chemical properties of the cells – and microscopy. In both cases, DNA within the cells was stained so that it would fluoresce when hit with a laser and become visible to the researchers.

like growth factor 1 (IGF1), increased in ASC.B6 compared to vASCs (IGF1 production of vASCs was only initiated

This result is significant as multiple DNA copies can lead to altered gene expression and change the cells’ behaviour. To see how the cells’ gene expression and behaviour had changed, the researchers developed an immortalised ASC cell line, containing the abnormal DNA configuration which they called ASC.B6 and compared its activity to original vASCs. While the ASC.B6 cells were still able to differentiate into mature cells as normal, transcriptome analysis – study of the total sum of messenger RNA (an intermediary between DNA and proteins) present in the cell – showed that 2395 genes were being expressed at least twice as much or as low as normal. In particular, genes and gene products responsible for cell growth, division and movement were overrepresented in ASC.B6, showing that these cells might be involved in cancer development. This was backed by

increased expression of cancer stem cell markers Sca-1, CD29, and Krüppel-like factor 4 (Klf4). Interestingly, expression

New research lead by Dr Robert Katona has shown that ASCs ... may promote cancer development and act as tumour support structures. of another cancer stem cell marker, Nestin, actually decreased. ASCs CAN PROMOTE TUMOUR CELL GROWTH To demonstrate this, Dr Katona and his team co-cultured the 4T1 murine breast cancer cell line with vASCs or ASC.B6. Both ASC cultures enhanced proliferation of 4T1, although ASC.B6 did so far more than the vASCs. Surprisingly, most of the tumour-promoting growth factors were expressed at similar levels in both vASCs and ASC.B6. However, expression of one tumour-promoting growth factor, insulin-

when they started to show increased DNA content) and was found in the broth surrounding both cultures. Moreover, treatment with an IGF1 antibody significantly decreased tumour growth, suggesting that IGF1 production was at least partly responsible for ASC.B6’s tumour growth promoting effect. WHY IGF1? IGF1 is responsible for promoting mammary terminal end bud, ductal and gland formation in healthy mammals. While it’s mostly produced in the liver, it is also expressed in mammary stromal

Metaphase chromosome spreads were made from colchicine-blocked vASCs at passage numbers 4 and 7, and ASC.B6. Chromosomes were DAPI stained and counted using a fluorescent microscope.

www.researchoutreach.org

111


40

PERCENTAGE OF POSITIVE CELLS (%)

120 100

vASC

*

ASC.B6

30

80 **

60

20

40 10 20 0

CD29

CD44

Sca1

CD106

CD11b

CD45

Cell surface markers of vASCs at passage 3 and ASC.B6 cells were detected by flow cytometry and the percentage of positive cells was determined. The bars show the mean ± SD from three independent experiments, the statistical analysis was t-test with P-values set at: *P < 0.05, **P < 0.01.

cells. As a result, overexpression of IGF1 can lead to breast cancer development, progression and metastasis. This is reflected in Dr Katona’s results where he exposed vASCs and ASC.B6 to 4T1 murine breast cancer cells. The tumour-promoting effect of IGF1 is often heightened when it binds to its receptor – IGF1R. This makes it a promising target for human breast cancer therapies and several researchers have already attempted clinical trials that treat breast cancer with IGF1R or receptor tyrosine kinase inhibitors. Unfortunately, these clinical trials were stopped as IGF1R antibody therapy caused hyperglycaemia

0

Threshold Cancer

112

www.researchoutreach.org

Death and survival

Cell cycle

Functional enrichment analysis done by IPA for the identification of biological functions and diseases that were most profoundly represented by the differentially expressed genes. Only carcinogenesis associated categories are presented, all of which are highly affected. Right-tailed Fisher’s exact test was used to calculate a p-value. False discovery rates (FDR) were generated based on the Benjamini-Hochberg corrected p-values.

By now, the cells were showing signs of increased DNA content, with... cells having up to 163 chromosomes in total, rather than the usual 40. – increased blood sugar – and metabolic syndrome in patients while receptor tyrosine kinase inhibitors led to patients developing metabolic toxicities. Currently, researchers are exploring alternative anti-IGF1 antibodies, with some successfully slowing or even inhibiting cancer proliferation in latestage cancer patients. While this sounds

vASCs at passage numbers 3, 6 and 10 and ASC.B6 cell line were stained for SA-βgal activity for 16 h and then the blue staining (senescence) was detected with an inverted light microscope.

Mammalian stem cells dividing.

Cellular Cellular growth and movement proliferation

like a lot of effort, finding effective therapeutic targets against tumour stroma, such as ASCs, is critical, as their secretion of IGF1 can lead to cancer drug resistance, minimising the effectiveness of treatments and leading to poorer patient prognosis. CAN THE MOUSE MODEL BE APPLIED TO PEOPLE? Dr Katona believes it can. Mouse models are used due to their high similarity with humans, (we share 85% of our proteincoding DNA). As a result, cancerous activities observed in mice would most likely reflect those in humans. Just as in mice ASCs, ageing and deteriorating human ASC cultures have had altered chromosome numbers. Dr Katona expects this would also be the case in the body and could be due to higher concentrations of reactive oxygen species or inflammatory mediators which promote chronic inflammation in the tumour microenvironment. While Dr Katona hopes that his mouse ASC model could be used for further research into the genes responsible for cancer development and potential treatments targeting tumour stroma, his findings could also influence future use of therapeutic ASCs.


Behind the Research Dr Robert Katona

E: katona.robert@brc.mta.hu T: +3662433397 W: www.brc.hu/gen_acstem.php

Research Objectives

References

Dr Katona’s work uses a mouse adipose stem cell model system to study cancer and cancer stem cell development.

Fajka-Boja, R., Marton, A., Tóth, A., Blazso, P., et al. (2018). Increased insulin-like growth factor 1 production by polyploid adipose stem cells promotes growth of breast cancer cells. BMC Cancer, [online], 18, 872-884. Available at: https://bmccancer.biomedcentral.com/articles/10.1186/ s12885-018-4781-z [Accessed 22/01/2019].

Detail Institute of Genetics Biological Research Centre Hungarian Academy of Sciences 6726. Szeged, Temesvari krt. 62. Hungary Bio Dr Katona is Principal Investigator at the Artificial Chromosome and Stem Cell Research Laboratory. He has 28 years’ experience in basic research and R&D research. His fields of expertise are molecular biology, biochemistry, cell biology, transgenics, stem cell biology, nanotechnology, cytology and industrial protein production. Funding This work was funded by the GINOP-2.3.2-15-2016-00001 and GINOP-2.3.2-15-2016-00039 grant of the National Research, Development and Innovation Office. Collaborators • Roberta Fajka-Boja • Annamaria Marton • Anna Toth • Peter Blazso • Vilmos Tubak • Balazs Balint • Istvan Nagy • Zoltan Hegedus • Csaba Vizler

Frese, L., Dijkman, P. and Hoerstrup, S. (2016). Adipose Tissue-Derived Stem Cells in Regenerative Medicine. Transfusion Medicine and Hemotherapy, [online], 43(4), 268-274. Available at: https://www.ncbi.nlm.nih.gov/pmc/ articles/PMC5040903/ [Accessed 22/01/2019]. Karanes, C., Nelson, G., Chitphakdithai, P., Agura, E., et al. (2008). Twenty Years of Unrelated Donor Hematopoietic Cell Transplantation for Adult Recipients Facilitated by the National Marrow Donor Program. Biology of Blood and Marrow Transplantation, [online], 14(9), 8-15. Available at: https://www.bbmt.org/article/S1083-8791(08)00249-8/ fulltext [Accessed 22/01/2019]. Miana, V. and González, E. (2018). Adipose tissue stem cells in regenerative medicine. ecancermedicalscience, [online], 12, 822-836. Available at: https://www.ncbi. nlm.nih.gov/pmc/articles/PMC5880231/ [Accessed 22/01/2019]. Robertson, S. (2018). What is flow cytometry? [online] News Medical Life Sciences. Available at: https://www. news-medical.net/life-sciences/What-is-Flow-Cytometry. aspx [Accessed 22/01/2019]. (2010). Why mouse matters. [online] National Human Genome Research Institute. Available at: https://www. genome.gov/10001345/importance-of-mouse-genome/ [Accessed 22/01/2019]

Personal Response Given your results, would you suggest people reconsider the use of therapeutic ASCs and if so, under what circumstances? Yes, but people should receive ASC therapy only if all the safety issues have been addressed by double blind, regulated and controlled clinical trials. Any clinically unproven procedures should be strongly avoided.

www.researchoutreach.org

113


Biology ︱ Dr Nan-ping Weng

The biology of ageing What causes ageing? Is it inevitable? Could it be slowed or even reversed? Humans have wanted to know the answer to these questions ever since we became aware of our own mortality. Despite the tremendous advances that have been made in our scientific understanding over the past century, ageing remains one of the greatest mysteries in biological science. But scientists, like Dr Nan-ping Weng of the National Institutes of Health, are slowly uncovering its secrets.

D

r Nan-ping Weng thinks the best experimental approach to study human ageing is through a longitudinal approach. A ‘cross-sectional’ approach comparing groups of young individuals to older groups is commonly used and yielded rich information regarding the difference between the young and old groups. But this approach has issues, according to Weng: “Substantial differences in environmental experiences exist between young and old subjects, it is sometimes difficult to distinguish those true biological changes with age, from different life experiences”. With this in mind, he utilises an approach that tracks the changes that occur to

impacts their susceptibility to disease. In Dr Weng’s group’s 2016 study published in Immunity & Ageing, Dr Weng wanted to understand two things. Firstly, was there a change in the different types of immune cells in individuals as they age and, secondly, could a pattern be determined by comparing individuals? To do this, they looked at four different types of immune cells: B cells (cells that produce antibodies to fight infection, Natural killer (NK) cells (part of the inbuilt or ‘innate’ system which seek out and kill infected cells and two types of T cells (part of the immune system that can adapt following infection. T cells come in two

Humans have wanted to know the answer to these questions ever since we became aware of our own mortality. people over time. These follow-ups can come after 5 years, 10 years, 20 years, or even lifetime long. Although more timeconsuming and expensive than the crosssectional approach, so-called longitudinal studies allow researchers to minimise the variables that might otherwise confuse their conclusions. IMMUNE SYSTEM DECLINE It is well known that as people age, they become more susceptible to infection and other diseases as well, it is one of the reasons we vaccinate the elderly to influenza. In his longitudinal studies of ageing, Dr Weng is trying to understand what changes in the immune system as people age, and how this

flavours, CD4+ cells secrete signals to aid other immune cells in the fight, and CD8+, which, like NK cells, seek out and kill infected cells specifically. These cells are known collectively as lymphocytes. Dr Weng and colleagues tracked the changes to immune cell numbers in 165 subjects aged from 24 to 90 years old. The results showed that a large degree of variability from person to person. Some showed a decline, some no change and some had an increase. To try and understand this further, the group looked to see if the rate of change varied as people got older. They were interested to see that the rate of change stayed quite consistent over time. This means that although two people could display very different changes to their immune system as they age, their individual pattern of change did not seem to vary over time. Interestingly, different immune cells seem to vary to different extents. Most variable

114

www.researchoutreach.org


WHAT WE LOSE WITH AGE Cell

Cell

Cell

Chromosome

As cells divide Telomeres

End caps that protect the chromosome

over time...

...telomeres shorten, and eventually cell division stops were CD4+ T cells, followed by: NK cells, CD8+ T cells and B cells. The group wondered if these differences might be due to infection with viruses such as cytomegalovirus (CMV), but this proved not to be the case. However, they did find that levels of T cells correlated with levels of the cytokine (a soluble signal that controls immune function) IL-15, which is known to cause these cells to expand in number. Similarly, the number of B cells was found to correlate with signal TNF-RI, which causes a similar expansion in these cells. Therefore, differences in the levels of these soluble signals may have produced this variability. TELOMERES AND AGEING But what might cause the levels of immune cells to change? An explanation for why the levels of certain immune cells drop in some people as they age may lay in something called a telomere. Each time a cell divides in two, the cell needs to make an identical duplicate of its DNA (the template that instructs a cell how to function) so that each daughter cell receives a copy. On the end of each DNA strand is a sequence called a telomere. Each time the cell divides, the telomere shortens until eventually, it becomes so short that the cell can no longer divide. Why the DNA is capped with these telomere sequences is not known for sure, but some have put

forward the idea that they may act as a safety mechanism to prevent cells from dividing out of control, which could lead to cancer. In order to fulfil their function, the lymphocytes of the immune system are required to divide many times. This has led scientists to wonder if they could be susceptible to telomere shortening. To address this question Dr Weng published two studies in the journal Clinical Science (2015) and Frontiers of Immunology (2017) that aimed to track the length of immune cell telomeres as people age. The first study was performed in 216 people aged between 20-90 years of age over a period at 0, 5, and 12 years, and the second study was performed

in 465 people between 21-88 years over a period of 13 years. Examining a pool of immune cells that includes T, B and NK cells, the group found that telomere length decreased as the participants aged. Interestingly, the rate of change among T cells, B cells and monocytes varied, which may be a result of the different functions they fulfil. Despite the telomere shortening with each division, cells do have a tool they can use to lengthen their telomeres once more: an enzyme (protein catalyst) known as telomerase. The levels of this enzyme are usually tightly controlled, probably due to the risk of cancer. The study found that a substantial amount of the variation seen in T cell telomere length could be accounted for by

Cumulative effectors of life-long stressors and diseases

Healthy young environment

Old inflammaging IL-1β, IL-6, TNF-α

Inflammaging refers to the chronic, low-grade inflammation that characterises ageing. In young adults, tissue microenvironment is healthy whereas cumulative insults by stressors and diseases over a life-long time, results in damage of microenvironment and increased inflammation-related cytokines in circulation. This figure is modified based on Franceschi et al Seminar in Immunology 2018.

www.researchoutreach.org

115


different levels of telomerase activity in cells, in combination with other factors including the number of immature T cells and physiological features such as blood glucose level. Notable was the fact that, as with the group’s 2017 study, the differences observed varied substantially between participants. Most noticeably, age-related trajectories of telomere attrition, elevated circulating inflammatory cytokines, and anti-CMV IgG are independent, and that ageing individuals do not show a uniform pattern of change in these variables (Figure 3). Highlighting the importance of studying changes in individuals, rather than averaging whole groups together. EXPLAINING IMMUNE SYSTEM DECLINE If the telomere length of immune cells decreases as people age, could there be a link between the length of a person’s telomeres and the ability of their immune system to defend against infection? To tackle this question Dr Weng and colleagues performed a study which was published in 2015 in The Journal of Infectious Diseases. In the paper, the group use a group of 22 healthy, older individuals, that had immune cells with either particularly short or long telomeres and determined the strength of their immune response to an influenza vaccine. As the body mounts an immune response to an invading virus such as influenza, B cells produce large amounts of antibodies. Antibodies are proteins that stick to the virus, rendering it unable to infect cells. Antibodies are highly specific, so they must be generated for each new strain of influenza virus that a person encounters. As the vaccine contains an inactive influenza virus, the researchers could measure the levels of influenza antibodies produced after participants received the shot and use it to determine the strength of their immune response. Dr Weng’s group were interested to see that those with a strong antibody response to influenza infection, had

INDIVIDUALIZED AGE-ASSOCIATED CHANGES IN IMMUNE SYSTEM YOUNG

AGEING

OLD

TELOMERE

www.researchoutreach.org

TELOMERE LENGTH Long

TELOMERE

TELOMERE SHORTENS AFTER MULTIPLE REPLICATIONS

TELOMERE

TELOMERE SHORTENS AFTER MULTIPLE REPLICATIONS

TELOMERE AT SENESCENCE

Short

TELOMERE

TELOMERE SHORTENS AFTER MULTIPLE REPLICATIONS

INFLAMMATIONCYTOKINES Low

TELOMERE

TELOMERE AT SENESCENCE

High

TELOMERE

TELOMERE AT SENESCENCE

TELOMERE

TELOMERE AT SENESCENCE

CMV IgG TITER

Low

High

Ageing of immune system measured by three independent biomarkers: telomere length, inflammationrelated cytokines (IL-1b, IL-6, and etc.), and anti-CMV IgG titer. In young adults, all three parameters are good, with advance of age, old adults display different degrees of changes in these three biomarkers.

longer B cell telomeres than those with a weak response. They also found that shorter telomeres in influenza-specific CD8+ T cells also positively correlated with their ability to undergo cell divisions. This raises the possibility that shortening of telomeres in immune cells could contribute to the decline in a person’s ability to fight off infection as they age. THE FUTURE The changes that occur during ageing and how these result in declining health, is an area of interest for society and scientists alike. By tracking changes in subjects over time, researchers such

When looking at Dr Weng’s studies, one thing becomes clear, the way in which individuals age varies greatly.

116

TELOMERE SHORTENS AFTER MULTIPLE REPLICATIONS

as Dr Weng are helping to shed light on these questions. When looking at Dr Weng’s studies one thing becomes clear, the way in which individuals age varies greatly. This heterogeneity highlights the importance of his approach; if we simply average together groups of participants we may lose important information in the noise. A proper understanding of the processes that underlie ageing is crucial if we are to improve our health in old age. This includes bolstering our ability to fight off infection as we get older and to enhance the function of the immune system when it gets “old”, such as in the case of reduced ability to fight off influenza infection. We, therefore, await the results of Dr Weng’s future studies with great anticipation.


Behind the Research Dr Nan-ping Weng

E: Wengn@mail.nih.gov T: +1 410 558 8341 W: https://irp.nih.gov/pi/nan-ping-weng W: www. nia.nih.gov/research/labs/lmbi/lymphocyte-differentiation-section W: www.researchgate.net/ profile/Nan-ping_Weng

Research Objectives

References

Dr Nan-ping Weng’s research is focused on understanding the mechanism of age-related changes in immune function with emphasis on telomere in T cell function, TCR repertoires, and naïve and memory T cell maintenance using human Longitudinal Study (BLSA) and mouse models.

Lin Y, Damjanovic A, Metter EJ, Nguyen H, Truong T, Najarro K, Morris C, Longo DL, Zhan M, Ferrucci , Hodes RJ6, Weng NP. (2015). ‘Age-associated telomere attrition of lymphocytes in vivo is co-ordinated with changes in telomerase activity, composition of lymphocyte subsets and health conditions’. Clin Sci (Lond), 128(6):367-77. doi: 10.1042/CS20140481.

Detail 251 Bayview Blvd., Baltimore, MD 21224, USA Bio Dr Weng received his MD from Fudan University Shanghai Medical College (former Shanghai First Medical College), China and PhD in Immunology from Baylor College of Medicine. He obtained his postdoctoral training at NCI, NIH. He joined the National Institute on Aging (NIA) as a tenure-track investigator and now is a tenured senior investigator at the Laboratory of Molecular Biology and Immunology, NIA. Funding This research was supported entirely by the Intramural Research Program of the NIH, National Institute on Aging. Collaborators • Richard Hodes • Luigi Ferrucci

Lin, Y., Kim, J., Metter, E. J., Nguyen, H., Truong, T., Lustig, A., Ferrucci, Weng, N. P. (2016). ‘Changes in blood lymphocyte numbers with age in vivo and their association with the levels of cytokines/cytokine receptors’. Immunity & Ageing: I & A, 13, 24. Lustig A, Liu HB, Metter EJ, An Y, Swaby MA, Elango P, Ferrucci L, Hodes RJ, Weng NP. (2017). ‘Telomere Shortening, Inflammatory Cytokines, and Anti-Cytomegalovirus Antibody Follow Distinct Age-Associated Trajectories in Humans’. Front Immunol, 8:1027. Najarro , Nguyen H1, Chen G, Xu M, Alcorta S, Yao X, Zukley L, Metter EJ, Truong T, Lin Y, Li H, Oelke M, Xu X, Ling SM, Longo DL, Schneck J, Leng S, Ferrucci L, Weng NP. (2015). ‘Telomere Length as an Indicator of the Robustness of B- and T-Cell Response to Influenza in Older Adults’. J Infect Dis, 212(8):1261-9.

Personal Response What future studies do you have planned? We are currently focusing on unravelling the underlying mechanisms of lymphocyte ageing from single cell to cell population to their collective function. We hope we will be able to determine cellular ageing at the individual cell level and to apply the multi-parameters approach to measure immune system age. Such depth measurements of an individual allow us to determine general and specific changes in lymphocytes and to develop new tools for clinical applications in precision medicine.

www.researchoutreach.org

117


Health & Medicine ︱ Dr Hanne van Ballegooijen

Dream team: Improving hearts and bones with vitamins D and K Vitamins are essential nutrients our bodies require to perform important functions. Although they can be absorbed through a healthy lifestyle, many people are deficient in vitamins such as vitamin D and vitamin K. Dr Hanne van Ballegooijen and her team from Amsterdam UMC, location VUmc, work to understand the consequences of such deficiencies on human health. Future research may focus on the combination of specific vitamins and long-term impact of supplement use, a popular worldwide trend.

V

itamins are organic compounds essential for our bodies to perform a range of specific functions. As we cannot produce these ourselves, we must absorb them through our daily diets or lifestyles. Two important fat-soluble vitamins are vitamin D and vitamin K. WHY DO VITAMINS MATTER? Vitamin D is responsible for controlling the amount of calcium and phosphate in our bodies. Without vitamin D we cannot absorb calcium, which means our bones are at risk of becoming soft or brittle. Vitamin D is often associated with getting enough sunlight, as the majority of vitamin D comes from exposing our bare skin to sunlight. It is very difficult to get sufficient vitamin D from diet alone. Less commonly understood than vitamin D is vitamin K. Unlike vitamin D, vitamin K is predominantly obtained through diet. Sources include leafy green vegetables, fish, fermented dairy and eggs. Vitamin K is the term used to represent a group of compounds (vitamin K1, vitamin K2 etc. based on their chemical structure and side chain) which contribute towards our body’s ability to clot blood. Vitamin K can only be stored in small amounts and should be consumed daily. Understandably, anyone concerned about vitamin

D deficiency may choose to top up their levels with artificial forms known as supplements. The use of supplements is widespread practice throughout the world with tablet forms often available in supermarkets at low prices. However, the availability of over-thecounter supplements containing single vitamins or nutrients may not take into consideration the unknown long-term consequences and the importance of the interaction between different vitamins. Vitamins D and K may work together to ensure calcium is correctly distributed in our bodies. Whilst vitamin D plays an overarching role in stabilising bodily calcium levels in our blood, vitamin K helps to ensure the calcium is distributed to our bones, as well. Research by Dr Hanne van Ballegooijen of Amsterdam UMC, location VUmc, is some of the first to investigate the health benefits of this combination of vitamins. HOW CAN VITAMINS D AND K AFFECT OUR HEALTH? Dr van Ballegooijen’s research forms part of 10 years of study into the function of vitamin D in our bodies with a particular focus on cardiovascular health. Specifically, her latest research considers the potential mechanism between the interaction of vitamin D and K and explores the benefits to both cardiovascular and bone health. Cardiovascular health problems affect many people around the world and according to the World Health Organization, account for a third of deaths worldwide. The consequences are devastating. Bone health problems are less often directly culpable as a cause of death, but are

118

www.researchoutreach.org


GOOD NUTRITION PROMOTES VITAMIN K

NATURAL SUNLIGHT PROMOTES VITAMIN D

HEALTHY CARDIOVASCULAR SYSTEM

HIGHER BONE DENSITY

When vitamin K and vitamin D are received in the correct amounts, there are benefits to the cardiovascular systema and to bone density. Both of these factors are important for a healthy, active lifestyle.

particularly common among older adults. Unfortunately, this demographic is also at increased risk of cardiovascular difficulties.

the long-term consequences of even low-dose vitamin D supplements have not received much research attention.

The specific aim of the research by Dr Hanne van Ballegooijen was to discover the potential synergy of vitamin D and K to best understand their combined functionality. That began by unravelling the complex interaction between the two vitamins to better understand their impact on each other. One of the hypotheses is that vitamin D stimulates the creation of vitamin K-dependent proteins within the body. Once created, these proteins need vitamin K for optimal functioning. In this way, our body ensures that our calcium levels will always be well distributed. Without vitamin K, these proteins cannot carboxylate to function properly. But what if these vitamins are not supplied in optimal amounts due to supplementation of only one vitamin? The team suggests that widespread vitamin D supplementation may therefore be part of the reason for the vitamin K deficiencies identified in humans.

In the studies conducted on Dutch population cohorts, some groups were found to be particularly at risk. Older adults who were deficient in both vitamins were found to have greater risk of high blood pressure also known as hypertension. The study used data from the Longitudinal Aging

FINDINGS FROM POPULATION STUDIES The team hypothesises that vitamins D and K could only interact correctly with each other if they were present in combination and in the right amount. As previously mentioned, many people choose to take supplements. However,

in people aged 55 to 65 years. When the participants were followed up approximately six years later the team found 62% of the group with insufficient vitamin D and K had developed hypertension. The combination of low levels of vitamin D and K also showed participants had increased systolic blood pressure when compared with the group with higher levels of both vitamins. The team concluded that the combination of low levels of both vitamins D and K led

The consumption of a well-balanced diet is most important for the prevention of chronic diseases. Study to research the incidence of high blood pressure and hypertension

to increased systolic blood pressure and a greater risk of incident hypertension.

STAGES OF ATHEROSCLEROSIS

Atherosclerosis is the most common type of arteriosclerosis. Fatty plaques build up in the blood vessels and eventually restrict and even block the flow of blood.

www.researchoutreach.org

119


A second study conducted among kidney transplant recipients elucidated the role of low levels of both vitamin D and vitamin K and long-term outcomes. Though transplant recipients are significantly more likely to survive compared with dialysis recipients, their chance of graft failure and eventual associated mortality remains high. Kidney transplant recipients tended to demonstrate low levels of a range of micronutrients, including vitamins D and K. The kidney transplant recipients were included in the study between 2002-2003 approximately six years after receiving their transplants. When followed up almost 10 years later, more than a quarter of the cohort had died and a further 10% had developed death-censored graft failure. The team concluded that kidney transplant recipients who demonstrate deficiencies in both vitamins D and K have a higher risk of graft failure, and a higher risk of mortality. A further conclusion showed that patients who were treated with vitamin D and who had low vitamin K levels had a greater risk of death or transplant failure than those who weren’t. Nowadays, kidney patients are all given vitamin D treatment, but the long-term consequences are poorly understood. The team originally hypothesised that members of “at-risk” groups who used vitamin D supplements and had low vitamin K levels would be at greater risk of disease. Their research has shown that those undertaking vitamin D therapy or taking supplements with low vitamin K status had indeed a much higher risk of

LOWER RISK OF FRACTURES LESS ARTERIOSCLEROSIS HIGHER BONE DENSITY Eating a healthy, balanced diet rich in vitamin K has many health benefits for the bones and cardiovascular system.

The long-term consequences of vitamin D supplementation on clinical outcomes are poorly understood. mortality and graft failure. The next step is to unravel whether vitamin D combined with vitamin K rich foods or vitamin K supplementation together could improve bone and cardiovascular health.

RESEARCH IN THE FUTURE Having already researched vitamin D and vitamin K extensively, Dr Hanne van Ballegooijen still has plans to discover more about the impact and consequences of vitamin D for human health. Future research may include considering the consequences of long-term vitamin D supplement use. This research would form an important part of future health knowledge as the side effects of longterm vitamin D supplement use remain poorly understood. The future will tell us whether successful vitamin D supplementation, when used appropriately in combination with vitamin K, could prove to be an inexpensive solution to develop new preventative methods of stimulating cardiovascular health.

In the right proportion, vitamins K and D can contribute to a healthy cardiovascular system.

120

www.researchoutreach.org


Behind the Research Dr Hanne van Ballegooijen

E: aj.vanballegooijen@vumc.nl T: +31 20 44 43127 W: www.researchgate.net/profile/Hanne_Van_Ballegooijen2

Research Objectives

References

Dr Hanne van Ballegooijen has been working on vitamin D for almost 10 years. Her work suggests that vitamin D status alone is not strongly related to cardiovascular disease and that combining vitamin D with vitamin K might improve its efficacy.

Nair R and Maseeh A, 2012. Vitamin D: The “sunshine” vitamin. Journal of pharmacology & pharmacotherapeutics, 3(2), p.118.

Detail Dr A.J. (Hanne) van Ballegooijen, PhD Amsterdam UMC, location VUmc Amsterdam De Boelelaan 1117 1081 HV Amsterdam The Netherlands Bio Dr van Ballegooijen is an assistant professor at the Amsterdam UMC, location VUmc, at the department of Nephrology and Epidemiology & Biostatistics, Amsterdam. Her research is focused on nutrition and lifestyle determinants in the etiology of cardiovascular diseases and chronic kidney disease. A better understanding of these relationships may help to promote cardiovascular health. Further, she is involved in teaching and supervision of students in the field of nutritional epidemiology and cardiovascular and kidney disease. She enjoys these teaching activities to encourage critical thinking and to stimulate others to outperform themselves. Funding Niels Stensen Fellowship Dutch Kidney Foundation ZonMw Collaborators Amsterdam: Prof J.W.J. Beulens and Dr. M.G. Vervloet Maastricht: Prof. L. Schurgers Graz, Austria: Dr. S. Pilz, Dr. N. Verheyen Seattle, USA: Prof B. Kestenbaum, Prof I. de Boer

US Department of Health and Human Services, 2004. Bone health and osteoporosis: a report of the Surgeon General. Rockville, MD: US Department of Health and Human Services, Office of the Surgeon General, 87. van Ballegooijen AJ, Cepelis A, Visser M, Brouwer IA, van Schoor NM and Beulens JW, 2017. Joint association of low vitamin D and vitamin K status with blood pressure and hypertension. Hypertension, 69:1165-1172. van Ballegooijen AJ, Pilz S, Tomaschitz A, Grübler MR and Verheyen N, 2017. The synergistic interplay between vitamins D and K for bone and cardiovascular health: A narrative review. International journal of endocrinology 2017. van Ballegooijen, AJ, Beulens, JWJ, Keyzer, CA, Navis GJ, Berger SP, de Borst MH, Vervloet MG, Bakker SJL, Joint Association of Vitamin D and K Status with LongTerm Outcomes in Stable Kidney Transplant Recipients, Nephrology Dialysis Transplantation, 2019.

Personal Response Considering modern lifestyles, is it realistic to expect the majority of the population to obtain sufficient amounts of vitamin D or vitamin K without supplements, or do we need to instead improve supplements? More clinical data about the potential interplay between vitamin D and vitamin K metabolism is urgently needed before broader treatment recommendations can be given. The consumption of a well-balanced diet is key for population-based primary prevention of chronic diseases. As more is discovered about the powerful combination of vitamins D and K, it gives a renewed reason to eat a healthy diet including a variety of foods such as vegetables and fermented dairy for bone and cardiovascular health.

www.researchoutreach.org

121


Health & Medicine ︱ Dr Stephen Barr

Evolutionary arms race A 400 million-year-old battle between HIV and ancient genes, HERC5 and HERC6 Two human genes, HERC5 and HERC6 have protective effects against HIV and other viruses. Dr Stephen D. Barr from Western University, Canada, traces the evolution of these genes throughout human history and investigates their protective effects. His work has shown that these genes are involved in an evolutionary arms race, with host defence mechanisms and the virus ever trying to outwit one another. Increased understanding of the viral mechanisms involved may present novel therapeutic targets for antiviral treatments.

The HIV virus. Some human genes have protective effects against the virus.

T

he immune system is part of the host’s defence mechanism against potentially dangerous pathogens, such as bacteria and viruses. However, many viruses have evolved ways in which they can avoid elimination by the cells of the immune system. These adaptations include avoiding detection or hiding from immune cells, interfering with host processes or preventing recruitment of immune cells to the site of infection. HIV is one such virus which evades the immune response by preventing antibodies from binding to itself and by destroying immune cells, amongst other strategies. Understanding more about the mechanisms which viruses use to avoid antiviral activity is vital for the future development of antiviral drugs. THE HERC GENE FAMILY AND HIV Several proteins induced by signalling molecules, called interferons, have previously been identified as exhibiting antiviral activity towards HIV. These proteins include those encoded by a family of genes called HERC genes. There are six HERC members that are classified into ‘large HERC’ proteins (HERC1 and HERC2) and ‘small HERC’ proteins (HERC3, HERC4, HERC5 and HERC6). The small HERC proteins are highly similar proteins of which only HERC5 and HERC6 are interferon-induced in humans. Although the functions of the small HERC family members are not fully understood, they do have antiviral effects. Dr Stephen D. Barr and his team at Western University, Canada, previously discovered that the human protein called HERC5 potently inhibited the replication of human immunodeficiency virus (HIV), identifying it as a new

candidate for HIV therapy. In contrast, whilst HERC6 also inhibited HIV replication, it did so to a much lesser extent than HERC5. Based on this work, the Barr lab were also interested in the origin of the HERC5 gene and how its antiviral activity has evolved in vertebrates. In order to do this, they searched genome databases to find the earliest emergence of the small HERC gene members and tracked their evolution through vertebrates over time. More recent versions, or distantly related versions, of genes can be recognised by matching similarities in their genetic sequences to their ancestral genes. In this way a family tree of genes can be made, which identifies the original genes and follows the changes and divergence of these genes over time and between species. Dr Barr’s team showed that the oldest member of the small HERC gene family is HERC4. HERC4 is present in one of the few lineages of jawless fish that survive today, sea lampreys, which originated almost 600 million years ago. HERC4 has since duplicated itself three times at different evolutionary timepoints in the tree of life. The last expansion of the HERC family happened after the divergence of the ray-finned fish, such as eels, salmon and sturgeon, around 400 million years ago and this probably resulted in a duplication of the HERC6 gene which gave rise to HERC5. The earliest vertebrate identified to possess HERC5 was an ancient marine organism called the coelacanth, a type of fish that emerged over 400 million years ago. To better understand how HERC5 and the other HERC proteins evolved in vertebrates, Dr Barr’s team compared


Three dimensional models of evolutionarily similar HERC5 proteins.

dozens of evolutionarily divergent HERC proteins, looking for similarities and differences that might give clues about how the antiviral function of HERC5 emerged. One example of a program which they used for this complex analysis is the Structural Alignment of Multiple Proteins (STAMP), which aligns proteins based on their three-dimensional structures. The STAMP analysis showed that a particular region of the HERC5 protein that is key for antiviral activity, the RCC1like domain, was highly similar in other HERC proteins and in HERC proteins from evolutionarily distant species, such as coelacanth and human. This highlighted the potential that HERC5 in other vertebrates, and perhaps other HERC proteins, may possess antiviral activity and that this activity has an ancient origin.

must be met by an increase in fitness by the competing organism. HERC genes originated in marine species; however, not all vertebrates possess HERC genes. This suggests that different forms of the genes emerged

ANTIVIRAL EFFECTS AGAINST OTHER VIRUSES In order to assess whether the antiviral effects of the genes were effective against related viruses, Dr Barr and his team tested their antiviral activity towards a non-human virus, the monkey version of HIV called simian immunodeficiency virus (SIV). SIV is thought to be at least 32,000 years older than HIV, so it could be hypothesised that primate immune responses may be more evolved to target SIV compared to HIV. They tested HERC5 and HERC6 as these exhibited both the strongest and the weakest antiviral effects in human cells.

Since viruses such as HIV have been in battle for so long, they have had time to learn ways to get around the host defence shields and become smarter.

AN EVOLUTIONARY ARMS RACE In order to avoid detection and destruction by the host immune defences, viruses continually evolve mechanisms to escape detection. To combat this, host defences must also evolve to keep pace with the viral adaptations. This evolutionary arms race, sometimes called the Red Queen hypothesis, is the driving force behind the battle to simply survive in a constantly evolving environment. This dynamic coevolution process is caused by a positive feedback loop; gain of fitness, or the ability to survive, in one organism

at different points during evolution, when they would confer a survival advantage to the host. Two of the more recent HERC genes, HERC5 and HERC6, were observed to have been involved in an evolutionary arms race with viruses for hundreds of millions of years. Because of this battle, these genes have developed sophisticated ways to put up a shield in cells to block viruses. Using sequence analysis, the Barr lab showed that positive selection pressure is causing both HERC5 and HERC6 to evolve. For example, they have previously demonstrated that part of the RCC1-like domain in HERC5 has been evolving for over 100 million years in this way, providing further evidence that these genes are inextricably linked with viral evolution. Interestingly, HERC3 and HERC4, which also possess the RCC1like domain, do not seem to be evolving under selective pressure despite being much older than HERC5 and HERC6.

Human cells expressing increasing concentrations of coelacanth HERC5, human HERC5 or human HERC6 were infected with either SIV or HIV. As the team predicted, human HERC5 showed the greatest protective effect against

A coelecanth, the earliest vertebrate identified to possess HERC5.

www.researchoutreach.org

123


Salmon (large image), sea lamprey (top inset) and sturgeon (bottom inset) all carry versions of the HERC genes.

HIV, with coelacanth HERC5 and human HERC6 showing little inhibition of HIV. Surprisingly, all HERCs potently inhibited SIV virus replication. This suggests that the protective responses of HERC5 and HERC6 exhibit species- and virusspecific activity.

Photo Credit: Crystal Mackay, Schulich School of Medicine & Dentistry, Western University

Given the knowledge that the RCC1like domain is important in the immune response against viral infection, Dr Barr asked why HERC5 and HERC6 varied in their antiviral activities, with particular focus on this domain. Indeed, his team showed that if part of the RCC1-like domain from HERC5 was transferred into HERC6, then HIV particle production was inhibited to a level similar to HERC5. Dr Barr mentions that, “additional structure-function studies are needed to understand exactly how

Two of the more recent HERC genes, HERC5 and HERC6, were observed to have been involved in an evolutionary arms race with viruses for hundreds of millions of years. this part of the RCC1-like domain exerts its antiviral activity.” Since viruses such as HIV have been in battle for so long, they have had time to learn ways to get around the host defence shields and become smarter. Consequently, this new level of sophistication can allow these viruses to jump the species barrier

Dr Barr with Ms Ermela Paparisto, the first author of the 2018 study and a PhD student in Dr Barr’s laboratory.

124

www.researchoutreach.org

to establish new infections in humans, a process called zoonosis. An interesting comparison would be to infect monkey cells with the viruses to see whether the observations from human cells are also seen in another species. DEVELOPMENT OF NOVEL THERAPIES Dr Barr’s work investigating the arms race between genes and viruses has provided new insights into how both the immune system and viruses have evolved. The work by the Barr lab has highlighted that the HERC family are likely to have an important role in intrinsic immunity. Dr Barr’s ultimate goal is to discover more about the mechanisms which viruses use to inactivate HERCs and other similar antiviral proteins. In doing so, he provides a window of opportunity by which this knowledge can be exploited for the development of novel antiviral drugs. If these mechanisms can be specifically targeted, antiviral genes such as HERCs will remain active during infection and will increase viral clearance and therefore improve the outcome for the patient.


Behind the Research Dr Stephen Barr

E: stephen.barr@uwo.ca T: +1 519 661 3438 W: http://publish.uwo.ca/~sbarr9/ www.youtube.com/watch?list=PLA0E847DD01B0FA01&v=3PcfQhaQNK8

Research Objectives Dr Stephen Barr’s work aims to illuminate the evolution of the HIV virus and the family of HERC genes that inhibit HIV.

Detail Western University Schulich School of Medicine & Dentistry Department of Microbiology & Immunology Dental Sciences Building Room 3007 London, Ontario N6A 5C1 Canada Bio Dr Barr obtained his PhD in Molecular Biology from the University of Calgary under the supervision of Dr Lashitew Gedamu, studying a family of antioxidant proteins called Peroxiredoxins and how they protected cells from infection by Leishmania. He then studied various aspects of HIV biology as a Postdoctoral Fellow with Dr Rick Bushman at the University of Pennsylvania. He continued his Fellowship with Dr James Smiley at the University of Alberta where he studied the host interferon response towards HIV. In 2008, Dr Barr became an independent investigator where his laboratory continues to study the host interferon response towards viruses such as HIV and Ebola virus. Funding • Natural Sciences and Engineering Research Council of Canada (NSERC) • Canadian Institutes of Health Research (CIHR) • CWRU/UH Center for AIDS Research (CFAR) • National Institute of Allergy and Infectious Diseases (NIAID) Collaborators • Dr Eric Arts • Dr Yong Gao

@BarrLab

References Paparisto E, Woods MW, Coleman MD, Moghadasi SA, Kochar DS, Tom SK, Kohio HP, Gibson RM, Rohringer TJ, Hunt NR, Di Gravio EJ, Zhang JY, Tian M, Gao Y, Arts EJ, Barr SD. (2018). Evolution-guided structural and functional analyses of the HERC family reveal an ancient marine origin and determinants of antiviral activity. J Virol, 92:e00528-18. Available at: https://doi.org/10.1128/JVI.00528-18 [Accessed 6 February 2019]. Woods MW, Tong JG, Tom SK, Szabo PA, Cavanagh C, Dikeakos JD, Haeryfar SMM, Barr SD. (2014). Interferoninduced HERC5 is evolving under positive selection and inhibits HIV-1 particle production by a novel mechanism targeting Rev/RRE-dependent RNA nuclear export. Retrovirology 11:27. Available at: http://www.retrovirology. com/content/11/1/27 [Accessed 6 February 2019]. Woods MW, Kelly JN, Hattlmann CJ, Tong JGK, Xu LS, Coleman MD, Quest GR, Smiley JR, Barr SD. (2011). Human HERC5 restricts an early stage of HIV-1 assembly by a mechanism correlating with the ISGylation of Gag. Retrovirology 8:95. Available at: http://www.retrovirology.com/ content/8/1/95 [Accessed 6 February 2019].

Personal Response How do you plan to further develop your understanding of the antiviral effects of the HERC proteins? We are currently comparing the antiviral activities of evolutionarily diverse HERC proteins towards HIV and other viruses such as Ebola virus to better understand how broad the antiviral activity of HERCs are, and to discover how viruses evade these HERC proteins for survival. We also hope to test these effects in animal models that are more biologically relevant to the human immune system.

www.researchoutreach.org

125


Thought Leader

Iridescent:

Disrupting the classroom for the better The education system has struggled to keep up with the constant evolution of technology. Iridescent, a global nonprofit organisation, is revolutionising education by providing programmes that empower underserved children through technology and engineering. With a special focus on underrepresented young girls, Iridescent challenges the negative myths surrounding AI and uses technology to inspire children to become innovators. Research Outreach spoke with founder and CEO Tara Chklovski of global non-profit Iridescent, discussing how Iridescent’s goals have become a reality.

I

t is crucial that children have the technological knowledge and understanding to succeed in today’s digital world. Unfortunately, many education systems are insufficient in preparing children for the workforce, especially those from underprivileged upbringings. Iridescent has identified the necessity to incorporate STEM education, technology and AI into children’s lives at an early stage through cutting-edge programmes and technology. Through these methods, Iridescent hopes to inspire the next generation to become innovators. Iridescent has experienced incredible growth since its inception in 2006; the non-profit now has a global reach. Iridescent’s success is attributable to its clear strategy, its collaboration with industry experts and its ability to stay abreast of technological evolution.

126

www.researchoutreach.org

Hi Tara! Can you tell us more about Iridescent in terms of its background, history and core mission? Iridescent is a global non-profit organisation that provides cuttingedge STEM education to underserved children, and their families through two programmes; Technovation and Curiosity Machine. We use technology-based and engineering-based programmes to achieve the aim of empowering the world’s underrepresented young people (especially girls) to become innovators and leaders. Our organisation was founded in 2006 from a desire to help the education system, a system that is often slower to evolve in the rapidly changing technology industry. Since Iridescent’s founding, the non-profit has grown tremendously; we now operate in more than 115 countries and deliver STEM curriculum to more

than 100,000 children and their families worldwide. Iridescent is proud to be the first organisation to assist young girls in underserved communities learn coding and app development skills through our Technovation Challenge. The Technovation Challenge programme introduces the girls to technological concepts in an engaging way. We recently introduced the Artificial Intelligence (AI) Family Challenge as a new part of our popular Curiosity Machine programme. Tell us about your journey into Artificial Intelligence? What led you to launch an AI-focused programme? For 12 years, we have been introducing children worldwide to cool new technologies and scientific advances including nanotechnology, mobile computing and robotics. We are attuned to technological trends as well as being skilled at identifying new skillsets which


Left: A father and daughter in New Orleans build circuits as a part of their self-driving car game, learning basic artificial intelligence concepts. Right: Children in Somalia get familiar with materials they’ll use as a part of Iridescent’s AI Family Challenge.

a key role in innovation. Our goal is to show children that this field is accessible and that they can play a pivotal role in the industry from the onset. Another of our aims is to show these children that the innovation field is a viable and fulfilling career path.

A group of mothers and daughters work together on a robotic arm at a Phoenix area school.

children, and their parents, will need in order to compete and succeed in today’s evolving workforce. The renowned fear surrounding Artificial Intelligence (AI) made AI a natural next step for our programmes and curriculum. AI is rapidly changing our world and the

What was Iridescent’s impact in 2017 and goals for this year? Is Iridescent on the right track to completing the goals set in the 2015-2019 strategic plan and Theory of Change? In 2017 alone we reached five of the six goals outlined in our 2015-2019 strategic plan. Notably, we increased participant reach nearly 40% to 35,000 annual programme participants, and we partnered with 7,000+ organisations worldwide. We were very proud of effectively scaling for participant and partner growth whilst simultaneously

In 2018 we are making greater strides towards our 2015-2019 strategic plan. We are taking steps that will help us measure, as well as share, the impact of our programmes on participants, mentors, partners, and educators more effectively. Furthermore, we have committed to increasing our transparency in our impact reporting. Our proudest achievement in 2018 was the launch of the AI Family Challenge. The programme received a wonderful response from families, educators, professionals, and funders. Can you tell us more about the two programmes Curiosity Machine and Technovation? What impact have they had so far? Our two primary programmes, Technovation and Curiosity Machine,

Our two primary programmes, Technovation and Curiosity Machine, introduce underserved communities to cuttingedge technologies. nature of people’s work. The primary and secondary education systems are not equipped with the knowledge, information or resources to respond quickly enough to prepare children, or even adults, for the workforce. It is this problem that Iridescent is working to solve. We established the AI Family Challenge with the goal to empower and equip children from underserved communities with the necessary knowledge to play

decreasing our average costs to $10/ contact hour. Our organisational growth and increased global reach in 2017 demonstrate that families, community partners, and corporate funders continue to turn to Iridescent as both a provider and partner. Stakeholders such as these trust Iridescent in delivering highquality STEM education programmes that not only teach but also excite underserved youth.

introduce underserved communities to cutting-edge technologies. Through our detailed, yet accessible curriculum, we equip and empower kids, their families, and those mentoring or coaching them to apply the knowledge they have learned to solve real-world problems. This year we launched our a new initiative, the AI Family Challenge. The curriculum teaches children ages 8-15, and their parents about AI technologies. Families are guided through the process

www.researchoutreach.org

127


A mentor works with Technovation students at the annual Technovation World Pitch, where girls from around the world pitch apps they develop to solve community problems.

of creating an AI-based product that solves problems that face society including transportation, health, the environment and education. Engineering and AI-knowledgeable mentors support participating families. These mentors are rigorously trained to ensure participants have a positive experience that is both informative and engaging. Technovation Challenge encourages entrepreneurship in girls (aged 13-18)

and evaluating insights from our programme participants, mentors and educators are critical to understanding our programmes’ effectiveness. The evaluation process has also helped us to identify ways to improve and enhance our programmes. We have learned many things across our programmes, most notably we’ve discovered: • Children report having a better understanding of science and engineering (74.8%) and are more

Iridescent has some very well-known supporters and partners – why are they so important to Iridescent’s success? We are incredibly fortunate to have a wide variety of partners who are as passionate about our mission to bring STEM education to underserved children and families through Iridescent. While our corporate partnerships take many forms, we have found some of the best relationships are designed around skills-based volunteering opportunities. Through these types of partnerships, corporations encourage

Our aim is to ensure that we consistently deliver interesting and relevant information that is both educational and engaging. by challenging them to identify a problem in their community. The girls are then tasked with developing a mobile app and a start-up business that solves the identified problem. The girls collaborate with mentors to learn the skills necessary to bring their idea to life. Our cutting-edge curricula, teaching strategies, lesson plans, and mentor training materials for both programmes are freely available to educators, parents and mentors worldwide. What has your research shown on how have these programmes failed and succeeded? We have found that gathering

128

www.researchoutreach.org

interested in science at school (74.8%); • Parents report having a better understanding of science and engineering (77.8%) and that they will read more science books with their children (88.9%); and • College-level student instructors reported learning practical skills like critical thinking, creativity, public speaking, and collaboration. We are excited about the impact our programmes have on our audiences. We are committed to applying a datadriven approach to continually improve our operations. Our aim is to ensure that we consistently deliver interesting and relevant information that is both educational and engaging.

tech-savvy employees to share their knowledge by educating youth about topics ranging from self-driving cars, to robotics and biomechanics. These mentors play a crucial role in helping families learn about complex technology concepts in engaging and accessible ways. We’ve also found that this method of volunteering is a powerful tool to help engineering companies retain employees, particularly women. What are the major challenges in making AI more accessible to local communities? How do you overcome these challenges? There are multiple factors that impact people’s acceptance of AI. People’s perceptions differ and views can range


Thought Leader

Families in Ethiopia gather to learn about artificial intelligence through Iridescent’s AI Family Challenge.

from confusion about what AI is, how AI can fit into our daily lives and how AI will disrupt the workforce. There is fear that AI will take away many jobs, especially in underserved communities. We often talk to parents who are concerned about how they can provide their children with the necessary tools and skills to succeed in today’s digital world. For underserved families, in particular, there is often limited access to STEM education beyond what they find in the classroom. In fact, according to a recent study we commissioned, only 36% of children receive technology education outside of their schooling. We want to help children and their parents feel confident and optimistic about their family’s future in a world filled with new technologies. We aim to build their confidence by giving families a handson experience with AI to demystify the technology and remove the negative perceptions around it. How can scientists and engineers get involved with your programs and outreach? There are many ways scientists and engineers can involve themselves in our programmes either individually, or through their employer. We regularly work with technology professionals to invent challenges for students based on the professional’s line of work. This could include anything from driverless cars to robotics. Another method we use is encouraging professionals to share their story via video

such as teaching families about a topic within their area of expertise or to inspire children and families to tackle challenges within our programs. Professionals can also make a difference by acting as mentors, either online or within the community. Mentors often express to us that their mentoring not only changes a child’s life but also can have a transformative effect on their own life. For example, the experience can teach the mentor how to communicate complex concepts more effectively whilst working in their own professional capacity. Iridescent is a 501(c)3 registered nonprofit organisation, and you clearly value transparency by openly publishing your financial documents on your website – has this received a positive reaction?

many other industries in multiple ways. We are curious about how Virtual Reality combined with AI will result in unique learning journeys for students. Iridescent has the potential to completely disrupt learning as we know it. In five years, I hope Iridescent will be the leader in AI-education for both children and parents. Iridescent will continue to introduce collaborative platforms to families worldwide. Through our platform, families can create disruptive products while embarking on exciting learning journeys in technology. To find out more about Iridescent’s mission and their programmes to help young people develop, please visit their website at http://iridescentlearning.org.

We believe transparency is very important and we consistently receive positive reactions across all our stakeholders for our openness. In fact, for 2018, we changed our reporting policies so that we are now analysing and sharing our key learnings and programmes impact data on a quarterly basis. What does the future hold for Iridescent? Where do you see Iridescent in the next five years? We are always evaluating new technology trends and identifying ways to connect underserved families with industry experts who are changing the ways we live, work and play. One of the fields we observe very closely is the world of gaming. The gaming world is usually ahead of

For more information about Iridescent and its programs, contact info@iridescentlearning.org

www.researchoutreach.org

129


COMMUNICATION

Accidental Science! Many of the world’s most incredible discoveries came about by someone finding something they weren’t actually looking for. Not quite true accidents - each finding was made by a discerning individual following through on their observations, turning the unexpected into something useful. But chance certainly helped to play a role. Let’s take a look at some examples of these serendipitous innovations.

I

n 1879, the artificial sweetener saccharin was discovered by a Russian chemist who forgot to wash his hands. Constantin Fahlberg of Johns Hopkins University was trying to discover new uses for coal tar. After a long day in the lab, Fahlberg sat down to eat his sandwiches. Forgetting to wash his hands he noticed that they tasted incredibly sweet. Heading back to the lab, he (bravely) tasted some of the chemicals he’d been working with. The results of an experiment combining o-sulfobenzoic acid with phosphorus chloride and ammonia produced the sweet substance. Naming it saccharin, he patented it five years later, and mass production began. Around 400 times sweeter than sugar, the artificial

years later when Silver’s colleague, Arthur Fry, used it as a way of holding bookmarks to his hymn book while singing in his church choir. Partnering with Silver, the pair developed the product. Launched in 1980, it was an immediate success. Today, more than 50 billion Post-it Note products are sold every year. CHANGING MEDICINE FOREVER Arguably the most famous accidental finding of all, Sir Alexander Fleming’s discovery of penicillin changed medicine forever. Experimenting with the influenza virus in 1928, Fleming left his laboratory at St Mary’s Hospital London for a twoweek vacation. He returned to find that a mould had developed on an accidentally

In 1879, the artificial sweetener saccharin was discovered by a Russian chemist who forgot to wash his hands. sweetener became widespread when sugar was rationed during World War I. Research later showed that the body can’t metabolise saccharin. The ultimate lowcalorie sweetener, today it’s also a popular sugar substitute for diabetics. THE MIGHTY POST-IT NOTE The wonder that is the handy sticky note exists because 3M research engineer, Dr Spencer Silver, failed to make a strong, tough adhesive. What he actually discovered were sticky microspheres – strong enough to hold papers together, but weak enough to peel apart easily. Recounting his finding, Silver said: ‘At that time we wanted to develop bigger, stronger, tougher adhesives. This was none of those.’ No application for this light adhesive was apparent until several

130

www.researchoutreach.org

contaminated Staphylococcus bacterial culture plate. On closer inspection, he noticed that the mould prevented the growth of the Staphylococcus. The result? The discovery of the first naturally occurring antibiotic. Severe and lifethreatening illnesses like meningitis and pneumonia became treatable. Fleming later said: ’One sometimes finds what one is not looking for. When I woke up just after dawn on Sept. 28, 1928, I certainly didn’t plan to revolutionise all medicine by discovering the world’s first antibiotic, or bacteria killer. But I guess that was exactly what I did.’ Alexander Fleming,

Howard Florey and Ernst Boris Chain were jointly awarded the Nobel Prize for medicine in 1945 for the development of penicillin. MAGNIFICENT MAGNETRON In 1945, Raytheon Corporation engineer Percy Spencer was experimenting with a new vacuum tube called a magnetron. During his experiments, he discovered that the chocolate bar in his pocket was melting. Intrigued, Spencer investigated further - aiming the magnetron at other items including popcorn kernels. When they began to pop, he immediately clocked its potential. He concluded that the heat generated was due to microwave energy and the microwave oven was born. Of course, none of these ‘accidents’ would have been the world-changing discoveries they are without the right person there to recognise their value. But what they do show is that innovations can be born out of the unexpected. Rachel Perrin, PhD, is a science communication writer based in Bristol, UK.


CROWDFUNDING FOR GIRLS’ & WOMEN’S EMPOWERMENT

@WomensW4

WomensWorldWideWeb

www.W4.org


Partnership enquiries: simon@researchoutreach.org Careers and guest contributions: emma@researchoutreach.org www.researchoutreach.org


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.