The outreach quarterly connecting science with society ISSN 2517-7028 ISSUE 104
FEATURING RESEARCH FROM:
North Carolina State University, American Chemical Society, University of Glasgow, Lancaster University, University of Birmingham and King’s College London, Heriot-Watt University, University of Notre Dame, University of Minnesota, Arizona State University, School of Electrical and Electronic Engineering at Manchester University, Canadian Public Health Association, Dalhousie University, University of Alberta, The Rockefeller University, University of Toronto, Oregon State University, University of Alaska Fairbanks, Des Moines University, Research Features 3 Smithsonian Astrophysical Observatory, British Antarctic Survey, Louisiana State University, Christian-AlbrechtsUniversity, University of California.
COLLABORATE DISSEMINATE ENGAGE
Research Publishing International offer a completely barrier-free publishing portal. We have a multi-media presence and readership, through both digital and physical print copies of Research Outreach magazine, and provide online hosting of research articles through feature webpages and downloadable PDF documents. We abide by the Creative Commons (CC) license terms to ensure widespread, open-access dissemination of all the work featured across our various platforms.
An important factor in assisting research teams to maximise their exposure is the use of modern social media techniques. Combined with traditional digital and physical distribution of our publications, we engage heavily with the wider community through the use of various social media channels. RPI has over 30 years of collective expertise in science communications. Our know-how ensures that we work efficiently and cost-effectively, boosting the impact of your research globally.
www.researchpublishinginternational.com Partnership enquiries contact: simon@researchoutreach.org Careers and guest contributions contact: emma@researchoutreach.org Facebook “f ” Logo
CMYK / .eps
Facebook “f ” Logo
CMYK / .eps
RESEARCH OUTREACH ISSUE 104
WELCOME
The public outreach magazine for the research community ISSUE 104
TO ISSUE 104
Across the globe, a wealth of valuable research is conducted every day. Sadly though, this vital work isn’t always visible to the public. Dr Donna Nelson of the American Chemical Society has made it her mission to change the public perception and appreciation of science. Whether that is as a leading figure in the research community or as scientific advisor to Breaking Bad, Dr Nelson’s commitment to increased awareness of the value of science is clear. As Chair of the Canadian Public Health Association (CPHA), Dr Suzanne Jackson champions a public health perspective on important public policy discussions that impact the health and well-being of Canadians. She talks to us about the institution and highlights the key challenges to come in the future. Professor Mike Meredith is tackling a significant global challenge head on: science leader at the British Antarctic Survey, he investigates the role oceans play in slowing climate change by absorbing carbon and heat. A key tool for his work? The underwater vessel named Boaty McBoatface by the British public. Covering work from researchers in fields as diverse as Physical Sciences, Health & Medicine, Earth & Environment and Biology, we’re sure this issue of Research Outreach has something to interest everyone.
FEATURING RESEARCH FROM:
North Carolina State University, American Chemical Society, University of Glasgow, Lancaster University, University of Birmingham and King’s College London, Heriot-Watt University, University of Notre Dame, University of Minnesota, Arizona State University, School of Electrical and Electronic Engineering at Manchester University, Canadian Public Health Association, Dalhousie University, University of Alberta, The Rockefeller University, University of Toronto, Oregon State University, University of Alaska Fairbanks, Des Moines University, Research Features 3 Smithsonian Astrophysical Observatory, British Antarctic Survey, Louisiana State University, Christian-AlbrechtsUniversity, University of California.
THIS ISSUE Published by: Research Publishing International Ltd Publisher: Simon Jones simon@researchoutreach.org Editorial Director: Emma Feloy emma@researchoutreach.org Operations Director: Alastair Cook audience@researchoutreach.org Editor: Hannah Fraser hannah@researchoutreach.org Designer: Craig Turl Project Managers: Kate Cooper (Senior) kate@researchoutreach.org Tobias Jones tobias@researchoutreach.org Ben Phillips ben@researchoutreach.org James Harwood james@researchoutreach.org Contributors: Charlotte Adams, Alex Davey, Ingrid Fadelli, Kate Feloy, Sarah Henton de Angelis, Rebecca Ingle, Barney Leeke, Owen Leigh, Karen O’Hanlon Cohrt, Kate Porter, Ila Sivarajah, Paul Smith, Jacek Krywko /ResearchOutreach /ResOutreach
Editor Copyright © and ™ 2018 by Research Publishing International Ltd
Please feel free to comment or join the debate. Follow us on twitter @ResOutreach or find us on Facebook https://www.facebook.com/ ResearchOutreach/
CC BY This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this license, visit http://creativecommons. org/licenses/by/4.0/ or send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.
www.researchoutreach.org
3
CONTENTS 06
10
14
18
22
26
UNDERSTANDING WHY MATERIALS FAIL Dr Srikanth Patala The impact of ‘grain boundaries’ on the structural integrity of many materials. ACS: TAKING CHEMISTRY TO HOLLYWOOD Dr Donna Nelson Aiming to change the public perception and appreciation of science, particularly chemistry. DESIGNING CATALYSTS BIT BY BIT Prof Roy L. Johnston and Dr Francesca Baletto Using novel computational approaches to understand and design nanocatalysts. CLEANER PATHWAYS TO CHEMICAL SYNTHESIS VIA NEW GENERATION CATALYSTS Professor Mark Keane Developing methods for clean chemical production from renewable feedstocks. SEEING THE STRUCTURES OF MOLECULES: INSIGHTS FROM NMR AND INDUSTRY Professor Anthony Serianni Experimental approaches to understand and identify the 3D structures of compunds. CONFIGURING NEW BONDS BETWEEN FIRST-ROW TRANSITION METALS Professor Connie Lu Synthesising the first mixedmetal complex containing only the first-row transition metals.
78 30
34
38
IN SEARCH OF SUPERMASSIVE BLACK HOLE FEEDBACK Professor Evan Scannapieco Investigating why the largest galaxies in the universe have become dormant. OFFSHORE WIND POWER’S BIG BREAK Professor Mike Barnes Designing novel circuit breakers for offshore power networks. MAKING THE INTERNET A SAFER PLACE Dimitrios Pezaros & David Hutchison Developing new technologies to fight against cyber threats.
42
46
50
CPHA: THE HEART OF CANADIAN PUBLIC HEALTH Dr Suzanne Jackson Advocating for public health at all levels of government. A NEW CLASS OF ANTIBIOTIC DRUGS Professor Chris McMaster Developing these drugs could save lives and reduce healthcare costs. PREVENTING AND TREATING COMPLICATIONS OF HEART FAILURE AND FABRY DISEASE Dr Gavin Y Oudit Developing ground-breaking therapies to treat genetic and non-genetic cardiovascular disorders.
I do not think that people realise how much science, especially chemistry, impacts their lives.
10 4
www.researchoutreach.org
DR DONNA NELSON Page 10
62 34 54
58
62
66
66
RAISING ANTIBODIES AGAINST PROTEIN COMPLEXES Dr John LaCava Identifying interactions between transcription factors and other macromolecules. IMPAIRED THEORY OF MIND ASSOCIATED WITH VERY PRETERM BIRTH – AN INVISIBLE HANDICAP Dr Margot Taylor Investigating how very preterm (VPT) birth impacts social cognitive function.
70
74
78
SUPERVOLCANO FORENSICS: UNRAVELLING THE MYSTERIES OF THE EARTH’S BIGGEST NATURAL CATASTROPHE Professor Shanaka de Silva Revealing the secrets of ‘supervolcanoes’ and supereruptions.
82
ONE OCEAN, MANY MINDS: COLLABORATIVE SCIENCE IN THE ARCTIC DProfessor Igor Polyakov Monitoring climatic changes in the Arctic Ocean.
EXTINCT GIANTS, A NEW WOLF AND THE KEY TO UNDERSTANDING CLIMATE CHANGE Dr Julie Meachen Re-opening excavations at Natural Trap Cave (NTC) in North America. THE SUN’S CROWNING GLORY: OBSERVING THE CORONA Edward E. DeLuca Exploring the outer atmosphere of the Sun during the Great American Eclipse. BAS: INVESTIGATING ICY WATERS WITH BOATY MCBOATFACE Professor Mike Meredith Understanding how oceans store carbon and heat to slow down climate change.
86
90
94
ENERGISING LIFE ON EARTH: THE THIRD WAY Dr Kirstin Gutekunst Discovering a new pathway of carbohydrate breakdown in cyanobacteria and plants. UNLOCKING THE CHEMICAL SECRETS OF MICROBIAL CONVERSATIONS Dr Matt Traxler Developing a method to study microbes under their natural conditions. COMMUNICATION Why science must combat sensationalism
UNRAVELLING THE SIGNALLING CUES CONTROLLING VERTEBRATE REPRODUCTIVE BEHAVIOUR Dr Karen Maruska Investigating how animals process and translate multisensory social cues into context-specific behaviours.
RESEARCH AREAS
Physical Sciences
Health & Medicine
Earth & Environment
Biology
www.researchoutreach.org
5
Physical Sciences ︱ Dr Srikanth Patala
Understanding why materials fail After receiving a Young Investigator Program award from the US Air Force Office of Scientific Research, Dr Srikanth Patala and his research team at North Carolina State University have studied the impact of ‘grain boundaries’ on the structural integrity of many materials used in critical applications. This includes materials commonly found in jet engine turbine blades, nuclear power stations and internal combustion engines.
S
ociety’s reliance on the properties of key components in critical structures – made up of metal alloys and ceramic materials – is without question. Of course, many of these materials perform well but require scheduled maintenance to detect defects, such as cracks or heat damage, before they become serious or create a situation where a failure would threaten life. Materials used are often polycrystalline, meaning they comprise a structure made up of joining many regularly shaped crystals. For example, metals used in aircraft engine turbine
6
www.researchoutreach.org
blades rely on a polycrystalline Nickel alloy. Cracks in these components can form at high temperature but their rate of formation remains unclear. Evidence suggests that the interfaces between crystals of the material – the grain boundaries – influence how materials will fail. This area has been the subject of Dr Srikanth Patala’s research. CRYSTALLOGRAPHIES Dr Patala and his team investigated how crystals come together to form a material and how its structure influences properties like rate of diffusion, corrosion resistance, conductivity, inter-granular cracking, resistance to failure and the impact of extreme environments. The geometry, or crystallography, of the interfaces between the crystals is complex – it must describe three parameters called misorientation and two additional parameters for the orientation of interface. This makes a total of five dimensions to define the structure of each grain boundary rigorously. Dr Patala’s team developed tools to visualise the grain boundary misorientations and to represent the variations of the properties of interfaces as the geometrical parameters are varied in the five-dimensional space. UNDERSTANDING STRUCTURE– PROPERTY RELATIONSHIPS In materials science, to predict how a material behaves requires an understanding of the underlying structure – i.e., how atoms pack together to form the material. Computing the
Above: The polyhedral unit model of the Σ3(1 0 1̄) asymmetric tilt grain boundary is illustrated. (a, b) The atomistic structure and the polyhedral units, along two different views (the tilt axis and the plane normal), are shown. The polyhedral units corresponding to the smallest asymmetric unit (the unit-cell) are highlighted. (c) The unit-cell polyhedral units, along with the overlapping voids and the clusters of atoms that correspond to the polyhedral units, are shown. (d) The unique GB units are the 9-atom unit and the 5-atom dual-tetrahedron. The dual-tetrahedron is split further into two seed tetrahedra. The 9-atom GB unit best fits a 9-atom seed cluster in the data-base with RMSD error of ~ 0.38. The error is calculated using the point-pattern matching algorithm.
Metals used in aircraft engine turbine blades rely on a polycrystalline Nickel alloy. Cracks in these components can form at high temperature but their rate of formation remains unclear atomistic structure of grain boundaries represents a significant challenge but one that would, if achieved, show how the influence of a collection of grain boundaries would affect the properties of a material. This was the challenge that Dr Patala set himself and his team: to create a reduced-order mathematical model that predicted how polycrystalline materials would perform and discover how grain boundaries would impact their ultimate strength, toughness and performance. To overcome the complexity of grain boundary structures, Dr Patala’s team plans to adopt machine-learning algorithms that make use of pre-existing and large databases of grain boundary structure–property relationships. MODELLING USING THREEDIMENSIONAL GEOMETRY An important step in quantifying and understanding how atoms pack
at grain boundaries has been made with an algorithm recently developed by Dr Patala’s team. The algorithm describes the structure as a packing of three-dimensional polyhedra (a manysided three-dimensional shape with flat sides – a cube is a polyhedron but they can have many more sides). The creation of arrangements of 3D polyhedra to model grain boundaries from a disordered set of atoms that form a material is complex. Dr Patala chose to use the mathematical properties of the well-established Voronoi network, a method of partitioning space, to automatically identify the network of three-dimensional polyhedra (whose vertices or corners represent the atoms) that are present in the structure of a grain boundary. Dr Patala’s team also developed a pattern-matching technique that allows the comparison of the polyhedra
www.researchoutreach.org
7
found in the grain boundaries to a pre-existing database of hard-sphere packings (the way that atoms arrange themselves in a model material system). This allows for the classification of the type of polyhedra observed in grain boundaries and comparison of their structure. While Dr Patala’s research has focused on the analysis of grain boundaries in aluminium, his findings are applicable to most metals, ionic solids and some ceramics. However, they are not applicable to organic materials, which have directional bonds that define their material structure. This is a focus of future research in Dr Patala’s research group. VOIDS MATTER Using the polyhedral geometries allows for the identification of voids – the space that is unoccupied by atoms – in the grain boundary region. Evidence suggests that the structure of voids, or the free-volume, at grain boundaries influences material properties. This is similar to observations in amorphous or metallic glass metals – those with a more disorganised structure at the atomic scale. Grain boundaries are disordered and Dr Patala anticipated that modelling a polyhedral unit structure that analysed the void content of a material at grain boundaries would prove beneficial, offering practical applications. For example, identifying the voids in the grain boundary would help understand how small solute atoms
8
www.researchoutreach.org
Dr Patala and his team were then able to analyse the voids in the grain boundaries using the three-dimensional polyhedral geometries interpose themselves within the grain boundary, influencing the ultimate strength and toughness of structural materials (e.g., through hydrogen embrittlement). WHY THE RESEARCH MATTERS Dr Patala’s research has shown that it is possible to generate a coarse-grained geometric description of the structure at grain boundaries, and especially the atomic packing. They expect their model to replace the traditional structural unit model currently in use. Grain boundaries that are similar in terms of their crystallography can now be compared. The rigorous mathematical approach and comparison of the results obtained with a database of rigid hard-sphere packings provided a robust basis for
classifying grain boundary structures. It will be possible to use the results in autonomous learning algorithms to discover a fundamental set of polyhedral units at grain boundaries, improving understanding and allowing prediction of properties – such as resistance to failure. Void structures can also be classified using Dr Patala’s research outcomes and it is anticipated that this information will be utilised to identify potential segregation sites where there are solute atoms at grain boundaries. The developed mathematical model highlights a link between analysed grain boundary structures and those observed in metallic glasses, paving a way to evaluate grain boundaries using the theories proposed for amorphous materials.
Below: The polyhedral unit model of the Σ11(1 1̄ 3) grain boundary is illustrated. (a, b) The atomistic structure and the polyhedral units (capped trigonal prisms (CTP)) of the symmetrictilt boundary, along two different views (the tilt axis and the plane normal), are shown. The CTP corresponding to the smallest asymmetric unit (the unit-cell) is highlighted. (c) The CTP and the overlapping voids, along with the clusters of atoms that correspond to the CTP, are shown. Also illustrated is a comparison between the observed and a perfect CTP unit obtained using a point-pattern matching algorithm.
Behind the Bench Dr Srikanth Patala
E: spatala@ncsu.edu T: +1 919 515 3039 W: http://research.mse.ncsu.edu/patala/
Research Objectives Dr Patala’s research interests include the structure of materials and the analysis of defects, particularly focusing on their interactions in polycrystalline materials, and in design principles that improve the performance of structural and functional materials. Funding • National Science Foundation USA (NSF USA), Faculty Early Career Development Program, DMR #1554270 • AFOSR Young Investigator Program, Aerospace Materials for Extreme Environments, Contract #FA9550-171-0145
Q&A
Your research focuses on the structure of aluminium and you suggest that it is applicable to other metals, ionic solids and some ceramics. Can you explain why this is so and expand on the types of ceramics where your research is applicable? The current limitation of the technique arises from the nature of the bonding in materials. Primarily, bonding can be metallic (found in metals), ionic (found in most ceramics) or covalent (in covalent ceramics and polymers). Metallic and ionic bonds are non-directional, i.e. the energy does not depend on the direction or the orientation of the bond. In covalent solids, the bond energy depends on the angles between the bonds (directionality) and hence the geometries are constrained. Therefore, a polyhedral model might not work very well for covalent solids (like carbon, silicon etc.). We are currently developing tools for describing such structures. You mention machine learning as a potential next stage for your research. Could you elaborate on this? Representing the grain boundary structure as a combination of polyhedra is one step in understanding the properties of materials. The next step is to relate the
Collaborators • Christopher A. Schuh, Danae and Vasilis Salapatas Professor of Metallurgy, Massachusetts Institute of Technology, USA. • Eric R. Homer, Assistant Professor, Mechanical Engineering, Brigham Young University, Utah, USA. • Arash D. Banadaki, Postdoctoral Researcher, Materials Science and Engineering, North Carolina State University, North Carolina, USA. Bio Dr Patala was awarded his PhD by Massachusetts Institute of Technology (MIT) in 2011 where he studied Materials Science
structure to properties (such as diffusion, corrosion resistance, conductivity, etc.) of interfaces. To accomplish this, we are building a database of properties using high-throughput simulations of interfacial phenomena. By using the structural features of the grain boundaries and the data generated, robust structure–property relationships can be constructed using machine learning algorithms. The mathematics of your model is complex, mirroring the complexity of grain boundary types. Can you explain why you selected Voronoi network and Delaunay triangulation, and whether any other approach was considered? I would actually say that the technique of using Voronoi network for identifying polyhedral units is pretty simple. Voronoi analysis is a technique taught in most undergraduate mathematics classes. We use the property of Voronoi networks to identify which atoms should be joined together to form the polyhedral units. The simplicity of the technique is also what makes it powerful and applicable to many different grain boundary types and material systems.
and Engineering. He was also given awards for an Outstanding PhD Thesis Research by MIT and the James Clerk Maxwell Young Writers Prize by the Philosophical Magazine & Letter, both in the same year. Contact Dr Srikanth Patala Materials Science & Engineering North Carolina State University 3028C Engineering Building I 911 Partners Way, Campus Box 7907 Raleigh, NC 27695-7907 USA
the performance of materials under extremely high temperatures. In these environments, it is the grain boundaries that tend to fail first. Therefore, developing quantitative structure– property relationships of interfaces will not only help us understand how materials fail but will also help design materials that can withstand extreme environments that are of interest to the US Air Force. It seems that you have improved science’s understanding of grain boundaries in materials. How do you see the research progressing from here? As I had mentioned earlier, computing the structure is simply the first step in understanding material properties. We are developing novel ways to describe this structure that can be used as input to machine learning algorithms. These algorithms will then be combined with high-throughput simulations and experimental measurements to develop the relationships between the structure of grain boundaries and their properties.
How do you see the results of your work assisting your main sponsor, the US Air Force? The Air Force Office of Scientific Research is interested in understanding
www.researchoutreach.org
9
Thought Leader
ACS: Taking chemistry to Hollywood Science is everywhere. From your sofa to your car, pretty much any product you can think of would not exist without the work of science, and chemistry in particular. This fact is often under-appreciated by the public, but for Dr Donna Nelson of the American Chemical Society, she has made it her mission to change this. Following her work as the science advisor to Breaking Bad, she now hopes to continue her Hollywood adventure in the hope of changing the public perception and appreciation of science. We at Research Outreach recently spoke to her about this and much more.
B
reaking Bad is regarded by many as one of the best American TV drama series ever. The numerous awards it has picked up over the years are testament to this. For those who have not seen it, the show centres on a school chemistry teacher who turns to drug manufacture in order to fund his own cancer treatment. The science underpinning each episode, especially related to the process behind the production of crystal meth, is fundamental to the show’s success. And it is here that Dr Donna Nelson’s work as the show’s scientific advisor has proved pivotal. Away from her Hollywood duties, Dr Nelson is also the Immediate Past President of the American Chemical Society. She recently sat down with us at Research Outreach to discuss her role further, outlining why changing the public perception and appreciation of science is so important. Hello Donna! What does your role as Immediate Past President of the American Chemical Society (ACS) involve? There are three of us in presidential succession simultaneously, and part of
10
www.researchoutreach.org
our collective role is to represent the ACS to the public. It is a voluntary position, so the members very much want to hear from us, because we represent them. Within that, each president gets to choose their own projects, related to something they are particularly interested in, or knowledgeable about. For me, I am very interested in communicating science to the public, which is also one of the reasons why I became the science adviser to Breaking Bad. What impact do you think Breaking Bad has had on the public, with regards to chemistry? I think it has moved everybody’s knowledge of chemistry forward. I have given hundreds of talks about the show, and at the end there is a Q&A period. Each time there are people in the audience who know more about Breaking Bad than I do, because they became so enthralled with it. One of the things that really amazed me about it was the number of students who would tell me they had become much more interested in science because of it. Some even phoned me up wanting quotations for their own science blogs, which of course I am always happy to provide. It is amazing how many times young people have said that Breaking
Bad is what inspired them to start these blogs in the first place, so I am absolutely positive that it has influenced a great number of young people. I think a lot of people may not have known about the Drug Enforcement Agency (DEA) beforehand either. For instance, the show had assistance from the DEA on what the equipment in illicit meth labs should look like. They also advised which steps to omit from being shown on TV – they did not want it to be a cookbook for making illicit drugs.
I do not think that people realise how much science, especially chemistry, impacts their lives Why is it important to you to improve the presentation of science to the public, and ensure its accuracy on TV? The main reason is to influence the public to appreciate science. I found that it was Hollywood, the television and the movies, that had a real impact on the public’s
perception of science. People watch TV series every week from their own living rooms, so I decided it was Hollywood that we needed to reach out to – but at the time it seemed impossible. That was until I read an article written about the show’s creator, Vince Gilligan, in Chemical
and Engineering News – the American Chemical Society’s weekly magazine. He was saying that neither he, nor any of his writers, had any formal science background or education. They were having to research their science content through the web. It was really important
www.researchoutreach.org
11
to Vince to get the science right, and it was proving to be really difficult. I read that and decided to volunteer my expertise. Breaking Bad was good because it helped to get science out there, but we still have a long way to go. It definitely benefitted me though, because it gave me a peek into how Hollywood operates. What was your input to the show? I gave input when they contacted me saying they were putting something into the script involving science. They would
related information and knowledgebased solutions. A good example of this is the Chemical Abstracts Service (CAS), which assigns CAS numbers to chemical substances. The service is a division of the ACS, and was developed to overcome the limitations of other chemical naming systems. This is important for everybody who manufactures almost anything, because of the CAS number that is used to identify each chemical. Even attorneys and business people know that the CAS number is used to identify chemicals,
good job now of increasing the education aspect of science – there is a lot of science built into TV shows and even in our schools, etc. – but I do not think there is necessarily an appreciation for science. The general public are certainly becoming more science literate, and the students that come into my class now are so much smarter than they were 20 years ago, but I do not think that people realise how much science, especially chemistry, impacts their lives. You cannot name anything that does not contain chemicals, except a vacuum. Every single thing – car parts, furniture, carpeting, clothing – contains
I am very interested in communicating science to the public, which is also one of the reasons why I became the science adviser to Breaking Bad send me pages out of the script, asking me to draw chemical structures that they would put on the blackboard, or help them with the pronunciation of certain words or dialogue. I made set visits and was able to actually meet the actors and answer their questions about how scientists talk to each other, how they talk to their students, and what type of person becomes a scientist. Can you tell us more about the ACS’s background and what the aims of the society are? The ACS has a four-part strategic plan. The first part relates to its publication unit, which provides the very best chemistry-
but they may not know that the number actually comes from the ACS. The second part is about helping members advance their careers. I am particularly happy with the way that the ACS gives under-represented groups opportunities that one might not be able to get in academia or in the industry where they work, enabling them to get training and learn about leadership. The third part relates to education. ACS have their own self-certified degree, which students only receive if they take certain courses and fulfil certain requirements.
Dr Donna Nelson with Bryan Cranston (left) and Aaron Paul, during a Breaking Bad set visit.
12
www.researchoutreach.org
The fourth goal is to communicate chemistry’s value to the public and to policy makers, including members of Congress. This is very important to me, because when I speak to people I will often say that I do not think the public appreciates scientists or science enough. Most science organisations and the government have done a pretty
chemicals and people do not really fully appreciate that. That is why we have that as one of the ACS’ goals – to make the public not just understand science, but genuinely appreciate it. What impact do you think the ACS has had on advancing the broader chemistry enterprise since it was first established 140 years ago, and are there any accomplishments you are particularly proud of? I think the ACS has had a huge impact on every single scientific development, but particularly in two broad categories – the Chemical Abstracts Service and SciFinder. SciFinder, also produced by CAS, is a comprehensive database of chemical literature and is a core research tool. A lot of chemists do not understand what has gone into that, but it has been an immense effort. What developmental goals have you chosen for the ACS while you serve as the society’s primary spokesperson and representative? When I ran, I told the membership that I would try to accomplish whatever goals they wanted. I also sent emails out to every single member asking them what concerns they had. There were two main ones. One was about jobs and the other was about the public perception of science – which is my area of interest.
Thought Leader The American Chemical Society Building, in Washington DC
To combat these, at the next ACS National Meeting we are going to have a symposium on how chemistry and science are presented in Hollywood, on TV and in the movies. For the jobs issue, we created a task force of people to look into it. I am currently in the process of writing up the results. Under the National Historic Chemical Landmarks program, the ACS grants landmark status to seminal achievements in the history of the chemical sciences and provides a record of their contributions to chemistry and society in the US. For you, which landmark has had the greatest impact? I really enjoyed visiting the birthplace of the American Chemical Society in Pennsylvania. It all began in a house that once belonged to Joseph Priestley, who discovered oxygen in 1774. It is still there and it has been converted into a museum. Amazingly, his lab is still there as well. On the hundredth anniversary of the discovery of oxygen, a lot of chemists met at his house and it was through a discussion at that anniversary celebration that they decided they needed a chemical
society for the US. So, that is where the ACS was actually born – at that meeting and at that house. From a more personal perspective, your research into organic chemistry and long-term commitment to chemical research has seen you win numerous awards over the years. What does winning these awards mean to you? It means a great deal. One of the awards which I won early on was a Guggenheim Award, which is very prestigious. I am very proud of that and I think that these things help you in your career. I remember when I was nominated for ACS Fellow I kept telling myself it was okay if I did not win, because I did not want to be disappointed. But then when I was elected to ACS Fellow, it meant everything. I think awards generally are very important – they certainly help one’s career, as they give you credibility in your work.
ACS President Donna Nelson
American Chemical Society 1155 Sixteenth Street, NW Washington, DC 20036 USA E: service@acs.org
W: http://www.acs.org @AmerChemSociety /AmericanChemicalSociety
www.researchoutreach.org
13
Physical Sciences ︱ Prof Roy L. Johnston and Dr Francesca Baletto
Designing catalys Catalysis, using reagents to speed up chemical reactions, is big business. From synthesising new chemicals to cleaning up the exhaust fumes from cars and sustainable energy devices such as hydrogen fuel cells, catalysts play a huge role in our daily lives. Prof Roy L. Johnston and Dr Francesca Baletto at the University of Birmingham and King’s College London respectively are using novel computational approaches not just to understand the intricate mechanisms of how such catalysts work but to use this knowledge to design new, more efficient nanocatalysts for particular applications.
C
atalysts speed up chemical reactions by lowering the energy required for the reaction to occur. To achieve this, the catalyst forms an intermediate complex with the chemical reactants, to provide a lower energy route to reaction. These intermediate complexes are typically short-lived as, to allow the catalytic cycle to continue, the catalyst needs to be ‘refreshed’ so it can be used in future reactions. There are many types of catalysts made up of a whole variety of different chemical elements. Some of the most commonly used elements are transition metals because they can interact with a wide variety of chemical species. These elements can also be arranged in a variety of ways to make catalysts with different shapes and structures, that can have a dramatic effect on their efficiency and properties. As catalysts play a key role in processes such as the removal of toxic carbon monoxide from exhaust fumes, there is a huge need to develop new
14
www.researchoutreach.org
sts bit by bit catalytic species that are efficient, robust and cost-effective. However, intelligently designing new catalysts or predicting which materials are likely to be successful in the lab relies on a deep understanding of the intricacies of catalytic mechanisms, another very challenging problem in itself. Prof Roy L. Johnston and Dr Francesca Baletto at the University of Birmingham and King’s College London, with their collaborators, are developing and applying computational techniques for exactly this problem, via the design and tailoring of nanomaterials for catalysis. CATALYSIS BIT BY BIT Modelling catalytic materials computationally is no mean feat. While computational modelling can be used to predict the outcomes of simple gas phase chemical reactions very accurately, modelling solids, and particularly those with heavy atoms like metals, offers a number of additional challenges. Prof Johnston and Dr Baletto are also particularly interested in the properties of more unusual materials, known as nanoalloys, which have numerous desirable and enhanced catalytic properties. These nanoalloys are nanoparticles which are made up of two or more metallic elements. Computational modelling of a catalytic reaction requires an understanding of
means that the computational models need to trial a huge number of different configurations and particle sizes to explore which combination of structural features will give rise to the most promising catalysts. Despite the challenges in constructing accurate models, the advantage of the methods used by Prof Johnston and Dr Baletto is that the calculations provide a level of detail about the catalytic mechanism that it is simply not possible to capture in laboratory experiments. For example, it was experimentally known that the introduction of additional strain in the layers of atoms at the surface of the catalysts could result in different chemical properties to those of the unstrained surface of the same elements. However, the mechanism underpinning this observation was not understood until the corresponding computational modelling was performed. Even greater levels of complexity are introduced into the computational models by potential transformations of the structure of the catalyst during a reaction event. While such changes in the shape of the catalyst may be rare events, it is necessary to account for these possibilities in the computational modelling to accurately reproduce experiments. As well as the
Johnston and Baletto aim to tune catalysis at the nanoscale using stateof-the-art numerical and computational techniques the structure of the catalyst itself as well as any substrates it interacts with. The nanoalloys that Prof Johnston and Dr Baletto study are particularly complex as they come in a variety of sizes. This
development of new catalytic materials, this is also driving the development of new computational methods, as part of the TOUCAN project (‘Towards an Understanding of Catalysis on
www.researchoutreach.org
15
Nanoalloys’), a collaborative network of projects focused on understanding how nanoparticles and nanoalloys can speed up chemical reactions. GREENER ENERGY One of the key reasons that Prof Johnston and Dr Baletto are so interested in nanoalloys over traditional, substrate-supported monometallic catalysts is the potential impact they could have in the field of sustainable energy and clean technology. They have studied the impact of nanoalloys in the catalysis of a variety of reactions, including the oxygen reduction reaction (ORR), which is of importance in electrochemical fuel cells, and the oxidation of carbon monoxide (CO). As well as being a toxic pollutant, CO is one of the main poisons of fuel cell catalysts; it decreases the efficiency of the platinum electrodes, which are incredibly costly to replace due to the increasing scarcity of platinum. Unfortunately, there are many existing catalysts for which platinum is an essential component and it has been challenging to find chemical elements that can act as a suitable replacement. However, platinum nanoalloys offer a route to making catalysts that are as efficient, but require much less
16
www.researchoutreach.org
Rational design of nanocatalysts implies finding the best shape for a target reaction. Sampling the energy landscape of nanoparticles in the gas phase (A) and supported nanoparticles (B) and extracting possible structures; (C) calculating the binding energies of a reagent onto the nanoparticles. Here we illustrate the adsorption of O2 onto Pt clusters of ~55 atoms, displaying a cuboctahedral (CO) or decahedral (Dh) shape, which have been deposited onto pristine MgO. Atom colour code: Red refers to oxygen, blue to platinum and green to Mg.
Johnston and Baletto are searching for descriptors to relate the structures of metallic nanoparticles to their catalytic performance platinum by replacing some of the platinum content with a cheaper metal. Prof Johnston and Dr Baletto have been modelling the effects of the amount of substitution in the nanoalloys, for a range of different metals, to see if they can identify which metals, and how much of each one, are required to design novel, highly efficient and selective catalysts for reactions such as CO2 capture and reduction, and O2 reduction. The work of Prof Johnston and Dr Baletto is helping to provide an unparalleled insight into exactly how catalysts work, as well as leading the development of new, general
computational approaches, that can be used to predict the properties of a variety of catalytic species. All of this is helping to drive ‘intelligent’ catalyst design. This will help identify and pre-screen species that are likely to be successful candidates, essential not just as a cost-saving exercise but to accelerate the development of a new generation of catalysts in clean technologies.
Behind the Bench Professor Roy Johnston
E: r.l.johnston@bham.ac.uk T: +44 121 414 7477 Research Objectives Prof Johnston and Dr Baletto use computational techniques to design and tailor nanomaterials, focusing on nanoalloys and their catalytic properties. Funding EPSRC (Critical Mass Grant EP/J010804/1 and EP/ J010812/1) Collaborators Other PIs on the TOUCAN Project: Prof Gábor Csányi, Prof Chris Pickard, Prof David Wales (University of Cambridge); Dr Jonathan Doye (University of Oxford) For external collaborators visit: toucan.bham.ac.uk Bio Roy Johnston is Professor of Computational Chemistry at the University of Birmingham, UK.
Q&A
What do you think are some of the biggest unsolved problems in catalysis today? • Developing cheaper catalysts, by replacing or reducing scarce metals such as Pt and other precious metals. •K nowledge of the atomistic mechanisms, including reaction kinetics and intermediate formation and the effect of the surrounding environment. This will allow the correlation of nanoshapes with catalytic properties, leading to new design rules. •M odelling nanocatalysis as a dynamic process occurring at high temperatures and pressures, in the presence of various reactants and products. This will be facilitated by the development of new theoretical methods which are able to capture and analyse various elementary steps including the structural, and hence electronic, evolution of the nanosystems. Do you think nanocatalysts will be the dominant type of catalyst in the future? Yes, since nanoparticles allow the tailoring of catalytic properties by changing the elemental composition and ordering, shape and morphology, which in turn tunes the electronic properties of the nanoparticle. The global market for nanocatalysts is continually expanding and this trend is forecast to continue for the near and medium-term future. Metallic, multimetallic (mixing precious and abundant metals) and hybrid (e.g., metal-metal oxide and metal-biomolecule) nanoparticles will play an increasingly important role, once
Dr Francesca Baletto
E: francesca.baletto@kcl.ac.uk T: +44 2078482152
He has published approximately 250 journal articles, reviews, books and book chapters. Francesca Baletto is a Senior Lecturer in Physics at King’s College London, UK. She is authors of more than 45 journal articles, reviews, and book chapters. Contact Professor Roy Johnston School of Chemistry University of Birmingham Edgbaston,Birmingham B15 2TT, UK Dr Francesca Baletto Physics Department King’s College London WC2R 2LS, UK
the manufacture of highly efficient catalysts, controllable at the molecular scale, becomes feasible at a reasonable cost. The combination of numerical and experimental tools is mandatory to achieve the required control. What do you think are some of the most exciting developments in catalysis that have arisen from using computational modelling in this area? The most exciting idea is the growing idea that chemistry can be tuned by modifying nanoparticle shape, leading to a completely new approach to catalysis which is only feasible at the nanoscale. In this regard, a hot topic is the search for descriptors, which are measurable quantities that link nanoparticle shape and chemical composition to their catalytic activity and selectivity. Finding robust descriptors will open the possibility of developing a dynamical model for nanocatalysis, where the substrate is also mobile. Numerical tools have begun to be used to elucidate the role of the surrounding environment, at least as far as it stabilises particular shapes and chemical ordering of nanoparticles What are the main challenges to be overcome for computational methods to achieve ‘chemical accuracy’ in describing catalytic processes? • To include all aspects of the environment of the catalyst, including the support (usually a metal oxide material), temperature, pressure, solvent (for solution phase catalysis) and any surfacebound but non-reactive ligand species. •A tomistic methods in combination with machine learning tools can achieve the desired accuracy
More info: • toucan.bham.ac.uk • www.chem-res.bham.ac.uk/johnston • http://bit.ly/2wgY9f5 • www.balettogroup.weebly.com • www-wales.ch.cam.ac.uk/CCD.html www.ucl.ac.uk/klmc/Hive/ External Collaborators: Professor Dr Rolf Schaefer (Technical University Darmstadt, Germany); Prof Micha Polak (Ben Gurion University of the Negev, Beer Sheba, Israel); Prof Marcela Beltran (UNAM, Mexico); Dr Laurent Piccolo (CNRS, Lyon, France); Prof Roberto D’Agosta (ETSF-Nanobio, EHU/UPV, San Sebastian, Spain); Dr Caetano Miranda (USP, Sao Paolo, Brazil); Dr L. Oliver Paz-Borbon (UNAM, Mexico)
needed to shape nanocatalysts with the most promising characteristics, based on the use of descriptors which will define proper design rules. • As density functional theory is at the core of catalysis modelling, the choice of pseudopotentials with the right level of accuracy is important, with an eye to size scalability. • A dynamical model for understanding chemical reactions in a realistic environment, including atomic mobility within each nanoparticle. • New numerical tools are needed to understand the assembly of nanoparticles into super-architectures and the influence this can have on catalysis. Do you think it will be possible to deduce general intuitive chemical models for catalyst design from your work? Yes, the search for and definition of geometrical descriptors in conjunction with experimental data – e.g., electron microscopy – will open a new way to design nanocatalysts in an intuitive fashion. The use of machine learning is also likely to shed new insights on how catalytic activity depends on nanoparticle shape, size and chemical ordering, leading to new and fascinating routes for nanocatalyst design. Existing optimisation tools can be adapted to explore the pool of candidates which maximise/minimise the geometrical quantities of interest, eventually providing guidelines for the synthesis of these nanoparticles.
www.researchoutreach.org
17
Physical Sciences ︱ Professor Mark Keane
Cleaner pathways to chemical synthesis via new generation catalysts The synthesis and manufacture of chemicals, and chemical products, has typically relied on petroleum-based feedstocks. Even hydrogen, often hailed as a ‘clean fuel’ and an important component in many chemical reactions, is sourced from such feedstocks. However, Professor Mark Keane’s research at HeriotWatt University is developing methods for clean chemical production from alternative, renewable feedstocks. To achieve this, he is designing and developing new generation catalysts that not only open up numerous possibilities in chemical synthesis but also use alternative sources of hydrogen for such reactions.
18
www.researchoutreach.org
M
any modern materials and chemicals start their lives as fossil fuels. From plastics to petrol, fossil fuels, such as crude oil and natural gas, are still widely used for the extraction and synthesis of many of the chemical compounds we rely on in our daily lives. The problems with this are obvious. Fossil fuels are finite resources
and there are environmental penalties associated with their extraction and processing. Potentially increasing fuel prices may make it prohibitively expensive to extract the chemical species that are the essential starting ingredients for many synthetic procedures, such as the manufacture of drugs or polymer materials.
Coupled dehydrogenation (of Reactant 1) to give Product 1 with hydrogenation of Reactant 2 to Product 2
Perhaps one of the most surprising compounds to be sourced from fossil fuels is hydrogen. The majority of hydrogen is produced from natural gas in a process requiring very high temperatures that also produces carbon dioxide. Despite the seeming abundance of hydrogen atoms as part of water molecules and hydrocarbons, molecular hydrogen (H2) is actually very rare on Earth as the majority escapes from the atmosphere. However, hydrogen is a key component in one of the most important types of chemical reaction, hydrogenation. Hydrogenation reactions involve the addition of molecular hydrogen to another compound and are usually
The majority of hydrogen is produced from natural gas in a process requiring very high temperatures that also produces carbon dioxide facilitated by the presence of a catalyst. These reactions are ubiquitous in industrial synthesis; they are how vegetable oils are processed into fats, such as margarines, or to make paraffins and naphthenes for use as fuels. With few alternative methods for hydrogen production that are not reliant on the use of fossil fuels, Professor Mark Keane at Heriot-Watt
2-butanol dehydrogenation to 2-buanone over nano-scale supported Cu with hydrogen release and use in the selective hydrogenation of furfural to furfuryl alcohol over nano-scale supported Au
University has been finding ways to modify the catalysts for hydrogenation reactions. This is to both reduce, or eliminate, the need for hydrogen in these reactions, finding new ways of exploiting alternative sources of hydrogen that have a much lower environmental impact. BIOMASS FEEDSTOCKS One reaction that Prof Keane has been particularly interested in is the transformation of furfural to furfuryl alcohol. Furfural is a molecule derived from biomass, such as corncob and sugar cane, that can be selectively hydrogenated to form highly valuable furfural alcohol used in manufacturing resins, rubbers, adhesives and as a building block for drug synthesis. There are also many chemical derivatives that have numerous applications in polymer synthesis. The ability to obtain such building block chemicals from non-petroleum sources is of pressing urgency but one of the problems with furfural hydrogenation and processing has been the use of very environmentally damaging copper chromite catalysts. Prof Keane has developed an alternative approach for this reaction, making use of a specially designed catalyst that does not just pose less environmental risk but reduces hydrogen usage in the process.
www.researchoutreach.org
19
N ENATIO G O DR Y H Furfuryl alcohol O
H2
www.researchoutreach.org
H
Au
INPUT
COUPLING
Cu
O
O
YD
OUTPUT
High value chemicals
2-Butanone
H
WOOD WASTES
H
2-Butanol
RO
GE
DRUG SYNTHESIS POLYMER RESINS
NA TIO N
SOLVENTS
Use of biomass feedstock in coupled dehydrogenation/hydrogenation for the production of an array of high value products
to produce the hydrogen required to fuel the reactions at the gold sites. All that is required is a chemical that can act as a suitable feedstock for the complementary reaction. FUTURE OF INDUSTRIAL SYNTHESIS Already, Prof Keane and his group have been able to successfully extend this approach to developing new generation catalysts for other important reactions to generate amines and imines with myriad uses in chemical synthesis. Such coupled reactions offer not just a safer route to large-scale chemical synthesis but also one that allows far more efficient use of hydrogen. Recently, Prof Keane has also been able to demonstrate the use
of formic acid, a side product of many biorefinery processes, as a source of hydrogen for such reactions. Prof Keane’s approach is easily scalable to large-scale, continuous flow chemistry, which is a far more efficient approach to chemical synthesis than the conventional production of single batches. His future work will continue to involve extensive collaboration with industry, tackling chemical synthesis problems of global importance with catalysts that offer routes to far cleaner and greener chemical synthesis and a reduction of our reliance on petroleum feedstocks.
VALORISATION
20
H
H
E
The reason these coupled catalysts are so efficient is because they combine complementary hydrogenation and dehydrogenation reactions on the two metals present. At the gold site, the normal hydrogenation reaction occurs, but at the copper site, a complementary dehydrogenation reaction occurs
FOOD CROP RESIDUES
Biomass derived compounds
Furfural
D
The major drawback with gold catalysts has been the relatively slow reaction rates, so commercial use of such catalysts also involves using an excess of pressurised hydrogen to compensate for this. The use of highly pressurised gases comes with many safety concerns and a large proportion of the hydrogen used in this way is essentially wasted as it remains unreacted. Prof Keane has been designing so-called ‘coupled catalysts’ that use a combination of metals, such as gold and copper as a solution to this problem. These have increased hydrogen utilisation by several orders of magnitude in comparison to the exclusively gold catalysts.
H
O
O
HYDROGENFREE SELECTIVE HYDROGENATION An ideal catalyst is one that selectively promotes formation of the desired reaction product, does so with good efficiency and has the longest possible lifetime before degradation of the catalytic material occurs. For furfural hydrogenation, gold-based catalysts have been used with some success as they have very high selectivity for the desired hydrogenation product.
O
Behind the Bench Professor Mark Keane E: m.a.keane@hw.ac.uk T: +44 (0)131 451 4719 W: https://www.hw.ac.uk/staff/uk/eps/mak.htm
Research Objectives Professor Keane’s work is directed at clean chemical production using bio-derived feedstock with a particular focus on alternative sources of hydrogen for reaction. Funding EPSRC
Q&A
How easy are these coupled catalysts to synthesise? The catalysts for reaction coupling contain a dehydrogenation (e.g., Cu) and a hydrogenation (e.g., Au) metal component on an oxide carrier or support. Catalyst preparation involves standard chemical impregnation or precipitation techniques that are widely employed in a range of applications. Metal loading is controlled by varying metal precursor concentration and pH. Metal particle size and surface composition are tuned by altering activation temperature. Support acid-base and redox properties are important surface characteristics that can determine overall performance. Do you think this approach will be applicable to all industrial hydrogenation and dehydrogenation reactions? Coupling endothermic dehydrogenation with exothermic hydrogenation presents critical advantages in terms of thermal efficiency, lowering capital and operational costs. In addition to circumventing use of external pressurised hydrogen gas, the hydrogen generated from dehydrogenation is more reactive, leading to higher rates and increased product throughput. Our work has focussed on a continuous coupling of 2-butanol dehydrogenation (to 2-butanone) with furfural hydrogenation (to furfuryl alcohol) where the two
Bio Mark A. Keane is Professor of Chemical Engineering at Heriot-Watt University. He was Professor of Chemical & Materials Engineering at the University of Kåentucky and Senior Lecturer at the University of Leeds. Author of over 190 publications, his research deals with the synthesis/ characterisation of new generation catalysts with application in cleaner processing and environmental catalysis.
Contact Prof Mark A Keane Room NM2 John Coulson Building Heriot-Watt University Edinburgh Scotland EH14 4AS UK
products are readily separated by standard distillation. This is an example of advanced process intensification directed at enhanced energy efficiency and is applicable to a range of industrially relevant dehydrogenation/hydrogenation combinations.
to establish the range of applicable reactants. Any increase in oil prices will drive the demand for biomass-derived platform chemicals. One potential bottleneck is the cost of effective chemical extraction from biomass, which is being addressed in biorefineries. Drawing on technology used in the pulping industry, furfural can now be produced from a variety of cellulosic feedstocks.
Can you easily predict which metals are likely to be good candidates for such coupled catalysts? Gold is inactive in dehydrogenation but highly selective in hydrogenation. Copper shows high dehydrogenation activity. In designing coupling catalysts, the first approach is to keep both metals segregated on the oxide carrier with effective transfer of reactive hydrogen from copper to gold sites where hydrogenation proceeds. There are other transition metal combinations where this can apply but this requires parallel testing to ensure conditions where the rate of dehydrogenation (hydrogen supply) matches hydrogen utilisation. One interesting feature is the possibility of supported alloy formation at the nanoscale, which may disrupt or enhance coupling. This is currently under investigation. What are some of the remaining challenges for using biomass as a feedstock? Our results establish that reaction coupling harnessing hydrogen from dehydrogenation results in greater hydrogenation rates and full hydrogen utilisation for model reactants derived from biomass. Reaction coupling is still at the developmental stage with fundamental kinetic data required
Do you think it will be possible to make very general coupled catalysts for a wide variety of chemical reactions? It is striking that the number of published studies on coupled heterogeneous catalytic dehydrogenation/hydrogenation systems is still so limited. This may be due to the requirement for a catalyst that is active for both dehydrogenation and hydrogenation. Differences in reactivity of the two reactants, the possibility of cross-reaction between the target products and/or reactants and ease of product separation are crucial considerations. At this juncture, we are some way from considering the possibility of a single catalyst for a variety of chemical reactions/products. Work must first focus on bespoke catalysts for particular reactant(s) combinations where catalyst selectivity is paramount.
www.researchoutreach.org
21
Physical Sciences ︱ Professor Anthony Serianni
Seeing the structures of molecules: insights from NMR and industry Many of the essential chemical constituents of life, including carbohydrates, are made up of complex atomic arrangements. Knowing the particular structure of a compound is important not only for identification purposes, but also for understanding how biologically relevant compounds react. Through this knowledge, their biological functions can ultimately be deciphered. Professor Anthony Serianni at the University of Notre Dame has developed a wealth of experimental approaches to understand the 3D structures of such systems, creating widely applicable techniques for structural identification.
22
www.researchoutreach.org
I
dentifying the structure of molecules remains a key challenge within modern chemistry. In many cases, a molecule’s structure has a huge influence on both its chemical and biological activities. For example, carvone, a molecule found in many essential oils and fragrances, can either smell like caraway seeds or spearmint, depending on the orientation of one of the chemical substituents in the molecule. While the two forms of carvone are identical in their atomic composition, it is the physical orientation of one of the groups in the molecule that determines which receptors interact within the human nose and ultimately, how we perceive the smell.
There are still very few experimental techniques capable of identifying molecular structures, particularly in the solution phase. X-ray crystallography is probably the most famous structural identification technique, with the 1964 and 1962 Nobel Prizes in Chemistry both being awarded to work determining the structure of biochemical substances including vitamin B12 and globular proteins, respectively. However, X-ray crystallography relies on forming a solid, crystalline sample, often technically challenging and not always representative of the environment in which a compound exists in vivo. It was not until advancements in nuclear magnetic resonance (NMR) in the 1970s that it was possible to capture information on the structure of molecules in solution. The underlying physics of nuclear magnetic resonance (NMR) spectroscopy is more widely appreciated in the form of magnetic resonance imaging (MRI), a technique used in hospitals to image organs and diagnose diseases. In chemistry, rather than identifying the type of tissue, NMR can be used to identify the chemical groups found in a molecule and their relative connectivities. NMR signals can be translated into chemical structures for a wide variety of molecules, from small molecules to large proteins, either as solids or liquids. Professor Anthony Serianni at the University of Notre Dame is an expert in utilising modern NMR techniques and computational approaches to identify the structures and reactivities of carbohydrates and nucleic acids. One of his approaches is to use ‘isotopic labelling’ to make it easier to differentiate between different regions of these large complex biomolecules. This technique has been so successful that it has been commercialised by his company, Omicron Biochemicals Inc., which provides services to researchers around the world.
(A) Structure of a Man6 hexasaccharide containing eight sites of 13C enrichment (♦) chosen to optimise the measurement of redundant J-couplings across its constituent O-glycosidic linkages. (B) Structure of a mature high-mannose N-glycan, Man 9GlcNAc2, appended to protein. The subfragment highlighted in red corresponds to the structure in (A).
Professor Anthony Serianni at the University of Notre Dame is an expert in utilising modern isotope-based NMR techniques and computational approaches SWEET STRUCTURES Saccharides are a large family of molecules, including sugars, starch and cellulose, made mostly from carbon, oxygen and hydrogen. Saccharides are also known as carbohydrates and are ubiquitous in the biological world. Understanding their chemical structures is often key to understanding how they will bind and interact with the receptors in the human body, shaped in such a way that only molecules with complementary structures are able to fit. The highly complex structures of many saccharides make standard NMR techniques very challenging to use. Professor Serianni’s group has found a way to overcome this limitation by developing a range of new methods capable of introducing isotopic labels
into specific sites in the compounds. An isotope is a version of a chemical element that has a different number of neutrons that, in an NMR experiment, acts as a unique flag for the labelled element in the compound. This is very powerful when looking at chemical reactions, as it is possible to follow the position of the label during the reaction to identify the structures of intermediates and end-products. BIG BUSINESS While introducing isotopic labels is a powerful tool to gain more structural information on molecules by NMR, introducing isotopes is a challenging problem, spanning both synthetic and biological chemistry. Fortunately, Professor Serianni and his team have developed a suite of methods that
www.researchoutreach.org
23
The University of Notre Dame and Omicron Inc. work in parallel, utilising complementary approaches to push the limits of isotopic labelling and the applications of labelling to solving important problems in chemistry and biochemistry have transformed the possibilities for the types of compounds that can be precisely isotopically labelled at single sites, multiple sites, or uniformly labelled. These approaches are currently being used to tackle fundamental chemical problems in the Notre Dame laboratory – the same research location that led to the development and expansion of Omicron Biochemicals, Inc. The work done in the Omicron research facility is somewhat different though, instead focusing exclusively on saccharide isotopic synthesis using chemical, biochemical and biological methods. This work has had an enormous impact
worldwide, enabling other researchers to purchase isotopically-labelled compounds for use in their own work. The labs at both Notre Dame and Omicron now work in parallel, utilising different approaches to push the limits of isotopic labelling and the applications of labelled saccharides to address chemical, biochemical and biomedical problems. Without this synergistic effort, tackling fundamental problems encountered in saccharide isotope labelling would have been difficult, and the core technologies underpinning Omicron would never have been developed or would have been developed more slowly.
In the future, Professor Serianni is optimistic that the innovative approaches pioneered in the Notre Dame Lab will lead to new spin-out companies able to exploit these findings and apply them to solve specific problems that impact human health and well-being.
Behind the Bench Professor Anthony Serianni E: Anthony.S.Serianni.1@nd.edu T: +1 574 631 7807 W: http://www.nd.edu/~aseriann
Research Objectives Prof Serianni’s research interests include methods development for site-specific labelling of carbohydrates, conformational studies of simple and complex carbohydrates related to the N-glycans of human glycoproteins by nuclear magnetic resonance, applications of molecular orbital theory to aid in the interpretation of NMR parameters, and structure-function studies of non-enzymic protein glycation. Funding National Science Foundation USA (NSF USA)
24
www.researchoutreach.org
Collaborators • Ian Carmichael (University of Notre Dame) • Wenhui Zhang (University of Notre Dame) • Allen G Oliver (University of Notre Dame) • Jaroslav Zajicek (University of Notre Dame) • John G Duman (University of Notre Dame) • Robert Woods (University of Georgia) • Paul Bondo (Omicron) • Shikai Zhao (Omicron) • Qingfeng Pan (Omicron)
Bio Anthony Serianni received a BS in Biochemistry from Albright College, PA. He pursued graduate studies at Michigan State University, earning a PhD in 1980. He later moved to Cornell University for postdoctoral training, and in 1982, joined the University of Notre Dame, where he is currently Professor of Chemistry and Biochemistry. Contact Anthony S Serianni Professor of Chemistry & Biochemistry Department of Chemistry and Biochemistry University of Notre Dame, Notre Dame IN 46556 – 5670, USA
Q&A
What are the most important molecular structures you have been able to elucidate using your techniques? We have focused our attention on the multiple structural elements in saccharides that collectively comprise their structures. The motivation is that, if we can develop more quantitative experimental tools to characterise these elements in solution, these tools can be applied generally to any structure, that is, we seek general applicability and thus greater impact. The research questions are fundamental so that their potential impact can be broad. Specifically, we aim to develop more detailed NMR-based models of many of the key mobile conformational elements found in saccharides, which include: (a) O-glycosidic linkages in oligosaccharides; (b) exocyclic hydroxymethyl group conformation; (c) N-acetyl side-chain conformation; (d) O-acetyl side-chain conformation; (e) hydroxyl group conformation; and (f) furanose and pyranose ring conformation. We have been frustrated with conventional NOE-based and simple J-based methods to characterise these behaviours because they often lead to generally unsatisfying solutions. This situation has led to an over-reliance on MD simulation and related methods to assess these behaviours, yet experimental validation of MD is weak. We contend that a more holistic treatment of NMR J-couplings, which are highly abundant in saccharides, is a potential solution to this problem, and recent studies using circular statistics in conjunction with DFT appear to provide conformational models that can be compared directly to those derived from MD. One of our core research goals is to parameterise all biologically relevant O-glycosidic linkages to enable studies of their conformational behaviours in simple and complex structures. This work hinges on the ability to label target molecules with 13C at one or more sites to allow measurements of JCC values. Recent expansions of this approach to studies
of furanosyl rings, such as those found in DNA and RNA, appear promising; this work transcends typical JHH analyses, as first applied by Altona, Sundaralingam and others, and offers the potential to characterise their conformational equilibria in greater detail and with greater reliability than has been possible for the past 40 years. What are the big challenges ahead in 3D structural determination? The assembly of larger oligosaccharides containing site-specific or multiple labelling with 13C and/or other biologically relevant isotopes presents significant challenges (in contrast, uniform 13C labelling is currently reasonably easy to accomplish using biological methods but often leads to NMR spectral complexity that resists analysis). Unlike oligopeptide and oligonucleotide synthesis, an automated instrument does not yet exist in the commercial sector to achieve assembly in high yields, although people such as Seeberger, Wong, Demchenko and others have been working to solve this problem. In addition, having the ability to incorporate labelled oligosaccharides into proteins (the latter either labelled or unlabelled) to generate chemically pure glycoproteins is still a major challenge; this problem needs to be solved if we want to fully exploit heteronuclear NMR and other methods in conjunction with stable isotopes to interrogate their structures, conformational features, time-dependent motions (dynamics), and biological functions. What are some of the advantages of using NMR for structural determination? Nowadays, techniques such as mass spectrometry (MS) have taken some of the wind out of NMR, being far more sensitive and thus amenable to determinations of primary saccharide structure when sample amounts are limited. Unlike for proteins, X-ray crystallography has not proven as generally useful for saccharide structure work because obtaining high quality crystals of saccharides, especially oligosaccharides, has proven to be challenging and unpredictable. Neither MS nor crystallography, however, provide information on saccharide solution behaviour, the biologically relevant state in many cases,
A hypersurface showing the torsional dependencies of the trans-glycoside J-coupling, 3JH1’,C2, as determined by DFT for an αMan-(1→2)-αMan linkage like that between residues 5’ and 6’ in the hexasaccharide shown in Scheme 1 (A). This 3JCOCH value depends primarily on the H1’–C1’–O1’–C2 torsion angle (φ), with only a minor dependence on the C1’–O1’–C2–H2 torsion angle (ψ).
and in this regard NMR still reigns supreme. Information on solution conformational equilibria and dynamics evolves from quantitative studies of different types of NMR parameters, giving for example a wealth of data on motions occurring on different time scales. These equilibria and dynamics properties are often closely associated with biological function. Did you face any difficulties creating this spin-out company from your fundamental university-based research? Omicron Biochemicals Inc. was founded in 1982 at a time when faculty-inspired companies were less common and less accepted in academic circles than they are today. In practice, founding and sustaining a company while functioning largely as a faculty member has proven challenging not in a technical sense, but rather in a social-psychological sense – the age-old question of which master do you serve? It is important to answer this question clearly, to be self-aware and cognizant of the importance of answering this question honestly. Omicron was founded for two main reasons: (1) there was a void in the scientific community that the company filled by widening the range of compounds that could be labelled in an affordable manner and on scales that enabled diverse applications; and (2) the company serves as a key research resource for academic studies at Notre Dame that require access to labelled compounds, without which the projects would be difficult if not impossible to undertake in a timely manner. Any success I may have achieved in academic research I attribute in significant measure to the unique opportunities offered by Omicron as a partner in the work. Do you think it will become more common in the future to see spin-out companies developing from university research groups? Yes, I do. Science, or specifically the conduct of science, is becoming increasingly codified. There is significant outsourcing of scientific work in most labs, and this trend is likely to continue over time. If you need to express a protein, you hire a company to do it. Scientific “kits” can be purchased to expedite specific kinds of lab work that would have taken considerable time and treasure previously. Perhaps equally important is that doing basic research might be less costly in a company environment than in an academic one, indirect costs being what they are. In the future, I would not be surprised if federal funding agencies increase their research support for companies engaging not only in applied research, but also in research at the fundamental level. Should this come to pass, academic institutions will need to redefine their relationships and regulations with respect to faculty entrepreneurs to ensure that the core values of both entities are protected and potential negative impacts on these values minimised.
www.researchoutreach.org
25
Physical Sciences ︹ Professor Connie Lu
Configuring new bonds between first-row transition metals Transition metals are some of the most important elements in the Periodic Table for their wealth of applications, spanning catalysis to biology. The rich chemistry of the transition metals arises from their remarkable ability to form multiple chemical bonds, a process that is still not fully understood and remains a major challenge in fundamental chemistry. Professor Connie Lu at the University of Minnesota has been tackling this question by synthesising the first mixed-metal complex containing only the first-row transition metals to explore the fundamental nature of a chemical bond.
T
he transition metals are a series of elements mostly located in the middle of the Periodic Table. This includes elements such as cobalt, iron, and manganese. What makes transition metals so interesting, and gives rise to the thousands of different ways in which they can react, is their electronic structure, or the arrangement of the electrons in the element. As chemical bonding is all about electron sharing between two different chemical species, the electronic structure of an element dictates how an element will react with other species and how many and what type of chemical bonds it can form. Transition metals typically have a large number of electrons available for chemical bonding, but also space to receive electrons from possible ligands, molecules which bind to the central metal atom to form the final metal complex. These ligands can vary in size, from single atoms, to highly complex
26
www.researchoutreach.org
Decreasing M-Cr bond order 28 25
26
Mn
24
Cr B.O.= 5
24
27
Fe 24
Cr 3
molecules composed of ten or more atoms. Ligands can be diverse not just in their chemical structure, but also in how they attach to the metal centre. Some ligands just bind directly to the metal whereas others bind through multiple sites on the ligand, or in the case of metal complexes with more than one metal centre, the ligands can act as a bridge between the centres by binding the two metals together. It is the variety and the complexity of this chemical behaviour that makes Professor Connie Lu at the University of Minnesota so fascinated by transition metals and their associated ligands. With her research group, she has designed over fifty new metal-ligand complexes, with a variety of different metal centres. Now, she wants to use this knowledge not just to address fundamental questions in chemistry, but to design next-generation catalysts for chemical
Co 24
Cr 2
Ni
Cr 1
Building metal-metal bonds can be like building blocks – chemical bonding to chromium (Cr) can change from quintuple to single by varying the other light transition metal partner.
Professor Lu wants to find ways to exploit the properties of the more common light transition metals, to replace their heavier, scarcer counterparts synthesis and carbon dioxide activation, an important step in the development of sustainable energy technologies. DOUBLE-DECKER METALS Professor Lu’s research predominantly focuses on the lighter transition metals. At present, many industrial processes that rely on catalysts make use of the heavier transition metal elements. However, the heavier transition metals are significantly less abundant on Earth and there is the imminent danger of shortages of many of these elements.
Bimetallic complexes can also deliver "ready-made" bimetallic active sites onto a solid support, as shown for the metal-organic framework material, NU-1000.
This is why Professor Lu wants to find ways to exploit the properties of the more common light transition metals, to replace their heavier, scarcer counterparts. One series of complexes that Professor Lu’s group have been making are dicobalt complexes, two cobalt centres joined together by some unusual ligands, also designed by her group, called ‘double-decker’ ligands. These ligands are named as such because they act as scaffolds for stacking together two metal-metal centres, allowing the metal centres to form multiple bonds with each other. The more bonds formed between two species, the shorter and stronger the bond but such multiple bonding is very rare between two metals of different kinds. The design and properties of these ligands has allowed Professor Lu to achieve several world-firsts by making a molecule with triple bonds between two different light transition metals, iron and chromium and a quintuple bond between manganese and chromium. While quintuple bonds are rare in themselves, the manganese-chromium complex remains the only example of
www.researchoutreach.org
27
a quintuple bond between two different metal centres. THE NITROGEN PROBLEM Making bonds between metal centres of different transition metals is a powerful tool for exploring the chemistry and bonding of transition metals and forming an extensive library of such compounds is important to further our understanding of catalysis design and synthesis. However, such compounds are not so commonly found in nature, whereas bimetallic complexes, where the two metal centres are the same transition metal, are found in many enzymes, nature’s catalysts for biological reactions. Professor Lu has found that one of the dicobalt species her group has synthesised is an incredibly efficient catalyst in turning nitrogen (N2) to N(SiMe3)3, an amide species. Nitrogen is an incredibly inert species that makes up most of our atmosphere so getting it to react to form new compounds is very difficult. However, amines are very valuable chemicals, with a global market worth billions of dollars; they are widely used in the pharmaceutical
28
www.researchoutreach.org
The difficult conversion of nitrogen to silylamine, when catalysed by the dicobalt complex, can occur at room temperature and at ambient pressure of nitrogen. Theoretical calculations predict that the hardest step is the formation of the second N-Si bond, which is shown on the right.
industry for drug manufacture as well as for agricultural usages and in water purification. The possibility of being able to use nitrogen, rather than more polluting chemical synthesis routes, as a way of manufacturing these chemicals would be a huge development towards more sustainable chemical synthesis. LOOKING AHEAD Given the success of Professor Lu’s group at making many metal-metal compounds with a variety of transition metal species, the plan is to find other exciting applications of this chemistry. The bimetallic molecules made by the group can be used to install the metals
in solid supports, that can later be used in catalysis. By using the bimetallic molecules as a delivery vehicle for the metal centres, it should be possible to make a greater variety of catalytic supports and develop new systems that make use of the more abundant lighter transition metals. By bringing together chemical identification techniques used in other areas of chemistry, Professor Lu also hopes to further characterise the unique bonding in the compounds synthesised in her group and develop models to further our understanding of how different metals bond and interact.
Professor Lu has achieved several worldfirsts by making a molecule with triple bonds between two different light transition metals
Behind the Bench Professor Connie Lu
E: clu@umn.edu T: +1 612 625 6983 W: http://www1.chem.umn.edu/groups/lu/
Research Objectives Professor Lu’s lab seeks to develop homogeneous catalysts for converting abundant small molecules, such as N2 and CO2, into useful chemical feedstocks, such as ammonia and methanol, respectively. The group is interested in creating, understanding and exploiting new chemical bonding.
Collaborators • Professor Laura Gagliardi, University of Minnesota • Dr Eckhard Bill, Max Planck Institute for Chemical Energy Conversion • Collaborators in the Inorganometallic Catalyst Design Center, University of Minnesota, http://www1.chem.umn. edu/icdc/
Funding • National Science Foundation (NSF) • US Department of Energy, Basic Energy Science
Bio Connie Lu received a BS in Chemistry from MIT and a PhD from Caltech with Jonas Peters. She was a Humboldt postdoctoral fellow at the Max-PlanckInstitute for Bioinorganic Chemistry
Q&A
What allows the double decker ligands to bring together two different metal centres? Transition metals have certain preferred geometries for binding ligands. We take advantage of the wellknown fact that a single metal centre often prefers to form a five-membered ring with ligands. Our double-decker ligand design scaffolds just the right number of donor atoms such that both metals are each part of three approximate pentagons. As I write this, I think it would fun to ask grade school children to see how many irregular pentagons they can find in one of our molecules. Note: These are irregular pentagons because the sides and angles are not all equal. Do you think these ligands can be used for all transition metals and pairings? We can use these ligands to study other pairings, for example between
with Karl Wieghardt. In 2009, she began independent research at the University of Minnesota, where she is currently an Associate Professor. Contact Professor Connie Lu Laboratory of Inorganic Synthesis and Catalysis University of Minnesota 207 Pleasant St SE, Minneapolis, MN 55405 USA
transition metals and main group elements, which are to the right on the Periodic table. This is a new, exciting direction for us because we have found that nickel-gallium pairings make great catalysts for hydrogenating CO2. We are curious about some of the heaviest elements, the rare earth metals, which reside at the very bottom of the Periodic table. While our current ligands are too small for these metals, we have designed new ligands which we hope will allow us to explore bonding in the nether regions of the Periodic table.
Do you think your bimetallic catalysts will be scalable for industrial processes? The bimetallic catalysts contain specialised ligands, which will be expensive to make for an industrial process. Rather, these catalysts are wonderful demonstrations of what light transition metals can achieve together. To be scalable, we will need cheaper methods to place two metals together on a solid support and to find a support that will preserve the unique bimetallic reactivity.
Do you think that it will be possible to replace all heavy transition metal catalysts with lighter metal catalysts? Absolutely, it is just a matter of time. We all want sustainable catalysts, and researchers all over the world are working to replace heavy, precious metals with lighter, earth-abundant ones for all types of catalytic applications. As a community, we are generating knowledge about how to better control the properties of the light transition metals.
What’s the greatest number of chemical bonds you’ve been able to create between two metals? Five. Quintuple bonds are still quite rare and are only known for transition metals. My collaborator Laura Gagliardi has predicted that the maximum number of bonds between two metals is six. Isolating a molecule with a sextuple bond would be spectacular, and it remains an exciting challenge for synthetic chemists.
www.researchoutreach.org
29
Physical Sciences ︹ Professor Evan Scannapieco
In search of supermassiv Professor Evan Scannapieco and his team at Arizona State University have been investigating the puzzle of why the largest galaxies in the universe, once the most active, have become dormant and ceased to produce stars. They were one of the first to propose a mechanism that involved colossal feedback from supermassive black holes at the centre of the galaxies and they set out to prove the theory using the Cosmic Microwave Background radiation as a tool.
A
strophysicists deal with distances that are so vast and timescales that are so long that they are difficult for humans to comprehend. The enormity of these scales first came into view roughly 90 years ago through the work of Henrietta Leavitt and Edwin Hubble. Leavitt analysed special stars called Cephids and discovered that their luminosities were directly related to the periods of their pulsations. Hubble then used this relation and the observed brightness of Cephids to measure the distances to galaxies outside the Milky Way. He found that they are so distant that their light takes millions of years to reach the Earth. Now we know of galaxies whose light takes over 10 billion years to reach us, revealing them as they appeared in the early universe. A MATTER OF SCALE The billions of galaxies that fill the universe can be broadly grouped into moderate-sized disks, like our own galaxy the Milky Way, and giant elliptical galaxies with typical stellar masses that are more than ten times greater. By comparing the more distant galaxies, which are observed as they appeared long in the past, with the more nearby galaxies, which are observed as they appear closer to the present,
www.researchoutreach.org
ve black hole feedback astronomers are able to examine how these types of galaxies evolved over time. To compare to these observations, theoretical models were developed that predicted hierarchical galaxy formation – in which gravity collects material together to form stars and solar systems, like our own. However, some parts of the universe did not fit the existing models: the largest galaxies, once the most
the material reaches the event horizon – some is ejected perpendicularly, in both directions, from the core and out of the galaxy. These outpourings are of gargantuan proportions and images from radio telescopes show trails larger than the galaxies that host them. Professor Scannapieco and his team used a mathematical model to propose that the shock of the ejections from the
The largest galaxies, once the most active in star formation, had become dormant with older stars dying without replacements being created active in star formation, had become dormant with older stars dying without replacements being created. In 2004, Professor Scannapieco and his team set out to determine why this was the case. They proposed a physical model, backed by rigorous mathematics, that suggested an intergalactic feedback mechanism was operating, associated with the supermassive black holes discovered at the centres of elliptical galaxies.
hot disks surrounding supermassive black holes can be felt in the Intergalactic Medium. His team’s research suggested that the gasses and other material within the IGM may be heated such that they are no longer capable of coalescing under the influence of gravity to form stars. Professor Scannapieco refers to this effect as ‘AGN feedback’.
THE AGN FEEDBACK THEORY Nearly fourteen billion years have passed since the Big Bang, the formation point of the universe. During the intervening time, galaxies have been created by the accretion of material from the Intergalactic Medium (IGM), the tenuous material that lies between galaxies. Innumerable galaxies have been formed, each containing countless stars. Some galaxies have merged and formed Active Galactic Nuclei (AGN) at their cores, which result in higher than expected galaxy luminosities that do not come from the stars.
In fact, the team’s equations focus on an increase in temperature that changes the entropy of the IGM, a thermodynamic measure of the availability of a system’s energy to do work. Their findings predict that the AGN feedback mechanism will increase the entropy above a critical value, making it impossible for affected galaxies to cool and form stars by gravitational accretion. This matches the observed universe, as has now been shown by many research groups, some of which have built on the 2004 model, and others of which have employed independently-developed models in which AGN feedback operates more gradually.
Theory suggests that each of these galaxies contains a supermassive black hole surrounded by a hot disk that is accumulating material from the galaxy to be devoured by the black hole. Not all
COSMIC MICROWAVES With this variety of AGN feedback models fitting the observations, Professor Scannapieco and his team worked to develop a method to
www.researchoutreach.org
distinguish between them. The key difference between these possible models is their predictions for the increase in IGM temperature, caused by supermassive black hole feedback. Innovatively, the team decided to use the Cosmic Microwave Background (CMB) radiation, a remnant from the Big Bang, to measure the impact of AGN feedback in its heating of the IGM. The measurements relied upon an established mechanism called the Thermal Sunyaev–Zel’dovich effect. When photons of electromagnetic radiation, like rays of light or microwaves, pass through a region containing higher temperature gasses, some of the photons interact with the gaseous atoms and cause electrons to gain energy and emit photons of a higher frequency. This results in a shift in the wavelength of the CMB over what is expected. It is these changes that can be measured, compared with the ‘normal’ CMB and then, using a mathematical model, converted into estimates of the thermal energy of the IGM. One way that was proposed to do this was to examine the Thermal Sunyaev– Zel’dovich effect around active AGNs. But this was problematic – AGNs are rare and are so bright at many wavelengths that their light contaminates measurements of the microwave background. Instead, the team elected to ‘stack’ many measurements of more common galaxies to enhance the tiny signal and reduce extraneous noise. Professor Scannapieco chose to examine massive elliptical galaxies that were dormant because of the AGN feedback effect the theory suggested. Two separate categories were chosen at two cosmic epochs corresponding to different distances from the observers. For the selected galaxies, data was used from two telescopes: the South Pole Telescope in Antarctica, and the Atacama Cosmology Telescope in Chile.
www.researchoutreach.org
Map of the logarithm of the Thermal Sunyaev-Zel’dovich effect. The large panel is 1.1° on a side and is generated from a simulation including AGN feedback. The smaller panel is 0.55° on a side and is generated from a simulation without AGN feedback. Both maps are for measurements with frequencies well below 160 GHz. The AGN feedback simulation has a higher average Themal Sunyaev-Zel’dovich effect signal, which is also smoothed out on the smallest scales. Originally published in ‘Measuring AGN Feedback with the Sunyaev-Zel'dovich Effect’, The Astrophysical Journal, Volume 678, Issue 2, article id. 674-685, pp. (2008) https://doi.org/10.1086/528948 © AAS. Reproduced with permission.
Professor Scannapieco’s research suggests that the gasses and other material within the IGM may be heated such that they are no longer capable of coalescing under the influence of gravity to form stars Combining measurements from thousands of elliptical galaxies, and correcting the measurements to allow for contamination from other sources of light, delivered a result that was somewhat higher than suggested by hierarchical models that neglected the effect of AGN feedback. These initial measurements made with existing telescopes suggest that AGN feedback
may be at work. Professor Scannapieco and his group are now working with the team building the TolTEC Camera for the Large Millimeter Telescope to make higher resolution measurements, which they hope will prove to be definitive.
Behind the Bench Professor Evan Scannapieco E: evan.scannapieco@asu.edu T: +1 480 727 6788 W: http://scannapieco.asu.edu/
Research Objectives Professor Scannapieco’s research focuses on better understanding feedback processes in galaxy formation and the evolution of the elements across cosmic time. Funding • National Science Foundation USA (NSF USA) Collaborators The original papers we wrote on the topic were in collaboration with: • Prof S. Peng Oh at the University of California, Santa Barbara • Prof Robert J. Thacker at Saint Mary’s University in Canada
Q&A
The feedback hypothesis was very innovative at the time it was proposed. How did you reach the conclusion that AGN feedback was a possible reason as to why giant elliptical galaxies might be dormant? We had two big hints. The first was that the most massive galaxies stopped forming stars first, a trend that is very hard to explain without a large energy source to suppress gas cooling. The second was that the number of stars that elliptical galaxies contain and the masses of their central black holes are closely related, suggesting that the central black holes might have a way of somehow “telling” their host galaxies when to stop forming stars. The supermassive black hole feedback mechanism has gained traction since you and your team suggested it. What scientific progress has been made due to the interest you have triggered? Through our work and those of others, black hole feedback has become an
• Prof Hugh Couchman at McMaster University in Canada The papers in which we measure the feedback were led by ASU graduate student Alex Spacek, in collaboration with myself, Dr Seth Cohen, Bhavin Joshi, and Prof Phillip Mauskopf. The PI of the TolTEC Camera is Grant Wilson at UMass Amhurst, and the ASU lead is Prof Phillip Mauskopf. Bio Professor Evan Scannapieco studied at Harvard University and the University of California, Berkeley and then worked as a postdoc at Arcetri Observatory in Italy and
essential component in our modern understanding of how galaxies form. At the same time, there are many different points of view on how this feedback may have played out in detail. Thermal Sunyaev–Zel’dovich effect measurements are one of the main tools we have in constraining this process. Using the Thermal Sunyaev–Zel’dovich effect to measure the energy levels of the IGM has given credence to the ‘feedback’ theory. How will you confirm that your measurements are accurate? Our measurements are accurate but still not very precise, because these are difficult measurements at the limits of current instruments. Still, they are already placing interesting constraints on feedback models. We are closely working with groups building next generation instruments, like the TolTEC Camera, which will open up a new window on the history of gas heating around large galaxies.
the Kavli Institute of Theoretical Physics in Santa Barbara. He moved to Arizona State University (ASU) in 2007 and is now an Associate Professor in the School of Earth and Space Exploration. Contact Professor Evan Scannapieco Arizona State University School of Earth and Space Exploration PO Box 871404 Tempe, AZ 85287-1404 USA
structures in the universe do in fact form hierarchically. While the most massive galaxies were the first to stop forming stars, as more time went by they continued to group together into even larger structures called galaxy groups and galaxy clusters. Today, the largest galaxy clusters contain thousands of galaxies, swarming around each other like bees. How has our understanding of the universe improved since the start of your research? As an astrophysicist, I am fortunate to be working in one of the most active areas of modern science. Some of the most exciting results since 2004 have included observations of distant galaxies that appear as they were during the universe’s infancy, discoveries of earth-mass planets around other stars, and the detection of gravity waves from merging neutron stars and black holes.
Can anything be salvaged from the apparently defunct hierarchical model of the universe? The most massive gravitationally-bound
www.researchoutreach.org
Physical Sciences ︱ Professor Mike Barnes
Offshore wind pow The World Energy Council states that the capacity of offshore wind generation installed globally was around 12,000 MW by the end of 2015 with over 92 percent of these installations located in European waters. This will require innovative solutions such as the work being done by Professor Mike Barnes and his team at the School of Electrical and Electronic Engineering at Manchester University, on novel circuit breaker design for offshore power networks. Ultimately, this research could help onshore power networks to supply increased amounts of electricity generated by offshore wind turbines and contribute to future low carbon electricity generation.
W
ind turbines are anticipated to have a significant role to play in the future energy mix. Electricity generated from renewable sources currently accounts for around 25 percent of UK electricity demand. Current EU targets mean that by 2020, the UK must generate 30 percent of its electricity demand from renewable sources. Expansion of the UK’s offshore wind capacity has been proposed as one means of achieving this target. Development of new and improvement of existing technology is required to support this proposed growth. The research being undertaken by Professor Barnes and colleagues could help in achieving this target. They aim to improve circuit breakers for offshore wind installations that could facilitate greater flows of energy from increasing distances offshore whilst ensuring the protection of existing power networks onshore. Professor Barnes has been heavily involved in several large collaborative wind energy research
34
www.researchoutreach.org
wer’s big break projects funded by the European Union since joining Manchester University in 1997. His current research into circuit breakers forms part of a threeyear research project funded by the Engineering and Physical Sciences Research Council. HARNESSING WIND POWER A wind turbine can extract up to 60 percent of the power from the wind but because wind-speed varies, it is necessary to convert the alternating current (AC) output from the wind generator to direct current (DC) before converting it back to AC at a frequency that matches the onshore
flows are maintained in the remainder of the network. One of the key challenges to reliably operating this meshed configuration is how to stop current flowing in the event of a problem. Isolating DC with circuit breakers is more complex than for AC. AC changes direction cyclically, falling to zero twice during a cycle which provides an ideal opportunity to interrupt current flow. However, DC flows in one direction only so in the DC system, currents must be quickly reduced to zero which requires a more complex breaker. The pertinence of Professor Barnes’ research is clear:
Current EU targets mean that by 2020, the UK must generate 30 percent of its electricity demand from renewable sources power network. Another advantage of this temporary conversion to DC is its potential for power transmission over greater distances with lower cost than is possible for an AC system. This combined with the configuration of the power system are important considerations if turbines are to be located at greater distances offshore in future. Conventional “point-to-point” networks involve each offshore wind generator being connected directly to the power network onshore. With an increased amount of installations, this method is anticipated to be prohibited by cost and technology limitations and therefore unsuitable to meet planned future expansion. The alternative option being explored relies upon the DC component of the network comprising a “meshed” grid configuration where connected wind farms share pathways to the AC converter onshore. If a fault develops in the DC meshed network, discrete parts of the network can be isolated whilst power
there is a pressing need to develop responsive and economic circuit breakers that quickly stop current flowing if a fault develops within a DC system without detriment to the wider network. IMPROVING EXISTING PROVISION Should a fault develop within the DC power network, it is necessary to limit the flow of current as quickly as possible to prevent other parts of the network operating beyond their design parameters. This is currently achieved using AC circuit breakers which may isolate the entire DC system leading to power loss. This would mean that if a future wind installation comprising of a network of five wind farms was operating and a connection to one farm developed a fault, all five would have to be isolated and no power would flow from the four working systems. The UK power network operates to certain supply quality standards that state that the maximum power loss the network can bear is 1800MW. Presently, any such
www.researchoutreach.org
35
Lost transmission Rectifier
DC
AC
Inverter
Transformer
Fault AC
Without a circuit breaker (above), a fault prevents the flow of current acrosss the whole circuit. With a DC circuit breaker (below), a fault can be isolated, allowing current to flow from the remaining wind farms.
Detection and Isolation
Transmission lost reduced Rectifier
AC
DC
Inverter
Transformer
Fault DC Circuit Breakers
losses from existing offshore wind farms are within network operating limits but with up to 30,000MW of electricity potentially being sourced from offshore wind in future, isolating larger capacities could exceed this maximum. This would lead to severe impacts and, without mitigation technology, could potentially cause blackouts over parts of the UK electricity network. DC circuit breakers are commercially available and may be used to manage faults that develop in the DC network. However, to maintain power flows, these need to isolate the affected area within two to five milliseconds. Currently available technology cannot achieve the fastest of these constraints and so a new development is needed. APPROACH TO THIS CHALLENGE To improve circuit breaking within DC power systems, in addition to thoroughly reviewing the different types of DC circuit breaker currently available,
36
www.researchoutreach.org
AC
There is a pressing need to develop responsive and economic circuit breakers for DC systems Professor Barnes’ research team will also investigate the potential of fault current limiters. These devices limit or slow the current increases that may occur if a fault develops but have not been widely used on DC networks. The team will also consider the fundamental principles of physics that underpin the design of existing systems and components while examining the materials from which current breakers are composed with the aim of identifying desirable and unwanted attributes of each. Circuit breaking within a DC power system may be achieved mechanically using switches. However, these are currently limited by their operating
time. Solid-state breakers that use semiconductors technology are an alternative option; they operate more quickly than mechanical breakers but have higher losses which compromise system efficiency. Hybrid systems which use the best characteristics of mechanical and solid-state breakers show potential but are currently not sufficiently reactive. The outcome of this research will be to propose novel and improved methods for circuit breaking on future DC networks which could in turn support increased development of offshore wind.
Behind the Bench Professor Mike Barnes
E: mike.barnes@manchester.ac.uk T: +44 (0) 161 306 4798 W: http://www.research.manchester.ac.uk/portal/en/researchers/mike-barnes(9484b24e-0a22-44a0bb9f-02083d354ce0).html
Research Objectives Professor Barnes and his team are developing a smaller, faster current breaker for offshore use to aid the integration of offshore wind farms into the National Grid. Funding EPSRC Collaborators • GE Grid Solutions • Scottish and Southern Energy Networks
Q&A
How did you become interested in power networks? What modern electronics can do is amazing: from healthcare to entertainment to transportation. As anyone who has ever been stuck with a low battery on their smartphone knows, getting power to all this is fundamental. I just got fascinated by the challenge of trying to achieve this sustainably for future generations. Plus, you get to work with some really cool and amazing technology and people. Can the UK meet its target to supply 30% of its electricity demand from renewable sources by 2020? In June this year, 50% of electrical power was produced from renewables for a short while. So the target of a high proportion of renewable energy is achievable – but it requires a long-term strategy. That in turn means clear and binding decisions by government on our future energy plans to allow energy
Bio Mike Barnes graduated with BEng (’93) and PhD (’98) degrees from the University of Warwick, Coventry, UK. He became a Lecturer at the University of Manchester Institute of Science and Technology (UMIST, now merged with The University of Manchester), Manchester, UK in 1997, where he is currently a Professor. His research covers the field of power electronics systems.
Contact Prof Mike Barnes School of Electrical & Electronic Engineering University of Manchester Oxford Rd Manchester M13 9PL UK
companies to make the necessary longterm investment in the UK.
Europe is leading in the development of offshore wind – where else in the world do you see potential? China, Japan, India and North America are also active in this area and we expect to see rapid development of the technology by them.
Your research has led to patentable results. How close are these to being put into practice? Our ideas are being developed by people around the world. China, Korea and even NASA are working on some of the concepts, and our work is being used by people active in this area. However, we will only see these devices appearing once we see large high-voltage DC networks emerging, which will be a few years yet. However, before you put such equipment on the network, there are several years of writing guidelines and specifications to ensure effective and safe use of such installations – so the research has direct relevance even now.
What are the next steps for your research? We will continue to work with manufacturers and utilities in this area to look at the route to commercialisation and at the implications of such equipment on existing hardware.
In June this year, 50% of [the UK’s] electrical power was produced from renewables for a short while
www.researchoutreach.org
37
Physical Sciences ︱ Dimitrios Pezaros & David Hutchison
Making the Interne Back in May 2017, a huge cyberattack crippled several of the largest digital networks in the UK and US, paralysing over two hundred thousand computers. To combat such threats Dimitrios Pezaros, Senior Lecturer at the University of Glasgow, and David Hutchison, Distinguished Professor of Computing at Lancaster University, launched SAI2 (A Situation-Aware Information Infrastructure), a research project aimed at developing new technologies to fight against cyber threats.
C
yber attacks are becoming more and more common, finding their way into the headlines every couple of months. The incident in May 2017 was a fairly typical but high impact ransomware attack. Software called WannaCry infected several organisations’ internal computer networks, using the EternalBlue tool, or ‘exploit’, to install rogue software on unpatched and vulnerable computers. The vulnerability was also used to spread the WannaCry code from one computer to another. On each infected computer a ransom was demanded for putting the rightful users back in control.Another widely publicised cyberattack hit Dyn, a Domain Name System (DNS) provider, in 2016. A ‘botnet’ was formed by infecting large numbers of easily-hacked networked IoT (Internet of Things) devices such as IP cameras, printers and other everyday gadgets which were used to launch a Distributed Denial-of-Service (DDoS) attack on the Dyn servers. This led to many major Internet platforms that are dependent on Dyn becoming unavailable to huge numbers of their users. According to the Situation-Aware Information Infrastructure (SAI2)
www.researchoutreach.org
Situation Awareness added to our resilience framework can help the fight against damaging cyberattacks
et a safer place investigators, such incidents could be better controlled if resilience management were in future deployed in networks, acting more intelligently to detect the onset of attacks, assisted by situation awareness information. UNITED WE STAND; DIVIDED WE FALL The SAI2 researchers were concerned that cyber security was too reliant on the static defence of individual end devices. The focus should be on protecting the whole networked system, and constantly checking for intrusions. Think about the setup in a typical household – a desktop, a laptop, a tablet and a smartphone. All these devices can be fitted with protective technology like antivirus software. But in case of a threat, each will mind its own digital business. If a hacker fails to get past the security running on a desktop, it doesn’t mean he or she will be just as unlucky when attacking a laptop. One way around this is to use a home gateway equipped with a firewall that is supposed to protect the whole network. But the firewall’s rules – what it does and does not recognise as a threat – are usually static and thus can’t adapt intelligently to break-ins on the network or attacks it has not encountered before. When we multiply this by thousands of computers connected into one of the largest networks of the world, the problem gets significantly worse. That’s why the SAI2 team focused on building a security system that integrates different sources of information like operators’ warnings about cyber threats, social media news feeds, or indeed any relevant contextual information, as well as conventional network traffic packet traces. A computer network is frequently referred to as the information highway,
www.researchoutreach.org
De f
Di ag
er v o
Re m e
Re
Re c
www.researchoutreach.org
De t
t ec
carrying bits of data travelling in both directions between connected devices at extremely high speeds, and as with a real highway there’s much that can be inferred by measuring the traffic. What the SAI2 team proposed was algorithms and tools for detecting anomalies in the measured traffic, and then reacting to such anomalies in short timescales. It sounds simple, but the real challenge lies in deciding how to respond to anomalies. The SAI2 researchers have been better able to spot when something suspicious is going on by analysing network data traffic patterns alongside global information feeds from social media (e.g., Twitter) and news (e.g., Reuters) sources. However, to accomplish this and at the same time feed any alerts back to the network infrastructure in short timescales, they decided to revisit the fundamental packet switching mechanisms of the network. Doing so for large networks that transmit data at rates of hundreds of Gigabits per second and multiplex traffic for millions of users over large, geo-distributed data centres poses significant technical challenges. So, the SAI2 team developed a novel, programmable switching architecture that can natively incorporate monitoring and adaptive control intelligence as part of the main packet forwarding operation of the network infrastructure. This way, Hutchison, Pezaros and their colleagues designed a cyber security system that is aware of what’s going on in the entirety of the network it protects, and can react according to the temporal operational conditions and incidents as and when they unfold. But knowing what’s going on inside the network was only the first step in building a situation-aware infrastructure. The next logical step was to equip the network with the ability to understand, to an appropriate extent, the outside world as well.
d en
e fin
se o n
ate di
A framework for resilience in networked systems
Resilience management can use Situation Awareness to help make better remediation and recovery decisions LOOKING OUTSIDE While the term ‘cloud computing’ sounds intangible, the reality is our photos, emails, videos or medical records have to be physically stored on servers located somewhere. Clouds and any critical infrastructure will be subjected to challenges including natural disasters and a variety of operational failures as well as cyber attacks. The SAI2 investigators apply a resilience management framework to protect such networked systems, assisted by situational awareness information from external sources including social media. Of all social media platforms, Twitter is certainly one of the most accommodating to researchers – its data is easily obtainable: how many people tweeted a particular message; how many used a given hashtag; when
and where those people did so, etc. This is all invaluable information when it comes to dealing with and assessing crises. Suspicious activity will appear in Twitter data patterns like ripples in the water. So, the SAI2 team went on to build algorithms that model such news feeds into their cyber security systems. In this way, computer networks of the future will know what’s happening around them as well as inside the network and can react accordingly. Does this mean they will become self-aware? No, they won’t. But the idea is that future networks will exhibit other properties like selfmanagement and self-adaptation which ultimately will make them more resilient and reliable: properties which will benefit us all.
Behind the Bench Professor David Hutchison
Dr Dimitrios Pezaros
E: d.hutchison@lancaster.ac.uk T: +44 (0)1524 510331 W: https://www.gla.ac.uk/schools/computing/staff/dimitriospezaros/ W: www.research.lancs.ac.uk/portal/en/people/david-hutchison Research Objectives Prof Hutchison and Dr Pezaros work on improving cybersecurity. In particular, their research aims to give networks better abilities to detect and respond to cyberattacks and other challenges. Funding • UK Engineering and Physical Sciences Research Council (EPSRC) TI3 Project EP/ L026015/1: Situation Awareness. • European Cooperation in Science and Technology (COST) Action CA15127: RECODIS on Resilient Communication Services.
Q&A
What first got you interested in cyber security and resilience? The realisation that computer networks increasingly become part of the national critical infrastructures, and are therefore too important to fail. We have been working in the areas of network and service management for years, and we are bringing this know-how into an emerging area requiring holistic solutions to enable emergent properties such as reliability and resilience of large-scale networked systems. Actually, what we work on is the resilience of networked systems, which includes cybersecurity but goes beyond it to respond to cyberattacks (and other disruptive challenges such as natural disasters) and rapidly attempt to remediate and recover the normal operation of the system. How can the Situation-Aware information infrastructure benefit us, the end users? A Situation-Aware information infrastructure should aim at making the underlying interconnection medium even more transparent to the end-user by minimising down-time and through rapid re-engineering of its operation at the onset of adversarial events, while offering uninterrupted services to its end users.
Collaborators Lancaster University: Andreas Mauthe, Angelos Marnerides, Noor Shirazi, & Steven Simpson. University of Glasgow: Joemon Jose, Long Chen, & Simon Jouet Bio Dimitrios Pezaros, CEng, SMIEEE, SMACM, is Senior Lecturer (Associate Professor) at the School of Computing Science, University of Glasgow. His research focuses on the resilient and efficient operation of virtualised and software-defined networked infrastructures. He holds BSc and PhD degrees in Computer Science from Lancaster University, UK. How does your Situation-Aware network architecture work? Situation Awareness is facilitated through a novel network architecture that makes the network’s main data-forwarding plane programmable. This is achieved through each switch in the network supporting a minimal, performance-bound instruction set. Based on this, centrally-controlled, minimal programs can be installed on the switches along the data path to enable high-speed, adaptive functionality alongside packet switching. Using this novel architecture, we have demonstrated several use-cases of monitoring and control functionality such as exponential weighted mean average computation on every switch along the data path for normal behaviour profiling (a prerequisite for enabling anomaly detection), as well as collaborative pushback for distributed, denial of service attack remediation. How can a cyberattack affect a regular person? Cyberattacks can deprive users from accessing their data on a physical machine or over the cloud, and from alwayson connectivity which is increasingly considered vital. Cyber incidents where attackers encrypt the victim’s data and subsequently ask for ransom in order to decrypt it are becoming increasingly popular. At the same time, attacks on the networked infrastructure can have wider and more costly effects. Volume-based
David Hutchison is Distinguished Professor of Computing at Lancaster University and has been doing computer network research for more than 25 years. He now focuses largely on resilient and secure networking, with interests in the Future Internet and also the protection of critical infrastructures including utilities and industrial control systems. Contact Prof David Hutchison Lancaster University, InfoLab21 Lancaster, LA1 4WA UK
amplification attacks can take significant parts of the infrastructure offline for long periods of time, preventing users from accessing online services but also from running their own businesses over the cloud. What, in your opinion, will perpetrators do to defeat future cybersecurity systems like those you’re working on? Over the years, there has been the typical cat-and-mouse game between perpetrators and defence systems where the latter have been developed or amended in response to a new attack, and new attacks are being developed to exploit previously unknown vulnerabilities. The work in this project strives to break this endless cycle of events by making the infrastructure adaptive, able to learn from its own past behaviour, and to harness as much information as possible to try and predict the onset of adversarial events. This way, defence against cyber-attacks does not depend on static knowledge that can only protect against a certain set of vulnerabilities but rather it evolves together with the operation of the networked infrastructure.
www.researchoutreach.org
Thought Leader
CPHA: The heart of Canadian public health Health related issues, such as diabetes, heart disease or obesity, can often be hugely influenced by public health initiatives. In her role as Chair of the Canadian Public Health Association (CPHA), Dr Suzanne Jackson champions a public health perspective on important public policy discussions that impact the health and well-being of Canadians. She spoke to us about CPHA’s influence since its foundation – both the positives and negatives – before highlighting where public health policy needs to focus its attention in the coming months and years.
P
ublic health affects everybody – it’s inescapable but paramount. Recent societal changes seen in the UK formed from public health policy include the five-a-day healthy eating initiative, the smoking ban and the sugar tax. In Canada, the Canadian Public Health Association (CPHA) is the independent national voice and trusted advocate for public health, speaking up for people and populations to all levels of government. Through its work, CPHA has operated untiringly to improve Canadian public health, having previously advocated for national health insurance in the late 1930s, fluoridation in the early 1970s, and establishment of the first national HIV/AIDS education and awareness programme in the 1980s. Dr Suzanne Jackson has worked in the public health field since the early 1980s and is the current Chair of CPHA (2017– 18). In her career, she has seen numerous positive global public health changes but is aware that more needs to be done. She sat down with us at Research Outreach to discuss this, and more, in further detail. Hello Suzanne! Can you give us an overview of what the CPHA does? CPHA is primarily a member-driven organisation. Its members represent
a diverse range of roles and professions in public health – nurses, physicians, inspectors, nutritionists, dentists, health promoters and researchers. The Board defined a strategic vision for CPHA in 2015 that represents an overview of what we do. Two goals are related to the organisation (engaged membership, financial stability), and the remaining four goals relate to our main role. These include: 1 National, independent evidence-based voice for public health in Canada 2 Represent the public health community’s interests in public health system renewal 3 Convenor of partners to identify solutions to public health issues 4 Inspire and motivate change in support of health equity We run a big conference every year to exchange the latest information about what is going on in public health, we convene a national table for all provincial and territorial public health associations to meet several times a year (mostly by teleconference), we research policy issues of interest to our membership, and we also develop position statements which form the basis of our advocacy to government departments, media and other organisations.
CPHA also runs some projects under contract to government and others to create resources or training opportunities for public health workers. We serve as the home for the Canadian Journal of Public Health and we communicate with our members regularly about events, new reports and publications, and jobs in the field of public health in Canada. The CPHA marked its centenary in 2010. What impact do you think the organisation has had on Canadian public health since it was founded? Are there any achievements that really stand out for you? I have always been proud of CPHA for taking positions on public health at the leading edge. As a worker in the system, it was great to see the leadership offered by taking a stand on issues ahead of what the rest of the field was doing. For example, we are making an important contribution to what to do about climate change. My involvement in the field goes back to the mid-80s when CPHA co-sponsored the Ottawa Charter conference with WHO in 1986. Since then, this has led to a remarkable 30 years of global attention to health promotion following the same guidelines – no other field can argue such global consistency. Not only that, but, also in 1986, CPHA established
I have always been proud of the Canadian Public Health Association for taking positions on public health at the leading edge
42
www.researchoutreach.org
the first national HIV/AIDS education and awareness programme. CPHA also held positions on the dangers of smoking and second-hand smoke that added to the pressures to change policy felt by all levels of government. However, the way I remember it is that some very committed and brave Medical Officers of Health in Ottawa and Toronto persuaded their local Boards of Health to adopt innovative by-laws, serving as members of CPHA and leaders in public health in the country. The antismoking in public places by-laws were a remarkable public health achievement. Among many other major public health milestones, CPHA notably advocated for national health insurance in 1939, abortion in 1972, water fluoridation in 1977, and against nuclear weapons in 1982. What is the importance of research in CPHA’s work?
Careful policy research about the level of evidence in the literature in relation to identifying the key components of a policy issue is very important. CPHA prides itself in providing timely, evidenceinformed public health guidance and perspectives to public health professionals and policy makers. It ensures that its positions and statements can be backed up by the best available evidence. We also re-evaluate our positions periodically to ensure that we do not become dogmatic and that we are informed by the most recent evidence. An evidence-based approach is important for us to be a credible voice for public health in Canada and to advocate for change to public policy to the federal government. What involvement does CPHA have in the development of public health policy? We have had some influence. For
example, the Chief Public Health Officer’s Report in 2015 focused on alcohol and our position paper was referred to several times. Our 2014 recommendations to the House of Commons’ Standing Committee on Health regarding e-cigarettes were repeated practically verbatim in the Committee’s report. These recommendations are echoed in the current Bill S-5 in the Senate. Our 2016 recommendations to the Task Force on the Legalisation and Regulation of Cannabis are clearly represented in the Task Force’s recommendations to government. Although we cannot be sure of the extent of our influence at other levels, I believe that CPHA papers and resources have been used by public health officers across the country to advocate for changing policies at the local public health unit
www.researchoutreach.org
43
level. They have been used in the preparation of media reports about health issues in Canada. Media exposure affects public opinion which, in turn, affects politicians and policy makers. As a specialist in health promotion, can you explain what is meant by health promotion? What are the benefits of considering an issue from a health promotion perspective? As per the Ottawa Charter, 1986, health promotion is “the process of enabling people to increase control over their health.” I see four main ‘hooks’ coming from this definition and these guide my teaching and practice. 1 Focus on determinants of health – the factors that affect people’s health are broad and include peace, clean environment, resilient ecosystem, education, income. This means that health promoters focus on changing policy to affect such broad factors. This was originally conceived as building healthy public policy, but it has now evolved into “whole of government” policies and “health in all policy”. This also forces attention on “creating supportive environments” and “intersectoral collaboration.” 2 Focus on the positive – health promotion focuses on achieving something positive – health – it is goaloriented rather than problem or disease oriented. Instead of looking at people as bundles of problems and deficits, health promoters look at people as collections of strengths and assets. This means using a “situational analysis” rather than a “needs assessment” in programme planning as well as focusing on achieving goals framed in a positive way. 3 Focus on participation – in order to enable people to increase control over their health, they need to be involved directly in the decisions that affect their health. This means using participatory approaches to planning, evaluation and research. This means listening at the individual level, using group consensus methods at the community level, and community development approaches and participatory decision-making at the societal level. “Strengthening community action” was the original Ottawa Charter strategy. 4U sing multiple strategies at multiple levels – given the breadth of work
44
www.researchoutreach.org
Dr Suzanne F. Jackson, Chair, Canadian Public Health Association
required in health promotion from the individual to family to community to societal levels, many kinds of strategies are required from health education, community development, intersectoral collaboration to building healthy public policies. Health promotion is about affecting change at individual, community and societal levels and there are many theories and strategies guiding this work. The benefits of considering an issue from a health promotion perspective is that one considers a range of causes of the causes – one looks at what can be achieved positively rather than reducing a deficit, and a range of possible strategies are considered. In June, CPHA is hosting the Public Health 2017 conference. Could you
tell us a bit more about this? How important is it for the public health community? CPHA hosts Canada’s largest annual public health conference to share current research, promote best practices and improve health and well-being. The conference is an important opportunity for public health workers and researchers to meet, network and exchange views on interventions and concepts that are working. CPHA holds the conference in different parts of the country each year to make it easier for people in that area to come and highlight achievements in different parts of Canada. This year it is being held in Halifax and next year it will be in Montreal. It keeps us connected to what is going on in all parts of the country and it is good for inspiring the next generation of public health workers.
Thought Leader One of CPHA’s goals has been to increase understanding that health is determined by many factors outside of the health care system. So, for example, living with violence, or in fear of violence. What are the advantages of bringing factors outside the health care system to bear on the understanding of health? Realising that there are factors outside the health system that affect health helps to identify the policy changes needed to affect the greatest numbers of people
policies at the provincial and municipal levels. Given the federal involvement directly in Indigenous issues in Canada, a clear policy should have a direct influence on decreasing violence towards Indigenous women. The policy would have to be holistic and recognise the intergenerational trauma of residential schools. It would also need to put in place other components that would be required to reduce structural or system level violence, such as dealing with housing, water and sanitation, food
surveillance information to monitor the situation and evaluate progress. Finally, can you tell us what is it about public health that interests you personally? I really like that public health takes a population focus and tries to make sure everyone is reached using public health measures (e.g. immunisations, nutrition, school health, healthy babies, tobacco legislation). I like the systems or structural approach to much of public health
I like the way public health goes beyond individual behaviour change to look at the conditions that create health at the population level. Inequities become clear and this is also important to recognise in policy. Looking upstream keeps us from blaming people living in difficult circumstances for their health problems and steers us away from taking an individual behaviour change approach. When other sectors recognise the parts they play in creating unhealthy or healthy conditions, partnerships can emerge. So, for example, CPHA partners with the Canadian Produce Marketing Association to promote the consumption of vegetables and fruits. As mentioned in the previous question, CPHA has drawn attention to the impact on health of living with violence, or in fear of violence. This has been most notable in the case of missing and murdered Indigenous women. If violence was clearly identified as a priority health issue in federal policy, what difference do you think that would make to violence against Indigenous women in Canada? Violence in all its forms is a key determinant of mental illness. Making societies free from discriminatory practices, bullying and other forms of violence, promotes mental wellness. If we don’t address this issue, people will still suffer unnecessarily from mental stress and illness. If violence prevention is clearly identified as a priority health issue in federal policy, it sets a clear direction for other
security, youth health, employment and income, and cultural connections. And very importantly, the policy process should engage Indigenous peoples in its formulation, so that further system violence of exclusion from decisionmaking is not perpetuated. CPHA is currently working with Indigenous leadership in Canada to work out how their perspectives can be included in CPHA advocacy positions and relationships.
successes (e.g. restaurant inspections, water monitoring, sanitary systems, tobacco by-laws, seatbelt legislation). I like the way public health looks at the causes of the causes and goes beyond individual behaviour change to look at the conditions that create health and works at the policy level. I also like the values of equity, cultural sensitivity, focusing upstream, environmental concerns, participatory approaches, and the interest in evidence.
There is currently a growing opioid crisis in Canada, which is resulting in large numbers of overdose deaths. What are CPHA’s strategies for dealing with this issue? As our position statement on the Opioid Crisis states, we believe that the current emphasis on changing prescribing practices and disrupting the availability of drugs are limited strategies. These are interventions aimed at the downstream impact of problematic substance use. CPHA is recommending that Canadians address the underlying causes of problematic substance use such as trauma, racism, colonialism, criminalisation, and poverty. In addition, CPHA advocates for involvement of people with lived experience with opioids in discussions about the best approach to take; take a harm reduction approach (e.g. more safe consumption facilities in communities; make naloxone available over the counter); develop legislation to protect first responders to overdoses; strengthen
• For more information on CPHA – the Canadian Journal of Public Health, their conference or the Association itself – please visit their website at www.cpha.ca.
Dr Suzanne Jackson Canadian Public Health Association 404-1525, Carling Ave, Ottawa ON, K1Z 8R9 Canada T: +1 (613) 725-3769 F: +1 (613) 725-9826 W: www.cpha.ca
www.researchoutreach.org
45
Health & Medicine ︱ Professor Chris McMaster
A new class of antibiotic drugs Finding new classes of antibiotic drugs could not only save human lives, but also greatly reduce the healthcare costs related to tackling bacteria resistant to currently available drugs. Professor Chris McMaster, director of the Cheminformatics Drug Discovery Lab at Dalhousie University in Halifax, Canada, is developing a new class of antibiotics, while trying to overcome the many challenges associated with this particular field of research – namely, the limited sales of these drugs, the lack of funding, and the greater focus on other pharmaceuticals.
M
ulti-drug resistant bacteria, commonly known as superbugs, kill over 700,000 people annually. However, more worryingly, this number is predicted to rise to over ten million by the year 2050. This would have massive repercussions, in terms of both the number of human lives lost and on the world’s economy, which would lose several trillion dollars due to sickness and death of patients infected by these bugs. Only a small handful of scientists are trying to develop new antibiotic drugs that could counteract these superbugs – although many challenges are currently slowing down progress. THE NEED FOR NEW ANTIBIOTICS Antibiotics are a type of pharmaceutical drug that destroy microorganisms or prevent them from growing. These are generally used to treat or prevent a variety of bacterial infections, including pneumonias, tuberculosis, several sexually transmitted diseases (STDs), and many others. Antibiotics are generally classified based on a number of factors, including their mechanism of action, the type of bacteria they target, and chemical structure. While most doctors prescribe antibiotic drugs on a daily basis, some bacteria have become resistant to all existing drugs, causing severe illness and death to a considerable amount of people. Most antibiotics currently developed to tackle these bacteria are based on late generations of existing antibiotics. However, resistance to these drugs tends to develop quickly, due to individual bacteria already having underlying mechanisms of resistance to similar antibiotics. As Professor Chris McMaster puts it himself: “Every single antibiotic
46
www.researchoutreach.org
currently in clinical trials, of which there are less than 40, is a ‘me-too’ version of a current antibiotic versus a known target.” This highlights the need to develop entirely new classes of antibiotic drugs, which would be able to treat infections caused by drug-resistant bacteria. Trying to address this problem, Prof Chris McMaster and his team have developed a new class of antibiotics that could help treat infections. CHALLENGES OF THE FIELD Despite the urgency and importance associated with developing new classes of antibiotics, this does not come without challenges. Several factors impair the introduction of new antibiotics into the market, making it a tortuous and difficult process. “A new class of antibiotic will likely be the drug of last resort to treat patients with multi-drug resistant infections implying that there will be limited use, and hence limited sales,” said Prof McMaster. “Second, the payers have been able to pay pennies per treatment for antibiotics for decades, and the market feels there will be resistance to a highly priced antibiotic.” Due to these issues related to sales and price of the drugs, research developing new antibiotics is often poorly funded. This is ironic, as in contrast with other drugs that offer momentary relief from symptoms, or drugs that elongate the lifespan of affected individuals, new classes of antibiotics could actually save lives, entirely eradicating bacteria resistant to current drugs. “This conundrum is especially odd considering that in many cases antibiotics will cure an infection and
New classes of antibiotics could save lives, entirely eradicating bacteria resistant to current drugs
save a life,” said Prof McMaster. “Contrast this situation to anti-cancer drugs, where there are over 800 drugs currently in clinical trials, many of which will have price points in the tens to hundreds of thousands of dollars per patient, and many of these drugs will extend life-span but not offer a cure.” TREATING DIABETIC INFECTIONS To overcome the funding issues related to the assumption that new antibiotics will have low sales and a low price point, Prof McMaster chose to develop a new class of antibiotics that would meet the needs of at least 200,000 patients per year, whose infections are hard to treat using current antibiotics.
He said: “We landed on early-stage diabetic foot infections (DFIs) as having a large patient base with a high unmet need, no currently approved antibiotic, and an expected high level of uptake into the clinic and market. There is also a dramatic increase in cost as DFIs become more severe allowing for a reasonable price point to be established for a new drug that could treat this condition”. ANTIBIOTICS AND DIABETES Diabetes is a disease that affects over 415 million people worldwide, caused
by an impaired ability of the body to produce the hormone insulin, resulting in abnormal metabolism of carbohydrates and high glucose levels in the blood. Each year, approximately 1.5% of diabetics develop what is termed diabetic foot ulcer infections (DFIs), which range from relatively mild, to more problematic. For more severe infections, patients generally receive antibiotic therapy, yet approximately 50% of moderate DFIs progress to more serious stages. This therefore comes with substantial healthcare costs related to further treatment or surgical amputations. Currently, no drugs on the market have been proven to successfully decrease
www.researchoutreach.org
47
LPS Outer membrane Inner membrane Lipid A (10-14) Haemolysin (12) HlyC
Biotin (7) BioCFH
Phospholipids (14-18) PlsBCXY
HtrB MsbB LpxD LpxA
O
ACP
SCR
Lipoic acid (8) LipA FA synthase FabABDFGHIZ
Membrane-derived oligosaccharides MdoH
Luminescence (14) LuxD AasS(+ATP)
ACP AcpS
Acylhomoserine lactones (4-16) LuxI AinS
SH AcpH
apo-ACP
Acyl-ACP synthetase (≥4)
Other known partners SpoT, MukB, AidB, IscS, SecA, TesA/B, YbgC, YiiD, AccA, AceE/F, GlmU, PssA, ...
Acyl-ACP-dependent products, enzymes, and pathways in bacteria. Typical acyl chain lengths are shown in red, while AcpS, the enzyme of interest, is highlighted in blue
the bacterial load for DFI infections, which would facilitate the healing process. Prof McMaster and his team developed a new class of antibiotics designed to eliminate bacteria in DFIs, with an entire new structure compared to current antibiotics. This would be the first class of antibiotic to target the bacterial enzyme holo-acyl carrier protein synthase (AcpS), which is essential for lipid metabolism in bacteria. Prof McMaster tested the new class of antibiotics in animal models of DFI, and other skin infections, and found them to successfully reduce the bacterial load in DFIs by over 99%. Further
48
www.researchoutreach.org
Prof McMaster hopes to be one of the first researchers in over 50 years to develop a new class of antibiotic drugs tests suggested that the drug has all the properties required for it to be successfully used within clinical settings. A CRITICAL PROBLEM The issue of drug-resistant bacteria is of great importance, with Dame Sally Davies, England’s Chief Medical Officer, describing it as a “catastrophic threat to the human population.” Developing new classes of antibiotic drugs is crucial to effectively counteract infections that resist currently available treatments.
Prof McMaster hopes to be one of the first researchers in over 50 years to develop a new class of antibiotic drugs, in this case specifically designed to tackle diabetes-related foot infections. He and his team are also looking to develop a drug for Gram negative infections.
Behind the Bench Dr Christopher McMaster
E: Christopher.McMaster@Dal.Ca T: +1 (902) 494 3430 W: https://medicine.dal.ca/departments/ department-sites/pharmacology/our-people/faculty/christopher-mcmaster.html
Research Objectives Dr McMaster’s research focuses on metabolic regulation with an eye toward the development of therapies for unmet medical needs in the infectious and inherited disease spaces. Funding Canadian Institutes of Health Research (CIHR)
Q&A
Why is developing new classes of antibiotic drugs so important right now? We are nearing a tipping point in the rise of multi-drug resistant bacteria. We can act now to head off this looming crisis, or risk returning to a pre-antibiotic era where infectious diseases were the top killers, and average human lifespans were less than 50 years. When and how did you first start being interested in devising new classes of antibiotics? A decade ago I made an intentional shift in research focus toward drug discovery and development. That said, the very first research essay I ever wrote – at eight years of age – was on Alexander Fleming and his discovery of penicillin. What would you say are the biggest challenges in your field of research and how can these be overcome? There is work to be done between governments, regulators, and the
Bio Dr Christopher McMaster received both his BSc and PhD from the University of Manitoba, later working as an Assistant Dean at Dalhousie University. He currently works there still, as the Carnegie and Rockefeller Professor and Head of the Department of Pharmacology and Director of the Cheminformatics Drug Discovery Lab at Dalhousie University in Halifax, Canada.
Contact Christopher McMaster, PhD Carnegie and Rockefeller Professor and Head Department of Pharmacology Dalhousie University Canada
private sector. The private sector looks to make medicines that can make a profit, which makes sense. Most antibiotics are expected to be priced low and used little and thus are a low priority. Governments and regulators need to find a way to incentivise antibiotic development that works for both parties. Other fields, such as rare childhood (orphan) diseases, have such mechanisms in place already so templates do exist. Hopefully, an agreement can be found before multidrug resistant bacteria develop into a crisis.
by clinicians as DFIs are a disease for which no current antibiotics are effective, have a large patient base, and a rapid increase in cost as the disease progresses. This will enable a reasonable price point for a large unmet medical need that we feel should be enticing with respect to moving a new first-in-class antibiotic into the clinic.
What advantages do you expect the drugs you are currently testing would have once they are placed on the market? We intentionally worked on an antibiotic programme that should have high uptake
What are your plans for future research and investigation? Our antibiotic class works versus a broad-spectrum, Gram-positive bacteria. We are currently working on Gram-negative antibiotics, as well as drugs for rare inherited (orphan) diseases. In essence, therapies for infectious and inherited diseases.
We can act now to head off this looming crisis [of multi-drug resistant bacteria], or risk returning to a pre-antibiotic era where infectious diseases were the top killers
www.researchoutreach.org
49
Health & Medicine ︱ Dr Gavin Y Oudit
Preventing and treating complications of heart failure and Fabry disease Dr Gavin Y Oudit, a distinguished cardiologist from the University of Alberta, focuses his research on developing ground-breaking therapies to treat both genetic and non-genetic cardiovascular disorders, and to prevent heart failure. His exciting work into the role of ACE2 has provided a fundamental scientific understanding used in the development of therapeutic options for cardiovascular, renal and lung diseases. He has also provided important insights into various cardiomyopathies including heart disease associated with Fabry disease.
E
very year, approximately 610,000 people die from heart disease in the United States, with heart failure (HF) accounting for one in nine fatalities. Worryingly, according to the American Heart Association, the number of people diagnosed with HF is expected to increase rapidly in the United States, from six million to eight million by 2030. Not only does this threaten life expectancy and impact the sufferer’s quality of life, but the financial implications are also severe. As a result of this increase, the total medical costs required to treat the disease are estimated to rise from $14.3 billion to $29.2 billion. WHAT IS HEART FAILURE? Essentially, HF is a condition whereby the heart cannot pump sufficient blood to meet the needs of the body. This can be due to a weakening of the muscle itself, or a deficiency in the volume of blood that fills the heart’s chambers. Abuse of alcohol, drugs, diabetes and obesity can greatly enhance the risk of suffering from HF, but genetic defects are also a common contributor. HF can result from a number of different factors including arrhythmia (irregular heart beat), heart attacks, hypertension and cardiomyopathies (disorders affecting the heart muscle). Currently, there is no cure for HF. However, a combination of simple lifestyle changes and medication can improve quality of life. Nevertheless, prognosis can be poor for people who
50
www.researchoutreach.org
develop acute HF and these individuals often continue to suffer from the condition. As a result, Dr Oudit and his team are looking to develop novel HF therapeutic options. Dr Oudit’s studies mainly focus on metabolic disorders associated with heart disease, neuromuscular dystrophy and Fabry disease – a genetic disorder that can cause cardiovascular complications. In fact, Dr Oudit’s ground-breaking research has already revolutionised the way clinicians diagnose and treat Fabry disease, reducing the risk of sufferers developing HF. Crucially, he has also shown that the renin-angiotensin system (RAS) plays a significant role in HF, and inhibiting its activity can act as a preventative measure. THE ROLE OF THE RENINANGIOTENSIN SYSTEM (RAS) The RAS hormonal system is vital in regulating blood pressure, controlling homeostasis in both the cardiovascular and renal systems. Low blood pressure is detected by the kidney, triggering the release of renin into the blood. This enzyme cleaves the plasma protein angiotensinogen to form angiotensin I. Another enzyme, ACE (angiotensin converting enzyme) transforms angiotensin I (Ang I) into angiotensin 2 (Ang II) which is a biologically active octapeptide. Ang II induces the release of the hormone aldosterone from the adrenal gland,
causing vasoconstriction (narrowing of blood vessels), and increasing blood pressure to optimal levels. However, Ang II can have an adverse effect on heart health by causing
the adverse effects of Ang II: examples include ACE inhibitors and Ang II receptor blockers. These therapeutic approaches have effectively reduced hypertension, improved cardiac function and prevented ventricular
levels. In fact, Ang 1-7 (a product of Ang II degradation) behaves as a cardioprotective peptide, actively reducing the negative impact of Ang II by suppressing aldosterone secretion and inhibiting myocardial fibrosis.
Dr Oudit’s ground-breaking research has revolutionised the way clinicians diagnose and treat Fabry disease, thus reducing the risk of sufferers developing heart failure hypertension, ventricular dilation, hypertrophy (thickening of ventricular wall) and alterations in chamber configuration. These effects result in two HF disorders: i) HF with preserved ejection fraction (HF-pEF), whereby the left ventricle is stiff, reducing relaxation during blood fill before the next beat, and ii) HF with reduced ejection fraction (HF-rEF). This is where the heart cannot pump adequately, expelling less oxygen-rich blood. In the past, researchers have attempted to inhibit the RAS system to reduce
remodelling. Although these medications have shown clinical promise in patients suffering from HFrEF, they do not protect those afflicted with HF-pEF. ACE2 – A NATURAL PREVENTION AGAINST RAS-INDUCED HF? To address this issue, Dr Oudit and his team have investigated other strategies to inhibit the RAS system. The discovery of ACE2 could provide a solution. ACE2 is a monocarboxypeptidase that degrades both Ang I and Ang II, inhibiting the RAS system at two
Excitingly, studies on mice confirmed the protective benefits of ACE2 in both HF-rEF and HF-pEF. Clearly, ACE2 could potentially be a revolutionary therapeutic tool, although more studies need to be conducted to achieve optimal RAS inhibition and cardioprotection. FABRY DISEASE Dr Oudit and his colleagues have also revolutionised the way clinicians treat and diagnose Fabry disease. This is an X-chromosome-linked recessive disease that affects around 1,500 to 3,000 people. A mutation in the gene
www.researchoutreach.org
51
Liver
Lungs Heart
Renin
Angiotensinogen
Angiotensin I
Angiotensin II Angiotensin converting enzyme (ACE) Vasoconstriction and increased blood pressure
Kidney
encoding α-galactosidase A (a hydrolase enzyme involved in fat metabolism) confers insufficient degradation of glycosphingolipids, resulting in the accumulation of fatty deposits in cell storage organelles called lysozymes. Cell function is impaired, causing damage to a variety of tissues, including the kidneys, skin, eyes and heart. HF is the primary cause of death for individuals afflicted with Fabry disease, so early diagnosis is essential to prevent severe heart damage. Unfortunately, symptoms of Fabry disease are subtle and are often associated with other more common disorders. Examples of these symptoms include hearing difficulties, kidney problems, fatigue, eye abnormalities and heart defects. Therefore, diagnosis is very difficult, and many individuals are not diagnosed until later in life – Dr Oudit and his team aim to change this. T1 MAPPING He and his team at the University of Alberta have developed an innovative
52
www.researchoutreach.org
Diagnosis of Fabry disease is very difficult, and many individuals are not diagnosed until later in life – Dr Oudit and his team aim to change this diagnostic tool called T1 mapping. The term ‘T1’ refers to the ‘relaxation time’ of the cardiac muscle. Interestingly, in Fabry disease this T1 time is shortened, which is a rare phenomenon. T1 mapping is an MRI technique, but is significantly more powerful than ultrasound or regular MRI, as it can detect microscopic changes in the heart and determine damage severity early on. As a result, patients suffering from chest pain and hypertrophic cardiomyopathy can be screened for Fabry disease, and, if detected, can undergo Enzyme Replacement Therapy. This effective treatment uses an enzyme called agalsidase beta to degrade the fatty deposits in the cells. SCREENING FOR FABRY DISEASE Dr Oudit’s team is currently screening
for Fabry disease in patients with undiagnosed hypertrophic cardiomyopathy and patients with chest pain syndrome with normal epicardial coronary arteries. The detection of new cases of Fabry disease is essential since the use of enzyme replacement therapy has important beneficial effects. Overall, the work of Dr Oudit and his colleagues is making an outstanding contribution to the field of cardiology. By developing novel diagnostic tools and therapies to treat heart failure, the team are drastically improving the quality of life of many people all over the world.
Behind the Bench Dr Gavin Y Oudit T: +1 780 407 8569
Research Objectives Dr Gavin Oudit’s research focuses on cardiomyopathies and heart failure. He and his team have been responsible for ground-breaking work investigating preventative measures for heart disease and Fabry disease. His research is largely focused on elucidating the molecular and cellular underpinnings of heart failure, but he studies other areas, including vascular and kidney diseases, as well. Funding Funding support was provided by the Canadian Institutes of Health Research (CIHR), Alberta Innovates-Health Solutions (AI-HS), Heart and Stroke Foundation (HSF), and the University of Alberta Hospital Foundation.
Q&A
How does angiotensin II specifically cause heart failure? Angiotensin II is a peptide hormone which is produced in the circulation and in the heart tissue. It produces high blood pressure and causes progressive myocardial and renal damage leading to heart failure. ACE2 is an X-linked gene – can genetic mutations enhance the risk of developing heart failure, and if so, how can this be prevented? Yes, ACE2 is X-linked so women have two copies and men have one copy
Collaborators • Maria B. Grant, USA • Bart Vanhaesebroeck, England • Josef M. Penninger, Austria • Peter P. Liu, Canada • Michael West, Canada Bio Following his academic career at the University of Toronto, Dr Gavin Oudit embarked on his training in Internal Medicine and the Clinician-Investigator Program of the Royal College of Physicians and Surgeons of Canada. Following this, he completed a postdoctoral fellowship in the molecular biology of heart failure before joining the University of Alberta, where he currently works as an Associate Professor, Cardiologist and Clinician-
of the ACE2 gene. These differences may relate to sex-based differences in cardiovascular disease with women having a better prognosis compared to men. How long will it take for ACE2 to be used commercially to treat heart failure? Between three and five years. How does T1 mapping detect early heart damage in Fabry disease patients? T1 mapping is a very sensitive and specific marker of Fabry cardiomyopathy and can be used without given
Scientist. He currently directs the Heart Function Clinic at the Mazankowski Alberta Heart Institute. Contact Gavin Y Oudit MD, PhD, FRCPC Associate Professor, Department of Medicine, University of Alberta Clinician-Scientist, Mazankowski Alberta Heart Institute Canada Research Chair in Heart Failure Director of the Heart Function Clinic Division of Cardiology, 2C2 Walter Mackenzie Health Sciences Centre Edmonton, Alberta, T6G 2B7 Canada
gadolinium contrast. It may serve as an important imaging biomarker to track the progression of Fabry cardiomyopathy. Are you researching any other ways to prevent heart failure? Yes, we are developing novel apelin analogues as potential therapies for heart failure and we are screening for cardiomyopathies including Fabry disease.
www.researchoutreach.org
53
Health & Medicine ︱ Dr John LaCava
Raising antibodies against protein complexes Dr John LaCava of The Rockefeller University has identified a gap in the current availability of target-specific antibodies for the analysis of intracellular protein-protein interactions. Using the latest antibody production techniques, alongside immunoprecipitation and mass spectrometry, he aims to identify important interactions between transcription factors and other macromolecules which are implicated in disease.
P
roteins are vital for the inner workings of cells. Complex networks of interactions form between protein molecules, and because these associations drive cellular activity, an accurate knowledge of them is vital for understanding cell biology and biochemistry. One way of gaining an insight into the protein interaction networks forming within a specific cellular population is through immunoprecipitation – using the molecules of the immune system to bind proteins and draw them out of solution so they can be studied. GUILT BY ASSOCIATION Immunoprecipitation is a molecular technique capable of providing an accurate picture of protein associations. The technique involves breaking open
living cells to access their contents, thus releasing a complex mixture of proteins, then uses antibodies to attach to a specific target protein, thereby capturing it and permitting it to be pulled away from the mixture. Importantly, the antibody must be able to bind to its target in the context of the physical associations the target forms with other proteins in the cell, and with minimum off-target binding. When done properly, immunoprecipitation permits groups of interacting proteins to be collectively purified from cells. Co-purifying proteins are said to be guilty by association. That is, if the biological function of one or more of the proteins in the purified group is known, the rest often also prove to be implicated in the same or related biological functions. With antibodies against every human protein, researchers could map the vast networks of protein associations responsible for life. When protein associations go wrong, the resulting altered interactions may lead to disease. The study of these changes therefore has high clinical value. A CENSUS OF PROTEIN ASSOCIATIONS Dr LaCava’s work is important because, despite significant advances in genome characterisation and protein identification, the global networks of protein interactions that occur within cells (dubbed interactomes) remain poorly characterised. It is estimated that 10% of human protein interactions, or fewer, are currently mapped – and this figure does not include the diseasespecific interactions which are arguably of most interest. As part of their collaboration with CDI Laboratories Inc., Dr LaCava’s group is currently focused on identifying interactions involving transcription factors, proteins which are master regulators of gene
54
www.researchoutreach.org
NOTHING WORTHWHILE IS EVER EASY: DISCOVER, OPTIMISE, REPEAT This task is made more challenging because of the now widely recognised problem that many antibodies are not capable of reliably capturing their target protein and its associated interaction partners. Moreover, even otherwise reliable antibodies may not perform well under all experimental conditions, and protein associations existing in cells are not all equally stable and analytically tractable once they are released from cells and subjected to immunoprecipitation. Therefore, each antibody and immunoprecipitation experiment must be subjected to procedural optimisations, a labourintensive and often time-consuming process. Dr LaCava and his collaborators at CDI have therefore set about generating and evaluating a suite of new antibody candidates, as well as developing robust processes to use them in optimal conditions. The process is not entirely straightforward. Protein interactions within cells (in vivo) exist in a highly specific set of naturally occurring environmental parameters. These conditions are inevitably altered during immunoprecipitation, which requires the cells’ contents be transferred into artificial conditions within test tubes (in vitro) in order to mix them with antibodies used for protein capture. An undesirable yet common side-effect of transferring proteins out of cells into an artificial environment is that interacting groups of proteins sensitive to the change will rapidly dissociate from one another – preventing their co-capture during immunoprecipitation. These protein associations therefore remain invisible to detection (false negatives). Similarly, when bona fide interactions dissociate, spurious interactions may form, wrongly implicating these spurious interactions in biological
CDI’s antibodies against endogenous transcription factors
Interactome screening Cryomilled patient tissues
716.2
100 95 90 85 603.2
80 75 70
943.4
65 815.2
60
Protein Complexes /working conditions
Relative Abundance
expression and commonly implicated in cancer progression. Changes in these proteins are often responsible for the unregulated proliferation of tumours, so understanding their associations and activities in both the natural and disease states will assist with identifying potential targets for therapy.
55
1257.4
390.1
1427.4
1100.2
50
1612.4
45 40 35
1213.2
30 25
604.2
15 10 5
745.1
541.0 373.0
1195.3
926.4 854.8
1409.4 1408.6
1444.6
1001.4
691.3
392.7
400
1258.3
1054.3
489.1
20
1560.3
1652.3
1481.1 600
800
1000 m/z
1200
1400
1600
Quantitative characterisation of IP performance by mass spectrometry
Cryomilled human cell lines
Endogenous-Complex-IP-Competent antibody to market
Interactome curated
FIGURE 1. One implementation of the modular pipeline: The parameters of antibody performance are assayed via screening in model cell lines as well as clinical samples. The underlying process is described in greater detail in Hakhverdyan et al. Nature Methods (2015). Well-performing antibodies characterised in this way can be relied upon to effectively immunoprecipitate (IP) endogenous protein complexes when the discovered experimental parameters are employed. Curated, disease-related interactions and the antibodies targeting them may also be of diagnostic and/or therapeutic value – identifying, differentiating, and modulating disease states. The data contribute to a global human interactome map.
It is thought that as little as 10% of human protein interactions are currently mapped processes linked to the target of the immunoprecipitation (false positives). Hence, different components of the interactome require different parameters to be in place during immunoprecipitation for the experiment to be robust and results physiologically accurate. To overcome this, Dr LaCava and his colleagues at the National Centre for Dynamic Interactome Research (NCDIR) developed a high-throughput screening method using mass spectrometry based proteomic analyses, allowing precise in vitro conditions to be performance classified. Their results reveal the optimal conditions for immunoprecipitation. Armed with these techniques, the team are now focusing their efforts on evaluating commercially-available antibodies that target human transcription factors, which have been produced under the National Institute of Health’s (NIH) Protein Capture Reagents Program (PCRP). Their immediate aim is to characterise these antibodies for their ability to immunoprecipitate protein complexes formed with transcription factors within established cell lines. Ultimately, the team plans to use the same techniques to purify transcription factor protein complexes directly from
resected patient tumours – exploring compositional differences specific to cancerous states. A NEW TOOLBOX FOR BIOMEDICAL RESEARCHERS Using the building blocks of their screening techniques, specific antibodies, and identified optimum conditions, the team hope to be able to capture a range of complexes for the next stage of the programme. Presently, antibodies are typically generated on a case-by-case basis. In such a workflow, a protein of interest (such as a recombinant human transcription factor) is, for example, injected into a mouse, provoking an immune response. Antibody producing B-cells are then harvested from the mice and cultured in the lab to provide a renewable source of those antibodies. In the hands of Dr LaCava and CDI, these antibodies are tested for their efficacy in immunoprecipitation, as described above. CDI has made a major advance in the field developing a proprietary monoclonal antibody production pipeline, named FastMAb®. Overall, however, this remains an expensive, labour-intensive and timeconsuming process.
www.researchoutreach.org
55
What if this process could be sped up? In order to meet the ultimate goal of mapping the entire human interactome in health and disease, good antibodies against every human protein and variant are needed. To solve this problem, Dr LaCava and CDI have taken to injecting immunoprecipitated protein complexes, containing collections of physically and functionally linked proteins, into mice. In doing so, they simultaneously raise antibodies against numerous proteins present in the mixture; these antibodies are then validated in immunoprecipitation as above, and used to mine the interactome for new protein associations – generating a virtuous cycle. Dr LaCava hopes to exponentially expand the portfolio of antibodies useful for interactome studies and, likewise, rapidly increase the coverage of bona fide human protein-protein associations. This approach also has added value. When using intact, endogenously assembled protein complexes as immunogens, some of the antibodies generated may recognise variables that are part of the gamut of naturally occurring protein processing. These may include alternative isoforms and truncations, post-translational modifications, and interfaces formed only when proteins are associated together (referred to as the quaternary structure). Creating reagents that can distinguish bound and unbound proteins, and capture only those protein complexes in a given state of protein associations is a ‘holy grail’ of
FIGURE 2. Another implementation of the modular pipeline: Affinity tagged human protein complexes can be optimised and scaled up for injection into mice to generate numerous monoclonal antibodies against physically and functionally related proteins, as present in vivo. These reagents are then forwarded to the Figure 1 pipeline. The data contribute to a global human interactome map. For more details about HuProt screening see http://cdi-lab. com/HuProt_proteome_ microarray.html
56
the field. Another major benefit of this approach is that the resulting antibodies target endogenous protein forms.
largest-content, full-length human protein microarray in existence (able to screen antibodies against nearly
Dr LaCava’s approach is providing novel solutions to long-standing, under-articulated problems in protein biochemistry and affinity proteomic research This alleviates the need to genetically modify cells, appending affinity tags to target proteins in order to purify protein complexes using an antibody against the tag. Affinity tagging is currently used in most interactome studies due to the sparsity of native antibodies available. However, this method is only widely applicable in model cell lines that can be easily genetically manipulated on a genome-wide scale, leaving more disease relevant clinical samples off the table. Taken together, the drive for this research is to provide tools for expanding biomedical research capabilities and findings while also improving reproducibility. UNIQUE APPROACH TO AN OLD PROBLEM Dr LaCava’s approach is therefore providing novel solutions to longstanding, under-articulated problems in protein biochemistry and affinity proteomic research. Leveraging CDI’s proprietary monoclonal antibody production pipeline, which uses the
anti-tag antibody
Interactome screening
twenty-thousand human proteins), Dr LaCava’s team is determined to further expand the current possibilities of immunoprecipitation techniques and bring this to the commercial marketplace themselves. The ultimate goal is of course to improve patient outcomes and develop new drugs capable of combatting cancer. Indeed, the team believe that the intact, purified protein complexes they obtain will provide an unparalleled opportunity to test drug candidates for their ability to modulate proteins as they are found within cells, and in doing so, treat disease. Hence, the benefits of this research are likely likely to be recognised across both diagnostics and therapeutics, as it becomes increasingly possible to quickly characterise the biochemical profiles of tumours and develop weapons against their aberrant activity.
Scale-up / inject in mice
Generate antibodies / HuProt screen
CDI’s Fast-MAb®
Protein Complexes /working conditions
Cryomilled human cell lines ectopically expressing affinitytagged protein complexes
www.researchoutreach.org
Interactome curated
Interactome screen & characterise new antibodies (Figure 1)
Behind the Bench Dr John LaCava
E: jlacava@rockefeller.edu T: +1 212 327 8136 W: http://www.b13logy.com CDI Laboratories Inc. http://www.cdi-lab.com/ National Center for Dynamic Interactome Research http://www.ncdir.org/ Research Objectives Dr LaCava specialises in macromolecular interactions analyses. Working in collaboration with CDI Laboratories, he and his research team are developing monoclonal antibodies capable of binding to constituents of protein complexes found in established and clinic-derived cancer cells. Collaborators • National Centre for Dynamic Interactome Research (NCDIR) • CDI Laboratories Inc. (CDI)
Q&A
What are the advantages and disadvantages of the high-throughput screening approach? Advantages: Our approach speeds up the discovery of conditions for successful immunoprecipitation and identifies multiple successful working conditions that typically reveal novel interactors of the protein of interest. This approach allows us to study how the signal and noise of the immunoprecipitation experiment change across many experimental parameters, revealing how in vitro conditions affect protein behaviours in complexes. Such knowledge has clear basic research and industrial applications. Disadvantages: The technique does require training to master. It requires special equipment and screening consumes a lot of material. Although, the material use is efficient on a “per discovery” basis, since we find valuable, otherwise invisible interactions when we look through the lens provided by the screen. How big is the market for these sorts of biochemical tools? The global antibody market is in excess of $80 billion and continues to expand each year. The antibody market comprises three major sectors: therapeutic applications, diagnostic tests and research-use. Scientific research institutes use
Bio Dr LaCava is a research faculty member at The Rockefeller University and the New York University School of Medicine, Institute for Systems Genetics. He is a senior researcher at the NIH’s National Center for Dynamic Interactome Research, serves as an R&D collaborator and scientific advisor for CDI Laboratories Inc., and has recently cofounded B13LOGY LLC.
Contact John LaCava, PhD Laboratory of Cellular and Structural Biology The Rockefeller University 1230 York Avenue New York, NY 10065 USA
Funding National Institutes of Health (NIH)
immunoprecipitation technology for protein-target discovery and characterisation. The research-use only antibody market generates between $2.2 billion and $2.7 billion per year and growing (2015).1,2 Our market research indicates that the proteomics global economy is projected to be valued at over 20 Billion USD by 2021. What makes you think that this technique will succeed where others have failed? Firstly, others have not had the ability to screen antibodies for success in immunoprecipitation in such a comprehensive way. Secondly, others have lacked the array-based pre-screen of CDI to select for antibodies likely to be specific to begin with. Finally, to our knowledge, others have not been able to readily purify enough endogenous complexes from human cells to routinely inject them in mice for antibody production – a recent preparative ‘trick’, coupled with our already highly effective protocols helped us make the leap. How will this research impact on cancer diagnosis and therapy? When aberrant molecular interactions are identified, they may prove to be diagnostic of cancer sub-types (or prognostic of outcomes), and rational approaches to intervene may be effectively employed as therapies. A therapeutic approach may seek to reverse the aberration by e.g. stabilising a labile diseased interaction, or by destabilising a stable diseased interaction, or otherwise modulate
diseased protein complexes to more greatly resemble and/or propagate the healthy state. A prominent example of the promise (and challenges) of antitumorigenic treatments resulting from ‘drugging’ protein-protein interactions is embodied by the development of nutlin, the first small-molecule inhibitor of the p53–MDM2 interaction3 – illustrating that, if we thoroughly mine diseaselinked protein networks, diagnostic and therapeutic strategies will emerge. We aim to be among the vanguard of this global effort. What is the most challenging aspect of this work? There are so many challenging aspects of the work – but among the most challenging is assessing which proteins among those purified are true positives and which are false positives. While there are many potential indicators, and large amounts of public data to draw from, there’s no foolproof method to score an interaction accurately without deep knowledge of the underlying biology. Analysis is a bottleneck, which B13LOGY LLC is hoping to address. References: 1. Fung P. A. BioCompare Antibody Market Report. (2015). 2. Baker M. Antibody anarchy: A call to order. Nature. (2015). 3. Khoo et al. Nature Reviews Drug Discovery. (2014).
www.researchoutreach.org
57
Health & Medicine ︱ Dr Margot Taylor
Impaired theory of mind associated with very preterm birth – an invisible handicap Dr Margot Taylor, Director of Functional Neuroimaging at the Hospital for Sick Children and Professor at the University of Toronto, is investigating how very preterm (VPT) birth impacts social cognitive function including Theory of Mind (ToM), a skill that enables us to appreciate that perspectives and beliefs of others can differ from our own. By using neuroimaging techniques, Dr Taylor has explored the neural networks that underpin the association between VPT birth and ToM.
E
verybody has a unique set of beliefs and principles, and the ability to appreciate these differing opinions is an essential part of social living. This phenomenon is known as ‘Theory of Mind’ (ToM) – termed a ‘theory’ because the mind cannot be directly observed. We can only interpret what others are thinking/ feeling by their speech, body language and facial expressions. These cues allow us to visualise that person’s perspective and anticipate their behaviour. As children develop, their ability to comprehend these emotive indicators improves alongside their emotional intelligence. ToM ability typically emerges between four and five years of age and development continues through adolescence into adulthood. In some cases, however, children do not master ToM and struggle to understand perspectives different from their own. As a consequence, these children may have difficulties across a range of social skills, such as understanding what others mean during daily social interactions or having empathy for others. Individuals who suffer from autism spectrum disorder, attention deficit disorder and schizophrenia are particularly prone to deficient ToM. As a result, these children and adults often find social situations stressful and difficult to cope with. However, because deficits in ToM are only apparent in social situations, ToM impairment is often not recognised until children start school, where adaptive social functioning becomes crucial, not only to making friends and dealing with new and complex social situations but also in learning.
58
www.researchoutreach.org
ToM IN VERY PRETERM-BORN CHILDREN Research has shown that poor ToM also occurs in very preterm (VPT) born children: babies born at a gestational age of less than 32/40 weeks. In Canada alone, over 4000 VPT babies are born every year. Even though improvements in neonatal intensive care mean that fewer VPT infants suffer from serious health conditions, over half of VPT children experience difficulties such as academic underachievement and cognitive impairment, including ToM deficiency. Despite the prevalence of VPT births and the subsequent detrimental lifelong impact, little is known about how neural pathways are affected by early birth, which leads to poor social-cognitive abilities, including poor ToM skills. By using innovative neuroimaging techniques, such as magnetoencephalography, coupled with classic ‘false belief tests’, Dr Margot Taylor and her colleagues, Sarah Mossad in particular, have been the first to investigate the impact of VPT birth on this critical social-cognitive skill. FALSE BELIEF TASKS A standard protocol used to assess ToM ability is the ‘false belief task’, whereby a person must realise that another individual holds a belief about a situation that is different from their own and from reality. The ‘Jack and Jill’ false belief task was used by Dr Taylor and her team to compare ToM abilities in VPT vs. full term (FT) children. Participants saw a series of cartoon images projected in sequential pairs on a screen. In the first image, Jill sees Jack about to drop a ball into either a blue or red hat. In the second image, Jack
either switches the location of the ball or drops it into the same hat. During the second image in the pair, Jill is either present for the ball switching or absent. Therefore, if Jill does not witness the ball being switched, she has the false belief that the ball is still located in the original hat. Having seen the two cartoon images, participants were then asked the question: ‘Where does Jill think the ball is?’ and responded using a button box. 300 trials with both true and false belief examples were run. Results showed no significant difference in behavioural performance. When tested outside the scanner on various scenarios requiring ToM skills, however, VPT children performed significantly worse than FT children. MAGNETOENCEPHALOGRAPHY During this task, Dr Taylor and her colleagues used the state-of-the-art brain imaging technology known as magnetoencephalography (MEG),
Using innovative neuroimaging techniques, Dr Margot Taylor and her colleagues have been the first team to investigate the impact of very preterm birth on social cognitive development to compare neural activation associated with ToM and false belief in both VPT and FT children. MEG measures brain activity by recording magnetic fields produced by electrical currents, using sensitive magnetometers. This innovative technique can provide millisecond temporal information along with nearmillimetre spatial resolution, giving a highly-detailed map of the brain’s activity. Despite similar behavioural performance in the false belief task in MEG, Dr Taylor showed that neural processing is different in VPT compared to FT
children. Both groups engaged some of the regions involved in false belief processing, referred to as the ToM or ‘mentalising’ network. In a previous study, performed on adults, Dr Taylor highlighted this network using MEG. Activation begins in the occipital regions, followed by the right inferior frontal gyrus and the temporal-parietal junction. Results showed that activation in all regions was reduced and often delayed in the VPT compared to the FT group. Key findings indicate the importance of the temporal-parietal junction in ToM
www.researchoutreach.org
59
Network of brain regions involved in Theory of Mind (ToM) processing in typically developing children and adults. This network includes the right temporal parietal junction (large green sphere), which is a region known to be crucially involved in ToM processing. Compared to their full-term peers, children born very preterm, show reduced and delayed recruitment of these regions during ToM tasks.
and false belief processing. Following the inhibition of one’s own beliefs, in order to see from another person’s perspective, the temporal-parietal junction is activated. MEG imaging showed that FT children recruit the right angular gyrus, temporal-parietal junction (rTPJ) more than VPT children in false belief. In contrast, the VPT children recruited primarily temporal brain regions to successfully complete the task, suggesting their use of alternative strategies and underutilisation of the mentalising network, which may explain impairments in ToM ability frequently observed in VPT children. FUTURE RESEARCH The brain is an extremely complex organ and a full understanding of the neural underpinnings of social cognition – both good and poor abilities – is difficult to achieve. However, Dr Taylor’s groundbreaking research has highlighted some of the key neural differences
60
www.researchoutreach.org
Social-cognitive dysfunction is a common sequelae of VPT birth and disturbances in developing these skills have profound long-term consequences academically and socially underlying ToM in VPT and FT children. She and her colleagues are continuing their investigations into the cognitive consequences of VPT birth. They propose a series of longitudinal studies that focus on investigating how neural patterns change as both VPT and FT children develop, using cognitive and social-cognitive tasks. The impacts of impaired social cognition can be very detrimental and a lifelong handicap. Without effective intervention, individuals may have a reduced quality of life, as they struggle to navigate
social situations and forge relationships. In children, this in turn would impact their academic performance. Therefore, it is essential that we develop methods to identify ToM impairment early on. Dr Taylor’s team aim to establish predictive models that will identify children who are more vulnerable to developing social-cognitive difficulties. Targeted intervention techniques could then be used to teach these children ToM skills and help them to develop their social-cognitive abilities, greatly improving their quality of life.
Behind the Bench Dr Margot J. Taylor
E: margot.taylor@sickkids.ca T: +1 416 813 6321 W: http://www.sickkids.ca/Research/margot-taylor-lab/Index.html
Research Objectives Dr Taylor’s research utilises innovative brain imaging technology, such as magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), and MRI to understand the neural bases of cognitive development. Funding • Canadian Institutes of Health Research (CIHR) Collaborators Dr Evdokia Anagnostou, Dr Benjamin T. Dunkley, Dr Jason P. Lerch, Dr John
Q&A
Why is Theory of Mind (ToM) so important? Theory of Mind (ToM) allows us to understand the beliefs, emotions or mental states of others and that they can be different from our own; this allows us to predict and understand others’ behaviour and is thus critical to adaptive social interactions. Why is impaired ToM prevalent in very preterm born children? We believe that very preterm birth impacts the development of brain networks critical for social-cognitive abilities; our studies will determine the atypical brain function underlying this. What are the benefits of using magnetoencephalography to examine neural networks underpinning ToM? MEG is an exceptional neuroimaging technique as it provides information on both where and when in the brain
G. Sled, Dr Charline Urbain, Dr Hilary E. Whyte, Dr Mary Lou Smith, Dr Elizabeth Pang and Sarah Mossad Bio Dr Taylor received her doctorate from McGill University in 1981 and came to SickKids in Neurology as Director of Evoked Potential Labs. She moved to Toulouse, France in 1998 as Directeur de Recherche, CNRS before being recruited back to SickKids in 2004, as the Director of Functional Neuroimaging.
processing is occurring. The speed of thoughts and where they are happening can change enormously with age and with clinical populations; being able to identify what the differences are allows us to understand why some groups are not able to master ToM skills adequately. What are the main neurological differences between very preterm and full-term infants? VPT infants have a much higher risk of brain injury, and can sustain white matter injury at birth. These effects can be seen throughout childhood, but do not relate directly to the social and cognitive deficits that this group often experiences. The deficits are due to atypical brain function and that needs to be studied by techniques such as MEG. Some of these
Contact Dr Margot J. Taylor Diagnostic Imaging The Hospital for Sick Children 555 University Avenue Toronto, Ontario M5G 1X8 Canada
VPT children show amazing resiliency despite early brain injury; we want to understand what underlies that, to be able to foster it in all of these high-risk children. How can we treat impaired ToM? There are interventions (often developed for children with autism) that can teach children how to behave more appropriately in social circumstances, and better understand what others mean with language and expressions. The children also need to learn what impact some of their own behaviours have on others. These interventions are usually based on intensive social modelling and roleplaying approaches.
Theory of Mind (ToM) allows us to understand the beliefs, emotions or mental states of others and that they can be different from our own
www.researchoutreach.org
61
Earth and Environment ︱ Professor Shanaka (Shan) de Silva
Supervolcano Forensics: unravelling the mysteries of the Earth’s biggest natural catastrophe To many, if not most, the word ‘forensics’ invokes images of the very small – DNA, fingerprints, etc., – but for Professor Shanaka de Silva and his colleagues at the College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, ‘forensics’ is being used to investigate something altogether bigger. The team is using a multidisciplinary approach to reveal the secrets of ‘supervolcanoes’ and supereruptions, calling on expertise in the fields of geophysics, geochronology, petrology, geochemistry, and numerical modelling.
N
ext to asteroid impacts, supervolcanoes are the most catastrophic natural hazard on Earth. On average, supereruptions have occurred on Earth approximately every 100,000 years, blanketing surrounding regions with thousands of cubic kilometers of volcanic material and affecting the global climate. During these eruptions, collapse of the magma chamber roof leaves a caldera (a crater tens of kilometers in diameter). In the following decimillennia, the volcano recovers as magma readjusts to the disturbance (rather like the surface of water when something is dropped into it) causing the ground above to swell (‘uplift’) and deform – a process known as resurgence. Earthquakes, lake tsunamis and fresh eruptions characterise this recovery, posing significant and continuing hazards.
The Earth’s largest volcanic lake, Toba (Sumatra, Indonesia) fills the 100 kilometres long and 30 kilometres wide Toba caldera that formed 74 ka in the Earth’s largest recent supereruption. Samosir Island, in the center, is the caldera floor that was uplifted almost a kilometer during post-superuption resurgence. GeoCover Landsat 7 satellite image in infrared and visible light courtesy of NASA.
62
www.researchoutreach.org
The potential impacts make understanding supervolcanoes a task of the utmost importance, and one that is being tackled by Professor Shanaka de Silva and his colleagues at the College of Earth, Ocean, and Atmospheric Sciences, Oregon State University. In particular, the group are addressing a number of questions: 1. Magma bodies that feed supereruptions are likely at least an order of magnitude larger than the calderas they form and develop over hundreds of thousands to millions of years. Questions remain as to how such large volumes of magma can accumulate in the crust and eventually erupt, rather than cool and solidify into a granite. 2. The very conditions that promote the growth of large magma bodies demote the likelihood of eruptions. Why then do these magma systems eventually fail and erupt? 3. After catastrophic supereruptions, the system recovers during the ‘resurgence’ and ‘restlessness’ stages (or as Professor de Silva describes it, ‘the afterparty after the big dance’). Why does this happen and what are the driving mechanics and time scales? Since all currently active calderas (e.g., Yellowstone, Campi Flegrei, Long Valley, Toba) are resurgent and restless, how long will this last and what is the hazard posed? 4. Since many large calderas erupt repeatedly, going through cycles of eruption and recovery, what is the relationship between supereruptions and resurgence? PIECES OF THE PUZZLE Professor de Silva and his colleagues are gathering information using different scientific techniques – an approach they have termed Supervolcano Forensics – at calderas around the world. Students and postdoctoral researchers have conducted much of this groundbreaking work, examples of which include: 1. Geochronology (led by graduate students Casey Tierney, Chris Folkes, Jamie Kern, Jason Kaiser, Rodrigo Iriarte and collaborators Axel Schmitt and Martin Danišík), which uses the decay of radioactive isotopes in magmatic minerals (i.e., crystals within the magma; for example,
Pyroclastic flows from Sinabung volcano on the 2nd August, 2017. Sinabung lies just 20 miles to the north of Toba and shares geochemical and age characteristics with Toba. Credit: Shanaka de Silva
zircon) to date volcanic processes. This work has focused on calderas in the Central Andes and has shown that: (a) crystals can form in the storage region several 100,000 years before eruption; and (b) most magma in the storage region actually remains non-erupted. 2. Geochemistry and petrology (including work by graduate students Dale Burns, Stephanie Grocke, Chris Folkes and collaborator Jan Lindsay), which uses the chemistry and textures of both liquid magma and magmatic minerals to understand magma history (e.g., storage depths, temperature, water content, interaction with other magma, speed of ascent). The team has confirmed (a) the multi-stage evolution of magma chambers, with distinct changes in volume, composition, and heterogeneity; and (b) that thermally and chemically homogenous magmas reside in the storage region both before and after a supereruption, and drive resurgent activity. These magmas do not solidify owing to regular periodic injections of fresh, hot magma from depth. 3. Geophysics (led by collaborators from the PLUTONS project), including the use of seismic waves (i.e., waves produced by earthquakes and magma movement) to generate 3D images of the crust below calderas. This
work has confirmed the presence of large low velocity zones (i.e., partially molten areas) that extend hundreds of kilometres across and tens of kilometres deep. 4. Numerical modelling (led by collaborator Patricia Gregg), that uses mathematical models to predict how systems will behave under given conditions. This work has shown that: (a) the rheology (whether brittle or ductile) of surrounding rock is a controlling factor; (b) negative feedbacks between magmas’ thermal energy, rock plasticity, internal pressurisation and likelihood of eruption promotes growth rather than eruption; (c) eventual failure of large magma chambers (i.e., eruption onset) is a function of roof rheology and geometry; once reservoir volumes reach 104–105 km3, the crust is unable to support them and the roof collapses, producing calderas of up to 103–104 km2, consistent with the largest calderas on Earth. The work of Professor de Silva’s group, grounded firmly in field-based observation of the deposits and stratigraphy (the relative temporal and spatial relation of events) is showing that supervolcanoes are surface manifestations of crustal scale magmatic activity. The development and longevity
Next to asteroid impacts, supereruptions are the most catastrophic natural hazard on Earth www.researchoutreach.org
63
The life cycle of a supervolcano like Toba caldera. Top – As magma accumulates the crust of the earth is upwarped and stretched. Faults propagate from the surface downwards. Magma is kept hot by continued intrusion from deep (red) (105 to 106 years). Middle – Supereruption - Eventually the faults intersect the magma and eruption initiates as the roof blocks collapse into the magma chamber acting like a plunger to force the vesiculating magma out as ash, gas, and pumice at supersonic speed (days to weeks). Bottom – Resurgence. The caldera may fill with water. The magma system and crust readjust uplifting the collapsed roof blocks and leaking small eruptions through the collapse faults (104 to 105 years)
of supervolcano magmatic systems depend on the interplay between heat transfer and the mechanical strength of the crust. Without this feedback, magma could not be stored in large volumes; it would erupt in small events, or solidify too early. This in turn controls the eventual size of the eruptions and calderas. As an integrative framework and with an eye to hazard assessment, Professor de Silva and his colleagues are developing a simple model that frames calderas behaviour as a reaction to changes in the balance of forces in the crust and magma system. In this model, the caldera cycle is a continuous loop. An exciting possibility is that since the temporal and spatial scales of deformation associated with pre-eruptive development of large magma systems is quite different from those associated with restlessness, the transition from resurgence and restlessness to pre-eruption buildup could, in principle, be detected. Part of the challenge is nailing down the temporal and spatial scales of the different stages and their surface representations. NEW RESEARCH FOCUS To specifically improve understanding of resurgence and restlessness, Professor de Silva and his team have now turned their attention to Toba, Indonesia. Approximately 74 ka (thousand years ago), Toba experienced the most catastrophic eruption of the last 100,000 years, during which at least 2800 km2 of magma was erupted (that is 28,000
64
www.researchoutreach.org
Professor de Silva and his colleagues are gathering information using different scientific techniques – an approach they have termed Supervolcano Forensics times the amount erupted during the 1980 eruption of Mt St Helens!), forming a caldera 30 km wide and 100 km long. Since then, the caldera floor has experienced well over 1 km of vertical uplift, forming the island of Samosir. This project, which won the support of the National Science Foundation, aims to test the hypothesis that resurgence is fed by magma left over after the climactic eruption. So far, graduate student Adonara Mucek has used geochronology to date zircon
crystals and lake sediment deposits, revealing that resurgence began at least 30 ka and continued until at least 2.7 ka. Eruptions fed by remnant magma rejuvenated by fresh magma from deep continued for at least 20,000 years after the climactic eruption. New work by graduate students Katharine Solada and Jade Bowers is further constraining lake sediment history, and expanding our understanding of resurgent eruptions, including possible relationships with the actively erupting Sinabung volcano.
Behind the Bench Professor Shanaka (Shan) de Silva E: desilvas@geo.oregonstate.edu T: +1 541-737-1212 W: http://ceoas.oregonstate.edu/
College of Earth, Ocean, and Atmospheric
Collaborators
Research Objectives
Sciences
•P atricia Gregg, University of Illinois
Professor Shanaka de Silva and his many
104 CEOAS Administration Building
• A xel Schmitt, University of Heidelberg,
collaborators are currently investigating the
Oregon State University, Corvallis OR 97331-5503 USA Bio Shanaka de Silva is a Professor of Geology and Geophysics at Oregon State University. With
Germany
most devastating natural events on Earth,
• Martin Danišík, Curtin University, Australia
supereruptions. With this project, Prof de
•R ay Cas, Monash University, Australia
Silva and his team are working to gain a clear
• Jan Lindsay, University of Auckland, New
understanding of the processes and timescales of
Zealand • Graduate Students: Adonara Mucek, Stephanie
these supervolcanoes as this is vital for our ability to address potential hazards in the future.
fieldwork as a point of departure, Shan, students,
Grocke, Jason Kaiser, Dale Burns, Rodrigo
and collaborators have adopted a “forensics”
Iriarte, Jamie Kern, Katharine Solada, Chris
Funding
approach to understanding supereruptions and
Folkes, Jade Bowers
National Science Foundation (NSF), Geological
supervolcanoes in the Central Andes, Japan, China, Sumatra, New Zealand and the Italian Alps.
Q&A
• PLUTONS Team (various collaborators) http://
Society of America Research Grants Program
plutons.science.oregonstate.edu
flows. Beyond this, depending on the prevailing
What new technologies and/or scientific
winds there could be continent scale impacts
advances will help us to better understand
Where (and when) will be the next
on transportation, power infrastructure, water
supervolcanoes?
supereruption?
resources, and agriculture. Communication
We are still in infancy when it comes
The most likely place for a future supereruption
will almost certainly be limited and air traffic
to understanding volcanoes, not just
is a location where there has been such an
limited due to airport inoperation. Global
supervolcanoes. Critical to understanding this
eruption in the past. We know that large
impacts are debated, but it is commonly thought
is what is happening in the magma systems.
calderas like Toba and Yellowstone have
that significant cooling (due to atmospheric
While we are beginning to understand some of
had multiple eruptions in essentially the
aerosols) for many years that will adversely
the signals volcanoes broadcast, the problem
same location. If the factors that led to their
impact agriculture, the food chain, and human
with supervolcanoes is that they operate on
formation and evolution remain the same
activities is the most likely global impact. Given
much longer time scales than normal volcanoes
(largely controlled by plate tectonics), these
the interconnectedness of the global economy, a
and over much larger spatial scales. This is
calderas are the most likely location.
supereruption in any part of the globe is likely to
a huge challenge, but advances are being
be a global “Black Swan” event.
made on several fronts from understanding how magma systems are built and evolve,
Current statistics suggest that the Earth experiences a supereruption (Magnitude, M 8)
What is the radius of total destruction for a
the time scales over which these systems
approximately every 100,000 years. However,
supereruption?
develop, the rates and time scales of magmatic
there have been at least two such eruptions in
About 100 km is a generally agreed value.
processes, and what leads to eruption versus
the last 74 ka, and it is likely that our inventory
How far away could you be from a supereruption
storage. While we are improving our ability
of Earth’s supereruptions is incomplete.
and still hear it?
to read signals from restless calderas, two as
Calderas appear to be cyclic, but their
The M 4.5 1883 eruption of Krakatau is often
yet insurmountable challenges are to predict
periodicity varies rapidly. Our best strategy is
quoted as the loudest sound ever produced
exactly when (if) an eruption will occur and how
to be vigilant at the currently active systems
on Earth. It was apparently clearly heard up to
big it will be. Methods and technologies that
and pay attention to volcanic areas around the
3,000 miles away and the pressure spike (an
help answer these two questions are critical.
Earth that have shown this type of activity in
acoustic wave) created by the eruption was
One exciting area of development is the use
the last two million years or so.
recorded around the globe for about five days.
of satellites to measure signals associated
So technically the “sound” of this eruption was
with volcanoes. Given the global distribution
What would be some of the local, regional,
heard around the world for several days. A
of restless calderas and the long time and
and global impacts of a supereruption
supereruption is at least 1000 times as intense
spatial scales over which they operate,
today?
as the Krakatau eruption, therefore, the “sound”
constant surveillance and measurements by
Our understanding is that everything within a
could be expected to be significantly “louder”
satellites maybe a key direction in our effort to
100 km radius will be devastated by pyroclastic
and intense.
understand supervolcanoes.
www.researchoutreach.org
65
Earth and Environment︹ Professor Igor Polyakov
One ocean, many minds: collaborative science in the Arctic
The Arctic Ocean is undergoing a period of significant change. In collaboration with an international, multidisciplinary team of scientists, Professor Igor Polyakov from the University of Alaska Fairbanks is the lead scientist of an observational programme monitoring climatic changes in the Arctic Ocean. The data he and his team have collected is proving instrumental in understanding the on-going and fundamental changes experienced in the Arctic.
T
he loss of sea ice in the Arctic Ocean is a key indicator of global climate change and is a fundamental issue for modern-day climatology in the polar regions. Data collected over the past few decades show remarkable changes in Arctic sea ice and indicate that the Arctic may be entering a transition period of significant and potentially irreversible change. POLAR TRANSITION Over the last few decades, the extent of summer sea ice in the Arctic Ocean has reduced at an average rate of 11% per decade. In recent years, this reduction has also been accompanied by a reduction in both thickness and maturity: the Arctic Ocean lost 42% of its multiyear ice (ice that survives at least one Arctic summer) between 2005 and 2008. A COLLECTIVE EFFORT FOR OBSERVATIONAL DATA In order to track these remarkable changes within the Arctic Ocean, scientists at the University of Alaska Fairbanks conceived an innovative and collaborative observational project, called the Nansen and Amundsen Basins Observational System (NABOS). Established in 2002, NABOS focuses on the Eurasian basin (comprised of the Nansen and Amundsen basins) in the eastern Arctic and exploits a diverse range of observational methods, such as mooring buoys and oceanographic surveys. As the lead scientist of NABOS, Professor Igor Polyakov collaborates with scientists from several different countries, collecting vital information to document and understand climatic changes in the Arctic Ocean. The programme also encompasses outreach and education through the support of multiple graduate students and shipboard summer schools.
66
www.researchoutreach.org
NABOS technicians in preparation for mooring deployment.
FOCUSING ON CHANGE The overarching goal of the NABOS project is to build a cohesive picture of climatic changes in the Arctic Ocean. In order to achieve this goal, the team at NABOS investigate multiple oceanographic research areas, including: circulation patterns, changes in thermodynamic state, mixing rates, chemical composition and sea ice. The success of the NABOS programme relies primarily upon the effective collaboration of scientists from a wide range of countries, institutions and scientific disciplines. A collaborative approach has been key to obtaining repeated oceanographic sections and the long-term maintenance of mooring buoys. In response to the observed dramatic reduction of sea ice, one focus of the team at NABOS has been an investigation into the mechanisms behind recent sea-ice reduction.
programme’s conception in 2002, the international research team at NABOS carried out Arctic cruises every year. However, since 2013, cruises have been run every other year. The team’s primary monitoring tool consists of an array of moorings anchored to the seabed, beneath the active ice layer. The mooring locations
collected throughout the programme is analysed to give information on a wide range of oceanographic and climatic variables including: ocean turbulence, greenhouse gases, temperature, salinity, alkalinity, and ocean currents. When combined, these observations provide a broad picture of ocean activity. This data, collected during NABOS cruises, is then made publicly accessible online and is available for analysis by researchers across the globe.
The success of the NABOS programme relies primarily upon the effective collaboration of scientists from a wide range of countries, institutions and scientific disciplines
OBSERVING TRANSFORMATIONS For the first eight summers following the
have been carefully selected to capture key elements of the system, such as circulation patterns, the interactions between different water masses, and transport in the deep ocean basins and shallow shelf boundaries. Oceanographic surveys onboard stateof-the-art research vessels complement the mooring-based observations and provide opportunities for retrieval and re-deployment of the moorings. Data
The collection of oceanic data at high-latitudes is notoriously challenging, with scientists often facing harsh working conditions such as snow, ice-cover, poor visibility and negative temperatures. Despite these challenges, the NABOS cruises have collected important data that have made significant contributions to polar climatology. MAKING CONNECTIONS Since the beginning of the programme, NABOS has made several major
www.researchoutreach.org
67
findings that have greatly enhanced our understanding of recent transformations in the Arctic Ocean. For instance, NABOS observations were key to identifying a dramatic warming of the Arctic Ocean throughout the 2000s, which reached its peak between 2006 and 2008. In addition, NABOS observations collected between 2013 and 2015 have greatly increased our understanding of the mechanisms behind recent sea-ice reduction. Atmospheric forcing, through solar heating of surface waters, is accepted as a primary contributor to the loss of sea ice. However, recent NABOS data has revealed a complex secondary mechanism that is intensifying the reduction of Arctic sea ice. At intermediate depth, warm saline water from the Atlantic Ocean flows north into the eastern Arctic Ocean. Heat from this Atlantic water is prevented from reaching the surface as the overlying stratified layer of cold water, the ‘halocline’ (~50-150m depth) provides an effective insulating barrier. However, in recent years, the halocline layer has become significantly weaker, resulting in reduced stratification and prompting transition to a new ocean state: without the presence of a strong halocline layer, heat can be transferred vertically from the underlying warm Atlantic water. This enhanced upward heat flux has substantially reduced rates of winter sea-ice formation and has contributed to the loss of sea-ice cover observed over recent years.
NABOS observations collected between 2002 and 2015 have greatly increased our understanding of the mechanisms behind recent seaice reduction
68
www.researchoutreach.org
Above and right: Deployment and recovery of moorings in the harsh Arctic environment has always been a challenge.
Behind the Bench Professor Igor Polyakov
E: ivpolyakov@alaska.edu T: +1 907 474 2686 W: http://research.iarc.uaf.edu/NABOS2/index.php
408 G Akasofu Building International Arctic Research Center University of Alaska Fairbanks 1731 South Chandalar Drive Fairbanks, AK 99775 USA Research Objectives Professor Polyakov dedicates his research interests to the observational programme NABOS (Nansen and Amundsen Basins Observational System), where he is the lead scientist, and is instrumental in improving understanding of the ongoing fundamental changes in the Arctic.
Q&A
Why is the eastern Arctic an important place to gather observational data? Eddy Carmack, a prominent Arctic researcher, said that the Amerasian Basin is the king of storage whereas the Eurasian Basin is the king of flux. Climatologically, the Amerasian Basin indeed stores an impressive 12–20 m of domed liquid fresh water in the Beaufort Gyre – four to five times more freshwater than the Eurasian Basin. On the other hand, the eastern Eurasian Basin with its shelves is the transit area for the majority of Arctic riverine water. The Eurasian Basin provides the shortest pathway for Arctic fresh water to the sub-polar seas, where weak stratification leads to deep convection, a key part of global thermohaline circulation. The eastern Eurasian Basin as a switchgear, redistributing ice and fresh water between the eastern and western Arctic in response to
Funding National Science Foundation (NSF) Bio Igor Polyakov is a Physical Oceanographer with more than 30 years of experience in Polar Science. He earned his first and advanced degrees in oceanography and mathematics in St Petersburg, Russia. He also has numerous publications related to high-latitude climate change and air-ice ocean interactions.
Institute, St Petersburg, Russia – Vladimir Ivanov, Leonid Timokhov • Applied Physics Laboratory, University of Washington – Matthew Alkire, Jamie Morison, John Guthrie • Alfred Wegener Institute, Bremerhaven, Germany – Torsten Kanzow • Norwegian Polar Institute, Tromso, Norway – Arild Sundfjord • Institute of Ocean Sciences, BC Canada – Eddy Carmack
Collaborators • Arctic and Antarctic Research
atmospheric and oceanic forcings, has profound effects on climate as a result. What are the key questions that need to be addressed in future Arctic Ocean research? How will the Arctic Ocean physical, chemical and biological components respond to climate change? What impact will the high-latitude climate change have on lower latitude regions? What is next for the NABOS programme? There is hope that the programme will continue: only by continuous observations can important issues related to climate change in the Arctic be addressed.
How have technological advancements changed the way observational data is acquired at high latitudes? There is great progress in observational techniques used in Arctic research. Continuous observations spanning decades are now available thanks to the progress in this area. Moreover, new technologies now allow observations which we were just dreaming of ten years back. Mixing in the ocean, the impact of physical processes on the state of biological species (to mention just a few) are the areas where new technologies play the key role.
Only by continuous observations can important issues related to climate change in the Arctic be addressed
www.researchoutreach.org
69
Earth and Environment ︱ Dr Julie Meachen
Extinct giants, a new wolf and the key to understanding climate change By William Harris (Own work) [CC BY-SA 4.0 (http://creativecommons. org/licenses/by-sa/4.0)], via Wikimedia Commons
After its last excavation in the 1970s, a group of palaeontologists, genetics experts and cavers led by vertebrate palaeontologist and mammalian carnivore specialist Dr Julie Meachen of Des Moines University, have re-opened excavations at Natural Trap Cave (NTC) in North America. During this project, Dr Meachen hopes to uncover the secrets of the mass extinction of the last ice age and give high school students the opportunity to get involved in Palaeontology.
Holocene bison skull from 2016 Natural Trap Cave excavations. © Julie Meachen
70
www.researchoutreach.org
B
Dr Julie Meachen rappels into Natural Trap Cave © Justin Sipla
efore the withdrawal of the last ice age in the late Pleistocene some 10,000 years ago, species diversity was much richer than it is today. Many Pleistocene species were bigger, more varied in appearance and roamed in greater numbers than those we see now. Megafauna – large animals typically weighing more than 44kg and iconic to this time – such as mastodons, dire wolves and sabre-toothed cats, roamed the Earth. Now, these species are extinct, with climate change being one of the contributing factors in a complex extinction event that saw a dramatic loss of diversity. In an attempt to better understand the factors that influenced this mass extinction and how climate change affects animal populations, Dr Meachen along with co-investigator Dr Alan Cooper and her collaborative team have been studying the remains of megafauna excavated from Natural Trap Cave in Wyoming, North America. Today, we are experiencing a sixth mass extinction, with climate change and human influence believed to be at its heart. The research at Natural Trap Cave (NTC) could not only uncover information about the events leading to the loss of the ice age’s megafauna, but may also help us to understand how climate change has played a role in the current extinction crisis. A GLIMPSE INTO THE PAST Natural Trap Cave is a pit cave in North America. During the Pleistocene, this natural deadfall trap would have been located south of the Cordilleran and Laurentide ice sheets– glaciers that were separated by a channel leading from Alaska into North America. NTC lies just south of where this corridor between the ice sheets would have ended. Previous research has revealed that prey and predator species would have used this channel as a migration route into new territories. During this migration, some animals would fall into the cave due to the sheer 85-foot drop. Any animals that managed to survive the fall would be unable to escape, leading
to an abundance of well-preserved Pleistocene fossils. NTC has protected the wealth of skeletons that litter its floor for thousands of years by keeping temperatures below 10°C and preventing weathering from wind and rain. As a result, the fossils recovered from the site are often of unusually high quality and many are almost fully intact, allowing for a unique look at the past ecosystems of America. Some of the most complete American Cheetah (Mirancinonyx trumani) specimens have been recovered from this site, as well as an extinct species of Musk-oxen (Bootherium bombifrons) and several specimens of unclassified wolves.
Dr Meachen and her collaborative team have been able to assess the anatomy and appearance of species, their diet and the environmental conditions during a time of rapid climate change. This information has helped to broaden understanding of how animals may have responded to environmental changes brought about by a warming climate, by revealing immigration patterns and changes in anatomy. A DIFFICULT WOLF TO CLASSIFY One of the key findings by the team at NTC has been that of Beringian wolf specimens. The recently classified Alaskan Beringian wolf had not previously been identified as far south as Wyoming. Researchers who studied NTC wolves in the past had trouble categorising the skeletons they uncovered because Beringian wolves are very similar to both extinct Dire wolves (Canis dirus) and today’s Grey wolves (Canis lupus). However, by analysing DNA recovered from specimens at NTC, and measurements taken from jaw bones and teeth, Dr Meachen’s collaborative team could establish that the wolves of NTC are distinct from Dire wolves.
The research at Natural Trap Cave (NTC) could help us to understand how climate change has played a role in the current extinction crisis NTC was first revisited in 2014, with another two field sessions in 2015 and 2016 providing excellent data. Using DNA found in the bones and teeth of long extinct animals, pollen samples and records from the site’s last excavation,
www.researchoutreach.org
71
Left: A view of Natural Trap Cave. © Justin Sipla Right: An Ice Age coyote jaw from 2016 Natural Trap Cave excavations. © Julie Meachen
A little smaller than a Dire wolf and with a higher bite force and longer snout than Grey wolves, Beringian wolves lie somewhere between the two species in appearance and are believed to be an ecomorph of Canis lupus. In other words, Beringian and Grey wolves were the same species, but due to exposure
to different conditions, such as prey availability, differences in temperature and precipitation rates, they developed separate, distinct characteristics. Further study revealed that these animals were likely to be megafaunal
Dr Meachen believes the migration of Beringian wolves into mid-continental America may offer insights into the dispersion of the Grey wolf – a species which once covered the entire Northern Hemisphere
prey specialists and would have arrived at NTC by following the migration of musk-oxen and bison. Dr Meachen believes the migration of Beringian wolves into mid-continental America may offer insights into the dispersion of the Grey wolf – a species which once covered the entire Northern Hemisphere. A last field session in 2017 promises to further explore how climate change in the late Pleistocene affected these megafaunal populations. BRINGING THE ICE AGE TO SCHOOL Barefoot is an educational outreach team looking to make science relatable for children. Founder James MacDiarid has dedicated a decade of his career to bringing accurate science to students through videos and video-telecasting. Barefoot’s goal is to encourage children to ‘Imagine, learn and connect’ with science using stories to explain complex content. In 2016, Barefoot were invited to join Dr Meachen and her collaborative team at NTC, where they held skype sessions with students in Australia and England, bringing the excitement and wonder of the ice age directly into the classroom. It is the hope of both Barefoot and the NTC researchers, that by working together to engage with students they can provide an insight into how an excavation is carried out and what it is like to uncover the bones of long dead giants.
72
www.researchoutreach.org
Behind the Bench Dr Julie Meachen E: Julie.Meachen@dmu.edu
Dr Julie Meachen Assistant Professor Des Moines University 3200 Grand Avenue, Des Moines, IA 50312 USA
Bio
Dr Julie Meachen completed her PhD in Ecology and Evolutionary Biology at the University of California in 2008 and is now a vertebrate palaeontologist and functional morphologist that specialises in mammalian carnivores at Des Moines University.
Q&A What first interested you in Natural Trap Cave (NTC)? I knew about NTC, but never gave it much thought until I met my Co-PI Dr Alan Cooper at a group meeting at the National Evolutionary Synthesis Center in Durham, NC. He suggested we go back to NTC to get ancient DNA from the source. After that, it took us a few tries to get funding, but once we did, the project was off and running. How does your work at NTC change our understanding of the current extinction crisis and climate change? Our work explores the link between genetics, morphology, and the environment. So hopefully we will be
T: +1 515 271 1568 W: www.dmu.edu/
Research Objectives Dr Meachen and her team are currently working at Natural Trap Cave, where they are excavating Ice Age mammals to determine how climate change and the extinction events at the end of the Pleistocene (10,000 years ago) have affected the morphology and ecology of living and Pleistocene species. Additionally, they are using a microfaunal and pollen record to recreate Pleistocene climate in midlatitude North America. Funding National Science Foundation (NSF)
able to answer questions about how climate affects genes, morphology, and the likelihood that a species will go extinct. What can we learn about modern Grey wolf populations from this research? Hopefully this work will give us insights into when Grey wolves came into North America, and it has opened the door to questions about how widespread the Beringian morph was in North America. It makes me wonder if there are any living Grey wolves out there that may carry some hidden Beringian alleles in their DNA. Why do you think it is important to engage children in science? Children are the next generation of adults, and right now, it is more important
Collaborators
• Dr Alan Cooper – Australian Centre for Ancient DNA (ancient DNA analysis) • Dr Pennilyn Higgins – U Rochester (stable isotope analysis) • Dr Jenny McGuire – Georgia Tech (microfaunal analysis) • Dr Cory Redman – Drake University (mammal ecology) – former postdoc • Dr Susumu Tomiya – DMU (mammal morphology & ecology) – current postdoc • Dr Tom Minckley – U Wyoming (pollen analysis)
than ever to pique their curiosity about the natural world and encourage them to ask questions about how the world around them works. I would really love to inspire future generations of scientists to carry on important work on climate change and animal/habitat extinctions. What do you hope to achieve with your next excavation in 2017? We hoped to fill in some gaps in the microfauna fossil record that we had previously missed – and we collected about 40 bags of cave dirt to do just that. Hopefully we will get the microfauna we need! We also were looking for some good ancient DNA specimens of wolves, coyotes, bison, and horses – and I think we’ve also got some good candidates there!
www.researchoutreach.org
73
Earth and Environment ︱ Edward E. DeLuca
The Sun’s crowning glory Observing the corona
How do you observe a faint light next to a very bright one? Put simply, with great difficulty. Our knowledge of the outer atmosphere of the Sun – called its corona – is still fairly limited, due to difficulties observing this part of the Sun. Fortunately, solar eclipses (where the Moon and Sun perfectly overlap ‘in syzygy’) provide scientists, such as Dr Edward DeLuca at the Smithsonian Astrophysical Observatory, with an opportunity to explore this unknown phenomenon. During the “Great American Eclipse” in 2017, that’s exactly what he did.
74
www.researchoutreach.org
occur when magnetic energy stored in the corona becomes unstable. Hence, understanding the coronal magnetic field will lay the foundation to predict the volatile nature of the corona.
Forward
The AIR-Spec on-board the GV aircraft.
O
ne of science fiction’s greatest and most-beloved movies – 2001: A Space Odyssey – opens by showing a scene where the Sun and the Moon line up perfectly to form a solar eclipse. In August 2017 in America, this rare phenomenon could actually be seen in person. Beyond its aesthetic appeal, a solar eclipse offers a rare opportunity for researchers to observe certain solar and atmospheric phenomena that cannot normally be seen because of the Sun’s light. Having the Sun blocked out allows scientists to explore a relatively unknown aspect of the universe we live in – the Sun’s corona. M-M-M-MY SUN’S CORONA The Sun’s corona is simply its outer atmosphere. This comprises plasma that extends hundreds and hundreds of kilometres into space. During a total eclipse, the Moon completely obscures much of the Sun’s light, revealing the innermost part of the corona – a difficult thing to observe under ordinary circumstances. Although scientists now understand a lot about the makeup of the Sun, there are still many fundamental questions that have eluded viable explanation. For instance, we have
very little understanding of the magnetic field of the corona, which is believed to hold the key to unravelling the mysteries of its many solar activities. Dr Edward DeLuca, from the Smithsonian Astrophysical Observatory (SAO), and his team of scientists have utilised this rare solar eclipse opportunity to plan and execute an experiment that will advance our knowledge of the Sun’s corona. WHAT GOES ON INSIDE THE CORONA? Upon closer examination, the corona, composed of hot plasma, is a lustrous whorl of solar activity. It is a chaotic display of solar flares igniting and radiating randomly. Imagine hundreds of fireworks exploding constantly around the Sun. Energy flows from the corona to the photosphere in the form of waves as well as heat caused by ‘braiding’ or tangling of magnetic fields in the corona.
RESEARCHING THE SUN A solar eclipse presents an opportunity to directly measure the magnetic fields emanating from the corona. Since magnetic fields in the corona control the origin of the solar wind and the stability of active regions that result in flares and CMEs, magnetically sensitive infrared (IR) lines are the best candidates for making accurate measurements. IR is the invisible wavelength radiation just beyond the red end of the visible spectrum. The measurement method proposed by Dr DeLuca and his team members was to perform IR spectroscopy of the coronal fields. In simple terms, IR spectroscopy passes the IR radiation emitted from the corona through a device called a spectrometer which separates the light into different wavelengths. The resulting signal, detected using a high-resolution camera, produces a spectrum that shows absorption and emission lines at particular IR wavelengths. This data can be analysed to infer detailed information about the corona. THE AIR-SPEC PROJECT Dr DeLuca and his team proposed the Airborne Infrared Spectrometer (AIRSpec) project to maximise the window of opportunity created by the Great American Eclipse. The objective of this National Science Foundation (NSF)funded project was to characterise five magnetically sensitive coronal emission lines, which are difficult to observe. Since the spectroscopic signals are very weak, the AIR-Spec system had to be extremely stable and sensitive. As such, the design and data acquirement instrumentation had to be meticulously constructed. A large part of this responsibility fell to Ms Jenna Samara, a Harvard graduate student and pre-doctoral fellow in Dr DeLuca’s team,
During a total eclipse, the Moon completely obscures much of the Sun’s light, revealing the innermost part of the corona This heats the corona which accelerates the solar wind. Occasionally, high-energy material hurtles across space referred to as coronal mass ejection or CME. These
www.researchoutreach.org
75
who was instrumental in developing the apparatus. This kind of teamwork attitude is something Dr DeLuca feels strongly adds to his team’s success, citing Dr Phil Judge from the National Center for Atmospheric Research’s (NCAR) High Altitude Observatory (HAO) as a developer of IR coronal spectroscopy and a key collaborator. At ground level, water vapour in the atmosphere significantly limits the transmission at certain IR wavelengths. To overcome this obstacle, the experiment was devised to be carried out on-board an aircraft flying at 45,000–50,000 feet. To achieve this, Dr DeLuca enlisted the NSF/NCAR Gulfstream-V Highperformance Instrumented Airborne Platform for Environmental Research (GV) aircraft (see above image). Any sensitive instrument on-board a moving plane experiences fluctuations. Such variations and other vibrations had to be tested and dampened to ensure satisfactory performance of the AIR-Spec. To extract the most data within the short time frame of the solar eclipse, the flight path of GV was chosen such that the conditions were favourable for the measurements. The instrument design included an ephemeris-based (calculated positions of a celestial object at regular intervals) GPS pointing and tracking system to accommodate this objective. AIR-Spec was designated to fly along a path of maximum eclipse duration. Two separate trial flights were piloted in December 2016 and July 2017, after which, several modifications and fine tuning were done to better stabilise the AIR-Spec. TAKING FLIGHT The AIR-Spec project took flight at the onset of the “Great American Eclipse” on 21st August 2017. With the spectrometer on-board, the GV aircraft set forth tracing the path of totality of the eclipse. During the flight, measurements were performed in different parts of the corona. The GV completed the flight within 10–19 seconds of the projected time frame. Analysis of the spectroscopic data acquired clearly showed the five coronal emission lines that were predicted, and a paper detailing the results of the AIR-Spec project is due to be published later this year.
76
www.researchoutreach.org
Above: The GV aircraft. Below: AIR-Spec installed, ready for the flight.
Dr DeLuca and his team have built a highly stable and mobile spectroscopic system that can be deployed for various research efforts LOOKING AHEAD Through AIR-Spec, Dr DeLuca and his team have built a highly stable and mobile spectroscopic system that can be deployed for various research efforts. The image stabilisation system is independent of the target, so it can be adapted for any remote-sensing application. Future eclipse observations are enabled by the GV’s ability to travel anywhere. For example, the 2nd July 2019 eclipse
in the South Pacific off the coast of Chile is an obvious candidate for future flights. The emission lines observed by the AIR-Spec project will help guide future observing programmes. Through Dr DeLuca and his collaborator’s work, the Air-Spec project has laid the foundations to explore the Sun’s corona in more detail, thereby expanding our understanding of the Sun, its effects on Earth and, ultimately, the habitability of our universe.
Behind the Bench Edward E. DeLuca E: edeluca@cfa.harvard.edu
Smithsonian Astrophysical Observatory MS 58, 60 Garden St Cambridge, MA 02138 USA Bio Dr DeLuca is widely recognised for his theoretical work on the magnetic structure of solar active regions and magnetic activity in the Sun and stars, with work ranging from instabilities in hot magnetised plasmas to magnetic field production in rapidly rotating convective stars. He has served as the science lead for several missions and instrumentation projects. Dr DeLuca is active in the Heliophysics community,
Q&A If the footprint of the spectroscopy/ imaging apparatus were to be made more compact, would that allow for drone type flights? And if so, would that enable better spectroscopic measurements? The best option for an autonomous instrument would be on a high-altitude balloon. These can carry heavy instruments, fly at ~150,000 feet and have been successfully used for solar observations in the past. We are in discussions with colleagues at HAO about a future balloon project. Are there other methods you could use to measure the spectrum instead of a spectrometer? It’s a bit of a tautology – if you are measuring the spectrum it is a spectrometer. Our spectrometer uses a grating. There are other types of spectrometers that use filters or interferometric techniques. The NCAR ground-based NAI is a fourier-transform IR spectrometer that uses interferometry.
T: +1 617 496 7725
W: http://hea-www.harvard.edu/~deluca/HomePage.html
serving as chair of the NASA Heliophysics Division Roadmap committee and as a science editor for the Astrophysical Journal for five years. He has had more than 30 published papers in the past five years. Collaborators Jenna Samra – Harvard U. School of Engineering and Applied Science & SAO pre-doc fellow; Phil Judge – High Altitude Observatory National Center for Atmospheric Research; Lou Lussier – Research Aviation Facility National Center for Atmospheric Research; Peter Cheimets – CfA Project Engineer/PM; Vanessa Marquez – CfA Lead Mechanical Engineer; Alisha Vira Is a solar eclipse the only occasion when you can perform these coronal measurements? Why is that? It is not the only occasion; you need to block out the photospheric emission to about ~107. There are telescopes called coronagraphs that create artificial eclipses using an “occulting” disk to block the photospheric light. We can’t use one in the GV because the window in the plan body will cause scattering at a level higher than ~107. The balloon experiment could support a coronagraph. Also, the new NSF funded 4m Danial K. Inoue Solar Telescope on Maui will be able to observe the solar corona in the IR without an occulting disk. That was a huge design challenge. The University of Hawaii has a prototype 0.5m telescope that has taken these measurements from Maui. Is it safe to assume the technology of the instrumentation (camera/ spectrometer/ stability etc.) is more limiting than the vehicle? Would there be any benefits to mounting the AIR-Spec on a satellite or other space vehicles? Balloon instrumentation and satellite instrumentation are much more expensive than airborne instruments. The conditions that they need to endure are unlike the lab
– Smith College undergraduate, Summer Intern at SAO; Chad Madsen – Postdoc at SAO Research Objectives Dr DeLuca’s recent project aimed to observe five magnetically sensitive coronal emission lines in the Sun’s corona by designing and flying a highly advanced imaging spectrometer to make measurements during the recent 2017 solar eclipse. Funding NSF Major Research Instrumentation grant, NSF AGS-1531549 setting/airplane cabin and they need the software interface to operate remotely or via commanding from the ground. Once we have characterised the emission properties of these coronal lines with AIR-Spec, we can understand what would need to be built for a balloon or satellite and determine if those instruments are viable. Are models from combustion spectroscopy/coherent anti-Stokes Raman spectroscopy useful in understanding coronal flares? Probably not, the conditions in the corona are close to the best vacuums achievable in the lab. We can detect signals because of the very long path lengths through the coronal structures. The light emitted is not re-absorbed by gas in front of it, so we see a sum of all the light emitted along the “line-of-sight” – this can be hundreds of megameters in the corona. We have extensive modelling of coronal emissions that is constrained by laboratory experiments, atomic physics models and solar observations in the EUV and soft X-rays. The same tools can be applied to the IR coronal lines, but observations are needed to check on the model predictions.
www.researchoutreach.org
77
Thought Leader
BAS: Investigating icy waters with Boaty McBoatface Oceans are not only filled with many weird and wonderful creatures, but they can also slow down climate change – storing human-produced carbon and heat in their oceanic depths. Understanding how this process happens is vital to predicting the impact climate change will have over the coming years. Professor Mike Meredith, science leader at the British Antarctic Survey, focuses on this area – investigating dense waters as they flow from Antarctica into the Atlantic Ocean, as a participant in the Dynamics of the Orkney Passage Outflow (DynOPO) project.
B
ack in 2014, construction began on a new polar research vessel for the British Antarctic Survey (BAS), to replace two existing ships – the RRS James Clark Ross and RRS Ernest Shackleton. Fast forward two years, and the Natural Environment Research Council (NERC), the institution in charge of the construction, set up an online poll asking members of the public to suggest potential names for this replacement ship. The RRS Boaty McBoatface quickly became a firm favourite with the public, but ultimately a name was selected that honours Britain’s much-loved naturalist, Sir David Attenborough. Nonetheless, due to the widespread publicity received from
project was? What was the motivation for and background of the project, and what was its goal? The oceans exert a huge influence on our planet’s climate, by sucking down heat and carbon from the atmosphere, and storing them in the ocean depths for decades or even centuries. This does us humans a big favour, by slowing the rate of global warming – but we need to know more about how it works, so that we can predict it better. A particular focus for us is the waters that form close to Antarctica. These are made incredibly dense by interacting with the freezing atmosphere and ice, and they sink to the seabed and spread out to
them when they cross an underwater mountain chain called the South Scotia Ridge. We believe that the contorted pathways the water takes as it flows over and around these mountains leads to a lot of mixing, and that this mixing might change over time. We hope to find out exactly how and why this happens, and what it means for the role that these deep waters play in climate change. You recently lived and worked on board the British Antarctic Survey (BAS) research ship James Clark Ross. Can you describe what life was like there? What did your average day entail? How long were you aboard? Actually, I was the unlucky one – whilst I
The DynOPO project has been created to study dense waters as they flow from the Antarctic into the Atlantic Ocean, and investigate what happens to them when they cross the South Scotia Ridge the naming campaign, the moniker Boaty McBoatface was given to one of the craft’s underwater vehicles instead. And, in 2017, it embarked on its first mission. Professor Mike Meredith is an oceanographer and science leader at the NERC’s BAS. He recently spoke to us at Research Outreach to discuss his latest research venture – the DynOPO project – highlighting the impact Boaty McBoatface has had on improving public appreciation of polar research. Can you explain what the DynOPO
78
www.researchoutreach.org
become the abyssal waters across most of the globe. These waters have warmed in recent decades, and we don’t really know why – but we need to figure it out, so that we can better predict how it will change in future. This matters for several reasons, including the global heat budget and sea level rise. The DynOPO project was created by scientists at the University of Southampton, BAS and the USA to study these dense waters as they flow northward from the Antarctic into the Atlantic Ocean, and what happens to
am an Investigator on the project, I was not participating in the fieldwork myself (too many other responsibilities!). But I’ve sailed on the James Clark Ross many times, so I know what the field party will have gone through. It’s actually a very comfortable ship, with all mod cons and some of the most advanced science equipment that marine scientists could wish for. Science expeditions to the Southern Ocean are hugely exciting of course – not just for the chance to make breakthroughs in the things we are studying, but also
Professor Mike Meredith of the British Antarctic Survey
because of the environment around us – the scenery can be amongst the most spectacular in the world, and the richness of the wildlife is staggering. Life on board typically settles into a routine quite quickly, and things tend to revolve very much around mealtimes. The food is normally very good, and plentiful – scientists often leave expeditions several pounds heavier than when they start! Work will have been full-on – the ship works around the clock, so the scientists split into shifts, with some working nights to ensure that data collection never stops. The ship collected data continuously, even when it was steaming along between target sites, but many of the key measurements required the ship to be stopped and equipment lowered into the ocean, sometimes down to a couple of miles or deeper. Water samples were collected and analysed in the ship’s laboratories, and a great deal of computer-based work was carried out to make sense of all the data as it was collected. Expeditions on James Clark Ross are typically a few weeks long; the DynOPO was a long one being around seven weeks in total. This was excellent – it offered scope to collect a huge and unique dataset with which we can tackle the questions we are trying to answer. What are the main challenges of carrying out research in the Antarctic? Antarctic fieldwork in general is challenging because of the harshness of the environment, which must be treated with utmost respect. Ship-based fieldwork brings its own challenges – the seas around Antarctica can be some of the roughest in the world, so you are working in an environment that can make
80
www.researchfeatures.com
The National Oceanography Centre’s Autosub Long-Range (also known as Boaty McBoatface) being loaded onto the BAS vessel RRS James Clark Ross. Photo by Dr. Povl Abrahamsen, BAS
you feel nauseous just by being there. It’s also the case that you are working inclose confines with your colleagues for several weeks, so a lot of tolerance and patience is required by all. And simply being away from family and loved ones for such a long period can be emotionally challenging. But typically, a camaraderie develops on-board, and people enjoy working together in a team on problems that they are all interested in – so whilst the challenges are undoubtedly real, people usually deal with them very well. Boaty McBoatface was one of the research tools at your disposal on this trip. Can you tell us a bit about Boaty’s mission? And what impact the
publicity generated by Boaty has had on the research mission? Boaty was one of the key tools that DynOPO used – it was deployed into one of the key deep gaps in the underwater mountain chain through which the dense water flowed, and it completed missions in and around that gap to collect data on ocean temperature, how salty it is, how much it is mixing, and so on. By being able to stay submerged for days or even weeks, it could build up datasets of a complexity and detail that has never been possible before – so it enables a real leap forward for the science. The publicity surrounding Boaty was wonderful – the way it caught the public’s imagination gave us scientists the chance to engage
Thought Leader
Boaty McBoatface on the deck of the James Clark Ross, prior to the DynOPO expedition. Photo by Dr. Povl Abrahamsen, BAS
Science expeditions are hugely exciting. The scenery can be amongst the most spectacular in the world, and the richness of the wildlife is staggering with a much wider section of the general community than we would otherwise have been able to, and explain the science we are doing to them, and why it matters. Can you tell us about some of the other research tools and processes you used? The workhorse of the science we conduct is called a “CTD” (ConductivityTemperature-Depth instrument). It is basically an extremely advanced thermometer that is lowered on a wire from the ship down to the seabed. (It also measures salinity and a number of other things that we care about.) It collected samples of water that we brought onboard and measured in the ship’s labs. There are other instruments we used too, including free-fall probes for measuring mixing – these are nerve-wracking, because they aren’t tethered to the ship, so each deployment is a heart-in-mouth experience. Luckily, they are normally well-trained about coming back when they should. What is the wider significance of understanding the complex physical
processes occurring in the Southern Ocean? The Southern Ocean is key to the functioning of all of Planet Earth. It is the main site globally where deep waters from 1–2 km down rise to the surface and can interact with the atmosphere and the ice; once they have done this, they sink back into the ocean interior for very long periods. This means that the Southern Ocean can draw down heat and carbon from the atmosphere much more effectively than other regions, and hence can slow the rate of climate change. This matters for societies in all parts of the planet, but we need to know more about how it works, so that we can improve how well we can predict it. And finally, what initially triggered your interest in polar ocean research? I originally trained as a physicist, which seemed to involve a lot of time spent working in darkened laboratories, but I have always been fascinated by extreme environments. Like most people, I was amazed by the early documentaries showing the wildlife and environment
of Antarctica – so having finished my physics degree, I jumped at the chance to study for a PhD in Antarctic oceanography. Soon after that, a job became available at the British Antarctic Survey, and the rest is history! • To find out more information about the DynOPO Project, or about the BAS in general, please visit their excellent website at www.bas.ac.uk.
Professor Mike Meredith British Antarctic Survey High Cross, Madingley Road Cambridge CB3 0ET United Kingdom E: information@bas.ac.uk T: +44 (0)1223 221400 W: www.bas.ac.uk
www.researchfeatures.com
81
Biology ︱ Dr Karen Maruska
Unravelling the signalling cues controlling vertebrate reproductive behaviour How do vertebrate brains integrate information from external social cues and internal physiological states to produce appropriate behaviours? This is one of the big questions that Dr Karen Maruska and her research team at Louisiana State University (LSU) are striving to answer. Dr Maruska leads a research group that uses fish models to investigate how animals process and translate multisensory social cues into context-specific behaviours for reproduction and survival.
A
nimals, including humans, live in a multisensory world, using many sensory channels to communicate during crucial behavioural situations. External multisensory or multimodal signals include: visual, chemosensory (smell and taste), auditory (sound), tactile (touch), and mechanosensory (e.g., pressure or vibration) cues. These signals convey crucial information about the sender’s status, and must be integrated with the receiver’s internal physiological state for translation into adaptive behaviours e.g., those involved in courtship and reproduction.
THERE’S SOMETHING FISHY GOING ON AT LSU Dr Maruska’s team uses the African cichlid fish Astatotilapia burtoni as a research model. Fish serve as an excellent model for this work, because they are the largest and most diverse group of vertebrates, they have well-described social behaviours, and are easily manipulated during experimentation. The availability of such a model provides an opportunity to study basic neuron and sensory function, and how these functions relate to proximate and ultimate behavioural mechanisms in comparative and evolutionary contexts. Specifically, the group studies how
Dominant male Astatotilapia burtoni (shown here) are brightly coloured and defend territories that they use as spawning sites. Males signal to females during courtship with visual displays, sounds, chemical cues, and water movements. These multimodal signals are then used by females to make behavioural decisions. Photo credit: Karen Maruska.
82
www.researchoutreach.org
(mouth brooding), and the young are released after approximately two weeks.
ACOUSTICS IN CICHLID REPRODUCTION As a postdoctoral scholar at Stanford University, Dr Maruska used sound recording analysis to reveal that male cichlids deliberately produce courtship sounds when close to a gravid female, and that these sounds are spectrally compatible with the female’s hearing abilities. Females were more sensitive to these sounds when they were gravid, and this coincided with an increase in the levels of the primary female sex hormone estradiol, and an increase in levels of the enzyme that produces estradiol (aromatase) in several auditory processing and decision-making regions in the brain. Behavioural experiments revealed that gravid females were more attracted to male courtship sounds than to unspecific noise, highlighting the importance of acoustic cues in female mate choice.
How do females use male courtship signals, and where in the female brain are these signals integrated with her internal state to produce appropriate behaviours? In other words, how does the female decide whether or not to mate with a certain male? These questions are at the centre of Dr Maruska’s work.
This work provided the first evidence for the importance of acoustic communication as part of a multimodal signalling repertoire during cichlid reproduction. It also demonstrated that perception of such acoustic information changes, depending on the receiver’s internal physiological state (i.e. the female hormonal state).
The role of visual signals such as colours and the male body quiver in cichlid mating has been intensively studied; yet researchers have never been able to explain the diversity of cichlids by visual communication alone. During courtship, female cichlids are exposed to a wide array of stimuli by dominant males, including colours, movements, sounds, and chemicals, and up until a few years ago, the role of other sensory channels
THE ROLE OF CHEMOSENSORY SIGNALLING IN CICHLID SOCIAL INTERACTIONS Dr Maruska’s research also demonstrated that dominant male cichlids modulate their urine release in both reproductive and territorial situations, suggesting that urine might be an important social signal. Males released urine sooner and more frequently when visually exposed to gravid females, and the combination of visual and chemical signals resulted in ten times more courtship behaviour than visual cues alone.
Figure 1- During courtship, males send and females receive different types of information via multiple sensory channels. Females must then integrate these signals with their own internal physiological state to make context-specific behavioral decisions..
the cichlid brain processes unimodal and multimodal sensory information, how sensory systems contribute to behaviour, and how natural fluctuations in the cichlid’s internal hormonal or nutritional state can influence neural function and behavioural outcomes. In order to uncover new insights into the mechanisms regulating animal reproductive behaviour, the team uses a combination of approaches, including: hormonal assays, sound recordings, advanced microscopy, brain recordings, molecular techniques, and behavioural analysis. TO MATE OR NOT TO MATE? During the reproductive process, a dominant male cichlid fish becomes brightly coloured and performs courtship behaviours to tempt passing females into their territories to spawn. A sexually-receptive (gravid) female enters the territory, lays eggs, and immediately picks them up in her mouth. Following this, the male quivers his body towards the female, which results in her nipping at his so-called egg spots, which look remarkably similar to female eggs. This act stimulates the male to release sperm, which then fertilises the eggs already present in the female’s mouth. The fertilised eggs are then reared inside the female’s mouth
Textbook discussions on the comparative neural control of communication during reproduction are essentially non-existent e.g., the auditory system, was virtually unexplored. The work of Dr Maruska and her group has made significant advances in this area.
Along with the work on acoustic communication, these findings further highlight the role of non-visual sensory modalities in reproductive behaviour. Remarkably, reproductive and territorial behaviours were enhanced when males were simultaneously exposed visually
www.researchoutreach.org
83
acoustic, or visual and chemosensory, information is conveyed simultaneously, visual information dominates. The group is also currently investigating how anthropogenic noise (noise caused by humans) might influence fish behaviour, physiology, and sensory abilities, a project led by PhD student Julie Butler. This work could have implications for how we consider the consequences of environmental noise, urbanisation, and climate change on reproduction and survival in fishes and other vertebrates.
Figure 2 – The Maruska lab uses an integrative approach that includes behavioural assays, neural activation, and neural recording studies to investigate where and how unimodal and multimodal sensory information is processed in the fish brain to generate adaptive behaviours in different social contexts.
This work is timely and will transform our current understanding of how sensory inputs, reproductive state, and behavioural circuits interact in the vertebrate brain and chemically to other fish, as opposed to visual exposure alone, demonstrating the power of multimodal signalling in regulating behaviour. Recent work done by a PhD student in the lab, Karen Field, has shown that females also use chemosensory signalling in the presence of dominant males and mouth brooding females (a sign of aggression), and that this coincides with activation of highly conserved social decision-making regions of the brain. A senior research associate in the group, Dr Alexandre Nikonov, is also recording from single neurons in these same brain regions to determine how this chemosensory information released by females is processed in the male’s brain. Coupled with the previous work, these findings illustrate true chemosensory communication in both sexes of a single fish species, and reveal neural substrates (i.e., parts of the brain) that mediate sexual and aggressive social behaviours in females. MAPPING THE BRAIN’S RESPONSE TO MULTIMODAL STIMULI Building on previous accomplishments, Dr Maruska now leads her team (PhD student Teisha King, and several
84
www.researchoutreach.org
undergraduate researchers) on an ambitious National Science Foundationfunded project that will use behavioural, cellular, and molecular analyses to shed further light on how multimodal signals are represented in the brain of the female cichlid, as a model for all vertebrates. Specifically, this project aims to identify the neural substrates that mediate behavioural decisions based on the reception of multimodal signals, and whether neural activation patterns are influenced by the female reproductive state. Preliminary findings revealed clear differences in brain activation between females exposed to courting males and other females, allowing the group to identify parts of the brain involved in receiving and processing such signals. WHAT MIGHT THIS MEAN FOR SOCIETY? Dr Maruska’s work has shown that communication in cichlid fish is multimodal and nonredundant in both sexes, whereby each sensory channel (visual, acoustic, or chemosensory) conveys a distinct message. The work also found that when visual and
Female cichlids are mouth brooders, and they must be able to rapidly change their eating behaviours during the reproductive cycle. Once brooding begins, they rapidly cease eating to protect the developing young, and once the young are released, they resume eating to regain energy for subsequent breeding attempts. What controls these switches, and how exactly does the brain control the female urge to eat? These fascinating questions are another current NSF-funded focus in Dr Maruska’s group (in collaboration with Dr Suzy Renn, Reed College), and will shed light on the neural basis of feeding and maternal care behaviours. The answers may even improve our understanding of human eating disorders. The long-term goal of Dr Maruska’s research is to gather a complete picture of how a species communicates during reproductive and aggressive contexts using multisensory systems, and how this sensory processing and behaviour can be influenced by the animal’s internal physiology, such as hormone levels, reproductive state, or social status. All animals live in a multisensory world, sending and receiving information in multiple sensory channels, yet many previous studies examine only a single sensory modality at a time. Accounting for multisensory signals and inputs better represents an animal’s natural world, providing new meaningful data on how animals use this information for behavioural decisions. Deciphering how all of these processes work will significantly advance our understanding of how the vertebrate brain regulates social behaviours, and will likely overlap with other research disciplines, such as: psychology, evolution, and cognitive neuroscience.
Behind the Bench Dr Karen Maruska
E: kmaruska@Isu.edu T: +1 0225 578 1738 W: http://www.lsu.edu/science/biosci/faculty_and_staff/ maruska.php W: http://www.kmaruska.biology.lsu.edu/ W: https://burtoniblog.wordpress.com/ W: https://vimeo.com/182046796 W: https://www.lsu.edu/science/biosci/
Research Objectives Dr Maruska and her team’s research aim is to use fish models in order to gain insight on the basic mechanisms of how the brain functions and adapts to an animal’s constantly changing external environment and internal physiological state. Funding National Science Foundation (NSF)
Q&A
Why have you specifically chosen the African cichlid fish as a model organism for your research? First, they are very social fish with now well-characterised territorial and reproductive behaviours. For our sensory work specifically, it’s important that they use multisensory communication in different behavioural contexts. They are relatively easy to put in different types of social situations to examine neural and physiological correlates of behaviours. There is a lot of background information on this species from several different research labs, making it an important emerging model in behavioural neuroscience. Second, they have a sequenced genome, making molecular and genetic studies possible, as well as comparative and evolutionary research. Having resources available from whole animal behaviour down to molecular-level analyses in a single species allows for significant advances in the fields of social behaviour and communication. What has been the biggest technical challenge in your research to date? One of the biggest challenges in sensory behaviour experiments is creating the right experimental conditions so that the fish behave normally while ensuring that the sensory exposures are correct. We perform lots
Collaborators • Dr Alexandre Nikonov (Louisiana State University) • Dr Suzy Renn (Reed College)
as vertebrate models to study animal communication, sensory system plasticity, and the neural basis of social behaviours.
Bio Karen Maruska is an Assistant Professor of Biological Sciences at Louisiana State University. She received her PhD from University of Hawaii and was a postdoctoral scholar at Stanford University. Her research uses fishes
Contact Karen P. Maruska, PhD Assistant Professor Department of Biological Sciences Louisiana State University A345 Life Sciences Annex Building, Baton Rouge, LA 70803, USA
of pilot studies before deciding on the appropriate experimental protocols to ensure we collect the most meaningful data possible. Another challenge is interpreting brain activation data in the fish because their forebrains develop differently from that in higher vertebrates like mammals. This makes it difficult to discuss homologies across taxa when it comes to functions of specific brain regions, but one goal of our work is to make advances on this front as well.
examining epigenetics, and there are likely epigenetic mechanisms involved in many aspects of this fish’s behaviour and physiology. The Fernald Lab at Stanford University has done some work on epigenetics in this species, but there is certainly lots of opportunity to investigate this in the future.
To what extent do the behaviours observed in the controlled laboratory environment mimic reallife situations for these fish? Our experiments are purposely designed to be close to the natural situations that these fish may encounter in the wild. Neural mechanisms of behaviour are only meaningful in the context of the behaving animal, so it is important to keep experimental variables as close to biologicallyrelevant as possible. While there are always some limitations to conducting experiments in the lab, the reproductive and aggressive behaviours displayed by this species are similar in aquaria and their natural habitat of Lake Tanganyika, Africa. Do you have plans to explore the influence of epigenetics on behaviour in your model? We have discussed the possibility of
What relevance, if any, can this research have for fish conservation practices? Our work on sensory communication and behaviour can have important implications for fish conservation, management, and aquaculture. For example, environmental changes associated with climate change, urbanisation, and pollution can have detrimental effects on the ability of fishes to sense and react to prey, predators, and mates, all of which is crucial for survival and species persistence. A fundamental understanding of how fishes use different sensory channels for communication and survival is a necessary first step towards interpreting how they may be affected by environmental disturbances and how they may be able to adapt. This can contribute to research-based guidelines for species management and aquaculture practices to improve species survival.
www.researchoutreach.org
85
Biology ︱ Dr Kirstin Gutekunst
Energising life on earth: the third way Almost all living organisms on earth get their energy, ultimately, from the sun. Energy is fixed in carbohydrates by plants and cyanobacteria during photosynthesis, then both animals and plants release it by breaking down those carbohydrates. Until now, only two main routes of carbohydrate breakdown were thought to be present in cyanobacteria and plants. However, Dr Kirstin Gutekunst, of Christian-Albrechts-University of Kiel, Germany, has found a third pathway – the Entner-Doudoroff pathway – also plays a vital role in carbohydrate breakdown in cyanobacteria and plants.
T
he processes of fixing solar energy as carbohydrate by photosynthetic organisms (cyanobacteria, algae and plants), and its subsequent breakdown to release energy, water and carbon dioxide, are central to life on earth. They have been subject to great amounts of scientific study, and it was thought that the pathways and reactions of both had been well established. However, Dr Gutekunst’s work found that, at least in cyanobacteria and plants, one glycolytic route has been previously overlooked. ENERGY FROM THE SUN All living organisms need two things to survive: a source of energy, and a source of organic carbon for building cells. Both of these are fixed by plants during photosynthesis – making this process essential to life on earth. During photosynthesis, energy from sunlight is used to combine water and atmospheric carbon dioxide into glucose sugars, a form of carbohydrate. The solar energy becomes stored as chemical energy in the bonds between carbon, hydrogen, and oxygen atoms in the glucose molecules. Glucose and its derivates can go on to be built into larger molecules, such as starch, protein, fat, and even DNA. Alternatively, it can be broken down completely again – either within the plant or in an animal that has eaten it – releasing water, carbon dioxide and, crucially, the stored energy. The energy is released in a molecule known as ‘adenosine triphosphate’ (ATP) which is the ubiquitous energy currency of all cells. ENERGY FROM SUGARS It has long been known that there are two different pathways by which animals, cyanobacteria and plants
86
www.researchoutreach.org
Synechocystis sp PCC 6803 Thylakoid membranes (where photosynthesis takes place) Carboxysome (where CO2fixation takes place)
Life on Earth is essentially driven by a circuit of photosynthesis, which uses energy from sunlight to form carbohydrates and carbohydrate oxidation, which releases the stored sun energy in the form of ATP.
Glycogen (storage form of carbohydrates)
Cytoplasm (where glycolytic routes, TCA cycle and CalvinBenson cycle* are located)
break down glucose: the EmbdenMeyerhof-Parnas (EMP) pathway, also called simply ‘glycolysis,’ and the oxidative pentose phosphate (OPP) pathway. However, simpler organisms such as bacteria and archaea are known to employ a variety of routes to release energy from glucose. One of these is the Entner-Doudoroff pathway. Dr Gutekunst’s team has now found that the key enzyme of the Entner-Doudoroff pathway, known as KDPG aldolase, is in fact widespread amongst photosynthetic organisms such as cyanobacteria and plants, from mosses to higher plants including rice, barley, maize, banana, potato, spinach, soybean, cotton and tobacco. In barley, their analyses have shown that KDPG aldolase is functional during periods of active growth, such
*only one enzme of the Calvin-Benson cycle (Rubisco) is located in the carboxysomes
as germinating seeds and developing roots, suggesting an operating EntnerDoudoroff pathway is present. Using the photosynthetic cyanobacterium, Synechocystis, Dr Gutekunst and colleagues have developed mutants in which each of the
The previously overlooked EntnerDoudoroff pathway is in fact widespread amongst cyanobacteria and plants three pathways of glucose breakdown is disrupted. They found that growth in the presence of light and glucose was reduced most significantly in those mutants without a functional EntnerDoudoroff pathway. This indicates that the Entner-Doudoroff pathway is not only functional, but is a significant contributor to growth in Synechocystis. The team is now working, with Dr Götz Hensel from the Leibniz Institute of Plant Genetics and Crop Plant Research at Gatersleben, to develop similar ‘knockout’ mutants in barley to test the significance, behaviour and requirements of the pathway in higher plants.
Analysing Synechocystis cells under the transmission microscope can have some surprising results! The big black spots that look like eyes are cyanophycin accumulations (a storage form of nitrogen).
First, the Entner-Doudoroff pathway releases less energy from glucose: one molecule of the energy currency ATP per glucose molecule, compared to two in the EMP pathway. Although this might seem disadvantageous at first sight, the Entner-Doudoroff pathway has some advantages. The pathways
UNIQUE FEATURES So, how does the Entner-Doudoroff pathway differ from the other two pathways operating to break down glucose? There are two crucial differences.
by which glucose is built up and broken down to some extent overlap, but with reactions occurring in opposite directions. Thus, during daylight when photosynthesis is active, the action of the EMP or OPP pathways can undo the reactions occurring in photosynthesis, causing futile cycling between the two processes. Previously, it was thought that cyanobacteria compartmentalise their cellular processes chronologically, focusing on photosynthesis during daylight hours and respiration in the dark, thus avoiding this problem. The big advantage of the Entner-Doudoroff pathway is that it does not overlap with any of the reactions of photosynthesis, allowing cyanobacteria to break down glucose and release energy and cellular building blocks during daylight as well as at night, without risk of futile cycling. This was exactly when Gutekunst’s mutant studies showed
www.researchoutreach.org
87
It has been known for a long time that cyanobacteria and plants possess both glycolysis (shown in red) and the oxidative pentose phosphate pathway (shown in blue) as glycolytic routes. However, the Entner-Doudoroff pathway (shown in green) was previously overlooked.
Glycogen GP
HK
Glucose-1P
GDH
Glucose PGM
ATP
ADP
GK
ZWF
Glucose-6P
ADP
ATP
ADP
Glyceraldehyde-3P
TAL
Fructose-6P
Glyceraldehyde-3P (GAP) GAPDH
RPE
Xylulose-5P Sedoheptulose-7P Erythrose-4P Xylulose-5P TKT
NAD(P)+
NAD(P)H
1,3bP-Glycerate PGK
RPE
Fructose-6P
Pyruvate
DHAP
RPI TKT
Fructose-1,6bP
TPI
Ribulose-5P Ribose-5P
EDA
ALD
CO2
GND
NAD P+ NAD PH
EDD
2K-3D-6P-Gluconate (KDPG)
Fructose-6P PFK
ATP
6P-Gluconate
NAD P+ NAD PH
PGI
FPB
Gluconate
NAD P+ NAD PH
ADP ATP
3P-Glycerate PGAM
2P-Glycerate ENO
Cyanobacteria and plants possess three alternative glcolytic routes Embden-Meyerhof-Parnas pathway (EMP; glycolysis) Oxidative pentose phosphate pathway (OPP) Entner-Doudoroff pathway (ED)
Phosphoenolpyruvate PYK
ADP ATP
Pyruvate that Synechocystis released significant amounts of energy from glucose via the Entner-Doudoroff pathway. SUCCESSFUL SYMBIOSIS One of the outstanding questions surrounding the Entner-Doudoroff pathway is why it is found only in plants and bacteria, never in animals. An evolutionary analysis suggests that – like photosynthesis – the Entner-Doudoroff pathway made its way into plants from cyanobacteria via a process of ‘endosymbiosis,’ in which one organism is engulfed by another and part of its genome
becomes permanently incorporated into the host. Interestingly, however, the pathway is not present in all plant species. Crucially, it appears to be missing from the ubiquitous ‘model’ plant species, Arabidopsis (thale cress), used for genetic, biochemical and physiological studies across the world. This may explain why the significance of the pathway has remained overlooked for so long. Having overturned the paradigm of two routes to glucose breakdown in photoautotrophs, Dr Gutekunst now wants to elucidate more fully
The Entner-Doudoroff pathway was transferred from cyanobacteria to plants via endosymbiosis and is especially important when photosynthesis is running
88
www.researchoutreach.org
Chen et al. PNAS (2016)
the physiology of this third route, integrating this into a full and complete revision of carbon metabolism in cyanobacteria and plants. With Prof Christoph Wittmann of Saarland University, she hopes to clarify the importance of the three different mechanisms of glucose breakdown and the relative carbon fluxes in each, under the full range of conditions that plants and cyanobacteria experience, from light to dark and nutrient-rich to nutrient‑limited. The implications of this research extend beyond fundamental knowledge to important applications, including the potential to manipulate plants and cyanobacteria for biotechnological uses, such as producing fuels including hydrogen (H2) as an energy source, pharmaceuticals or nutrients. It’s high time this overlooked metabolic pathway got a look-in!
Behind the Bench Dr Kirstin Gutekunst
E: kgutekunst@bot.uni-kiel.de T: +49 431 880 4237 W: http://www.researchgate.net/profile/Kirstin_ Gutekunst W: http://www.biotechnologie.uni-kiel.de/de/mitarbeiter/kirstin-gutekunst Research Objectives Dr Gutekunst’s work focuses on the hydrogen and central carbon metabolism in cyanobacteria and plants, particularly the Entner-Doudoroff pathway – an overlooked glycolytic route in cyanobacteria and plants. Funding Deutsche Forschungsgemeinschaft (DFG) Bundesministerium für Bildung und Forschung (BMBF) Collaborators Co-authors: Xi Chen, Karoline Schreiber, Jens Appel, Alexander Makowka, Berit Fähnrich
Q&A
Why do you think the EntnerDoudoroff pathway has been overlooked for so long? Protein sequences from many plants and cyanobacteria have been available for a long time. However, pathways and key enzymes are not automatically eye-catching. You need to search for them. In eukaryotes (animals and plants) routes of glucose breakdown were first studied in animals that lack this pathway. There was a paradigm that the EntnerDoudoroff pathway is restricted to prokaryotes (bacteria and archaea). It is furthermore missing in the most studied model plant Arabidopsis thaliana. I guess as soon as something is accepted to be completely understood these things are naturally not questioned any longer. As long as we do not stumble upon inconsistencies, we can be collectively blind to misconceptions. What drew you to study this pathway yourself? We found the pathway accidentally. I was interested in studying the influence of glycolytic routes in cyanobacteria on their production of hydrogen, which is a gas that can be used as a source of energy in fuel cells. So we started to construct deletion
Mayo Roettger, Mohammad R. Hajirezaei, Frank D. Sönnichsen, Peter Schönheit, William F. Martin. Collaborators: Alexander Makowka, Berit Bünger, Lars Nichelmann, Dr Götz Hensel, Prof Karin Krupinska, Prof Wolfgang Bilger, Prof Christoph Wittmann, Prof Karl Forchhammer. Bio Dr Kirstin Gutekunst is a senior scientist in the group of Prof Rüdiger Schulz. She was recently awarded with a Forschungspreis from the BMBF for her own research. This group aims to maximise hydrogen production in cyanobacteria.
mutants in which we knocked out all known glycolytic pathways. To our surprise, these mutants were still able to enhance their growth on glucose. This was completely contradictory to what we had expected. So we realised that something very basic was missing in the picture that we had of the central carbon metabolism in cyanobacteria. What role do you think the pathway plays in nature...? There is exciting work from Flamholz et al 2013, which states that this pathway is especially advantageous when bacteria and archeae are not limited in growth by the ATP yield of their glycolytic route but instead by protein costs. In photosynthetic organisms such as cyanobacteria and plants it seems most likely that it is important when glucose needs to be broken down in parallel with photosynthesis. And this is due to the fact that only this pathway does not form a futile cycle with the CalvinBenson cycle of CO2 fixation. …and how could the pathway be harnessed by humans? If photosynthetic organisms are exploited to use the energy of sunlight for the production of fuels, pharmaceutics, food additives and cosmetics, it is essential to understand the carbon flow in
She additionally holds two DFG grants to study the carbon metabolism in photoautotrophs. Dr Kirstin Gutekunst is mother of five children, two of them studying at University and three of them still attending school. Combining family life and science is her passion. Contact Dr Kirstin Gutekunst Christian-Albrechts-Universität zu Kiel Botanisches Institut und Botanischer Garten Kiel Germany
these organisms. This gives us the opportunity to manipulate the carbon flow in a desired manner. The Entner-Doudoroff pathway might be especially important as it can run in parallel with photosynthesis. So it might be advantageous to overexpress this pathway in order to enhance product yield. Where do you see your research into the Entner-Doudoroff pathway going next? I feel that it is absolutely essential to understand the central carbon metabolism of photosynthetic organisms as basically all life on earth depends on it. We eat plants, drive our cars, aircrafts, and ships and heat our houses with fossil fuels that are nothing else but fixed carbon from ancient plants. What I find most thrilling is to unravel the interplay of photosynthesis and the EntnerDoudoroff pathway. It seems that this pathway is physiologically most important under photosynthetic conditions. And what I really love is the hypothesis that photosynthesis, which is an autotrophic process, might be less independent than generally accepted. It might be that this process needs support from a constant supply of glucose breakdown. Future work will show if this view is true.
www.researchoutreach.org
89
Biology ︱ Dr Matt Traxler
Unlocking the chemical secrets of microbial conversations Dr Matt Traxler of the University of California, Berkeley, is changing the way we study microbes. Gone are the days of thinking about a single species in a pure culture in the lab – Traxler and his team of graduate students and postdocs are developing a version of mass spectrometry which promises to allow single microbial cells, and their interactions with other species, to be studied under their natural conditions.
A
ctinomycetes are a group of bacteria with a wide variety of human uses, but they are best known for producing natural products, also known as specialised metabolites, such as antibiotics. These microorganisms also produce many antifungals and immunosuppressants, among others. Not only are they important for use in human health and veterinary practices, their natural products are also useful in agriculture. Actinomycetes are found in a wide range of habitats including soils and marine waters, and in association with plant roots and insects. These microorganisms rarely exist in single species communities. As a result, actinomycetes may use their range of
© Dr Vineetha Zacharia.
High-magnification image of a Streptomyces coelicolor colony producing aerial hyphae (fuzzy texture) and secreting droplets containing the bluepigmented antibiotic, actinorhodin.
90
www.researchoutreach.org
chemical products to interact with other bacteria in their communities. A wide variety of gene clusters have been observed in the genomes of these microbes that would theoretically allow them to produce products that have not been seen in typical lab settings, the majority of which are unknown to us. Usually, in the laboratory these bacteria are studied one species at a time under strictly controlled conditions. These conditions are radically different to those found in natural settings. Dr Traxler and his colleagues see this as a gap in the study of such microbes – by using traditional means we have been ignoring the importance of their interactions with the other microbes in the communities to which they belong. This was highlighted in a major study in which Dr Traxler and his colleagues found that the well-studied actinomycete, Streptomyces coelicolor, could produce an astounding array of compounds, but only when it interacted with other soil microbes. When studied in interactions with five other actinomycetes, S. coelicolor produced metabolites specific to each of the five other species which it does not produce in isolation. These findings suggest that interactions between microbes may be a rich new source for discovering useful natural products. These metabolites were only observed in specific combinations of species, suggesting that microbes in-situ could produce chemicals we have never seen in the lab. Since then, Dr Traxler has been working to develop an improved method of mass spectrometry imaging
© Bailey Bonet
that will allow us to study microbes in situ, that is, in the soil, tissues or plant roots where they are naturally found. A CLOSER LOOK Dr Traxler and his colleagues have brought together an interdisciplinary approach to this issue, combining mass spectrometry, microbial ecology and microscopy, amongst others, to develop an improved methodology of High Resolution Mass Spectrometry (HR-MSI) that will allow them to study microbes at an ecologically relevant scale. In September 2016, the National Science Foundation (NSF) awarded Dr Traxler a research grant of nearly $300,000 to fund the ongoing development of this new technology. Matrix assisted laser desorption/ ionisation time-of-flight (MALDI-TOF) mass spectrometry imaging (MSI) is a method that enables researchers to visualise the distribution of chemical
A Streptomyces coelicolor colony (above) responds to the presence of another actinomycete, Amycolatopsis sp. AA4 by making a red pigmented antibiotic called prodiginine. This particular colony of Streptomyces coelicolor is deficient in its ability to produce a second pigmented compound called actinorhodin. This allows the full extent of prodiginine production to be easily seen.
The ability to ‘see’ molecules produced by individual bacterial cells will open an exciting new window onto microbial life compounds in a biological sample. In this method, a laser is used to ionise molecules from the sample surface. Moving the laser in a grid pattern allows researchers to create a profile of chemicals from each point across the sample. This information is then used to build images depicting the distribution of individual chemical species.
It is thought that most microbes outside of a lab are found in colonies of less than 100 cells, each of which are around one to five microns in size. A team of researchers in Dr Traxler’s lab, including postdoctoral fellow Rita Pessotti, are working to refine the resolution of their MALDI-TOF apparatus from 10x10 microns to around three microns.
www.researchoutreach.org
91
© Main: Dr Scott Behie.
Dr Matt Traxler examines the alignment of an electrospray ionisation source.
This improved resolution will allow single bacterial cells and the chemicals they produce to be studied, not in a large lab culture, but in the plant tissues, soils and even samples from the gastrointestinal tracts of mammals where they would normally occur. Other improvements are also being made to the methods used in this research, including the optimisation of biological sample preparation techniques to maximise the spatial resolution achieved by MSI. These improvements, once firmly established, will form the basis of a set of protocols that other researchers will be able to use when studying microbes in this way in the future. The ability to ‘see’ molecules produced by individual bacterial cells will open an exciting new window onto microbial life. This will advance our understanding of basic biological mechanisms and principles that govern the exchange of metabolites at the scale of single microbes in their microbiomes – the communities and environments in which they are found. The ultimate aim of this work in the Traxler laboratory is
92
www.researchoutreach.org
The future holds an exciting merger between the study of microbial interactions, chemistry, and microbiome function to make the potentially transformative power of micro-scale HR-MSI feasible for any laboratory with an existing highresolution mass spectrometer. CHANGING THE FUTURE OF MICROBIOLOGY The Traxler laboratory studies microbial interactions, with an emphasis on understanding how these interactions are mediated by natural products like antibiotics. Dr Traxler’s lab group is an interdisciplinary team, including researchers with expertise in microbial genetics, ecology, natural products chemistry, and informatics analysis. They see that the future holds an exciting merger between the study of microbial interactions, chemistry, and microbiome function.
Dr Traxler seeks to learn why bacteria make compounds like antibiotics. He hopes that, by answering this question, new methods of compound discovery could be formed, helping scientists and doctors design new treatments to minimise the spread of antibiotic resistant pathogens. Though centred on actinomycetes, the lessons learned from the work of Dr Traxler and his research team could be applied to countless other antibiotic-producing microbes. With an ever-increasing number of antibiotic resistant pathogens, this innovative research could provide new sources of antibiotics with the potential to save numerous lives that would otherwise have gone unnoticed.
Behind the Bench Dr Matt Traxler
E: mtrax@berkeley.edu T: +1 510 642 8058 W: http://plantandmicrobiology.berkeley.edu/ W: http://traxlerlab.berkeley.edu/ Research Objectives Dr Matt Traxler’s research seeks to understand how bacteria interact with one another through natural products like antibiotics. Projects in his lab range from natural products discovery to understanding how these molecules shape microbiomes. A key part of his research involves improving the resolution of imaging mass spectrometry techniques, with the aim of someday seeing molecules produced by individual bacterial cells in microbiome contexts. Funding National Science Foundation (NSF)
Q&A
Other than the discovery of new antibiotics, what benefits do you think will come of studying microbes at this level? We know that healthy microbiomes are incredibly important to the fitness of all kinds of organisms, including humans. If we want to understand how microbiomes function, we need to understand how the microbes within them interact at the chemical level. Our efforts to push the limits of imaging mass spectrometry will hopefully provide a new way to see chemical interactions within microbiomes, which have implications for human, animal, and crop health. Why is it important to understand the reasons bacteria make compounds like antibiotics? Antibiotics are essential to our healthcare system, but their efficacy in the clinic is always being eroded by spreading pathogen resistance. Historically, resistant pathogens are usually observed within a few years of introducing a new antibiotic therapy. However, bacteria that make antibiotics, like actinomycetes, have been producing these compounds for millions of years! And, the advantage to making these antibiotics is still robust, even after all that time. This suggests that actinomycetes may use these compounds in ways that we do not understand, and may be profoundly different than the way we use them in the clinic. We have
Collaborators • Eoin L. Brodie, Deputy Director, Climate and Ecosystem Sciences. Lawrence Berkeley National Lab • Javier A. Ceja-Navarro, Research Scientist, Lawrence Berkeley National Laboratory (LBNL), Earth Sciences Division, Ecology Department • Tom Bruns, Professor, Plant & Microbial Biology, University of California at Berkeley Bio Matt Traxler received his BS and PhD in microbiology from the University of Oklahoma, and was a postdoctoral fellow at Harvard Medical School. His research something to learn from these microbes about how to use antibiotics. Why have these interaction-specific metabolites taken so long to find? Like all researchers, we build on the work of those who came before us. For example, work in the lab of Kenji Ueda at Nihon University in Tokyo began looking at interactions between actinomycetes in the early 2000s, and that work continues to be a big inspiration for us. What we have done is bring new mass spectrometry techniques into the picture, and the chemical diversity we found was really surprising. I also want to give credit to the laboratory of Pieter Dorrestein at UCSD, where I learned many of these mass spectrometry techniques. Beyond this, I would say that discovering natural products from microbes in pure culture yielded an incredible bounty of therapeutics that sustained us for many decades. As discovering novel compounds has become more challenging, we must innovate, and looking at microbial interactions is one way we can do that. Where would you like to take this research next? The imaging mass spectrometry tools and protocols we are developing may have many applications in the study of microbiomes, and it will be very exciting to see what we can learn with these new tools. The people in my lab are doing incredibly exciting, multidisciplinary projects. For example, Rita Pessotti,
aims include integrating metabolomic and transcriptomic paradigms with the ultimate goal of understanding the role of specialised metabolism in bacterial interactions and translating this knowledge into a platform for natural products discovery. Contact Matt F. Traxler, PhD Assistant Professor Dept. of Plant and Microbial Biology UC Berkeley Berkeley, California 94720-3102 311 Koshland Hall USA
Scott Behie, and Bridget Hansen are focused on understanding the roles of natural products in the microbiomes of plants and insects. Likewise, Bailey Bonet, Vineetha Zacharia, and Dylan McClung are focused on understanding the genetics of microbial interactions, and how we can leverage that information to discover novel compounds. It’s a dynamic time in microbiology, where we are seeing an expansion from thinking about how bacteria function at the molecular level, to thinking about how different bacteria function in communities. Our challenge is to bring these two worlds together, and I hope the tools we are working on will facilitate this new synthesis. Do you think students at Berkeley benefit from involving this research in their classes? I love working with undergraduate students! I teach the microbiology lab class for majors at UC Berkeley. In this class, the students isolate actinomycetes from their own soil samples, and put them into interactions with each other. Actinomycetes are particularly interesting since they make an incredible array of compounds, but also because they have very interesting shapes and colours that change when they interact. The emphasis in the class is really on formulating hypotheses and designing experiments to test them. I think the students enjoy having the freedom to explore their own ideas, which is what science is all about!
www.researchoutreach.org
93
COMMUNICATION
Why science must combat sensationalism I remember once whilst flicking through a certain popular newspaper, I noticed a headline claiming that ‘one glass of red wine a day prevents breast cancer’.
N
aturally, like anyone, I became intrigued, and delved right into the content.
Now, I guess I’m different to most other readers, as I already have an academic background in science, so I more-or-less know how to smell rubbish when it is staring me in the face. Reading the article in front of me, the first half carried on with the sensationalised theme of red wine as the miracle cure for breast cancer. Fantastic news if true, but highly doubtful – especially when included on page 36.
Research actually shows that drinking one glass per day can be bad for health, and can actually be a causative effect towards breast cancer – but that’s beside the point. To the average viewer reading this extensively exaggerated, fabricated claim, they may not think to question the scientific validity behind it and will take it as fact. Research even shows that people lose their attention as quickly as eight seconds now, so a lot of readers wouldn’t have even reached the paragraph discussing the actual research itself. There are so many other examples of this – you only need to ask the person sitting
With the growth of modern-day media streams, information is everywhere, and it is vital that people know what to believe It wasn’t until the very last paragraph that anything of any scientific merit was included, finally mentioning the actual research the claim had been based on. And boy, did the rubbish smell fresh. The research was actually conducted on the cells within the skin of a type of grape used to make a particular type of red wine. Nowhere in the actual research paper itself (yes, I checked) did it say anything about red wine or the influence drinking one glass a day would make. Nor did it specifically mention breast cancer.
94
www.researchoutreach.org
next to you to hear some of the scientific claims that people have heard. This is why you as a researcher need to ensure that your work is represented in a way that is accurate, true and accessible to the general public. With the growth of modern-day media streams – whether it be Facebook, Twitter, TV, radio, whatever – information is everywhere, and it is vital that people know what to believe. Together, scientists can stop sensationalism.
Social Media for Scientists RSM was born out of multiple conversations with researchers who see a real benefit in connecting with a broad audience over an ongoing basis. Social Media can now be considered one of the most prominent and important engagement tools of the modern era. We help you get the ball rolling and can even provide long term Social Media Management support.
Start your Social Media journey now: www.researchsocialmedia.com
Partnership enquiries: simon@researchoutreach.com Careers and guest contributions: emma@researchoutreach.com www.researchoutreach.org