EU Research Summer 2019
REFORESTATION
the best climate change solution available?
DFG, A focus on Independent Junior research in Germany
Economics and Politics: Development, social change, and human empowerment
Disseminating the latest research from around Europe and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
Editor’s N T
he Horizon Europe programme is set to break new ground with funding for the next round, between the years 2021-2027. Whilst the European Commission proposed €94.1 billion, compared to €77 billion for Horizon 2020, the European Parliament would like to see the budget closer to €120 billion. This gigantic research budget is seen as the key to the European economy, the spearhead to both growth and efficiency, the driver of solutions and the best way for keeping Europeans safe, secure and ahead. Whilst the UK’s inclusion in the budget is now threatened, the EU’s remaining 27 members will have enormous opportunities for advancing knowledge and innovation.
On a global scale, never before in history has scientific knowledge and aspiration been so critical for us all. Food security, global climate, biodiversity, healthcare provision, pandemics, AI, quantum computing, surveillance, space exploration – there are so many enormously sensitive and important issues that we are going to deal with in the next few years. We are standing on the bridge between now and the future, and if we wrong-foot our path, there are some terrifying currents that can sweep us away. Having the means, the minds and the support to forge ahead with important science will give us all the chance to adapt to problems, beat the odds and create a future that is worth living in.
As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.
www.euresearcher.com
In this issue you can read about some of the incredible work that our finest analytical and creative minds are involved in and many of these projects will go on to change our lives in different ways. We need to ensure the budgets for scientific endeavour are secure and match our ambition as a species. Without the money to drive science, so many possibilities would diminish into nothing.
Hope you enjoy the issue.
Richard Forsyth Editor
1
Contents 18 Planctomycetes as
4 Research News EU Research takes a look at the latest news in scientific research, highlighting new discoveries, major breakthroughs and new areas of investigation
10 TrueBold Current MRI methods are not particularly sensitive to the smaller blood vessels in which neural activation typically occurs. We spoke to Professor Klaus Scheffler about his work in developing improved methods to detect neural activity in the brain
12 P2X7 and P2X4 The P2X7 receptor is known to play an important role in the development of some forms of cancer, now researchers are seeking to explore its potential as a target for treatment, as Dr. Anna Junker, an EmmyNoether research group leader at the University of Munster, explains
13 REWARD The REWARD project is investigating a performance-based reward mechanism to encourage pharmaceutical innovation and help bring drugs to more of the people that need them, as Professor Thomas Pogge explains
16 Development of
integrated continuous flow systems With pharmaceutical companies under pressure to both improve productivity and reduce manufacturing costs, scientists are investigating alternatives to the established batch processing methods, a topic at the heart of Dr. Janina Bahnemann’s research
2
a source of novel secondary metabolites We spoke to Dr. Nicolai Kallscheuer about his work in characterising the metabolome of different recently isolated planctomycetal species, which could hold important implications in terms of drug development
19 Development of
heteropolyvanadate spin clusters Dr. Kirill Monakhov and his colleagues synthesise stimuli-responsive molecules, characterise them, and apply them on substrate surfaces, work which could open up new possibilities in highly sought after ‘More than Moore’ information technology
22 Importance Sampling of
Chemical Compound Space
A lot of attention in research is focused on moving towards a more structured approach to materials design. Dr. Tristan Bereau and his colleagues are using computer simulations to systematically explore chemical space and help accelerate compound discovery in soft matter
24 DiluteParaWater Para-water accounts for 25 percent of water at room temperature, while the remaining 75 percent is ortho-water, in which the proton spins in water molecules are symmetrical. We spoke to Professor Geoffrey Bodenhausen about his research into the properties of water
26 Reforestation Report The world’s forests have declined significantly in size over the last century, but now some countries are replanting trees to remove carbon dioxide from the atmosphere. How effective could planting more trees be in helping to mitigate the impact of climate change? Richard Forsyth reports
30 Conducting Assisted Evolution
Coral reefs are among the longestliving ecosystems on Earth, yet they are struggling to keep pace with unprecedented rates of environmental change. Human intervention is essential to ensuring the long-term future of coral reefs, argues Dr. Hanna Koch
32 ICONOX Oxygen concentrations in the world’s oceans are decreasing, which is affecting nutrient recycling from sediments and thus the nutrient balance in the water column. We spoke to Dr. Florian Scholz about his research into iron cycling in marine sediments
34 INMARE The INMARE project brings together researchers from several different disciplines to discover new enzymes and bioactives from marine environments, which could help chemical and pharmaceutical companies work more effectively, as Professor Peter Golyshin explains
37 TropSOC A lot of the research into biogeochemical cycles over recent decades has been conducted in temperate zones, yet less is known about the underlying mechanisms in tropical Africa, where soils function very differently, a topic that Dr. Sebastian Doetterl and his colleagues are addressing
38 A new threat to the
stratospheric ozone layer Increased levels of Very Short-Lived Halocarbons (VSLH) might represent a new threat to the ozone layer. We spoke to Dr. Susann Tegtmeier about her work in assessing the likely extent of future VSLH emissions and their impact on the atmosphere
EU Research
EU Research
40 Paleolandscape –
reconstructions of tectonically active regions Landscapes in tectonically active regions have changed dramatically over the course of human history. We spoke to Dr. Simon Kuebler about his work in reconstructing earlier landscapes, which will help researchers understand the factors that affected early settlement patterns
41 Differential processing of host plant toxins
Plants are not only a source of food for insects, but also toxins that can provide protection against predators. Dr. Georg Petschenka and his colleagues are investigating the interactions between plants and insects, with the aim of building a deeper understanding of their co-evolution
42 TROCONVEX The earth’s outer core is comprised of liquid, which flows in a highly turbulent manner. The flow inside the earth’s outer core is thought to be organised in such a way that this liquid metal moves around in a spiral-like fashion, a topic of great interest to Professor Rudie Kunnen
45 The power behind
German research (DFG) The German Research Foundation (DFG) supports the country’s scientists, helping to strengthen the country’s research base and lay the foundations for future technical developments. Richard Forsyth examines its history and looks forward to its future
48 The Cool Water Effect Cool water conditions were an important factor behind the Protestant West’s pioneering role in industrial history, believes Professor Christian Welzel, the Principal Investigator of the Cool Water Effect project
52 SIREN The Internet plays a central role in the modern economy, yet its communication infrastructure remains vulnerable to attack. Researchers are now exploring a new approach to securing the routing of Internet traffic, as Professor Michael Schapira explains
www.euresearcher.com
Summer 2019
REFORESTATION
54 From Dark to Light The next generation of telescopes will allow scientists to look back even further into cosmic history, opening up new possibilities in research. We spoke to Dr. Benjamin Moster about his work in developing a new approach to modelling the formation and evolution of galaxies
57 A history of philosophy in global perspective
We spoke to Professor Rolf Elberfeld about his work in developing a global framework on the history of philosophy, which will help pave the way for doing philosophy in a global perspective in future
60 A Scientific Theology?! On the surface the methods used in scientific research may seem to be very different to those applied in theology, where researchers look at the nature of God and investigate how religion shapes the world, yet Dr. Benedikt Göcke’s group brings these two topics together
63 Black Swans in public administration
Major failures of public administration are rare in democratic states, yet they can have serious consequences when they do occur. We spoke to Professor Wolfgang Seibel about his work in identifying the causal mechanisms behind organisational failures
66 Propositionalism in
linguistic semantics Language is understood not just through the words of a speaker, but the grammatical structures they use. Professor Thomas Ede Zimmermann is probing deeper into the semantic structure of language, using methods derived from mathematical logic
68 EUPLEX The policy issues that the European Union deals with have become progressively more complex over recent decades, raising the question of how proposals can be processed and implemented efficiently, a topic at the heart of Dr. Steffen Hurka’s research
the best climate change solution available?
DFG, A focus on Independent Junior research in Germany
Economics and Politics: Development, social change, and human empowerment
Disseminating the latest research from around Europe and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH
EDITORIAL Managing Editor Richard Forsyth info@euresearcher.com Deputy Editor Patrick Truss patrick@euresearcher.com Deputy Editor Richard Davey rich@euresearcher.com Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks info@euresearcher.com PRODUCTION Production Manager Jenny O’Neill jenny@euresearcher.com Production Assistant Tim Smith info@euresearcher.com Art Director Daniel Hall design@euresearcher.com Design Manager David Patten design@euresearcher.com Illustrator Martin Carr mary@twocatsintheyard.co.uk PUBLISHING Managing Director Edward Taberner etaberner@euresearcher.com Scientific Director Dr Peter Taberner info@euresearcher.com Office Manager Janis Beazley info@euresearcher.com Finance Manager Adrian Hawthorne info@euresearcher.com Account Manager Jane Tareen jane@euresearcher.com
EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: info@euresearcher.com www.euresearcher.com © Blazon Publishing June 2010
Cert o n.TT-COC-2200
3
RESEARCH
The EU Research team take a look at current events in the scientific news
NEWS
Construction of TMT telescope held up The construction of the proposed Thirty Metre Telescope (TMT) on the dormant volcano of Mauna Kea on Hawaii has been held up again following protests from the local community. The telescope itself is designed to help scientists probe deeper into a wide variety of astrophysical problems, yet its location has attracted criticism from local people. One of the five volcanoes that forms the island of Hawaii, Mauna Kea is considered a sacred site by local people. However, it is also an ideal spot for astrophysical observations, rising to an altitude of 4,207 metres above sea level, and over the years a number of observatories have been established in the area, with the TMT the latest. The project itself is expected to provide a significant economic boost to the region, and astronomers are excited by the possibilities that it will open up. “We will, for the first time, be able to make measurements of the atmospheres of other Earth-sized planets in the habitable zone around other stars,” enthused Roy Gal, an Associate Astronomer at the University of Hawaii. This doesn’t assuage the concerns of others however, who view Mauna Kea as temple, and want to protect it from further construction projects and help preserve the island’s heritage. “It contains some of our highest born and most revered ancestors. It is
Lofty ambitions: the Thirty Meter Telescope’s primary mirror is made up of 492 hexagonal segments, each about 1.44 m across. (Courtesy: TMT International Observatory).
a symbol of peace and aloha,” said Kealoha Pisciotta, president of the Mauna Kea Anaina Hou. The governor and the local administration now face the task of striking the right balance between these different positions. While the local administration of course want to attract investment and boost the local economy, this cannot come at the cost of the island’s heritage and traditions.
“It contains some of our highest born and most revered ancestors. It is a symbol of peace and aloha,”
A rendering of the Thirty Meter Telescope. The telescope will allow astronomers to see some 13 billion light years away from Earth. But some say that knowledge shouldn’t come at the expense of Hawaiian spirituality. (Courtesy TMT International Observatory)
4
© European Union
EU Research
Einstein’s still right, for now Research by a team of scientists has suggested that Einstein’s general theory of relativity still holds up, but cracks are beginning to show. The scientists observed a star orbiting around a supermassive black hole right at the centre of the universe, aiming to detect an effect known as ‘gravitational redshift’, when light is stretched to longer wavelengths. The team found that Einstein’s ideas still stand up, even in this environment far removed from the earth’s atmosphere. “Einstein’s right, at least for now,” said Andrea Ghez, a Professor of Physics and Astronomy at UCLA and one of the
leaders of the study. “Our observations are consistent with Einstein’s general theory of relativity. However, his theory is definitely showing vulnerability.” A major point here is that the theory of general relativity cannot fully explain gravity inside a black hole, a challenge for which a new theoretical framework may be required. More immediately, Professor Ghez believes the team’s research will lead to a “transformational change in our understanding about not only the existence of supermassive black holes, but the physics and astrophysics of black holes.”
Star wars: France announces satellite plans The French government has announced plans to develop satellites armed with laser weapons, which will help to protect its existing orbital satellites, as concern grows around the militarisation of space. China, the US and Russia in particular are investing significant resources in placing technology in space, now France is exploring its options. Only last year the French government accused Russia of eavesdropping on a communications satellite, in what is
believed to be one of the first ever cases of space espionage. Now, the French are looking towards taking protective measures. “If our satellites are threatened, we intend to blind those of our adversaries,” said France’s Defence Minister Florence Parly. “We reserve the right and the means to be able to respond; that could imply the use of powerful lasers deployed from our satellites or from patrolling nano-satellites.” © NASA
www.euresearcher.com
5
Surveying the superbug threat
A drug-resistant superbug is spreading through Europe’s hospitals, researchers from the Sanger Institute in the UK have warned. The spread of Klebsiella pneumoniae is a cause of great concern, as some strains are resistant even to carbapenems, which are typically used as a drug of last resort in cases where an infection cannot be treated with anything else. The superbug is thought to have been responsible for over 2,000 deaths in Europe in 2015, a six-fold increase from the 2007 figure, highlighting the importance of further research. With concern growing, a lot of energy is being devoted to surveillance of resistant bacteria, so that resistant strains can be detected rapidly. Researchers analysed the genomes of nearly 2,000 Klebsiella pneumoniae samples taken from patients in 244 hospitals across 32 countries, and it has been found that it is mainly transmitted within hospital settings. “Our findings imply hospitals are the key facilitator of transmission [and suggest that] the bacteria are spreading from person-toperson primarily within hospitals,” Dr Sophia David from the Sanger Institute told the BBC. “The fact that we see the same high-risk clones in many different hospitals around Europe also shows there’s something special about those strains.”
Testing a robotic arm which can give amputees the sensation of touch through sensors in the hand, making it easier to pick up and hold objects ( University of Utah/SWNS ).
Tickle your way to a healthy old age A new therapy has been developed that could help us protect our health as we grow older. Researchers at the University of Leeds have found that effectively ‘tickling’ the ear with a small electrical current helps to prevent some of the chronic diseases associated with aging, including heart disease and high blood pressure. The treatment is more formally known as transcutaneous vagus nerve stimulation (tVNS), and is based on delivering a small, painless electrical current to the ear, which transmits signals to the body’s nervous system. This effectively recalibrates the body’s internal control system, helping people to rebalance the autonomic nervous system, which controls many of the bodily functions which don’t require conscious thought, such as digestion and breathing. This therapy has proved effective in improving the balance of the autonomic nervous system in people under the age of 30, now researchers have tested it on adults over the age of 55, who are more likely to have imbalances in this respect. The therapy had a positive impact on the majority of the participants in the study, who experienced improved autonomic balance, while a number reported other benefits, including better sleep. The researchers now plan to continue their investigations in this area. “The ear is like a gateway through which we can tinker with the body’s metabolic balance, without the need for medication or expensive procedures,” explained Dr Alice Bretherton, lead author on the study. “We are excited to investigate further into the effects and potential long-term benefits of daily ear stimulation, as we have seen a great response to the treatment so far.”
New insights into autism Research recently published in Molecular Psychiatry suggests that high levels of oestrogen sex hormones in the womb may heighten the likelihood of a child being autistic. The foundations of this work date back to 2015, when scientists measured the levels of four prenatal steroid hormones in the amniotic fluid in the womb.
Amputee gets his sense of touch back An amputee from the US state of Utah is once again able to touch and feel, 17 years after losing his hand. A braincontrolled robotic prosthetic enables Keven Walgamott to peel fruit, put on his wedding ring and a variety of other everyday tasks. One of seven participants in a study by the University of Utah, Mr Walgamott has helped to provide feedback and refine the technology, which is connected to a computer worn on the user’s belt, giving them freedom to wear it anywhere. The hand is controlled by an individual’s thoughts, and also sends back the appropriate signals to the brain, so that the users knows how to deal with the object in their hand. Nobody wants to crush an egg in their hands after all.
It was found that the levels of these hormones was higher in male foetuses who subsequently went on to develop autism, now researchers have built further on these findings, looking at another set of prenatal sex steroid hormones called oestrogens. Researchers have tested amniotic fluid samples from the Danish biobank, and the findings seem to back up ealier work. It was found that levels of all four of the oestrogens were significantly elevated in the male foetuses that went on to develop autism. This is an exciting finding, says Dr Alexa Pohl, part of the team at the University of Cambridge that conducted the study. “The role of oestrogen in autism has hardly been studied, and we hope that we can learn more about how they contribute to foetal brain development in further experiments. We still need to see whether the same result holds true in autistic females,” she explained.
“The most amazing thing for me was what the team were able to do,” commented Mr Walgamott. “They take a bunch of mechanical pieces and provide, through a computer, not only the ability to move all fingers and grasp things, but be able to feel again.”
6
EU Research
Implants that deliver drugs Anti-retroviral drugs already play an important role in treating people with HIV, now researchers are looking towards prevention. The pharmaceutical company MSD has conducted a trial on 16 patients, investigating the use of an implant with an HIV prevention drug. This approach differs from the PrEP (pre-exposure prophylaxis) pills that are currently available in some areas. However, users have to remember to take a pill whenever it’s required, whereas an implant would be much simpler. It has been shown that certain PrEP drugs significantly reduce the risk of contracting HIV from an HIV-positive partner, now researchers are looking to provide them via an implant that slowly releases the drug in the body, removing the need to remember to take the pills. Researchers gave an implant with a dosage of a drug called islatravir to 12 healthy patients for 12 weeks, looking to assess how they tolerated the implant and to measure drug
LOCKR holds the key to personalised treatment A team of researchers have developed the world’s first completely artificial protein switch, opening up new possibilities in personalised treatment. The team, led by Dr Hana El-Samad and Dr David Baker, have developed LOCKR (Latching, Orthogonal Cage/Key pRotein), which can be used to modify the internal circuity inside a cell. The switch allows scientists to control the interactions of proteins with other cell components at unprecedented levels of precision, while LOCKR can also be used to effectively build new biological circuits that can sense the surrounding environment. This ushers in a new era in biology, believes Dr El-Samad. “In the same way that integrated circuits enabled the explosion of the computer chip industry, these versatile and dynamic biological switches could soon unlock precise control over the behaviour of living cells and, ultimately, our health,” she explained.
concentrations. Measurements were taken until four weeks after the implant was removed, with promising results. “The trial appears to offer a long-promised alternative delivery system for PrEP – which, all else equal, is a good thing,” commented Dr Timothy Hildebrandt from the London School of Economics. Cost is of course a critical issue in terms of the wider availability of these implants, and research is still at quite an early stage. However, scientists are encouraged by results so far, which will spur further research and development. “Further trials will examine implants with different doses and different makeup, and with plans to conduct evaluations in larger populations,” said Randy Matthews, a researcher at MSD.
Rising threat from fungal infections The changing nature of the global climate is likely to expose humans to a higher risk of fungal infections, researchers have warned. While humans have historically been largely unaffected by fungal infections, primarily because our bodies are too warm for fungi to replicate, rising temperatures could heighten the threat to human health. Our average body temperature is around 37ºC, which has proved too warm for fungal species to replicate, yet as they adapt to the warming climate the threat they pose is set to rise. “There are millions of fungal species out there,” warns Arturo Casadevall, of the John Hopkins Bloomberg School of Public Health. “As they adapt to a warmer climate, some of them will then have the capacity to breach our thermal defenses.” Evidence is already emerging that fungal species are adapting and proliferating. Pathogenic versions of the fungus Candida auris were found independently in Africa, Asia and South America between 2012 and 2015, while there have been nearly 700 confirmed cases of infections in 12 US states since 2016.
A rendering of LOCKR in closed (background) and open (foreground) states. A key, in black, unlocks the gray cage, revealing bioactive contents (in yellow). (Ian Haydon/Institute for Protein Design at UW)
www.euresearcher.com
7
Ransomware disrupts Joburg power supply
Value of ivory
Some residents of Johannesburg were left without power recently following a ransomware attack on the South African city’s main electricity provider. The attack shut down City Power’s IT systems, leaving customers unable to buy electricity through its pre-paid vending system. It is thought that more than a quarter of a million people could have been affected by the virus, illustrating the widespread impact that such attacks can have. “It has encrypted all our databases, applications, and network,” read a statement on City Power’s twitter account. The majority of the IT systems are now back up and running, yet the incident has illustrated the vulnerability of utilities and other major organisations to ransomware attacks, heightening the pressure on companies to take strong protective measures.
Researchers have found that the value of ivory has increased ten-fold since trading it was banned under the terms of the Convention on the International Trade in Endangered Species (CITES) in 1989, reflecting continued high levels of demand. This provides a powerful financial incentive for poachers to continue hunting elephants, despite the provisions of the CITES agreement. The researchers analysed a large dataset of ivory prices, covering the period between 1989-2017, established both by analysing the literature and making visits to ivory markets in Asia, Africa and Europe. Asian markets typically command the highest prices, while Africa has the lowest, important considerations in terms of understanding the wider demand for ivory. This research holds broader relevance to conservation efforts, believes Monique Sosnowski, a lead author on the study, based at the Bristol Veterinary School. “With poachers killing an estimated 100 elephants of the remaining 350,000 each day, we believe our findings are significant to global wildlife conservation policymaking,” she said. “We hope that a greater understanding of the factors that drive the price of ivory will lead to better informed policy interventions that lead to a more secure future for the long-term survival of elephants and other animals that suffer due to the ivory trade.”
Indian tigers beginning to roar again The Indian Prime Minister Narendra Modi recently released a census of the nation’s tiger population to mark International Tiger Day. The census showed that the number of tigers in India rose to almost 3,000, an increase of around 700 from the 2018 report. This represents a very welcome rise after years of decline. While it is thought there were around 40,000 tigers in India when the country gained independence in 1947, the population dropped markedly over the following decades, reaching a record low of 1,411 in 2006. Other countries around the world with tiger populations also saw sharp declines in numbers, leading to a concerted plan to take action. In 2010, India and 12 other countries reached an agreement to try and protect the big cats, with the aim of doubling numbers by 2022. Some 50 habitats across the country were reserved exclusively for tigers, helping to protect them from poachers, and limiting the possibility of human encroachment. This work is now starting to bear fruit, and Prime Minister Modi plans to continue the conservation efforts. “We reaffirm our commitment towards protecting the tiger,” he said when releasing the report. “Some 15 years ago, there was serious concern about the decline in the population of tigers. It was a big challenge for us, but with determination we have achieved our goals.”
8
EU Research
Ethiopia sets tree-planting record The people of Ethiopia planted more than 350 million trees in a little over 12 hours recently, part of efforts to restore the country’s forests and help to combat climate change. It is thought that around 30 percent of Ethiopia’s land was forested at the end of the 19th century, which has since dropped to less than 4 percent, according to Farm Africa. This is one of the factors which has led to the degradation of land in the East African country, while the effects are exacerbated by recurrent floods, droughts and soil erosion. In a country where the vast majority of the population is dependent on the agricultural sector, these are worrying trends, prompting the government to take action. A number of government offices were shut down to allow civil servants to plant trees, while several major international organisations were also involved, including the UN, African Union, and staff from several foreign embassies. Overall, a total of 353,633,660 trees were planted across the country, a significant step towards the target of planting 4 billion trees over the rainy season. This is thought to be a new world record for the number of trees planted in a single day, surpassing the 66 million planted in India in 2017. Tree-planting on this scale could have a significant impact in terms of removing carbon from the atmosphere, according to a study by researchers from ETH Zurich in Switzerland. The study found that restoring degraded forests around the world would lead to the capture of around 205 billion tonnes of carbon. To put that in context, global carbon emissions are around 10 billion tonnes a year, underlining the potential impact.
The dinosaur that left a heavy trace An enormous thigh bone thought to belong to a sauropod, a plant-eating dinosaur common in the late Jurassic era, has been discovered at an excavation site in south-west France. Measuring around 2 metres in length, the bone is extremely well preserved, say palaeontologists. “We can see the insertion of muscles and tendons, and scars,” said Ronan Allain of the National History Museum of Paris, in an interview with Le Parisien. “This is rare for big pieces which tend to collapse in on themselves and fragment.” The site has yielded over 7,500 fossils from more than 40 species since 2010, including stegosauruses and a herd of ostrich dinosaurs, and researchers are continuing their work, in the hope of gaining deeper insights. Maxime Lasseron inspects the sauropod femur after it was discovered earlier in the week during excavations at the palaeontological site of Angeac-Charente, France. Photo: AFP
The wind is in off-shore energy’s sails The North Sea Wind Power Hub is just one of several projects designed to harness the potential of offshore wind, part of the wider shift towards more sustainable methods of generating power. Located in the North Sea around Dogger Bank, the North Sea Wind Power Hub will be built on artificial islands located far away from European coastlines.
“We’re (the UK) only just starting to run out of space close to the shore. It’s particularly Germany and the Netherlands which have quite small coastlines,” points out Ian Staffell, a lecturer in sustainable energy at Imperial College London. This has prompted a focus on alternatives, in particular locating them further away from coastlines.
It’s very much a long term project, with nine European countries cooperating in plans to develop several winds farms over time using a modular, granular approach. This could generate large amounts of power, as much as 180GW by 2045, which would be a major step towards meeting the goals set out in the Paris Climate Agreement of 2015.
This brings significant benefits, as they have less of a visual impact on the landscape, which has been a major concern for people living near on-shore wind farms, and aren’t as disruptive to shipping. Winds are often stronger and steadier further out to sea, so these wind farms could provide a more reliable source of power.
Many existing wind farms have been built close to shorelines, yet now companies are starting to explore the possibility of locating them further out to sea. The UKs long coastline offers relatively large amounts of space for wind farms close to the coast, yet this is not the case for much of the rest of the continent.
It’s not a cheap venture however, as building the artificial islands will be costly and getting the power back will be similarly expensive, so these wind farms will need to be substantial in size if they are to be economically viable.
www.euresearcher.com
9
New MR methods for functional imaging
Magnetic Resonance Imaging scanners have become an important tool in both medical diagnostics and research, yet current methods are not highly sensitive to the micro blood vessels around which neuronal activation typically occurs. We spoke to Professor Klaus Scheffler about his work in developing improved MR methods to detect neuronal activity in the brain. A technology developed in the ‘70s, Magnetic Resonance Imaging (MRI) scanners have since become an important diagnostic and research tool, providing anatomical images of the body from which Doctors can learn more about our organs and how they are affected by disease. As the Principal Investigator of the TrueBOLD project, Professor Klaus Scheffler now aims to develop improved magnetic resonance methods to
which has been oxygenated. “The oxygen in arterial blood goes to the nerve cells, where it is consumed. Then there is venous blood, which has a much lower oxygenation level,” explains Professor Scheffler. If a certain region of the brain needs to work more intensively, maybe because the individual is solving physics equations or focusing on another demanding task, then it will consume more oxygen; this is an important
We want to develop a much more precise and higher spatially resolved method of
mapping the working brain. detect neuronal activity in the brain. “Our focus is on functional imaging, to produce images showing which regions of the brain are working hardest at a particular point in time,” he explains. An image of the brain may allow researchers to see the different regions of the brain, but not necessarily which of those have been activated at a specific moment, an issue that Professor Scheffler is working to address. “I am trying to develop a new method that is more specific to neuronal activation than existing methods,” he says. These existing methods work primarily by detecting changes in water relaxation within a neurovascular network within the brain. There are essentially two sorts of blood in the brain, one of which is arterial blood,
10
consideration in Professor Scheffler’s research. “When a region of the brain has a task there is a local change in blood oxygenation level,” continues Professor Scheffler. “We cannot directly measure blood oxygenation, but we can measure a secondary effect on an MRI image, which can then be interpreted further.”
TrueFISP The current magnetic resonance methods are very sensitive to the larger draining veins in the brain, yet neuronal activation typically occurs near much smaller blood vessels, with diameters of between 5-10 micrometers. Researchers in the project have modified a fairly well-established
image acquisition technique called TrueFISP, mostly used for rapid cardiac imaging, which has been found to actually be better suited to imaging neuronal activation than existing methods. “The reason for this is that it is less sensitive to larger veins,” outlines Professor Scheffler. The goal here is to enable a more specific signal of neuronal activation, more easily distinguishable from vascular effects. “The new method essentially does not see the medium-sized vessels, it’s not sensitive to them. It’s more sensitive to the micro-
EU Research
vessels with a diameter of between 5-10 micrometers,” says Professor Scheffler. This means the resulting signal or activation is much less easily confounded by vascular effects, so neuronal activation can be identified more clearly. While this is a significant attribute of the TrueFISP acquisition method, one disadvantage is that it is around 3-4 times slower in imaging acquisition speed than existing techniques, an issue which Professor Scheffler aims to address over the course of the project. “If the aim is to resolve rapid changes in the brain then this level of temporal resolution is a problem. That can be solved, and that’s what I will be working on intensively over the next few years,” he outlines. Neuronal processes take place over very short timescales and extremely rapid measurements are required to capture them, so this is an important aspect of Professor Scheffler’s research. “We want to develop a much more precise and higher spatially resolved method of mapping the working brain,” he says.
Functional brain imaging There are some significant technical challenges to deal with before this approach can be applied more widely in functional brain imaging, which is the wider goal over the long term. In particular, a very homogenous magnetic field is required in order to use this image acquisition method. “The field in conventional Magnetic Resonance systems is not really homogenous enough, that’s why we added dynamic shim arrays,” says Professor Scheffler. These dynamic shim arrays surround the brain, in order to make the field more homogenous and allow the application of the TrueBOLD technique, leading to a more distinct signal of neuronal activation. “For example, we want to make the red dots which indicate neural activation [see figure 1] more specific,” outlines Professor Scheffler. “If you have a more specific signal, more closely related to neural activation than vascular effects, then that opens up new possibilities in both radiology and neuroscience research.” A more detailed method of mapping the brain could in the long-term open up new avenues of investigation and help researchers build a deeper understanding of how it functions. While a lot of progress has been made in the neuroscience field over recent decades, there remains a lot to learn about the overall structure and function of the brain. “We don’t know whether certain paths work together and how different regions are controlled,” outlines Professor Scheffler. The project’s research holds important implications in these terms. “This is the only non-invasive method with such a high temporal and spatial resolution, and it is able to capture signals from the entire brain. We can put anybody into this scanner – it’s completely non-invasive – and look into their brain,” says Professor Scheffler. “This is about trying to answer very fundamental questions on how the brain functions.”
TrueBOLD Detecting brain activity with TrueFISP Project Objectives
TrueBOLD addresses the detection of neuronal activity in the human brain with magnetic resonance imaging based on an acquisition technique called TrueFISP or balanced SSFP at very high fields. Traditionally, blood oxygenation changes are detected with echo planar sequences (EPI) that are sensitive to the static spin dephasing around small and larger vessels filled with deoxygenated blood. EPI is not specific to a certain type of vessel architecture or size, it sometimes shows blurring and blooming around larger vessels, it shows significant spatial distortions and thus severe challenges in precise co-registration to submillimeter anatomical structures. The proposed detection of BOLD changes with pass-band balanced SSFP, TrueBOLD, in combination with localized and dynamic shim arrays and strategies to minimize physiological signal fluctuations has the potential to overcome these limitations.
Project Funding
German Research Foundation, DFG Reinhart Koselleck Project SCHE 658/12.
Contact Details
Project Coordinator, Prof. Dr. phil. nat. Klaus Scheffler, PhD MRC Department Max Planck Institute for Biological Cybernetics Department of Biomedical Magnetic Resonance Center for Integrative Neuroscience, CIN University of Tübingen Max-Planck-Ring 11 72076 Tuebingen, Germany T: +49 (0)7071 601-700/701 E: klaus.scheffler@tuebingen.mpg.de W: https://www.kyb.tuebingen.mpg. de/157539/dfg-reinhart-koselleck-project W: http://www.kyb.mpg.de/de/mr
Professor Klaus Scheffler, PhD
Klaus Scheffler is Director of the Biomedical Magnetic Resonance Department at the University of Tübingen, where he also works as a Professor of Neuroimaging and Magnetic Resonance Physics. He has a PhD in Biophysical Chemistry, and held research positions at several institutions in Europe and America before taking up his current role.
www.euresearcher.com
11
A focus on P2X7 and P2X4 receptors The P2X7 receptor, an ion channel, has been well-explored in terms of ligand development, now researchers are seeking to explore its potential as a target for the treatment of different cancers and inflammatory diseases. We spoke to Dr Anna Junker about her work in contributing to the further improvement of diagnostics and therapies against breast cancer. important role in the development of some forms of cancer through stimulation of cell invasion and migration, and in inflammatory diseases through the release of interleukin 1ß (IL-1ß), a cytokine protein. As scientists have learned more about this specific receptor, part of the family of purinergic receptors, it has attracted increasing attention as a target for the treatment of these conditions. “We are always looking for new targets. P2X7R is quite wellexplored in terms of ligand development – more than 50 patents have been filed focusing on the development of P2X7R ligands for the treatment of inflammatory diseases,” outlines Dr Anna Junker, an Emmy-Noether research group leader at the University of Munster. This also represents a good starting point for developing PET tracers to assess the effectiveness of treatment, another important strand of Dr Junker’s research. “We develop tracers that can be used in in vivo settings for imaging,” she says. “When you have a mouse model of breast cancer and treat a mouse with an imaging agent targeting P2X7R, you can detect and monitor the progression or remission of the cancer.”
The students Clemens Dobelmann (Ph.D., right) and Maximilian Hagemann (MSc, left) analyzing the HPLC data.
Andreas Isaak (Ph.D.) performing synthesis.
P2X7 and P2X4 Receptors P2X7 and P2X4 Receptors in Cancer and Inflammation: Drug Development, Bioimaging, and Targeted Drug Delivery Funded through the Emmy Noether Programme of the Deutsche Forschungsgemeinschaft (DFG). Dr Anna Junker, European Institute for Molecular Imaging, Westfälische WilhelmsUniversität, Münster, Germany T: +49 251 8333363 E: anna.junker@uni-muenster.de W: https://www.uni-muenster. de/EIMI/de/research/junker-a. phps
Dr Anna Junker is an Emmy Noether Group Leader at the European Institute for Molecular Imaging in Munster. Alongside the P2X4 and P2X7 receptor ligands, her main research interests are the ligands for P1 and P2 receptors, stereoselective synthesis and the development and application of novel synthetic methods in medicinal chemistry.
12
Photographs taken by Peter Dziemba.
The P2X7 receptor is known to play an
ligand if we want it to work as a fluorescent imaging agent than we do for PET imaging,” points out Dr Junker. “We also use some of the ligands that we develop in targeted drug delivery systems. So we essentially hook our ligand onto a drug delivery system, and try to achieve enrichment of the ligand in the area that the ligand should be delivered to.” A particular compound may be known to be potent, yet not function effectively because it is not delivered to the right part of the body, while an effective compound does not necessarily provide a good imaging agent, for example due to its rapid metabolism or excretion. In this type of situation, researchers may need to modify the compound in some way. “For example, we might need to increase the hydrophilicity or the lipophilicity of a compound or improve its metabolic stability. Usually, we modify various parameters that influence absorption, distribution, metabolism, and excretion (ADME) properties,” explains Dr Junker. The wider objective in this research is to develop more effective therapies against breast cancer and help minimize the sideeffects of treatment. It is known that P2X7R plays an important role in the development of triple-negative breast cancer. Dr Junker and her colleagues aim to contribute to improved
Andreas Isaak (Ph.D.) HPLC sample preparation.
We develop imaging tracers that can be used in in vivo settings to improve future diagnosis of breast cancer. Ligand design There are a number of different areas in Dr Junker’s research, with a lot of energy devoted to developing and improving ligands for both bioimaging and targeted drug delivery. A library of compounds has been generated and tested in vitro on cells, aiming to assess their effectiveness in inhibiting the P2X7 receptor. “Based on that we can analyse structure-activity relationships, and look to identify which modifications are better in terms of receptor inhibition,” outlines Dr Junker. This represents a step towards a more systematic approach, enabling scientists to precisely design ligands for specific purposes. “We have different requirements for the
treatment. “We want to help improve the diagnostics and potential therapies against breast cancer,” she says. A key step towards this is developing P2X7R-targeting tracers, which can then be evaluated in animal models. “We usually work with mice at our institute. We have a PET imaging system for mice constructed in the same way as the conventional system,” explains Dr Junker. “We usually use our compounds on mice, and when we find that the toxicological profile and the bio-distribution look satisfying, then we look at the disease model. If we find that the compounds perform well in the disease mouse model, then they can go forward for further investigations.”
EU Research
A new track towards drug development The patent system gives pharmaceutical companies an exclusivity period on newly-developed drugs, yet this can mean that drugs are priced beyond the reach of poorer populations. The REWARD project is investigating a performance-based reward mechanism to encourage pharmaceutical innovation and help bring drugs to more of the people who need them, as Professor Thomas Pogge explains. The
pharmaceutical
industry
has an important role to play in helping to raise health standards across the world by developing new drugs, yet at the same time companies are of course keen to maximise profits. Pharmaceutical companies invest a lot of time, money and expertise in research, and it often takes a long time before a new drug reaches the market. “It takes several years before you see any return on the initial investment, while clinical trials are also extremely expensive,” points out Thomas Pogge, Professor of Political Philosophy at the Universities of Central Lancashire and Yale in the US. Under the existing Intellectual Property Rights (IPR) system, pharmaceutical companies get 20 years of exclusivity on a patented drug, which provides an incentive to invest in research and development, yet this is rarely targeted at those diseases that mainly affect the world’s poor. “Some diseases do a lot of damage but they are not lucrative areas of R & D, as the patients who have these diseases tend overwhelmingly to be poor,” explains Professor Pogge. “Another point is that profitability is not really well-correlated with urgency. For example, it may be very profitable for a company to create a drug
to treat a disease for which a good drug is already available.”
REWARD project These issues lie at the core of Professor Pogge’s work in the Health Impact Fund (HIF) and the REWARD project, an ERC-funded initiative which brings together researchers from several disciplines to investigate restructuring the way pharmaceutical companies are rewarded. The current IPR system helps
his colleagues in the project are looking at a new payment mechanism. “We want to establish a second track, beside the usual monopoly-pricing track. On this second track, a pharmaceutical company can engage in research and development of a new medicine, and be rewarded in a wholly new way,” he says. “The company would sell the drug at the cost of production and distribution, and get a reward from public funds proportional to its health impact. We are looking at
Even in the richest markets there are mis-matches between the drug that’s best suited to a particular patient and the drug that a patient actually gets companies recoup their initial investment in R & D through patent-protected high prices, yet this can mean drugs are priced beyond the reach of poorer populations, while there are also insufficient incentives to encourage companies to investigate those diseases which primarily affect these populations. It’s also important to consider the increasingly heavy burden that high prices place on healthcare systems in affluent countries; a new approach is required to close these gaps, believes Professor Pogge, now he and
how to measure the health gains from the introduction of a new drug, for the purpose of rewarding the introducing company in proportion to the health impact.” There are two main components to this work, the first of which is the metric to be used in assessing the health impact of a new drug on the wider population. Thomas Pogge and his colleagues are using quality-adjusted life years (QALY) for this purpose, a measure that is commonly used in global burden of disease studies. “When a person’s life finishes, we can Partners in the REWARD project discuss ways of promoting community engagement in health planning, provision and monitoring.
www.euresearcher.com
13
ASHAs implement India’s public health programmes.
Accredited Social Health Activists (ASHAs) learn how to collect data on drug provision and consumption, using tablet computers.
measure the QALYs that they had over their lifetime. We might find that while they lived to the age of 79, some of those years were in poor health, so we would adjust it accordingly,” he outlines. The QALY metric itself is fairly wellestablished, so the aim in research is not to revisit its underlying methodology, but to use it for measuring the health gains attributable to the introduction of a new medicine much like it is currently being used by health insurers around the world to estimate health gains of patients in the context of making reimbursement decisions. “The idea is that we look at the state of the world as it would be without the introduction of these new medicines – and then compare the real world to that counterfactual state,” he explains. “So we look at how much better people’s health outcomes are than they would have been in the absence of these new medicines.” This work involves the extensive use of statistical sampling and statistical methods, which are essential to measuring a drug’s health impact. These measurements are by nature somewhat imprecise, which should not impede or distract innovators and pharmaceutical companies from focusing their energy on maximizing health impact. “As long as they cannot game the measurement in some way, then the best thing for them to do is to generate as much health impact as possible, which is exactly what we want,” he points out. Researchers in the project are designing a new mechanism by which the company behind a drug can be rewarded in proportion to its health impact. “The HIF would create a series of reward pools. Each annual pool will reward all participating medicines for the health impact that they have achieved in a given year,” he says. “The size of the reward will depend on the health impact that the participating drugs collectively produce. If a reward pool has $3 billion in it and there are ten drugs registered, then on average each of the companies will earn $300 million a year for example.”
14
Kerala case study A pilot study has been conducted in rural parts of Kerala, a state in South India, to pioneer health data collection and lay the foundations for continued research. There has been some scepticism around whether a drug’s health impact can be measured, particularly in resource-poor settings, so Professor Pogge was keen to show that it can in fact be done. “We worked with ASHAs (Accredited Social Health Activists), women employed by state governments across India to aid with implementation of existing public health programmes (such as vaccination or maternal and child health), taking responsibility for a particular district, maybe 500 families or so,” he says. ASHAs in Kerala have been enabled to collect data on issues around drug provision using a hand-held tablet computer. “We’ve shown the feasibility of gathering data through ASHAs by digital means. Exploring whether
people are taking drugs as prescribed, we have found that adherence to cardiovascular medications is low and also that there are over 20 brands of atorvastatin, a generic drug, being used with wide variations in price. This work shows how, with a bit of thought and ingenuity, useful data can be collected quite cost-effectively,” he says. The wider goal in this research is to more closely align what’s good for the patient with what’s profitable for a pharmaceutical company. This is an issue not just in poorer countries, but also in mature economies. “Even in the richest markets there are mis-matches between the drug that’s best suited to a particular patient and the drug that a patient actually gets,” he explains. This might be because a doctor has a strong relationship with a particular pharmaceutical company, leading them to prescribe their drugs even if they are not expected to be effective; Thomas Pogge
REWARD researchers with local partners in Kerala, where the project conducted a pilot study.
EU Research
says that the HIF system will help to prevent this. “If we are prescribed a drug that doesn’t work for us, under the HIF system then there’s no reward,” he points out. “In the HIF scheme, even the richest populations would get a drug at a very low price. So we would pay less for HIF-registered medicines and we would pay less for health insurance, insofar as our health insurance covers them. The health impact would be measured in similar ways, except that the baseline is likely to be higher in richer than in poorer countries.” This means that a new drug can have an especially large impact in poorer countries. While in mature economies there might be
medicines,” he continues. “This could be by creating a heat-stable version that makes the medicine more suitable for the tropics. Or by creating a paediatric formulation that is often sorely needed in poorer countries, where the majority of the population is young.” The best proposals would be implemented over a three-year period, then a sum of money would be divided among the companies involved, in proportion to the health impact each achieved over those three years. This would not only test researchers’ ability to measure health impact, but also offer new insights into the power of money to incentivise pharmaceutical companies to
The company would sell the drug at the cost of production and distribution, and get a reward from public funds proportional to its health impact. We are looking at how to measure the health gains from the introduction of a new drug. alternatives if a specific drug is not available, that tends not to be the case in areas like Kerala. “Often if this HIF-sponsored cheap medicine hadn’t come onto the market then the patient would have had nothing at all. So the medicine really makes a very big difference in poorer countries,” he stresses. The project’s work has attracted strong support from both the Kerala state government and the Indian national government, now Professor Pogge is looking towards the next steps. “We are hoping to gain support for a larger pilot, where we would ask pharmaceutical companies to make proposals on how they can achieve more health impact among poorer populations with existing
go beyond their normal approach. “It would show what they would do, and how they would do it. It would also show how much health impact we can actually generate for that money,” he says. This might not incentivise the development of an entirely new drug, but it could encourage a company to develop a new version that was targeted at a low-income setting. “A company may have a drug sitting on the shelf which they haven’t put out yet, because they felt there was not enough profit in it. They might pull it off the shelf and say; ‘now, that there is this extra reward money, we are willing to give this drug a try,’” outlines Professor Pogge.
An ASHA monitors a patient’s blood pressure and collects information on drug intake.
www.euresearcher.com
REWARD REWARD
Project Objectives
REWARD has investigated how to measure the health gains produced by the introduction of any medicine for purposes of making reward payments to pharmaceutical innovators committed to selling their new drug at the cost of manufacture and distribution.
Project Funding
ERC-AG - ERC Advanced Grant €1,922,338.40 for the years 2014-19.
Project Partners
• University of Central Lancashire (lead) • Delhi think tank RIS (Research and Information System for Developing Countries) • University of Calgary
Contact Details
Project Coordinator, Professor Thomas Pogge Yale Philosophy Department P.O. Box 208306 New Haven, CT 06520-8306 T: +1 203-432-2272 E: thomas.pogge@yale.edu W: https://www.uclan.ac.uk/research/ explore/projects/reward.php W: https://campuspress.yale.edu/ thomaspogge Professor Thomas Pogge
Having received his PhD in philosophy from Harvard, Thomas Pogge holds professorships at the University of Central Lancashire and at Yale. He is a member of the Norwegian Academy of Science as well as co-founder of Academics Stand Against Poverty (ASAP) and Incentives for Global Health (IGH).
15
A continuous system for pharmaceutical processing Batch processing is a well-established method in the field of pharmaceutical processing— but in the face of mounting pressure to improve productivity and reduce manufacturing costs, researchers are increasingly looking for alternatives. We spoke with Dr. Janina Bahnemann about her work in developing an integrated continuous flow system. The pharmaceutical industry
is a major player in the European economy, and pharmaceutical companies are always keen to further refine and improve the efficiency of their production methods. The majority of biopharmaceuticals are currently manufactured using long-standing batch processing methods—but that may soon begin to change, with the rise of continuous bio-manufacturing systems. That topic lies at the heart of Dr. Janina Bahnemann’s research. “Our group is particularly interested in integrated continuous flow systems because we are aiming to develop more flexible methods of producing recombinant proteins for pharmaceutical processes,” she explains. While traditional batch cultivation and production processes are safe and wellestablished, they are also time-consuming, inflexible and do not allow product quality controls - significant downsides which have spurred researchers to investigate alternatives in recent years. “Companies simply don’t want to invest months of work in establishing one single cell line for the production of a new antibody,” stresses Dr. Bahnemann.
Transient transfection Industrial-scale continuous cultivation and production systems might allow pharmaceutical companies to significantly trim back that lead time. On a smaller scale, researchers in Dr. Bahnemann’s group are currently using continuous transient transfection methods to rapidly deliver a gene of interest into a host cell, instead of establishing the stable transfection of cells. “Mammalian cells have the ability - using the gene of interest - to produce antibodies or proteins, or whatever target it is that the gene codes for,” she explains. This continuous transfection system could be integrated into a bioreactor, which would then enable scientists to manipulate cells over time and thereby greatly improve the efficiency of production. “Using this method, cells can produce a protein continuously because they are essentially being transfected over and over again,” notes Dr. Bahnemann. Dr. Bahnemann’s immediate aim is to achieve the efficient and reproducible transfection of mammalian cells. The next step will then be to develop a system for
incubating such cells along with the plasmids containing the gene of interest. “We are ultimately looking to develop a cell separation system in order to separate our cells from the old medium and the transfection buffer,” she says. “The goal will be to transfer the transfected cells into a fresh culture medium, to then continue with the production of our protein of interest.” This system would open up the possibility of producing several different proteins in parallel with one another. Researchers in Dr. Bahnemann’s group are already developing a small-scale microfluidic system through which several different proteins could potentially be produced at the same time when integrating the system into a bioreactor. “We could use different plasmids with different genes of interest, and achieve parallel cultivation and production,” she observes. The cultivation parameters would usually be identical, with Dr. Bahnemann and her colleagues also developing biosensors to act as an online monitoring system. “The biosensors will monitor the cultivation system and alert
Feed bioreactor Cells
Monitoring Biosensors
Reaction reagent Transfection reagents
Lab-on-a-chip LOC
Old medium
Fresh medium
Cell retention
Production bioreactor
DNA
16
EU Research
©Marcel Kipke
DEVELOPMENT OF INTEGRATED CONTINUOUS FLOW SYSTEMS Development of integrated continuous flow systems Project Objectives
us to the presence of any contaminants which is critical to ensuring the quality of the eventual product,” she stresses. These biosensors will enable scientists to quickly identify any shifts that take place within the system and assess in real time whether changes or modifications are required - for example, modifying key parameters such as the buffer, pH level, or nutrient supply. “Because we are receiving this information in real time, we can very quickly ascertain whether our system is working well or not. And if we see that something is starting to go wrong, we can step in and take a deeper look. It’s a very dynamic model,” says Dr. Bahnemann. Additional biosensors can also be integrated into the microfluidic system to monitor the proteins that are produced, thereby allowing Dr. Bahnemann and her colleagues to assess the productivity and
Personalised medicine The system that Dr. Bahnemann envisions holds tremendous potential in the field of personalized medicine. Continuous biomanufacturing systems could facilitate the safe and efficient development of medicines that are tailored directly to the needs of individual patients - which is especially useful in the case of ‘rare’ diseases, which, for economic reasons, are not a major priority for big players in the pharmaceutical industry. “When your focus is large-scale production, batch cultivation is desirable because it’s important to establish the stable transfection of cells. But if the focus is personalized medicine - for example, if you want to treat a disease with a low incidence rate that affects only a small proportion of the population - then it’s very useful to focus on smaller-scale production systems,” explains Dr. Bahnemann. “We can use such systems to
Our group is particularly interested in integrated continuous flow systems because we are aiming to
develop more flexible methods of producing recombinant proteins for pharmaceutical processes. performance of the cells. “We can not only determine whether our protein of interest is being produced, but also identify the specific concentration of production,” she explains. Because perfecting adaptable and versatile biosensors that can consistently serve these purposes is a critical part of any continuous cultivation and production process, they are a central point of interest for Dr. Bahnemann’s team. “In the future, I envision that biosensors like the ones we are working on might be used not just in our cell cultivation system, but also on biomedical or even environmental samples,” she says. The ultimate aim is to develop a system for the production of recombinant proteins that is not only just as safe, efficient, and consistent as current methods, but also far more flexible. “We are currently working to develop a microfluidic system that achieves the efficient and reproducible transfection of mammalian cells, so that we can essentially insert the plasmid with the gene of interest and thus allow a continuous and flexible protein production,” continues Dr. Bahnemann.
www.euresearcher.com
produce small amounts of common proteins, or to produce very rare proteins, in a costefficient manner that simply isn’t possible using traditional production methods.” A continuous cell transfection and cultivation system would also be of great interest to medical research departments. More efficient production of antibodies and proteins would help to facilitate research into their effectiveness and wider medical potential. “After a protein has actually been produced, researchers can check the bioactivity to see whether or not the product is actually efficacious,” says Dr. Bahnemann. Because many rare proteins are either extremely difficult to produce, or are not commercially viable to produce on an industrial scale, Dr. Bahnemann’s research into smaller-scale production methods holds tantalizing promise in this field as well. “We are currently working on producing our first proteins for research purposes. Because there’s such a great need in research applications for the production of certain types of growth factors, this is one of our top priorities at the moment,” she explains.
With the design, fabrication and integration of different functional microfluidic based devices this project pursues the construction of a novel controlled and continuous cell cultivation process for recombinant protein production and simul-taneous analyte monitoring. Main features of the integrated system will be (1) a microfluidic LOC device for continuous gene delivery to the production cell line, (2) a continuous cultivation of the transfected cells in small perfusion culture reactors and (3) a novel electro-mechanical aptamer-based biosensor for specific monitoring of model target proteins and early detection of contamination. The final goal is the establishment of a parallel process for continuous gene delivery and production of various target proteins simultaneously.
Project Funding
This is a DFG Emmy Noether project. http://gepris.dfg.de/gepris/ projekt/346772917?language=en
Project Partners
• Prof. Michael R. Hoffmann, California Institute of Technology (Caltech), Pasadena, USA • Prof. Ester Segal, Israel Institute of Technology (Technion), Haifa, Israel
Contact Details
Project Coordinator, Dr. Janina Bahnemann Emmy-Noether Fellow Institute of Technical Chemistry Gottfried-Wilhelm-Leibniz University Hannover Callinstr. 5, 30167 Hannover, Germany T: +49-511-762-2568 E: jbahnemann@iftc.uni-hannover.de W: https://www.tci.uni-hannover.de/ ak_jbahnemann https://scholar.google.de/citations?user=S1H xpLkAAAAJ&hl=de&oi=ao Dr. Janina Bahnemann
Dr. Janina Bahnemann is a Junior Research Group Leader for Cell Culture and Microsystems Technology at the Institute of Technical Chemistry at Leibnitz University Hannover. Her main research interests lie in cell culture technology, bioprocess technology, microsystems technology, additive manufacturing/3D printing, and biosensors/bioanalytics
17
Tapping into the potential of Planctomycetes Planctomycetes often hold a predominant position in their natural marine habitats, despite their relatively low growth rates. We spoke to Dr. Nicolai Kallscheuer about his work in characterising the metabolome of different recently isolated planctomycetal species, and its implications for the discovery of new bioactive small molecules with potential applications as antioxidants or antibiotics. Planctomycetes are uncommon Gram-negative bacteria, which are typically found on nutrient-rich surfaces within marine environments, such as algae and sponges. Species of the phylum Planctomycetes play an important role in the global carbon and nitrogen cycle. “Most parts of the ocean are oligotrophic. This means that nutrients are not present in high amounts. In contrast, surfaces of algae and kelp are quite nutrient-rich, compared to the water itself. This is why the bacteria like to grow on such surfaces,” explains Dr. Nicolai Kallscheuer. These nutrient-rich surfaces are highly competitive environments, so it is thought likely that competition between microorganisms essentially triggers the production of antimicrobials. This holds important implications in terms of the development of new drugs, an important motivating factor behind Dr. Kallscheuer’s work. Currently based in Christian Jogler´s lab at Radboud University in the Netherlands, Dr. Kallscheuer aims to build a deeper understanding
Planctomycetes as a source of novel secondary metabolites This research was funded by the German Research Foundation (DFG), grant KA 4967/1-1, project number 405562673. Dr. Nicolai Kallscheuer Radboud University, Institute for Water and Wetland Research, Department of Microbiology, Heyendaalseweg 135, 6525 AJ Nijmegen, The Netherlands E: N.Kallscheuer@science.ru.nl W: https://www.ru.nl/microbiology/ department/people/nicolai-kallscheuer-dr/ Dr. Nicolai Kallscheuer
2008-2013: B.Sc. and M.Sc. at Aachen University of Applied Sciences, Campus Juelich, Germany; 2013-2018: PhD and Postdoc at Forschungszentrum Juelich / Heinrich-Heine-University Duesseldorf, Germany; since 2018: Postdoc at Radboud University Nijmegen, The Netherlands.
18
Scanning electron microscopy picture of the not yet described strain Poly41 isolated from plastic particles in the Baltic Sea close to Heiligendamm, Germany in October 2015.
of the secondary metabolism of different, as yet uncharacterized, Planctomycetes. “We’re looking at between 5-10 Planctomycetes that were previously harvested and isolated from marine environments. We chose those strains that looked most promising in terms of secondary metabolites, and focused our attention on those,” he says. The Streptomyces genus is responsible for producing the majority of natural
different natural competitors, after which we extract the superior performer, and then also analyse which compounds are produced. We compare it to a culture that was not triggered by the presence of a competitor, then we try and identifty which specific molecules are only produced in the presence of that particular competitor,” says Dr. Kallscheuer. Planctomycetes grow quite slowly, so it might be expected that they would be outcompeted by faster-growing bacteria; however, in their natural habitats this is in fact not the case, and researchers have found that Planctomycetes are dominant on such surfaces. “This seems somehow counter-intuitive. One possibility is that their survival is mediated by a set of bioactive, anti-microbial secondary metabolites,” explains Dr. Kallscheuer. A second approach involves using database predictions to assess which gene clusters of Planctomycetes are of interest with respect to the production of certain compounds. While the majority of antibiotic compounds derived from different streptomycetal species are known, Planctomycetes represent a completely
We’re looking at between 5-10 Planctomycetes that were previously harvested and isolated from marine environments. In our bioprospection approach, we chose the Planctomycetes that appeared to be the most promising sources of compounds with health-promoting activities. antibiotics currently available; by comparing the physiology of Planctomycetes with Streptomyces, researchers hope to gain important insights. “It might be that certain Planctomycetes follow similar strategies for survival as Streptomycetes, including the production of bioactive molecules,” outlines Dr. Kallscheuer. Further analysis is required to harness their wider potential however, which forms a major part of Dr. Kallscheuer’s overall agenda. Researchers are using different approaches to investigate the metabolic diversity of different species. “One is through the co-cultivation of Planctomycetes with
untapped source, so Dr. Kallscheuer says there is vast scope for investigation. “There are many interesting compounds to be found,” he stresses. This holds important implications in the context of growing concern about antibiotic resistance, and while research is still at a relatively early stage, Dr. Kallscheuer is very much aware of the wider picture. “Every anti-microbial compound that is identified could be important in future. Many strains are becoming resistant to existing antibiotics, so there’s an urgent need to look for new bioactive compounds,” he continues. “This is why we decided to look into a completely new phylum.”
EU Research
‘V6’-POM
‘V18’-POM
‘P2V3W15’-POM
“More than Moore” the interface between synthetic inorganic chemistry and condensed matter physics Molecular electronics is attracting a lot of attention as a means of enabling continued reductions in feature size in data storage and processing devices. Dr Kirill Monakhov and his colleagues synthesise stimuli-responsive molecules, characterise them, and apply them on substrate surfaces, work which could open up new possibilities in highly sought after ‘More than Moore’ information technology. The microelectronics industry developed rapidly over the second half of the twentieth century and beyond, as continued down-scaling of CMOS devices opened up wider commercial opportunities. However, further miniaturisation down to the sub-10 nm regime (so-called ‘quantum limit’) and below is growing ever more challenging, prompting researchers to explore other avenues; Dr Kirill Monakhov and his colleagues (Prof Rainer Waser in Jülich and Prof Bernd Abel in Leipzig) are investigating the field of molecular electronics. “We produce coordination compounds suitable for molecular deposition experiments and nanoscale imaging on surfaces. We try to design and then synthesise molecules that are likely to exhibit many discrete and thermodynamically stable oxidation states,” he explains. Among these molecules are biocompatible vanadium-oxo clusters (polyoxovanadates) and their heteropoly derivatives from a class of polyoxometalates (hereinafter referred to as POMs), which have beneficial structure–property characteristics. “For example, usually negatively charged polyoxovanadates (charged balanced by e.g. quaternary ammonium cations or alkali
www.euresearcher.com
metal cations) feature a striking interplay of molecular charge, redox states and spin states,” says Dr Monakhov. “These can be manipulated by micro-spectroscopic means to address specific goals, not only in the domain of IT devices, but also in molecular biochemistry and biophysics.”
Molecular synthesis and surface studies Researchers at Dr Monakhov’s laboratory carry out multiple tasks, from preparing and characterising molecules to immobilising and electrically accessing them on substrate surfaces, work which could hold important implications for the future development of molecule-based computer memory cells in the electronics industry. The first step here is to produce the markedly different molecular structural motifs, including the development of conceptually new metal complexes and their supramolecular assemblies. “The molecules are designed to have a level of stability against air and moisture, solubility and a specific functionality. The latter could be a low molecular charge or the charge neutrality of a POM building block for example, or you might have organic ligation growth, that
provides a source of stabilisation on surfaces,” outlines Dr Monakhov. The next step after synthesis of these coordination compounds is to investigate their suitability for adsorption as intact molecules on surfaces, which is one of the crucial prerequisites for their applicability. “We usually explore deposition of our nonvolatile molecules in solution on different substrate surfaces under ultra-high vacuum – the surfaces can be conductive. Gold substrates are basically the first choice, due to the ease of handling,” continues Dr Monakhov. The team is also investigating molecular deposition and charge transport characteristics on semi-conductive surfaces, as they aim to demonstrate compatibility with CMOS devices. Different surface-sensitive methods are used here to elucidate the structure and properties of molecular adsorbates in the electrode environment. “We employ various different techniques for sub-molecular resolution imaging and the analysis of molecule–surface interfaces, from scanning tunnelling microscopy (STM) to Grazing-incidence small-angle X-ray scattering (GISAXS),” explains Dr Monakhov. “First, we want to determine the adsorption type, the agglomeration tendency, the distribution and the oxidation state of deposited molecules.”
19
Two terminal fundamental cell setup. Three terminal practical cell setup.
By applying scanning tunnelling spectroscopy on individual molecules, Dr Monakhov and colleagues are able to detect electron transport. “We can look at how molecules are conductive and whether their molecular conductivity can be tracked as a function of individual metal redox states. Their reversible switching at room temperature and a low bias voltage is the major goal,” he says. This work is central to assessing the suitability and processability of these single molecules as components of multiple-state resistive (memristive) switching devices; one possible application is in braininspired neuromorphic computing, for example. “Different computer memory cells are currently available on the market, based for example on charge-based, magnetic or optical storage. We go beyond these von Neumann architectureoriented concepts of constructing binary devices,” explains Dr Monakhov.
Resistive Random Access Memory (ReRAM) In collaboration with the group of Prof Waser at the Peter Grünberg-Institute (PGI-7) of the Forschungszentrum Jülich, Dr Monakhov’s lab is now investigating the concept of redoxbased resistive switching memories (see photograph opposite). These could bring some significant benefits over existing data storage and processing techniques in terms of low power consumption, higher scalability, non-volatility and lower switching time. The ReRAM cells are based on a relatively simple structure that helps to improve cost
20
efficiency. “In our model two-terminal cell setup we just have two electrodes – a surface as bottom and an STM-tip as top – and the molecules inbetween,” says Dr Monakhov. The long-term goal is to develop a moleculebased memristive switching device – a type of electrical component – operated by many discrete, stable and electrically accessible metal redox states, which could open up some extremely interesting possibilities. “Indeed, we want to implement such devices in the area of artificial intelligence,” says Dr Monakhov. “This would be for specific high dense, non-
we would effectively replace a large number of the energy-intensive transistors by such memristive devices, ultimately improving efficiency and increasing storage capacity,” outlines Dr Monakhov. The microelectronics industry is currently dominated by metal-oxide-semiconductor structures, so shifting to molecular ‘More than Moore’ nanoelectronics could have farreaching effects in this area. “In a memristive device, writing of information would occur at an increased potential in the range of a few volts, whereas reading of information would
We produce coordination compounds suitable for molecular deposition experiments and nanoscale imaging on surfaces. We try to design and then synthesise molecules that are likely to exhibit many discrete and
thermodynamically stable oxidation states. volatile and intelligent memories, which could realize data storage and processing in the same material through the additive character of available molecular redox states. This will allow us to substantially reduce internal data transfer, from which classical von Neumann architectures suffer, and furthermore improve performance of IT systems.” One other possible application is in digital medicine. “We can also think about implementing the molecular memristive functionalities into neurostimulators. Thus,
occur at significantly lower voltages in the millivolt range. This is amazing, because the envisaged molecule-orchestrated read and write operations would enable us to sustainably increase the performance and energy efficiency of nanoelectronic devices, while keeping on down-scaling,” continues Dr Monakhov.
Moving fundamentals towards the application The long-term prospects are very exciting, yet there are still some technical issues to deal
EU Research
1 1 3 4 1 1 2 0 1 3 1 4 0 1 1
2 3 4 0 1 2 1 1 2 4 1 0 1 3 1 0 0 4 2 1 3 1 0 2 4 1 3 2 4 1 1 3 4 2 1 1 1 3
with first, which are high on the agenda for Dr Monakhov and his colleagues. The current focus in research is on understanding how the coordination compounds behave on various conductive and semi-conductive surfaces, and how their structural and electronic properties respond to adsorption and further manipulation by the electric field of a scanning tunnelling microscope. “The next step beyond that will be to integrate single molecules into specific nano-gaps, in order to somehow electrically contact the molecules (see figure to left),” says Dr Monakhov. This remains one of the major challenges in the wider molecular electronics field, and Dr Monakhov says that it is central to the development of practical devices relevant to the needs of industry. “We need to find procedures by which we can address the stable and reproducible electrical contacts to single molecules that are wired to nano-gap electrodes,” he continues. “The usage of a three-terminal setup implying two electrodes and a contact substrate surface is moreover the key to determining the retention and switching time of the molecular metal redox states and the endurance of the potential single molecule-based memristive device.” However, one significant question that needs to be explored by chemists and physicists is how to conveniently embed the single molecules in nano-gaps, without ‘losing’ their inherent molecular orbital structure, as compared to their bulk state. “For this reason, not only the structure– property relationships of coordination
compounds should be fine-tuned, but also the nano-gap electrodes require optimisation by both chemical and physical means,” says Dr Monakhov. Molecular electronics brings together elements of many different fields, and while Dr Monakhov specialises in chemical synthesis and surface engineering, he says other disciplines also play an indispensable role in driving progress. “There are people from different backgrounds, including for example micro-spectroscopists, theoretical chemists, and materials scientists. Close collaboration with them is very important with respect to fundamental understanding and active control of molecular electron transport processes and for future industrial applications,” he outlines. This research is quite high-risk and exploratory in nature, yet this also means that the potential gains are correspondingly high. While the current focus in research is on the development of processable coordination compounds, the characterisation of their properties, the bottom-up surface modification and the electrical addressability of fabricated hybrid materials, Dr Monakhov is also fully aware of the wider picture. “We want to move all these underlying research questions and goals towards practical implementation, and we’re looking towards the next steps in this yet fundamental area. We’re looking for opportunities to do technology-focused research, outside the framework of our surface-oriented work,” he says.
SWITCHSPINPOV Development of Heteropolyvanadate Spin Clusters as Candidates for Future Redox-Based Memory Devices
Project Objectives
Project studies lie at the interface between the synthetic inorganic chemistry and condensed matter physics. They target the fundamental understanding of multiple-state resistive (memristive) switching at the level of individual polyoxovanadates and their organically modified derivatives immobilised on conductive and semi-conductive surfaces. The farreaching goal is to integrate the developed redoxactive molecules into the industrially relevant “More than Moore” technology setups.
Project Funding
The project is supported by the Emmy Noether programme of the Deutsche Forschungsgemeinschaft (DFG).
Project Partners
• Professor Rainer Waser (JARA-FIT, Forschungszentrum Jülich und RWTH Aachen University)
Contact Details
Project spokesperson: Dr Kirill Monakhov Laboratory for Switchable Surfaces and Spintronics Leibniz Institute of Surface Engineering (IOM) Permoserstraße 15 04318 Leipzig Germany T: +49 (0)341 235 3364 E: kirill.monakhov@iom-leipzig.de W: https://www.iom-leipzig.de W: https://monakhovlab.jimdofree.com/ W: https://twitter.com/monakhovlab?lang=en
Kirill Monakhov
Maria Glöß (PhD in the group of Kirill Monakhov) and Dr Marco Moors (PostDoc in the group of Rainer Waser) perform a model STM experiment with polyoxovanadate molecules immobilised on a gold substrate surface at the UHV oxide cluster tool located at the Peter Grünberg Institute (PGI-7) of the FZ Jülich.
Kirill Monakhov currently holds a Group Leader position at the Leibniz Institute of Surface Engineering (IOM) in Leipzig. He received his Dr. rer. nat. degree at the Heidelberg University in 2010 and spent then several years as a postdoctoral fellow at the University of Strasbourg and the RWTH Aachen University. He was the recipient of the prestigious DFG Emmy Noether Fellowship in 2015 and of the Academia Europaea Burgen Scholarship in 2011.
www.euresearcher.com
21
A rational design of small molecules in specific biomolecular environments Illustration of a small molecule (left) close to a lipidmembrane interface. Water is not represented.
Researchers continue to search for new molecules with functions relevant to the development of new products, yet identifying the right molecules from the vast number available is a demanding task. We spoke to Dr Tristan Bereau about his work in using computer simulations to systematically explore chemical space and help accelerate compound discovery in soft matter. A lot of
attention in research is focused on moving towards a more structured approach to materials design, with scientists seeking to identify molecules with specific functionalities relevant for the development of new products. This includes using not just laboratory-based experiments, but also increasingly computer simulations, a topic at the heart of Dr Tristan Bereau’s research. “In my group we are trying to explore what we call chemical compound space, which is the ensemble of all stable molecules,” he outlines. “Most molecules have never been synthesized, so computer simulations can help us to discover interesting new compounds.” This work involves systematically exploring certain properties of these molecules, with the wider aim of moving towards a more rational approach to materials development, specifically for soft matter. Soft matter deals with materials that are structurally altered by thermal fluctuations - these include liquids, polymers, and colloids, as well as biological materials. A rational approach to materials design involves measuring the same quantity for many molecules, thereby establishing a relationship between chemical structure and function. Certain adjustments are required to make measurements for many molecules. Unlike in the laboratory, compounds don’t need to be synthesized, but the models need to be parameterized, to define how molecules interact with one another.
22
Optimizing these parameters has historically been a labour-intensive, manual task, but now machine learning can be used to help do this more efficiently. “We are effectively setting up a high-throughput screening experiment, using computer simulations. We can do this because we essentially parameterize every molecule automatically, rather than spending time on each compound,” continues Dr Bereau. The result of a high-throughput screening procedure is a database relating a large number of compounds to specific properties. Artistic representation of a drug (yellow) interacting with a lipid membrane. Our scheme allows us to explore how varying the small molecule affects drug-membrane thermodynamics at high throughput.
“We’re building databases of thermodynamic properties for small organic molecules in complex environments, like in a liquid,” continues Dr Bereau. “These properties are essential, yet we only have measurements for a small number of molecules.” Databases are often used to train machine learning models, which then predict compound properties for which there is no data, effectively filling the gaps for large portions of chemical space. Ultimately, such a computational approach can suggest compounds of interest to be tested experimentally in the laboratory, speeding up the compound discovery process.
Computer simulations This interest in developing computational high-throughput screening techniques to investigate thermodynamic properties is not new. However, so far work in this area has been limited by computational-power constraints. “Calculations of thermodynamic properties, such as free energies, are best obtained from molecular dynamics simulations, which unfortunately require significant computational investment,” explains Dr Bereau. This limits researchers to focusing on just a handful of compounds. To circumvent this barrier and reach much larger scales, Dr Bereau and his colleagues use so-called coarse-grained models, which simplify the representation by grouping several atoms together. “It’s a bit like looking at a Pointillism painting. If you look at these models from a
EU Research
IMPORTANCE SAMPLING Importance sampling of chemical compound space: Thermodynamic properties from high-throughput coarse-grained simulations Project Objectives Small organic molecule represented at an atomistic resolution (left) and coarse-grained resolution (right).
distance, they look like molecules, but if you look closer, you don’t see every atom,” he explains. Not only do the simulations converge much faster, but a single coarse-grained simulation also provides information about many compounds. Dr Bereau draws an analogy here with construction toy models. “If you’re asked to use Lego bricks to build two specific molecules that closely resemble each other, you might find that you don’t have enough resolution in your building blocks to tell them apart,” he explains. “So you would use the same set of bricks for the two compounds, as at this specific resolution they effectively look the same. This is something that we use to our advantage, in the sense that we only need a few simulations to screen a large part of chemical
models are minimalistic, Dr Bereau says they are tailored to encode the relevant driving forces for this particular problem, making them extremely efficient. “We put a lot of the physical ingredients that we think are relevant into the model, making our calculations reliable,” he stresses.
Molecular design This research not only holds important implications for the pharmaceutical sector, but is also of intrinsic scientific interest, enabling scientists to probe deeper into the physical chemistry relevant to a problem. “Once we have generated the data, we can establish a structure-property relationship,” outlines Dr Bereau. “This can help us in the design of new molecules. For example, if you
Most molecules have never
been synthesized, so computer simulations can help discover new interesting compounds space. This makes coarse-graining an efficient strategy for high-throughput screening.” A property that Dr Bereau and his colleagues have been studying is the propensity for a molecule to cross a cell membrane, an important quantity in drug development. “Before going into a cell, a drug has to permeate across the cell membrane, a soft architecture made primarily of phospholipids,” he explains. The researchers have been studying how likely a molecule is to cross a lipid membrane, and how quickly it does it. “In our last paper we predicted the permeability coefficient for several hundred thousand molecules, several orders of magnitude more than previously achieved from computer simulations,” outlines Dr Bereau. While the coarse-grained
want to design a molecule that can permeate through the lipid membrane easily, then the database can help researchers identify what type of chemical group is most relevant.” Researchers have already gathered a lot of data, and Dr Bereau hopes their work will help encourage further use of computer simulations for high-throughput screening. “This protocol could be adapted to different types of systems, materials, and environments,” he says. Researchers are also looking to analyse the results in greater depth. “There’s still a lot of information that we could tease out from the data. It will be very interesting to go back and use machine learning to gain further insights,” continues Dr Bereau.
Illustration of the reduction of chemical space: many structurally- and thermodynamically-similar molecules map to the same coarse-grained representation.
www.euresearcher.com
The Emmy Noether project “Importance Sampling in Chemical Space” aims at a systematic investigation and rational design of small molecules in specific biomolecular environments. This calls for the development of high-throughput computer simulations combined with data-driven techniques to generate and subsequently analyze large databases of thermodynamic properties.
Project Funding
Emmy Noether programme of the Deutsche Forschungsgemeinschaft (DFG)
Contact Details
Project Coordinator, Dr Tristan Bereau Max Planck Institute for Polymer Research Ackermannweg 10 55128 Mainz Germany T: +49 (0)6131 379 478 E: bereau@mpip-mainz.mpg.de W: http://www.mpip-mainz.mpg. de/~bereau/
Menichetti, Kanekal, Bereau, Drug–Membrane Permeability across Chemical Space, ACS Central Science, 5, 290-298 (2019); https://pubs.acs.org/ doi/10.1021/acscentsci.8b00718 Menichetti, Kanekal, Kremer, Bereau, In silico screening of drug-membrane thermodynamics reveals linear relations between bulk partitioning and the potential of mean force, Journal of Chemical Physics, 147, 125101 (2017); https://doi.org/10.1063/1.4987012
Dr Tristan Bereau
Dr Tristan Bereau is an independent group leader of the Theory Group at the Max Planck Institute for Polymer Research in Mainz, Germany. His research focuses on the modeling of soft-matter and biomolecular systems using a combination of physics-based computer simulations and data-driven techniques.
23
Obtaining a novel type of contrast for Magnetic Resonance Imaging Although water is essential to life on earth, much remains to be learned about its fundamental properties. There are two protons in each water molecule and their magnetic properties can be configured in different ways, now Professor Geoffrey Bodenhausen and his colleagues aim to gain deeper insights into the properties of water. Characterising the fundamental properties of water remains a challenging task. Some of these properties are related to the fact that the hydrogen atoms in water have a nucleus, which itself has a property called spin. “There are two protons in each water molecule, and the two spins talk to each other. It turns out that they can appear in different ways,” says Geoffrey Bodenhausen, Professor of Chemistry at The research group at the Ecole Normale Supérieure in Paris with their equipment for demanding magnetic resonance experiments. In para-water, the two protons are arranged in what can be loosely described as an anti-parallel configuration. “That occurs when one proton is up while the other one is down, or vice-versa,” continues Bodenhausen. “This means that the proton spin is neither on the left side of the molecule nor on the right, but somehow inbetween, somehow de-localised.”
Para-water Para-water accounts for 25 percent of water at room temperature, the remaining 75 percent being ortho-water, in which the proton spins are symmetrical. As the Principal Investigator of the DiluteParaWater ERC project, Bodenhausen aims to look at water in which there is a deviation from this 25-75 balance, also known as a tripletsinglet imbalance (TSI). “The project is about trying to create or amplify this imbalance – to get more para-water than ortho-water beyond this 1:3 ratio, or the other way round,” he outlines. The first step is to prepare water in this particular state, then to isolate it. “We want to isolate this water somehow – in a solid, a liquid, or a matrix of some sort. From there we can then try to study its properties – to see if para-water has different properties from orthowater,” says Bodenhausen. By studying the behaviour of para-water and ortho-water, researchers hope to make progress in understanding the overall mixture. However, preparing para-water itself is a challenging task; parallels can be drawn here with molecular hydrogen gas, which also has two protons. “It’s exactly like water, except that
24
The research group at the Ecole Normale Supérieure in Paris with equipment for demanding magnetic resonance experiments
there is no oxygen atom in between. From that point of view, hydrogen is analogous to water,” explains Bodenhausen. It is relatively easy to separate ortho-hydrogen from para-hydrogen, and researchers have found that they have very different properties. Yet separating the two forms of water is more difficult. “One of the difficulties is that the protons do not remain attached to the same oxygen atom. Water molecules tend to exchange their protons – so a proton can travel from one oxygen atom to another,” continues Bodenhausen. A pure para-water molecule loses its identity if a proton hops to a neighbouring oxygen atom in this way, which represents a challenge in terms of the project’s overall agenda. The idea of the project is to isolate water molecules by protecting them behind a layer of different molecules. “A solvent we have in mind is called dioxane. This would stick to the water molecule and prevent it from exchanging any protons with other water molecules,” says Bodenhausen. This would help preserve the TSI and turn it into a longlived state (LLS). “The idea is that the water molecules in dioxane would be isolated, there would be few water molecules diluted among a larger number of dioxane molecules. In that
environment they would be protected and have a lifetime which we believe to be on the order of 20 seconds,” explains Bodenhausen. The imbalance between para-water and ortho-water is created by freezing ordinary water protected by diluting it in dioxane at a temperature of around 1 Kelvin, which is about -272° C. Researchers then apply a technique called dynamic nuclear polarisation (DNP) to bring the temperature of the spins down even lower, to about 50 milliKelvin, around 20 times colder than the frozen sample. “We know from our calculations that there is an excess of the triplet state over the singlet state at that temperature – so that we have an excess of the orthowater,” says Bodenhausen. The next question is how to study the imbalance at this very low temperature. “There’s no technique to visualise what we have done at very low temperatures. So we have to extract it from this very low temperature arrangement and transfer it to room temperature,” outlines Bodenhausen. This dissolution part of the procedure has to be performed very quickly, as the lifetime of the imbalance is limited to around 20 seconds. This is a major challenge for Bodenhausen and
EU Research
his colleagues. “Currently we lose a lot of the imbalance. So the imbalance, instead of being massive, becomes very subtle, on the border of what we can observe,” he explains. A lot of energy in the project has been devoted to optimising the experimental procedures in order to preserve this imbalance in bulk water, which is proving challenging; however, researchers have made more progress in other avenues of investigation. “The concept can be extended to consider para- and orthodrug molecules. Some drugs also have two protons, just like water has two protons. This has become a thriving line of research in its own right,” says Bodenhausen.
Transport phenomena This technique also holds potential as a means of studying transport phenomena within fluids, such as flow and diffusion. While studying these phenomena in water has again proved challenging, Bodenhausen and his co-workers have gained important insights into other molecules with two protons. “We have been able to measure very slow diffusion, in particular the diffusion of very large molecules which – because of their size – is very slow. We have made quite a bit of progress in measuring the diffusion of large objects like large molecular assemblies,” he says. There are several potential avenues of investigation arising out of the project’s work; while fully aware of the wider
If a few para-water molecules could be isolated in dioxane, they would be protected from proton exchange and have a lifetime which we believe to be on the order of 20 seconds. The potential here lies in the idea that a long-lived state can be used to improve the sensitivity of drug screening. Typically, drug screening is performed on a very large scale, and fairly high concentrations of both the drug and the target are required. “The target is typically a molecule in the human body, often a protein. A solution of purified target proteins has to be prepared, and that’s very expensive,” outlines Bodenhausen. The technique developed in the project enables drug researchers to reduce the concentration of their target, so making this kind of experiment much cheaper. “It’s about good sensitivity for small amounts of purified proteins. We can work at much lower concentrations,” says Bodenhausen.
picture, Geoffrey Bodenhausen is keen to also pursue further research into water. “We’re still trying to improve the experimental procedures, as we still want to study water,” he stresses. The transfer of a TSI from an apparatus at very low temperature to a machine that works at room temperature is a key challenge in this respect. The aim is to essentially accelerate this process by transferring a solid pellet instead of a liquid, and so preserve the imbalance between para- and ortho-water more effectively. “We’ve made a lot of progress here, but the method is not yet functional. If we can make it work, that would be the ideal way of proving that we indeed have an imbalance, so that we can study the properties of triplet water, and the role of singlet water,” says Geoffrey Bodenhausen.
DILUTEPARAWATER DILUTEPARAWATER (Long-Lived Nuclear Magnetization in Dilute Para-Water) Project Objectives
The magnetization of hydrogen nuclei in H2O constitutes the basis of most applications of magnetic resonance imaging (MRI.) Only ortho-water, where the two proton spins are in states that are symmetric with respect to permutation, features NMR-allowed transitions. Parawater is analogous to para-hydrogen, where the two proton spins are anti-symmetric with respect to permutation. The objective of this proposal is to render para-H2O accessible to observation. Several strategies will be developed for its preparation and observation in solids, liquids and gas phase.
Project Funding
DILUTEPARAWATER is funded by the European Research Council. Project funding total is € 2,500,000.
Project Team
The research group at the Ecole Normale Supérieure in Paris with their equipment for demanding magnetic resonance experiments.
Contact Details
Project Coordinator, Prof. Geoffrey Bodenhausen Département de chimie Ecole Normale Supérieure 24 rue Lhomond, 75231 Paris cedex 05, France T: +33 1 44 32 34 02 W: http://paris-en-resonance.fr W: https://isic.epfl.ch/faculty-members/ emeritus_professors/bodenhausen/ W: https://cordis.europa.eu/project/ rcn/110296/factsheet/de
Professor Geoffrey Bodenhausen
(a) Carbon-13 NMR signals at 1.2 K in thermal equiibrium (blue) and amplified by DNP (red); (b) Boosting carbon-13 signals by repeated cross-polarization at 1.2 K; (c) Carbon-13 NMR signals after dissolution at 300 K in thermal equiibrium (blue) and amplified by DNP (red); (d) Slow decay of polarization at 300 K.
Geoffrey Bodenhausen is a Professor of Chemistry at ENS in Paris. He specialises in NMR and MRI, and played a pioneering role in the field of 2-dimensional Fourier transform NMR spectroscopy. He is a Fellow of the American Physical Society.
www.euresearcher.com
25
Are trees our best hope for fighting climate change?
Considering the range of innovations and ideas proposed to halt climate change, the humble tree may still be our most effective tool against carbon pollution. Photosynthesis removes carbon dioxide naturally and trees can store it. Despite all the advisories not to destroy trees for this very reason, we are cutting down forests at an unprecedented pace. Is therefore, the startlingly simple idea to grow more trees a solution that will work? By Richard Forsyth
D
ense forests perform a vital duty in the checks and balances for the ‘machine’ of nature to perform as it should. Forests provide carbon sinks which undertake a process called sequestration, which removes chemical compounds from the atmosphere. However, around 7.3 million hectares of forest is levelled every year according to the United Nations Food and Agriculture Organization (FACO). To date, about half the world’s tropical forests have been levelled. One of the most devastating human activities, that drives accelerated climate change, is known to be the destruction of large forests, for agricultural land, resources or development. Despite an awareness of this process, policy makers in the many areas where the forests reside too often favour short term economic development over the ‘less tangible’ climate advantages of leaving these regions alone. Point in fact, the policies of Brazilian President, Jair Bolsonaro, have meant the rate of rainforest destruction has soared in the Amazon region with an increase of 88% compared to June 2018. This is worrying climate experts around the world, because The Amazon Basin is the world’s largest carbon sink after the ocean, absorbing 20% from our atmosphere. Ranching, building and mining are perceived as lucrative paths for development. But it’s not just the Amazon that’s being stripped away. Indonesia has the most deforestation. Since the last century, Indonesia has cleared close to 16 million hectares of forest land, fuelled by a global demand
26
for palm oil. Similarly, Bolivia has cleared land for its burgeoning soya industry and cattle ranching. In Peru, 80% of the destruction is illegal but none-the-less completely relentless. Other countries such as Thailand, the Democratic Republic of Congo, Russia, Mexico, Papua New Guinea, Sudan, Nigeria and many, many others have decimated forest lands in huge swathes. This is an undisputable global pattern of destruction, with no sign of abating. There is an estimate that in only 100 years all the rainforests may be completely gone. There’s currently a lot of research funding flowing into innovative technologies that focus on removing carbon dioxide from the atmosphere. Whilst some are deployed today, others are in pilot phases or on the drawing board. The idea is that these devices, like great carbon hoovers, suck the carbon straight from the air and store it underground. But, views are now polarising in the scientific community that the best solution is in fact the simplest, plant more trees. This will compensate for those lost. Trees, in fact all plants, have always played a vital role in climate regulation. They use carbon dioxide to generate energy, when they die some of the carbon is taken to the ground as it decays. Trees can be around 50 percent carbon in weight. With vast numbers of trees and over time, this leads to a net reduction of carbon in the atmosphere.
Reforesting the planet Thomas Crowther at the Crowther Lab, ETH Zurich, recently coauthored a paper which is featured in the Journal, Science, claiming
EU Research
Photograph by Andreas Gücklhorn
We all knew that restoring forests could play a part in tackling climate change, but we didn’t really know how big the impact would be. Our study shows clearly that forest restoration is the best climate change solution available today.
that reforestation, “…is so much more vastly powerful than anyone expected… By far, it’s the top climate change solution in terms of carbon potential.” Crowther and his team analysed satellite imagery to locate forests and work out where they could possibly retake. The study generated the first global map of where trees can naturally exist in today’s climate and they calculated how many trees can exist in those places and how much carbon they can store. This is excluding agricultural and urban areas. The quantitative map shows how it is possible to offset climate change. The globe-spanning assessment of satellite imagery focused on the potential for forestry to flourish in areas already cut down. The conclusion was that if saplings were allowed to grow in areas already cleared, this would increase forested regions by one-third and take out 205 billion metric tonnes of carbon from the 300 billion tonnes of carbon that has been released into the atmosphere as a result of humans, since the Industrial Revolution. Essentially, this means taking out two-thirds of emissions put there by Mankind since the 19th Century. They had found a powerful argument in this solution by nature. So, just how many more trees could we pack in, in this hypothetical massive reforestation scenario. They concluded, taking into consideration current climate conditions, the planet could support 4.4 billion hectares of continuous cover, which means 1.6 billion more hectares than the current 2.8 billion hectares. Of these 1.6 billion, 0.9
www.euresearcher.com
billion hectares fulfil the criterion of not being used by humans. To give this perspective – that’s an area around the size of the whole USA. It should be noted that these regions would not be grasslands, they are those places where trees could naturally proliferate into forest land. According to Prof. Thomas Crowther: “We all knew that restoring forests could play a part in tackling climate change, but we didn’t really know how big the impact would be. Our study shows clearly that forest restoration is the best climate change solution available today. But we must act quickly, as new forests will take decades to mature and achieve their full potential as a source of natural carbon storage.” The research indicates where in the world reforestation is best suited. The best six countries are: Russia (151 million hectares); the US (103 million hectares); Canada (78.4 million hectares); Australia (58 million hectares); Brazil (49.7 million hectares); and China (40.2 million hectares). Crowther’s claim that this is “…the best climate change solution proposed to date,” has created renewed interest in the concept. However, despite the paper’s message of hope in an increasingly hopeless situation, such a solution would also require a drastic cut to emissions in parallel to be effective. Whilst this is a grand and some say unrealistic idea, the traction from the media of the report shows a willingness, at least, to highlight ways out of our current climate crisis, when we are facing ever more bleak assessments of our path into climate chaos.
27
Seeds of hope The idea of reforestation is specifically targeting areas that have been cleared, to let the trees regrow or planting them systematically. This is fundamentally different to the term afforestation, which means planting trees in locations where there were previously none. Afforestation is also proposed as part of the overall solution by the Intergovernmental Panel on Climate Change (IPCC) to keep global warming to 1.5C above pre-industrial levels. However, afforestation is not nearly as effective as growing natural forests. Forests contain a wider range of plant species that occupy all heights and spaces, so there is more surface area for capturing sunlight for photosynthesis, where they absorb CO2 to generate energy. Forests are also intended to be left alone whereas timber grown will almost always be harvested systematically, leaving significant gaps in effectiveness as new trees have to mature all over again, in the cycle. It cannot be emphasised enough, despite all the quibbles of either attempting reforestation or afforestation, as a part of a larger global solution, growing trees will mean a positive impact for climate. Afforestation projects have in fact had a significant push in recent years, all around the world. In 2006 the United Nations Environmental Programme (UNEP) launched the Billion Tree campaign, simply put, with the target to plant a billion trees by 2007. This was achieved, and with a sense of optimism around potential,
brought about new targets, starting with the 2008 campaign to grow 7 billion trees – which was also achieved. Since 2016 a total of 14.2 billion trees have now been planted. China, India, Ethiopia, Pakistan, all planted more than a billion trees in this endeavour, whilst Mexico, France, Turkey, Peru, Nigeria and other countries achieved high three-figure million numbers. It’s interesting to see China lead this movement too, as the world’s fast-tracked economic powerhouse, which has also been widely blamed for much of the industrial pollution that fuels climate change. From 1990-2015 China planted more new forest than any other country in the world, equating to 79 million hectares and more than $100 billion in afforestation investment. Many farmers were encouraged to transform their farmland into forests. The initiative was called Three-North Shelter Forest Program but is also known as The Great Green Wall. Importantly this was not all done for reasons of altruistic accountability to the world. The main reason for this effort had much to do with avoiding encroaching desertification, where fertile land degrades to wasteland desert. The Gobi desert is expanding at an alarming rate, damaging agriculture significantly. The hope is the planted trees will survive long enough to retain moisture, and eventually stabilise the soil. So far, many of the trees have not survived. This is a difficult battle.
Any threats to food security and human habitation quickly become very practical and serious problems for populations and that’s when governments pay attention. Perhaps that is why it feels like we are at a different kind of cross-roads with how climate change is, and will be, viewed. The effects of climate change are very much occurring today.
28
EU Research
Any threats to food security and human habitation quickly become very practical and serious problems for populations and that’s when governments pay attention. Perhaps that is why it feels like we are at a different kind of cross-roads with how climate change is, and will be, viewed. The effects of climate change are very much occurring today. The hypothetical arguments are no longer ideas but realities. Eventually, climate impact, land degradation, weather emergencies, food security, mass migration – all the fallout from climate-related resource mismanagement, creates very tangible national security issues that affect countries’ economies and citizens in drastic ways. With this in mind, afforestation and reforestation will surely remain ‘on the table’ as simple counterbalances to climate effects, where countries stand to lose sustainable sources of food production or regions where people can live.
Does one person count? On a last note. There is an inspirational story that’s worth sharing about reforestation. Whilst we can see how big data, governments and campaigns can potentially lead to redefining landscapes and growing forests, even one determined person can create huge impact. There’s a wonderful story that gained traction around 2012, with help from documentary makers, about a man called Jadav Payeng, who single handedly and successfully reforested an area
covering around 550 hectares. He was left to tend The Mulai Reserve on Majuli Island in the Brahmaputra River near Kokilamukh in the Jorhat district of Assam, India. The island was suffering from severe soil erosion on its banks and the speculation was it would end up submerged. The government had all but given up on it as a lost land. However, Jadav began planting bamboo, followed by a variety of other plant species. Some thirty years on from his oneman reforestation project, and his fertile forest, rich in plant life, is now also home to Indian rhinoceros, reptiles, deer, rabbits, birds and even elephants. Jadav earned the nickname of Forest Man in India and showed how reforestation and the ecosystems and animals that come as a result, just requires perseverance. Jadav’s efforts shows us three things. Firstly, where forests have lived before, even when it looks unlikely, they can often thrive again and secondly, anyone can do this kind of work. Planting trees is not a science reserved for government bodies and scientific groups, it’s simply gardening and nature does most of the work for us. Lastly, he showed us, this works. Where there was nothing – there is now a healthy ecosystem, with benefits beyond the tree’s carbon adsorption, where animals proliferate, plants co-exist and the environment has been successfully regenerated. He showed the world what we can do in terms of regenerating nature, with all its benefits. It just takes the will.
The Gobi desert is expanding at an alarming rate, damaging agriculture significantly. The hope is the planted trees will survive long enough to retain moisture, and eventually stabilise the soil. So far, many of the trees have not survived. This is a difficult battle.
www.euresearcher.com
29
Putting corals on a sustainable footing Coral reefs are among the longest-living ecosystems on Earth, yet they are struggling to keep pace with unprecedented rates of environmental change, leading to a rapid worldwide decline in coverage. Human intervention is essential to ensuring the long-term future of coral reefs, a major motivating factor behind Dr. Hanna Koch’s work in crossbreeding corals.
Underwater coral outplant. Photograph by Conor Goulding.
The world’s coral
30
reefs are declining rapidly, with overall coverage estimated to have decreased by 80% in the Caribbean since the ‘80s, and around 50% worldwide. Rising sea surface temperatures are a major factor in this decline. “When temperatures become too high the coral becomes stressed, and in response it expels its algal endosymbionts, which are responsible for the majority of the coral’s nutritional requirements,” explains Dr. Hanna Koch. The corals are essentially starving, and although they can recover from acute thermal events, these types of events are increasing in severity and duration, leading to repeated mass die-offs. “There’s just not enough time for the corals to recover in between such stressful events,” says Dr. Koch. Exacerbating the situation is increasing disease prevalence, especially in the Caribbean and Western Atlantic, where the rapid loss of corals from emerging diseases is unprecedented in the geological records. “Finding and utilizing endemic temperature tolerant and disease resistant genotypes is crucial for developing appropriate genetic and reproductive interventions,” adds Dr. Koch. As a DFG postdoctoral fellow and visiting research scientist at Mote Marine Laboratory in the Florida Keys, Dr. Koch is working to help protect these threatened coral species, restore natural populations and prepare them for future environmental change. Coral reef ecosystems constitute a mere fraction of the marine environment (< 1%) but have profound importance for humans and the natural world. They support more than 25% of all marine life, serve as biodiversity reservoirs and provide essential environmental and economic services including food, jobs, oxygen, tourism, coastline protection, and raw resources for novel medicines. However, the marine environment is
changing at an alarming rate, leaving these longlived, slow-growing organisms unable to adapt at sufficient rates. “Some coral populations are now so small and degraded that sexual reproduction and gene flow, processes crucial for introducing new individuals with potentially novel adaptive alleles, are becoming less efficient or are disrupted altogether,” warns Dr. Koch.
assisted evolution. While there are different approaches to assisted evolution of corals, Dr. Koch is focusing on intraspecific selective breeding. “The idea is to crossbreed corals with different phenotypes and genotypes conferring resistance or resilience. We want to generate offspring populations with increased adaptive potential by crossing thermotolerant and
The idea is to crossbreed corals with certain desirable traits for generating robust offspring populations with increased genetic diversity that can be used for
resilience-based coral reef restoration. Assisted evolution A gap has emerged between the adaptability of reef-building corals and current rates of environmental change, which Dr. Koch aims to bridge by harnessing and accelerating, in a controlled laboratory setting, naturallyoccurring evolutionary processes for building coral reef resilience - a concept referred to as As part of Dr. Koch’s team, intern Allyson DeMerlis helps raise and test the sexual recruits (baby corals).
disease resistant corals, for example, to produce individuals that have a combination of beneficial traits, while at the same time controlling for genetic diversity,” she explains. “In order for adaptive traits to be inherited across successive generations - an essential component of adaptive evolution - they must have a genetic basis; evaluating the heritability of these traits is therefore a necessary part of this work.” The process of coral sexual reproduction, cross fertilization and adaptation already happens in nature, but at rates feared to be too slow to keep up with those of surrounding environmental changes, indicating human intervention is necessary. With selective breeding, researchers first identify corals with certain desirable traits like increased thermotolerance, skeletal density, and disease resistance. “After screening the different genotypes for resilience to a variety of environmental stressors, we tag and track the corals so we can collect gametes from them during annual mass spawning events, which we then take back to the lab for performing controlled crosses,” explains Dr. Koch. “In
EU Research
Mote Marine Laboratory’s Elizabeth Moore International Center for Coral Reef Research and Restoration in the Florida Keys.
CONDUCTING ASSISTED EVOLUTION Conducting assisted evolution in a threatened coral species for promoting more resilient reef ecosystems Project Objectives
This work explores novel genetic and reproductive interventions for generating resilient coral populations able to withstand rapidly changing environmental conditions associated with climate change.
this way, we can raise very large numbers of offspring in the lab, most of which would not survive in nature.” The primary long-term goals of coral reef restoration are to increase coral cover and genetic diversity, so that essential ecosystem services are maintained or restored. “Increasing coral cover and population sizes can be achieved relatively quickly via coral gardening, a process of planting large numbers of laboratory or field-raised coral fragments onto degraded reefs, but this technique relies on the asexual propagation of a limited number of top-performing genotypes,” explains Dr. Koch. “Sexual reproduction and genetic diversity are crucial for the long-term persistence of coral reefs as genetic diversity promotes resilience by providing a buffer against novel environmental change and the flexibility to adapt,” she continues. “Coral reef restoration programmes should therefore strive to include sexually-reproducing corals - or get outplanted populations to a sexuallyreproducing state as quickly as possible.” Another goal, under the umbrella of assisted evolution, is to generate enhanced coral stocks resilient to predicted future changes as well. “We have modelled projections for temperature and pH conditions for the year 2100, so we can use the land nursery and climate simulator system here at the International Center for Coral Reef Research and Restoration for running experiments to test the performance of sexually-produced offspring under those predicted future conditions,” Dr. Koch explains. “Once we identify corals resilient to projected conditions, we can store them in a coral gene bank for future selective breeding and restoration schemes.”
The research objectives are to 1) breed and raise corals with certain desirable traits in a controlled laboratory setting, 2) investigate the genetic basis (heritability) of relevant tolerance and resistance traits, 3) produce robust offspring populations with enhanced stress tolerance and increased adaptive potential, and 5) use these resilient stocks for coral reef restoration. This assisted evolution approach harnesses the power of sexual reproduction for generating novel genotypes and increasing genetic diversity while simultaneously increasing resilience within offspring populations.
Adult staghorn coral colony (2-year old outplant).
Gravid colony revealing pink gamete bundles developing inside.
Project Funding
Postdoctoral Fellow | Deutsche Forschungsgemeinschaft (DFG), Germany.
Contact Details
Project Coordinator, Dr. Hanna R. Koch, PhD Visiting Research Scientist International Center for Coral Reef Research and Restoration Summerland Key, FL 33042 T: +(305) 745-2729 x711 E: hkoch@mote.org W: https://hanna-koch.com Dr. Hanna R. Koch, PhD
Coral larvae settlement tanks. 10-month old sexual recruit ready to be screened for resiliency.
Photograph by Cody Engelsma
Dr. Hanna R. Koch is a Postdoctoral Research Fellow with The German Research Foundation and a Visiting Scientist at Mote Marine Laboratory’s International Center for Coral Reef Research and Restoration in the Florida Keys. She is a marine and evolutionary biologist working towards advancing resilience-based coral reef restoration strategies through assisted evolution and selective breeding.
Performing a sexual maturity assessment on outplanted coral colonies.
www.euresearcher.com
31
Environmental change in the ocean: past and present Oxygen concentrations in the world’s oceans are decreasing, which is affecting nutrient recycling from sediments and thus the nutrient balance in the water column. We spoke to Dr Florian Scholz about his research into iron cycling in marine sediments, which has important implications for nutrient release to the water column and the health of marine ecosystems more generally. The sediments on
the ocean floor can act as both a source and a sink of nutrients. This depends largely on the chemical composition of the surrounding seawater, in particular the oxygen concentration and the pH level, which have changed significantly over recent years. “These parameters are changing because of anthropogenic perturbations, so nutrient fluxes from the sea floor into the water column – and vice-versa – are likely to change as well,” says Dr Florian Scholz. As the Principal Investigator of the ICONOX project, Dr Scholz aims to investigate the impact of these changes in seawater on nutrient fluxes from the ocean floor. “The major nutrients we’re looking at are phosphate, iron and other bio-essential trace metals like cobalt, zinc and copper,” he outlines. “We’re looking at what factors control whether sediments act as a source or a sink of nutrients. We’re also interested in what happens upon release very close to the ocean floor, in the nearbottom water.” Nutrients are mobilized in the surface sediment through the re-mineralization of organic material or dissolution of terrigenous minerals. Some of these nutrients are released into the bottom water, while others are reprecipitated and buried. Metals released at the seafloor are partly re-precipitated in the bottom water. It is currently possible to sample around 10 centimetres of the bottom water overlying sediments, while the open water column can also be sampled at a safe distance, several metres above the seafloor. However, Dr Scholz says it is the layer in between the bottom water and the sediments that is crucial to building a more complete picture. “In a way it is the missing link. You’ve got to understand what’s happening in this layer in order to understand how fluxes from the sediments
to the bottom water translate into nutrient transport throughout the open water column,” he explains. Sampling this layer is a challenging task, as it’s difficult to generate meaningful trace metal data without contaminating the sample in some way. Researchers in the project have built a device, the Benthic Trace Profiler, to sample particles and water from this layer, which will enable Dr Scholz and his colleagues to gain deeper insights into how much of the sedimentderived micro-nutrients can be transported to the surface of the ocean. “The Benthic Trace Profiler is mainly used for collecting particles and water samples, which are then analysed in the laboratory,” he continues.
an instrument called a benthic lander on the sea floor. The lander contains a glass chamber, which is driven into the surface sediment, thereby incubating a sample of bottom-water and surface sediment. “We then take water samples from these chambers in regular time intervals,” outlines Dr Scholz. “By measuring the evolution of nutrient concentrations in these chambers over time, we can determine the flux of phosphate or iron from the sediments into the bottom water.” “We use concentration profiles in pore waters and benthic chamber incubations to quantify fluxes and to determine which environmental parameters and processes control the magnitude of these fluxes,” continues Dr Scholz.
We need to understand how nutrient fluxes will respond to declining oxygen concentrations in order to predict how marine ecosystems that rely on these nutrients will be affected by
global environmental change. Sampling techniques This device is being used in the project to gather samples from areas of the oceans in which oxygen concentrations in the water column are low, where these sedimentary sourcesink fluxes are particularly important, such as around the eastern boundary upwelling regions of the world’s oceans. A variety of other techniques are also being used in the project to take samples of sediment, pore water and bottom water. “We can recover sediment by sediment coring. The water in between the solid sediment particles is called pore water. The composition of this pore water provides information on water-solid interactions. Pore water is recovered from sediment cores using various techniques,” explains Dr Scholz. Nutrient fluxes from the sediment into the bottom water can be determined by putting
Researchers are also investigating the extent to which the magnitude of fluxes between the sediments and the bottom water depends on parameters, like the concentration of oxygen or other oxidizing agents. Once a relationship has been established, Dr Scholz says it’s possible to look towards predicting how the iron flux will evolve in future. “The interesting thing here is that there are feedback connections. If oxygen concentrations in the ocean go down, more iron would be released from the sea floor into the water column,” he explains. “As iron is often the limiting nutrient for primary production, there is a chance of higher rates of primary production, and increased fluxes. This in turn leads to the export of more organic carbon, and to more oxygen consumption. In a way this closes the feedback loop.”
Recovery of a sediment core in the Gulf of California. The sedimentary record is used to reconstruct biogeochemical cycling in the Eastern Pacific over the last 10,000 years.
32
EU Research
ICONOX Iron cycling in continental margin sediments and the nutrient and oxygen balance of the ocean Project Objectives
Iron and other trace metals are essential micronutrients for biological productivity in the ocean. Depending on environmental conditions such as oxygen concentration and pH, marine sediments can represent an important source or sink for these compounds. Within the ICONOX project, observational, experimental and numerical approaches are applied to constrain how changing micronutrient fluxes across the sediment-seawater interface affect biogeochemical cycling in the ocean.
Project Funding Schematic sketch illustrating how oxygen-dependent nutrient fluxes across the sediment-water interface amplify primary production, organic matter export and oxygen consumption in the water column. Dr. Scholz and his collaborators use benthic landers and the Benthic Trace Profiler to quantify chemical fluxes at the seafloor.
There are other links between nutrient cycles to consider, and it is not clear how future changes will affect ocean chemistry and marine ecosystems. Analysis of past changes can help researchers assess how the oceans are likely to change in future, so there is also a paleoceanographic dimension to the project, in which Dr Scholz and his colleagues are investigating the geological past. “We’re looking at how those feedbacks may have affected environmental change in the ocean during past periods of global warming,” he outlines. This work involves not only using the seafloor as a record of past environmental change, but also investigating its role in driving that change, via nutrient release or burial in sediments. “Surface sediments are an active player in ocean biogeochemical dynamics,” continues Dr Scholz. “The sea floor acts as a kind of lynchpin.” The aim here is to investigate how past changes in sedimentary fluxes have affected oxygen concentrations in the oceans over longer timescales. Sedimentary records have been retrieved from the later part of the Cretaceous period (~93 Million years ago), during which substantial areas of the ocean floor were covered by anoxic bottom water. Researchers are analysing these records to try and build a fuller picture of past changes. “We do detailed observational research in different ocean regions to understand processes on the sea floor and in the layer above it. We then extrapolate from our findings to look
www.euresearcher.com
at past periods of global warming, to try and understand the biogeochemical dynamics that took place during these periods,” outlines Dr Scholz. Models are also being used in the project to build a deeper picture. “We use global biogeochemical models to evaluate how sedimentary nutrient fluxes affect the dynamics of marine biogeochemical cycles, both today and in Earth’s history” says Dr Scholz.
Ocean change This research is being conducted against a backdrop of growing concern over how the oceans are changing. Ocean warming, acidification and de-oxygenation can all affect nutrient exchange between the seafloor and the water column. “As the ocean becomes warmer, it also gets more stratified. There is less transport of oxygenated water masses into the ocean interior, so less vertical exchange,” explains Dr Scholz. This has significant consequences for marine ecosystems, underlining the wider importance of this research. Samples have been taken from several different ocean regions, and Dr Scholz has further expeditions planned over the coming years, which will allow him to place their findings in the wider context. “We will participate in expeditions to the Benguela upwelling off Namibia this summer, and there will be a cruise in the Arabian Sea next year. We are trying to find recurring patterns in different areas,” he outlines.
The ICONOX project is funded by the Emmy Noether Program of the Deutsche Forschungsgemeinschaft.
Contact Details
Project Coordinator, Dr Florian Scholz GEOMAR Helmholtz Centre for Ocean Research Kiel Wischhofstr. 1-3 24148 Kiel | Germany T: +49 (0)431 600 2113 E: fscholz@geomar.de W: https://www.geomar.de/en/research/ ongoing-projects/project-details/prj/99900085/ W: https://www.geomar.de/en/mitarbeiter/fb2/ mg/fscholz/
Scholz, F., 2018. Identifying oxygen minimum zone-type biogeochemical cycling in Earth history using inorganic geochemical proxies. Earth-Science Reviews 184, 29-45.
Dr Florian Scholz
Florian Scholz is a sediment geochemist and paleobiogeochemist at GEOMAR Helmholtz Centre for Ocean Research Kiel. His research centers on geochemical processes and fluxes at the seafloor and their role in global biogeochemical cycles, today and through Earth’s history.
33
Oceans of enzyme possibilities
INMARE sampling. © Carla CCR de Carvalho
The world’s oceans are full of interesting microorganisms, yet their wider potential has not yet been fully explored. The INMARE project brought together researchers from several different disciplines to discover new enzymes and bioactives from marine environments, which will help chemical and pharmaceutical companies work more effectively, as Professor Peter Golyshin explains The world’s oceans
are home to an abundance of marine life that is uniquely adapted to living in sometimes very harsh environments. This is particularly true of marine microorganisms. Enzymes from microorganisms adapted to living in these extreme environments hold great potential for biotechnology. Significant parts of the oceans are inhabited by psychrophilic microbes for example, which are well adapted to low temperatures, while many other ‘extremophiles’ live in extreme conditions in marine environments. “For example, there are thermophiles that live at high temperatures, in hot vents. Other organisms prefer alkaline conditions, or elevated salt concentrations, there are also organisms that thrive under conditions of multiple extremes,” outlines Peter Golyshin, Professor of Biology at Bangor University in the UK. Professor Golyshin and his colleagues in the INMARE project set out to streamline the enzyme discovery process. “Many people have invested a lot of time into improving enzymes, yet often it doesn’t lead anywhere. In this project, our aim was to screen enzymes from microbes that have already evolved to operate in extreme environments,” he explains. These extreme environments hold a great deal of scientific interest, as they can provide important insights into the origins of life, in addition there is also a strong applied dimension to the project’s work. Chemicals are often synthesised in harsh conditions for example, so companies need enzymes that can cope. “For industry, it is important to have enzymes that can operate under the conditions of industrial processes,” stresses
34
Professor Golyshin. In the pharmaceutical and chemical industries, many substrates have very low solubility, so a reaction needs to be set up accordingly. “The reaction must be set up in the right conditions, e.g. with high pressure, temperature or high concentration of solvents, to enable solubility of substrates,” explains Professor Golyshin. “Microbes that thrive in similar environments may be capable of producing enzymes to do these reactions. That’s the rationale behind exploring these environments - we hope to find some microbes and their enzymes that have these corresponding features.”
“In MAMBA we gathered very large sets of genomic and metagenomic resources, which we screened for enzymatic activities. In INMARE we are carrying on with that work, but we also have some new sampling sites and new ways to identify enzymes and metabolites,” says Professor Golyshin. Samples have been gathered from many different environments. “For example, our Norwegian colleagues are taking samples of hydrothermal fields (‘black smokers’) in the northern part of the mid-Atlantic ridge. Microbial DNA is then extracted to establish genomic libraries, which are screened for
We harvest DNA from the environment, then we perform shotgun DNA sequencing to identify which genes are present in the sample. Then we identify the relevant ones and express them in different microbial hosts, like yeasts, pseudomonads or bacilli. Sampling sites This work builds on the earlier MAMBA project, in which researchers gathered and stored large numbers of genetic resources from microorganisms in different marine environments, including deep hypersaline anoxic basins in the Eastern Mediterranean Sea, epibionts of marine invertebrates, or petroleum-depleting marine microbial communities. The project brings together 24 multi-disciplinary partners from across Europe, providing a solid foundation to identify interesting enzymes, characterise them and find ways to apply them in industry.
activities. Alternatively the DNA can be directly sequenced and then analysed for genes for enzyme candidates to be cloned and characterised,” outlines Professor Golyshin. “We also have other partners who take samples from sea floor brine lakes in the Mediterranean. There are also further under-explored sites, including those with shallow hot vents.” These environments are home to some unique microbes, Professor Golyshin and his colleagues sought to harness their wider potential. However, it’s very difficult to disentangle the activity of certain microbes and identify industrially relevant enzymes.
EU Research
“There may be millions of different microbial species, most of which are not amenable to cultivation in the laboratory, and therefore their enzymes and metabolites remain very difficult for us to access,” explains Professor Golyshin. A technique called ‘metagenomics analysis’ helps researchers to gain access to industrially relevant enzymes from as yet uncultured microorganisms. “Essentially, we harvest DNA from the environment, then we perform shotgun DNA sequencing to identify which genes are present in the sample. Then we identify the relevant ones and express them in different microbial hosts, like yeasts, pseudomonads or bacilli, for example,” continues Professor Golyshin. “We also establish gene libraries, then screen those for the defined activity that we are interested in. In this way, we can discover genuinely new enzymes.” This could include substrate-promiscuous enzymes capable of catalysing different reactions, which could help chemical and pharmaceutical companies work more efficiently. Researchers in the project have established a large collection of esterases, which have been analysed against a similarly large quantity of ester substrates, and Professor Golyshin says the results so far are very interesting. “We’ve found that a large proportion of these enzymes can take not just one or two substrates, but many more. Many
of these enzymes can do different things and work with different reactions. We have some enzymes that catalyse reactions with 76 substrates,” he outlines. Some commonalities in the structure of the enzymes have also been found, which holds important implications in terms of the project’s wider agenda. “We are now able to predict promiscuity just by looking at the sequence of the protein. This is one of the most important discoveries we have made in the project,” continues Professor Golyshin.
Commercial sector The project’s industrial partners play an important role in guiding this research and ensuring it is relevant to commercial needs, which is an important part of the wider agenda. While it is of course difficult to forecast the outcome of this kind of research, Professor Golyshin is keen to translate the project’s work into commercial development. “We try to deliver enzymes or compounds to the commercial sector that could be used in the future,” he stresses. One
Sampling equipment being deployed from a research vessel off the coast of Messina, Italy. © CNR (Consiglio Nazionale delle Ricerche)
www.euresearcher.com
35
INMARE Industrial Applications of Marine Enzymes: Innovative screening and expression platforms to discover and use the functional protein diversity from the sea
Project Objectives
The main objectives of INMARE were to: 1. Streamline and significantly shorten the pipelines of marine enzyme and bioactive compound discovery for industrial applications; 2. Develop marine enzyme collections with a high proportion of promising enzymeallrounders; 3. Identify new lead products and deliver prototypes for new biocatalytic processes based on marine microbial enzyme resources for targeted production of fine chemicals, drugs and materials for use in environmental clean-up applications.
Project Funding
INMARE was funded by the European Union’s Horizon 2020 research and innovation programme, grant agreement No. 634486.
Project Partners
There are 24 INMARE Project Partners Please see website for full details: http://www.inmare-h2020.eu/participants.php.en
Contact Details
Professor Peter Golyshin, INMARE Coordinator School of Natural Sciences Bangor University, Gwynedd, LL57 2UW T: +01248 383 629 E: p.golyshin@bangor.ac.uk W: http://www.inmare-h2020.eu/ Ferrer M, Martínez-Martínez M, Bargiela R, Streit WR, Golyshina OV & Golyshin PN. Estimating the success of enzyme bioprospecting through metagenomics: current status and future trends. Microb Biotechnol. 2016, 9(1), 22-34. DOI:10.1111/1751-7915.12309. All INMARE publications can be found here: http://www.inmare-h2020.eu/publications.php.en
Professor Peter Golyshin
area in which these enzymes could be applied is the pharmaceutical industry, which could benefit from a more streamlined development process. “Currently it takes a long time to bring a new discovery from the laboratory to the market, as it has to go through pre-clinical and clinical trials. A lot of money and time is required,” points out Professor Golyshin. “One of our partners is developing anti-tumour drugs, and through the work of the project they have succeeded in finding a new compound, which they have already patented.” A further important dimension of the project’s work is the possibility of using new marine enzymes in environmental clean-up operations. Petroleum constituents can cause a lot of damage when they leak into marine environments. “Petroleum constituents dissolve the membranes of microbial cells,” explains Professor Golyshin. Certain microbes can help clean up marine environments which have been exposed to these types of pollutants. “There are microbes that can degrade petroleum hydrocarbons. They are naturally present in the sea, but normally they are found in very low numbers. We’ve found that when we take a sample of sea-water and add some petroleum, the number of these bacteria increase - they comprise up to 90 percent of the total microbial community. These bacteria are naturally associated with algae, or unicellular eukaryotes, which photosynthesise and produce fats, lipids, and other molecules similar to petroleum components,” continues Professor Golyshin. These bacteria can degrade petroleum effectively in the right conditions, when they have access to oxygen and nitrogen, while researchers are also working on enzymes capable of degrading plastic. This is a prominent issue at the moment, with concern rising over
Image of a 5 litre fermenter. © Carla CCR de Carvalho
the amount of plastic in the world’s oceans and its ecological impact, and Professor Golyshin says a lot of attention is being paid to this area. “The EU and different national funding agencies are looking to improve our basic knowledge, aiming to build a toolbox to degrade plastic,” he outlines. This will remain an important part of Professor Golyshin’s agenda in future, along with building on the progress that has been made over the course of the INMARE project and moving towards commercial applications. “A spin-off company has already been established, and a number of patents have been filed, which was a major part of the initial project application. The project has recently concluded, and we have produced more than 100 peer-reviewed papers and book chapters, 4 patents and one startup company.” It would seem that INMARE’s ambitious voyage of discovery has certainly been a successful one. INMARE Scientist Professor Michail Yakimov, Institute for Coastal Marine Environment (IAMC), CNR. © CNR (Consiglio Nazionale delle Ricerche)
Peter Golyshin is Professor of Biology at Bangor University, a role in which he investigate the environmental genomics of microorganisms. He performed the first omics-based studies in marine obligate hydrocarbonoclastic bacteria, that play a key role in marine oil degradation worldwide. He pioneered activity-based metagenomic enzyme discovery in cow rumen, earthworm gut and deep-sea environments.
36
EU Research
Understanding carbon cycling in tropical soil systems As the population surrounding the African great lakes and the Congo basin continues to grow, increasing amounts of land are being converted into agricultural use. A deeper understanding of how carbon is sequestered and released from soils in tropical regions will both inform models of earth system dynamics and help farmers in the region use land effectively, as Dr Sebastian Doetterl explains. Little is known
about biogeochemical cycling and the underlying mechanisms in tropical Africa, where soils function very differently from temperate zones. As the Principal Investigator of the DFG funded Tropsoc project, Dr Sebastian Doetterl aims to help build a deeper understanding of carbon sequestration and release in the area. “We’re looking at sites in Uganda, Rwanda and the Congo,” he says. While the study sites in these countries are broadly similar in terms of climate and population pressure, the way land is used and the geology are very different. “For example, in Rwanda there are a lot of sedimentary rocks and often terraced, very old cropland systems to limit erosion, while we find in our sites in the Congo an abundance of basalt and nearly unconsolidated cropland expansion into primary rain forest. The severity of the impact of land degradation on the fertility of cropland soils depends on the geochemistry of the bedrock, as some rocks can weather more quickly and release more nutrients than others,” explains Dr Doetterl.
Carbon stabilisation potential The geology of an area is therefore an important issue in terms of the project’s overall agenda, as soil fertility and the carbon stabilisation potential of soils - which constrains greenhouse gas emissions from soils - are closely related to mineral availability. Carbon stabilisation is controlled by a very complex interplay between different factors, with Dr Doetterl and his colleagues focusing on topographical and geological drivers. “We are looking at systems that have a comparable amount of rainfall and a comparable topography,” he says. “We want to understand how much carbon is stabilized in soils under different environmental systems (i.e. cropland versus forests) along a geochemical and topographic gradient.” “Soil is mobilized on slopes in each of the areas we are looking at, leading to a lot of erosion, and later deposited in valley systems as sediments. The environmental conditions between slopes and valleys then drive the stability of the mobilized carbon.” This lateral transport of soils is an important factor with respect to carbon storage. Vast areas
www.euresearcher.com
Impact of the study
TropSOC fieldwork in the Eastern Congo basin, a region troubled by conflict and violence, scarcity of fertile land and a fast growing popluation.
of cropland suffer from degradation and erosion, which affects the properties of soil on these slopes. “That also drives differences in the carbon stabilisation potential,” explains Dr Doetterl. In this context, the researchers distinguish between stable and dynamic landscapes to gain deeper insights into carbon sequestration. “Plateaus are the stable landscapes, as there is basically no loss of soil. The dynamic part of the landscape are these slopes, where soil can be lost through erosion,” says Dr Doetterl. “As topsoil is lost through erosion, important plant nutrients and stabilized carbon are lost as well.” Changes in soil properties will also have a significant impact on forest vegetation. Even though erosion is much less pronounced than on cropland, it will change nutrient availability in some areas, a topic that Dr Doetterl and his colleagues are addressing. “In our sites we investigate functional traits of plants and changes in plant strategies. For example, we analyze if plants increasingly invest in roots in nutrient depleted envrionments than they do in a nutrient-enriched environment, and can this be tied to the geochemistry of soils on the one hand and to topography and erosion on the other?” he outlines. The wider aim here is to link the insights on soil carbon with the functional traits of plants in forests, in terms of carbon sequestration. “Hence, it’s not an exclusively soil-focused project. We also want to raise awareness of questions around tropical biogeochemical cycling and how it drives plantsoil interactions,” continues Dr Doetterl.
This project aims to bring a wider relevance to modelling carbon in earth system dynamics by taking into account soil dynamics in less studied areas across the globe. Many current models use observations on carbon dynamics gathered from temperate zones, which do not fully reflect soil functioning in tropical Africa. Dr Doetterl hopes to make an important contribution in this respect. “We aim to help reduce the uncertainties around tropical carbon release and stabilisation in large-scale models,” he outlines. The project’s research will also have an impact on the local level, helping farmers to use land more effectively. “We’re working with the International Institute of Tropical Agriculture (IITA), which is specifically interested in improving farming techniques and the livelihood of subsistence farmers in developing countries,” says Dr Doetterl. “We will provide information on crop yields and how they are affected by erosion specifically in relation to mineralogical properties.”
TropSOC Tropical soil organic carbon dynamics along erosional disturbance gradients in relation to variability in soil geochemistry and land use TropSOC is funded by the German Research Foundation DFG as an Emmy Noether Junior research group hosted at Augsburg University (Germany). Funding was approx. € 1.7 Million including overheads. Additional € 300,000 came from structural funds of the hosting university. T: +41 44 633 60 20 E: doetterl@geo.uni-augsburg.de W: https://www.congobiogeochem.com/tropsoc
Dr Sebastian Doetterl is an Assistant Professor for Soil Resources at ETH Zurich’s Department of Environmental Systems Science. He holds a PhD in Geosciences and a Diploma in Physical Geography, his main research focuses in on investigating geoecosystems at different temporal and spatial scales.
37
The numbers behind a new threat to the ozone layer The ozone layer is thought to be recovering after the production of CFCs was banned in the ‘80s, yet increased emissions of Very-Short Lived Halocarbons (VSLH) from treatment of industrial and ballast water could represent a new threat. We spoke to Dr Susann Tegtmeier about her work in assessing the likely extent of future VSLH emissions and their wider impact on the atmosphere. The depletion of
the ozone layer has been a major concern since the problem was brought to wider attention in the 1970s, leading eventually to the Montreal Protocol of 1987, which sets limits on the emissions of chlorofluorocarbons (CFCs). While this has led to the onset of ozone recovery, increased levels of Very Short-Lived Halocarbons (VSLH) might represent a new threat to the ozone layer. “We find VSLH in coastal waters where there are a lot of macroalgae, for example,” says Dr Susann Tegtmeier. While VSLH have long been present in the oceans, forming part of the background state, anthropogenic activities such as oxidative treatment of industrial water and ship ballast water are leading to greater emissions, a topic at the core of Dr Tegtmeier’s research. A ship usually takes on ballast water during cargo unloading in one harbour and discharges the ballast water during loading operations in another harbour. A lot of bacteria and small living organisms in the ballast water can spread more widely in the new eco-systems after being released, potentially causing serious problems. “Some of these marine invasive species have proved to be very damaging. For instance, the Zebra mussel has invaded many parts of the Baltic Sea and is now causing large economic losses for electric power generators by blocking their water intake pipes,” outlines Dr Tegtmeier. In order to address this problem, the International Maritime Organisation (IMO) adopted an agreement on the management and cleaning of ballast water, which came into force in 2017. “We will see an increase in the cleaning of ballast water over the next four or five years,” continues Dr Tegtmeier. The treatment of industrial water represents another major source of halocarbons, and while the concentrations are lower than in ballast water, the overall quantity involved is higher. Nuclear power plants use large volumes of water in cooling, for example, which also needs to be treated before it can be released. In general, there are two different methods of cleaning industrial cooling and ballast water. “One involves adding oxidants, such as chlorine, to the
38
Figure 1: Schematic of natural and anthropogenic Very Short-Lived Halocarbons (VSLH) emissions and their impact on atmospheric processes.
Climate Change Ozone recovery Radiative forcing
?
Atmospheric oxidizing capacity
Natural VSLH Macroalgae Phytoplanton
Anthropogenic VSLH Increasing aquaculture
?
Oxidative water treatment
?
water,” says1:DrSchema c Tegtmeier. This The impact of VSLH treated water Figure of method naturalleads and anthropogenic Veryfrom Short-Lived Haloto a significant increase in the production of depends to a large degree on where the carbons (VSLH) emissions and their impact on atmospheric processes. VSLH, a drawback that is avoided in the second water is released and the prevailing transport method of treating water, based on the use of patterns. The ozone layer is located at UV light. “The UV method will not lead to between 20-25 km above the Earth’s surface, the production of halocarbons – only the use in the stratosphere, and gases need to get into of oxidants like ozone and chlorine will have uplift regimes around the tropics in order to such an effect,” explains Dr Tegtmeier. Many make it to this kind of altitude. “This is where shipping and industrial companies are likely the short-lived species are very different to use the chlorination method however, as from the long-lived species, the CFCs,” says it’s cheaper than the UV method and allows Dr Tegtmeier. While long-lived species often them to treat huge volumes of water. circulate for long enough in the atmosphere to
VSLH can be easily emitted from the ocean into the atmosphere. My research is about finding out what effects these gases have in the atmosphere once they are released from the treated water. VSLH emissions In her research, Dr Tegtmeier aims to assess the likely extent of future emissions of VSLH, and their wider impact on the atmosphere. “We want to quantify the amount of halocarbons that are going to be produced in industrial and ballast water, if we assume that the majority of companies will use the chlorination method,” she outlines. “VSLH can be easily emitted from the ocean into the atmosphere. My research is about finding out what effects these gases have in the atmosphere once they are released from the treated water.”
eventually reach the tropics, from where they can get into the stratosphere, it’s different for the short-lived species. “After a while they are chemically destroyed, and if they haven’t been taken into the stratosphere by this point, they won’t affect the ozone layer,” explains Dr Tegtmeier. “But if they are emitted in the tropical oceans, the region where the convective uplift regimes are located, they can be transported into the stratosphere.” The location of these uplift regimes holds clear importance to assessing the likely impact of VSLHs on the ozone layer, which is a major priority for Dr Tegtmeier. The
EU Research
3
20
Qinhuangdao Inchon Dalian Pyeongtaek NagoyaTokyo Qingdao Busan Ulsan Kwangyang 20 Kobe Chiba Kitakyushu 5800 Shanghai Ningbo 250
40N
20N
2
80
Guangzhou
Xiamen0
0 250
20
20
20 50
20
20 80
20
100E
110E
20
20
20
130E
0
140E
Avg 2001-2010
20°N
3
2.5
15°N 70 %
50 %
10°N
2
30 %
5°N
1.5
Eq
1
5°S
10°S 75°E
relative density distribution in %
Figure 3: Annual mean surface (20 m) spread of disinfection by-products from discharge in Singapore relative to the total number of particles released. Contours show the area of the percentage of particles (30%, 50%, 70% and 90%) characterised by the highest density.
20
120E
50
20
20
50
20
10S
100
20
20
20
50
20 5080 50 20 20
0.5 90 %
85°E
95°E
105°E
115°E
125°E
0
Above figures from: Maas, J., Tegtmeier, S., Quack, B., Biastoch, A., Durgadoo, J. V., Rühs, S., Gollasch, S., and David, M.: Simulating the spread of disinfection by-products and anthropogenic bromoform emissions from ballast water discharge in Southeast Asia, Ocean Sci., 15, 891-904, https://doi.org/10.5194/os-15-891-2019, 2019.
core question here is the quantity of VSLHs that could reach the stratosphere. “We have to deal with large uncertainties, because we don’t really know how many ships and industrial companies will use chlorination and how many will use the UV method,” outlines Dr Tegtmeier. It is also important to be aware that the amount of VSLH produced in treated water can vary significantly, largely due to environmental conditions, such as the amount of organic material. It is however possible to look at various different scenarios, from which Dr Tegtmeier can then build a more complete picture. “If we assume a worst-case scenario, we find a much larger amount of halocarbons produced by industry than assumed so far,” she continues. “If this does prove to be the case, I think there would need to be some level of communication with the industry.”
Macroalgae farms The growth of macroalgae farms is another major consideration in terms of assessing the likely future level of VSLH in the atmosphere. Macroalgae have very different halocarbon production rates, an issue that Dr Tegtmeier and her colleagues aim to understand in
www.euresearcher.com
A NEW THREAT A New Threat to the Stratospheric Ozone Layer from Anthropogenic Very Short-lived Halocarbons Project Objectives
20
Port Kelang 20 Tanjung Pelepas Singapore
Eq
150
Shenzhen Kaohsiung Hong Kong
Laem Chabang Saigon
10N
8020
20
30N
20
2500
Tianjin
20
Figure 2: Estimated annual ballast water discharge volume from each harbour in the modified world port ranking in Southeast Asia with the names of the largest 26 ports. Contours and black contour lines show climatological ocean surface velocities [cm s-1]
Mio. m 200
greater depth. “What is the contribution of the macroalgae farms to the total amount of halocarbons coming from all macroalgae?” she outlines. The possibility of developing larger-scale macroalgae farms has attracted attention as a means of capturing and storing carbon; while this is a major global priority, Dr Tegtmeier believes it’s important to first assess the likely impact of these farms. “This is something we want to monitor and understand. Macroalgae farms on the open ocean might have a very big impact on ozone chemistry,” she cautions. This is not an immediate prospect however, and at this stage Dr Tegtmeier is focused more on investigating what happens in the ocean and the atmosphere after ballast or industrial water has been released. This is closely related to the question of how increased levels of VSLH will affect the ozone layer. “What will happen with the ozone layer in the future? What will be the impact of various anthropogenic activities? As scientists, we try to build a deeper understanding of the effects that increased emissions would have,” says Dr Tegtmeier. These questions will form a central part of Dr Tegtmeier’s research agenda over the coming years.
There is only an emerging awareness of the impending increase of atmospheric VSLH, which represents a new threat to ozone recovery. In addition to the damaging effect on the ozone layer, increased levels of VSLH will also impact the radiative forcing and the oxidizing capacity of the atmosphere - the capacity of the atmosphere to ultimately remove many species emitted from natural and anthropogenic sources. Assessing future emissions of anthropogenic VSLH, their multisided impact on atmospheric chemistry and physics, and in particular their potential to prolong stratospheric ozone depletion (Figure 1), is therefore a correspondingly high priority.
Project Funding
The project is supported by the Emmy Noether programme of the German Research Foundation (DFG).
Contact Details
Project Coordinator, Dr Susann Tegtmeier GEOMAR Helmholtz-Zentrum für Ozeanforschung Kiel Düsternbrooker Weg 20 24105 Kiel T: +49 (0)431 600-4160 E: stegtmeier@geomar.de W: https://www.geomar.de/index. php?id=stegtmeier
Dr Susann Tegtmeier
Dr Susann Tegtmeier is an Emmy Noether Research Group Leader at the Helmholtz Centre for Ocean Research (GEOMAR) in Kiel, Germany. Previously she held Fellowships at institutions in Germany and Canada. She teaches courses on several different areas of meteorology, oceanography and climate dynamics, and has published 35 peerreviewed papers. She will be starting a faculty position at the University of Saskatchewan in September 2019.
39
Picturing the landscape of human evolution Landscapes in tectonically active regions like the Rift Valley in East Africa have changed dramatically over the course of human history, helping to shape human evolution and dispersal patterns. We spoke to Dr Simon Kübler about his work in reconstructing earlier landscapes, which will help researchers understand the factors that affected early settlement patterns. The landscape of
the East African Rift Valley is actively deforming over geological timescales as a complex system of extensional faults in the lithosphere continues to develop, and evidence of both recent and earlier volcanic activity can also be found in the region. Similar geological features can also be observed in the southern Oregon area in the US, providing a basis for researchers to draw comparisons. “There are lots of volcanic rocks and ash deposits from the two regions that can be compared,” says Dr Simon Kübler, a post-doctoral fellow in geology. Currently based at the University of Colorado, Boulder, Dr Kübler is investigating the nature of earlier landscapes, aiming to understand their role in driving human dispersal and evolution. “I want to understand how much these landscapes have changed since the time they have been inhabited,” he explains. “With southern Oregon, that goes back to something like 12,000-15,000 years before the present day. Whereas with the Rift Valley, we’re talking more about 500,0001,500,000 million years ago.” Paleolandscape-reconstructions of tectonically active regions - a new tool to predict fossil site locations and discern patterns of hominin inhabitance DFG Postdoctoral fellowship awarded to Simon Kübler (KU 3512/2-1), international host Professor Gregory Tucker, CU Boulder, USA. Simon Kübler, Research fellow, Cooperative Institute for Research in Environmental Sciences (CIRES) Community Surface Dynamics Modeling System (CSDMS) University of Colorado, Boulder T: +1 720-569-1100 E: simkuebler81@gmail.com W: http://gepris.dfg.de/gepris/ projekt/408311491?language=en
Simon Kübler is a DFG Research Fellow at the Cooperative Institute for Research in Environmental Sciences, University of Colorado, Boulder, USA. He is interested in understanding the physical constraints of dynamic landscapes, earthquake processes and the interplay between geology, soil edaphics and human and animal nutrition.
40
Field team in southern Kenya November 2017. Left to right: Simon Tim (field guide, Ol Tepesi, Kenya), Maurice Obunga (soil scientist, National Museums of Kenya, Nairobi), Dr. Stephen Rucina (Palaeoecologist, National Museums of Kenya, Nairobi), Dr. Simon Kübler (Geologist, Cu Boulder).
Geological data This research is built on analysis of geological data, with researchers using existing fossil archives to establish time and environmental constraints on palaeolandscapes. By combining data from geological mapping with satellitebased remote sensing techniques, researchers can investigate how geological features have changed since the time that these regions have been inhabited. “This includes quantifying displacements of tectonic faults, lake level fluctuations, river erosion and other processes. We are trying to reconstruct the landscape at different points in time,” outlines Dr Kübler. The nature of the landscape also helps shape human dispersal patterns; over time, humans have tended to settle in areas where water is easily accessible, which Dr Kübler says is related to the presence of tectonic faults. “Areas characterised by tectonic faulting are more likely to have stable springs over a longer period of time,” he explains. “High nutrient levels in soils are also often related to the presence of volcanic rocks like basalt, which provide important phosphate and calciumbearing minerals in large amounts.”
The quality of the soils that grow on these rocks is likely to be quite high, another important factor in human dispersal patterns. The geology of an area is also a major driver behind animal migration patterns, such as the wildebeest migration in East Africa, in which millions of wildebeest move across the Serengeti in search of green pasture. “This is partly connected to the patchy soil nutrient properties of the region, which is directly linked to the underlying geology,” says Dr Kübler. Researchers are investigating how geological factors control soil quality and water availability, as well as how tectonics can constrain the movements of both humans and animals, questions which hold clear relevance to our understanding of human evolution. “How could humans effectively move through a landscape as complex as the Rift Valley? How could early humans or hominins have exploited a tectonically active landscape to predict animal migration routes?” continues Dr Kübler. This topic will continue to form a central part of Dr Kübler’s research agenda. While his approach so far has been largely field-based, Dr Kübler now aims to bring together his own knowledge with the modelling expertise of his colleagues at Boulder. “We want to visualise how landscapes may have changed over time. We’ve incorporated a combination of various processes, to make palaeolandscape models that are as realistic as possible,” he explains. This will enable scientists to show how a landscape may have been strategically exploited by early humans in the past, while Dr Kübler is also looking into whether this approach could be applied more widely. “If we can identify a certain set of features unique to these sites, then it may provide a new tool to identify new fossil sites elsewhere in the world,” he outlines. “By using this combination of active tectonics and other geological processes, we want to promote future fossil finds.”
Complex landscape resulting from tectonic faulting in basalt east of Summer Lake basin close to the Paisley cave site. The steep cliffs formed by extensional faulting create a terrain difficult to cross by migrating animals.
EU Research
A new perspective on co-evolution Plants are not only a source of food for insects, but also toxins that can provide protection against predators, which holds clear importance to their evolutionary prospects. By investigating the interactions between plants and insects, Dr Georg Petschenka and his colleagues at the University of Giessen aim to build a deeper understanding of their co-evolution. A wide variety of chemical compounds for defence have evolved within plants over time to ward off herbivorous parasites, yet these defences can almost invariably be overcome by adapted species of insects. Many insects also sequester these compounds, suggesting that they play a wider role. “This means that not only can these insects cope with these compounds, but they also store them in their bodies. A famous example is the monarch butterfly (Danaus plexippus) whose caterpillars acquire heart poisons from milkweed (Asclepias spp.), then the caterpillar and the resulting butterfly are protected against predators,” explains Dr Georg Petschenka. As an Emmy Noether Group Leader at the University of Giessen, Dr Petschenka aims to probe deeper into insect resistance to plant toxins, as well as other questions around the relationship between insects and plants, including the evolutionary implications. “We want to find out how insects overcome plant defence mechanisms and what this means for insect-plant co-evolution,” he outlines.
Co-evolution The concept of co-evolution between plants and insects, which is classically thought of as bi-trophic, is central to this work. In this concept, an insect occupying a novel dietary niche might evolve a resistance trait to overcome its new host’s defences, after which it can then thrive on the available dietary resources. “This is the classic assumption. However, some of the insects we’re studying - especially the milkweed bugs, the Lygaeinae - seem to interact with specific plants only because toxins are there,” says Dr Petschenka. While these insects may feed on many plants, they also interact with individual plant species because they have particular compounds which they can then use for their own defence. “This uptake of compounds for use as a defence mechanism is called sequestration. We’ve found that this is an evolutionary driver, that makes an insect stick to a specific plant species in some cases,” continues Dr Petschenka. Researchers in the group are also studying a number of other topics in this area as part of their work in looking at sequestration on a multi-layered scale. Different insects are being
www.euresearcher.com
Many herbivorous insects use plants not only as a dietary resource, but also to sequester plant toxins as a defense against predators. Conceptually, this means that insects exploit plants in at least two ways, i.e. for nutrition and defense. While insects may gain food and toxins from the same plant in many cases, other species seem to use multiple plants for feeding but acquire toxins exclusively from specific plant species. In conclusion, sequestration of toxins for defense is an evolutionary driver selecting for specialized insect-plant interactions also in insects that are generalists from a nutritional perspective.
studied in the laboratory, with Dr Petschenka and his colleagues looking at how they perform on different diets. “We investigate whether the inclusion of what we think are evolutionarily novel sources providing toxins leads to dietary benefits. We study natural histories and do lots of analytical chemistry, to figure out what insects actually store from plants, and also work in the predator-prey sphere,” he explains. The aim here is to assess whether the sequestration of toxins provides protection against predators. “We work with lacewings as insect predators, and also have a collaborator experimenting with birds. We are trying to get the full picture of co-evolution on different hierarchical levels,” says Dr Petschenka. This research holds important implications with respect to the management of insect populations. Just as insects can develop resistance to plant toxins, they can also evolve resistance to insecticides. “This is a big problem for insect and pest control,” stresses Dr Petschenka. The
group’s research has led to deeper insights into insect biology, which could prove important from a conservational perspective. “It may seem surprising that some species are ecologically fragile, as they are adaptable and have a high level of dietary flexibility. However, we have shown that they depend on specific plants to acquire certain toxins – if this plant is not present, then they may not thrive in the population,” continues Dr Petschenka. “So it’s clear that we need to understand the ecology of different species in much more detail if we are to protect them effectively.” The side-effects of insecticides are currently tested mainly against bees, yet as a dietary generalist it is already pre-adapted to lots of noxious compounds, so is unlikely to be affected as severely as a more ecologically fragile species that is dependent on a single source of food. A more evolutionary perspective is required to accurately assess risk, believes Dr Petschenka. “We need to integrate other concepts into risk assessment strategies,” he says.
Differential processing of host plant toxins by insect herbivores as a driver of multi-trophic interactions Dr. Georg Petschenka Emmy Noether research group leader Justus-Liebig-Universität Giessen Institut für Insektenbiotechnologie Heinrich-Buff-Ring 26-32 35392 Giessen, Germany T: +49 641 99 37603 E: Georg.P.Petschenka@ agrar.uni-giessen.de W: http://www.uni-giessen. de/fbz/fb09/institute/iib/ ento/ag/petschenka Dr. Georg Petschenka studied Biology in Tübingen and Bayreuth. Since 2017 he is leading an Emmy Noether Research group (DFG) and a research group funded by the Hessen State Ministry of Higher Education, Research and the Arts (HMWK) via the ‘LOEWE Center for Insect Biotechnology and Bioresources at Justus Liebig University Giessen. His research focuses on insect resistance to plant toxins and sequestration of plant toxins by insects.
41
Modelling the heat flux: understanding of convection in geophysics and astrophysics. Beneath our feet liquid metal flows around in the earth’s outer core in a spiral-like manner, which generates the earth’s magnetic field. Researchers in the Troconvex project are using experiments and numerical simulations to gain new insights into heat and energy transfer in this kind of turbulent flow, as Professor Rudie Kunnen explains. The earth’s outer
core is comprised of liquid, mainly iron, which flows in a highly turbulent manner. Researchers believe that the flow inside the earth’s outer core is organised in such a way that this liquid metal moves around in a spiral-like fashion, a topic of great interest to Rudie Kunnen, an Assistant Professor in Applied Physics at Eindhoven University of Technology. “This spiral-like flow then generates the earth’s magnetic field,” he outlines. As the Principal Investigator of the Troconvex project, Professor Kunnen aims to gain deeper insights into the nature of these kinds of turbulent flows, looking at the geostrophic regime. “Many different physical phenomena take part in a flow. The geostrophic regime is the part where the main force balance is between the Coriolis effect and the local pressure gradient, part of the NavierStokes equation, which expresses momentum conservation in fluids,” he says.
Geostrophic regime A number of simulations have been developed to represent flow behaviour in the earth’s outer core, yet they have some significant limitations. The results generated so far do not match up with ground-based measurements of the earth’s magnetic field, which would suggest that current simulations are not big enough to show the entire picture. “We cannot fully resolve all the active turbulent scales, so we need to do some simplifications,” explains Professor Kunnen. By using a simpler system, researchers hope to build a fuller picture of some of the underlying processes involved. “With those simplified systems, we can see some hints of processes that we may not see in full earth simulations,” continues Professor Kunnen. “For example, we can see the formation of big helical flows, big vortices. These may then grow, and we can investigate them in depth.” The experimental setup that researchers are using in the project is relatively simple. Researchers are using a standard plastic cylinder, which is filled with water, then sealed off above and below with cooled and heated plates, respectively. “We can then apply thermal forcing. This whole arrangement
42
rotates around the vertical axis,” outlines Professor Kunnen. The plan is to take two kinds of measurements of flow behaviour in this cylinder. “The first is actually to look at the thermal properties of this kind of flow. We wrap the cylinder in insulation, to prevent any heat leaks and ensure that all of the heat put in from below will travel upwards through the water, to the top plate,” says Professor Kunnen. “This is an important property in terms of modelling large-scale systems, as it represents the amount of heat loss. At a given rotation rate, a specific amount of heat is lost.” A second measurement involves making part of the cylinder transparent, so that flow measurements can be taken. This can be done by adding tiny plastic particles to the water, which then passively follow the flow. “We use laser illumination to record the movement of these particles with cameras. From that, we can then look at what kinds of structures are developing in the flow,” says Professor Kunnen. This then directly reproduces the structures that may form inside large-scale natural flows; Professor Kunnen and his colleagues are breaking new ground in this area. “We are among the first to study these structures in the geostrophic regime. This is mainly because of the design of the experiment, as it’s pretty hard to get into this geostrophic regime. We are currently preparing the measurements,” he outlines.
Rayleigh number The wider objective in this research to investigate how much heat and energy is transferred in a flow. The Rayleigh number, a dimensionless number which essentially describes the strength of the thermal forcing, is a major consideration in this respect. “It’s related to temperature differences that occur. Typically we consider a system that’s heated from below – the fluid near the bottom plate will be heated, so it will become lighter and rise up. Then at the top the fluid is cooled, so it will become heavier and sink,” explains Professor Kunnen. The strength of this thermal forcing is denoted by the Rayleigh number; Professor Kunnen and his colleagues aim to probe
deeper in this area. “We hope to establish a relationship that, based on the value of the Rayleigh number and other parameters, enables us to predict the overall amount of heat that is transferred outwards,” he outlines. Researchers in the project are also using numerical simulations to complement the experimental work. While in the experimental setup researchers are essentially limited to investigating water, using numerical simulations opens up other possibilities. “We can look at different kinds of fluids in the simulations. We could put in a liquid metal, we could put in a gas, and see what happens,” says Professor Kunnen. Ideally, the results of the simulations will match up to those from the experiments, from which Professor Kunnen and his colleagues can then look to draw wider conclusions. “Our experiment will give us insights into certain trends. For example, we can predict the behaviour of the heat transfer as a function of this Rayleigh number,” he outlines. “We unfortunately can’t do experiments in the exact same conditions as in nature, but we can identify trends.” A further step could be to extrapolate from these trends to make predictions about natural flows, including not just in the earth’s outer core, but also other geophysical and astrophysical flows. Rotation is needed in order to reach the geostrophic regime, something which Professor Kunnen says occurs in many large-scale flows. “For example, the liquidmetal core of the earth, but also the giant gaseous planets like Jupiter and Saturn, and also our own Sun to some degree. These systems are all very large and are all rotating – and based on estimates of the flow conditions, we expect them to be in the geostrophic regime,” he explains. The project’s work could also lead to new insights in atmospheric dynamics, although Professor Kunnen is cautious in this area. “Things like phase transitions from liquid to vapour, and the significant interaction of the shallow atmosphere with earth’s surface topography, make the atmosphere more complicated than those areas that we can directly contribute to,” he continues.
EU Research
Many different physical phenomena take part in a flow. The geostrophic regime is the part where the main force balance is between the Coriolis effect and the local pressure gradient, part of the Navier-Stokes equation.
TROCONVEX Turbulent rotating convection to the extreme (TROCONVEX) Project Objectives
The geostrophic regime of turbulent rotating convection is relevant for geo- and astrophysical flows. The flow behaviour in this regime displays significant and unexpected differences with the traditionally studied regime, making extrapolations impossible. We study heat transfer and flow structure with a revolutionary experiment and high-end numerical simulations.
Project Funding
TROCONVEX is funded by the European Research Council - Starting Grant -2015, project no. 678634.
Contact Details
Project Coordinator, Professor Rudie Kunnen Department of Applied Physics P.O. Box 513 5600 MB Eindhoven Netherlands T: +31 40 247 3194 E: r.p.j.kunnen@tue.nl W: https://www.tue.nl/en/research/ researchers/rudie-kunnen/
Professor Rudie Kunnen
Rudie Kunnen is assistant professor at Eindhoven University of Technology (NL). His research focuses on the experimental investigation of turbulent flows, including rotating convection in the geophysical context in the ERC StG project TROCONVEX and the behavior of droplets in turbulence.
Temperature fluctuations from a simulation. Left: nonrotating convection displays very fine structures. Right: with rotation, the formation of a large vortex can be observed.
www.euresearcher.com
43
EU Research
For more information, please visit: www.euresearcher.com
EU
The Deutsche Forschungsgemeinschaft (DFG) The DFG is a German institution investing, training and collaborating to a degree that it is pushing for new discoveries at an impressive pace. The institution has been the governmentâ&#x20AC;&#x2122;s scientific mouthpiece, through both the highs and low points in its history, and today it is respected as a world leading voice in science and a pride of post war Germany. By Richard Forsyth
In pursuit of scientific excellence
www.euresearcher.com
45
F
orging ahead in science has never been a weak point for Germany, which is unafraid of investing in discovery, knowledge and innovation. The Deutsche Forschungsgemeinschaft, or DFG for short, is an association with members including some of the truly great research institutes in Germany, such as Max Planck Society, Helmholtz Association of German Research, Fraunhofer and many other research organisations. It boasts an annual budget of €3.2 billion, with heavy funding by the German Government, as well as EU funding and other donors. In turn the DFG has been funding and supporting research for 80 years and is recognised on the international stage as a launchpad for ground-breaking scientific achievements. In 2017, the DFG funded approximately 32,500 research projects, a staggering investment into science and knowledge advancement. The institute has the weighty task of being a link for scientific research for members of parliament and crucial decision-making organisations in Germany.
From a dark place to light Its history is very much entwined with Germany’s, both the good and the bad. For there is a dark spot in the bright light that is this scientific institution, one that the DFG acknowledges and analyses to this day with typically scientific rigour, not hiding from a time when science was abused to make extreme political views validated. During and prior to the National Socialists coming to power in 1933, the institution, then known as Notgemeinschaft der
46
Wissenschaft (NG) aligned itself to Nazi-research, including ethnographic research that would lead to polices linked with the Holocaust. Arguably, the NG did more than reinvent itself after the war as DFG, it forged itself into a spearhead for world leading science. Having a past as it does, the institution may be at an advantage in spotting dangerous times ahead. The President of the Deutsche Forschungsgemeinschaft, Prof. Dr. Peter Strohschneider began 2019 in an address, talking about ‘the endangerment of social cohesion’ and the shift from ‘solidarity to competition’ and how conditions including ‘the anarchy of the internet’ could promote authoritarian rather than democratic constitutions. As a place in time and space the DFG has a unique perspective of dangers in our present, learnt from the past. This is relevant because one of the most challenging and important duties of the DFG is to foster interest, engagement and encourage public involvement in political discourse for questions relating to science and research. Keeping science in the public and governmental conversations is seen as of paramount importance, as is safeguarding the context of those interests. Today, the DFG, whilst pioneering in German rooted science, has significant outreach and collaboration around the world, with a vested interest in recruitment of young researchers from all around the globe. There is healthy international cooperation among researchers. This includes efforts to liaise with the foreign embassies’ science departments to share information in exchange sessions for diplomats and international delegations.
EU Research
The link between research and prosperity The DFG has become a very powerful and recognised hub for scientific research. It boasts nine strong funding programmes and celebrates 11 scientific prizes, including arguably the most important research funding award in Germany, namely the Gottfried Wilhem Leibniz Prize. This prestigious prize is awarded each year, since 1986 with a maximum of 10 recipients – each award is worth €2.5 million. In 2019 Prof. Dr. Sami Haddadin received an award for his pioneering research in robotics, and Prof. Dr. Brenda Schulman for her work in biochemistry and structural biology on the molecular mechanisms of the ubiquitin system. The institution has in fact, a diverse number of responsibilities. As well as managing prizes, advising the German government and helping train researchers, the institution is also encouraging collaboration between researchers and the private sector to ensure projects can mature into solutions for industrial and business sectors. Avoiding the so called ‘valley if death’ where research sometimes ends without momentum into the world beyond academia, pro active efforts to connect to industry are in effect. This way Germany will be fuelled by knowledge, innovation and have ways to turn research into economic and industrial benefits, to maintain its highly competitive edge. The baton from research to industry is one loaded with power for economic strength. In the World Economic Forum’s Global Competitiveness Report in 2018, Germany was top in the ranks as the most innovative economy, with a score of 87.5 out of 100 for innovation capability – one of the 12 key pillars used as a guide to gauge a country’s productivity.
www.euresearcher.com
It outperformed the US and they’re tipped as a leading light in the current escalating Fourth Industrial Revolution. It was highlighted in the report that it was the ‘sheer number of ideas’ that makes Germany an innovation powerhouse. That’s got to owe something to the output of research institutes such as DFG.
Investing in the future But at the heart of the DFG it is a commitment to pure science that counts, and a steadfast dedication to investing in worthy research projects. Recently, the DFG announced it would establish 14 new collaborative research centres (CRCs) to support top-level research in German universities. These CRCs are to receive around €164 million in funding for an initial four-year period. This is in parallel to the Grants Committee approving an extension of 27 existing CRCs for an additional funding period. The new CRCs have ambitious aims that could have far reaching impacts, especially in the healthcare sector, with projects trying to better understand how the gut and liver interact, and another, forming deeper understanding of aortic diseases, and then there’s the project investigating the skin as a sensor and effector orchestrating local and systemic immunity. Far from fringe science, these projects could have wide reaching effects. CRCs receive funding for a maximum of 12 years and from July 2019, the DFG is funding a total of 278 CRCs. The overall impacts of these robust investments and groundbreaking research projects will ensure considerable advances in human knowledge.
47
The Cool Water Condition and Female Marriage Ages in 1800 CE
The Cool Water Condition and the Beginning of the Modern Fertility Drop
Putting Cool Water at the heart of industrial history The Industrial Revolution began in the Protestant West, yet many of these countries had been technological backwaters for much of human history. Cool water conditions were an important factor behind the region’s pioneering role in industrial history, believes Professor Christian Welzel, the Principal Investigator of the Cool Water Effect project. The countries of
the Protestant West were among the first to industrialise, developing new technologies and mechanised processes that revolutionised society and laid the foundations for the region’s economic prosperity. However, this prosperity was not pre-ordained, as for much of human history the countries of Northern Europe had been technological backwaters in comparison to other regions. “China was much more advanced than northwestern Europe before the Industrial Revolution, and within Europe, the South developed much earlier,” outlines Christian Welzel, Professor of Political Science at Leuphana University. As the Principal Investigator of the Cool Water Effect project, Professor Welzel is looking at the underlying reasons why countries in northwestern Europe industrialised first; emancipatory dynamics are a major factor in this. “We can look at the history of the West as a history of struggles for rights. As one group conquers a set of entitlements then the other groups who are still discriminated against take this as an example and feel encouraged to then step up for their rights,” he explains. A prime example is the struggle for the vote. Historically in the Protestant West the
48
franchise was limited to certain sections of the population, yet over time previously marginalised groups demanded the right to vote, which Professor Welzel believes was an important factor in the region’s economic development. “I believe there’s an intricate, reciprocal relationship between economic and political development,” he says. It is well established that there is a close correlation between the income level of a country and
that there is no causal link between the two and that there is another background factor that causes both to go together,” says Professor Welzel. “We think that cool water conditions were an important background factor.”
The Cool Water Effect This means in particular cool summers and mildly cold winters combined with
We don’t want to be misunderstood as people who rigidly believe in geo-climatic determination. In one part of the project – that we’re still working on – we’ve shown that the explanatory power of this cool-water condition has loosened over the past 20-25 years, in parallel with rising globalization. the strength of its democratic culture – with the exception of some resource-rich nations – yet there is still a degree of debate over the question of which came first and the precise nature of the relationship. “The standard interpretation is that in most cases income caused democracy more than the other way round. This relationship has been called into question however - some researchers argue
continuous rainfall over all seasons and permanently navigable waterways. These are all major factors in considering why it was the Protestant West that led the Industrial Revolution, starting with the nature of the climate. “If you are outdoors in a cold climate, you need to do something to keep yourself warm. Like working on a field, or building a house,” points out Professor Welzel. Living
EU Research
The Timeless Condition and the Nuclear Family Configuration in 1800
patterns in areas with cold climates are also markedly different from those in warmer climates. “In colder climates we have a much stricter separation between private and public, because the walls that form our houses are a clear boundary. Inside is our private land, and outside is the public realm,” explains Professor Welzel. “Around the Mediterranean, people spend more time outdoors on squares and so on, so the public realm has more importance. That means that in these colder climates people become more individual in outlook, the focus is more on the nuclear family than the extended family. That also leads to a socialising pattern where kinship is not as important as in more southern climes.” The land-to-labour ratio in cool-water areas is another major point of interest to Professor Welzel. In cool-water areas it was possible to have a mixed approach to agriculture, where farmers not only grew crops, but also kept livestock. “This was land-intensive but not very labour-intensive. So you didn’t need an army of labourers to cultivate an acre of land in cool water conditions, which meant there was less need for cheap mass labour, including children,” explains Professor Welzel. This was one factor which reduced pressure on women to reproduce as early as possible; Professor Welzel says that another relates to the nature of the environment in cool water areas. “Child mortality was naturally lower in cool-water areas, as fresh water is usually not infested with microbes,” he outlines. “Also, some of the diseases that are prevalent in hotter areas do not occur in cool water areas. So that lowers child mortality – then you could afford lower fertility to maintain
www.euresearcher.com
The Cool Water Condition and Female Marriage and Education in 2010
the required workforce. If you have lower fertility pressure on women, that allows for later marriages with the consequence of more egalitarian gender relations.” There are other significant differences between cool-water areas and warmer parts of the world. One important characteristic of areas with cold climates is that they have distinct seasons over the course of a year, so it’s important to prepare for periods when it’s not possible to grow crops. “That means that there is more emphasis on planning and food storage. There is also evidence that in colder climates people’s time orientation is different,” says Professor Welzel. As longterm planning is more important in these areas, people are more likely to prepare for the future, for example by developing new skills. “People think more about investments with delayed gratification – which is the case for instance with education, because you earn the reward after building up your skills. These are important factors,” continues Professor Welzel. “Another major consideration is access to water. When you have continuous rainfall throughout the seasons, everyone will have access to freshwater resources. No-one can constrain this.” This removed a vital means of control from elites. In more arid zones elites could take control over the water supply and so enforce control over the population in that way; however, this was not possible in rainy areas. “Water is a very diffuse, democratic resource, because it’s available for everyone,” stresses Professor Welzel. Relatively easy access to water meant people were more independent and less reliant on a central authority, while Professor Welzel
says the existence of permanently navigable waterways in the Protestant West also helped spur economic development and facilitate trade. “Western Europe is unique in terms of this capillary system of smaller rivers. Rather than having one central stream that dominates the continent, we have a system of smaller rivers connected to each other that creates a web of exchange,” he explains. “That means people could leave an area where a tyrant ruled the territory and settle somewhere else. People could also join forces more easily and pool their resources for private purposes.” The wider point here is that an autocratic, hierarchical and controlling centralised state was unlikely to emerge under cool water conditions, leaving the space for individual initiative and enterprise to flourish, both of which were essential to the Industrial Revolution. The structure of society in the Protestant West also militated against the emergence of an autocratic or despotic state. “Marriage was strictly consensual for example. The consent of the woman was needed to marry, it was not pre-arranged,” says Professor Welzel. The family is the fundamental social unit, from which bigger organisations then evolve, so the principle that alliances are formed by consent then filtered throughout wider society. “This consensual principle transplanted itself into bigger institutions that started building up, such as state beaurecracies, big companies, local assemblies, and city administrations,” explains Professor Welzel. “These institutions all worked on this consensual principle. This meant there was a contractual orientation early on.”
49
Contractual orientation This also acted as protection against authoritarianism as the institutions of the state began to develop. A monetised economy is a key step towards building a nation state, as then taxes can be raised to build a bureaucracy, establish a civil service and raise an army. “In Europe, the barter economy lasted well into the 13th century. When money replaced the barter economy, rulers could start to think about building a state, a bureacracy and an army,” says Professor Welzel. Raising taxes was essential to this, and city dwellers were a prime target; Professor Welzel believes that strong social organisations protected people against over-ambitious rulers. “There were local assemblies, city assemblies and associations that were built on a voluntary basis, and they were practiced in self-organisation. They knew how to organise resistance against over-ambitious rulers,” he explains. “In Europe, the authorities were only able to build a state in return for representation – the principle was no taxation without representation.”
The same principle lies at the root of moves in the Protestant West towards universal education. A state is built on the consent of the people, so in order to maintain and strengthen their power base, rulers had to unlock the potential of the whole population, and not just narrow sections of it. “In order to gain power, rulers had to appeal to society. Rulers began to recognise that they had to unlock the intellectual potential of the population, and therefore education was essential. When nation states started to introduce universal schooling, people got an education, became literate, and started to think for themselves,” outlines Professor Welzel. Around this time women were also marrying later as a result of lower fertility pressure and having less children, which meant that they could pay more attention to their offspring as a result. “If you don’t have a lot of offspring then you have more room and opportunity to invest in those children and their education and skills,” points out Professor Welzel.
The Cool Water Condition and Gender Equality today
50
Historical road-map Researchers now aim to bring together these ideas and develop a kind of historical roadmap. There are a lot of loose ends in the literature, which Professor Welzel believes are connected by the cool water idea. “Our achievement is to create a synthesis, an integration of all those loose ends, which gives us a more comprehensive and coherent understanding of what happened,” he says. A key part of the project’s work involves collecting supporting data from across the globe. “We are collecting data on households and families in pre-industrial societies for example. We’ve found evidence on households and families in North America with respect to the leaning towards nuclear families in cool water areas,” continues Professor Welzel. “However, we don’t want to be misunderstood as people who rigidly believe in geo-climatic determination. In one part of the project – that we’re still working on – we’ve shown that the explanatory power of this cool-water condition has loosened over the past 20-25 years, in parallel with rising globalization.”
The Declining Impact of Environmental Advantages on Economic Growth
EU Research
The cool-water areas of the Protestant West still score well today in important developmental outcomes, such as per capita GDP, life expectancy, gender equality and levels of corruption. However, the explanatory power that the cool water condition has over these and other developmental outcomes has declined over
globalising conditions than it was in earlier periods of history,” he points out. “Societies can learn from each other, and so we get a degree of policy diffusion.” This research could help inform the wider process of policy diffusion, something Professor Welzel is keen to explore in the future. In particular, the project’s
Western Europe is unique in terms of this capillary system of smaller rivers. Rather than having one central stream that dominates the continent, we have a connected system of smaller rivers connected to each other that creates a web of connections. the past 20-25 years. “It’s a fairly steady process that we can map and trace,” says Professor Welzel. This shows that societies can escape the determinative power of geography in today’s globalised world, where information flows more easily than ever before, believes Professor Welzel. “Cross-cultural learning is much easier under
work holds important implications for water policy, and how water can be made universally accessible. “It also has implications for how we incentivise lower fertility among women and later marriage. Gender equality is another very important consideration, how do we encourage that?” says Professor Welzel.
THE COOL WATER EFFECT The Cool Water Effect: Why Human Civilization Turned Towards Emancipation in Cold-Wet Regions Project Objectives
This project examines the deep causes of this civilizational turn, analyzing the role of geography, genes, disease, agriculture, language, religion, statehood, colonialism, law traditions and other institutional factors, such as emerging democracy. The evidence shows that, among multiple possible paths towards human emancipation today, there is only one narrow route of significance. The very narrowness of this route explains why it took civilization so long to reach towards human emancipation.
Project Funding
German Science Foundation (DFG) 1 Million Euros
Contact Details
Principal Investigator, Professor Christian Welzel Chair in Political Culture Research Center for the Study of Democracy Leuphana University T: +49 1512 5240255 E: cwelzel@gmail.com W: https://www.leuphana.de/en/university/ staff-members/christian-welzel/cool-watereffect.html Welzel, C. (2018). “Theories of Democratization.” In Haerpfer, C., P. Bernhagen, R. Inglehart & C. Welzel (eds.), Democratization. 2nd (fully revised) edition. Oxford: Oxford University Press, pp. 21-39.
Professor Christian Welzel, Principal Investigator
Welzel, C. (2014). “Evolution, Empowerment and Emancipation: How Societies Climb the Freedom Ladder.” World Development 64: 33-51. Welzel, C. (2013). Freedom Rising: Human Empowerment and the Quest for Emancipation. New York: Cambridge University Press (chapter 11).
Professor Christian Welzel
Christian Welzel is the Political Culture Research Professor at Leuphana University in Lueneburg, Germany. He is also President (emer.) and Vice-President of the World Values Survey Association and Chief Foreign Director of the Laboratory for Comparative Social Research (LCSR) at the National Research University-Higher School of Economics in St. Petersburg and Moscow, Russia.
The Cool Water Project Team (from left to right): Lennart Brunkert, MA, Doctoral Researcher, Phuong Pham, BA, Research Assistant, Dr. Stefan Kruse, Postdoctoral Researcher, Le Cam Nhung, BA, Research Assistant.
www.euresearcher.com
51
Illustration of the routing of traffic through an attacker’s network in Belarus in 2013, due to an attack on the Border Gateway Protocol (BGP).
Strengthening the borders of the internet Much of the core infrastructure underlying the Internet was designed decades ago, at a time when security was not always the foremost consideration. The Internet has since evolved significantly to play a central role in the economy, national security, and social interactions, yet its communication infrastructure remains alarmingly vulnerable to attack, as Professor Michael Schapira explains. The basic mechanisms
that enable us to send data via the Internet from one point to another were devised decades ago. This was a very different era; the Internet was much smaller, and it was largely used for communication between trusted parties, yet it has since grown and evolved significantly. “People didn’t necessarily foresee today’s world, where tens of thousands of organizational networks owned by selfinterested, often competing, entities, are stitched together to make up the Internet,” says Professor Michael Schapira, the Principal Investigator of the SIREN Project. While security was not always the foremost consideration in the initial design of the Internet’s core infrastructure, it is a primary concern today, as Internet outages have a significant social and economic impact. Yet, addressing this is a complex task in the current context, given how central the Internet is to everyday life. “Any solution to the Internet’s security vulnerabilities has to be backwards compatible with existing technologies,” says Professor Schapira.
SIREN project This is a topic at the heart of the SIREN project, an ERC-backed initiative hosted by the Hebrew University of Jerusalem, which is exploring a new approach to securing the routing of Internet traffic. Currently, Internet traffic typically goes through several organizational networks along the way before it is eventually directed to the intended
52
destination. “There is a protocol that specifies how traffic should be forwarded towards its destinations,” outlines Professor Schapira. This is the Border Gateway Protocol (BGP), which was designed around 30 years ago, along with the rest of the Internet’s core communication infrastructure. “In a way, the BGP is the most important part of the Internet’s infrastructure, because it’s the glue that holds the organizational networks comprising the Internet together,” explains Professor Schapira. “While it’s proved effective, it’s also a security vulnerability.” The reason for BGP’s insecurity is rooted in the way the protocol itself works. A destination of traffic advertises to the world the Internet addresses that belong to it, and then this information is propagated onwards by the neighbouring parties. “It’s really about trust. An organization advertises its Internet addresses, and other parties trust it to do so correctly,” explains Professor Schapira. The routing system can be subverted, however, and it is not uncommon for major organizations to be disconnected from the Internet or attacked in some other way. “An entity can advertise Internet addresses that don’t belong to it there are many reasons why they might want to do something like this. For example, it might want to prevent an organization from communicating with the world, or it may want to monitor traffic or eavesdrop on it,” says Professor Schapira. “There is currently no widely deployed mechanism to prevent an organization from advertising incorrect
Internet addresses, which represents a significant point of vulnerability.” A lot of energy has been focused on using heavyweight machinery to achieve very high levels of security and combat these types of attacks, yet it has proved difficult to develop an easily deployable solution. Professor Schapira and his colleagues in the project are exploring a different approach, developing a flat, decentralised design to achieve a very high level of security, while at the same time taking the wider economic context into account and considering what will incentivize organizations to change. “The cost of transition should be low, and it should also lead to some tangible benefits,” he says. It’s not practical to replace BGP entirely, so Professor Schapira and his colleagues in the project are rather looking to effectively jump-start security. “We do this through a combination of techniques, building on two high-level ideas. The first is about setting a very high bar for the attacker,” he says. One option is establishing ownership of Internet addresses by sending data packets to these addresses from many locations in the world, and requesting a response from the owner organization to each and every one. “This is essentially a way of checking de facto ownership of these Internet addresses,” says Professor Schapira. “It’s like calling you from many different locations in the world and expecting you to pick up the phone.” This is actually a very high bar for an attacker to deal with, as to prove ownership
EU Research
of IP addresses which it doesn’t actually own, it would have to be able to respond rapidly to queries from all over the world. In order to achieve this, it would have to essentially intercept traffic from everywhere in the world leading to the intended addresses. “There are only two ways in which this can be done. One is if the attacker is already located between an organization and the rest of the world. Maybe an organization has a single Internet service provider (ISP), which is the interface between it and the rest of the world, and can passively monitor all of its traffic. Then, if something is sent to the organization, it can be intercepted,” outlines Professor Schapira. This is not the major concern, as it’s unusual for organizations to be attacked by their own ISPs. The other kind of attack is when someone tries actively to hijack traffic,” says Professor
developed during the course of the project are applicable to other contexts. One example is the Network Time Protocol (NTP), which deals with clock synchronisation, something crucial to the security of many Internet applications. “For example, for financial applications, it’s important to know that something was done before something else,” points out Professor Schapira. However, the NTP is also vulnerable to attacks. “If I can push your time settings five seconds forward or back, then that has consequences. It’s fairly easy to launch such attacks,” says Professor Schapira. “We’ve been working to develop a solution called Chronos that provides security and can be installed on your device – say your smartphone or laptop – without having to change other elements of the Internet infrastructure.”
In a way, the Border Gateway Protocol is the most important part of the Internet’s infrastructure, because it’s the glue that holds the organizational networks comprising it together. While it’s proved effective, it’s also a security vulnerability. Schapira. However, explains Professor Schapira, this makes the attack highly visible and the victim can be alerted. The second high-level idea is not setting the bar too high in terms of desired security guarantees at the expense of non-trivial and error-prone mechanisms that are unlikely to be adopted. This requires characterizing the most dangerous and common attacks and identifying the minimum effort needed to protect against these, instead of trying to defend against all attacks by leveraging heavyweight machinery whose adoption faces significant obstacles. Professor Schapira quotes Sir Robert Alexander Watson-Watt, the British pioneer of radar technology, who said; “Give them the third best to go on with; the second best comes too late, the best never comes.” Professor Schapira is also exploring whether the techniques that have been
Internet Engineering Task Force This work holds clear importance for governments and other major organizations, who are keen to minimise the risk of attacks and improve the security of the Internet. Professor Schapira and his team are working with governance organizations with the aim not just of developing deployable solutions, but also influencing the way practitioners think about Internet security. “For this reason we’re actively working with cyber bureaus in Israel and abroad,” stresses Professor Schapira. Researchers in the project are also actively engaging with the Internet Engineering Task Force (IETF), the entity responsible for standardising Internet protocols. “There are working groups in the IETF that are tasked with standardising such solutions. That’s another arena where we can influence the debate,” says Professor Schapira.
SIREN Securing Internet Routing from the Ground Up Project Objectives
The aim of the planned research project is to put forth and explore a radically new paradigm for securing routing on the Internet. The proposed alternative roadmap for securing the Internet consists of two steps: • Jumpstarting BGP security: A novel approach to routing security that bypasses existing obstacles. Specifically, the proposed design will be flat, decentralized, fully automated, avoid dependency on a single rootof-trust, and not require the modification/ replacement of legacy BGP routers. • A long-term vision for Internet routing: Leveraging the vast computational resources in modern datacenters, and research on Secure Multi-Party Computation, to outsource routing to a small number of entities while retaining flexibility, autonomy and privacy. The belief is that, taken together, these two steps can lead to a more secure Internet in the shortrun, and outline a promising, as yet uncharted, new direction for the future of Internet routing.
Project Funding
European Research Council - Starting Grant
Contact Details
Project Coordinator, Professor Michael Schapira School of Computer Science and Engineering, Hebrew University of Jerusalem E: schapiram@cs.huji.ac.il W: http://www.cs.huji.ac.il/~schapiram/ W: https://cordis.europa.eu/project/ rcn/200038/factsheet/en Professor Michael Shapira
Michael Shapira is an Associate Professor in the School of Computer Science and Engineering at the Hebrew University of Jerusalem. Previously he was a visiting scientist at Google in New York, and also held postdoctoral research positions at Princetop University, UC Berkeley, and Yale University.
SIREN Man in the middle attacker An NTP client synchronises time by communicating with a set of NTP servers.
www.euresearcher.com
53
How do galaxies form? Scientists have long looked beyond our own planet to learn more about the origins of the Universe, and the next generation of telescopes will allow scientists to look back even further into cosmic history. Researchers at LMU in Munich are developing a new approach to modelling the formation and evolution of galaxies, as Dr Benjamin Moster explains. The development of new facilities like the James Webb Space Telescope (JWST) and the Extremely Large Telescope (ELT) opens up exciting new possibilities in astrophysics research, enabling scientists to probe deeper into galaxy formation at even earlier periods in cosmic history. While observations from these telescopes provide a snapshot of galaxies and their properties at a given point in time, it is necessary to build models to investigate their formation and evolution, a topic at the heart of Dr Benjamin Moster’s research. “The aim in our research is to understand how galaxies form,” he outlines. Telescope observations are an important part of this, as they provide a picture of galaxies at a particular point in time, the next step then is to investigate how their properties evolved. “We want to try and understand how these observations can be explained with models. We try to build models that connect these observations to the underlying dark matter and explain what kind of physics happen in those galaxies,” explains Dr Moster.
54
A number of different approaches are available for modelling galaxy formation, the most commonly used of which is hydrodynamical simulations. With this approach, researchers take the Universe at a very early time in cosmic history, place some initial conditions on it - derived for example from the cosmic microwave background - and then simulate the physical processes, such as gravity, gas dynamics and cooling, and star formation. “Eventually large gravitationally bound structures will form, so-called dark matter haloes – gas will then collapse in these dark matter haloes and form stars,” says Dr Moster. This approach produces cosmological volumes with galaxies that closely resemble observed galaxies, yet it can be difficult to resolve the physical processes in these hydrodynamical simulations, while Dr Moster says there are also other challenges. “You need to run a simulation for several months on these big supercomputers, and a lot of computing time is required. At the end you then get only a single simulation adopting
one chosen model that you can study,” he outlines.
Empirical galaxy formation models As Emmy Noether Group Leader at LMU in Munich, Dr Moster is now developing a different approach to modelling the formation and evolution of individual galaxies, called empirical galaxy formation models. Rather than trying to model the physics, the aim here is to introduce empirical relations between observed galaxy properties and simulated halo properties. “Unlike the gas physics, we know exactly how gravity works. Since dark matter only interacts gravitationally, we can run gravity-only simulations, which always have the same result,” says Dr Moster. These gravity-only simulations can typically run much faster, because it’s not necessary to calculate any gas physics. “If you don’t have any gas in there then you don’t need to calculate any star formation or feedback from supernovae or black holes, you just focus on gravity,” points out Dr Moster. “What we
EU Research
The large-scale structure of the Universe at early cosmic times (left) and today (right). The top panels show the distribution of dark matter, while the bottom panels show where galaxies are located. The side length of each map is 1.5 billion light years.
find in these models is that the Universe still evolves very similarly on large scales – matter again clumps in dark matter haloes.” The next step is to then introduce empirical relations between the properties of these dark matter haloes and the observed properties of galaxies, for example between the mass of a halo and the mass of the galaxy at its centre. These relations are then adjusted until the model reproduces
Hydrodynamic simulation of a dwarf disc galaxy as seen from above. The gas cools and and forms molecular clouds where stars are born (red regions). When massive stars die, they explode as supernovae, leading to large bubbles in the gas (dark regions).
reproduces the observations?” he continues. “Over time we’ve experimented with different parameters and models, so we have a degree of knowledge on the levels of uncertainty, and the uncertainty of the predictions that can be made with them.” This provides the foundations on which Dr Moster and his colleagues can then look to learn more about galaxy formation. With improved models, researchers can look at the
We want to try and understand how
these observations of galaxies can be explained with models. We try to build models that connect these observations to the underlying dark matter and explain what kind of physics
happen in those galaxies. statistical observations, which could relate to how galaxies of a specific mass cluster with each other for example, or how many galaxies form stars in comparison to how many are passive and don’t form stars. “Our models are tuned to reproduce all these observations as accurately as possible,” says Dr Moster. A model can then be run for each dark matter halo and compared with galaxy observations, from which Dr Moster and his colleagues can then gain important insights. “So, which kind of model with which set of parameters best
www.euresearcher.com
evolution of these dark matter haloes over time and address some unresolved questions in astrophysics. “For example, when do galaxies grow? In one of our papers we’ve also looked at how much mass forms in the galaxy itself, and how much comes in through satellite galaxies that merge with that galaxy,” outlines Dr Moster. Researchers have found that around 50 percent of the mass of large galaxies is formed in the galaxy itself, and the other 50 percent comes in through accreting or merging satellites; our own
galaxy is however not on this sort of scale. “The Milky Way is a fairly normal galaxy, with an average mass – only a few percent of its mass comes in through satellites,” says Dr Moster. “Another topic we are investigating is red galaxies. These are galaxies that have stopped forming stars.” The evidence suggests that these red galaxies are typically a bit more massive than blue galaxies in the same halo, now researchers are trying to understand how their properties evolved. This research holds important implications for the operation of the next generation of observational facilities such as JWST, a space telescope which is set to be launched at some point in 2021. “We can help in survey planning, so that the observers could have some initial idea of what they’re going to observe, and if it makes sense for them to look for specific galaxies,” explains Dr Moster. While ideally the observations and the model would match up, identifying any points where they differ gives Dr Moster and his colleagues valuable insights into the physics of galaxy formation and how the model could be improved. “To some extent it’s an iterative process. So as theorists we may have some good ideas of how to improve the models, but then we get interesting observational data, and we can also improve the model through that,” he says.
55
FROM DARK TO LIGHT The connection between galaxies and their dark matter haloes through cosmic time
The Galaxies and Dark Matter research group at the University Observatory at the LMU Munich. Left-to-right: Benjamin Moster (group leader), Joseph O’Leary (PhD Student), Ulrich Steinwandel (PhD Student), Abhishek Malik (Master Student), Aura Obreja (Postdoc), Jannik Bach (Master Student), Alina Stephan (Bachelor Student), Jelena Petereit (Bachelor Student)
Project Objectives
The aim of our research is to study how galaxies form within dark matter haloes, and to better understand the various physical processes that shape their properties. For this we use empirical models, which relate the observed galaxy populations to the underlying dark matter distribution in a statistical manner that is as independent as possible of any model assumptions.
Project Funding
The Dark to Light project is funded through the Emmy Noether Programme of the Deutsche Forschungsgemeinschaft (DFG). The research is hosted at the University Observatory of the LMU Munich.
Project Partners
• Thorsten Naab • Simon White Max-Planck-Institute for Astrophysics, Garching bei München, Germany
Contact Details
Project Coordinator, Dr. Benjamin Moster University Observatory Munich Ludwig-Maximilians-University of Munich Scheinerstraße 1 81679 Munich Germany T: +49-89-2180-9284 E: moster@usm.lmu.de W: www.usm.lmu.de/people/moster W: www.usm.lmu.de/GDM https://ui.adsabs.harvard.edu/ abs/2018MNRAS.477.1822M/abstract Dr. Benjamin Moster
Dr. Benjamin Moster is an Emmy Noether Group leader at Ludwig Maximilians University (LMU) in Munich, where he conducts research into galaxy formation. He is also a guest researcher at the Max-Planck-Institute for Astrophysics in Garching. Previously he was a Kavli Fellow at the Institute of Astronomy at Cambridge University in the UK, while he has also held research positions in Germany and the US.
Astrophysics questions A further important aspect of the project’s research is the possibility of comparing these empirical models with much more complex hydrodynamical simulations of a single galaxy. This enables researchers to assess whether the mass of a simulated galaxy is in the right range for example, or if the star formation rate is too high or low. “It’s possible to essentially zoom in on a single galaxy at very high resolution in these hydrodynamical simulations. Then you can make a direct comparison,” explains Dr Moster. The aim here is to simulate a galaxy at very highresolution, then compare it to empirical predictions. “The empirical model effectively gives you observational constraints. We can then compare those to an individual galaxy, and see if the physics that we’ve simulated leads to the right result, or if it leads to something different,” continues Dr Moster. “If it’s the latter, then we should think about including more physical processes in the simulation that weren’t included previously.” This could help researchers understand the physical processes responsible for the galaxy properties shown in the empirical models, and from that learn more about the physics that drives the formation of those galaxies. This could mean looking at whether supernovae explosions play a major role in galaxy formation, or investigating the
importance of black holes. “We are looking at how black holes affect galaxy growth,” says Dr Moster. Another major avenue of investigation in Dr Moster’s group is whether a change in the background cosmology away from that described in the lambda cold dark matter (ΛCDM) model would affect galaxy properties. “We ran these dark matter simulations not with cold dark matter, but with warm dark matter,” he outlines. “Warmer particles travel faster, which means it’s harder for them to form small structures, so small blocks of dark matter can’t really form. If we take this as the basis for the empirical model then we can ask - how do galaxy properties change?” The development of ever-more sophisticated telescopes will allow scientists to probe deeper into the issues around these kinds of questions. For example the Euclid mission, which is set for launch in 2022, will accurately measure the positions of galaxies in the early Universe, from which Dr Moster and his colleagues could learn more about the nature of dark matter. “If we find that galaxies cluster somewhat differently in the warm dark matter simulation than in the CDM simulation, then we have a prediction and then can say; ‘ok, with this next generation telescope you could observe it and try to nail down whether dark matter is cold or warm,” he says.
Hydrodynamic simulation of a dwarf disc galaxy as seen from its centre.
56
EU Research
A global view on the histories of philosophy Since the 2018 World Congress of Philosophy it has become more evident than ever that the field of Philosophy has globaly transformed itself, yet this is not currently reflected within the historical framework. We spoke to Professor Rolf Elberfeld about his work in developing a global framework on the history of philosophy, which will help pave the way for doing philosophy in a global perspective in future. While many nations
and cultures across all the world have their own traditions of thinking, philosophy came to be conventionalised as a purely European project by the late 18th century. Now Professor Rolf Elberfeld and his colleagues in a new DFG-funded project aim to develop a more inclusive picture. “We aim to find a new framework for the history of philosophy, which includes different traditions,” he outlines. While European philosophers have made important contributions to the field, Professor Elberfeld believes it’s important not to neglect traditions from other parts of the world in the wider historiography. “One of the main aims of the project is to put the history of philosophy in global perspective,” he says.
Global perspective This means moving away from the eurocentric viewpoint that emerged around the middle of the 18th century. Johann Jakob Brucker’s Critical History of Philosophy,
written around 1730, documented the history of Persian, Egyptian and Chinese philosophy among several other traditions, yet the field later narrowed, partly in response to the ideas of Immanuel Kant. “Kant said; ‘within the history of philosophy, only one philosophy can be true, and represent the truth’,” outlines Professor Elberfeld. Other philosophers subsequently made important contributions to the history of the field, notably Georg W.F. Hegel. “In some respects, Hegel pluralised philosophy. He said; ‘there’s not just one true philosophy, but there was a development within the history of philosophy, which started with the Chinese’,” explains Professor Elberfeld. “But then he created a line of philosophical development in which European Philosophy is the final and highest end of the History of Thinking.” Against this linear process, the project wants to stress that the history of philosophy is inherently multi-polar and inter-connected. One example among many others are the
Arabic and Jewish traditions which are integral parts of European philosophy. However, some scholars have taken a European-centered viewpoint, notably the 19th century philosopher Albert Schwegler, who argued that up to classical antiquity philosophy existed only in Europe. “Schwegler had a very narrow view on philosophy. He defined philosophy in connection to science,” says Professor Elberfeld. A key part of the project’s agenda centres around re-analysing the exclusion mechanisms through which philosophy was conventionalised as a purely European project, work which involves considering the nature of the subject itself. “It depends on the definition, on what is considered to be part of philosophy,” continues Professor Elberfeld. This is a complex question, and perceptions of what the subject involves have evolved over time. During his lifetime Isaac Newton was considered to be a natural philosopher for example, yet he is now thought of primarily as
哲学の庭, 哲学堂公園 “Garden of Philosophy” and “Hall of Philosophy Park” in Tokyo - © panoramio Author Carbonium. (original image vertically extended)
www.euresearcher.com
57
They did not include modern Japanese philosophy, even though it can be considered to some degree as an offshoot of European philosophy. Modern Japanese philosophers have built on those foundations to create new philosophy,” he explains. Chinese historians of philosophy by contrast typically take a different approach. “Chinese philosophers include their whole tradition within their history of philosophy. So philosophers from different countries take different things into account when writing a history of philosophy,” continues Professor Elberfeld.
Inter-cultural dialogue
Christian Kortholt (1632–1694), Treatise on the origin and development of the ancient barbaric philosophy of the Chaldeans, Persians, Egyptians, Indians and Gauls, Jena 1660.
a physicist. Moreover, the importance of religion and theology within the field has been the subject of debate. “If philosophy is narrowly defined as a science, then topics related to religion will not belong to it,” says Professor Elberfeld. For his part, Professor Elberfeld aims to open up the debate by developing a global perspective on the history of philosophy. “We have a pluralised picture of the history of philosophy. We see many different traditions in China, Japan, India, Africa, Latin America, and other parts of the world,” he outlines. “Since the 2018 World Congress of Philosophy in Beijing it has become evident that Philosophy is already a global project which is taught at universities all over the world.” Researchers in the project now aim to collect materials from histories of philosophy written in several different languages to try and build a fuller picture. Language is an important window into a nation’s culture and its philosophical traditions. “Within a language, your way of life and the structure of
the world will be culturally biased,” Professor Elberfeld points out. While it’s clearly not possible to cover every global language, researchers are looking at materials and texts across a diverse range. “We’re looking at different histories of philosophy, written in Chinese, Japanese, Arabic, Latin, German, English, and other languages. What kinds of histories of philosophy do we have already in these languages?” asks Professor Elberfeld. “We will analyse the material, and look at what is excluded from these histories and what is included. In the first year of the project we want to establish an overview of the different materials.” A detailed analysis of these materials can reveal a great deal about what is considered to be important in certain philosophical traditions and what is not. Professor Elberfeld points to a very interesting example from Japan. “Some Japanese scholars wrote a history of the western tradition of philosophy, starting with Thales of Miletus - so preSocratic philosophy - and ending with Derrida.
A global perspective on the history of philosophy could support deeper dialogue between these different traditions and enrich cultural understanding. Dialogue on aesthetics could open up new perspectives on art for example. “If you look closely at paintings in Chinese art galleries, sometimes they relate to their own tradition of painting and the arts. That makes the paintings different,” says Professor Elberfeld. These works cannot be fully understood purely on the basis of European aesthetics. This underlines the importance of inter-cultural dialogue, which can also lead to new insights into more familiar artworks, Professor Elberfeld explains: “For example, we are starting to look at European artworks with aesthetic theories from China. It is quite fruitful to look at these artworks on the basis of Chinese aesthetic theories. It opens up new perspectives, and new ways of paying attention.” An aesthetic theory by nature makes distinctions, and those distinctions will draw the viewer’s attention to specific aspects of the painting. One very fundamental distinction is between nature and culture, and the nature of this distinction varies across different philosophical traditions. “In the European tradition, we consider everything
The Toledo School of Translators is the group of scholars who worked together in the city of Toledo during the 12th and 13th centuries, to translate many of the philosophical and scientific works – translated before from Greek into Arabic or written in Arabic - from Classical Arabic into Latin.
58
EU Research
that is made by human beings as culture,” outlines Professor Elberfeld. There is however a growing trend in urban architecture to design buildings to fit in with the landscape, which to a degree reflects the influence of Japanese aesthetics and ideas. “Many Japanese architects create buildings which essentially meld architecture with nature. For example, the Church of Light and the Church on the Water in Japan, which were both designed by a famous Japanese architect called Tadao Ando,” says Professor Elberfeld. A visitor to the Church on the Water is able to look out of the building over the adjacent pond directly into the natural world, so from this viewpoint it is nature itself that is holy. This represents a different perspective on religion to that commonly held in Europe, which reinforces the importance of intercultural dialogue. “We can only be aware of
different traditions and ideas. This does not mean neglecting a nation’s own philosophical traditions, but rather placing them within a wider context. “A global perspective means that you look from your own tradition into the plurality of different traditions, on a global scale,” explains Professor Elberfeld. This includes philosophical traditions that were excluded from the history of ideas due to social and political circumstances such as colonialism. “Particularly, in India, Africa and Latin America there are strong movements to decolonize philosophy, that is to ‘provincialise Europe’, as the Indian scholar Dipesh Chakrabarty calls it, and to include indigenous philosophical traditions into the canon of philosophy,” outlines Professor Elberfeld. “The project would also like to include ideas and viewpoints from these philosophical traditions.”
We have a pluralised picture of the history of philosophy. We see many different traditions in China, Japan, India, Africa, Latin America, and other parts of the world. these different ideas if we have a broad view of the history of philosophy,” points out Professor Elberfeld. Having spent part of his career abroad, Professor Elberfeld is well placed to develop this broader picture. “The first point in the project is to collect all the materials, in different languages and from different histories of philosophy,” he says. “We’ll also look at how philosophy is studied in different countries and universities. Students in Taiwan start with a full semester of philosophy on a general level for example, then they have to decide between studying the Western tradition or the Chinese tradition.” The hope is that the project’s work will encourage philosophy departments to extend their teaching programmes to
www.euresearcher.com
This represents a paradigm shift within philosophy itself, towards a more entangled, globalised perspective. While the 2018 World Congress of Philosophy was the biggest ever, with around 3,000 papers presented, Professor Elberfeld says there was a recognition at the event that the field needs to evolve: “The President of the International Federation of Philosophical Societies, Luca Scarantino, said in his final speech at the Congress that the field had to change.” The project will make an important contribution in these terms. “We would like to support, and to enhance, intercultural dialogue between different traditions – within epistemology, aesthetics, and different fields of knowledge,” continues Professor Elberfeld.
HISTORIES OF PHILOSOPHY A History of Philosophy in Global Perspective
Project Objectives
In the age of globalization it is necessary for the future of philosophy, to draw up a history of philosophical thought in a global perspective, in which the Eurocentric narrowness of Philosophy is overcome. The result of the project will be a new, globally oriented picture of the history of philosophy.
Project Funding
The Project is funded by the The Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) with 1,25 Million Euro for five years. (http://gepris.dfg.de/gepris/ projekt/411880265?language=en)
Contact Details
Principal Investigator, Professor Rolf Elberfeld Institut für Philosophie Universität Hildesheim Universitätsplatz 1 31141 Hildesheim Germany T: +49 5121 88321104 E: elberfeld@uni-hildesheim.de W: https://www.uni-hildesheim.de/ histories-of-philosophy/
Professor Rolf Elberfeld
Professor Rolf Elberfeld studied Philosophy, Japanology, Sinology, History of Religion in Wuerzburg, Bonn and Kyoto. He received his PhD form the University of Wuerzburg and did his Habilitation at the University of Wuppertal. He is Full Professor of Philosophy at the University of Hildesheim. His fields of research are Intercultural Philosophy, Phenomenology, Japanese Philosophy, global History of Philosophy
59
Evaluating the fundamental elements of a Scientific Theology While theology as a field is commonly thought of as part of the humanities, researchers do employ scientific methods to investigate the nature of God. Now researchers aim to assess the arguments on whether theology can actually be thought of as a scientific discipline, and to bring some structure to debates in related fields, as Dr Benedikt Göcke explains.
60
The purpose of scientific research is the
Scientific theology
institutionalised attempt of providing true descriptions, good explanations, and correct predictions of our world, work which has led to many important discoveries, technical advances and material progress. On the surface this may seem very different from the field of theology, in which researchers look at the nature of God and investigate how religion shapes the world, yet the work of Dr Benedikt Göcke’s research group brings these two topics together. “We are analysing the different arguments against the possibility that theology, or confessional theology, is a scientific discipline,” he outlines. On the other hand researchers are also looking at the arguments in favour of the proposition that theology is indeed a scientific discipline, work which leads into Dr Göcke’s wider goals. “We want to bring some structure into interesting discussions in science, theology and the philosophy of science,” he continues. “From this point, we hope to then reach reasonable conclusions about the possibility of a scientific Catholic theology.”
This work is built on the analysis of recent debates and discussions in the philosophy of science, as well as in analytical philosophy and analytical theology. While the study of God and religion by nature involves a lot of reading, textual analysis and debate, theologians do actually employ scientific methods in research. “Theologians in different disciplines, like church law, moral theology, and Christian and social ethics, use scientific methods. For example, theologians use philological methods in the analysis of the Bible,” points out Dr Göcke. Other parallels can also be drawn in the foundations of theological and scientific research. For example, while scientific research is often thought of as a very rational activity, proceeding on the basis of observed data and accumulated knowledge, it is necessary to first make some fundamental assumptions about the nature of reality, like in theology. “It’s not possible to engage in science without making certain metaphysical presuppositions,” stresses Dr Göcke.
It is in practice unrealistic for human beings not to make any metaphysical assumptions in the course of their daily lives, Dr Göcke believes, regardless of their religious perspective. A religious individual makes assumptions about the existence of God for example, while even committed atheists still make metaphysical assumption about the nature of reality. “We all make metaphysical presuppositions,” says Dr Göcke. Theologians and scientists may, however, approach their work with certain metaphysical presuppositions, but, Dr Göcke says, it is important that these presuppositions are questioned and reflected on, as part of the search for objective truth. “In the humanities and the natural sciences, researchers need to reflect on the fundamental structure of what they see, which arguments can be formulated in favour of particular worldviews, and to identify possible inconsistencies,” he outlines. “The overall goal in research is to find objective truth. We engage in all these different activities in order to find it.”
EU Research
https://www.peterlang.com/view/ title/67511?format=HC
The members of the Vienna circle, an eminent group of philosophers who were active in the 1920s and ‘30s, had argued that science had to be built from the ground up, without any metaphysical presuppositions. However, subsequent debate in the philosophy of science has shown that this is not actually how science progresses and how new discoveries are made, a topic that Dr Göcke and his colleagues are exploring. “Based on the research of my colleague Dr Jan G. Michel, we have started analysing what exactly makes a scientific discovery. Dr Michel is currently developing a
in the Copernican Revolution. “There are sometimes points in scientific development where so much evidence accumulates against certain assumptions that it is essential to change perspective,” says Dr Göcke. The problem here is in identifying whether there are meta-theoretical principles that can guide such changes in perspective. “Science is always based on our common understanding of the world – our daily experience of being,” Dr Göcke continues. “When you see that the way you perceive the world doesn’t stand up any more, then you change your perspective.
Due to fundamental changes in the overall worldviews of the believers, it’s hard to imagine that religious beliefs literally meant the same thing to people 2,000 years ago that they do today theory of the structure of scientific discoveries which will lead us to a better understanding of what processes are behind the discovery of for instance a new species in biology, or a new particle in physics,” he outlines. This may involve changing perspective on the nature of reality and reassessing certain metaphysical and other presuppositions. The path of scientific progress involves not just the accumulation of knowledge, but also sometimes radical shifts in perspective, for example the change from a geocentric view of the solar system to a heliocentric view
www.euresearcher.com
It’s about searching for the right perspective to understand reality.” An individual’s faith may be an important factor in this, and many people perceive the world through the dogmas of their particular faith. With respect to the ‘scientificness’ of theology, one important question is whether an individual can perceive the world from a religious perspective without generating inconsistencies with what is known from physics, biology, philosophy, or other disciplines. “That is one of the most fundamental questions that each and every religion has to ask itself,”
https://www.aschendorff-buchverlag.de/ detailview?no=11912
Dr Göcke emphasises. This would seem to imply that dogma, or the interpretation of it, should evolve in line with the advancement of knowledge, which indeed has historically been the case. “The interpretation of what a certain dogma means changes over the course of time. We have to decide on how to understand the dogma according to the circumstances in which we live,” continues Dr Göcke. “Due to fundamental changes in the overall worldviews of the believers, it’s hard to imagine that religious beliefs literally meant the same thing to people 2,000 years ago that they do today.”
Sharing perspectives The world has of course changed significantly since, yet researchers today remain preoccupied with the search for objective truth, both in theology and the natural sciences. While there can sometimes be a degree of mutual suspicion between the natural sciences and the humanities, Dr Göcke believes researchers in different disciplines can learn from each other. “We are trying to bring together scientists and theologians,” he explains. A good example is a recent seminar which brought together theology and geology students to discuss questions like how old the world is. “Some religions believe that the world is only 7,000 years old or so. We discussed this, so that the geology students got a better
61
A SCIENTIFIC THEOLOGY?! Naturalism and philosophy of science as challenges to catholic theology
Project Objectives
Both analytic philosophy of science and current naturalistic thinking are full of arguments against the very possibility of a scientific catholic theology. The research group intends to structure and evaluate these arguments by way of developing fundamental elements of a catholic philosophy of science that not only is able to reject these arguments but at the same time can account for the possibility of science as such.
Project Funding 1,600,000 euros
Project Partners
The project works extensively with colleagues from Cambridge University, Oxford University, Lincoln University, and Warsaw University.
Contact Details
Project Coordinator, Professor Benedikt Paul Göcke Philosophy of Religion and Philosophy of Science, Catholic Faculty of Theology, Ruhr-University Bochum T: +49 234 32 29389 E: benedikt.goecke@rub.de W: http://www.kath.ruhr-uni-bochum.de/ wissenschaftstheorie/index.html.de https://news.rub.de/presseinformationen/ wissenschaft/2018-02-08-theologie-die-lehre-vongott-als-wissenschaft https://www.tagesspiegel.de/wissen/theologie-alswissenschaft-die-gottesfrage/21201240.html
understanding of how theologians approach certain philosophical and metaphysical questions. The philosophy and theology students also gained a better understanding of the methods that are used in the natural sciences,” says Dr Göcke. “For example, what methods are used to determine the age of the universe?” This type of interdisciplinary discussion and debate helps open up new perspectives on the nature and structure of scientific approaches to describing reality, which is
theology and metaphysics, while Dr Göcke is also engaged in the wider debate about the importance of theology in the digital age. “How will the development of artificial intelligence and other new technologies affect religious worldviews? How will it change how we perceive the world ethically?” he asks. “For instance, should robots have rights? Can we reconcile the incredible possibilities of artificial intelligence with a Christian perspective on reality?”
Theologians in different disciplines, like church law, moral theology, and christian and social ethics, use scientific methods. For example, theologians use philological methods in analysis of the Bible at the core of Dr Göcke’s research. From this point, he and his colleagues can then ask whether theology can be described as a scientific activity. “We are trying to show that confessional theology is a subject worthy of scientific reflection and is a scientific approach to reality itself,” says Dr Göcke. This research has led to the publication of a number of papers on the philosophy of science, as well as on
These topics remain the subject of lively debate, and it’s important to consider them from different perspectives as technology continues to advance. For instance, while self-driving cars could help improve traffic flow, there are some important ethical questions to consider first. “Who’s responsible if there is a crash? We need to think about these issues, before we introduce them on the streets,” stresses Dr Göcke. The Research Group
https://www.katholisch.de/aktuelles/aktuelleartikel/ist-theologie-eine-wissenschaft
Professor Benedikt Paul Göcke
Benedikt Paul Göcke is Professor for Philosophy of Religion and Philosophy of Science at the Catholic Faculty of Theology, Ruhr-University Bochum. He is also an associated member of the Faculty of Theology and Religion at University of Oxford. He published articles in The International Journal for Philosophy of Religion, Zygon, Sophia, The European Journal for Philosophy of Religion, and Theologie und Philosophie, among others.
62
EU Research
Why do unlikely organisational failures happen? Major failures of public administration are rare in democratic states, yet they can have serious consequences when they do occur. We spoke to Professor Wolfgang Seibel about his work in identifying the causal mechanisms behind organisational failures across different areas of public administration, which could help inform improved prevention measures in future. Serious failures of public administration are thankfully rare in democractic states, yet this rarity by nature makes them difficult to analyse and learn from. There is a tendency in the social sciences to focus attention on statistically significant events when investigating organisational failure, yet Professor Wolfgang Seibel believes it’s important not to neglect rarer incidents, which require a different approach. “There is insufficient acknowledgement in the social sciences that we need to apply a different kind of perspective to deal with these rare but serious events,” he outlines. This topic is at the heart of Professor Seibel’s work in the DFG-funded Black Swans in Public Administration project. “Public administrations should of course try to avoid serious failures. The question then is, how do we develop generalisable factors that enable
us to take preventative measures, on the basis of a relatively small number of cases and observations?” he asks.
Causal mechanisms This methodological puzzle is a central part of the project’s work. Through case studies on specific instances of organisational failures, including the crowd disaster at the Love Parade music event in the German city of Duisburg in 2010, researchers aim to identify generalisable causal mechanisms. “We should be able to identify generalisable causal mechanisms, even though there are only a limited number of cases,” says Professor Seibel. When seeking to make generalisations it is essential to first precisely define the focus of investigation; Professor Seibel and his colleagues’ attention is centered on extremely serious cases of organisational
failure. “We’re not looking at failures in the sense of poor performance for example, but rather cases where the failure is crystal clear. In some of these cases the mismanagement of public administration resulted in the loss of human life,” he outlines. “The first point that can be generalised here is that it is highly unlikely that somebody would allow these organisational failures to happen just out of negligence.” There may nevertheless have been particular incentives in place that encouraged public officials to make compromises they would not normally consider. The result of this may be that the physical integrity of people is essentially made negotiable, putting safety at risk. “If we can identify those common incentives, then we can look at causal mechanisms that we may be able to generalise,” says Professor Seibel.
Hurricane Katrina was an extremely destructive and deadly Category 5 hurricane that made landfall on Florida and Louisiana in August 2005, causing catastrophic damage; particularly in the city of New Orleans and the surrounding areas. Subsequent flooding, caused largely as a result of fatal engineering flaws in the flood protection system known as levees[3] around the city of New Orleans, precipitated most of the loss of lives.
www.euresearcher.com
63
Cars rest on the collapsed portion of I-35W Mississippi River bridge, after the August 1st, 2007 collapse. Photograph by Kevin Rofidal, United States Coast Guard [Public domain]
The aim here is to identify regular patterns of failure; this research builds on ideas from the field of welfare economics. “A business or firm produces goods for public consumption for example, yet it also may have negative externalities on the public, e.g., when it pollutes the air. The idea of welfare economics is that we need to strengthen the positive externalities and to reduce, or mitigate, the negative externalities,” explains Professor Seibel. “Both positive and negative externalities are related to the mechanism of internalisation, a mechanism that enables us to attribute particular external effects to particular causes.” The general hypothesis in Professor Seibel’s research is that in cases of organisational failure there are weaknesses of internalisation that contribute to a higher likelihood of negative externalities. It is important in these terms that public officials are answerable for their decisions and that clear lines of accountability are in place. “If there are reliable mechanisms to hold those in public administration accountable for negative consequences, then it is more likely that negative externalities will be reduced to a minimum,” says Professor Seibel. The wider aim here is to ensure that officials are incentivised to ensure that human security is always the top priority for public organisations, even when they are under financial or political pressure, whether it be to run a major event, or to keep the costs of maintaining public infrastructure down. “Resources may be scarce, or maybe accountability mechanisms are unenforced,” explains Professor Seibel.
64
Case studies Researchers are now looking to gain deeper insights from case studies of organisational failure across various different areas of public administration, including mass events, transport and road safety, and infrastructure and construction. While the researchers cannot directly compel witnesses to provide their versions of events, Professor Seibel and his colleagues do have access to a lot of documents on these cases. “We have reports that are comprehensive enough to give us a good account of what happened,” he outlines. Researchers have trawled through official reports and other documentation on these cases, aiming to reconstruct the events and processes that led to organisational failure. “This information is the basis of our
beforehand, it nevertheless went ahead. “Public officials made the deliberate decision to ignore the risks, for the sake of running a big, potentially prestigious mass event,” says Professor Seibel. This had fatal consequences. “21 people died unnecessarily, due to a massive failure of public authorities, as public officials could easily have been aware of the risks involved,” continues Professor Seibel. “We aim to help enhance the knowledge base necessary to manage these types of activities and events effectively.” A degree of complexity is unavoidable in this, so it’s important that lines of accountability are clear and the parties involved communicate effectively. It is not complexity as such that is dangerous, believes Professor Seibel, but rather the presence of counter-incentives that prompt public officials to ignore the inherent risks. “That is when complexity becomes dangerous,” he stresses. The trend towards outsourcing and collaborative governance is also leading to increased complexity in other areas of public administration. “It’s important to carefully consider the implications when outsourcing the delivery of public goods and services. You want to guarantee the same control and coherence of vision as when those goods and services remain in the immediate realm of public authorities,” says Professor Seibel. “When accountability structures are compromised, that increases the likelihood of loss of control.”
Ambivalence of pragmatism The evidence gathered so far supports the externalisation hypothesis, yet no definitive conclusions have yet been made and research is ongoing. Alongside enhancing
We’re not looking at failures in the sense of poor performance for example, but rather cases where the failure is crystal clear. In some of these cases the mismanagement of public administration resulted in the loss of human life research. It’s primarily written, documentary information, from parliamentary investigative committees for example, that we can rely on,” says Professor Seibel. The 2010 Love Parade event in Duisburg is one of the cases on which Professor Seibel and his colleagues are focusing their attention, with researchers looking at the events leading up to the failure and trying to identify the contributing causal mechanisms. The cancellation of the 2009 event had heightened pressure to run the 2010 edition, and while safety concerns were raised
the knowledge base required to help prevent serious administrative failures in future, Professor Seibel also hopes to contribute to the debate around the limits of pragmatism in decision-making. “On the one hand, public administration is dependent on flexibility and situation-specific decisionmaking. So, in this sense pragmatism means not just sticking rigidly to the rules, but considering the particular characteristics of a situation,” he explains. However, there are also situations in which rules and regulations must be strictly enforced. “We cannot accept
EU Research
BLACK SWANS IN PUBLIC ADMINISTRATION
2013 Savar building collapse, Bangladesh. On Wednesday, 24 April 2013 in the Savar Upazila of Dhaka, Bangladesh where an eight-story commercial building named Rana Plaza, collapsed. The search for the dead ended on 13 May 2013 with a death toll of 1,129.
Black Swans in Public Administration: Rare Organizational Failure wlth Severe Consequences
Project Objectives
The project analyses patterns of organisational failure in public administration with severe consequences for the physical integrity of humans such inadequate maintenance of public infrastructure causing buildings and bridges to collapse, organizational failure in disaster relief, management deficits of child protection authorities leading to fatal child abuse, etc.
By rijans - Flickr: Dhaka Savar Building Collapse, CC BY-SA 2.0, https://commons. wikimedia.org/w/index. php?curid=26051590
Project Funding pragmatism when it comes to road safety, or the protection of young children,” points out Professor Seibel. “The question is – to what extent are staff at all levels of public administration aware of these limits and aware of the ambivalence of pragmatism?” A clear distinction can be drawn here between consequentalist and deontological ethics as decision-making frameworks. While in deontological theory the emphasis is on adhering to a set of pre-defined rules, consequentalism is more flexible. “Consequentalism means that you think in terms of what are the consequences of what you are doing. Can you justify those consequences according to a set of sort of normative, ethical yardsticks?” explains Professor Seibel. These yardsticks might be practical, financial or political, or based on some other consideration; many people
would approve of this approach. “Most of us would like public officials to think about the consequences of what they are doing and not just rigidly apply the rules,” says Professor Seibel. “The thing is that rules exist for good reason – and while it is unacceptable to stick rigidly to the rules in all circumstances, it is also unacceptable to bend and ignore them.” The leaders of public administrations have an important role to play here in identifying those areas in which people are justified in acting pragmatically, or where pragmatism is unacceptable. Allowing these boundaries to be blurred can have fatal consequences, as at Love Parade in Duisburg and the Grenfell Tower fire in London in 2017 for example. “There are limits to pragmatism, where public administrations have to rigidly enforce legislation,” stresses Professor Seibel.
The project is supported by a € 500,000 grant within the Reinhart Koselleck programme of the German Research Foundation (DFG). According to the DFG definition, the programme “enables outstanding researchers with a proven scientific track record to pursue exceptionally innovative, higher-risk projects”.
Contact Details
Project Coordinator, Professor Wolfgang Seibel Department of Politics and Public Administration University of Konstanz Universitätsstraße 10, 78464 Konstanz, Germany T: +49 (0)7531 88-3684 E: wolfgang.seibel@uni-konstanz.de W: https://www.polver.uni-konstanz.de/en/ seibel/professors/prof-dr-wolfgang-seibel/
Professor Wolfgang Seibel On 14 June 2017, a fire broke out in the 24-storey Grenfell Tower block of flats in North Kensington, West London just before 1:00 am British Standard Time. The devastating fire caused 72 deaths, including those of two victims who later died in hospital. More than 70 others were injured and 223 people escaped the inferno. Photograph by Natalie Oxford [CC BY 4.0 (https:// creativecommons.org/licenses/by/4.0)]
Wolfgang Seibel is a Full Professor of Politics and Public Administration at the University of Konstanz. He also held roles at institutions in Europe, the US and South Africa. His recent work focuses on the theory of public administration and variants of drastic administrative failure and disasters.
www.euresearcher.com
65
Exploring meaning in grammar Language is understood not just through the words of a speaker, but the grammatical structures they use, as different types of information are imparted through different forms of expressions. We spoke to Professor Thomas Ede Zimmermann about his work in a DFG-project which brings together researchers from several different disciplines to analyse the propositionalist thesis. The meaning of a sentence is defined not just by the words out of which it is made up, but also its grammatical structure. In order to understand a spoken sentence for example, the listener needs to not only understand what the words mean, but also how their meanings interact. “This interaction between the meaning of words is central to the meaning of a sentence,” says Thomas Ede Zimmermann, Professor of Linguistics at Goethe University Frankfurt. As the Principal Investigator of a new DFG-funded research project, Professor Zimmermann is probing deeper into the semantic structure of language, using methods derived from mathematical logic. “We can use grammatical structure to understand what the subject and the object are in a sentence, but this doesn’t tell us how the meanings of these components interact,” he explains. “This is where logical semantics comes in. With methods from formal logic, we can describe the interaction of word meanings.”
Linguistic semantics This approach only entered linguistic semantics around the 1970’s, and a lot of progress has been made over a comparatively short time. However, Professor Zimmermann
66
says that certain suppositions or ideological preferences have entered the field. “One of them is the idea of propositionalism,” he outlines. This can be thought of as a fairly general approach to linguistic semantics that is used by a majority of researchers in the field. However, over the last 10-15 years or so its foundations have been increasingly called into question. “People have started producing evidence against propositionalism,
The simplest grammatical constructions are known as extensional, usually identified by what’s called a substitution test. “Take the sentence ‘John kissed Mary’, and let’s suppose that Mary is Harry’s mother. The name ‘Mary’ and the description ‘Harry’s mother’ are co-referential – that means they refer to the same person,” outlines Professor Zimmermann. Since the constructions are extensional, they can be substituted for
We can use grammatical structure to understand what the subject and the object are in a sentence, but this doesn’t tell us how the meanings of these components interact. This is where logical semantics comes in. an anti-propositionalist thesis - this was mainly in philosophy of language, not so much in linguistic semantics,” continues Professor Zimmermann. “One thing that philosophers of language have not really considered enough, to my mind, is that it’s highly non-trivial to give this general strategy of propositionalism a precise definition.” The divide between extensional and intensional grammatical constructions is central to understanding propositionalism.
each other without affecting the truth value of the sentence, which is not the case with intensional constructions. “Intensional constructions often relate to what people think or mean. The most common example is what’s called an attitude report – a sentence where a clause gets embedded or becomes the object of a verb, as in; ‘Jane thinks that John kissed Mary’,” explains Professor Zimmermann. “In this context, replacing the name ‘Mary’ by the co-referential description
EU Research
‘Harry’s mother’ may turn a true report into a false one: Jane need not know that Mary is Harry’s mother.” This substitution resistance is indicative of intensional constructions. “In general, in an intensional construction, co-referential expressions or sentences with the same truth values cannot replace each other without affecting the truth value of the entire report,” says Professor Zimmermann. Researchers in Professor Zimmermann’s group are also looking at a number of intensional constructions including intensional transitive verbs, which allow for so-called non-specific readings of their objects. “A classic example is the sentence ‘John is looking for a cheap restaurant’. Let’s imagine that cheap restaurants are specifically Italian restaurants, but John might not be looking for an Italian restaurant, maybe he dislikes Italian food,” Professor Zimmermann continues. “So, although John is looking for a cheap restaurant, it doesn’t necessarily follow that John is looking for an Italian restaurant. In this respect, the intensional verb ‘look for’ differs from an ordinary transitive verb like ‘enter’: in the circumstances indicated, the truth of ‘John enters a cheap restaurant’ guarantees the truth of ‘John enters an Italian restaurant’.”
The propositionalist thesis The distinction between extensional and intensional constructions is central to developing a precise and systematic account of grammatical meaning, a topic central to Professor Zimmermann’s research. The overall aim of the project is to develop a critical analysis of the propositionalist thesis, namely that intensionality only occurs when a clause is embedded. “In the first example of an intensional construction – ‘Jane thinks that John kissed Mary’ – we have a clausal embedding. With ‘John is looking for a cheap restaurant’, we don’t seem to have a clausal embedding,” says Professor Zimmermann. This latter example could however be reformulated in such a way that a clause is embedded, viz. by ‘John is making efforts to the effect that he find a cheap restaurant’. “The advocates of the propositionalist thesis suggest that as it’s an intensional construction, there will be some clausal embedding,” explains Professor Zimmermann. A key question thus is whether there is always a way to paraphrase or analyse an intensional construction in terms of clausal embedding. As part of their work in scrutinising the propositionalist thesis, Professor Zimmermann and his colleagues are investigating potential counter-examples to this thesis. “We’ve
www.euresearcher.com
identified grammatical constructions that might turn out to be intensional, without being reducible to clausal embedding,” he outlines. One prominent counter-example involves the verb ‘fear’. “It has been argued that ‘fear’ is intensional” continues Professor Zimmermann. “For example, take the sentences ‘Lex Luthor fears Superman’ and ‘Lex Luthor fears Clark Kent’. It seems that the first is true and the second is false, even though Superman and Clark Kent are the same person. This looks like a substitution failure, so the construction should be intensional. However, people have argued that it would be difficult to come up with a clausal embedding analysis of this structure.” This reasoning has been questioned on the basis of the observation that it runs the risk of trivialisation, another issue that Professor Zimmermann and his colleagues are exploring in the project. David Kaplan’s 1975 paper How to Russell a Frege-Church, holds clear relevance in this respect. “Kaplan showed that Bertrand Russell’s arguably propositionalist picture is far less restrictive than it would appear,” says Professor Zimmermann. If propositionalism means what Kaplan took Russell to have meant, then the thesis of anti-propositionalism collapses; Professor Zimmermann and his colleagues will try to make an important contribution to this debate. “Part of the project is about developing a stable and precise definition of what’s behind propositionalism, and the thesis of anti-propositionalism,” he explains. One longer-term aim is find alternatives to propositionalism, yet researchers in the project are also pursuing more immediate objectives. One area in which Professor Zimmermann hopes to make tangible progress within the scope of the project is in the mathematical aspects of semantic theory. As a case in point, logical paraphrases are usually formulated in terms of a family of under-explored algebraic techniques known as type-shifting [aka ‘type-coercion’]. “I hope that we will elaborate some proposals and theories in the area of type-shifting,” says Professor Zimmermann. Over the longer term, Professor Zimmermann plans to devote a lot of his attention to the logical foundations of semantic analysis of natural language, and the role of propositionalism in particular. “Once we have a stable, viable definition of propositionalism which we can show is non-trivial, then we can proceed with potential counter-examples and look towards empirical questions. Can the account that we give be shown to be propositionalist? Under what circumstances will it be shown to be anti-propositionalist?”
PROPOSITIONALISM IN LINGUISTIC SEMANTICS Propositionalism in Linguistic Semantics Project Objectives
How do humans understand linguistic content? Techniques from formal semantics allow researchers to investigate the role played by grammar in imparting meaning, a topic central to Professor Zimmermann’s research. The aim in the project is to investigate the underlying theoretical principles by means of a critical analysis of what is known as the ‘propositionalist hypothesis’. This assumes that any reference to linguistic content is ultimately based on clausal embedding in grammar. The competing intensionalist approaches are more liberal and build on the assumption that information content in principle corresponds to all types of expression. The ultimate aim in research is to find alternatives to propositionalism.
Project Funding
Funding through the Reinhart Koselleck Projects (DFG)
Contact Details
Professor Thomas Ede Zimmermann Principal Investigator Institut für Linguistik Goethe-Universität Frankfurt am Main Norbert-Wollheim-Platz 1 60629 Frankfurt am Main T: +49 (0)69 798 32394 E: T.E.Zimmermann@lingua.uni-frankfurt.de W: https://www.propositionalismus.de/en/about/ Professor Thomas Ede Zimmermann
Thomas Ede Zimmermann is a Professor in the Department of Linguistics at Goethe University Frankfurt, where he is also a member of the Department of Philosophy. His research interests centre on linguistic semantics, logical foundations of linguistics, the interface between semantics and pragmatics, and the philosophy of language.
67
Analysing the complexity of European policy A large body of European Union regulations and directives has accumulated over time, which is regularly added to with new proposals, while the content of policy is itself growing increasingly complex. The question then arises of how these proposals can be processed and implemented efficiently, a topic at the heart of Dr Steffen Hurka’s research. The policy issues
that the European Union deals with have become progressively more complex over recent decades, as the organisation has expanded and assumed responsibility for a wider range of regulations. New proposals are being added to an already dense web of existing law, raising important questions about the functioning of political institutions. “How can these complex legal proposals be processed efficiently by existing political institutions?” asks Dr Steffen Hurka, an Assistant Professor of Politics based at the Ludwig-Maximilians-University (LMU)
EUPLEX Coping with Policy Complexity in the European Union Dr Steffen Hurka Ludwig-Maximilians-University Munich Geschwister-Scholl-Institute of Political Science Chair of Empirical Theory of Politics Oettingenstraße 67, 80538 Munich E: steffen.hurka@gsi.uni-muenchen.de W: https://www.steffenhurka.com T: +49 (0) 89 2180 9038
Steffen Hurka is an assistant professor of political science at LMU Munich. He earned his doctoral degree at the University of Konstanz in 2015. Hurka’s work mainly focuses on the legislative organization of the European Parliament and the dynamics of policy-making, both in the EU and at the national level.
68
in Munich. This question is at the heart of Dr Hurka’s work as the Principal Investigator of the EUPLEX project, funded by the German Research Foundation (DFG). “We’re interested in how the complexity of a policy proposal changes, once the European Commission has introduced it into the legislative process. How strongly do different EU institutions contribute to the complexity of EU law?” he outlines. A large data-set of different pieces of legislation from between 2004-2019 is being assembled within the project, which will provide the foundation for Dr Hurka and his colleagues to draw comparisons between different EU regulations and directives, covering a wide range of areas. The legislative process in the EU is described by what is
interacts with existing laws, adding up to a complex picture. “Once you propose a new policy, that policy interacts with an increasing amount of already existing policies. So, we suspect that the amount of policy complexity has increased markedly over time,” outlines Dr Hurka. This is not necessarily a problem, as to some extent policy complexity is the result of technological and societal progress. “As societies progress, they make more complex legislation. The question is, do our political institutions match this? We aim to identify the optimal degree of policy complexity. So, the amount of complexity at which the political system functions as efficiently as possible,” continues Dr Hurka. This research is ongoing, with Dr Hurka and
We’re interested in how the complexity of a policy proposal changes, once the European Commission has introduced it into the legislative process. called the ordinary legislative procedure, which is the focus of Dr Hurka’s research. “Usually the European Commission makes a policy proposal, then sends it to the European Parliament and the Council. These two institutions then negotiate with each other on amendments to the proposal, then they either adopt it or reject it,” he explains. The proposal itself may be extremely complex in nature, often relating to the integrity of the single market and the harmonisation of standards. “We’re looking at the question of how this complexity affects the political process, and the nature of negotiations,” says Dr Hurka. The European Commission itself is a vast bureaucracy home to great knowledge and technical expertise, yet its proposals must be passed by the Parliament and the Council before they become law. A new policy also
his colleagues currently working on a paper on how the complexity of the Commission’s proposals affect the length of negotiations in the EU. Researchers have found that different types of complexity have very different effects on the efficiency of the legislative procedure, which could hold importance in terms of how institutions function. “When the EC produces a certain policy proposal, we can make a good estimate of how long the negotiations will take,” explains Dr Hurka. In the longterm this could help to improve efficiency and strengthen the democratic legitimacy of European institutions. “Institutions are better able to organise their resources once they know how complex a certain proposal is. That should enable European institutions – the Parliament, Commission and the Council – to organise themselves better,” says Dr Hurka.
EU Research