EU Research Autumn 2020
COVID-19 and the Impact on Science The sunnier side of science: A closer look at the sun
ERC concern “Horizon Europe funding insufficient”
A focus on student engagement during the pandemic
UN says melting ice sheets on course for worst-case scenario
Follow EU Research on www.twitter.com/EU_RESEARCH
©NASA
Disseminating the latest research from around Europe and Horizon 2020
Editor’s N I
n July, the 27 EU countries in the EU agreed on a slash to the financial package proposed for the European Research Council’s (ERC) Horizon Europe, the seven year renewed multiannual financial framework (MFF), the next stretching from 2021 to 2027. Considering the European Parliament originally wanted 120bn Euros invested in the Horizon budget the figure by the European Council, a core budget of 75.9bn Euros as well as an additional ‘recovery’ budget of 5bn Euros, has come as a blow to the science community.
With every renewed budget to date, increased spend has been the norm, a signal of Europe pushing as a world leader in innovation and research. Country priorities have shifted during the pandemic and this critical budget has taken a knock. The new reduced figure has whipped up protests from the scientific community. A petition signed by some 16,000 scientists, scholars and universities, via Friends of the ERC are saying the revised budget would have disastrous consequences for Europe’s standing in science and the quality and breadth of research and innovation looking ahead. It’s been speculated it could even lead to a ‘brain drain’, if the ERC fails to fund enough researchers.
As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.
www.euresearcher.com
Their open letter calls upon the EU Presidents and Government leaders in Europe to protect and secure the ERC’s funding which is due to begin in January. In the letter there are some convincing insights to support the petition’s message, which are worth sharing. For example, that 75% of completed projects led to breakthroughs or major advances. There have been a range of awards through the system, including seven Nobel Prizes and it has supported over 70,000 team members, publishing 150,000 articles and 1,200 patent applications. In these testing times you might understand budget reductions, but you harvest what you sow and with science, funding is everything. We need to strive for a better world, a brighter world and losing momentum will have far reaching implications, at a time when scientific research and invention in so many fields is really needed.
Hope you enjoy the issue.
Richard Forsyth Editor
1
Contents
4
Research News
EU Research takes a closer look at the latest news and technical breakthroughs from across the European research landscape
10
HOTZYMES Current methods of producing pharmaceuticals and biocommodities are relatively inefficient. Researchers in the HOTZYMES project are investigating the potential of using magnetic nanoparticles to heat up enzymes and catalyse reactions more efficiently, as Dr. Valeria Grazú explains.
12
MEIOTIC TELOMERE We spoke to Professor Hiroki Shibuya about his work in investigating the fundamental mechanisms involved in meiotic recombination and the role of BRCA2 in repairing double strand breaks of DNA through homologous recombination.
14
NextGen IO Many lung tumours don’t respond to immunotherapy, prompting research into new approaches to treatment. Researchers in the NEXT GEN IO project aim to exploit the hypoxia response in T-cells and harness its wider therapeutic potential, as Dr Asis Palazón explains.
16
chemREPEAT The same amino acids may be repeated within a certain region of a protein. Dr Pau Bernadó is developing new strategies which will help researchers analyse the structure and dynamics of low complexity regions in proteins.
2
19
Engineering ‘smart’ viral RNA structures for stable and targeted siRNA delivery
It’s difficult to deliver siRNAs to the locations where they are required. Dr. Alyssa Hill is drawing inspiration from nature by engineering ‘smart’ viral RNA structures into delivery vehicles, which could open up new possibilities in treatment.
20 THE INFLUENCE OF EPISODIC MEMORY ON VALUE-BASED DECISION MAKING
Memory exerts an important influence on decision-making. We spoke to Dr Sebastian Gluth about his group’s work on both investigating the cognitive and neural processes involved in decision-making, and developing mathematical models of how those decisions are reached.
22 WRINKLES AND WRINKLONS We spoke to Dr Sebastian Gliga about his work in creating functional magnetic materials that can be reconfigured through shape change, which could open up new possibilities for the manipulation of spin waves with the aim of creating logical device.
24 ExclusiveHiggs We spoke to Professor Kostas Nikolopoulos about the work of the ExclusiveHiggs project in analysing particle collisions, research which could help elucidate the properties of the Higgs boson and guide the search for new physics.
26 DarkSPHERE Dark matter is the focus of a great deal of attention in research, yet mystery still surrounds its nature and structure. The DarkSPHERE project aims to shed new light on the topic, as Dr Ioannis Katsioulas explains.
27 Postponing
Research Studies
The Covid-19 pandemic has thrown research off course in all manner of ways, but little hit so hard as the need to cull research mouse cultures. The pandemic has reinforced both the importance of animals in scientific research and the attachment researchers have to their cultures. Adam England reports.
31
UP-Drive Wojciech Derendarz, of the UPDrive project, is working out how autonomous vehicles can develop improved perception, relying on combined technologies to process urban environments, so self-driving cars can take a step closer to earning their driving license.
34 E-CAM We spoke to Ignacio Pagonabarraga, Sara Bonella, Jony Castagna and Donal MacKernan of the E-CAM project about their work in developing new software modules targeted at the needs of both academic and industrial end-users.
38 Uncovering policy designs:
A training dataset for future automated text analysis Algorithms capable of identifying the design characteristics of policies affecting the renewable energy sector would be an invaluable tool in formulating climate policy, as Dr Sebastian Sewerin and Dr Lynn Kaack explain.
39 ANICOLEVO Fossils provide evidence of ancient life, including the colour of insects, reptiles, birds and mammals as old as hundreds of millions of years, a topic at the heart of Dr Maria McNamara’s research in the ANICOLEVO project.
EU Research
40 NATURVATION Nature-based solutions are an important part of efforts to combat climate change and can help cities address other sustainability challenges. The Naturvation project aims to help unlock the wider potential of naturebased solutions, as Professor Harriet Bulkeley explains.
43 FIThydro Hydropower plants represent a threat to the health of fish and the ecology of their habitats. We spoke to Professor Peter Rutschmann about the work of the FIThydro project in developing innovative solutions to mitigate the impact of hydropower plants.
46 RIVERS We spoke to Dr Lieselotte Viaene about the work of the RIVERS project in investigating how human rights and international law can move beyond its anthropocentric dogma by learning from the relationships that indigenous peoples have with water and nature.
48 Host-parasite interactions
in hybridizing Daphnia, from correlations to experiments There are over 200 species of Daphnia , some of which can hybridize with each other. Dr Piet Spaak and Dr Justyna Wolinska tell us about their investigations into the extent to which hybridization and parasitism influence major ecological processes.
50 The Sunnier Side of Science Europe’s largest telescope GREGOR, has recently had an upgrade and has taken some spectacular close-up images of the sun. Here, we take a look at what those images showed and explain why sun science is important. By Richard Forsyth
54 WEIRD WITNESSES
Cultural differences in the way people report events can be a challenge for police investigators. The WEIRD Witnesses project is designing new, culturally sensitive guidelines that will help investigators get the information they need, as Dr Annelies Vredeveldt explains.
www.euresearcher.com
56 SIBA New nation states were established following the dissolution of the Ottoman Empire. Photo archives from the 1920s and ‘30s open a window into everyday life in the period, a topic at the heart of Professor Nataša Mišković’s research.
57 LOCAL AUTONOMY & LOCAL PUBLIC SECTOR REFORMS
Local government provide a variety of services, but they are not immune to wider economic pressures. Professor Andreas Ladner and his colleagues are conducting comparative studies that will get to the heart of the issues facing local government.
60 SOUNDS OF ANTI-JEWISH PERSECUTION
The Jewish population of Eastern Europe were subjected to relentless persecution during the Second World War. What did this persecution sound like? Researchers are analysing references to sounds to build a deeper picture, as Professor Christian Gerlach explains.
62 Student Engagement Richard Forsyth talks to Dr. Joanne Tippett about Ketso Connect, a new way to increase engagement, bring structure and build community in online and blended learning.
64 GRAMADAPT We spoke to Professor Kaius Sinnemäki, Dr. Francesca Di Garbo, Dr. Eri Kashima and Dr. Ricardo Napoleão de Souza about the work of the GramAdapt project in investigating linguistic adaptation.
66 KITAB The KITAB project is harnessing the power of technology to detect examples of text reuse across a large corpus, helping to build a deeper picture of the relationship between different authors and their books, as Professor Sarah Bowen Savant explains.
68 PHYSIOLYTICS
AT THE WORKPLACE The use of physiolytics at the workplace could help companies monitor stress levels among staff and reduce absenteeism, yet this also raises ethical and legal concerns, issues at the heart of Professor Tobias Mettler’s research.
EDITORIAL Managing Editor Richard Forsyth info@euresearcher.com Deputy Editor Patrick Truss patrick@euresearcher.com Deputy Editor Richard Davey rich@euresearcher.com Junior Editor Adam England adam.england@outlook.com Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks info@euresearcher.com PRODUCTION Production Manager Jenny O’Neill jenny@euresearcher.com Production Assistant Tim Smith info@euresearcher.com Art Director Daniel Hall design@euresearcher.com Design Manager David Patten design@euresearcher.com Illustrator Martin Carr mary@twocatsintheyard.co.uk PUBLISHING Managing Director Edward Taberner etaberner@euresearcher.com Scientific Director Dr Peter Taberner info@euresearcher.com Office Manager Janis Beazley info@euresearcher.com Finance Manager Adrian Hawthorne info@euresearcher.com Account Manager Jane Tareen jane@euresearcher.com EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: info@euresearcher.com www.euresearcher.com © Blazon Publishing June 2010
Cert o n.TT-COC-2200
3
RESEARCH
NEWS
The EU Research team take a look at current events in the scientific news
Horizon Europe Funding Insufficient Argues European Research Council President Jean-Pierre Bourguignon is ‘furious’ with the funding for 2021-2027 programme Horizon Europe. Jean-Pierre Bourguignon, president of the European Research Council, is ‘furious’ with the funding formula of Horizon Europe, the next EU funding programme for research and innovation.
happen, while the “present erosion … has to be corrected to ensure researchers have enough room to develop their ideas and initiatives.”
As the most senior scientist in Europe, he told a Brussels committee hearing that at least €14.7 billion would be needed by Horizon Europe between 2021 and 2027, but that to match the ambitions of the ERC founders, €19.9 billion would be required.
When the European Research Council was launched back in 2007, there were high hopes that It would become the “Champions League of research” in the continent, and Bourguignon is confident that the ERC has lived up to expectations thus far.
Horizon Europe is a seven-year programme that’s set to replace Horizon 2020 and funds the ERC, but Bourguignon expects a “big reduction” after plans between EU leaders in July to reduce the budget to €80.9 billion, down from the European Commission’s €94.4 billion proposal and the European Parliament’s hopes for €120 billion.
An open letter to the European Commission was signed by around 16,000 academics worldwide, including 14 Nobel Prize winners, urging them to protect the ERC’s budget and showing the high regard in which it is held across the world. Bourguignon explained at the recent EU hearing that the ERC scheme is now “recognized as a global brand, which several countries outside Europe want to copy because of the extraordinary stimulation and opportunities it gives to the most ambitious and talented members of the research community.”
As a result, Bourguignon expected the budget to stagnate when Horizon Europe took off in January, though the it was slightly bigger than Horizon 2020’s budget in numerical terms. According to the ERC president, they may need to set a lower budget for 2021 than 2020, mirroring the situation between 2014 and 2015 when the ERC scientific council “had to take the drastic (and unpopular) measure to introduce reapplication restrictions that stayed. We are furious because we know this approach is wrong.” To add to his concerns, nothing from the €5 billon recovery fund set aside for the new programme would be targeted at the area where bottom-up research is focused, so there could be a decrease in research funding as a result.
He also described the ERC as “the key tool to offer prospects to the next generation of scientists, something badly needed to give them hope that they can advance in Europe better than elsewhere in the world,” and as a result, “Europe must show confidence in its own future and invest in research and innovation trusting its best minds.”
Describing this as a “worst-case scenario”, Bourguignon suggested Horizon Europe should be rebalanced if it was to
4
EU Research
First 3,200-Megapixel Images Snapped by Largest Digital Camera’s Sensors Sensors of record-breaking digital camera takes 3,200-megapixel images - largest ever. The first 3,200-megapixel images ever have been taken by sensors set to be part of the world’s largest digital camera at the U.S. Department of Energy’s SLAC National Accelerator Laboratory, operated by Stanford University. For reference, 378 4K ultra-high-definition television screens would be needed to display one of the images in full, and you would be able to see a golf ball from around fifteen miles away with the resolution of the images. The sensor array is set to be integrated into the largest digital camera, which will come up with panoramic images of the sky once every few nights for the next ten years. The data collected will be used in the Rubin Observatory Legacy Survey of Space and Time (LSST) the largest astronomical film on record will then be created, and is expected to lead to breakthroughs in areas such as dark matter. Images of around 20 billion galaxies will be collected by the camera, data that will “improve our knowledge of how galaxies have evolved over time and will let us test our models of dark matter and dark energy more deeply and precisely than ever,” according to the University of California, Santa Cruz’s Steven Ritz - project scientist for the camera. The focal plane captures light reflected by or emitted from objects, and it’s then converted into electrical signals, developing a digital
image. The LSST camera’s focal plane contains 189 individual sensors in total, and they each have 16 megapixels - the average number of most digital cameras today. It boasts 3.2 billion pixels, but it’s extremely flat as each pixel is around 10 microns in width. Altogether, the focal plane is about two feet wide, much bigger than the 1.4 inch imaging sensors in regular digital cameras. The imaging sensors will be able to detect objects 100 million times more dim than those which can be seen by the naked eye. Unsurprisingly, the process of completing the imaging sensors was difficult in parts. 25 rafts had to be inserted into the grid, and the spaces between sensors on adjacent rafts was less than five human hairs wide. With the sensors being as fragile as they are, it was crucial that this all went to plan - the team had a year’s practice first, inserting numerous practice rafts that wouldn’t make it into the focal plane. The rafts also cost around $3 million each, making the situation even more delicate. Now work has been completed on the focal plane, it’s currently inside a cryostat at below minus 100 Celsius. The project was disrupted due to the coronavirus pandemic, but plans are in place for the team to place the cryostat into the body of the camera and then add lenses. Eventually, the camera will be transported to Chile, where it will be installed at Vera C. Rubin Observatory to begin collecting images.
Melting Ice Sheets On Course to Meet UN’s ‘Worst-case Scenario’ Melting glaciers are contributing to a global sea level rise of four millimetres each year. It’s no secret that ice sheets in Greenland and Antarctica are melting, but according to a study published in the Nature Climate Change journal they’re melting at a rate on course to meet with the United Nations’ worst-case scenario forecasts.
the 21st century. During the United Nations’ Intergovernmental Panel on Climate Change (IPCC) an extremely similar scenario was suggested as the ‘worst-case’.
Researchers from the University of Leeds and the Danish Meteorological Institute discovered that melting ice sheets in Antarctica have increased sea levels across the globe by 7.2 millimetres since the 1990s - Greenland has contributed even more, with 10.6 millimetres.
Dr. Tom Slater, climate researcher at the Centre for Polar Observation and Modelling at the University of Leeds, and lead author of the study, explained that “Although we anticipated the ice sheets would lose increasing amounts of ice in response to the warming of the oceans and atmosphere, the rate at which they are melting has accelerated faster than we could have imagined.”
Thanks to melting ice sheets across the planet, oceans are rising by around four millimetres each year. Should they continue at this rate, sea levels could increase by 17 centimetres by the end of
“The melting is overtaking the climate models we use to guide us, and we are in danger of being unprepared for the risks posed by sea level rise.”
www.euresearcher.com
5
Synthetic molecule could repair spinal cord and brain damage New synthetic molecule developed at the University of Cambridge could benefit sufferers of epilepsy and Alzheimer’s. Scientists at the University of Cambridge have been able to create a synthetic molecular bridge, CPTX, which has been used to repair functions in cells and mouse models. This molecule can help to recreate connections in the brain and spinal cord of mice suffering from conditions including Alzheimer’s disease, spinal cord injury and cerebellar ataxia. The molecule cerebellin-1 establishes communication networks through connecting neuronal cells and receiver cells at synapses. The aim of the researchers was to find out if they’d be able to cut and paste elements from different organiser molecules to produce new molecules with different binding properties. CPTX was produced as a sort of synaptic ‘glue’, and this proved to be effective in organising neuronal connections in mice. It’s hoped that this new molecule can be used to both connect other cell types and remove connections to help treat conditions like epilepsy in humans. More research is of course needed to see if the results in humans mirror those recorded in mice, but results are promising as it shows that it’s possible to recreate damaged connections, helping humans with a number of conditions including those mentioned. According to Radu Aricescu of the Medical Research Council Laboratory of Molecular Biology at Cambridge, “Damage in the brain or spinal cord often involves loss of neuronal connections in
the first instance, which eventually leads to the death of neuronal cells. Prior to neuronal death, there is a window of opportunity when this process could be reversed in principle. We created a molecule that we believed would help repair or replace neuronal connections in a simple and efficient way.” Where neuronal degradation was evident in mouse models, improvement was then seen in the results, as connections were restored and results improved in various tests. After one injection into the site of a spinal cord injury, motor function was restored for around seven or eight weeks minimum. It was less effective when injected into the brain, however, where positive results were observed for around a week. The team tested the molecule on mice with cerebellar ataxia, a condition that can result from a number of diseases, and leaves sufferers with issues surrounding gait and balance as well as their eyes. Motor coordination was restored in these mice, as synapses were able to reconnect. This research also acts as a prototype in exploring how neuronal circuits can be repaired with structure-guided approaches, and the way this can be applied in neuronal repair and circuit engineering and remodelling. Kunimichi Suzuki, part of Aricescu’s group, is working on the development of second-generation synaptic organisers which will then be tested in a number of animal models.
Frogs Now Identified From Tadpoles, say Scientists Scientists are able to identify frogs from their tadpoles, helping conservation efforts. Many amphibians like the Asian horned frog are at risk of extinction with their habitat being destroyed in remote Vietnam, and the struggle of telling tadpoles apart has made efforts to save them more difficult - it’s more difficult to locate breeding sites as a result.
The threat of extinction is a major issue facing amphibians in Vietnam - there are over 250 species of frogs and toads in the Southeast Asian nation, and many of these are critically endangered due to a combination of the wildlife trade, loss of natural habitat, climate change and disease.
However, scientists are now able to identify frogs from their tadpoles, in a breakthrough that could help conservation efforts across the planet. Of course, tadpoles and frogs don’t look alike, so it was previously difficult to tell which tadpole became which frog, and consequently researchers couldn’t work out where key breeding sites were located.
In 2019, scientists claimed that a fungus - chytridiomycosis - is responsible for killing numerous amphibians, and is part of a sixth mass extinction - 501 species of amphibian are either extinct in the wild or have declined by 90% thanks to this disease.
By collecting data, making measurements and taking photos of tadpoles and comparing their DNA to samples from adults, the scientists could then work out which tadpoles were from six species of Asian horned frog: the Jingdong horned frog, the Maoson horned frog, the Mount Fansipan horned frog, the Giant horned frog, the Annam, horned frog, and the Hoang Lien horned frog. “These frogs occur in some of the most exploited forests on Earth and are suffering from rapid habitat loss and degradation,” explained Benjamin Tapley, curator of reptiles and amphibians at the Zoological Society of London, on the importance of this research, “it helps us detect the presence of a species, especially as adult frogs can be seasonally active and difficult to find, and allows us to identify which places might be important frog breeding sites that need protection.”
6
Co-author and scientific officer of the ATP/IMC in Hanoi, Luan Nguyen, explained that “During my time searching for frogs along streams, I have often been impressed by the unusual behaviours of some tadpoles feeding at the water surface.” “This has driven me to find answers for questions: “What species are they?” and “How and why do they choose to feed in such a way?” The description of the morphological characteristics of these tadpoles is helping to answer these questions and has moved us one step closer to understanding their evolution and natural history.”
EU Research
Artificial Pancreas Could Control Type 1 Diabetes in Children A clinical trial has found that a new artificial pancreas can manage type 1 diabetes in children aged 6 and up. Results from a trial funded by the National Institute of Diabetes and Digestive and Kidney Diseases, part of the National Institutes of Health, and published in the New England Journal of Medicine suggest that a new artificial pancreas, monitoring and regulating blood glucose levels automatically, can help to manage type 1 diabetes in children as young as six years old. The pancreas is also known as closed-loop control, and acts as a comprehensive system of managing type 1 diabetes with the use of a continuous glucose monitor (CGM) and automatic insulin pump. Instead of testing via fingerstick and regulating insulin through injections or a manual pump, this system makes the process much more simple. 101 children aged from six to 13 took part in the study, and were assigned to two different groups. The control group continued to use a regular CGM and insulin pump, while the experimental group used the new system. The study took around four months, with check-ins every other week - the children taking part went about their daily lives as normal during this time. Results showed that the experimental group saw a 7% improvement in regulating blood glucose in the day, and a 26% improvement at night. Controlling blood glucose levels at night is a real concern for children with type 1 diabetes, and their caregivers, as hypoglycemia can be fatal during sleep.
First Przewalski’s Horse Cloned by Scientists 40-year-old genetic material used to clone endangered horse. Przewalski’s horses are the last surviving wild horses left, but are rare and endangered - their natural home is the Mongolian steppe, but today there are a number of them in zoos as part of breeding programs. They were completely extinct in the wild at one point, but have gradually been introduced back to the wild in the vast East Asian nation. Considered to be ‘distant cousins’ of our domestic horses, they’re more stocky in build, with shorter hair on the tail and longer hooves. They also have 66 chromosomes, compared to the 64 found in other species of horse. However, scientists were able to clone a Przewalski’s horse for the first time in August, as Kurt was born in Texas. Remarkably, Kurt was cloned with the use of 40-year-old genetic material; it was cryopreserved four decades ago after being taken from another horse in captivity, about ten years after the last confirmed sighting of a Przewalski’s horse in the wild. Researchers had a population bottleneck to contend with, as the 2,000 horses today are all descendants of a breeding program of 12 horses, making population recovery more difficult due to the lower genetic diversity.
Protocol chair and professor of pediatrics at the Barbara Davis Center for Childhood Diabetes at the University of Colorado, Aurora said that “the improvement in blood glucose control in this study was impressive, especially during the overnight hours, letting parents and caregivers sleep better at night knowing their kids are safer. “Artificial pancreas technology can mean fewer times children and their families have to stop everything to take care of their diabetes. Instead, kids can focus on being kids.” A small number of minor adverse events - 16 in total - occurred during the study, mostly down to issues with the insulin pump. No serious events occurred throughout the study, so the team are optimistic about the future with regards to regulating type 1 diabetes in children. It’s not the only promising news for those with the condition either. In an unrelated study, researchers at the University of Pennsylvania were able to transplant insulin-producing cells into both mice and monkeys. They developed a combination of molecules, to imitate the typical pancreatic environment, preventing the death of beta cells. The animals receiving this mixture, Islet Visibility Matrix (IVM), had more stable blood glucose levels within one day and kept healthier levels for months.
cloning business ViaGen Equine, were able to produce an embryo using the genetic material taken from Kuporovic, and it was then implanted into a horse surrogate, before a trouble-free pregnancy resulted in Kurt’s birth. Bob Wiese, Chief Life Sciences Officer at San Diego Zoo Global, described Kurt as being expected to be “one of the most genetically important individuals of his species”, with the team “hopeful that he will bring back genetic variation important for the future of the Przewalski’s horse population”. Executive director of Revive & Restore, Ryan Phelan, said in a statement that ““This birth expands the opportunity for genetic rescue of endangered wild species. Advanced reproductive technologies, including cloning, can save species by allowing us to restore genetic diversity that would have otherwise been lost to time.” These aren’t the only species the conservationist group are working on, with another six species also in their sights. Their list includes the wooly mammoth, while the International Islamic University Malaysia has plans to revive the Sumatran rhino, only recently extinct, through biotechnology.
Hence, in 1980, scientists at San Diego Zoo’s Frozen Zoo took a sample from Kuporovic, a Przewalski’s horse who lived from 1975 to 1998, as his genome featured unique ancestry thus making him more suitable. Kurt, born over 20 years after Kuporovic’s death, is identical genetically to Kuporovic, and he was named after Kurt Benirschke, the Frozen Zoo’s founder who died in 2018 at the age of 94. San Diego Zoo, along with wildlife conservationists Revive & Restore and
www.euresearcher.com
7
Electronic Skin Developed, Can Sense Touch and Pain An artificial, electronic skin has been developed at RMIT University and reacts to touch, pain and heat. Just as our skin responds to external stimuli, scientists at RMIT University in Melbourne have developed an electronic, artificial skin with the same effect. Feedback from sensors on the skin is key to monitoring the health of the nervous system, for example using pin pricks to study the severity of nerve damage, so making advancements in artificial skin will have widespread implications in healthcare - it may be used to replace damaged skin in future. Researcher in the Functional Materials and Microsystems Research Group and the Micro Nano Research Facility at RMIT University, and co-author of the research, Professor Madhu Bhaskaran, explained that “Our pain-sensing prototype is a significant advance towards next-generation biomedical technologies and intelligent robotics.” “No electronic technologies have been able to realistically mimic that very human feeling of pain — until now … Our artificial skin reacts instantly when pressure, heat or cold reach a painful threshold … It’s a critical step forward in the future development of the sophisticated feedback systems that we need to deliver truly smart prosthetics and intelligent robotics.” There were three primary technologies used in the research. The first were stretchable electronics; these combine biocompatible silicone with oxide materials, to produce thin, transparent, wearable and strong electronics. The second were temperature-reactive coatings. In other words, self-modifying coatings which are around 1,000 more thin than a human hair, based on a material which transforms as it responds to heat. The third and final technology was brain-mimicking
8
memory, which involves electronic memory cells imitating how the human brain utilizes memory to store and recall information. About the prototype, which combined stretchable electronics and memory cells in the pressure sensor, temperature-reactive coatings and memory cells in the heat sensor, and all three in the pain sensor, first author Md. Ataur Rahman describes it as “responsible for triggering a response when the pressure, heat or pain reached a set threshold.” “We’ve essentially created the first electronic somatosensors — replicating the key features of the body’s complex system of neurons, neural pathways and receptors that drive our perception of sensory stimuli,” he explained of the research, which was published in the Advanced Intelligent Systems journal. “While some existing technologies have used electrical signals to mimic different levels of pain, these new devices can react to real mechanical pressure, temperature and pain, and deliver the right electronic response.” “It means our artificial skin knows the difference between gently touching a pin with your finger or accidentally stabbing yourself with it — a critical distinction that has never been achieved before electronically.” The artificial skin could be used in future as an less-invasive alternative to skin grafts, as well as being part of developments in prosthetics and robotics. The research, carried out at RMIT’s Micro Nano Research facility, was supported by the Australian Research Council.
EU Research
Fungal Leather Could Replace Animal and Synthetic Material Cheaper and better for the environment, fungal leather could substitute other versions say scientists. According to researchers from the University of Vienna, Imperial College London and RMIT University in Australia in a new review paper, it’s been claimed that fungal leather has plenty of potential, and could replace other types of leather as it’s more eco-friendly and cheaper too. Leather made from fungi is completely biodegradable on its own, and uses fewer dangerous chemicals. Also releasing less carbon, it’s the same as leather from animals in feel, so it looks like the best substitute. Traditional leather is intertwined with animal farming, and the livestock sector is responsible for around 14% of all human-caused greenhouse emissions. Just under two-thirds of these are from cattle rearing. Meanwhile, leather tanning takes up a lot of energy and resources, and leads to a lot of sludge waste, increasing its carbon footprint. Hence, it has a higher negative impact on the environment than most animal products. Although synthetic leather is often seen as ‘better’ from an ethical standpoint, it’s usually made with polymers derived from fossil fuels, and so isn’t biodegradable. The impact on synthetic plastics on the environment has been welldocumented, and this leather is no exception. Professor Alexander Bismarck of the University of Vienna and Imperial’s Department of Chemical Engineering, and co-author
www.euresearcher.com
of the study, said that “We tend to think of synthetic leather, sometimes known as ‘vegan leather’, as being better for the environment. However, traditional leather might be ethically questionable, and both leather and plastic substitutes have issues with environmental sustainability … fungi-derived leather brings none of these issues to the table, and therefore has considerable potential to be one of the best leather substitutes in terms of sustainability and cost.” Fungal leather was first mooted around five years ago, as American companies MycoWorks and Ecovative Design patented technologies which utilise mycelium, the structure of mushrooms that contains a polymer also found in crab shells. As these root-like structures grow together, they begin to resemble leather, and because it’s only the roots and not mushrooms themselves needed, this ‘leather’ can be grown anywhere. The process is relatively quick too, taking about two weeks to turn from one spore to the actual fungal leather itself. Even when it’s modified and treated with various dyes and chemicals, it still uses far less resources than the production of other types of leather, so the future looks promising. Though already-released prototypes have been expensive, as fungal leather becomes more widespread it’s expected to become more accessible in turn, so we could be seeing it replacing animal and synthetic leather in the coming years.
9
INCOMPATIBILITY CHALLENGES One-pot multi-enzymatic processes require the selection of suitable enzymes. To coordinate the optimal temperature of all enzymes as well as to avoid negative impact of temperature on product- and cofactor stability is crucial. Further, to make sure the process unfolds its full potential, cross reactivity should be circumvented by on demand activation/inactivation of the enzyme(s). Unfortunately, the development of efficient, perfectly orchestrated and regulated cell-free metabolic systems is still an unmet need. IMMOBILIZATION TO MNPs To exert functional control over different enzymes, HOTZYMES uses magnetic heating on a molecular scale. Enzymes are immobilised on magnetic nanoparticles (MNPs), which are exposed to an alternating magnetic field (AMF). This triggers magnetic energy to be transferred as heat, which results in the creation of high temperature gradients at the location of the enzymes with respect to the bulk. This should allow an unprecedented temporal control over each enzyme activity whilst the global temperature of the reaction bulk is not increased.
Catalysing change in the chemical industry? The current methods of producing pharmaceuticals and biocommodities are relatively inefficient and energy-intensive, leading researchers to investigate alternatives. Researchers in the HOTZYMES project are investigating the potential of using magnetic nanoparticles to heat up enzymes and catalyse reactions more efficiently, as Dr. Valeria Grazú explains.
10
The majority of the reactions used in the
Hotzymes project
chemical industry to produce pharmaceuticals and biocommodities are accelerated by inorganic catalysts. While these catalysts speed up reactions, helping to reduce the costs of the eventual chemical products, there are also some disadvantages to this approach, as Dr. Valeria Grazú explains. “Usually pharmaceuticals and biocommodities are produced via a cascade of chemical reactions that occur sequentially. Since inorganic catalysts are not specific, using them to catalyse these reactions often generates unwanted by-products. This is a drawback of the existing approaches to industrial biotransformation, as these by-products have to be taken out by means of complex downstream processes, which decreases the yield of the product of interest and increases production costs,” explains Dr. Grazú. “A further problem with inorganic catalysts is that they need very high temperatures in order to work effectively, which requires a high input of energy.”
A more sustainable alternative approach could involve using enzyme cascades in a single pot to catalyse these reactions more efficiently, a topic that Dr. Grazú and her colleagues in the HOTZYMES project are exploring. Biological catalysts work at lower temperatures and are much more selective than inorganic catalysts, yet there are challenges to overcome in terms of using them in industry. “There are not many examples of enzymatic cascades being used in industry, because it’s difficult to get biological catalysts, designed by nature, to work effectively in very specific non-natural environments,” outlines Dr. Grazú. In nature, enzymes work together and catalyse chemical reactions in cells, yet Dr. Grazú says it is difficult to use these catalysts effectively in industrial settings. “From an industrial point of view, there is a process that could be improved. So we want to catalyse certain reactions, using
the right enzymes for each step. But we may find that they are not all effective in the same conditions. Ideally we would like to combine the enzymes so that they work efficiently together in one pot,” she says. This goal is a central part of the HOTZYMES project’s overall agenda, with researchers working to develop more efficient enzymatic cascades. The idea here is that the enzymes will catalyse a cascade of chemical reactions in the same vessel, with magnetic nanoparticles used to generate the ideal temperatures for each specific reaction. “Magnetic nanoparticles are very small and are superparamagnetic, which means that they don’t have magnetic memory. So if they become magnetised they act like magnets, but when you remove the field, they lose their memory and re-disperse,” explains Dr. Grazú. These magnetic nanoparticles are able to generate heat when placed in an alternating magnetic field. “That
EU Research
HOTZYMES The Development of efficient Enzymatic Cascades in well-coordinated One-Pot-Systems Project Objectives
heat is very localised. Heat is generated very close to the surface of the magnetic nanoparticles,” continues Dr. Grazú. “The idea in the project is to try to regulate the temperature of these nanoparticles in order to achieve a selective control of enzymes needed to carry out biotransformations of interest.” The project brings together partners from both academia and the commercial sector in pursuit of this goal, with researchers working on several different areas of technology. This includes the synthesis of the magnetic nanoparticles and the production of bioreactors for biocatalysis, which are central to the prospects of this approach being used in industry in the longer-term. “There are several challenges that we need to face in order to transform this scientific
expensive for example, so Dr. Grazú says the project’s research could open up new possibilities. “With one pot enzyme cascades and concurrent reactions, there is the opportunity to produce new chemicals that could be of interest, which at the moment cannot be produced efficiently,” she explains. The goal at this stage is not to develop a market-ready product however, but rather to explore the feasibility of the idea. “The project is quite exploratory at this stage, but it could provide the basis to go further in future, if we have good results,” says Dr. Grazú. “It’s like the first step with a risky idea that could generate a breakthrough if successful. The next step then would be to go to the market.” The project’s work could have a significant impact on the commercial sector
Heat is generated very
close to the surface of the magnetic nanoparticles. The idea of the project is to try to regulate the temperature of these nanoparticles in order that they could heat different enzymes needed to carry out biotransformations of industrial interest. curiosity into a practicable technology,” says Dr. Grazú. Magnetic heating is not an entirely new concept and is already applied in a number of areas, for example in cancer therapy, but the demands of industry are very different. “One of our project partners in the consortium (nanoScale Biomagnetics) has a lot of experience in the development of devices for the magnetic heating of nanoparticles. Their focus in this project is not on therapeutic applications, but industrial applications,” outlines Dr. Grazú.
Industrial interest A number of different processes of industrial interest have been selected for attention in the project, some of which lead to the production of active pharmaceutical ingredients. Some of these processes can’t be carried out in the traditional way, because they are inefficient or too
www.euresearcher.com
in the long-term however, both helping companies produce existing chemicals more efficiently, and also opening up wider potential in production. While the project is still at a fairly early stage, Dr. Grazú says the results so far are encouraging. “We have just finished the first year of the project, but we have had very promising results so far,” she says. Over the remainder of the funding term, Dr. Grazú and her colleagues will be working to both improve the technology and also address fundamental scientific questions. “This is a high-risk project, and there are a lot of basic science questions that we need to answer in order to achieve a technological breakthrough,” she continues. “We are essentially pursuing research in two directions at the same time – from the basic point of view, and from the technological point of view. This is essential if we are to achieve a breakthrough.”
HOTZYMES aims to enhance multi-enzymatic processes for the biotechnological production of pharmaceuticals and biocommodities. To enable optimal temperature conditions for each reaction in a multi-step-scheme, HOTZYMES couples enzymes to magnetic nanoparticles that are controllable at nanoscale locally using magnetic heating. Also, a new generation of magnetic bioreactors for biocatalysis will be designed.
Project Funding
The EU-project HOTZYMES is funded in the frame of the H2020-FETOPEN-2018-2020 call. It consists of a multidisciplinary consortium of 7 partners from four European countries, who implement the project with a budget of 3 million Euro.
Project Partners
• For details of project partners, please visit: https://www.hotzymes.eu/consortium/
Contact Details
Principal Researcher of ACIB Martin Walpot, MA Head of Public Relations and Marketing ACIB - Austrian Centre of Industrial Biotechnology ACIB GmbH, Krenngasse 37, 8010 Graz T: +43 316 873 9312 E: martinwalpot@acib.at W: www.hotzymes.eu W: www.acib.at
Dr. Valeria Grazú Project Coordinator
Dr. Valeria Grazú is a senior researcher at CSIC, a position she has held since 2016. She has deep experience in the biofunctionalization of nanostructured materials and has worked with a wide variety of biomolecules (proteins, enzymes, DNA, sugars, peptides, etc), looking at their use in biotechnological and biomedical applications.
11
A deeper picture of meoitic recombination The BRCA2 gene is known to suppress cancer and also plays a major role in meiotic recombination. We spoke to Professor Hiroki Shibuya about his work in investigating the fundamental mechanisms involved in meiotic recombination and the role of BRCA2 in repairing double strand breaks of DNA through homologous recombination. Our DNA is composed of two strands of genetic material which coil around each other in the double helix identified by Crick and Watson in 1953. When one helix is broken the damage can be repaired relatively easily, as the other strand remains intact, but a double strand break (DSB) is more difficult to repair. “With a DSB, both helices of DNA break simultaneously. Homologous DNA is required to repair the break precisely - repair takes place through homologous recombination,” explains Hiroki Shibuya, an Assistant Professor at the University of Gothenburg. There are two forms of recombination, mitotic and meiotic; Professor Shibuya is investigating the latter in a research project part-funded by the European Research Council and the Swedish research council. “A DNA break is intentionally induced in meiosis by an activity of endonuclease. In meiotic recombination homologous chromosomes, with one from the mother and one from the father, are used as a template to repair the break,” he continues.
DNA break The DNA break induced in meiosis is repaired using BRCA2, a cancer suppressor gene first identified in 1995 which has since attracted a lot of attention in research. This process is not just about repairing the DNA, but also connecting the homologous chromosomes through homologous recombination into what is called a crossover structure, also known as a Chiasma. “A crossover is a structure which connects homologous chromosomes together, and it’s essential for the generation of genetic diversity by mixing maternal and paternal chromosomes, and also for chromosome segregation in meiosis,” explains Professor Shibuya. Aberrations in this step lead to a chromosome mis-segregation in meiosis called aneuploidy, that can cause down’s syndrome or certain birth defects, underlining the wider relevance of Professor Shibuya’s research. “My research is focused on investigating the basic mechanisms involved in meiotic recombination. We are sure that our research will provide fundamental knowledge, which will help us to understand the causes of genetic abnormalities caused by meiotic
12
Shibuya. Researchers are also carrying out in vivo analysis of protein function in mice. “We can produce genetically modified knockout mice using CRISPR-Cas9, which is a genome editing tool,” explains Professor Shibuya. “We can then analyse any defects by carefully observing the individual germ cells or even individual chromosomes in the knockout mice, which will tell us the function of each proteins.”
Novel proteins Meiotic chromosomes (green) decorated with the DSB repair proteins (red and blue).
errors, and will be useful for diagnosis or even treatment in the long run,” he outlines. Indeed, this research holds wider relevance in terms of understanding the root causes of certain other health problems. For example, a specific mutation of BRCA2 is known to leave an individual more susceptible to breast cancer or ovarian cancer. “This is because BRCA2 helps to repair DNA breaks through homologous recombination in somatic cells. So, if BRCA2 is mutated, then accidental DNA breaks are not repaired, predisposing individuals to cancer development,” explains Professor Shibuya.
By combining these different approaches, researchers have been able to gain new insights into both the process of meiosis and also meiotic recombination. One important finding arising from Professor Shibuya’s research is that BRCA2 forms a previously unknown ternary protein complex, BRCA2-MEILB2-BRME1, in meiosis. “We have identified these two novel proteins that function with BRCA2 in meiosis. We have looked at its function in germ cells by generating the gene knockout mice,” he outlines. Without MEILB2 or BRME1, the BRCA2 function is attenuated and meiotic DSBs are not repaired properly, leading to defects in recombination and crossover formation, which then cause male sterility. Certain links have also been identified
We will look deeper into the mechanism of those proteins and how they are involved in meiotic recombination. How do they impair mitotic recombination? How do they contribute to cancer development? Researchers have identified the key proteins involved in regulating BRCA2 during meiotic recombination, specifically MEILB2 (Meiotic localizer of BRCA2) and BRME1 (BRCA2 and MEILB2-associating protein 1), now Professor Shibuya and his colleagues are looking to probe deeper into the molecular mechanisms involved. A number of different techniques are being used in this research, including in vitro biochemistry, to build a deeper picture of how these proteins function. “We purify the protein, and then we analyse its activity and function in vitro,” says Professor
between aberrant expression of those genes and cancer development, and Professor Shibuya plans to continue his research in this area in future. “We will look deeper into the mechanism of those proteins and how they are involved in meiotic recombination. How do they impair mitotic recombination, when aberrantly expressed in somatic cells? How do they contribute to cancer development?” he says. A significant degree of progress has already been made in this respect. Researchers have found that MEILB2 and
EU Research
Q A D
r Hiroki Shibuya is a researcher in the Department of Chemistry and Molecular Biology at the University of Gothenburg. We spoke to him about what inspired him to pursue a career in science, his academic experiences, and his hopes for the future.
EU Researcher: When did you become
HS: I’m from Japan originally, and I spent some
interested in molecular biology? Was it at school?
time at Harvard before I moved to Sweden. The culture is of course a bit different, but the good points about the research environment in Sweden are that it’s very relaxed, and there is a lot of academic freedom for researchers. It’s easy to collaborate with researchers from neighbouring European countries, and that is an advantage.
&
Dr Hiroki Shibuya: Right from an early age I
was always very interested in the natural world. I remember collecting fossils and observing insects, I was fascinated by the variety and beauty of the organisms that are around us. When I went to university I was keen to pursue these interests. Molecular biology is a highly innovative part of the biology field, it’s very exciting to dissect living things at DNA and protein levels, and look at the underlying molecular interactions. EUR: Would you like to stay in academia in future and continue your research?
HS: Yes, my ultimate hope is to continue
research in whatever direction I feel is interesting. I want to study things that I find interesting in nature, I’m motivated by curiosity.
EUR: Have you been able to form
networks and relationships with other researchers in different parts of Europe?
HS: I’ve met other PIs at various international conferences and that is the main way I have built research networks. For instance conferences are organised through EMBO, which is a good way to interact and form research networks.
MEIOTIC TELOMERE Study of telomere function in germ cells, relevant to the regulations of homologous recombination and telomere length maintenance across generations Project Objectives
Breast cancer susceptibility gene 2 (BRCA2) was identified in 1995 as a potent cancer suppressor gene and has been the subject of intensive research over the past 25 years. Recently, Hiroki’s group identified novel BRCA2 partner proteins, MEILB2 and BRME1, which regulate BRCA2 function in normal germ cells and functions as a potential oncogene when aberrantly expressed in somatic cells.
Project Funding
European Research Council (ERC) Starting grant : € 1 500 000
Contact Details
Project Coordinator, Hiroki Shibuya Assistant Professor, Department of Chemistry and Molecular Biology, University of Gothenburg, Medicinaregatan 9E, SE-41390, GOTHENBURG, Sweden E: hiroki.shibuya@gu.se W: https://shibuyahiroki.com/research/ W: https://cmb.gu.se/english/about_us/staf f?languageId=100001&userId=xshibh
EUR: Is your university a good
environment to do that? Do you have lots of opportunities to collaborate with researchers at other institutes?
Professor Hiroki Shibuya
Dr. Shibuya fossil hunting in Gotland, Sweden, July 2020, and a Trilobite fossil he found.
BRME1, which normally function only in germ cells, are commonly upregulated in certain human cancers. Experimentally, researchers found that when MEILB2 or BRME1 was overexpressed in somatic culture cells, the function of mitotic homologous recombination was disturbed. “A number of sporadic cancers showed similar phenotypes seen in familial BRCA2 mutated cancers even without BRCA2 mutation. In such sporadic cancer cases, the overexpression of meiotic BRCA2 partner proteins can be a driving force for cancer development. In future this could be interesting as a route towards helping to diagnose these cancers at an earlier stage, or potentially as a way of identifying a new target of cancer therapy,” says Professor Shibuya. There are also several other avenues of research that Professor Shibuya plans to explore in future. “In order to faithfully transmit paternal and maternal chromosomes to the next generation, plenty of unique and
www.euresearcher.com
sophisticated events happen specifically in meiosis, such as reorganization of telomerebinding proteins, formation of chromosome axis structure, synapsis/recombination of homologous chromosomes, and then reductional segregation of homologous chromosomes. The molecular mechanisms underlying these processes are not fully understood, especially in mammalian model systems. It is very exciting to try and clear up these mysteries by discovering the key regulatory genes,” he continues. Researchers can gain a more complete picture of the molecular function of BRCA2, MEILB2 and BRME1 through locally constructing the steps involved, yet there is no clear path towards identifying new regulators of mammalian meiosis. This depends not just on scientific expertise, but also to a degree on serendipity. “It’s a bit of a fishing expedition,” acknowledges Professor Shibuya. This will form an important part of Professor Shibuya’s agenda in future.
Hiroki Shibuya is an Assistant Professor in the Department of Chemistry and Molecular Biology, University of Gothenburg, Sweden. He obtained PhD at the University of Tokyo, Japan, in 2014. After PhD, he worked at Harvard Medical School, USA, as a Human Frontier Scientific Program Long-term Fellowship Postdoc
13
Pointing the way to novel cancer immunotherapies Many lung tumours don’t currently respond to immunotherapy, now researchers are investigating new approaches to stimulate the immune system and combat the disease more effectively. Researchers in the NEXT GEN IO project aim to exploit the hypoxia response in T-cells and harness its wider therapeutic potential, as Dr Asis Palazón explains.
14
CD8+ T-cells are an important part of the
NEXT GEN IO project
human immune system, with the ability to directly kill malignant cancer cells, while they also differentiate into memory cells that can offer long-term protection. However, T-cell infiltration is usually lower in hypoxic areas of a tumour, which represents a significant problem in terms of the effectiveness of the immune response. “The T-cells must infiltrate a tumour in order to kill the target cells,” explains Dr Asis Palazón. “On the other hand, when T-cells sense that there is no oxygen, this triggers an activation mechanism. So essentially T-cells are able to sense that tissues are not completely healthy when there is low oxygen availability. We are trying to boost this signalling pathway through therapeutic intervention.”
The hypoxia pathway forms the central focus of research in the NEXT GEN IO project, an ERC-backed initiative formed with the goal of developing new drugs. Sophisticated techniques are being applied in this research, including flow cytometry, which helps researchers gain deeper insights into specific cell populations. “The ERC funded the purchase of a very expensive flow cytometer, which we are using in our experiments. This equipment has really helped in the project,” says Dr Palazón, the project’s Principal Investigator. The focus in the project is primarily on lung cancer, which Dr Palazón says has some interesting features. “Solid tumours, and metastasis in
the lung, are very hypoxic, while outside, the lung is a very well-oxygenated organ,” he explains. This offers a contrast in oxygen availability between malignant and healthy tissue, from which researchers can look to gain deeper insights into the hypoxia pathway, which can then be applied in drug development. While the project’s aims centre around translational immunotherapy, Dr Palazón says this is built on a deep understanding of the mechanisms behind hypoxia. “The hypoxia pathway is covered by a type of transcription factor called hypoxia inducible factors, HIFs,” he outlines. “These HIFs can trigger the expression of different
EU Research
NEXTGEN IO Exploiting the hypoxia response in T cells for Next-Generation Immuno-Oncology genes that promote adaptation to this low-oxygenation environment. So, when HIF is triggered, the metabolism of the cells change and angiogenesis is initiated. This vascularisation supplies oxygen and nutrients to solid tumours.” The researchers who discovered the hypoxia pathway won the Nobel Prize for Physiology or Medicine in 2019, now Dr Palazón aims to build on existing research foundations by developing new drugs to boost the immune response against lung cancer. A variety of different proteins are involved in oxygen sensing, some of them enzymes that can be targeted with drugs, which is a major area of interest in Dr Palazón’s lab. “One important oxygensensing enzyme is called factor-inhibiting HIF, which is the main focus of our project. This is an enzyme that acts as an oxygen sensor, and when oxygen is available, it inhibits the function of HIF,” he explains. “This mechanism exists in all our cells, but we are mostly focused on the hypoxia response in immune cells.” A technique called cell therapy is being used in the project, in which T-cells are taken from the patient, genetically manipulated, and then re-infused. This technique is also known as CAR T-cell therapy, and ensures that only the relevant
this is a huge field of research,” says Dr Palazón. The main priority at this stage for Dr Palazón is to discover targets in T-cells however, and he says significant progress has been made over the course of the project. “It’s important to say that we have already done the initial screening, to get these inhibitors of our targets,” he outlines. “Now we have hits against the target, and we are starting to do in vivo experiments in order to demonstrate that the drugs we have are effective.” A safe toxicity profile is also essential if these drugs are to be used in treatment, which will be assessed in the project. In general, immunotherapies have less dramatic side-effects than treatments like chemotherapy, but the toxicity of the compounds developed has not yet been fully established. “We have some hints, based on trans-genic mouse models, that they are safe, but we still have more to do in this respect,” says Dr Palazón. The aim in the project is to essentially de-risk these new drug candidates, so that they can then go forward to clinical trials, and researchers are collaborating with a pharmaceutical company in this work. “We essentially pay them to do some parts of the work, mostly focused on medicinal chemistry and drug screening,” continues Dr Palazón.
Project Objectives
The lab has a core focus on immuno-oncology, specifically on target discovery and drug development, to exploit several opportunities that the hypoxia pathway in T cells offers for the treatment of cancer. We employ a multi-disciplinary strategy, to deliver several early-stage drug discovery outputs. Our main objectives is the development of a novel small molecule inhibitor to modulate the hypoxic response in T cells. We also have a strong interest in therapeutic target discovery in T cells, and novel cell therapy approaches for the treatment of hypoxic solid tumors.
Project Funding
The NextGen IO project is funded by a European Research Council ERC Starting Grant.
Project Partners
• Domainex, United Kingdom domainex.co.uk
Contact Details
Asís Palazón Principal Investigator Cancer Immunology and Immunotherapy Lab Bizkaia Science and Technology Park, building 801A, Derio (Bizkaia) T: +4466 / 946 572 536 E: apalazon@cicbiogune.es W: https://www.cicbiogune.es/people/apalazon Asís Palazón
Hypoxia inducible factors can trigger the expression of different genes that promote adaptation to this low-oxygenation environment. So, when HIF is triggered, the metabolism of the cells change and angiogenesis is initiated. cells are targeted. “We can apply these drugs, or our manipulation of the hypoxia response, in the ex-vivo phase of the cell therapy approach. So that means that we will only modify the T-cells,” says Dr Palazón. The more advanced metastasis, the more hypoxic the tumour, so Dr Palazón believes this approach would be most effective against advanced stages of solid tumours and metastasis. “This is mostly because the bigger the tumour, the more hypoxia you can find,” he explains. The long-term effectiveness of this approach is an important consideration in terms of future therapeutic potential. While the first step is for the T-cells to rapidly kill the tumour cells, some of these T-cells must then also persist in the same function. “The effect of hypoxia on
www.euresearcher.com
Drug development This work is part of the early stages of the drug development process, and there are many stages ahead before these drugs can be applied in treatment. However, Dr Palazón is looking towards the potential commercialisation of this research in future. “We aim to finish the efficacy studies at some point within the next year. Then, when we have that proof-of-concept, it will be very important for us to file for a patent, covering the chemical structures and the effects and applications of these novel compounds,” he outlines. “The patent is very important, in order to enable the future licensing of these compounds into a pharmaceutical company, or maybe a spinoff company from our institution which could be established in future.”
Asís Palazón is a pharmacist and biochemist (University of Navarre). After his PhD thesis in Cancer Immunology and Immunotherapy (2012), he joined Prof. Randall Johnson’s lab at the University of Cambridge with the aim of studying the role of hypoxia on immune responses in cancer. He then worked for the pharmaceutical industry (Medimmune, Cambridge, UK). In early 2019, Asís joined CICbiogune as a Principal Investigator where he leads the Cancer Immunology and Immunotherapy lab supported by an ERC Starting grant (2018) and Ikerbasque.
15
Probing the structure of proteins with low sequence complexity The same amino acids may be repeated multiple times within a certain region of a protein, and it’s difficult to characterise these regions with traditional structural biology techniques. We spoke to Dr Pau Bernadó about his work in developing new strategies which will help researchers analyse the structure and dynamics of low complexity regions in proteins. A protein is
commonly thought of as a vibrantly coloured, rigid, 3-dimensional structure, yet not all proteins share these characteristics, and some in fact are not structured. A second common preconception, namely that the 20 amino acids that constitute proteins are evenly distributed amongst sequences, is also incorrect, as Dr Pau Bernadó explains. “There are sequences in which 1, 2, 3 or 4 amino acids are repeated within the same sequence, it’s what we call a low complexity sequence,” he outlines. As a chemist and structural biologist, Dr Bernadó has spent a lot of time analysing disordered proteins, those which don’t have a 3-dimensional structure. “This means that you cannot apply the classical, traditional methods to investigate these systems,” he continues. “You have to adapt your methods to reflect the fact that the protein is not rigid and that the amino acid sequence is highly repetitive.”
16
Low-complexity proteins This is a topic central to Dr Bernadó’s work as the Principal Investigator of the chemREPEAT project, an initiative based at France’s National Institute of Health and Medical Research (INSERM) in which researchers are investigating the structure and dynamics of low complexity regions (LCRs) in proteins. These LCRs are normally parts of proteins that have no 3-dimensional structure, with an amino acid composition that differs from a globular, rigid protein. “If you look at the distribution of amino acids in a globular protein, you will see that it’s more or less standard, it reflects the proteome in general,” says Dr Bernadó. A LCR by contrast has a very different amino acid composition. “You may have a glutamine or a glycine which is repeated 20 times consecutively,” explains Dr Bernadó. “This is, essentially, the lowest level of complexity that we can find.” These proteins lacking a rigid structure are very difficult to characterise, as many conformations
co-exist at the same time, so researchers typically measure average properties of all those conformations. Methods have been developed to characterise disordered proteins, which are not LCRs. “If you have a disordered protein that has a normal (unbiased) sequence, it can be disordered and not be low-complexity,” says Dr Bernadó. A technique called Nuclear Magnetic Resonance (NMR) spectroscopy can then be applied here, together with other methods and tools, to investigate the shape and peculiarities of specific proteins. “For example, glycine-67 and glycine-85 have different isolated peaks in the NMR spectrum,” explains Dr Bernadó. “The fact that you can identify these peaks as glycine-67 and glycine-85 is because their chemical environments are different.” This could mean for instance that there is an alanine before the glycine-67, while there might be a proline before the glycine-85. The presence of these different amino acids induces changes that enable researchers to distinguish between glycine-67 and glycine-85. “They are
EU Research
different because their chemical environment is different, because the sequence is different,” says Dr Bernadó. A lot of Dr Bernadó’s attention is devoted to analysing a specific type of LCR called homo-repeats, in particular in Huntingtin (Htt), a protein which is associated with Huntington’s disease. “There are homo-repeats of all kinds of amino acids. One of the most common homo-repeats in eukaryotes is polyglutamine,” he outlines. “In cases where you have 30 consecutive glutamines, it’s extremely difficult to distinguish between glutamine-15 and glutamine-25 for example.” The project aims to help overcome these limitations by essentially incorporating labelled amino acids within these homorepeats, from which more can then be learnt about their structure and dynamics. Normally, when a protein is produced for NMR analysis, isotopically labelled nitrogen-15 (15N) and carbon-13 (13C) are given to E. coli bacteria. “The bacteria eats this carbon and nitrogen and produces proteins that are fully labelled in 15N and 13C, which are NMR sensitive,” explains Dr Bernadó. This approach is not effective with Htt however, as all the glutamines would be labelled in the same way and so difficult to distinguish, so Dr Bernadó is developing a method called site-specific isotopic labelling (SSIL). “With SSIL, we essentially trick the system. We want to put the carbon and nitrogen isotopes in the places that we want,” he says.
tRNA suppression A technique called tRNA suppression plays an important role in this respect. Three consecutive bases of mRNA, referred to as a codon, code for an amino acid. The ribosome, while reading the mRNA, appends these coded amino acids to synthesize the protein. “There are specific codons for each amino acid, so the ribosome knows that if it encounters CAG in the mRNA, then it corresponds to a glutamine for instance. A different combination of three bases corresponds to a serine,” outlines Dr Bernadó. “The link between the mRNA and the amino acids is made by a small molecule, called tRNA. The tRNA comes with an amino acid attached on one side, and a sequence that can recognise the codon on the other. In a way, the ribosome simultaneously binds the mRNA and the appropriate tRNA to keep building the protein with the right sequence. Certain sequences of three bases do not code for any amino acid however; these sequences are called stop codons. When the ribosome reaches and recognises a stop codon, it simply stops building and the protein is delivered.”
www.euresearcher.com
NMR investigation of Htt with 16 consecutive glutamines. (top) overlay of the 13C and 15N-NMR spectra of all SSIL samples produced. Different colors indicate samples with individual labeled glutamines. This strategy enables the unambiguous assignment of the spectra. (middle and bottom panels) The analysis of the peak positions demonstrates the presence of several helical conformations of different length co-existing in solution.
The trick here involves externally synthesising, in vitro, a tRNA that can recognise these stop codons and with an amino acid attached. “We make use of a stop codon to synthesise the protein. It’s a tool to introduce whatever we want, wherever we want,” explains Dr Bernadó. This tRNA suppression method effectively represents a means of hacking the system, says Dr Bernadó. “This methodology allows
us to bring new chemistry into proteins. Some people introduce amino acids that are not natural, which are called non-canonical amino acids. The novel aspect of our research is that we introduce natural amino acids that are isotopically labelled, which doesn’t count as a chemical modification,” he explains. This approach allows researchers to highlight specific locations within a
Amino acid context of polyQ regions. a) Leucine and b) proline abundance around human polyQ regions. Dashed red lines refer to the background composition of the amino acid. These results suggest that the features found in Htt are very common in human glutamine-rich proteins.
17
chemREPEAT Structure and Dynamics of Low-Complexity Regions in Proteins: The Huntington Case
Project Objectives
The same amino-acid may be repeated multiple times in certain regions of proteins, which are known as low-complexity regions (LCRs). One particular sub-family of the LCRs are homorepeats, and a number of different pathologies are associated with unusually long repetitions of the same amino-acid within a protein. These LCRs are difficult to characterise, an issue that researchers in the Chemrepeat project are working to address. The aim in the project is to develop new strategies that will help researchers gain new insights into the structure and dynamics of homorepeats at the atomic level, which could eventually lead to the development of new therapies to treat diseases associated with LCRs.
Project Funding
Funded by an ERC Consolidator Grant. Total funding: € 1 999 844
Project Partners
• Juan Cortés (LAAS-Toulouse) • Miguel A. Andrade (JGU-Mainz)
Contact Details
Project Coordinator, Pau Bernadó Centre de Biochimie Structurale. INSERM, CNRS, Université de Montpellier 29, rue de Navacelles 34090-Montpellier (France) T: +33 4 67 41 77 15 E: pau.bernado@cbs.cnrs.fr W: http://www.cbs.cnrs.fr/index.php/en/ research-equipea2 Dr Pau Bernadó
Dr Pau Bernadó is an INSERM researcher and leader of the “Highly Flexible Proteins” group at the Centre de Biochimie Structurale (CBS) in Montpellier. His research is focused on the structure and dynamics of biomolecules and macromolecular complexes, in which he combines Nuclear Magnetic Resonance, small-angle X-ray scattering and computational methods.
Amino acid sequence of a sub-pathological version of Htt with 16 consecutive glutamines with a Green Fluorescent Protein (GFP) fused to Htt. The length of the poly-Q tract increases above 35 for pathological versions. (b and c) Schematic description of the strategy used for the site-specific isotopic labelling applied for the NMR investigation of Htt. This includes the in vitro loading of the tRNA and the cell-free reaction which included the trascription and translation of the protein and the tRNA suppression strategy.
protein, which opens up the possibility of gaining deeper insights into its structure and dynamics. The strategies developed in the project will be used to investigate the molecular basis of Huntington’s disease, a largely inherited neurodegenerative condition. “People fall ill with Huntington’s disease depending on the number of glutamines in Htt. What happens is that the enzyme that replicates DNA messes up when replicating genes containing several consecutive CAGs. Instead of copying the exact number of glutamines coded in the gene, it adds more CAG codons than in the original, in a process called DNAslippage. Then, when translated by the
There are homo-repeats of all kinds of amino acids. One of the most common homo-repeats in eukaryotes is polyglutamine. In cases where you have 30 consecutive glutamines, it’s extremely difficult to distinguish between glutamine-15 and glutamine-25 ribosome, this produces a protein with more glutamines,” outlines Dr Bernadó. “This is not an immediate problem if the number of amino acids increases from say 23 to 25, but subsequent generations are more likely to experience problems as the number of repetitions increases.” An individual with more than 35 consecutive glutamines will develop Huntington’s disease at some point, and the disease is likely to affect subsequent generations at an earlier stage of their lives as the number of repetitions increases,
18
in a process called ‘anticipation’. By using the strategies developed in the project to analyse Htt, Dr Bernadó hopes to shed new light on some important questions around Huntington’s disease. “Why does the protein become nasty when there are more than 35 glutamines in a homo-repeat? What is the difference between 34 and 37?” he asks. Researchers are addressing these questions by studying both pathological and non-pathological versions of the protein. “We now have the tools, and we are trying to address the difference between the pathological and non-pathological versions of the Htt protein at the atomic level,” continues Dr Bernadó.
A number of other diseases are linked to the expansion of the glutamine homorepeat, reinforcing the wider relevance of the project’s research, while Dr Bernadó is also looking at other LCRs within proteins. One important area of research involves looking at an amino acid called proline. “Just after the poly-glutamine in Htt there is also a poly-proline region. Although its length does not change, it seems to protect Htt from aggregating” says Dr Bernadó. “We have been able to look at proline in the same way that we have looked at glutamine.”
EU Research
Taking a page from the playbook of viruses Small interfering RNAs hold rich therapeutic potential as a way of silencing specific genes, yet they are extremely unstable in blood and inefficiently transported through the body, so it’s difficult to deliver them to the locations where they are required. Dr. Alyssa Hill is drawing inspiration from nature by engineering ‘smart’ viral RNA structures into delivery vehicles, which could open up new possibilities in treatment. A class of
short, double-stranded RNA molecules known as small interfering RNAs (siRNAs) hold rich therapeutic potential, and there is considerable interest in using them to treat a variety of conditions. These siRNAs co-opt the RNA interference (RNAi) pathway, which is a naturally-occurring mechanism of gene silencing in human cells. “Nobel Prizewinning research in the late 1990s showed that we can hijack the RNAi pathway with siRNAs and silence the expression of disease-causing genes. These might be cancer-promoting genes, or genes that encode for mutated proteins,” says Dr. Alyssa Hill. However, siRNAs are degraded rapidly by nucleases, which are defense enzymes in the blood. They also do not distribute widely through the body or passively enter cells, which makes it difficult to use siRNAs as drugs. Currently, only two siRNA therapies are approved for use in patients. “Even though they’re potent, getting siRNAs to the places where they’re needed – intact – is incredibly difficult,” explains Dr. Hill.
stable RNA structure produced by flaviviruses to accommodate an siRNA, focusing on a consensus motif used by the virus. “We’re using the viral RNA motif as a blueprint, changing the parts of it that we can while at the same time maintaining its overall fold,” continues Dr. Hill. “We’re also incorporating an aptamer, which is a device to drive uptake into specific cell types.” The goal is to develop an all-in-one platform that is stable in the blood, while also able to move itself into specific cells. Changes in the structure of an siRNA may lead to differences in the way it interacts with the RNAi machinery, however, which is an important consideration in the project. “Are we compromising the activity of the siRNA by trying to fuse it with another structure?” asks Dr. Hill. There are often tradeoffs between the stability of a molecule and its potency as a drug, an issue Dr. Hill is investigating. “We’ve looked at that by introducing unmodified siRNAs into cells grown in the lab and measuring their activity on a reporter gene,” she explains. “We also have introduced the engineered
We’re using the viral RNA motif as a blueprint, changing the parts of it that we can while at the same time maintaining its overall fold. ‘Smart’ RNA structures This issue is at the heart of Dr. Hill’s work as the Principal Investigator of a Swiss National Science Foundation (SNSF)-funded project in which the aim is to engineer ‘smart’ viral RNA structures for siRNA delivery. Over millennia, flaviviruses (e.g., Yellow fever virus, Zika virus) and other viruses have evolved RNA structures with clever ways of evading decay, which Dr. Hill and her project team now plan to use in siRNAs. “We aim to repurpose them, to export that stability into siRNAs,” she outlines. The aim here is to engineer a harmless but ultra-
www.euresearcher.com
molecules into cells and monitored their activity on the same reporter gene.” Evidence suggests that the engineered molecules are just as potent as unmodified siRNAs, and now Dr. Hill is looking further ahead. Once the stability and potency of the molecules has been established, the next step is to assess whether they can be targeted to disease-related cell types. “We’re considering a cell model of prostate cancer, which is a prevalent disease that has unmet clinical needs,” says Dr. Hill. The project itself is only a year in duration, yet Dr. Hill hopes it could provide a
springboard for further research. “It’s a very ambitious idea, and of course we want it to progress. This is a challenging field, but it would be exciting if our approach proves effective and we can move the idea forward.” Engineering ‘smart’ viral RNA structures for stable and targeted siRNA delivery Project Funding
SNSF grant number 190865
Contact Details
Alyssa C. Hill, Ph.D. ETH Zürich Institute of Pharmaceutical Sciences Vladimir-Prelog-Weg 1-5/10 8093 Zürich, Switzerland T: +41 44 633 74 15 E: alyssa.hill@pharma.ethz.ch : @alyssa_hill Alyssa C. Hill, Ph.D.
Dr. Hill is a postdoctoral research associate in the Institute of Pharmaceutical Sciences at ETH Zürich. Her research has been supported by awards from the National Science Foundation (Alexandria, Virginia, U.S.), the Novartis Research Foundation (Basel, Switzerland), and the Swiss National Science Foundation (Bern, Switzerland).
19
Mathematical models of memory-based decisions Memory exerts an important influence on decision-making, as we use our recollections of past experiences and events to inform the choices we make. We spoke to Dr Sebastian Gluth about his group’s work on both investigating the cognitive and neural processes involved in decision-making, and developing mathematical models of how those decisions are reached. A clear distinction is drawn in decision sciences between value-based and perceptual decisions. With perceptual decisions, there is an objective criteria of what is right or wrong, so a traffic light could be red, amber or green for example, and a driver decides on whether to move accordingly. “If you think it’s red but it’s actually green, then you will make an error,” says Dr Sebastian Gluth, Head of the Center for Decision Neuroscience at the University of Basel. With a value-based decision, however, there is no single, objectively correct thing to do. “Some people like Coke, others like Pepsi for example,” continues Dr Gluth. “We can only infer what a person wants by observing that person’s behaviour. And based on that we might then be able to infer what is good for a person or not.”
Value-based decisions As the Principal Investigator of a project backed by the Swiss National Science Foundation, Dr Gluth is investigating the basis on which people make value-based decisions, bringing together several strands of research. One aim in the project is to identify the cognitive and neural processes involved in making these decisions. “How do they take place? How can we set up mathematical models to explain why a person chooses a particular option, or predict the probability that they will choose that?” outlines Dr Gluth. Most of the earlier decision-making models focused on what the person was going to choose, but Dr Gluth and his team are now adding another dimension. “We are developing models that also make predictions about how fast we make decisions, and how confident we are about our decisions,” he explains. Evidence accumulation models are an important tool in this respect, enabling researchers to make predictions on how quickly an individual will make value-based decisions, such as what snack they might want to choose from a vending machine or what shirt to wear on a night out. The idea behind this is that individuals retrieve memories about the available options and thereby accumulate evidence for choosing one of them over time. “It develops over time, until you reach a certain
20
Picture of the Decision Neuroscience team during the annual meeting of the Society for Neuroeconomics in Philadelphia in 2018.
point of confidence in what you want. That’s when you make your decision,” says Dr Gluth. Analysis of electroencephalography (EEG) signals from the human brain and singleunit activity in the monkey brain for example suggests that once a certain threshold of accumulated neural activity is reached then a decision is made. “The choice is always made at the point when this EEG signal, or singleunit recording signal, reaches a specific level,” explains Dr Gluth. There are however certain so-called biases in our memory which are also an important consideration in the project. It might be
that an individual has a particularly strong recollection of one option for example, while their memory of a second option is less vivid, which Dr Gluth says will affect their decisionmaking. “Even if the value of the different options is pretty similar, then they will still have a clear preference towards the option that they remember more vividly. So the strength of memory drives your preferences,” he outlines. Researchers are probing how individuals make these decisions, using familiar stimuli. “We ask people how much they like different options and to rate them on a scale from 1-10, maybe they rate option A as 5 and B as 5. Then they are asked to choose between them,” says Dr Gluth. The study participants have to make the decision based on their memories of the different options, rather than visual evidence. While it might be expected that there would be a fairly equal spread, as they had previously given the same rating to the two options, Dr Gluth says that a fairly clear pattern emerges. “We see that they pick A more often. The reason is that they can remember A better than B,” he explains. This effect might be even more pronounced in elderly people, another area of interest to Dr Gluth and his colleagues in the project. “We know that elderly people experience a decline in their episodic memory, so this memory bias might be even stronger in elderly people compared to younger people,” he says.
The left panel shows a heat map of eye movements when participants made decisions between two options shown within the green and red squares. Warm colors indicate more fixations. The goal of using eye tracking is to investigate how people distribute their attention while making decisions from memory. The right panel shows time-frequency plots of EEG data when participants made decisions from memory. Warm colors indicate increased spectral power. The decrease in spectral power at lower frequencies (~20 Hz) before the response reflects the preparation of movement. The goal of using EEG is to understand how memory and decision-making processes emerge and interact over time.
EU Research
A shelf of tasty food snacks in the behavioral lab. Snacks are used as stimulus material to motivate participants to make decisions in line with their personal preferences.
THE INFLUENCE OF EPISODIC MEMORY ON VALUE-BASED DECISION MAKING Project Objectives
Most of our everyday decisions require us to retrieve important information from our memories. The goal of the SNF project “The influence of episodic memory on value-based decision making” is therefore to uncover the psychological, computational and neural principles of how our memories shape our preferences and decisions. This research question is addressed by combining mathematical modeling of choice behavior with modern neuroimaging tools.
Mathematical models This research is part of the wider goal of developing a deeper understanding of the interplay between memory and decisionmaking. Alongside conducting experiments, Dr Gluth and his colleagues are also developing a mathematical model of how memory influences decision-making processes, building on previous research. “A lot has been written about these evidence accumulation models, which are mathematical models of decision making. There’s also a lot of literature on how memory should work, and mathematical models of
in nature at this stage, Dr Gluth believes that it could bring wider benefits in future, for example in helping people who have experienced a decline in their memory to make good decisions. “If you understand the cognitive system better, then you can devise better technologies,” he explains. “For instance, insights into the visual system can help you to arrange the road system in a more optimal way. A famous example of this is that a lot of cars previously crashed at a road bend in Chicago because they were going too fast and failed to brake. Lines have since been painted on the street, and the distance between
How do value-based decisions take place? How can we set up mathematical models to explain why a person chooses a particular option, or predict the probability that they will choose that? memory. We try to combine these two aspects,” he says. One major question concerns whether information is retrieved from the memory strictly before a decision is initiated. “Do you first try to retrieve information from your memory, and after you have done this in your brain you start making your decision? Or does this happen in parallel?” continues Dr Gluth. The working hypothesis in the project is that information is retrieved more or less in parallel with decision-making, which will inform the ongoing development of the mathematical model. While research is largely fundamental
them gets smaller to make drivers think they are accelerating, and therefore they hit the brake.” This can be thought of as an example of nudge theory in action, essentially the use of behavioural insights to influence individual behaviour. While the project’s work holds wider relevance in these terms, Dr Gluth says the primary focus is more on fundamental research at this stage. “We are in the process of finishing some projects, and we hope to finalise some more publications over the next few months. There are also other projects, like our work with EEG signals, where we’re still analysing the data,” he outlines.
Experimental set up for a combined EEG and eye-tracking study. The eye-tracker is situated just below the computer screen.
www.euresearcher.com
Project Funding
Funded by the Swiss National Science Foundation - SNF Project #100014_172761 / 1
Project Partners
• Jörg Rieskamp, Department of Psychology, University of Basel, Switzerland • Ian Krajbich, Department of Psychology and Department of Economics, The Ohio State University, USA
Contact Details
Project Coordinator, Sebastian Gluth, PhD Assistant Professor University of Basel Faculty of Psychology Decision Neuroscience Missionsstrasse 62a 4055 Basel Switzerland T: +41 61 207 06 06 E: sebastian.gluth@unibas.ch W: http://p3.snf.ch/Project-172761
Gluth, S., Sommer, T., Rieskamp, J., and Büchel, C. (2015). Effective connectivity between hippocampus and ventromedial prefrontal cortex controls preferential choices from memory. Neuron 86, 1078–1090. Weilbächer, R., and Gluth, S. (2017). The interplay of hippocampus and ventromedial prefrontal cortex in memory-based decision making. Brain Sciences 7, 4.
Sebastian Gluth, PhD
Sebastian Gluth, PhD is currently assistant professor and head of the Decision Neuroscience lab at the Department of Psychology of the University of Basel. He studied psychology at the Humboldt University Berlin and received his PhD from the University of Hamburg. Prof. Gluth’s work addresses the cognitive and neural foundation of decision making and learning processes.
21
Exploring the potential of graphene topography for nanomagnetism Magnetic materials offer a broad range of possibilities for the development of sensors, memories and logic elements. We spoke to Dr Sebastian Gliga about his work in creating functional magnetic materials that can be reconfigured through shape change, which could open up new possibilities for the manipulation of spin waves with the aim of creating logical devices. A material formed
of an atomthick layer of carbon atoms, graphene has attracted a lot of attention in research, now scientists are seeking to harness its mechanical features in the development of functional magnetic nanomaterials. While a graphene sheet is commonly thought of as being perfectly flat, wrinkles as well as quasiparticles called wrinklons emerge when it is placed under stress. “We can think of wrinkles as sinusoidal variations in the height and width of a graphene sheet,” explains Dr Sebastian Gliga, a Scientist in the Microspectroscopy Group at the Paul Scherrer Institute. When a graphene sheet is suspended over a trench then wrinkles develop, in a way similar to the effect of pulling on a curtain. “We get wrinklons when wrinkles come together and merge with each other. Certain topological properties are associated with this,” says Dr Gliga. “I’m interested in this junction where two wrinkles meet.”
Spin waves While Dr Gliga is working with graphene, this is primarily a medium to support thin magnetic films, which is his main area of expertise. “I’m investigating the possibility
22
of creating magnetic films whose topography can be changed. The aim is to create functional magnetic materials through shape change. I am looking to manipulate spin waves using these materials - spin waves are essentially collective excitations of the electronic spins,” he outlines. A number of proposals have been put forward for creating logic elements based on spin waves; Dr Gliga says reconfigurability is an important issue in this respect. “Magnetic systems have been
domain wall in the film will have a favourable direction of propagation. Spin waves can also display asymmetric propagation, with waves traveling in opposite directions having different frequencies,” explains Dr Gliga. The intention is to build on these findings to create a material with properties that can be changed through bending, which Dr Gliga says has to be done on very small length scales, ideally of a few tens of nanometres. “These scales are associated
The aim is to create functional magnetic materials through shape change. I am looking to manipulate spin waves using these materials - spin waves are essentially
collective excitations of the electronic spins. developed which can propagate spin waves and do interesting things, such as define logical gates through spin wave interference, but they’re not reconfigurable. Other systems are reconfigurable, but you have to change their magnetic state,” he says. An alternative approach would be to achieve functionality through shape change, a topic at the heart of Dr Gliga’s research. There has been a lot of research over the last few years on curved magnetic films, and it has been found that when a thin film is curved, novel properties emerge. “This means for example that a given type of
with what is called the exchange length of the material. It’s a characteristic length of a magnetic material which determines the extension of spin inhomogeneities in the magnetic structure, such as domain walls for example,” he continues. The wrinkles in graphene can extend between a few nanometres to a few hundreds of nanometres, in principle matching these scales, so have the ideal dimensions to enable reconfigurability. The wrinkles appear when graphene is placed under stress, but they are also affected by the temperature of the graphene sheet. “Graphene has a negative thermal expansion coefficient. This means that when you heat up the graphene, together
EU Research
with the substrate, the substrate will expand, while the graphene sheet will actually shrink, so the wrinkles disappear. As you cool down the graphene, the substrate will shrink but the graphene sheet will expand, thus creating wrinkles,” says Dr Gliga. This opens up interesting possibilities in terms of controlling these wrinkles. “One of my hopes is to control the size and shape of these wrinkles, through different thermal cooling and heating cycles,” continues Dr Gliga. A technique called scanning electron microscopy is used to assess the size of the wrinkles, with researchers investigating how this can be controlled. From this point, Dr Gliga can then investigate whether there is a relationship between the properties of the wrinkles and the magnetic properties of films grown on top of the graphene sheets. “This is the hard part of the project. The graphene is the underlayer, the mechanical support, and the idea is then to grow a magnetic layer on top,” he outlines. Researchers are now investigating how magnetic layers grow. “Ideally the magnetic layer will then take on the wrinkle shape of the graphene. Then, once we manipulate the graphene and the wrinkles change in size, this will hopefully be transferred to the magnetic film, and then we can really tune the magnetic film properties,” continues Dr Gliga. “However, I’m not at that point yet, and there is a danger that the magnetic film might just ‘crush’ these wrinkles.”
Size and shape There are a number of other potential issues, so this is a high-risk, exploratory project at this stage, rather than focusing on any specific applied goals. The primary aim of the project is to control the size and shape of these wrinkles, with respect to amplitude and wavelength. “The ideal shape would be something like a half circle. To get the properties I’m interested in, the width and the height would need to be proportional,”
www.euresearcher.com
says Dr Gliga. While an iron-nickel alloy called Permalloy is being used in the project, Dr Gliga expects other materials like nickel, cobalt or iron would be equally interesting. “One of the differences might be in the way the magnetic films grow on graphene, which is another area of debate,” he says. “The literature suggests that it’s not easy to grow magnetic thin films on graphene. A solution to this is to have a thin layer of a different material in-between the graphene and the magnetic film. This of course makes the system less flexible.” The project itself is relatively short in duration, but if the results are positive then this could provide the foundations for further research into spin wave manipulation. There are many ideas around about the possible applications of functional magnetic materials in logical devices; one of the main issues in this respect is the energy consumption of computing chips. “As they get smaller, they dissipate more energy, thus generating large amounts of heat,” explains Dr Gliga. One problem with conventional computers, which are based on the von Neumann architecture, is that the memory and processing units are separate, and data needs to constantly be transported between them; this is fundamentally inefficient, an issue that Dr Gliga believes these materials could address. “One application could be in neuromorphic computing. This is the idea that you would simultaneously store data and perform logical operations,” he outlines. “The beauty of this system is that it opens new possibilities to achieve that through its structural reconfigurability.” This hinges however on the ability to modify and control the size of these wrinkles, to move the wrinklons along the sheet, and to nucleate new wrinklons when required. The main priority at this stage for Dr Gliga is to achieve this level of control, yet he is very much aware of the wider possibilities. “If this turns out to be possible, then you would be able to define a material through which spin waves can propagate and be simultaneously processed in reprogrammable ways,” he says.
WRINKLES AND WRINKLONS Wrinkles and wrinklons: magnetic films with tuneable topographies Project Objectives
The aim of this project is to grow magnetic thin films on top of wrinkled graphene sheets. A first objective is to determine if the graphene wrinkles can be used as a template to create wrinkled magnetic films. A second objective is to determine if, by actively changing the properties (e.g. size) of the graphene wrinkles, the topography of the deposited magnetic film can be modified. Concretely, this would mean that it is possible to actuate the magnetic film using the graphene underlayer.
Project Funding
Spark project funded by the Swiss National Science Foundation (SNF).
Contact Details
Project Coordinator, Dr Sebastian Gliga Paul Scherrer Institute WSLA/126 Forschungsstrasse 111 5232 Villigen PSI Switzerland T: +41 56 310 54 81 E: sebastian.gliga@psi.ch W: www.psi.ch/en/microspec/scientifichighlights/wrinkles W: http://p3.snf.ch/project-190736
Dr Sebastian Gliga
Dr Sebastian Gliga is a Scientist in the Microspectroscopy Group at the Paul Scherrer Institute. His research focuses on nanomagnetism, in particular the investigation of emergent phenomena in two- (2D) and three-dimensional (3D) artificial spin systems, aiming to create energy efficient functional materials. Sebastian Gliga has published over 60 peer-reviewed papers that have been cited more than 2300 times.
23
View of the Detector ATLAS open. © CERN
How does the Higgs boson couple to lighter particles? The Higgs boson was discovered in 2012, now researchers in the ExclusiveHiggs project are looking at how it couples to light quarks. We spoke to Professor Kostas Nikolopoulos about their work in analysing particle collisions, research which could help elucidate the properties of the Higgs boson and guide the search for new physics. The Higgs boson
was discovered in 2012, confirming decades-old theoretical predictions of its existence, yet much remains to be learned about its properties. As the Principal Investigator of the ExclusiveHiggs project, Professor Kostas Nikolopoulos is investigating how the Higgs boson interacts with light quarks. “We are trying to understand the properties of the Higgs boson, and in particular how it interacts with lighter particles,” he explains. The heavier a particle, the more likely it is to interact with the Higgs boson, so investigating its interactions with lighter particles is a challenging goal. “The elementary matter particles are divided into leptons, like the electron, and quarks, like the up and down that are the constituents of the proton and the neutron. These particles are organised in a family, with a first, second and third generation,” outlines Professor Nikolopoulos. “The three generations are identical to each other. But as you go from
24
the first generation to the second, the particles become heavier, and as you go from the second to the third, the particles become heavier again.” Researchers have observed the Higgs boson interacting with some of the heavier particles, such as the top quark, the bottom quark, and the tau lepton. However, there is less evidence of its interactions with the four light quarks – up, down, charm and strange – which is where Professor Nikolopoulos and his colleagues in the project come in. “At the time of the discovery, there was really no prospect of studying these interactions. We are working at the Large Hadron Collider in the project, as it’s the only place that can produce a Higgs boson. We are using the ATLAS detector in a way that is beyond what it was designed for,” he says. Around 40 million particle collisions occur every second when the LHC operates, with researchers aiming to identify the
most interesting in terms of the project’s wider goals. “We have developed a selection mechanism to identify interesting collisions. ATLAS records roughly 1,000 collisions a second, but we use less than 1 percent to collect events for exclusive decay searches,” continues Professor Nikolopoulos. The aim in the project is to sift through these data to find evidence of the Higgs boson interacting with light quarks through observations of its decay products. The Higgs boson itself can decay into a quark and anti-quark pair, yet Professor Nikolopoulos says this is not easy to distinguish from other processes. “A lot of other processes give signals that look exactly like this, so we need something that is very distinctive,” he explains. This is why attention is focused on exclusive decays, for example of the Higgs boson decaying into a phi meson. “One of the strange and anti-strange quarks radiates a photon, and then together with the other
EU Research
Phi meson candidates in the search for Higgs decays to a phi meson and a photon. Published in JHEP 07 (2018) 127 (arXiv:1712.02758).
they make a phi meson. That’s a signal that we can look for,” says Professor Nikolopoulos. “Another way is to look directly for the production of a W or Z boson together with a Higgs boson, that subsequently decays to a charm and an anti-charm quark. These travel away from their production point before they decay, and this creates an interesting experimental signature.” This latter approach has been used to observe Higgs boson decays to bottomquarks, yet it’s less sensitive for charm quarks that travel a shorter distance before they decay. While progress has been made in terms of identifying lighter particles, at this point no single approach seems able to provide all the answers researchers are seeking in the
holds wider relevance in these terms. “The outputs of our project will help to plan these future experiments to maximise the scientific output,” says Professor Nikolopoulos. More immediately, research continues into the interactions of lighter particles with the Higgs boson. “With these studies we cover new ground in our understanding of the Higgs boson and evaluate their future potential, at the same time we explore ideas that were not in the initial proposal,” continues Professor Nikolopoulos. “For example, we are investigating what is called associated production.”
Associated production This involves looking for evidence of a Higgs boson being produced together with a charm quark. The initial collision here might involve a gluon and a charm quark, but in the final state a charm quark and a Higgs boson decay are observed in the detector. “These two are produced in association. We aim to gain new insights into the interaction of the Higgs boson with the charm quark,” explains Professor Nikolopoulos. Significant progress has been made over the course of the project, and Professor Nikolopoulos plans to investigate a couple of other ideas over the remainder of the funding term.
We are trying to understand the properties of the Higgs boson, and in particular how it interacts with
lighter matter particles. project, so Professor Nikolopoulos and his colleagues are looking to combine different approaches. “We try to investigate as many of these ideas as possible,” he outlines. This may not necessarily provide the answers to all of the questions around different particle couplings, but Professor Nikolopoulos hopes it will help researchers map a path forward. “Even for the most successful of these methods, we will need a lot of data in order to probe the standard model, because the processes we are investigating are very rare,” he says. “We will need more data to consider observing these processes. The high luminosity upgrade of the LHC will help in this respect.” The upgrade of the LHC is expected to be operational in 2027, and will open up new possibilities in terms of observing rare decay processes. Beyond that, there is also an ongoing debate about the next generation of particle colliders - the European Strategy for Particle Physics was recently updated, looking decades ahead towards the needs of future researchers, and the project’s research
www.euresearcher.com
“We are also looking at how to apply these techniques in the search for new physics, for additional Higgs bosons,” he continues. “Some of the techniques that we have developed in the project can be used to search for additional Higgs bosons. For example, we have recently searched for possible Higgs Boson decays to a Z boson and a light scalar particle.” A large number of Higgs bosons need to be produced in the first place in order to pursue this research and observe these rare decays. For example, the decay of the Higgs boson to a phi meson and a photon is expected to happen only twice for every million bosons. “We need to produce huge numbers of Higgs bosons, because these are very rare processes,” stresses Professor Nikolopoulos. This is a technically challenging area of research, yet that is a large part of the attraction for Professor Nikolopoulos and his colleagues. “We want to learn as much as we can from the data that we have available, and then that will help pave the way for future experiments,” he says.
ExclusiveHiggs Search for New Physics in First and Second Generation Quark Yukawa Couplings through Rare Exclusive Decays of the Observed Higgs Boson Project Objectives
This project tackles for the first time, in a systematic and comprehensive way, the experimentally most unconstrained sector of the SM: the couplings of the light-quarks (up, down, charm and strange) to the Higgs boson, including possible flavour-violating interactions. Searches for the rare exclusive Higgs boson decays to a meson and a photon or Z boson, a novel and unique approach, are performed with the ATLAS detector at the CERN Large Hadron Collider. At the same time, an extensive set of measurements of analogous rare exclusive decays of the W and Z bosons is performed, further enhancing the scientific value of the proposed research programme. Moreover, the project scope has expanded to investigate further new and innovative approaches within this new field of study. These include direct searches for associated Higgs boson production with a W or Z boson, with subsequent Higgs boson decay to a charm—anti-charm quark pair, and associated production of a Higgs boson with a charm quark.
Project Funding
This project is funded by a European Research Council Starting Grant under H2020-EU.1.1. - EXCELLENT SCIENCE
Contact Details
Project Coordinator, Kostas Nikolopoulos Professor of Physics School of Physics and Astronomy University of Birmingham Edgbaston, Birmingham B15 2TT, United Kingdom T: +44 (0) 121 414 4627 E: k.nikolopoulos@bham.ac.uk W: http://exclusivehiggs.eu Professor Kostas Nikolopoulos
Kostas Nikolopoulos is Professor of Physics at the University of Birmingham. He is an experimental particle physicist, focusing on the study of electroweak symmetry breaking and the Higgs sector. Recently, his research interests have expanded to another major open question in physics: the nature of Dark Matter, through both direct and collider-based searches. Professor Nikolopoulos was recently honoured with the first ERC Public Engagement with Research award. One of three recipients of the inaugural award, Professor Nikolopoulos was honoured for his work in public outreach, which included art exhibitions, dance performances and workshops with students.
25
New light on dark matter Dark matter is the focus of a great deal of attention in research, yet mystery still surrounds its nature and structure. The DarkSPHERE project aims to shed new light on the topic through experiments with Spherical Proportional Counters, an innovative type of detector that will open up new possibilities in research, as Dr Ioannis Katsioulas explains. Evidence from astronomical
and astrophysical observations points to the existence of dark matter, which is thought to account for around 85% of the matter in the universe, yet its nature is still unknown. Based at the University of Birmingham, Dr Ioannis Katsioulas is leading the DarkSPHERE project, an initiative which aims to shed new light on dark matter, focusing on the 0.05-10 GeV (the proton has a mass of approximately 1 GeV) mass region. “Our aim is to tune our experiment to look in this region, specifically below 2 GeV,” he outlines. A novel type of detector called a Spherical Proportional Counter (SPC) will be used in this work. “It’s essentially a big sphere, with which we can read-out the signals induced inside the gaseous volume of the detector,” explains Dr Katsioulas. “There is an anode at the centre of the sphere. Whatever interactions take place in the volume of the detector, produce free electrons that drift towards the anode, to create a signal.”
Dr Katsioulas and members of the University of Birmingham installing an SPC in the Boulby underground laboratory for neutron background measurements.
DarkSPHERE Search for light Dark Matter with a Spherical Proportional Counter Funded under H2020-EU.1.3.2. Marie Skłodowska-Curie Fellow, Dr Ioannis Katsioulas School of Physics and Astronomy University of Birmingham Edgbaston Campus, B15 2TT Birmingham, United Kingdom T: +44 0 121 4144623 E: i.katsioulas@bham.ac.uk W: http://darksphere.eu/ Dr Ioannis Katsioulas is a Marie Skłodowska-Curie fellow at the University of Birmingham. He was awarded his undergraduate and doctoral degrees at the Aristotle University of Thessaloniki, Greece. He has spent three years at CEA Saclay, Paris developing novel detector technologies. He’s an experimental particle physicist with a particular interest in dark matter searches and neutrino physics and a founding member of the NEWS-G experiment.
26
The NEWS-G 140 cm in diameter SPC during installation in SNOLAB, Canada.
Spherical Proportional Counter The spherical shape allows the construction of large volume low capacitance detectors which results in low electronic noise, together with the large amplification of the signal, allows for single-electron detection threshold making the SPC ideal for light dark matter searches. “In order to detect light dark matter, you need to be able to detect very faint electric
matter to generate a signal, and this is done through ionisation.” The detector itself is at SNOLAB in Canada, where the experiment will be conducted. SNOLAB is located deep underground, which helps to reduce background from cosmic radiation. “The basic goal with this type of experiment is to reduce the background as much as possible. We put the detectors under the surface of the earth to reduce any cosmic ray flux to a very minimal level,” explains Dr Katsioulas. The detector was assembled underground, from pure copper parts that were quickly moved there to reduce cosmogenic activation. On the inside surface of the copper hemispheres, a layer of ultra-pure copper was deposited using an innovative method called electroforming, to further reduce background. In the near future, a new detector will be put together entirely underground at the SNOLAB which Dr Katsioulas says will also help reduce the background further and improve sensitivity and discovery potential. “We will build an electroformed detector underground with ultra-pure copper, which contains miniscule amount of radioactive isotopes,” he continues. “We’re also considering future upgrades of the detector’s capabilities, and how it can deal with high gas pressure underground.” This is a highly ambitious project, with researchers looking to extend dark matter searches to an unexplored region, which would represent a major scientific breakthrough. Alongside this exploratory research, Dr Katsioulas and his colleagues also plan to utilise the detector for certain industrial applications. “We
In order to detect light dark matter, you need to be able to detect very faint electrical signals, at the level of a single electron. signals, at the level of a single electron to a few electrons,” says Dr Katsioulas. The nature of dark matter means it cannot be observed directly, but researchers can find evidence of these particles through their wider effects. “We cannot really see the dark matter passing through the detector. If it passes without interacting with the normal matter then we won’t see anything.” explains Dr Katsioulas. “But the SPC is an ionisation detector. We can convert part of the kinetic energy of the dark
are investigating how to use this technology for neutron spectroscopy for example, and to help understand the neutron background in rare event searches,” he says. Another potential application of the detector is in medicine, where the use of radiation treatments can lead to the production of neutrons. “The production of neutrons in a medical environment could be dangerous for the people working there, as well as patients. It would be fantastic if we could use the detector to measure these effects,” concludes Dr Katsioulas.
EU Research
Postponing Studies: Dealing With the Mice
When you can’t carry out the research you had planned, what do you do with the mice you’ve already got? The Covid-19 pandemic has thrown research off course in all manner of ways, but little hit so hard as the need to cull research mouse cultures. It’s not been easy to handle, and the pandemic has reinforced both the importance of animals in scientific research and the attachment researchers have to their cultures. By Adam England www.euresearcher.com
27
Elizabeth McCullagh, Assistant Professor at the Integrative Biology Department of Oklahoma State University, photo credit Trevr Merchant. Carrying out research as a postdoc at the University of Colorado, Anschutz.
M
any researchers have had to cull lab animals during the Covid-19 pandemic, as plans for myriad studies have been scuppered. A great deal of normal research has been unable to go ahead, so many labs have been left with an excess of rodents. As a result, a number of researchers have reported having to euthanize mice they would have otherwise used for research, had 2020 been a ‘normal’ year. “We were effectively told by the university that the colonies had to be reduced to avoid putting too much pressure on the animal technicians,” explains Dr Sophie Thomson, Postdoctoral Research Fellow at the University of Edinburgh, “because if we were in a position where only a percentage of staff could come in - either because of infection or having to isolate - there would be too many animals for the remaining technicians to effectively and safely manage.” As a result, genetically modified mice colonies had to be reduced to the bare minimum. It’s not that euthanasia is particularly uncommon in this line of work rodents are frequently euthanized once research has been completed, but this is seen as a necessary evil instead of something carried out en masse, as has often been required during lockdown. Most of the millions of mice and rats used in scientific research each year are euthanized via inhalation of carbon dioxide, though this has been proven to cause behavioural aversion in numerous species, stimulating the amygdala and prompting fear-like behaviours. This is followed by cervical dislocation, though the success of this method in guaranteeing death has been debated. However, it’s seen as the most humane method of euthanasia when it comes to rodents, and as researchers try to use as few rodents as possible, there is a definite aim for suffering to be minimised.
Humane euthanasia is the norm Figures from Oxford University show that in 2019 a total of 222,206 mice were used in research procedures throughout the university.
28
2,388 mice, or 1.07% of the total, were used in ‘non-recovery’ procedures; that is, they had been placed under a general anaesthetic and then killed before regaining consciousness. In contrast, 128,725 mice, or 57.93% of the total, were used in ‘sub-threshold’ procedures - those procedures where no above-threshold pain or suffering occurs. As standard, most rodents are still humanely euthanized after the procedure - the aim is to minimize suffering while still carrying out whatever research is necessary. The principles of the 3Rs - replacement, reduction and refinement - have been the framework for animal research for the past 60 years after being outlined by W. M. S. Russell and R. L. Burch in 1959. Although these are generally adhered to, the need to cull mice during the pandemic means that, by definition, not all of the principles can be fulfilled. Instead of minimizing the number of animals used in research, excess animals are being euthanized. But then, despite an earlier death than would have otherwise been intended, are they suffering less on balance as a result? Perhaps, though the number of mice being culled at once would not have been predicted by researchers, it’s easy to come to the conclusion that as the mice were bred for research and would have been euthanized at some point regardless, it’s theoretically a non-issue. These mice existed solely for medical research - nothing beyond that. However, the problems which have arisen since the beginning of lockdown go further than just having to cull mice prematurely.
One of many issues Speaking to Elizabeth McCullagh, Assistant Professor at the Integrative Biology Department of Oklahoma State University, it becomes apparent there were issues around obtaining animals even before the pandemic. However, Covid-19 certainly hasn’t helped. “Things have been much slower than they would be normally
EU Research
because of remote work,” she explains, “mostly in terms of getting approvals to get animals and space to house them.” McCullagh hasn’t had to cull mice herself, as “I am actually just starting up my lab as a PI and have not been able to get mice in … I have not been able to be in the lab for any sustained period of time with students since March due to safety concerns for my family — I have two young children — and students.” In an area of science that is already controversial, the idea that animals may be culled simply because there’s no practical use for them is one that is unlikely to go down well with opponents of the use of animals in research. When discussing the moralities of animal research, it’s often seen as a necessary evil - with animal suffering an unfortunate byproduct of the research. So, if mice are culled before they’ve ‘fulfilled’ their role, is this more unethical? It’s a debate that has no easy answer, but the internal impact on researchers can’t be understated. Even without the need to reduce colonies, the pandemic was taking its toll on researchers. “Everyone was very frustrated that we were on one hand being told by the university before lockdown that research wasn’t stopping but then also being told to wind everything down there were mixed signals going on there” explains Thomson. Anna Smajdor, associate professor of Philosophy at the University of Oslo, says that this is likely to make researchers feel more conflicted about the use of animals. “I think one way in which people try to make themselves feel better about working in professions that involve killing animals is to say - ‘it has to be done - it’s not as if we’re doing it for fun or
that there’s no point or purpose in it - it has a very important rationale behind it’. Of course, when you just end up with a huge excess of mice that no-one can use, that rationale isn’t there anymore,” she explains. “I think that’s quite hard for people to deal with because, as I say, it’s a very carefully navigated sort of moral tightrope that people walk in these areas so if you lose your main justification for what you’re doing - for something that’s morally contentious in itself - then it makes it harder psychologically. Harder morally, as well.”
Leaving an impact And what of the human side of it? People working in animal research often suffer from heightened stress resulting from their work amongst other harms, including moral injury. Of course, many researchers will become attached to their animals, and culling mice is not a decision that is reached lightly. We know that researchers can often struggle with this aspect of the job, whether euthanizing animals after the research has been conducted or in a circumstance like the current pandemic. Talking about the effect that culling mice has on researchers, Smajdor compares it to the experiences of farmers during the outbreak of foot-and-mouth disease in the UK in 2001. “Farmers were really struggling with having to cull their animals, and these, of course, are farm animals that would be culled for meat anyway, but anything that takes the killing out of this very carefully controlled predictable environment re-exposes the cognitive dissonance that goes on when human beings kill other animals.”
“For animals that were due to be used for studies and just had to be scrapped, that was pretty devastating.” Sophie Thomson
www.euresearcher.com
29
Elizabeth McCullagh, Assistant Professor at the Integrative Biology Department of Oklahoma State University, photo credit Trevr Merchant.
That is, most of the time we think of ourselves as nice, civilised people; we don’t just go around killing things. What’s happening in the pandemic, I would say, is just exposing some of the underlying rawness and difficulty of working in one of those professions where you do have to do that.” Thomson described the need to cull animals as “awful”, as “no-one took it very well. For animals that were due to be used for studies and just had to be scrapped, that was pretty devastating.” Clearly, culling animals is something of a thankless task for those working in animal research, putting people in the position of deciding whether an animal will live or die. Whereas this would be somewhat easier when the animal has already been used in research, culling animals that haven’t fulfilled their role can prove to be more difficult.
Moving forward As researchers have had to adapt - McCullagh’s two children are aged 3 and 10 months so she’s stayed away from the lab for safety reasons - could this lead to wider changes in how research is conducted postCovid-19? Smajdor doesn’t think so, as she anticipates normal service being resumed once possible. “If I were being an optimist, I’d like to say that the pandemic has shown that we are prepared to do research that cuts out some of the reliance on animal models. We’re willing to do that when it seems that the necessity is urgent enough, and of course, the pandemic presents us with some kind of human urgency and necessity,” explains Smajdor. She continues: “I think for people who campaign against the use of animals in research, it’s very hard to envisage a time when the public
Anna Smajdor, Associate Professor of Philosophy at the University of Oslo.
or scientists are going to see an urgent need to stop using mice in research. I suspect that what will happen is some people will say ‘hey, look - we’ve managed to make some progress without using animals’ and other people will say ‘thank goodness we don’t have to do that anymore, and we can go back to the way that we used to do things’.” Although we can’t predict the future for medical research, it’s clear that people have had to adapt and change the way they work. There’s no way that a future without using animals in research will be arriving any time soon as, despite researchers being able to improvise and adapt to the pandemic, it’s not been ideal. For Thomson’s lab, some studies were allowed to continue as some animals had already been dosed as part of a long-term monitoring study, but the researchers weren’t allowed to start any new studies and so were limited in what they could do. As McCullagh explains, a number of issues have arisen, in part due to the pandemic, from difficulties in overseas students’ journeys to gathering preliminary data for grants. These are alongside the struggles in getting animals in, and of course the widespread need to cull mice. In short, the circumstances have been difficult for everyone - whether they’ve been directly involved in culling mice or not. Yet, when regular protocol can eventually resume, the need for a supply of animals will also. Despite there being at least some evidence that research can be carried out without animals, the consensus remains that animal research is sometimes a necessity, and this is predicted to continue. As Smajdor puts it: “It would be nice if one could think that something might change in the longer term, but I don’t think the pandemic is going to do that.”
“If you lose your main justification for what you’re doing - for something that’s morally contentious in itself - then it makes it harder psychologically.” Anna Smajdor 30
EU Research
City drive in an autonomous test vehicle. Safety driver has hands off the wheel but needs to keep their eyes on the road.
AI behind the wheel for city driving Wojciech Derendarz, of the UP-Drive project, is working out how autonomous vehicles can develop improved perception, relying on combined technologies to process urban environments, so self-driving cars can take a step closer to earning their driving license. Wojciech Derendarz, Project Leader
Taxi – take me home!
at Research & Development of Volkswagen Group, is a veteran of the autonomous car sector who has been sitting ‘hands off’ behind the wheel of self-driving prototypes for 13 years. His latest goal is to make vehicle AI better understand challenging urban environments. It sounds hard, and it is. To accomplish this feat, a car needs to adapt, accurately comprehend and react instantaneously in complex, changing environments. The original aim of UP-Drive was to develop a technology for self-driving cars, able to locate a parking space after the driver has been ‘dropped off’ and self-park. As the goal was to offer that technology anywhere in a city – not just on special premises like parking garages at airports – the project scope quickly evolved to tackling the myriad of challenges faced by city driving, with parking being the simplest piece. With 70% of the global population predicted to be living in urban and suburban areas by 2050, having reliable, completely autonomous cars will have a significant impact and represent a huge benchmark in self-driving technology.
“It is important to realise two different approaches out there,” explained Derendarz. “Most car manufacturers take the evolutionary approach. They currently offer privately owned cars with driver assistance systems, meaning that the driver still needs to monitor the system at all times – and they gather data and optimise performance over time. Once the systems become good enough, they hope to enable the higher level of autonomy that will finally allow the driver to take their eyes off the road. “We decided to shift the UP-Drive project when we realised you need similar technology for someone’s private on-street valet parking function, as you need for a robo taxi. It seems realistic that the systems we are developing will be seen first in robo taxies and robo shuttles, backed by companies willing to invest heavily in innovation. There is a stronger commercial case as they earn around the clock in the business model. A private car is parked most of the time and the cost for the technology may initially be prohibitive to car owners.”
www.euresearcher.com
So how would such a system work? A system capable of the higher levels of autonomy would need to have a strong capability of localisation and mapping, would need to master complete round-view perception of the vehicle’s environment, and be able to form a detailed understanding of complex scenes it encounters as well as predict what other traffic participants are up to. UP-Drive used and combined several sensing technologies such as camera, lidar and radar as a foundation for all those tasks.
Know where you are “The first step is knowing where you are and where you want to go. You need an idea of where you are in the world which means for AI, relying on a navigation application or a map from your smartphone or car – only much more detailed and precise.” It is useful and currently seems necessary to use a map in autonomous cars to provide information about the environment beyond that, what can be perceived with sensors. “We have the information in maps but you also need to know precisely where you are in
31
In an urban environment an autonomous car needs to be able to detect and track plenty of objects: pedestrians, cyclists, cars and other obstacles. 3D geometric data is the foundation for that.
But 3D points and boxes alone is not enough an autonomous car needs also to understand what it is seeing. Just as we humans do.
the map. This is what we call a localisation task. An autonomous car needs to localise itself with great accuracy to around 10cm and 0.1 degree in angular accuracy, to be able to utilise all the information in the map and to be able to navigate through a narrow part of road. We are approaching this level of accuracy.” From all the key technologies, localisation is the closest to being ready for autonomy.
Perceiving reality in complex scenes Although crucial, mapping and localisation are not the biggest challenges in self-driving vehicles, perception is much harder. “We have different sensing items we can use for this task,” explains Derendarz. “We are transforming toward more human like perception and this poses challenges. We humans rely very strongly on complex information. We classify the scene and everything we see in it.” People use top down perception to ‘fill in’ gaps with their imagination. For example, we know that if we see a chair leg obscured by a
table that there is a whole chair there, but a computer may perceive a piece of chair leg in isolation. More relevantly, if there is a row of parked cars, we may judge the gaps between them without seeing the gaps, so we know that they are not one long object. This level of understanding can pose a challenge for artificial perception. “As a driver you will never have a situation where you have a car somewhere and the next second you will think that this car has disappeared and the next second the car is there again. It’s no good if your self-driving car is stopping every five minutes or worse, hitting things. Typically, we can position the things that we do not see directly, I can connect the dots in my mind. We create a very strong context-based understanding of what we see. It looks very different to what a machine sees. “Machines try to get the geometric information first. For example, we take a laser scanner and capture a 3D point cloud out of that and out of that point cloud we look for objects. We look for road surfaces and placed objects and everything else
Test vehicles used in the project are based on the fully electric VW e-Golf and have been equipped with many additional sensors – some of them on the roof of the car.
32
is what we can collide with. We try to separate those things from one another. In human contextual understanding we have no problem understanding when cars are close to one another, where one car ends and the next one starts – we see the wheels and the windows, the different colours of the cars and so it’s easy. To do the same task in a point cloud, it gets much more difficult, especially when you are trying to describe it in a rule based algorithm that takes the gaps in the data or measurement noise into account. It’s difficult to find rules that do this task perfectly.” The project tackles the challenge by splicing the data from different sensing devices together into a holistic viewpoint which is rich with correlating points of information. This creates a more contextual, ‘fluid’ overview. “In UP-Drive we have taken an approach to build up on top of this geometrical way of looking at objects. We utilise geometry strongly and introduce this more contextual understanding of the scene to stabilise the perception. We do it by putting the camera images and point clouds into one representation. We project the points from the laser scanners onto the virtual image plane so that we can see where those points from the laser scanner would be in this image that is being captured by the camera. Then we do correlations, so we do a one to one correlation with the laser, camera and radar combined. “In the city you have a huge amount of objects and you cannot combine a high quota of data in traditional ways as it gets too cluttered. That’s why we decided to fuse the information on a much lower level. This is one of the more fundamental ideas we have proposed and explored in UPDrive. This has led to very good results but still, we have a long way to go to do what a human can do.”
EU Research
The full picture emerges only after the perceived information has been combined with information stored in the digital maps.
UP-DRIVE
Automated Urban Parking and Driving
Project Objectives
The UP-Drive project focus is on advancing key technologies for autonomous driving: • Robust, general 360° object detection and tracking employing low-level spatiotemporal association, tracking and fusion mechanisms. • Accurate metric localization and distributed geometrically consistent mapping in largescale, semi-structured areas. • Scene understanding, starting from detection of semantic features, classification of objects, towards behavior analysis and intent prediction. • Behavior and motion planning for complex environments.
Are we nearly there yet? The nuances of driving are becoming clearer for researchers trying to establish better AI in this sector. Even with perfect perception, there are still many challenges: the car needs not only to see what is happening right now but also to predict the behaviour of other traffic participants for the next couple of seconds. In UP-Drive this can be done with success if you assume the typical behaviour. However, predicting the atypical behaviour, like a pedestrian stepping onto the road to cross it even though they have no right of way, is what researchers are only starting to explore. Another aspect is that humans take more calculated risks than computers would in
there are many situations that don’t seem challenging at all but once you see it from the eyes of the car, sometimes it’s unsure of the better solutions to choose. You take a different perspective on everyday situations with the traffic. And there are situations that might be difficult for us as humans but the car masters those without any problems. The car can navigate complex crossroads and change lanes with ease, for example. With mapping information, it was a simple optimisation problem, which lane to take for the least lane changes. In contrast, in the factory facility there are very slow, narrow vehicles about the width of half a car, with trailers behind them. For a driver it would be simple to overtake slowly but
“We need to build
the technology to a level where we have complete trust in it ourselves, where we can put our children in the car and we trust it will take them safely to where they need to go.” typical driving scenarios. When you overtake a car on a busy motorway with traffic ahead and behind, braking space is often compromised for driving efficiently. So, what is acceptable risk when many people will expect machines not to take any risks but still drive effectively like a human? This brings us to the two final challenges of autonomy: prediction and trajectory planning. This is where the car’s ‘decision making’ is taking place. In the UP-Drive project the self-driving technology was initially trained and tested in the grounds of a factory facility which was large and diverse enough in scale and complexity to simulate a small city. The job was to interpret how the car viewed and interacted with the environment. “It’s always an adventure to see how the car is solving the different situations and
www.euresearcher.com
the autonomous car seemed challenged by its conflicting policies of getting to its destination fast and keeping enough safety distance for other traffic participants. With light variations of lane width, it changes its mind and hesitates, so there are these challenges to overcome.” For smooth and safe driving you have to have a sharp, reactive system in the car, so if something goes wrong, or the initial predictions are wrong, there is the ability to react fast when new information arrives. “We need to build the technology to a level where we have complete trust in it ourselves, where we can put our children in the car and we trust it will take them safely to where they need to go. Then we need to prove to users, this is reliable and stable in every situation. We’re working on it and there is a little way still to go yet.”
Project Funding
Funded under Research and Innovation Action. Programme H2020-EU.2.1.1. - Information and Communication Technologies (ICT)
Project Partners
• Volkswagen AG, Germany • Swiss Federal Institute of Technology in Zurich (ETH Zürich), Switzerland • IBM Research GmbH, Switzerland • Technical University of Cluj-Napoca, Romania • Czech Technical University in Prague, Czech Republic
Contact Details
Wojciech Derendarz Innovation ADAS, Autonomous Driving Department Volkswagen car.SW Org Wolfsburg AG T: +49 5361 915662 E: wojciech.derendarz@volkswagen.de W: https://up-drive.eu Wojciech Derendarz
Wojciech Derendarz started his career in 2007 at the Technical University of Braunschweig, Germany, where he worked on “Caroline”, the robotized car that participated in the Darpa Urban Challenge finals. In 2008 he moved to Volkswagen Group, where he continued his research on self-driving cars. In 2011 he took on the role of a project leader and has since been responsible for a number of research projects in the area of autonomous driving.
33
Software infrastructure for the computers of tomorrow New software infrastructure is required to fully exploit the potential of the emerging generation of high performance computers. We spoke to Ignacio Pagonabarraga, Sara Bonella, Jony Castagna and Donal MacKernan of the E-CAM project about their work in developing new software modules targeted at the needs of both academic and industrial end-users. The ongoing development
of high performance computers (HPC) is opening up new possibilities in research, helping scientists gain deeper insights into fundamental questions across all fields, from molecular dynamics, to electronic structure, to multiscale models. The E-CAM project has been established to support the ongoing development of Europe’s HPC infrastructure, as its technical manager Professor Ignacio Pagonabarraga explains. “E-CAM aims to create an infrastructure of software development, and to train users in the new HPC landscape. We liaise with industrialists, with the goal of enlarging the community,” he says. The focus of the project is on modelling materials and biological processes, work which holds wider importance to society and industry. There are three complementary themes to this work. “We develop, test, maintain and disseminate software modules targeted at enduser needs,” outlines Professor Pagonabarraga. “Second, we train academic and industrial researchers to exploit these capabilities. Third, we provide multi-disciplinary, applied consultancy to industrial end-users.”
CECAM This research builds on the network of CECAM, an organisation which aims to facilitate and catalyse international cooperation in computational science. The computational science research community was relatively small when CECAM was established in 1969, but the field has developed rapidly over the last 50 years, and CECAM’s overall mission has widened. “The community has grown, and the areas of expertise covered by CECAM have also increased accordingly,” says Professor Pagonabarraga. A number of workshops, schools and other events are organised by CECAM, activities which Professor Pagonabarraga says are central to its wider agenda. “We aim to bring people together to collaborate and work on different topics, for example questions around molecular simulations and modelling, and the development of new algorithms,” he continues. “CECAM today is organised as a network of 17 nodes distributed across Europe, and with Headquarters in Lausanne (Figure 1). These nodes also help to bring research initiatives closer to local communities.”
34
Figure 1: Map of the CECAM nodes and of E-CAM partners, showing how the E-CAM consortium is based around the CECAM’s distributed Network of nodes. E-CAM is coordinated at the École Polytechnique Fédérale de Lausanne (Switzerland), which is also where the Headquarters (HQ) of CECAM is located.
The wider aim of the E-CAM project is to develop the sophisticated software infrastructure required to fully exploit the capabilities of the emerging generation of supercomputing technologies. These machines are far more powerful than their predecessors, and their architecture is also different. “Therefore there is a need to develop this software infrastructure,” says Professor Pagonabarraga. Researchers are investigating how to model and simulate materials and biological processes on scales that have not previously been achievable. “There is no single, monolithic code that can be used with all the different types of materials and biological processes that we’re interested in. There are different types of
codes depending on the type of system, or the material properties,” explains Professor Pagonabarraga. “We are developing what we call modular software, that can be adapted to a particular type of code or family of codes. We have also created libraries that can interface with these codes to enable the effective exploitation of HPC resources.” This modular approach allows researchers to identify the specific pieces of software or workflows that may be of interest with respect to different codes. Another important activity in the project is the co-design of scientific codes to run on HPC machines. GPU accelerated and scalable applications can help reduce the time-to-market of novel products (Figure 2).
EU Research
Figure 3: E-CAM developed a new method and dedicated software for designing control pulses to manipulate qubit systems based on the local control theory (LCT). The system is schematically represented above: two fixed frequency superconducting transmon qubits (Qubit 1 and Qubit 2) are coupled to a Tunable Qubit (TQ) whose frequency is controlled by an external magnetic field. Changing the frequency, the TQ behaves as a targeted quantum logic gate, effectively enabling an operation on the qubit states. M. Mališ et al., Phys. Rev. A 99, 052316 (2019)
Figure 2: Strong scaling efficiency of DL_MESO (E-CAM Meso and Multi-scale Modelling code) versus the number of GPUs for a simulation of a complex mixed phase system consisting of 1.8 billion atoms. J. Castagna et al., Comput. Phys. Commun. 251, 107159 (2020)
There are several workpackages within the project, each covering different challenges around simulation and modelling. “They involve different types of codes and systems descriptions at different scales,” says Professor Pagonabarraga. For example, one of the workpackages concerns the development of coarse-grained models and multi-scale modelling. “The general goal of coarse-graining is to identify - out of the whole detail of a system composed of atoms and molecules - how much detail is required and which atomic or molecular details we can effectively average out,” continues Professor Pagonabarraga. “There is no single answer to this question. There are different ways to approach coarse-graining, depending on the scientific or technological problem.”
of communicating with each other,” he points out. This topic is being addressed, for example, in a pilot project together with Michelin where the aim is to develop new codes to study polymer systems. “If you want to understand how polymers are connected to each other in a tyre, and the mechanical properties of a tyre, then you need to use a coarse-grain approach,” says Professor Pagonabarraga. “In another pilot project, an algorithm called GC-AdResS is being developed for both coarse-grained and more detailed descriptions of a material.” The different scientific workpackages in the project provide the flexibility to address a problem at different scales. This is an important attribute in terms of collaborating with industrial partners, as Dr Sara Bonella explains. “You might
You might find that answering a specific question posed by an industrial partner requires the use of multi-scale modelling. E-CAM covers modelling and simulations at different scales, from the atomic level right up to the coarse-grain models, providing the flexibility to address a problem
from different points of view Modelling and simulations A further part of that workpackage involves multi-scale modelling, where researchers are developing software that can reproduce the detail of a system at different levels. For example it may be that one part of a particular system can be dealt with in a fairly approximate manner, while with another it may be important to capture the chemical specificity, which Professor Pagonabarraga says represents a significant technical challenge. “This is quite a complex task, as it requires different types of models and in some cases paradigms capable
www.euresearcher.com
find that answering a specific question posed by an industrial partner requires the use of multiscale modelling. E-CAM covers modelling and simulations at different scales, from the atomic level right up to the coarse-grain models, providing the flexibility to address a problem from different points of view”, she outlines. As the leader of the workpackage on quantum dynamics, Dr Bonella is working on the development of superconducting qubits for quantum computing, which again involves close collaboration with industry (Figure 3). “We’re looking at how laser pulses can be used to effectively tune the response of a group
of these qubits. By controlling this response, we hope to create different logical gates,” she continues. “This is a very ambitious objective, and we are looking at it in collaboration with a theory and simulation group at IBM.” This is a case in which the goal of the simulations is to provide indicators on experimental setups. The hope is that the set of indicators that have been gathered, in particular about what the laser should do to guide the behaviour of each qubit, can inform future experiments. The ability to model a system at multiple scales holds great relevance to the development of quantum computing. “A qubit is a quantum dynamical object, and its behaviour can be profoundly affected through something called decoherence. This is essentially the effect of the environment on the qubit, and it tends to destroy its quantum properties,” says Dr Bonella. “If these quantum properties are destroyed, then we lose the advantage to the hardware that is provided by the qubit.” A multi-scale description of a quantum system, in which the qubit is described at a quantum dynamical level, can help researchers build a fuller picture in this respect. This quantum dynamical description of the qubit can then be matched with a classical description of the environment in which it is embedded, which Dr Bonella says will help scientists understand how the environment affects its performance. “It’s another example of the importance of multi-scale modelling,” she outlines. The ability to control and tune the properties of a qubit is central to the wider goal of developing quantum computers, which could have significantly greater computational power than conventional machines. “There is an added richness that comes from the fact that you can have these linear combinations of states and entangled states and coherent transfer of information. These things exponentially multiply the computational capacity,” explains Dr Bonella. “But the development of these machines is still at a relatively early stage.”
35
Machine learning The project’s overall agenda also includes research into machine learning and artificial intelligence, areas which have developed rapidly over the last decade as a new way of exploring science. One topic being addressed in this part of the project is building reliable models for interactions using neural networks based approaches. “We aim to understand how to build neural networks which are correct, and which make reliable predictions on potential energy surfaces,” says Dr Jony Castagna. A neural network is based on a layered structure, where each layer is made of neurons, referred to in the project as perceptrons, which can be thought of as a mathematical model of a neuron. “We use graphic processing units (GPUs), which are able to calculate necessary matrix-matrix multiplications very rapidly,” explains Dr Castagna. “So they really enhance the power dramatically. They effectively train the neural network, and then you can use it to make predictions.” Researchers are also considering the potential applications of these machine learning techniques. There are pilot projects within E-CAM looking to exploit machine learning for important problems in specific fields. “One of them is the modelling of metal ions in proteins. These are basically metal co-factors, and they play very important roles in certain areas of biology. In a pilot project we are using machine learning to effectively generate more accurate coarse-grained models capturing biochemically important details (Figure 4),” outlines Dr Donal Mackernan. Highly sophisticated algorithms are required to investigate changes in protein structure and function, a topic central to Dr Mackernan’s work in the project focused on the development of diagnostic protein sensors for diseases, such as influenza and Covid 19 . “A typical protein complex is part of a system consisting of millions of atoms. Proteins don’t exist by themselves, they’re usually in water, in salt, or another complex environment, so they are a formidable system to simulate,” he explains. A major factor here is the timescales over which biological processes take place. The timescales involved in the oscillations of hydrogen bonds in water are on the order of 10 -15 seconds, which is very short,
36
yet the timescales at which real biological mechanisms take place are often on the order of milliseconds or seconds. “No supercomputer currently available can simulate a system to the required level of detail on that kind of timescale,” says Dr Mackernan. This is a problem that Dr Mackernan and his colleagues are working to overcome while developing a novel type of biosensor. “This biosensor is essentially a modular system, and is naturally coarsegrained,” he explains “We can investigate how to build it and optimise it. But when you do that you’re immediately confronted by the problem that the devil is in the detail - very often you need to take atomistic effects into account.” A powerful set of techniques called rare event sampling methods are being applied to help develop the very detailed simulations that are required. “These methods are used to try and overcome this timescale problem, to go from say 10 -15 of a second to milli-seconds or seconds. A lot of software has been developed within E-CAM for that purpose. It
is about developing sophisticated statistical techniques to try and capture fleetingly short lived events or configurations of the system critically important to the chemical or physical process,” outlines Dr Mackernan. Another pilot project, within the general activities of creating interaction potentials via neural network, specifically focuses on using electronic structure calculations as the building blocks of the machine learning procedure. “The quantum system, specifically an electronic structure calculation, trains the neural network, so it explores and discovers the relevant parameters of a highly adjustable multi component potential. That gives rise to an effective potential, which can be used to simulate a quantum system using classical simulation,” continues Dr Mackernan. The benefit of machine learning here is that it allows researchers to model the interactions at a small scale using quantum mechanics, from which point it’s then possible to simulate the system on much larger scales. Whereas other methods are relatively limited in terms
Figure 4: Neural network potentials (NNP) with their capability to reproduce arbitrary potential energy surfaces can also be used to construct coarse-grained (CG) models. The image shows different coarse-graining levels of dendrimer-like DNA molecules (top), from a description retaining more detail (left) to a single-bead representation (right). This work is part of an E-CAM pilot project.
EU Research
E-CAM An e-infrastructure for software, training and consultancy in simulation and modelling Project Objectives
of simulation size, Dr Mackernan says the use of machine learning opens up the possibility of simulating a system on much larger scales, and also over longer timescales. “ There’s an enormous improvement, in terms of the realism of the modelling,” he says. This is not a purely academic exercise, and alongside research Dr Mackernan is keen to help potential users develop their skills through extended software development workshops. “Recently we’ve held training events on the use of machine learning to derive potentials to model quantum systems for example,” he outlines. “Another important topic here is the extraction of reaction mechanisms. So when you try to simulate a large system, you want to understand some of the key properties of that system.”
Training This commitment to providing training is an important aspect of E-CAM, and a recognition of the need to help the next generation of researchers develop their skills. Extended software development workshops organized in the project provide the opportunity for researchers to learn about software tools, simulations and modelling in today’s rapidly evolving
computing landscape (Figure 5). “Participants can work together on specific modules, on the development of software,” says Professor Pagonabarraga. Another type of training environment established within E-CAM is what Professor Pagonabarraga calls a scoping workshop. “Here we bring together industrialists, software developers and academic researchers, to discuss the challenges they face,” he continues. “It’s very important to have people with complementary areas of expertise and to have these discussions as open and broad as possible.” A number of dedicated training events are planned for investors as well, as new simulation and modelling techniques could bring great benefits to industry, helping the commercial sector work more effectively. A couple of events are planned for the next year, including one on meso-scale simulations, as well as another in collaboration with one of the project’s industrial partners Biki, an SME that develops software for drug design. “We want to work with industrial researchers, so we can train them on the use of software, and on the methodological novelties while learning from them about exciting new problems,” explains Dr Bonella.
E-CAM aims at (1) developing software to solve important simulation and modelling problems in industry and academia, with applications from drug development, to the design of new materials. (2) Tuning those codes to run on HPC, through application co-design and the provision of HPC oriented libraries and services; (3) Training scientists from industry and academia ; (4) Supporting industrial endusers in their use of simulation and modelling, via workshops and direct discussions with experts in the CECAM community.
Project Funding
The E-CAM project is funded by the European Union’s Horizon 2020 research and innovation program under the grant agreement No. 676531
Project Partners
E-CAM is coordinated by CECAM at École Polytechnique Fédérale de Lausanne (EPFL), and is a partnership of 13 CECAM Nodes, 4 PRACE centres, 12 industrial partners and one Centre for Industrial Computing (Hartree Centre). Partners details can be found at: https://www.e-cam2020.eu/history/
Contact Details
Prof. Ignacio Pagonabarraga E-CAM Technical Manager CECAM Director Ecole Polytechnique Fédérale de Lausanne (EPFL) Batochime building BCH 3101 Avenue Forel 2 CH - 1015 Lausanne T: +41 21 693 79 23 E: ignacio.pagonabarraga@epfl.ch W: www.e-cam2020.eu W: www.cecam.org Left to right: Ignacio Pagonabarraga, Sara Bonella, Jony Castagna, and Donal Mackernan
Ignacio Pagonabarraga is the Director of CECAM, with headquarters at the École Polytechnique Fédérale de Lausanne, and Professor of Condensed Matter Physics at the University of Barcelona. Sara Bonella is the Deputy Director of CECAM and Principal Investigator in Computational Quantum Physics, at the École Polytechnique Fédérale de Lausanne. Jony Castagna is Computational Scientist at the Science and Technology Facilities Council, Hartree Centre. Donal Mackernan is Principal Investigator at the University College Dublin and the Director of the CECAM Irish Node.
Figure 5: E-CAM Extended Software Development Workshop at the Lorentz Center in Leiden.
www.euresearcher.com
37
Building the energy policy picture Researchers at ETH Zurich are preparing a training dataset on design characteristics of policies which affect the deployment of renewable energy technologies. This could help in the development of algorithms to automatically identify the design characteristics of more of these policies, which would be an invaluable tool in formulating climate policy, as Dr Sebastian Sewerin and Dr Lynn Kaack explain. A lot of attention and financial resources have been focused on the development of renewable sources of energy over recent years, part of the wider goal of mitigating climate change. Many governments have designed policies to encourage this transition and have committed themselves to ratcheting-up their ambitions in line with the Paris Agreement, yet currently researchers lack a clear picture of what measures have been adopted. “We don’t really have a comprehensive understanding of what countries are doing in terms of their policy interventions,” explains Dr Sebastian Sewerin, a senior researcher and lecturer in the Energy Politics Group at ETH Zurich. This is an issue Dr Sewerin and his colleagues in a new interdisciplinary research project based at ETH Zurich are working to address. “Each country has hundreds of policies that are potentially relevant to the deployment Uncovering policy designs: A training dataset for future automated text analysis Sebastian Sewerin & Lynn Kaack, Energy Politics Group, Swiss Federal Institute of Technology (ETH) Zurich, Haldeneggsteig 4, 8092 Zürich, Switzerland. T: +41 44 632 47 22 E: sebastian.sewerin@gess.ethz.ch E: lynn.kaack@gess.ethz.ch W: https://epg.ethz.ch/
Sebastian Sewerin is a Senior Researcher and Lecturer at ETH Zürich’s Energy Politics Group. He is interested in policy design and in assessing long-term policy change in the energy sector. His research brings together theories and concepts from political science and innovation studies to understand the co-evolution of policy and technology. Lynn Kaack is a Postdoctoral Researcher at ETH Zürich’s Energy Politics Group, and a chair of the organization Climate Change AI. Her research applies methods from statistics and machine learning to inform climate mitigation policy. She obtained a PhD in Engineering and Public Policy and a Master’s in Machine Learning from Carnegie Mellon University.
38
of renewable energy technologies,” he says. “It’s important to have more data on the design characteristics of all of the relevant policies if we are to understand how effective these policies are in inducing technological change in the energy sector.” His colleague in this project, Dr. Lynn Kaack, emphasizes, “Currently, students in our research group spend weeks annotating texts to create policy data. This doesn’t allow us to go much beyond analysing small selections of policies. With the help of computerized text analysis, we can speed up this process. This project is a first step in that direction.”
Design characteristics that are relevant for a policy’s effectiveness are, for example, an ambitious goal and clear implementation rules. A second set of indicators is more specifically related to the energy technologies themselves. “We’re also interested in how technology-specific certain policies are. Are they targeting renewable energy technology overall? Or are they targeted more towards specific renewable technologies, like solar PV or wind?” explains Dr Sewerin. A variety of policy design options are available to help stimulate the development of renewable energy, and Dr Sewerin believes they all have a
Currently, students in our research group spend weeks annotating texts to create policy data. With the help of computerized text analysis, we can speed up this process. Policy mix The challenge here is to create relevant data from the large number of policies which affect the renewable energy sector. These policies could mean subsidies and pricing strategies designed to support renewable energy projects for example, as well as more small-scale initiatives. “We’re looking at a very broad range of policy instruments, including things like complex feed-in tariffs or simple information campaigns. You will find different types of policies in a policy mix,” outlines Dr Sewerin. The first part of the project’s work centres around building a text corpus of energy policies. “We started with federal renewable energy policies in the US, and also include a selection of EU policies,” continues Dr Sewerin. “The next step then would be to label these individual policies according to the policy design characteristics that we’re interested in and start building up the training dataset.”
role to play in terms of meeting the emissions reduction goals set out in the Paris Climate Agreement of 2016. “The idea is that policies should be well-designed. Unfortunately, political discussions often focus on the perceived merits of policy instrument types, such as carbon pricing. This is not helpful at all, we should be talking about good design,” he stresses. The wider aim in the project is to use the training dataset that has been gathered to develop algorithms that can then pick up these policy design characteristics in other texts. “If a future algorithm is able to pick up on these policy design characteristics, we will have a much better understanding of the differences between countries and over time. This would help us tremendously to identify those policy mixes that are effective and thus keep a chance to limit climate change to 1.5°,” explains Dr Sewerin.
The 21st session of the UN Conference on Climate Change in Paris 2015.
EU Research
The colour of ancient life Fossils provide evidence of ancient life, including the colour of insects, reptiles, birds and mammals as old as hundreds of millions of years. Researchers in the ANICOLEVO project are applying high-tech methods to analyse fossils and gain new insights into how colour has evolved over deep time, as Dr Maria McNamara explains. Many of us
think of palaeontology as a descriptive discipline, in which researchers collect and describe fossils to build a deeper picture of ancient life. This is not the only way to study palaeontology however, and as the Principal Investigator of the ANICOLEVO project, Dr Maria McNamara is applying sophisticated techniques to study both the microstructure and chemistry of fossils. “We aim to find evidence of colour in animals, and to understand how colour has evolved over deep time,” she outlines. This includes using synchrotron X-ray fluorescence and X-ray absorption spectroscopy to analyse metal distributions in fossils. A key focus of her group’s work are fossils that preserve evidence of melanin, a pigment best known for its role in determining skin colour. “We can map metals across the surfaces of fossils, and understand how the metals are bound to the organic material, which is effectively polymerized and degraded melanin. This chemical information helps us understand which metals are more likely to have been present in life, in turn shedding light on the functions of melanin long ago,” explains Dr McNamara.
Melanin This research is grounded in a comprehensive analysis of extant animals. Part of the project involves characterising melanin in modern vertebrates to provide a solid basis for comparison. “Melanin is synthesised by melanosomes, tiny cell organelles that we all have in our hair and skin,” says Dr McNamara. From a chemical point of view, there are two main types of melanin, eumelanin and phaeomelanin. “These two forms of melanin are produced in different ways and have different optical effects. They also have quite different effects on the body,” continues Dr McNamara. “Phaeomelanin actually seems to be quite damaging. There’s a debate about why vertebrates evolved phaeomelanin if it has negative effects. It’s useful for visual signalling, especially camouflage, but there’s clearly an evolutionary trade-off. There are lots of unanswered questions about when and why different types of melanin evolved.” The project team are analysing fossils dating back to the Carboniferous period
www.euresearcher.com
Fossil melanosomes - cell organelles rich in the pigment melanin.
Mapping trace elements associated with fossil melanosomes to identify melanosomes in the skin and internal organs using synchrotron-X-ray fluorescence. Fossil snake skin preserved with high ultrastructural fidelity in three dimensions, including details of preserved chromatophores (prominent globules in this image).
Modern analytical techniques can reveal chemical and microstructural information on pigments and other coloration mechanisms in fossils tens to hundreds of years old, helping us understand the evolution of colour and its functions through deep time. from a variety of localities, including ancient lake environments and river systems, to address these types of questions. One major finding relates to the colours of ancient organisms. “Most fossils don’t preserve the full complement of colour-producing cells in the skin. We have shown that minerals can form an accurate 3-D template of the original tissue, and in those cases you can preserve all colour-producing cells. That’s when you can
reconstruct fossil colour with confidence,” says Dr McNamara. Another key finding relates to the distribution of melanin in the body. “Surprisingly, we showed that internal melanosomes are actually very widespread. We found them in every single internal tissue and organ, in modern amphibians, reptiles, birds and mammals, plus in their fossil ancestors. Obviously melanin doesn’t just colour our integument, but also has some cryptic and pretty important physiological functions.” Dr McNamara and her colleagues have also used melanosomes to shed light on the affinities of enigmatic fossils. “For instance, there has been a lot of debate over the last few years about the classification of an ancient fossil called Tullimonstrum, which is about 300 million years old. It’s been interpreted as being everything from a mollusc, to a vertebrate, to an arthropod,” she outlines. “It contains layers of differently-shaped melanosomes in its eyes, which was thought to be a uniquely vertebrate characteristic. We found melanosomes, however, in the eyes of modern squid and their relatives. What’s more, the chemistry of Tullimonstrum’s eye melanosomes suggests a closer affinity to invertebrates after all.”
ANICOLEVO Animal coloration through deep time: evolutionary novelty, homology and taphonomy Dr Maria McNamara, Senior Lecturer School of Biological, Earth and Environmental Science University College Cork Distillery Fields, North Mall Cork T23 TK30, Ireland T: +353 21 490 4570 E: maria.mcnamara@ucc.ie W: http://mariamcnamara.ucc.ie W: www.ucc.ie/en/bees/
Dr Maria McNamara is a palaeobiologist in the School of Biological, Earth and Environmental Sciences at University College Cork.
39
Nature-based drainage systems can provide solutions to managing excessive rainwater in highly developed, urban areas. Credit: Rudmer Zwerver.
Nature-based solutions to sustainability challenges Nature-based solutions are an important part of efforts to combat climate change and can help cities address other sustainability challenges, yet they have not been adopted on a widespread basis. The Naturvation project brings together researchers from several disciplines to unlock the wider potential of nature-based solutions, as Professor Harriet Bulkeley explains. The issue of
climate change is high on the agenda in many European cities, as they seek to reduce carbon emissions and protect the local environment. While climate change is a prominent issue, it is not the only challenge facing city authorities, who are also looking for effective ways to deal with a wide range of other concerns. “It’s apparent that as cities are responding to climate change, they’re trying to do it in a way that allows them to also meet other sustainability goals, such as those around economic regeneration, social inclusion, or health and wellbeing,” says Harriet Bulkeley, Professor of Geography at Durham University in the UK. Naturebased solutions (NBS) can be an effective way to address some of these challenges, a topic at the heart of Professor Bulkeley’s work as the Principal Investigator of the Naturvation project. “The project is looking at how cities can realise multiple sustainable development goals,” she outlines.
40
Nature-based solutions A well designed NBS can play an important role in this respect, for example through helping to create recreational areas and improve biodiversity. The idea of a NBS is not easy to define, but Professor Bulkeley says it can be broadly thought of as working with nature in cities in order to achieve sustainable development goals. “For example, if a local community is trying to restore a small park at the end of a street for leisure and recreation, and to create habitats for birds, butterflies and bees, then that’s a NBS,” she explains. Researchers in the project aim to build a deeper picture of the NBS that are being developed in different cities across the world, part of the goal of encouraging their wider implementation. “We started off by surveying 100 different cities in Europe, and looking at the NBS that were being implemented in each of those cities,” continues Professor Bulkeley. “We were interested in looking at solutions on all sorts of different scales.”
This data is central to the development of the Urban Nature Navigator, a multi-criteria decision-making tool intended to help decisionmakers identify which type of solution is best suited to their specific local priorities. If the plan is to build blue infrastructure in a city, then that is likely to contribute to goals around biodiversity and climate change for example, but not necessarily others. “Most blue infrastructure projects, as far as we can tell, don’t do very much for social inclusion or economic regeneration,” says Professor Bulkeley. The project’s agenda also includes some work on the economics of NBS, which Professor Bulkeley says is an important consideration. “We think it’s important to achieve an understanding of the economic value of NBS, because ultimately somebody has to decide on whether to pay for them,” she points out. “How would you design a business model? What business models have been successfully used with NBS?” An element of a business model might be about generating revenue, while another
EU Research
could be about avoiding future costs, for example in reducing flood risk. The project’s analysis shows that any one single business model on its own is unlikely to support NBS however; instead Professor Bulkeley talks about needing to ‘stack’ them together. “It’s about bringing two or three business models together in order to actually realise NBS,” she says. The financial model of NBS is an important element of the project’s work. “We’ve been looking at what it will take to mainstream NBS amongst investors and financial institutions. We’re finding that there is great potential for NBS to be financed by the insurance sector, and also by institutional
interested in institutional investors who are also landholders, like the National Trust. Not only are they long-term in their investment planning, but they also hold assets which they could be investing in themselves,” outlines Professor Bulkeley. Alongside looking at the financial side, Professor Bulkeley is also investigating the importance of policy and regulation in making NBS more of a mainstream option, as well as the role of the urban development industry. “Somebody has to build a NBS. So you have to know what it means to put a green roof on your building, or to leave enough space in design plans for a neighbourhood to create an ecological corridor,” she says.
It’s apparent that as
cities are responding to climate change, they’re trying to do it in a way that allows them to also meet other sustainability goals, such as those around economic regeneration, social inclusion, or health and wellbeing. investors,” continues Professor Bulkeley. “The insurance sector has got a really key role to play in terms of the potential for NBS to help reduce risk, and therefore reduce premiums.” The insurance sector has not historically been involved in this area, so a lot of support will be required. However, insurance companies and institutional investors like pension funds tend to adopt a long-term outlook, so could be encouraged to invest in NBS. “I’m particularly
www.euresearcher.com
Implementation of NBS A variety of different factors need to be considered in the implementation of a NBS, including the views of people in the communities affected. Transdisciplinary partnerships have been established in each of the cities that the project is working in, so that researchers can understand what NBS mean from the perspective of all the different actors involved. “We’ve analysed how to create the
governance arrangements that can support the implementation of NBS,” explains Professor Bulkeley. NBS are implemented primarily in partnerships, sometimes bringing together the public and private sectors, which can be a challenge for the organisations involved. “Organisations don’t always work to the same timelines, and they may have different priorities. The private sector will sometimes want quick wins, while sometimes private companies are willing to bear unpopularity and ignore electoral cycles, in a way that local authorities aren’t,” says Professor Bulkeley. The aim here is to understand the essential ingredients of good partnerships and identify where governance arrangements work well. Local authorities also need to consider the impact of NBS in areas like the property market, another topic of interest to Professor Bulkeley. “There’s a lot of concern about ‘green gentrification’, the idea that if you bring NBS into a neighbourhood, the overall value of property will increase,” she outlines. This will affect the viability of NBS as a regeneration strategy. “A NBS may improve a community’s air quality and give them access to recreational space, but if people can no longer afford to live there then they will not fully benefit,” says Professor Bulkelely. “Effectively you’re making the situation more inequitable. That is a really big challenge for NBS – and we see plenty of evidence that that’s happening.” A NBS would ideally not be imposed from above, but rather be based on the involvement of the local community. This may slow up
41
NATURVATION Nature Based Urban Innovation
Project Objectives
To unlock the potential of NBS for sustainable urban development, NATURVATION will take a transdisciplinary, internationally comparative approach to: • Advance assessment approaches to capture the multiple impacts & values of NBS to deliver a robust evidence base for decision-making; • Enable innovation to identify the most promising governance, business/finance and participation models and how to overcome the systemic conditions that currently limit their use to support systemic integration; • Generate momentum to realise the potential of NBS through co-design, co-development & co-implementation of new partnerships, knowledge, recommendations, processes and tools required to build capacity, enable replication and foster cultural change.
Project Funding
Funded by Research and Innovation action (RIA). Total funding: € 7 797 877.50
Project Partners
• Durham University • Central European University • Lund University • Universidad Autònoma de Barcelona • Utrecht University • Ecologic Institute • ICLEI Europe • Leibniz Institute for Regional Geography (IFL) • PBL Netherlands • Environmental Assessment Agency • ENT environment and management • Centre for Economic and Regional Studies (Hungarian Academy of Sciences) • Newcastle City Council • City of Malmö • City of Utrecht
Contact Details
Project Coordinator, Professor Harriet Bulkeley Durham University, Department of Geography, Lower Mountjoy, South Road, Durham, DH1 3LE, United Kingdom T: +44 191 3341800 E: info@naturvation.eu W: https://www.naturvation.eu Professor Harriet Bulkeley
Harriet Bulkeley holds joint appointments as Professor in the Department of Geography, Durham University, and at the Copernicus Institute of Sustainable Development, Utrecht University. Her research focuses on environmental governance and the politics of climate change, energy and sustainable cities.
42
Green walls promote biodiversity, improve air quality and address heat stress in cities. Credit: Evannovostro.
the implementation process in some cases, particularly with larger projects, yet Professor Bulkeley says certain things can be done quite rapidly. “I’ve been fascinated to see growing interest in the idea of miniature forests – small, tightly packed forests which get communities involved and have fast-growing species,” she outlines. These small patches of green space could then potentially form part of more ambitious schemes in future. “For example, making a certain area a bit greener, even with temporary structures, could help create a wildlife corridor for bees or butterflies,” explains Professor Bulkeley. “Cities like Edinburgh have been looking at how to connect these existing green spaces by making small interventions which could then have a wider impact, because they help create ecological corridors.” There is also a lot of interest in using space for these solutions on a temporary basis, as a way of building knowledge in the community and demonstrating the impact of NBS. Communities in Barcelona and in Leipzig – both partner cities in the project – have been given the ability to develop temporary gardens on land that is not currently being used for regeneration. “Communities are able to use the land for a year and establish gardens,” says Professor Bulkeley. This land may eventually be used for its original intended purpose, but Professor Bulkeley nevertheless believes there is a lot of value in allowing temporary NBS, particularly as Europe recovers from the impact of Covid-19. “If we’ve got shops closing down on high streets, or an area where a housing development can no longer go ahead, what are we going to do with all that space? Could we think about allowing some form of temporary NBS in those areas?” she asks. The more immediate objective in the project is to further improve the Urban Nature Navigator,
while a visualisation tool is also being developed, which is intended to help urban planners imagine a particular area of the city when NBS have been implemented. Professor Bulkeley says the project is also engaged in modelling work. “Our partners are doing some work on ‘what if’ scenarios. So, what if half of the roofs in European cities were green roofs? What would that do for our level of flood risk?” she outlines. The wider aim in the project is to help shift the perception of NBS, to a point where they are seen as a natural option for dealing with sustainability challenges. “We’re interested in what it will take to make NBS more normal, for finance, regulation and urban development. We hope to release a series of reports designed to identify particular steps that policy-makers, business or the finance industry can take towards making NBS more normal,” says Professor Bulkeley.
Sustainable drainage systems can generate multiple benefits. Credit: Arina P Habich.
EU Research
Mitigating the effects of hydropower plants Hydropower is an important source of renewable energy, yet hydropower plants can also represent a significant threat to the health of fish and the ecology of their habitats. We spoke to Professor Peter Rutschmann about the work of the European research project FIThydro in developing innovative solutions to mitigate the impacts of hydropower plants. A significant proportion of Europe’s electricity is generated by hydropower plants, which provide a reliable, sustainable supply of energy to companies and consumers. While hydropower is a major source of renewable energy, it is also important to consider the impact of hydropower plants on the surrounding environment and aquatic life. “Many hydropower plants – especially the older ones – have serious effects on the environment,” says Professor Peter Rutschmann. This is an issue at the heart of the FIThydro project, an EU-funded project which brings together 26 science and industry partners from across Europe. “The aim of the project is to find solutions for mitigating the effects of hydropower plants in an economical and cost-effective way,” explains Professor Rutschmann, the project’s Coordinator.
Hydropower plants This encompasses the effects of a hydropower plant on both fish and the habitat in which they live. One major problem is that a bypass is often not available for fish migrating downstream past a hydropower plant, and if it is, fish may not use the bypass – or fishway – , leaving them highly vulnerable. “Often fish don’t use the fishways because there is relatively little water in them, compared to the water going through the turbines. If they travel through the turbine, then there is a likelihood of them being killed. The estimated damage or mortality caused by the turbine passage used to vary considerably, depending on the formula available for calculation. With our research, we can now predict this likelihood quite precisely,” says Professor
Visit at the hydraulic laboratory VAW at the ETH Zurich, Switzerland.
Rutschmann. Hydropower plants also affect the environment. “Most hydropower plants have a reservoir. The velocity of the water, the water depth, and also the granulometry of the reservoir bed, are completely different to what they were before,” continues Professor Rutschmann. The conditions for fish in rivers today are thus often quite different to those that their predecessors experienced, which makes it difficult for them to find or reach their spawning grounds. This is one of the more serious examples of the impact of hydropower on the environment, an issue Professor Rutschmann and his colleagues are addressing in the project. “We have to look at that, and to find solutions to improve the habitat,” he outlines. This includes
both enhancing existing solutions and developing new tools to mitigate the impact of hydropower plants and comply with the European Water Framework Directive. “We are focusing on rivers. The important point with rivers, in terms of the Directive, is that all aquatic species should be able to travel both upstream and downstream,” says Professor Rutschmann. A hydropower plant represents a significant barrier in this respect. Now researchers are investigating the behaviour of fish using data from different test case studies across Europe, part of the wider goal of helping to protect and sustain fish populations over the longer term. Fish have adapted to their environment over the course of evolutionary history, yet they have not adapted to the
The hydropower plant Freudenau on the Danube in Vienna, Austria, is one of the test cases in the FIThydro project. © Verbund
www.euresearcher.com
43
Additional fish habitats are created in this fish pass at the river Günz, Germany. © Mathias Schlagenhauser
The FIThydro Consortium.
To study fish movement, fish are tagged and then detected with a portable antenna at the test case Bragado, Portugal. © Hidroerg
At the test case Bannwil, Switzerland, routes and impacts of downstream migration are investigated. ©BKW Energie AG
Computational hydrodynamic model of the hydropower plant Bannwil. © VAW, ETH Zurich
Measurements of substrate composition at the test case Schiffmühle, Switzerland. © VAW, ETH Zurich
The innovative Barotrauma Detection System sensors measure the pressure fish experience during turbine passage. © Jeffrey Tuhtan
44
presence of hydropower plants. “If we could understand how they behave, then we could build much better hydropower plants,” says Professor Rutschmann. “We are conducting experiments to observe them, and from this we can see that different species behave differently. We’ve done tests with fish in the field at test case sites as well as in the laboratory, and we’ve learned that fish have individuality.” The evidence shows for example that some trout behave very differently to others, so while it is possible to statistically assess how certain species will behave, there will be exceptions. Once researchers have a deeper understanding of the behaviour of the fish, they can then look at measures to help them avoid hydropower plants and reduce mortality rates. A variety of different species may be found in a river, including catadromous and freshwater fish. One well known species
The project is also involved in developing a number of other innovative devices, which can be used to more accurately assess the fish mortality rate around hydropower plants. The path of healthy fish through a turbine can be computed using complex, 3-dimensional tools, which can lead to deeper insights. “The turbine is modelled, the velocity and turbulences are modelled, and then a sort of virtual fish is passed through this computed flow field. You can see what pressure changes it will experience, and whether this virtual fish will be hit by the turbine blades,” outlines Professor Rutschmann. “From this data you can then assess whether the fish will survive or not. We have also tried to adapt this approach to different river conditions.” Such computations are important for hydropower operators to see if and how they might need to adapt the turbine operation mode to enable a safer turbine passage during migration periods.
The aim of the FIThydro project is to mitigate the effects of hydropower plants on fish in an economical and cost-effective way. investigated in the project is salmon, while researchers are also investigating other less well-known species, such as the bullhead, a protected fish on the red list of threatened species, which led to some interesting insights. “The bullhead is a very bad swimmer. However, in our experiment it survived all turbine passages, whereas we had high mortalities with trout and other strong swimmers,” Professor Rutschmann explains. Evidence showed there is a lower mortality rate with bad swimmers, which led researchers to look at the idea of electrofishing and to develop a protection device that can be installed in front of a turbine. “With this device the fish is quickly paralysed, and the mortality rate is reduced by a factor of 2. This is a relatively simple device, and it can be applied on existing turbines,” says Professor Rutschmann.
Test cases There are 17 test cases within the project, located in four regions which are broadly representative of European river systems. Scandinavia is a very important region in terms of hydropower production, while test cases have also been established in the Alps, France/Belgium and the Iberian Peninsula. “In the Iberian Peninsula we have a very rare fish species, which is well-adapted to very dry Summer conditions,” says Professor Rutschmann. This species may however be adversely affected if water is redistributed due to hydropower operation and more generalist species are attracted to the area. “This original Iberian fish species may have difficulty in adapting to the presence of a generalist. So these changes in water distribution can lead to the loss of this very specific species,” continues Professor Rutschmann.
EU Research
The impact of climate change is another important consideration in the project, with rising water temperatures affecting the distribution of different species. Fish tend to migrate in search of suitable conditions, so in some cases they may swim upstream to find areas with lower water temperatures, yet Professor Rutschmann says this can be challenging for the fish itself. “Upstream you have higher velocities, because of the slope. Some fish may not have the swimming ability to cope,” he points out. A lot of data has been gathered in the project on how different species swim, which will help researchers build a clearer picture in this respect. “We’ve looked at how fish swim upstream, and we’ve seen clear differences between species,” outlines Professor Rutschmann. “This will help to design better solutions such as fishways and migration habitats that enable different species to successfully migrate upstream past a hydropower plant.” The research results will feed into the development of a risk-based DecisionSupport System, to help operators, engineers and water authorities work effectively and mitigate the impact of a hydropower plant. Additionally, a Fish Population Hazard Index has been developed, through which the threat
posed to fish species by hydropower can be evaluated, while Professor Rutschmann and his colleagues are also working on a Cumulative Impact Assessment tool. “This will help us to understand what happens to fish which have to pass round a series of hydropower plants. For example, a salmon, when entering into a river system, may have to negotiate several hydropower plants,” he says. “We have developed a series of innovative solutions, tools and devices that can be employed for different aspects of hydropower impact mitigation.” A number of these decision-making tools are designed to be used during the refurbishment of hydropower plants. However, there is no single perfect solution for all cases, as Professor Rutschmann says hydropower is very site-specific. “The ideal solution for each site may be different,” he acknowledges. The cost of these solutions is another important consideration in terms of the project’s wider agenda. “There is a clear need to mitigate the effects of hydropower plants on ecology and fish. We sometimes see solutions that are effective in terms of mitigation, but which are not very economic. It is important in the project to find the right trade-off between costs and efficiency,” says Professor Rutschmann.
FIThydro Fishfriendly Innovative Technologies for Hydropower
Project Objectives
The project’s aim is to develop cost-effective environmental solutions, strategies and measures to avoid fish damage and increase the ecological compatibility of hydropower production. For this, technologies, methods, tools and devices are applied and enhanced at test sites across Europe. The results from the research are combined in different online accessible tools that help practitioners evaluate, plan and find solutions for fishfriendly hydropower.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 727830.
Project Partners
• TUM • FVB.IGB • CNRS/PPRIME • IST-ID • SINTEF • ECOLOGIC INSTITUTE • ETHZ • UHULL • TALTECH • SJE • EV INBO • NTNU • GEA-ITAGRA • LEW • UNIPER • VOITH HYDRO • AF-CONSULT • BKW • LKW • HIDROERG • FLUSSBAU iC • FISHCONSULTING • SAVASA • STATKRAFT • VERBUND • SWECO • https://www.fithydro.eu/consortium/
Contact Details
Project Coordinator, Professor Peter Rutschmann Technical University of Munich Arcisstr. 21 80333 Munich Germany T: +49 89 289 23161 E: info@fithydro.eu W: https://www.fithydro.eu/ : @fithydroproject Professor Peter Rutschmann
Peter Rutschmann is a full professor at the Technical University of Munich. He has 40 years of experience in hydraulic engineering and expertise in physical and numerical as well as hybrid modelling. He has managed some 50 hydropower projects, 35 sediment and flood management projects and also a few eco-hydraulic projects. He is one of the inventors of the innovative TUM hydroshaft powerplant and owns 8 patent families. Peter Rutschmann is a member of IAHR and the coordinator of the FIThydro project.
The nature-like fishway at the hydropower plant Freudenau enables fish to migrate upstream. © Verbund
www.euresearcher.com
45
A new viewpoint on human rights : the more-than-human
Water is more than just an essential resource, it’s also a living being that needs to be respected, according to indigenous peoples. We spoke to Dr Lieselotte Viaene about the work of the RIVERS project in investigating how human rights and international law can move beyond its anthropocentric dogma by learning from the different relationships that indigenous peoples have with water and nature. The
UN
General
Assembly
recognised water as a human right in 2010, essentially stating that everybody should have access to drinkable water. The concept of water as a commodity, as a resource for humans to use and exploit, is inherent within this legal framework, but now researchers in the ERC-backed RIVERS project aim to take a critical, broader view. “The project wants to look further, to see water not only as a natural resource. We aim to understand better the plurality of ways of knowing and relating to water and nature, focussing on indigenous peoples,” outlines Dr Lieselotte Viaene, the project’s Principal Investigator. This research touches on water conflicts, which is a concern both north and south of the equator. “Water controversies are not exclusive to the global South any more,” stresses Dr Viaene. “We want to see what the global North can learn from the global South and how the dominant legal and human rights framework can move beyond the human.”
RIVERS project There are three levels to this research, which brings together legal anthropology and human rights. In one part of the project Dr Viaene and her colleagues are working at a local level in Asia and Latin America, investigating the relationship between indigenous communities and the water around them; many territories are affected by natural resource development projects. “The main goal of the project is to provide rich, new empirical data, at the level of the local indigenous communities through participatory legal anthropological research,” she says. A lot of the knowledge that indigenous communities hold about their environment is not necessarily written down however, or held in a form that western academics would recognise. “For many
46
A dying river in Alta Verapaz, Guatemala.
indigenous peoples, knowledge derived from dreams, or consultations of the mountains and fire ceremonies, is very important,” continues Dr Viaene. “But from a western point of view that’s not recognized as knowledge, and there is a huge degree of tension there. There are conflicts about what forms of knowledge exist, showing that there are many worlds and ontologies.”
arise when indigenous visions and practices on water, territory and land encounter dominant western legal theories and practices. Environmental law, for example, also sees nature and the environment as property, something external of us, as a resource,” points out Dr Viaene. This dominant legal positivist assumption of nature-environment has clear colonial roots, which needs to be questioned in the face of the environmental degradation the world is facing. Many countries across the world are considering granting rights to rivers and other elements of nature, as a way of protecting them, which however raises further complex questions. Historically efforts to protect nature and the environment have often been promoted on the basis of the benefits they will provide to humans in terms of health and wellbeing, without considering the intrinsic
Indigenous communities have been severely affected by hydroelectric dams and demand for water from extractive industries. We want to look at what we can learn from radically different visions and practices on water, territory and land. A mountain to Western eyes might be simply a geological formation, but in some parts of the world indigenous peoples view them and other natural features as living beings worthy of dignity and respect. There is again a tension between this type of viewpoint and the neoliberalisation of nature, where the capitalist economic model has sought to harness the power of water and nature. “Indigenous communities have been disproportionally affected by hydroelectric dams and demand for water from extractive industries such as mining and oil extraction. We want to look at the many conflicts which
value of the environment itself. These kinds of assumptions are being challenged by indigenous knowledges and ontologies, and also by this emerging legal movement to promote the rights of nature. “This movement promotes the idea that nature is part of us, and that we are part of nature. It draws on indigenous thinking and Western holism,” explains Dr Viaene. There are plans to conduct field research in Nepal, Guatemala, and Colombia during the project, which Dr Viaene hopes will help encourage dialogue between the global North and South, as well as within the global South. “The design of the project is
EU Research
RIVERS Water/human rights beyond the human? Indigenous water ontologies, plurilegal encounters and interlegal translation Project Objectives
RIVERS engages with one of the most pressing questions of this century: the relationship between humans and “nature”. The project aims to produce ground-breaking knowledge, from an empirical, interdisciplinary and dialoguing perspective, about the contentions and challenges intrinsic to reconceptualising human rights with different ways of understanding and relating to water.
Project Funding RIVERS research design
bottom-up and empirical. We want to critically contribute to those debates around the rights of rivers for example,” she says. This research holds wider relevance in terms of human rights and environmental law at the level of the state, another topic that is being addressed within the project. When their human rights are violated, indigenous peoples may choose to take their case to court. “The court may then be confronted with these concepts of water as a living being, or of dreams about the river as legal evidence,” says Dr Viaene. This may sound radically different from the mainstream western idea of a logical legal case, an issue which Dr Viaene says is central to the Rivers project. “There you have an interlegal encounter between different worlds, and that’s what the project would like to highlight. There is not just one world, there are many worlds - how we can understand those other worlds?” she continues. “We would also like to have an influence on the policy level and share our findings with policymakers. The then UN Special Rapporteur of the Rights of Indigenous Peoples, Vicky Tauli-Corpuz from the Philippines, and the Chair of the UN Permanent Forum on Indigenous Peoples, Anne Nuorgam from Norway, attended the launch of the RIVERS project in 2019, and participated in the launch seminar - Rights of Nature: legal revolution or ontological conflicts.”
Indigenous peoples A RIVERS blog and youtube channel with recordings of the seminars and workshops have been set up, and podcasts are planned to help bring the project’s findings to a wider audience, while researchers are also working to amplify the voices of indigenous peoples, for example through a HYPERLINK “https:// rivers-ercproject.eu/blog/”blog titled ‘Nature and its Rights’. One outcome from the project will be a collaborative documentary, built on research in Nepal and Guatemala. “This will be a way to communicate what we try to convey in our academic writing, but towards
www.euresearcher.com
a broader audience such as lawyers, judges, students and NGOs,” outlines Dr Viaene. The Covid-19 pandemic has led to much of the field research being put on hold for the moment, but there are plans to resume it when restrictions are lifted, while Dr Viaene also plans to collaborate with the UN. “In November, there will be a session of the UN Expert Mechanism of Indigenous Peoples, a subsidiary body of the UN Human Rights Council. They are currently doing a study on the right to land of indigenous peoples, within the international human rights framework,” she says. The RIVERS project submitted comments on the draft version of this study in June, which will be discussed in November. The Expert Mechanism brings together a broad range of indigenous representatives, and Dr Viaene is keen to play a role. “We want to collaborate with some of the mandates within the UN relating to indigenous peoples, and to participate in discussions between academics, indigenous peoples and also policy-makers,” she explains. For example, recently the RIVERS project co-organized - together with the Ethnic-Racial Commission of the Special Jurisdiction for Peace in Colombia - an online Word Circle with indigenous lawyers and leaders from Colombia, Guatemala, Ecuador and Brazil about Territory and Nature as a victim of human rights violations. This reflects an increased openness to other points of view in the global North, yet it’s not easy to change an established mindset, as many of these other forms of knowledge have historically been excluded and marginalized. However, Dr Viaene believes it’s important to reflect critically on established legal practices, create interlegal dialogues and question colonial origins of international law and human rights. “That’s where we want to collaborate, through participatory research, and to see what we can learn from other forms of knowledge which are embedded in other ontologies or worlds related to nature and to water,” she says.
RIVERS is funded by an ERC Starting Grant (Grant agreement No. 80400) 2019-2024.
Contact Details
Lieselotte Viaene, Ph.D Professor Department of Social Sciences Principal Investigator Universidad Carlos III de Madrid Spain E: lviaene@clio.uc3m.es W: www.rivers-ercproject.eu : Rivers ERC project : Rivers ERC project Images opposite left page from the RIVERS International Launch (2019) Top left: Dambar Chemjong (Nepal), Belkis Izquierdo (Colombia), Anne Nuorgam (Norway) Top right: Guillermo Fernández-Maldonado (Colombia), Vicky Tauli-Corpuz (Philippines), Lieselotte Viaene (Spain) Lieselotte Viaene, Ph.D
Lieselotte is a Belgian legal anthropologist with a Ph.D in Law and her interests include indigenous ontologies, legal pluralism, indigenous peoples’ rights, and transitional justice. Over the past fifteen years she has been collaborating as a human rights’ academic and practitioner with indigenous peoples in Central and South America.
47
How does hybridization affect ecological processes? There are over 200 species of Daphnia (waterfleas), a type of crustacean found in a variety of aquatic ecosystems, some of which can hybridize with each other. We spoke to Dr Piet Spaak and Dr Justyna Wolinska about their work in investigating the extent to which hybridization and parasitism influence major ecological processes. Daphnia are a type of small crustacean that live in the open water and are found in many aquatic ecosystems, including lakes and ponds across Europe. They eat algae and are themselves food for fish, which makes them crucial in aquatic food webs. There are over 200 different species of Daphnia, including Daphnia longispina and Daphnia galeata, and these species can hybridize with each other, a topic central to Dr Piet Spaak’s research. “I’ve been interested in hybridization between different species since 1990. Around 20 years ago there was a lot of interest among the research community in the role of parasites in ecosystems. I thought: Is this something that influences the success of parental species in hybrids?” he outlines. This question is at the core of a research project backed by the Swiss and German National Science Foundations, in which Dr Spaak and Dr Wolinska are additionally investigating the impact of eutrophication on aquatic ecosystems. “We found that eutrophication influenced the Daphnia populations in lakes. We found hybrids in lakes, between two species,” he says. These crustaceans hold a great deal of interest for scientists, partly because they are able to reproduce very rapidly through both sexual and asexual reproduction. A female Daphnia is essentially able to clone herself and produce genetically identical babies. “You can very rapidly have thousands of genetically identical Daphnia. On top of that, Daphnia can also reproduce sexually, where genetic material from two lineages is combined,” says Dr Spaak. These fertilized sexual eggs are resting eggs that can survive in the sediment for many years. Researchers in the project are analysing Daphnia that
48
Daphnia galeata infected with protozoan parasite Caullerya mesnili. Multiple spore clusters are visible in the Daphnia gut (green). This parasite is very virulent to its host; it shuts down Daphnia reproduction and causes its earlier death. Picture: Kirstin Bittner, Eawag.
hatched from resting eggs of different ages. “We have Daphnia in our lab that hatched from 60 year-old resting eggs and therefore are genetically the same as the Daphnia that have been in the lake during those times. That gives us some very interesting research opportunities,” continues Dr Spaak.
Impact of eutrophication This research is being conducted against a backdrop of continued concern about the impact of eutrophication, where aquatic ecosystems become excessively rich in nutrients, which then affects the ecological balance. Greifensee, as well as almost all lakes in Western and Central Europe, went through a eutrophication peak around the early ‘80s, since when the level of nutrients has decreased, a major point of interest in the project. “We can use time-series data
Caullerya spore clusters
to look at how eutrophication levels affect biological processes,” explains Dr Justyna Wolinska, Research Group Leader at IGB in Berlin and Professor at the Freie Universität Berlin. “We can study a very large number of Daphnia generations in short timescales. We can look at changes in the same lake over time as it goes through different phases of eutrophication,” says Dr Wolinska. The project team is also studying the interactions of Daphnia with its parasites such as Caullerya mesnili, a highly virulent parasite that has caused disease in lakes across Europe. Researchers are taking a twopronged approach to this work, combining observations in the field with laboratory work. “First of all we observe patterns in nature. Here we use spatial studies, where we investigate several different lakes, which differ in environmental conditions,
EU Research
Individual spores of Caullerya mesnili. Copyright: Lohr et al. 2010, Journal of Eukaryotic Microbiology.
HOST-PARASITE INTERACTIONS
Host-parasite interactions in hybridizing Daphnia, from correlations to experiments Project Objectives
for example in eutrophication level, or in water temperature,” outlines Dr Wolinska. The next step would be to investigate how these environmental conditions affect the frequency and extent of infection by parasites. “We are studying Greifensee in detail over time. We can also test to what extent the level of infection depends on seasonally changing environmental conditions,” continues Dr Wolinska. “If we see something interesting, we then want to verify that in the laboratory.” Researchers are able to precisely control the conditions in a lab environment for Daphnia clones, which helps scientists to investigate the impact of any specific environmental variable. Observations from the field can be tested in the lab, from which researchers can then look to build a deeper understanding. “For example, we
and they are not selective, and when they consume filamentous structures their feeding apparatus gets clogged. Cyanobacteria are also less nutritious than green algae,” outlines Dr Wolinska. The wider context here is ongoing concern about the health of aquatic ecosystems and the long-term consequences for biodiversity. “There are other factors that influence lakes too, such as climate change, so it’s not only about the amount of nutrients,” says Dr Spaak. The general aim in the project is to study the extent to which hybridization and parasitism influence major ecological processes however, rather than more specific objectives around lake management. A lot of progress has been made on the molecular side of this research. “We have sequenced the genome of Daphnia galeata. We’ve also tried to sequence the genome of the parasite, yet that’s a lot
We are studying Daphnia populations over time. We can test to what extent the level of infection depends on
This SNF-DFG funded project aims to understand the influence of environmental factors on hybridization rate and host-parasite interactions in lake plankton. Specifically, we investigate how eutrophication affects hybridization rate and parasitic infections in Daphnia (water fleas). To do so, we combine field observations and experimental studies. This project, which results from a close collaboration between Swiss and German science teams, will further our understanding of the environmental conditions that drive the evolutionary processes in natural systems.
Project Funding
This project is funded by the Swiss National Science Foundation (SNF 166628) and the German Research Foundation (DFG WO 1587/6-1).
Project Partners
• Prof. Iñaki Ruiz-Trillo, Institut de Biologia Evolutiva (CSIC-Universitat Pompeu Fabra), Barcelona, Catalonia, Spain • Prof. Michael Monaghan, The Berlin Center for Genomics in Biodiversity Research (BeGenDiv), Berlin, Germany
Contact Details
Dr Piet Spaak E: Piet.Spaak@eawag.ch W: https://www.eawag.ch/de/abteilung/ eco/schwerpunkte/zooplankton-oekologieund-evolution/ Dr Justyna Wolinska E: wolinska@igb-berlin.de W: https://www.igb-berlin.de/en/wolinska Literature Lu et al. 2020 Mol. Phylogenet. Evol. 151, 106891. Turko et al. 2018 Evolution 72, 619-629. Dr Piet Spaak
Dr Justyna Wolinska
changing environmental conditions. observed in the field that if there were more cyanobacteria, there were more infections. From that you can formulate a hypothesis that higher levels of cyanobacteria make Daphnia more susceptible to infection. We then fed some Daphnia in the lab cyanobacteria, while we fed other Daphnia normal algae,” explains Dr Spaak. While cyanobacteria and algae both appear as green cells on the surface of a lake to the naked eye, Dr Spaak says they in fact are distinct from each other. “Cyanobacteria are not eukaryotes, they are really bacteria, that’s the difference. These cyanobacteria can also produce toxins,” he continues.
Cyanobacteria This is not the only way in which cyanobacteria have a negative impact on aquatic ecosystems. Alongside producing toxins, they also have quite difficult, filamentous morphologies, which means they are not an ideal food source for zooplankton. “Daphnia are filter-feeders
www.euresearcher.com
more complex, because the parasite is intercellular,” says Dr Spaak. There is enormous scope for further research, and in future Dr Wolinska hopes to build a deeper mechanistic understanding of the processes that have been observed in the field. “For example, what kinds of molecular mechanisms are responsible for the processes we’ve observed? We would also like to look at the genomic side of host-parasite adaptations,” she says. A number of studies have also shown that the microbiome of an organism plays a role in the interaction with diseases, representing another potential avenue of investigation. A Daphnia aquires its microbiome in the gut from the environment. When they feed bacteria come in and a population develops in their gut. “The question then is, does the diversity of the bacteria play a role in the success of the parasite?” asks Dr Spaak. “These are the types of mechanistic things, in terms of genes, genetics, and also biology, that we are looking at.”
Dr Piet Spaak is a senior scientist at Swiss Federal Institute of Aquatic Science and Technology (Eawag) and teaches at ETH-Zürich. His research focuses on understanding the interaction between aquatic organisms and their changing environment. He uses Daphnia as a model system for field and experimental work. Dr Justyna Wolinska is group leader at the Leibniz Institute of Freshwater Ecology and Inland Fisheries (IGB) and professor at the Freie Universität Berlin. She studies evolutionary and ecological processes mediated by parasitism in aquatic ecosystems. For example, she investigates how parasitism contributes to the maintenance of genetic diversity.
German Research Foundation
49
The Sunnier Side of Science
This image is a composite of 25 separate images spanning the period of April 16, 2012, to April 15, 2013. It uses the SDO AIA wavelength of 171 angstroms and reveals the zones on the sun where active regions are most common during this part of the solar cycle. Credit: NASA/SDO/AIA/S. Wiessinger
50
EU Research
Top left Image: Europe’s largest solar telescope GREGOR reveals intricate structures of solar magnetic fields in very high resolution. The image was taken at the wavelength of 516 nm. CREDIT: KIS
Top Right Image: The GREGOR telescope on Tenerife, Spain. Right: The newly redesigned optical laboratory of GREGOR. CREDIT: L. Kleint, KIS
Lower Right Image: A sunspot observed in high resolution by the GREGOR telescope at the wavelength 430 nm. CREDIT: KIS
Europe’s largest telescope GREGOR, operated by a German consortium and located on Teide Observatory in Tenerife, in the Canary Islands, has recently had an upgrade and has taken some spectacular close-up images of the sun. Here, we take a look at what those images showed and explain why sun science is important. By Richard Forsyth
T
he GREGOR telescope, Europe’s largest, was inaugurated in 2012 and started a comprehensive upgrade in 2018, carried out by scientists and engineers from the Leibniz Institute for Solar Physics (KIS), improving everything from optics, alignment, mechanics, vibration reduction, control systems and enhancements. The GREGOR telescope now allows scientists to comprehend details as small as 50 km on the Sun, a tiny fraction of the solar diameter of 1.4 million km. This is as if one saw a needle on a soccer field perfectly sharp from a distance of one kilometer. “This was a very exciting, but also extremely challenging project. In only one year we completely redesigned the optics, mechanics, and electronics to achieve the best possible image quality,” said Dr. Lucia Kleint, who led the project.
A Complex Upgrade The upgrade was only completed in March, during lockdown. Whilst snowstorms prevented observations at the time, when Spain reopened in July, the team obtained the highest resolution images ever taken of the Sun by a European telescope and the results were breathtaking.
www.euresearcher.com
The new optics of the telescope will allow scientists to study magnetic fields, convection, turbulence, solar eruptions and sunspots in fine detail. First light images obtained in July 2020 revealed the astonishing intricacies of sunspot evolution and the detailed structures in solar plasma. The workings of telescope optics are very complex systems involving mirrors, lenses, filters and other optical elements, and any error in fabrication will impact the performance. The GREGOR team found several issues that needed attention and calculated optics models to solve them. They had to replace, for example, two elements with so-called off-axis parabolic mirrors, which had to be polished to 6nm precision, which, for a sense of scale, is about 1/10,000 of the diameter of a hair. The result of this level of meticulous attention to detail resulted in a very sharp focus for the telescope. Prof. Dr. Svetlana Berdyugina, professor at the Albert-Ludwig University of Freiburg and Director of the Leibniz Institute for Solar Physics (KIS), said: “The project was rather risky because such telescope upgrades usually take years, but the great team work and meticulous planning have led to this success. Now we have a powerful instrument to solve puzzles on the sun.”
51
Why we Need to Study the Sun There are many reasons to study the Sun. As a 4.5 billion year old star (it’s still got another 5 billion years-worth of fuel in it), our Sun has a profound influence on our world, the life on it and of course, us. We rely on it as the keystone to life on Earth. When the Sun is brighter, at sunspot maximum, the greatest period of activity in an 11 year cycle, it can affect climate but more research is needed on the process and effects. We do know the Sun affects weather patterns on Earth, and that the Sun drives the physical and biological processes in the world, in oceans, on the land and in the atmosphere. It affects everything from harvests to our sleep patterns and is a driver of Earth’s climate, so knowing more about its impacts, cycles, trends and affects is beneficial, although the latest period of accelerating climate change is widely accepted to be an effect from our own industrial pace. Studying the magnetism of the sun and how it influences Earth can also give us insights that help us minimise damage to satellites and our technological infrastructure. The Sun’s surface can sometimes flare, giving off ultraviolet and X-rays. This Sun’s ‘weather’ affects Earth’s upper atmosphere. Solar winds blast from the Sun in all directions and consist of a flood of gases that shoots past the Earth at speeds of more than 500 km per second. This solar wind can disrupt the Earth’s magnetic field, inserting energy in the radiation belts. It has the power to alter satellite orbits, damage satellites and put astronauts in dangerous situations. On Earth it can create tremendous surges in power lines, interfere with technological equipment and knock out power for large geographical
areas. Monitoring the Sun’s surface and finding ways to predict solar storms could have great benefits for us. We can, at present, predict solar eruptions with a warning space of around two or three days, with the help of NASA satellites such as the Solar and Heilospheric Observatory (SOHO) but the large observatories like GREGOR can still play an important part protecting our technology from solar activity. It is also important to understand how the Sun works scientifically, as a fusion reactor. We are researching fusion energy on Earth with a view to making it a major energy resource, currently ITER in France is leading this endeavor for Europe. Fusion energy is heralded as an efficient and powerful energy source with little waste. It’s a very attractive idea for powering cities in the future and the more we study the natural processes of the Sun the more we can learn. We should also study the Sun because it is a fundamental building block in the Universe. Consider it is only one of more than 100 billion stars in the Milky Way, in a Universe teeming with perhaps 2 trillion galaxies at the latest estimate. We need to know as much as we can about how stars function, purely for the advancement of science and our greater understanding of the cosmos, the place in which we live. GREGOR’s upgrade is a welcome improved tool for studying our Sun that will have great value for European scientists. For a more technical description of the redesign an article titled: GREGOR: Optics redesign and updates from 2018–2020, led by Dr. L. Kleint was recently published by the Astronomy & Astrophysics journal. European researchers have access to observations with the GREGOR telescope through national programs and a program funded by the European commission and new scientific observations started in September 2020.
https: //www.iac.es/en /observatorios-de-canarias/telescopes-and-experiments/gregor-solar-telescope
SDO Captures Solar Flare
Visible in the lower left corner, the sun emitted an M6 solar flare on Nov. 12, 2012, which peaked at 9:04 pm EST. This image is a blend of two images captured by NASA’s Solar Dynamics Observatory (SDO), one showing the sun in the 304 Angstrom wavelength and one in the 193 Angstrom wavelength. Credit: NASA/SDO
52
SDO Captures X1.7 Class Solar Flare
The sun erupted with an X1.7-class solar flare on May 12, 2013. This is a blend of two images of the flare from NASA’s Solar Dynamics Observatory: One image shows light in the 171-angstrom wavelength, the other in 131 angstroms. This is the first X-class flare of 2013. Credit: NASA/SDO/AIA
EU Research
“The project was rather risky because such telescope upgrades usually take years, but the great team work and meticulous planning have led to this success. Now we have a powerful instrument to solve puzzles on the sun.” NASA’s Solar Dynamics Observatory captured this image of the X1.2 class solar flare on May 14, 2013. The image show light with a wavelength of 304 angstroms. Credit: NASA/SDO
Massive Coronal Hole on the Sun
The Sun and the Moon
NASA’s Solar Dynamics Observatory captured this picture of the sun on June 18, 2013, showing a huge coronal hole – seen here in dark blue - spread out over almost the entire upper left quadrant of the sun. Credit: NASA/SDO
A view of the sun captured by NASA’s Solar Dynamics Observatory on Oct. 7, 2010, while partially obscured by the moon. A model of the moon from NASA’s Lunar Reconnaissance Orbiter has been inserted into a picture, showing how perfectly the moon’s true topography fits into the shadow observed by SDO. Credit: NASA/SDO/LRO/GSFC
www.euresearcher.com
53
The eyewitness memory puzzle: is culture the missing piece? Eyewitness evidence is a crucial part of a criminal investigation. Yet, there are cultural differences in the way that people report events, which can be a challenge for investigators. Researchers in the WEIRD Witnesses project aim to design new, culturally sensitive guidelines that will help investigators get the information they need, as Dr Annelies Vredeveldt explains.
54
Evidence from eyewitnesses plays
WEIRD Witnesses project
a crucial role in criminal investigations, helping the police to reconstruct the events around a crime, identify a suspect and build a case. However, there are sometimes cultural differences in the way that witnesses report crimes, which can pose a significant challenge for investigators. “If a witness reports a rape in euphemistic language, for example, then the police officer is not getting the information they need to build a case,” points out Dr Annelies Vredeveldt, Associate Professor in the Department of Criminal Law and Criminology at VU University Amsterdam. Our memories are reconstructed rather than reproduced, and so inaccurate details can creep in, which is recognised in the scientific literature on eyewitness memory. One factor that is not really considered however is an individual’s cultural background, an issue Dr Vredeveldt is addressing in the ERCbacked WEIRD Witnesses project, analysing recorded police interviews, testimony provided in international criminal cases and eyewitness experiments. “Current theories about eyewitness memory take account of things like how well you were able to perceive an event, and whether you talked to other people afterwards. We believe that some of those things interact with your cultural background to influence how your memory works,” she outlines.
Culture itself can mean many different things, so rather than trying to define it in a broad sense, Dr Vredeveldt and her colleagues are focusing on what culture means in the context of the specific project. “One example of a potentially relevant cultural dimension, identified by Dutch psychologist Dr Geert Hofstede, is power distance, which is about how you respond to authority. Another example is collectivism versus
The project aims to help interviewers ask questions in a culturally sensitive way that elicits the information they need, while also assessing what information a witness might be holding back. “What would then be the best way to get that information, without manipulating the witness?” continues Dr Vredeveldt. The language, tone and manner of the interviewer are all important considerations in this respect, but ultimately the most important factor is the content of the
Current theories about eyewitness memory take account of things like how well you were able to perceive an event, and whether you talked to other people afterwards. We believe that some of those things interact with your cultural background to influence how your memory works. individualism: people from a collectivist culture may see themselves as a part of wider society, whereas in individualistic cultures, people tend to see themselves more as an independent being,” she outlines. Cultural differences such as these may affect the way people report an incident. “People from more collectivist cultures may report more about the relationships between people and events, whereas people from individualistic backgrounds may report more about their own role,” explains Dr Vredeveldt.
questions they ask, believes Dr Vredeveldt. “For example, imagine if an interviewer just asks a witness; ‘what happened?’ The witness might respond by using language that doesn’t make it clear what actually happened, so maybe that question then needs to be followed up by more specific enquiries,” she says. This is an important issue for example in South Africa, a country noted for its cultural diversity. Yet, the police in South Africa receive only limited guidance – or even no guidance at all - on how to interview people from
EU Research
WEIRD WITNESSES Beyond WEIRD Witnesses: Eyewitness Memory in Cross-Cultural Contexts. WEIRD Witnesses (witnesses from Western Educated Industrialized Rich Democratic societies). Project Objectives
The project aims to provide a nuanced understanding of the role of culture in eyewitness memory. A second aim is to design and test evidence-based guidelines for investigative interviewers in cross-cultural settings.
Project Funding
European Research Council ERC Starting Grant
Contact Details
different cultural backgrounds. “I previously worked in South Africa, where I collected 100 eyewitness interviews for analysis of a new interview technique which I was testing at the time,” says Dr Vredeveldt. “I noticed that there were many instances of cultural differences in how people report crime, while I also noticed cross-cultural communication issues between police interviewers and witnesses.” The first subproject entails a systematic analysis of these recorded police interviews. A second subproject is centred on international criminal cases, which often involve people from different cultural backgrounds. The focus here is on the International Criminal Tribunal for Rwanda (ICTR), established by the United Nations in 1994 to conduct trials of people accused of involvement in the genocide of that year, before it was eventually dissolved in 2015. The ICTR filled its ranks with staff from all corners of the world. This presented new problems, with witnesses, judges, and lawyers all trying to understand the process from unique cultural lenses. This presents a unique opportunity to study how different cultures interact in the arena of international justice. “We’ll look at the testimony given at the tribunal,” says Dr Vredeveldt. In some of these cases witnesses were asked to describe events which had happened over ten years ago, and the passage of time can make precise recall difficult. “First of all, we simply forget things,” points out Dr Vredeveldt. “Second, there are lots of external influences, which can cause differences and errors to sneak into your memory. So, for instance, if you talk about the same incident with other people, you start reporting things that you hear from them as if you had seen them yourself. That’s called social contagion. Other things like news reports can also influence how you remember an event.”
www.euresearcher.com
In the third subproject, researchers will conduct experiments in controlled settings to look at how people from different cultural groups encode, store and retrieve memories. “We will ask people from sub-Saharan Africa seeking asylum in Europe to participate in eyewitness experiments and mock asylum interviews, and Western immigration officials to evaluate their statements” explains Dr Vredeveldt. The way in which an asylum seeker describes their experiences in an interview can be a major factor in determining whether they are granted asylum. While people from some countries are highly likely to be granted asylum, other cases are less clear-cut, and it may depend on the way their story is presented. “If you have gone through traumatic events, or are in serious danger of persecution, then you need to report that in a way that’s believable, detailed and accurate in order to be granted asylum,” says Dr Vredeveldt. The researchers will compare asylum seekers’ reports to a control group of Western participants, who are matched on demographic characteristics like age and gender. “We are going to ask questions that are similar to those asked in asylum interviews, but we’re not going to ask them about their actual asylum application or the persecution, for two reasons. One, for comparison to the control group, because they don’t have a story of persecution. The other is to remove any perception of risk for the participants,” continues Dr Vredeveldt. This will then inform the project’s work in developing a theory of eyewitness memory in which culture is taken into account. The second major goal is to design and test evidencebased interview guidelines for interviewers, which Dr Vredeveldt hopes will ultimately help investigators to obtain better evidence. “We hope to identify the best way to achieve that in the research project,” she outlines.
Project Coordinator, Dr. Annelies Vredeveldt, Associate Professor Department of Criminal Law and Criminology Faculty of Law VU University Amsterdam T: +31 20 59 83 153 E: a.vredeveldt@vu.nl W: https://allp.nl/cases/eyewitnessmemory-in-cross-cultural-contexts/ W: www.annelies.vredeveldt.com W: www.allp.nl · Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33, 61–83. doi:10.1017/S0140525X0999152X · Wang, Q. (2016). Why should we all be cultural psychologists? Lessons from the study of social cognition. Perspectives on Psychological Science, 11, 583–596. doi:10.1177/1745691616645552 · Combs, N. A. (2017). Grave crimes and weak evidence: a fact-finding evolution in international criminal law. Harvard International Law Journal, 58(1), 47–126. · de Bruïne G, Vredeveldt, A., & van, Koppen. P. J. (2018). Cross-cultural differences in object recognition: comparing asylum seekers from subsaharan africa and a matched western european control group. Applied Cognitive Psychology, 32(4), 463–473. https://doi.org/10.1002/acp.3419
Dr Annelies Vredeveldt - Assoc. Prof. (left) Dr Laura Weiss - Postdoc. Researcher (right)
Gabi de Bruïne - PhD Candidate (left) Dylan Drenk - PhD Candidate (right)
55
New nations, shared Ottoman past Cities across modern Serbia, Bosnia and Turkey spent many years as part of the Ottoman Empire, before new nation states were established in the aftermath of the First World War. Photo archives from the 1920s and ‘30s open a window into everyday life in the period, a topic at the heart of Professor Nataša Mišković’s research. In the aftermath
of the First World War, new nation states were established on the ruins of the Ottoman Empire which aimed to fully break with the past. Yet the Empire left a legacy of many shared cultural practices across southeastern Europe and western Asia. As the Principal Investigator of the SIBA project, Professor Nataša Mišković is exploring everyday life in four post-Ottoman cities during the 1920s and ‘30s through local press photography from the period. “The aim of the project is to look at the public urban space and explore to what extent it changed under the new national regimes,” she says. This research centres on photos of Sarajevo, Istanbul, Belgrade and Ankara, all cities which were formerly part of the Ottoman Empire. “We wanted to have imagery from local press photographers, and to look at how they perceived their city,” outlines Professor Mišković. “Like the internet or social media today, rotary printing was a totally new technology which required investment. Before WW1 there were almost no photographs in local newspapers in the region.”
SIBA Exploring Post-Ottoman Cities Through the Photographers’ Lens. New Approaches to Lifeworld Research in Turkey and Yugoslavia, 1920s and 1930s (SIBA) Professor Nataša Mišković University of Basel | Department of Social Studies | Rheinsprung 21 | 4051 Basel | Schweiz T: +41 61 207 19 91 E: natasa.miskovic@unibas.ch W: https://nahoststudien. philhist.unibas.ch/de/ personen/natasa-miskovic/ W: https://nahoststudien. philhist.unibas.ch/de/forschung/ forschungsprojekte/siba/ Nataša Mišković is research professor at the University of Basel, Department of Social Sciences, since 2013. Her research focus is on the shared history of the Balkans and the Middle East, on visual history and on the history of Yugoslavia. Her new project is on History and Film on Nation-building, Violence and Memory from the European Periphery.
56
Namık Görgüç/Selahattin Giz: Photographing a worker on the Beyazıt Mosque minaret, Istanbul 1936. Cengiz Kahraman Photography Collection (siba.1158).
Everyday life in post-Ottoman cities A database of several thousand photos has been built up from archives, museums and private collections across the region, part of which is available to the public on the online database Visual Archive Southeastern Europe (VASE). While some areas of these four cities
different sections of society mixed. While the rulers of the new Yugoslavia and Turkey were keen to build a new sense of national identity, people still drew on their cultural heritage in everyday life, and Professor Mišković says this is still in evidence today. “People from the Balkans who travel to Turkey generally do not have any difficulties in getting around, and vice-versa,” she says. The photos have been displayed through a travelling exhibition called Cities on the Move — post-Ottoman, which has been shown in seven European cities. Professor Mišković hopes it has inspired people from the region to reflect on their history, and others to perceive post-Ottoman cities as modern European cities. “We want to encourage people to take a wider view of their own past, not reduced so much to their own national view, and to encourage them to ask questions about enforced narratives,” she says.
We use the photographs to give people a sensual, visual, emotional experience to help them dive into their own past. We want to show that the past is connected and shared. became closely associated with the new nation state, others retained their earlier character. “In each of the cities there are locations which are closely associated with the nation state. In Istanbul it’s Taksim square, which is in Beyoğlu, a modern part of the city,” explains Professor Mišković. There is also a social anthropological dimension to this research, as the photos reveal differences in the way people dressed. “Elites who supported the new regime presented themselves as being national and often dressed according to the latest international fashions. Whereas poor people, who didn’t have the money to buy new clothes, often stuck to the old ways. Uniforms were introduced for school children,” continues Professor Mišković. The photos themselves relate to five different aspects of everyday life, including images of the bazaar, where people from
This exhibition has had a very positive reception, attracting thousands of visitors in Istanbul, Belgrade and Sarajevo. The exhibition has helped to challenge established perceptions of the nation’s past. “We use the photographs to give people a sensual, visual, emotional experience to help them dive into their own past. We want to show that the past is connected and shared,” continues Professor Mišković. This is a topic Professor Mišković explored further in her film Zeitgeist, which she produced on base of material from her database. “We used the pictures from the exhibition to do a 30-minute experimental film, based on digitally enhanced historical photographs from Turkey and Yugoslavia. Arranged in a visual narrative, you can see people in an urban scenery, but it’s not easy to see which city it is,” she says.
EU Research
Local Autonomy, municipal size and effects Local government provide a variety of services, but they are not immune to wider economic pressures, and in some parts of Europe neighbouring municipal authorities have merged in search of improved efficiency. Professor Andreas Ladner and his colleagues are conducting comparative studies that will get to the heart of the issues facing local government. The
balance of responsibilities between local, regional and national government varies across Europe, with some nations relatively centralised, while others devolve a high degree of authority to the local level. Based at IDHEAP (Institut de hautes études en administration publique) at the University of Lausanne in Switzerland, Professor Andreas Ladner is the Principal Investigator of a research project which seeks to compare how the different countries organise local administration. “The European charter of local self-government serves as a point of reference to the Council of Europe, which around 30 years ago started to promote local self-government. They are very much in favour of a strong bottom layer, where services are provided on the local level, and citizens control what kind of services are offered,” he outlines. Despite the importance of local government, there is a tendency towards centralisation in many countries, for example through the merging of municipalities or through a shift of competences to higher levels. “A municipality may realise that they are too small to do all the things they should do, or that they can provide the same things cheaper and at better quality, together with a larger, neighbouring municipality,” explains Professor Ladner. Is small beautiful? This has occurred in several Northern European countries, for example Denmark and Sweden, and also in Germany, where municipalities are typically relatively large in size. However, the position is different in Switzerland, which has around 2.200 municipalities. “In Switzerland, we haven’t merged municipalities to the same degree as has been seen in Northern Europe,” says Professor Ladner. There may be powerful reasons for amalgamating municipalities, such as improving efficiency and boosting buying power, but this will affect the nature of local democracy, an issue Professor Ladner explored together with colleagues in another comparative project called “Size and Local Democracy”. “Is big as good as is sometimes thought? Or is it sometimes more convenient to remain small?”
www.euresearcher.com
he outlines. “This is related to the topic of democratic control and local accountability as well. There is the idea that small is beautiful, because citizens are closer to the people that make decisions that affect their lives. They can lobby them, and perhaps even participate in the decision-making process. On the other side, there is the argument that a bigger municipality benefits from economies of scale.”
The larger an organisation or municipality grows the more bureaucracy is required to run it however, which is another important consideration in the project. As Principal Investigator, Professor Ladner hopes the project will make a significant contribution to the ongoing debate around the structure of local administration. “We have data on a wide spectrum of cases and a lot of municipalities,”
57
Local Autonomy Index (LAI), country ranking 2014
he says. A final part of the research project involves looking at reform projects in Swiss municipalities, from which Professor Ladner hopes to gain new insights. “Switzerland is a very attractive location for this research, as it’s not a homogenous country. We have different political systems and cultural settings,” he explains. “In the French-speaking part of Switzerland the cantonal level – the intermediate tier of government – is very important, whereas in the German-speaking part the local level is more important. So, certain tasks are performed mainly on the local level in the German-speaking part, and more often on the cantonal level in the French-speaking part.” A prime example is schools, which in the French-speaking parts of Switzerland are typically administered from the cantonal level, while in the German-speaking parts they are much more in the responsibility of local government. There are also further differences in the way that government is organised. “In the German-speaking parts there is a form of assembly democracy, where people essentially gather together and decide on political matters by a show of hands. Whereas in the French-speaking parts this is the function of elected representatives in parliaments and local councils,” says Professor Ladner. The former is an example of a more direct form of democracy, where there are more possibilities for citizens to intervene directly in policy and influence and shape decisions. “Is enabling the direct intervention of citizens, through voting in referendums for example, a good thing?” asks Professor Ladner. “It’s difficult to quantify this type of question however, because it
58
depends on values and your norms, but these municipalities usually manage quite well.”
Patterns of Local Autonomy Researchers in the project aim to assess the effectiveness of these different approaches to local government, and provide a basis for further comparison and analysis. One major topic of interest to Professor Ladner is the importance of fiscal and organisational autonomy to municipalities. “Can we show that the more autonomous municipalities are
The Scandinavian countries rate highly in these terms, while Switzerland, Germany and the Netherlands also combine economic success with high levels of political participation. One of the common denominators here is a high degree of municipal autonomy, and while this is not the sole factor, the correlation is noteworthy. “There are also other reasons why these countries are successful, but they do have autonomous municipalities. Whereas municipalities in the newer
There is the idea that small is beautiful, because citizens are closer to the people that make decisions that affect their lives. On the other side, there is the argument that a bigger municipality
benefits from economies of scale. more inclusive, or more successful? If they already have a high degree of autonomy, are they willing or interested in adapting their structures, so as to improve performance? That can be both in terms of efficiency, and also in terms of including citizens in their decision-making process,” he continues. The project’s results so far suggest that almost all successful countries, both economically and in terms of the quality of their political culture, afford a high degree of autonomy to municipal authorities. “We can think of GDP and growth rates as economic indicators. The quality of a democratic culture can be assessed in terms of electoral turnout rates, trust, or corruption levels,” continues Professor Ladner.
European democracies have less autonomy,” says Professor Ladner. The wider aim in the project is to provide a more rigorous basis for comparison in this respect. “We want to provide the information required to dig deeper into the problem. We’ve established a database and made it accessible to the research community, which is an important part of our project,” explains Professor Ladner. “We want to produce usable data for further analysis, while we also plan to address several other research questions.” Interesting to learn while using a more detailed concept of local autonomy is that countries like Denmark and Switzerland achieve their top ranking with different patterns of local autonomy (see the two spider graphs).
EU Research
Different research projects and their interconnections
LOCAL AUTONOMY Local Autonomy and Local Public Sector Reforms
Project Objectives
Establishing the links between local autonomy, local democracy, local service provision and local public sector reforms.
Project Funding
Funded by the Swiss National Science Foundation (SNSF): €500,000.
Project Partners and Collaborators
• Prof. Andreas Ladner (IDHEAP), PI • Prof. Reto Steiner (ZHAW) • Dr. Nicolas Keuffer (IDHEAP) • Dr. Laetitia Desfontaine Mathys (IDHEAP) • Dr. Claire Kaiser (ZHAW) • Alexander Haus (IDHEAP) • Ada Amsellem (IDHEAP) • Jana Machljankin (ZHAW)
Contact Details
This research also holds wider interest to the European Commission, which is looking to invest funds in the younger democracies in Eastern Europe so as to foster economic development. However, it’s important that effective democratic structures are in place first so that these funds can be controlled by those for which they are meant, which is an important consideration for the EC. “They are interested in knowing more about the autonomy afforded to different levels of government and what structures are in place in order to control financial investments in the public interest,” says Professor Ladner. The project has opened up new opportunities for Professor Ladner and his team to collaborate with researchers across Europe, which is helping them build a more detailed picture of local autonomy in different areas. “The idea here is about measuring local autonomy on a comparative basis,” he outlines. “There is a very
strong international, comparative aspect to our research, which relies on cooperation. I think that’s the way science should work nowadays.” The historical context also matters here, as the structure of local government develops and evolves over time as circumstances change. Professor Ladner has long experience of monitoring Swiss municipalities, and has detailed information dating back to the end of the ‘80s. “We do surveys every 6-7 years, and look at how municipalities develop. We look at what happens to them, what they change and what kinds of reforms they undertake,” he says. The results of this research can then inform policy development, not just in Switzerland but also more widely. “Our results form part of policy briefs and publications which have international visibility,” continues Professor Ladner. “The diversity of the Swiss cantons and municipalities makes them ideal for this kind of research.”
Project Coordinator Professor Andreas Ladner Responable de l’unité Administration suisse et politiques institutionnelles Institut de hautes études en administration publique (IDHEAP) Quartier UNIL Mouline CH-1015 Lausanne T: +41 21 692 68 60 E: andreas.ladner@unil.ch W: http://local-autonomy.andreasladner.ch/ W: www.idheap.ch W: www.andreasladner.ch Swiss Municipalities Monitoring: Steiner, Reto, Andreas Ladner, Claire Kaiser, Alexander Haus, Ada Amsellem und Nicolas Keuffer (2020). Zustand und Entwicklung der Schweizer Gemeinden. Ergebnisse des nationalen Gemeindemonitorings 2017. Glarus: Somedia Buchverlag. ISBN: 978-3-7253-1072-2 Local Autonomy: Ladner, Andreas, Nicolas Keuffer, Harald Baldersheim, Nikos Hlepas, Pawel Swianiewicz, Kristof Steyvers and Carmen Navarro (2019). Patterns of Local Autonomy in Europe. London: Palgrave Macmillan. ISBN 978-3-319-95642-8 Size and Democracy: Denters, Bas, Michael Goldsmith, Andreas Ladner, Poul Erik Mouritzen and Larry Rose (2014). Size and Local Democracy. Cheltenham: Edward Elgar. ISBN 978-1-84376-672-8.
Professor Andreas Ladner
Countries with high and low degree of local autonomy (LAI 2014) Andreas Ladner is a full professor for political institutions and public administration at the IDHEAP, the Swiss Graduate Institute of Public Administration, at the University of Lausanne. He is a political scientist with a background in sociology, media science and economics. His areas of research include the quality of democracy, federalism and local government.
www.euresearcher.com
59
Jewish New Year’s Card depicting Street singer Jankiel Herszkowicz. United States Holocaust Memorial Museum, courtesy of Joseph Wajsblat.
Listening to the sounds of persecution
Hearing Herszkowicz’s satirical and ascerbic performances referencing political and everyday realities of life under Nazi occupation was part of ghetto street life in Lodz. The songs performed by him entered ghetto culture, and were referenced in first-person accounts, the Ghetto’s official Chronicle and documents such as this postcard. USHMM Photograph Number: 59782 https://collections. ushmm.org/search/ catalog/pa1148188
The Jewish population of Eastern Europe were subjected to relentless persecution during the Second World War. What did this persecution sound like? Researchers are analysing references to sounds in diaries, contemporary reports and post-war reports from survivors to build a deeper picture, as Professor Christian Gerlach explains. The Jewish population
of Eastern Europe suffered terrible German-organized persecution in the Second World War, during which huge numbers of people were confined to ghettos and murdered in camps or shot. The sounds that marked this period of history can tell us much about the nature of this persecution, a topic central to Christian Gerlach’s work as the Principal Investigator of a new research project based at the University of Berne (SNF project 172597). “The original, fairly simple question was; ‘what did this persecution sound like?’” he outlines. This represents a relatively neglected aspect of the historiography of the Holocaust, yet analysis of written evidence of the sounds heard at the time can help researchers uncover more detail about the nature of anti-Jewish persecution, believes Gerlach. “Sounds can tell us about things such as social relations, social order, and power hierarchies, as well as collective action, cultural difference and religious practice. They can help us learn about conflict, violence, and gender roles,” he explains.
Anti-Jewish persecution By analysing texts, the group hope to shed new light on the nature of anti-Jewish persecution during the conflict. Researchers in the project aim to identify references to specific sounds in written texts, focusing on three main
60
types of documents. “We work on the basis of written documents. One type is diaries; here you find references to everyday sounds. Then you also have post-war survivor reports, in which the sounds of everyday life are not very prominent but which focus on dramatic scenes. The third group of texts we are looking at is contemporary reports, that mainly deal
Sounds can tell us about things such as social relations, social order, and power hierarchies, as well as collective action, cultural difference and religious practice. with public life,” says Gerlach. These accounts detail not only sounds associated with violence and war, but also how people communicated and behaved at a time when their lives often depended on their ability to avoid detection, for example if they were in hiding from the occupiers. “This is very much about being on the alert, about being suspicious and always listening carefully,” continues Gerlach. “In one of the sub-projects, Nikita Hock looks at hiding places.”
A hiding place may have only been used for a short period of time before people felt they had to move on, while in other cases people may have stayed in the same location for months or even years at a time. This in itself presented challenges, among which were everyday bickering or often simply boredom. “Boredom is psychologically dangerous. Because when you are bored you can start to do stupid things,” says Gerlach, summing up Hock’s findings. The nature of the hiding place also affected how people conducted themselves. “Underground shelters had special acoustic qualities – you couldn’t be heard very well from outside, you were pretty isolated. That meant you could have a relatively normal life, you could wash your dishes, talk, some people could even sing. On the other hand, you wouldn’t be able to hear when somebody approached,” says Gerlach. “The situation in an attic hiding place was completely different. There you were much more exposed, and every move that you made could give you away.” The main threat to those hiding in these locations was denunciation, so it was important that as few people as possible knew they were there. The situation in ghettoes, where huge numbers of people were confined together in small areas, is a major topic of interest in Janina Wurbs’ sub-project. “One of the things we’re interested in is singing. Singers are interesting because this tells us about tensions within
EU Research
Social history This project overall is a scholarly undertaking, with Gerlach and his colleagues aiming to contribute to a social history of violence and persecution through sound history. One aspect
of Gerlach’s research involves investigating the role of mass media in inciting ethnic, racial or religious hatred. “The radio is acoustically the most interesting. In my studies on Jewish survivors of Nazi persecution, I found very few mentions of radio propaganda. I also looked in a comparative sense into other situations of mass violence, such as the mass killings in Indonesia in the ‘60s, where the same is true,” he says. This approach could allow researchers to put their findings on Nazi Germany into a broader context, yet the primary focus in the project is on anti-Jewish persecution during the Second World War, with Christoph Dieckmann’s subproject focusing on survivor reports from the post-war period. “This sub-project will be, more than the others, about issues of narrative and memory,” continues Gerlach. One of the key points to emerge from the project is that references to sounds become more frequent in the narrative during periods of transition or crisis, in particular the sound of steps. “The nature of these sounds is very interesting, because it tells you something about the system,” says Gerlach. German police and the SS devoted a lot of energy to looking through buildings in search of people who were in hiding, but they often did so in pairs or even alone, so they had a high degree of autonomy. “On most occasions the police functioned as they were supposed to, pulling people out of their hiding places and bringing them to designated locations,” outlines Gerlach. “Only in a tiny number of cases did they pretend to overlook the people who they found. We can learn more about this autonomy of action and how it was made use of by enquiring into what survivors said about the sounds that they heard.”
SOUNDS OF ANTIJEWISH PERSECUTION Sound production and aural experience during the Holocaust Project Objectives
The project aims to contribute to the history of the Holocaust by analysing written evidence of sounds from the period in diaries, reports and post-war accounts. Sound history is a relatively young field, and has not been widely applied in the study of persecution, so researchers hope that this approach will help shed new light on the period. Sounds reflect social conditions and relations, and so can open up new insights into how Jewish people and communities dealt with persecution, and the nature of the persecution they were subjected to.
Project Funding
Funded by the Swiss National Science Foundation
Contact Details
Project Coordinator, Christian Gerlach Universität Bern Historisches Institut Länggassstrasse 49 3012 Bern Schweiz T: +41 31 631 50 88 E: christian.gerlach@hist.unibe.ch W: http://soundscapesoftheshoah.org/ W: https://www.hist.unibe.ch/forschung/ forschungsprojekte/sounds_of_anti_jewish_ persecution/index_ger.html Christoph Dieckmann Nikita Hock Janina Wurbs © © ManuelMiethe
the ghetto,” Gerlach explains. The ghettoes in Warsaw, Łódź and other locations were places of extreme scarcity and hunger, as well as other social problems, issues that some singers touched on in their music. “Some of these songs were about the lack of food and hunger, but also about inequality,” continues Gerlach. “Wurbs’ sub-project is looking at certain social situations, for example street begging and soup kitchens, and the sounds associated with them. These big ghettoes were very crowded – for example, the Warsaw ghetto covered an area of around 4 km2, with a population of about 400,000 people.” Evidence suggests the main sounds that were audible in the ghetto came from human voices and music, rather than technology. There is however relatively little evidence of collective action amongst the population, despite so many people being crammed together in a confined space. “There’s very little evidence of people marching, of funeral processions or political demonstrations,” says Gerlach. There is also little to suggest that collective acoustic alarm systems were established within ghettoes, to try and warn Jewish people of a specific threat. “Drums, trumpets, bells or other signals were not used to raise the alarm. Communities usually had not agreed on specific acoustic signals to raise the alarm. This didn’t really change much, even as the persecution progressed,” explains Gerlach. “The German occupying forces tried to prevent the use of such signals, while to some extent people were in denial about the threat that they faced.”
Dr Christoph Dieckmann taught Modern European History at Keele University, UK, and researched Yiddish Historiography on the Russian Civil War at the Fritz Bauer Institut in Frankfurt am Main. His study on the German Occupation Policy in Lithuania 1941-1944 was awarded the Yad Vashem International Book Prize for Holocaust Research in 2012. Nikita Hock is a doctoral candidate at the University of Bern. He studied Jewish and Religious Studies at the Freie Universität Berlin, as well as Cultural Theory and History at the Humboldt-Universität zu Berlin. His doctoral project examines the experience and depiction of sound in Jewish war-time diaries. Janina Wurbs is a doctoral candidate at the University of Bern. She holds an MA in Jewish Studies and History; a specialist in Yiddish language and culture, she has published in major Yiddish newspapers and magazines as well as Yiddish radio. Her book „Generationenübergreifender Jiddischismus“ came out in 2018.
A bunker used by Jews for hiding during the Warsaw Ghetto uprising, April-May 1943. Yad Vashem Photo Archives 2807/3 From a report filed by General Jürgen Stroop, the SS officer who commanded the liquidation of the Warsaw Ghetto uprising, April-May 1943. Copyright © 2020 Yad Vashem. The World Holocaust Remembrance Center
www.euresearcher.com
61
A new way to address student engagement during the pandemic In response to the challenges of teaching during the pandemic, Dr. Joanne Tippett, a Lecturer in the School of Environment, Education and Development at The University of Manchester, has developed an innovation called Ketso Connect. This is a new way to increase engagement, bring structure and build community in online and blended learning. By Richard Forsyth
D
r. Tippett is an advocate of engaging the senses in learning, in particular though kinaesthetic approaches, or learning through active physical engagement. Her methods combine visual communication, the written and spoken word, along with hands-on movement of ideas to create clusters and develop connections. With universities now defaulting to online as the safest medium for much of their teaching during the pandemic, she believes something valuable may be forfeited in the learning process. When questioning students about remote online learning, Tippett found that it was easy for them to become distracted and disengaged. Students admitted they would be checking mobile phones and losing focus. Drawing on her decades of experience in using hands-on tools to promote engagement in face-to-face teaching, she addressed this new challenge by putting together a kit that looks a little like a board game at first glance. Students use it in conjunction with their online lectures. “I was worried how I was going to keep students engaged in online lectures, and have been really pleased that student feedback from trials of this hands-on kit has been incredibly positive,” said Tippett. “It’s now getting orders and interest from universities in the UK, USA, Nigeria, France and Poland.” The novelty of the kit is that it is very much physical, keeping hands and minds focused and busy as a way to learn, alongside using the computer.
Growing ideas ‘physically’ The concept of Ketso is about structuring and growing ideas, and as a way of learning in group interactions it’s already in use in over half the UK’s universities. This new individual kit is designed to be laid out next to a computer. It has sticky, colour-coded ‘leaves’ that can be written on and icons for prioritising ideas. It is based on a tree that the student can ‘grow’, ‘leaf by leaf’ on a felt workspace. They can add and rearrange ‘leaves’ to capture thoughts and ideas. The ‘tree’ grows as the lecture or discussion goes on. The fact that everyone is using the same physical toolkit at the same time, even when they are in different places, increases a sense of belonging to a learning community. Universities have seen many ways it can benefit cohorts of students. Ohio State University in the USA is incorporating the kit into teaching this semester. Shoshanah Inwood, Assistant Professor at the Agriculture and Environmental Sciences School of Environment and Natural Resources, sees this ‘learning aid in a bag’ as a critical framework for active student engagement. “I hope that this innovation will enable students to do the background analysing and mapping and come to class with better mastery of the readings, issues and connections, and thereby facilitate more enriched learning and conversations that can go deeper than if we had to spend time working on the background only.” As with so many educators in the current environment, Dr. Susan O’Shea, of Manchester Metropolitan University, had been concerned
“Adding this hands-on learning aid to blended learning drives better engagement. Giving students something extra as support can only be a positive in these strange and disconnected times.” 62
EU Research
that physical distancing, while essential for health and safety, can present barriers to communication for students and lecturers. She has worked with her department to procure more than 500 of the kits for use by incoming students, both in online learning and in socially distanced face-to-face workshops, where each student will bring and use their own kit. “Personally, I like that idea of moving away from the screen, even when in an online session,” explained Dr. O’Shea. “Our attention span wains after twenty minutes of staring at a screen. It may not be possible to see people’s faces in the group so we sometimes lose the social cues we might have in physical classrooms. This kit allows students time to think and reflect, yet to stay connected to one another. Having a tool like this allows us to take some of the core elements of participatory education and quickly adapt them for blended learning contexts.” Whilst the toolkit was a response to the challenges of the Covid19 pandemic, it has potential to become a fixture in universities when the ‘dust has settled’. Lecturers around the world are rapidly innovating and finding new ways to deliver their learning in small chunks, followed by some form of engagement suited to the online learning environment. When universities are able to use teaching spaces at full occupancy again, some of these innovations are likely to stick. Dr. O’Shea saw this potential, adding: “This idea of collaborative learning, so often successfully used in the classroom, can be extended into private study spaces by using a kit like this. This is something we should strive to maintain well beyond the current situation. When we begin to move back to more traditional lecture halls, students can bring their kits with them and keep those collaborations going.” Having a tangible representation of ideas that can be moved and developed over time helps students make more connections between
www.euresearcher.com
taught sessions, their reading and their own ideas. Students will have their own personal kit to use throughout their academic career, and can use them to help them plan essays and projects, assess their skills and plan careers.
Building student community From the sudden multitude of orders coming in from around the globe, Dr. Tippett knows she is on to something that has struck a chord with educators. “All lecturers are concerned about the impact of social isolation on our students’ experience,” said Tippett. “Adding this hands-on learning aid to blended learning drives better engagement. Giving students something extra as support can only be a positive in these strange and disconnected times.” The need to forge a stronger learning community is a big drive for using the learning aids. In addition, it could help bridge the digital divide that is now, with the forced reliance on technology, more prevalent than ever before. As Assistant Professor Inwood put it: “I hope it will provide students with a low-cost resource that is not reliant on expensive subscription-only digital technology or dependent on high speed broad band, that they can continue to use in both their academic and professional careers.” Cultural changes in pandemics happen. Could we be witnessing one such change, where adding hands-on, tactile engagement to the disembodied online learning environment endures, making learning something you can touch and interact with more fully?
Ketso.com 63
Uncovering the world’s linguistic diversity Do languages change systematically in certain environments? We spoke to Professor Kaius Sinnemäki, Dr. Francesca Di Garbo, Dr. Eri Kashima and Dr. Ricardo Napoleão de Souza about the work of the GramAdapt project in combining typological and sociolinguistic methods to investigate linguistic adaptation. Around 7,000 different
languages are currently spoken across the world, some of which are the mother tongue of millions of people in large countries, while others are spoken by relatively small populations in remote areas. Based at the University of Helsinki, Professor Kaius Sinnemäki is the Principal Investigator of the GramAdapt project, an ERC-backed initiative which aims to help build a deeper picture of this linguistic diversity. “Our work comes under the umbrella of language typology, the worldwide comparison of languages,” he outlines. “We aim to cover languages from many different language families. So this means not just those from the Indo-European family for instance, but also languages from the Uralic, Austronesian, Austroasiatic, and other families.”
GramAdapt project at the crossroads between typology and sociolinguistics A combination of typological and sociolinguistic approaches are being applied in the project to explore the structure of language,
with Professor Sinnemäki and his colleagues aiming to look at a group of 150 languages. The typology strand of this research involves mapping the strategies that are used in different languages to achieve certain functions or meanings. “For example, in English morphemes can be used to change a verb into a noun, like adding -ion to the verb ‘invent’ to change it to the noun ‘invention’,” explains Dr. Francesca Di Garbo. In other languages this change may be achieved using other means. “Typologists explore this variation of strategies and present it through geographic patterns, for instance,” continues Dr. Di Garbo. The strategies found in languages are often the result of inheritance, so sister languages tend to have similar structures. Very different structures are evident in other languages however, with researchers in the project aiming to draw comparisons. “Most of the work typologists do is about comparing the structure of languages, like how words are built up from smaller parts, how the sound system is structured, and what is the word order,” says Professor Sinnemäki. In English,
the word order subject-verb-object is used in a simple sentence, but a different order is used in Turkish and Hindi for example. “They put the verb at the end of the sentence. So instead of ‘John likes Mary’, they would say ‘John Mary likes’,” he outlines. “This may have significant repercussions elsewhere in the grammar.” This typological work is combined with socio-linguistic research in the project, in which the focus is on the relationship between languages and the groups of people that speak them, including languages spoken by relatively small numbers of people. During her PhD studies, Dr. Eri Kashima did field work in a remote part of Papua New Guinea, where she studied a local language. “I was hosted by a local tribe, and they identified their language as Nmbo. It’s probably spoken by about 700 people, while other people in the area have learnt it as a second language,” she says. By analysing different parts of the language and its use by male and female, young and old, Dr. Kashima was able to identify the core structure of the language and important aspects of its social variation.
50 sets of languages chosen from 24 geographic-cultural areas Observational Units: Language Sets
Map created by Sakari Sarjakoski.
64
The village of Bevdvn in the Morehead District of Western Province, Papua New Guinea (taken by Eri Kashima in 2015).
EU Research
“The local community really got involved and were very enthusiastic,” she continues. The descriptions of these types of languages are central to the wider goal of uncovering as much of the world’s linguistic diversity as possible. The aim in the project is to build a sample of 150 languages that is as representative as possible of global diversity. “In principle, we want to look at languages from all areas of the world,” says Dr. Di Garbo. This diversity will help ensure the project’s findings are robust, and not slanted towards one specific geographic region. “We know that most European languages work very similarly in many respects,” points out Dr. Ricardo Napoleão de Souza. “There are also geographical influences, as languages that are spoken in neighbouring areas for very long periods tend to become more similar over time because of multilingualism and mutual borrowing.”
Language structures and sociolinguistic environments The data on these different languages provides the basis for researchers to investigate the structure of language, and whether it adapts to the socio-linguistic context in which it is spoken. Most of the languages being analysed are spoken by relatively small groups of people, and Professor Sinnemäki hopes the project’s work will help bring these languages to the fore. “We want to look at languages spoken by minorities more systematically. Most languages in human history have been spoken by small people groups, but we do not have a clear idea what happens in them,” he says. A questionnaire is being developed in the project to help researchers characterise how language is used in different social domains. “It is structured in a way aimed not only at a modern, western society, but it also aims to capture how language is used in
We are investigating topics like how language is used and how it changes when it’s being spoken by smaller groups of people, when it’s being spoken by larger groups, or in environments where there is a lot of multilingualism. A large number of words commonly used in modern English have their roots in other languages, for example entrepreneur and schadenfreude, while there are also other reasons why a language may have certain similarities with another. One is that these languages are related, so they have inherited features from a shared ancestor, like in biology, while there are also other factors to consider. “Different languages may also change in the same ways because humans process and learn languages largely in the same ways all over the world. So although they are not related, and not spoken in the same areas, they may still look alike in various ways,” explains Professor Sinnemäki. “There’s also simple chance. There are some similarities in the structure of words between Finnish and the Quechuan languages spoken in the Peruvian Andes, but that’s just by chance.”
smaller communities, in principle anywhere in the world,” outlines Dr. Di Garbo. This is part of the wider goal of bringing the psychological and sociological aspects of linguistic research together. The aim here is to investigate whether languages change systematically in certain environments. “This means things like how language is used and how it changes when it’s being spoken by smaller groups of people, when it’s being spoken by larger groups, or in environments where there is a lot of multilingualism,” says Professor Sinnemäki. This research is grounded in empirical data on different languages from across the world, and Professor Sinnemäki believes their work also holds relevance beyond linguistics. “We want our research to be visible beyond the linguistics field, to anthropologists, social psychologists and cognitive scientists for instance,” he says.
GRAMADAPT Linguistic Adaptation: Typological and Sociolinguistic Perspectives to Language Variation Project Objectives
The ERC Starting Grant project GramAdapt aims at developing a synthesis of crosslinguistic and sociolinguistic approaches to language variation: this novel framework enables transforming sociolinguistic data to a typology of speech communities and for researching whether language structures adapt to the social environment in which they are and have been spoken.
Project Funding
European Research Council ERC Starting Grant project “Linguistic Adaptation”
Project Partners
• Dr. Francesca Di Garbo • Dr. Eri Kashima • Dr. Ricardo Napoleão de Souza
Contact Details
Principal Investigator, Kaius Sinnemäki Assoc. Prof. of Quantitative and Comparative Linguistics Area Editor, Linguistics Vanguard Department of Languages University of Helsinki T: +35 8 50 4482065 E: kaius.sinnemaki@helsinki.fi W: https://www.helsinki.fi/en/ researchgroups/linguistic-adaptation W: https://researchportal.helsinki.fi/en/ persons/kaius-sinnemäki Dr Kaius Sinnemäki, Dr Francesca Di Garbo, Dr Eri Kashima, Dr Ricardo Napoleão de Souza (from left to right) Photographs by Mika Federley
Dr Kaius Sinnemäki is Associate Professor of Quantitative and Comparative Linguistics at the University of Helsinki. He conducts research on language variation and universals and develops new tools for comparing how linguistic patterns interact among themselves and with social patterns across languages. In 2020 he was selected to the Young Academy of Europe.
Variables: Linguistic and Socio-Cultural Linguistic Variables
Example
Word and meaning (Lexico-semantic)
Kinship terms
Speech sounds (Phonological)
Syllable structure
Word formation and grammar (Morpho-syntactic)
Grammatical number, and other inflection
Data source: Published material
Reference grammars, others.
Socio-cultural Variables
Example
Language Ecology (Macrostructural and language ecological)
Societal multilingualism, language ideologies
Network structures (Network structures and linguistic diffusion)
Social network density
Interaction (Socio-cognitive)
Audience design, shared norms
Cognitive processing (General cognition and language processing)
Bi- and multilingualism
Data source: Questionnaire
Questionnaire designed for the project
www.euresearcher.com
65
Uncovering connections between Arabic texts The Arabic textual tradition is large and complex, and historically authors frequently made use of earlier texts in their own works. The KITAB project is harnessing the power of technology to detect examples of text reuse across a large corpus of material, helping to build a deeper picture of the relationship between different authors and their books, as Professor Sarah Bowen Savant explains. The Arabic written
tradition is enormously inter-textual, with authors from different periods often using ideas and excerpts from other scholars in their own works. Researchers have historically identified relations between texts by essentially lining them up and reading them side-by-side, without the use of technology, as Professor Sarah Savant explains. “Previously you couldn’t do this in a modern, digital way,” she says. Based at the Aga Khan University in London, Professor Savant is the Principal Investigator of the KITAB project, an ERCfunded initiative which is developing tools to help researchers identify where authors have re-used material from other works. “The term that’s used is text reuse. It signifies repetition, in whole or in part, of one text in another, with or without citation,” she outlines. “It can include paraphrasing, while it can also be very literal. The ways in which it occurs affect how we are able to find it, and also the ways in which we are able to measure it.” An algorithm called passim is central to this, enabling researchers to identify cases where authors borrowed language and ideas from other works. Created by David Smith, a computer scientist at Northeastern University, it operates in ways broadly similar to the anti-plagiarism software that is used in many universities across the world, as it looks for common sequences. “The original idea with KITAB was that we can use software to find similarities in the machinereadable Arabic files widely available online today,” says Professor Savant. Adapting passim for Arabic and the large written tradition has had its challenges. “We have also had to work to tailor passim to show us meaningful patterns of text re-use,” acknowledges Professor Savant.
Manuscript KÖprülü 01589 from the Süleymaniye Library in Istanbul. Members of the KITAB project are transcribing folios such as these ones.
KITAB project A large corpus of material has been assembled in the project, dating from around the 8th century right up to the 20th century. Altogether, the corpus comes to approximately 1.5 billion words. By harnessing the power of technology, researchers aim to build a deeper picture of the relationship between different
66
The KITAB project is developing analytical tools to study the relationships between texts. This PowerBi dashboard shows the many books in the project’s corpus that align to a major anthology of the 14th century entitled The Ultimate Ambition in the Arts of Erudition by Shihāb al-Dīn al-Nuwayrī (d. 732/1332). A user can click on a dot in the top part of the visualization and an alignment between al-Nuwāyrī’s book and an earlier book will appear in another panel.
EU Research
texts. “We’re building a large corpus, while we’re also looking to reveal connections between these texts,” says Professor Savant. The texts themselves address a wide variety of different topics. “There are lots of texts pertaining to the religious history of Islam, on theology and lots of commentaries on the Quran. An enormous quantity of the texts that we have in the corpus pertain to the Prophet,” continues Professor Savant. “What there’s not a lot of are works on groups that were not part of the main narratives of early Islamic history. These groups were not part of the major scholarly apparatus of Sunni or Shi’i Islam.” There is a degree of survival bias in the corpus, as, for example, it’s more difficult to obtain texts on the less prominent groups in early Islamic history. It’s important in the project to build a broadly representative corpus, so Professor Savant and her colleagues are working on ways to add texts that treat all branches of knowledge in Arabic. “Our aim is to look at as many texts as possible,” she says. A large number of cases where there is a relationship between two texts have already been identified, and now the research team is focusing its
KITAB Knowledge, Information Technology, and the Arabic Book / Studying the formation and development of the written Arabic tradition with digital methods Project Objectives
Each of the dots on this graph represents a book within the OpenITI corpus. They are plotted according to the century in which the author died (x-axis) and what percentage of the work consists of transmissive chains. The red line represents the middle two quartiles for all books (blue = histories only).
distantly related, where authors are writing about the same topics and using common language, but it’s very difficult to say who borrowed from whom.” There may also be variations between versions of ostensibly the same works, another area of interest in the project, with researchers comparing differences mathematically. The corpus itself is being regularly updated, and is being published regularly for the wider research community. “Other researchers can also use the corpus, for example researchers
There are lots of texts pertaining to the religious history of Islam, on theology and lots of commentaries on the Quran. An enormous quantity of the texts that we have in the corpus pertain to the Prophet. attention on the most interesting examples. One particular interest is isnads. “In Arabic books we commonly find isnads, which are chains of transmission. These isnads are also found in histories and works of geography; they’re all over the Arabic textual tradition,” explains Professor Savant. “The Islamic textual tradition involves a lot of citations and as an author, who you cite matters.” This does not mean that authors invariably cited all of the different scholars that influenced their own work, however. For example, Professor Savant is studying a historian from the 10th century, who was generally known for making numerous citations, yet in a specific case he did not credit a major influence. “He didn’t cite a scholar who lived about a generation earlier than him at all, even though he took a substantial portion of his work on the reign of one of the caliphs from him,” she says. This kind of issue is fairly common in book history. “On the other hand, sometimes the citations are so numerous and complex that we can’t figure out what they actually mean,” outlines Professor Savant. “Citation is a part of the work, while a lot of texts are also more
www.euresearcher.com
in computational linguistics,” says Professor Savant. The methods that have been developed in the project could also be applied to analyse texts in other languages, a possibility that Professor Savant is keen to explore in the future. “These methods, including the ways in which we built the corpus, the OCR and the text reuse detection methods, are re-usable for other languages,” she continues. “One goal is to do the same kinds of work with other languages used by Muslims, such as Persian and Urdu for example. We’ve also been asked if we could do similar work with French, and the answer is absolutely yes.” The project’s work with Arabic is still ongoing, and Professor Savant has been sharing their results at conferences and events within the wider research community. A number of articles are being written as part of the project, and a virtual reading environment is being developed, which will provide a forum for discussion and debate. “We will give users the ability to see our texts and text reuse data, and will use technology to showcase the results of team members. We’ll have research articles written by team members, and we’ll display them on our website,” says Professor Savant.
The KITAB project uses the latest digital technology to investigate the history of the Arabic textual tradition, one of the most prolific in human history. Its chief interest lies in how authors assembled works out of earlier ones. The circulation of texts will show how cultural memory was negotiated and shaped in the medieval Islamic world (ca. 700-1500).
Project Funding
KITAB is an initiative funded by the European Research Council (ERC). Grant agreement ID: 772989. The project also has funding from the Qatar National Library and the Andrew W. Mellon Foundation.
Project Partners
• Qatar National Library • Northeastern University • University of Maryland • University of Vienna • Leipzig University
Contact Details
Professor Sarah Bowen Savant KITAB Aga Khan University (International) in the United Kingdom, Aga Khan Centre 10 Handyside Street N1C 4DN London United Kingdom T: +44 207 380 3843 E: sarah.savant@aku.edu W: http://kitab-project.org Professor Sarah Bowen Savant
Professor Sarah Savant is a cultural historian specialising in Iran and the Middle East, based at the Aga Khan University. She is the author of The New Muslims of Post-Conquest Iran, Tradition, Memory and Conversion, which won the Saidi-Sirjani Book Award, given by the International Society for Iranian Studies, on behalf of the Persian Heritage Foundation.
67
Fit for the connected workplace? High levels of stress at the workplace can cause burn-out, and staff may need to take weeks, months, or even years off work to recover. The use of physiolytics at the workplace could help companies monitor stress levels among staff, yet this also raises ethical and legal concerns, issues at the heart of Professor Tobias Mettler’s research. Many European economies
see high levels of absenteeism due to stress or burn-out at work. A variety of factors may be behind high levels of stress, such as an individual’s professional workload, or the responsibility of taking difficult decisions. “It might be an organisational resource problem, where there are not enough experts available to deal with all the work for example,” outlines Tobias Mettler, Professor of Information Management at the University of Lausanne. As the Principal Investigator of a research project funded by the Swiss National Science Foundation, Professor Mettler is exploring issues around the introduction of technologies intended to reduce levels of absenteeism. “We’re particularly looking at the stress caused by suboptimal organisational resource management,” he explains. “If a department is severely understaffed, you could essentially measure elevated stress levels in the whole group.”
PHYSIOLYTICS AT THE WORKPLACE Full Title: Physiolytics at the workplace: How to increase motivation of and create value for employees Funding: The project is funded by the Swiss National Science Foundation, Grant No. 172740. Project Coordinator, Professor Tobias Mettler IDHEAP | Institut de hautes études en administration publique Swiss Graduate School of Public Administration Information Management Research Unit Université de Lausanne Bâtiment IDHEAP CH-1015 Lausanne T: +41(0) 21 692 69 50 E: tobias.mettler@unil.ch W: www.unil.ch/idheap
Physiolytics technology The use of biosensors and physiolytics devices could enable a company to monitor stress levels among employees, and then take appropriate action. However, this prospect does raise privacy concerns, an issue Professor Mettler is addressing in the project. “In many countries health is something that is very personal and private. People are not used to the idea that a company can monitor their health,” he points out. The issue here is to motivate people to use these kinds of technologies.
We’re looking at the stress caused by sub-optimal
organisational resource management. If a department is severely under-staffed, you could essentially measure elevated stress levels in the whole group. “The first thing is to convince people; what’s in it for them? We have done a lot of research in this area,” continues Professor Mettler. “In certain jobs, like firefighting or policing, you need to have a certain level of physical fitness to do the job. In these examples, there may be a stronger case for compelling people to use these kinds of sensors. But in other jobs adoption will be voluntary, so we will need to convince people.”
Tobias Mettler is associate professor for Information Management at the Swiss Graduate School of Public Administration. His work focuses on the study of the socio-technical design of digital technologies for the public sector.
Photo by Luis Villasmil
68
This is a challenging task, as the impact of these technologies is not typically visible, it’s about prevention rather than cure. An individual may find that their sensor readings are entirely normal and believe that they don’t need to use it any more, another topic Professor Mettler is investigating. “We’re looking into the question of longterm motivation. How can we convince people to use a sensor on a regular basis, to make it a habit?” he asks. One approach would be through using nudge theory and behavioural insights, yet this raises ethical concerns. “Do
we really want our employer to nudge us? What kinds of nudges are acceptable from an employee perspective? We work together with employees on these kinds of questions,” says Professor Mettler. “Another thing that we’ve looked at is gamification approaches. So how can we do it in a more fun way to encourage people to adopt these biosensors?” A third major strand of research in the project centres around providing evidence about the utility of this technology and its impact on absenteeism, which is central to a company’s decision on whether to invest in it. For smaller companies, biosensors could be an attractive option in terms of taking care of their employees’ health, but it has to be effective. “They will only invest in this kind of technology if they know that it really has an effect,” stresses Professor Mettler. Cost is of course another important topic for companies, yet Professor Mettler says this technology is not expensive, the questions are more around ethical and legal considerations. “Surveillance technologies are already used in some workplaces, the difference is that in future it will be a lot more individual, more personalised,” he says. “The technology in the connected workplace is not expensive, but we need to think about ethical and legal safeguards.”
EU Research