Editor’s Note
Alot can happen in 24 hours with the current British Government so it’s with caution I say ‘at the time of writing’ that things are looking up for a potential renewal of connections between UK and European science teams, working on joint research together.
European President, Ursula von der Leyden declared talks around the UK joining the science research scheme, the Horizon programme can begin ‘the moment’ the socalled Windsor Northern Ireland framework protocol is approved.
The sticking point of the mechanics of post-Brexit trade in Northern Ireland has been an ‘elephant in the room’ for the UK and Europe since the slightly acrimonious breakup.
The reality is that Britain did very well with research funding from the near hundred billion euro Horizon 2020 coffers, receiving seven billion euros, as one of the leading beneficiaries.
For science, Brexit was enormously frustrating for British universities, losing not only important research funds but equally important collaborations with European researchers.
As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.
What’s more, earlier, the Campaign for Science and Engineering (CASE) group indicated that over one and a half billion pounds, that was supposed to be set aside for UK research by the Conservative Party, was returned to the Treasury.
All this to one side, it is with great hope for science in general, that Britain and Europe can re-establish a strong relationship in research, to tackle some of the most important scientific challenges together, rather than apart.
Hope you enjoy the issue.
Richard Forsyth Editor4 Research News
EU Research takes a closer look at the latest news and technical breakthroughs from across the European research landscape.
10 Diversity and compartmentalisation of monocytes & macrophages and immuneparesis in patients with cirrhosis
The project: ‘Diversity and compartmentalisation of monocytes & macrophages and immuneparesis in patients with cirrhosis’ aims to understand the immune response in cirrhosis patients, as Dr. Christine Bernsmeier, explains.
12 PATHOGEN HOST CELL INTERACTION
We spoke to Dr. Volker Heussler, who researches the development of the Plasmodium parasite inside the hepatocytes, and the factors that influence its survival or elimination.
14 5D HEART PATCH
The 5D Heart Patch Project has identified human ventricular progenitor (HVP) cells that can create self-assembling heart grafts in vivo, offering hope to people suffering from heart failure.
17 CSI-FUN
Researchers in the ERC-funded CSIFUN project are investigating the underlying molecular mechanisms behind CSI and how it spreads through the body, as Erwin Wagner explains.
20
IN-SITU INVESTIGATION OF INTERFACIAL REACTIONS
It’s essential to ensure that implants are safe, a topic at the heart of Dr. Anna Igual Munoz and Dr Stefano Mischler’s work in collaboration with Prof. Dr Brigitte Jolles-Haeberli
22 SAHR
Researchers in the SAHR project are studying how humans acquire fine motor skills and looking to develop new learning and control algorithms for robots, as Professor Aude Billard explains.
24
Earthquake Research and Innovation
Innovation can make a difference in earthquake-prone areas, for prediction, resistance and search and rescue. Here are some ways science can mitigate the horrors that can occur in earthquakes. By Richard Forsyth.
28
REALM
Researchers in the REALM project are looking again at earth system modelling practice as they work to develop a new kind of vegetation model, as Professor Colin Prentice explains.
30 EVOCLIM
New assessment models will provide a more solid basis for comparison of instruments like carbon pricing, regulation and information provision, as Professor Jeroen van den Bergh explains.
32 COOLER
Professor Peter van der Beek and his colleagues in the COOLER project are investigating the transformation of fluvial landscapes into glacial landscapes.
34 MOBILITY IN SWITZERLAND
In their research, Dr Axel Franzen and Fabienne Woehner are investigating how digitalizing the labour market could help transform the mobility sector.
36 CIRCULAR FLOORING
Researchers in the Circular Flooring project are using the CreaSolv® Process to recover PVC from postconsumer waste flooring, enabling its eventual re-use, as Thomas Diefenhardt and Dr. Martin Schlummer explain.
38 LET IT SHINE
Why do companies choose to enter moral markets? Dr Panikos Georgallis says it’s not just about resources, but also a firm’s identity and the social context.
39 MixITiN
Dr Aditee Mitra and Dr Xabier Irigoien tell us how the MixITiN project has been bringing marine ecology into the 21st century, work which holds important implications for ocean health and policies.
42 THE EVOLUTION AND RESOLUTION OF SEXUAL CONFLICT IN FLOWERING PLANTS
Why is it that one plant species has separate sexes, while its sister species in the phylogeny remains hermaphroditic? This question lies at the core of Professor John Pannell’s research.
45 NanoInformaTIX
We spoke to Lisa Bregoli about the NanoInformaTIX project’s work in developing a framework and web-based platform to predict the behaviour of nanomaterials, which will support their safe-by-design development.
48 smMIET
We spoke to Dr. Jörg Enderlein, about his project Single-Molecule Metal-Induced Energy Transfer, a new method of three-dimensional fluorescence microscopy with a resolution on a molecular scale.
50 ION4RAW
The ION4RAW project aims to develop new, more sustainable and environmentally-friendly methods to recover critical raw materials and metals from mining sites, as Maria Tripiana explains.
53 ALPHA
Herbert Edelsbrunner, a pioneer of alpha shapes and persistent homology, is using these ground-breaking computational geometry ideas to empower new applications such as cancer detection and modelling.
56
The Chatbots Revolution
ChatGPT has arrived and demonstrated that AI can present research articles, write creatively, program code, do your child’s homework accurately and write poems instantaneously. Where is language AI going and is it a good place? By Richard Forsyth.
60 BALIHT
We spoke to Marta Pérez and Thomas Hoole about the work of the BALIHT project in developing and testing a redox flow battery capable of working at high temperatures.
62 SHeLL
Prof. Josephine Joordens shares revelations unearthed in the project Studying Homo erectus Lifestyle and Location (SheLL), in which she is reassessing a dig site in Trinil (Indonesia) through a geoarchaeological re-excavation.
65 PASSIM
We spoke to Professor Eva Hemmungs Wirtén about her work in investigating the patent system and its role in creating the information infrastructure that shapes our lives today.
68 PHILAND
We spoke to Professor Godefroid de Callataÿ, Sébastien Moureau and Liana Saif about their work investigating the origins of philosophy in al-Andalus, and its importance to the history of ideas.
72 CROSS-POP
How do right-wing populists adapt their discourse in cross-border regions characterised by strong economic interdependences? Dr Christian Lamour and Professor Oscar Mazzoleni are investigating this question.
74 eQG
Professor Hermann Nicolai and his colleagues in the eQG Group are working to develop a new theory of quantum gravity, bringing together general relativity and quantum mechanics.
EDITORIAL
Managing Editor Richard Forsyth info@euresearcher.com
Deputy Editor Patrick Truss patrick@euresearcher.com
Science Writer Holly Cave www.hollycave.co.uk
Science Writer Nevena Nikolova nikolovan31@gmail.com
Science Writer Ruth Sullivan editor@euresearcher.com
PRODUCTION
Production Manager Jenny O’Neill jenny@euresearcher.com
Production Assistant Tim Smith info@euresearcher.com
Art Director Daniel Hall design@euresearcher.com
Design Manager David Patten design@euresearcher.com
Illustrator Martin Carr mary@twocatsintheyard.co.uk
PUBLISHING
Managing Director Edward Taberner ed@euresearcher.com
Scientific Director Dr Peter Taberner info@euresearcher.com
Office Manager Janis Beazley info@euresearcher.com
Finance Manager Adrian Hawthorne finance@euresearcher.com
Senior Account Manager Louise King louise@euresearcher.com
EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom
T: +44 (0)207 193 9820
F: +44 (0)117 9244 022
E: info@euresearcher.com www.euresearcher.com
© Blazon Publishing June 2010
ISSN 2752-4736
The EU Research team take a look at current events in the scientific news
Repowering the EU with Hydrogen valleys with over 105 million Euro investment
European Commissioner announces Clean Hydrogen Partnership to fund nine hydrogen valleys projects across Europe.
The Clean Hydrogen Partnership, under the Horizon Europe Programme, has announced it is investing €105.4 million ($114.45 million) for funding nine hydrogen valleys across Europe, and negotiations for the grant agreements for these projects have begun and are expected to conclude before summer 2023. The hydrogen valleys projects will focus on the production of clean hydrogen, address various applications in energy, transport and industry sectors, and are expected to mobilize investments of a minimum of five times the funding provided by the European Union or above €500 million ($541.17 million).
Additionally, the European Commission allocated an extra €200 million ($216.47 million) to the Clean Hydrogen Partnership via REPowerEU to further benefit the hydrogen valleys. A publicprivate partnership supporting research and innovation activities in H2 technologies in Europe under the Horizon Europe Programme, The Clean Hydrogen Partnership has multiple members, including the European Commission, Hydrogen Europe (representing hydrogen industries), and Hydrogen Europe Research (representing the research community).
The Clean Hydrogen Partnership has already started the grant process for two flagship hydrogen valleys.The first of these valleys will be spread across the North Adriatic region. This comprises Slovenia, Croatia, and the Autonomous Region of Friuli Venezia Giulia in Italy. The goal of the second valley is to build a hydrogen corridor across Estonia, South Finland and other Baltic Sea countries. Beyond these, seven smaller-scale H2 valleys projects have been planned for areas of Europe that have no or a limited presence of these valleys, such as regions in Ireland, Luxembourg, Turkey, Italy, Greece, and Bulgaria.
Mariya Gabriel, Commissioner for Innovation, Research, Culture, Education and Youth said “Hydrogen Valleys are key for the creation of a European research and innovation area for hydrogen. They prove that European cooperation can catalyse innovation, create jobs and opportunities while tackling the great energy challenges of our times. And we will rapidly hit the target of doubling the number of operational Hydrogen Valleys by 2025.”
Years of support to research and innovation on hydrogen have put the EU in the global lead for key hydrogen technologies, notably electrolysers, hydrogen refuelling stations and megawattscale fuel cells. Horizon Europe supports the Clean Hydrogen Joint Undertaking (CHJU) with €1 billion, matched by the same amount from industry and research partners. As part of REPowerEU, the Commission has allocated an additional €200 million to the CHJU to accelerate the rollout of Hydrogen Valleys. The Commission also recently granted approximately €4 million under Erasmus+ for a long-term partnership between industry and education to develop advanced skills for the hydrogen economy.
Other EU programmes also offer opportunities for investment in Hydrogen Valleys such as the Recovery and Resilience Facility, the Cohesion policy funds under the relevant smart specialisation priorities, and the Connecting Europe Facility.
Scientific leaders have urged the UK government to re-join the EU’s €95.5bn Horizon research programme as soon as possible, after prime minister Rishi Sunak questioned whether membership represented value for money. “The UK will find it extremely difficult to be an effective research power if it is . . . not part of the European research network,” Sir Paul Nurse, head of the Francis Crick Institute in London, said on Tuesday at the launch of a government-commissioned report into Britain’s scientific research base. “Frankly, alternative arrangements that are being discussed elsewhere will be utterly inadequate in comparison,” he added. The FT reported last week that Sunak was looking at other options in place of Horizon membership, including the UK’s “plan B” global research plan. “Of course we should extend connections to the rest of the world, but first we have to do association with Europe, our closest neighbours, where we have networks already working,” Nurse said.
Tom Grinyer, chief executive of the UK’s Institute of Physics, agreed. “The government’s continued hesitation on Horizon puts the government’s tech ambitions — and the UK’s future as a science superpower — at risk,” he warned. The country’s science leaders have become increasingly alarmed that Sunak might decide not to re-join Horizon. On Monday, the UK’s science and technology secretary Michelle Donelan underlined the government’s stance on membership: “It would have to be on acceptable and favourable terms,” she said. “It would have to be value for money for the taxpayer.”
research programme
Last week Ursula von der Leyen, European Commission president, said work would begin on Britain’s associate membership of Horizon after a breakthrough on a new post-Brexit deal on Northern Ireland trade, known as the Windsor framework. Brussels had blocked British scientists from joining the scheme last year because of the dispute over Northern Ireland. But negotiations on re-joining are expected to take between six and nine months, as London and Brussels work out financial arrangements and agree on the extent of UK participation in Horizon. The top priority for British scientists is regaining access to the European Research Council, which funds the highest-quality scientific projects.
The UK had originally allocated as much as £15bn for participation over the seven-year Horizon programme that runs to 2027, but with just three to four years left that figure will fall significantly. The cross-party UK trade and business commission wrote to British Prime Minister Rishi Sunak on Tuesday evening, calling on him to “urgently recommit” to joining the EU programme. “The longer the UK delays on Horizon membership, the longer the UK will be unnecessarily excluded from significant funding opportunities, and from the economic benefits that international scientific collaboration can bring,” said the commission, whose joint leaders are the Labour MP Hilary Benn and Peter Norris, Virgin Group chair. The government responded: “We will continue to discuss how we can work constructively with the EU in a range of areas, including future collaboration on research and innovation.”
The UK’s new science and technology minister said Britain is willing to “go it alone” if an agreement cannot be reached.
Scientists call for UK to re-join EU’s Horizon
ERC decides to go ahead with lump-sum funding pilot
The European Research Council (ERC) is set to introduce lump sum funding to its Advanced grants for experienced researchers starting in 2024. The decision was made by the ERC Scientific Council on Friday, with the two provisos that the move must not affect the autonomy of the principal investigator in managing their grants and that there will be not explicit “deliverables” or milestones to be reached that could undermine scientific excellence by preventing researchers from acting upon unexpected findings.
ERC is set to give out almost €16 billion to researchers under Horizon Europe. Like most EU research funding, its grants are given based on the real cost of a project, but the introduction of lump sums will lift the burden of reporting, and will instead mean researchers are paid on the basis of activities carried out. Laura Keustermans, senior policy officer at the League of European Research Universities (LERU), says universities in the LERU network welcome the pilot and are particularly happy the Scientific Council acknowledged the autonomy of principle investigators and unpredictable nature of fundamental research. “We do see advantages of lump sum funding as long as the ERC is not losing what makes it so special,” said Keustermans.
€
Thomas Estermann, director for governance, funding and public policy development at the European University Association (EUA), is more sceptical about the move. While he says there’s nothing wrong with lump sum funding if it’s done right, the ERC should not rush it. Instead, beneficiaries should be given a choice on how they want to be reimbursed. “The easiest way would be to provide an option for the applicants,” he said.
Whether lump sums work will depend on the implementation of the approach, the details of which remain to be seen. Estermann urges cautiousness, warning “the devil is always in the detail and implementation.” He adds that a well-designed lump sum system must be transparent, support the principles of financial sustainability, the autonomy of research and not add another administrative burden to the beneficiary in terms of reporting.
Keustermans says it’s important to continue monitoring the situation and to talk to applicants, successful and not, “to try to get an idea on how much additional work there is with the lump sum approach.” The ERC promises the flexibility will remain unchanged. The ERC notes the decision to adopt lump sum funding is tentative and the final decision is expected “in accordance with the timeline set out for the adoption of the ERC work programme 2024.”
UN treaty to protect oceans agreed after decades of talks
framework for parts of the ocean outside national boundaries.
Nearly 200 countries have agreed to a legally-binding “high seas treaty” to protect marine life in international waters, which cover around half of the planet’s surface, but have long been essentially lawless. The agreement was signed on Saturday evening after two weeks of negotiations at the United Nations headquarters in New York ended in a mammoth final session of more than 36 hours – but it has been two decades in the making.
The treaty provides legal tools to establish and manage marine protected areas – sanctuaries to protect the ocean’s biodiversity. It also covers environmental assessments to evaluate the potential damage of commercial activities, such as deep sea mining, before they start and a pledge by signatories to share ocean resources. “This is a historic day for conservation and a sign that in a divided world, protecting nature and people can triumph over geopolitics,” Laura Meller, Oceans Campaigner at Greenpeace Nordic, said in a statement.
The high seas are sometimes called the world’s last true wilderness. This huge stretch of water – everything that lies 200 nautical miles beyond countries’ territorial waters – makes up more than 60% of the world’s oceans by surface area. These waters provide the habitat for a wealth of unique species and ecosystems, support global fisheries on which billions of people rely and are a crucial buffer against the climate crisis – the ocean has absorbed more than 90% of the world’s excess heat over the last decades.
Yet they are also highly vulnerable. Climate change is causing
ocean temperatures to rise and increasingly acidic waters threaten marine life. Human activity on the ocean is adding pressure, including industrial fishing, shipping, the nascent deep sea mining industry and the race to harness the ocean’s “genetic resources” –material from marine plants and animals for use in industries such as pharmaceuticals.
“Currently, there are no comprehensive regulations for the protections of marine life in this area,” says Liz Karan, oceans project director at the Pew Charitable Trusts. Rules that do exist are piecemeal, fragmented and weakly enforced, meaning activities on the high seas are often unregulated and insufficiently monitored leaving them vulnerable to exploitation. Only 1.2% of international waters are protected, and only 0.8% are identified as “highly protected.”
Douglas McCauley, professor of ocean science at the University of California Santa Barbara said “There are huge unmanaged gaps of habitat between the puzzle pieces. It is truly that bad out there,” Countries now have to formally adopt and ratify the treaty. Then the work will start to implement the marine sanctuaries and to attempt to meet the target of protecting 30% of global oceans by 2030. “We have half a decade left, and we can’t be complacent,” Meller said.
“If we want the high seas to be healthy for the next century we have to modernize this system – now. And this is our one, and potentially only, chance to do that. And time is urgent. Climate change is about to rain down hellfire on our ocean,” McCauley said.
The council says yes to no-strings-attached funding for experienced researchers. The move is meant to reduce the administrative burden.
After almost 20 years of talks, UN member states agree on legal
Room-temperature superconductor discovery meets with resistance
If Professor Ranga Dias of the University of Rochester, New York, and his team have observed room-temperature (294 K), near-ambient pressure superconductivity, their discovery could rank among the greatest scientific advances of the 21st century. Such a breakthrough would mark a significant step toward a future where room-temperature superconductors transform the power grid, computer processors, and diagnostic tools in medicine.
But for the past three years, the Rochester team—and Dias in particular—has been shrouded in allegations of scientific misconduct after other researchers raised questions about their 2020 claim of room-temperature superconductivity. In September, the Nature paper reporting that result was retracted. Further misconduct allegations against Dias have recently emerged, with researchers alleging that Dias plagiarized substantial portions of someone else’s doctoral thesis when writing his own and that he misrepresented his thesis data in a 2021 paper in Physical Review Letters (PRL). Jessica Thomas, Executive Editor of the Physical Review journals, confirmed that PRL has launched an investigation into that accusation. “This is a pretty serious allegation,” she says. “We are not taking it lightly.”
Scientists have been working on identifying new superconductors for decades—materials that can transmit electricity without friction-like resistance. However, previously discovered superconductors only work at super cold temperatures, and under incredibly high pressures. The newly discovered superconductor, lutetium, could be much more useful in applications, like strong magnets used in MRIs, magnetically floating trains, and even nuclear fusion, than those which must be kept super-cold.
Dirk van der Marel, a University of Geneva physicist who wasn’t involved in the new research or Dr. Dias’s other work, was among those who raised issues about the 2020 data. Dr. Dias said the retracted paper has been resubmitted to Nature after he and his colleagues collected new data in front of other scientists at the Argonne and Brookhaven National Laboratories, in Illinois and New York, respectively. He added that his group made all their data around “reddmatter” available during the peer-review process for the new paper. Although Dr. van der Marel said the new study appeared to properly demonstrate the effect in “reddmatter,” he said he feels “extremely uncomfortable about the whole thing.”
For his part, Dr. Dias said his group is already looking to tweaking their “reddmatter” recipe to try to achieve superconductivity at even warmer temperatures and lower pressures. One idea is throwing other rare-earth elements that are similar to lutetium into the mix, though these rare elements are expensive, Dr. Dias said. He hopes to try a different approach—maybe aluminum with a dash of something else—which is cheaper to make and can mimic lutetium’s effects.
The group will start using machine-learning to select their next superconductor recipes. They are training algorithms with data from this new work and previous experiments to help the AI better predict what combinations of hydrogen and other elements may yield superconducting materials. “It is remarkable that Mother Nature allows us to use different pathways to get to these remarkable superconducting states,” Dr. Salamat said, who added that dropping the pressure down to zero is the group’s next goal. Dr. Dias said he’s confident that achievement is coming: “It is just a matter of time.”
Despite superconductor breakthrough, some scientists remain sceptical with concerns of data manipulation and plagiarism.A 1-mm-wide sample of the nitrogendoped lutetium hydride created by Dias et al. | Photo Credit: University of Rochester/J. Adam Fenster
War in Ukraine leads to rethink of international scientific cooperation
After the one year anniversary of the war passes, despite EU support for Ukraine, the future of Ukraine’s research community remains uncertain.
A year ago, Russia’s full-scale invasion of Ukraine redefined geopolitics in a shockwave that is still reverberating through the science world. The EU research community was quick to cut ties with Russia and lend Ukraine a helping hand – but now it is grappling with resulting instability and uncertainty as the war climbs into its second year. Lucian Brujan, programme director for international relations and science diplomacy at the German National Academy of Sciences Leopoldina, says it’s too early to say what the long-term impact will be on research and innovation - and urges patience.
“I think many in the community are waiting to see how the political problems will be solved and how this war will end; and after that, we’ll need to have a discussion,” Brujan says. “We have to be honest with ourselves in the scientific community. We are dealing with political and security uncertainty.” But what is clear already is the shift in discourse on international cooperation. While it’s hard to judge the effect in hard terms, conversation has shifted away from blanket arguments in favour of openness, towards a more careful attitude, observes Thomas Jørgensen, director for policy coordination and foresight at the European University Association.
As global tensions intensify, eyes turn to China, which in recent weeks has been deliberating sending weapons to Russia. That the EU has a complex relationship with China isn’t new, but the tense geopolitical situation and China’s equivocation on the war in Ukraine, has added has a new dimension to it, says Lidia BorrellDamian, secretary general of Science Europe.
The complexity isn’t just about big politics but “comes from different legislative approaches in the EU and China regarding open access, open science, treatment of data, the outcomes of research. The difficulties of research collaboration with China have been there for years now,” adds Borrell-Damian. The EU cut off all research ties with Russia as the war broke out, and now the big question is whether it was a one-off extreme measure, or a realisation about the complexity of the world we live in that will lead to reappraisal of research ties with others, including China.
Overall, Brujan notes, “The war has shown one clear trend: scientific organisations and even scientists are way more careful than they have been before. Various aspects are being reconsidered, from security to fundamentals to practicalities. The scientific community will see how this prudence is going to affect cooperation globally. This is a normal reaction to navigating an unstable environment, he adds. We need to see how this prudence is going to affect scientific cooperation. It doesn’t mean we have to give up our way of doing: my message is to have patience and observe sharply what’s going on.”
The important thing now is to keep discussions going, at all levels. Talks under the European Research Area (ERA) framework have been fruitful, but “what we don’t know, and where there can be a mismatch, is between high-level diplomatic geopolitics discussions and research policy discussions,” said Jörgensen. For this discussion to happen, scientists and policymakers will need to learn to speak each other’s languages, and take a more practical approach, Brujan says.
Taiwan’s investment fund begins to make a mark in central and eastern Europe
With $200 million to invest in start-ups and scale-ups, Taiwan’s Central and Eastern Europe Investment Fund is becoming a significant force in the region.
A $200 million venture capital fund set up by Taiwan’s National Development Fund is filling a gap for start-up funding in central and eastern Europe and building connections with Taiwan’s industry and research base. Following its formation in March 2022, the fund has made three investments to date, and is shaping up to be a significant player in a region where local venture capital is scarce. “From the $200 million fund, we are looking to invest in 20-25 companies,” said Mitch Yang, managing partner of the Central and Eastern Europe (CEE) Investment Fund at venture capital firm Taiwania Capital. “We have just started, and we are going to speed up, so this year our target is to invest in six to eight companies.”
The fund has a broad range of interests, taking in semiconductors, laser optics, biotechnology, aerospace, fintech, electric vehicles, smart manufacturing, artificial intelligence and smart cities. Its first investments are in Litilit, an industrial femtosecond laser start-up based in Lithuania; Photoneo Brightpick, a computer vision and robotics business in Slovakia; and Oxipit, an AI medical imaging start-up in Lithuania. The fund is open to all countries in the region, but prioritises investments in companies from Lithuania, Slovakia, and the Czech Republic. This focus reflects where the fund feels its investments can have the most impact and find the best fit with Taiwan’s industrial strengths. But it is also influenced by broader
developments in the relationship between Taiwan and the region.
The origin of the fund lies in the COVID-19 pandemic, which helped forge new connections between Taiwan and countries in central and eastern Europe. In the early months of the pandemic, Taiwan sent masks and medical supplies to the region, with countries such as Lithuania, the Czech Republic, Slovakia, and Poland returning the favour with vaccines when the virus spread in Taiwan. This mutual aid resulted in a government delegation from Taiwan to the region in October 2021, and discussions about how cooperation could be extended. The fund is one result of these discussions.
Looking to the future, Yang sees the fund as the first step in sustained cooperation between Taiwan and the region, which will be driven by current events such as the war in Ukraine. “Once the war is over, there will be reconstruction and something like a Marshall Plan,” he said. “That will be focused on Ukraine, of course, but it will affect the whole region. Ukraine, Poland, Slovakia, the Czech Republic, and the Baltics are going to play important roles.” While the US is likely to drive that reconstruction effort, it will need to be a collaboration. “I want Taiwan to play a part, in our strengths, our values, and our convictions,” Yang said. “We want to be a player, and to invest based on our values.”
A new strategy for liver cirrhosis
Liver cirrhosis is scarring (fibrosis) of the liver, which impedes or prevents the function of the organ, and this occurs after long-term liver damage. Liver cirrhosis can go unnoticed or unchecked until significant damage is done, or it is too late to save the organ, and often the life of the patient.
Liver cirrhosis is the eleventh leading cause of death worldwide, and there is no cure for decompensated liver failure, meaning transplants are the only option open to those with failing livers. With such a destructive, widespread disease and with no cures available, finding effective ways of defining a damaged liver’s condition is important. A major cause of mortality in cirrhosis patients is due to infections, and about a quarter of hospitalised cirrhosis patients are admitted due to infections. Being able to understand the way the immune system fails on a cellular level,
could potentially help pave the way for a route toward new therapies specifically aimed at thwarting infection.
“Cirrhosis can occur in any person who has chronic liver disease. If cirrhosis occurs, it can be a bad prognosis but it is not always related to liver failure only, as most of these people die from infections. The underlying mechanisms are not well understood, and this is why we are working on this,” confirmed Dr. Bernsmeier. “It is easy to assume that in the person with the diseased liver, the liver gets worse until it fails, and they die. In reality, infection is a critical step because if infection occurs, the liver function gets worse. A downward spiral occurs which leads to death from sepsis, and this is what we are trying to understand because in many cases it is not the liver disease specifically that is fatal, it’s the infections from a failed immune response.”
This infection susceptibility, due to a failing immune system (immuneparesis), appears to accelerate cirrhosis. The project team are tasked with investigating the mechanisms of this process.
Discovering ‘tell-tale’ dysfunctional cells
In our innate immune system, monocytes and macrophages are crucial white blood cells that fight infection. In patients with liver disease, these cells change in different ways during the stages of cirrhosis, making them ultimately ineffective against harmful bacteria. Finding out how they change in each stage could create a gauge, allowing healthcare professionals to understand how critical the condition of the liver is.
“With this project, we are trying to more systematically see which kinds of subsets or states of monocytes are differentiating,
The project: ‘Diversity and compartmentalisation of monocytes & macrophages and immuneparesis in patients with cirrhosis’ explores new ways to understand the failing immune response in cirrhosis patients, as Research Group Leader, Dr. Christine Bernsmeier, explains.Created
in that kind of disease, and so far, we have identified different subsets but we are convinced that there are more states that can be found.”
These discovered subsets in the circulation of patients with cirrhosis were specifically Monocytic myeloid derived suppressor cells (M-MDSC), immune-regulatory CD14+DR+AXL+ and CD14+MERTK+ cells. In advanced stages of cirrhosis these dysfunctional subsets were more prevalent than the typical monocytes found in health, and were less able to fight infections.
“With MDSCs for example, these cells have also been found in tissues surrounding cancers so there is something similar, where monocytes and macrophages are somehow immobilised so they could not defend against cancer or infection. It would be interesting to see what happened if we could rebirth these cells or attack them or eliminate them.”
Thanks to funding by the SNF (Swiss National Science Foundation) the research team involving Dr. Bernsmeier’s group as
The challenges of the human body
The research has been conducted on biological materials in the laboratory environment and there have been no interventions on patients to date, as much research is still needed, especially to locate more desired targets.
“We have identified these subsets but the problem is that we now have realised that in every compartment of the body, like the liver, the abdominal cave, the blood circulation and so on, these targets are differently expressed. I think if we want to have an immunomodulatory approach, that needs to be not only stage specific but also compartment specific. This is the next step we are working on and wish to investigate,” said Bernsmeier.
It is the main objective of the project to decipher the diversity of monocyte and macrophage differentiation alluding to different compartments as well as different disease stages in patients with cirrhosis. This approach has many complex challenges but the pay-off could potentially save many lives.
Diversity and compartmentalisation of monocytes & macrophages and immuneparesis in patients with cirrhosis
Project Objectives
Given patients with liver cirrhosis most frequently die of sepsis rather than liver failure, the objective is to delineate the immune (dys-)function of their monocytes & macrophages in the blood and in tissues. Properties of these immune cells are assessed ex vivo using single cell transcriptomics, immunohistochemistry and flow cytometry-based techniques. Ultimately, the results may translate to therapies restoring the immune function and preventing sepsis and death in cirrhosis patients.
Project Funding
Swiss National Science Foundation (SNF)
- Projects Nr 159984 (2015-2019); https:// data. snf.ch/grants/grant/159984 and 189072 (20202023); https:// data.snf.ch/grants/grant/189072
Project Partners
• Prof. Burkhard Ludewig, Institute of Immunobiology, Medical Research Centre, Cantonal Hospital St. Gallen • Prof. Mark Thursz, Hepatology and Gastroenterology, Faculty of Medicine, Imperial College London • Prof. Julia Wendon, Liver Intensive Therapy Unit, King’s College Hospital, King’s College London
• Dr. Christopher Weston, Centre for Liver & Gastro Research, College of Medical and Dental Sciences, University of Birmingham
This project is financed by the Swiss National Science Foundation (SNSF)
Contact Details
Principal Investigator
Prof. Dr. Dr. Christine Bernsmeier, MD, PhD Research Group Leader Translational Hepatology, Department of Biomedicine and Department of Clinical Research
University of Basel
Consultant Hepatologist
University Center for Gastrointestinal and Liver Diseases
University Hospital Basel CH-4031 Basel, Switzerland
well as their important project partners are examining biological samples from cirrhosis patients at various stages of deterioration, to systematically identify which cells have changed and become ineffective. They examine blood or ascites, which is fluid from the abdominal cave, or use liver biopsies.
“We can identify in the blood, the monocytes and then we can investigate those and characterise them. We can do that as a systematic approach. We did that with single-cell RNA sequencing which is a relatively new approach to looking at the transcriptome of every single cell that we isolate. We also use classic approaches such as flow cytometry, where we can look at the specific targets that we want to stain and see their expression, and we also look at the function of these cells to see whether they still function in the sense they would defend against infection. We are trying to look at a variety of patients at different stages, which may be advanced or not advanced and we try to find out the difference and also the cut-off point, where the system will collapse.”
“There are not many groups working on this, on a molecular basis. Other groups are looking at slowing down or reverting fibrosis, to prevent cirrhosis to happen or persist but we are working on the potential to intervene in the very last stages and I think that that is new, and is a different approach. We do not favour to work on animal models because this is a multi-systemic human disease, which addresses the entire body with all its compartments and this cannot be mimicked in an animal. It makes it difficult for us to foresee everything at a glance, for example, we cannot take out a spleen and have a look at its macrophages, so these are obstacles that we have to find ways to overcome.”
The long-term aim of this research is to offer hope for patients, especially those in the direst stages of liver cirrhosis, beyond the desperate solution of a liver transplant. Further to this, is a wish the research can be a stepping stone to finding a way to restore the immune system’s functionality, enabling patients to fight infection and in turn, allowing more time and space for physicians to treat and reverse the underlying liver disease.
T: +41 61 7777400
E: c.bernsmeier@unibas.ch
W: https://forschdb2.unibas.ch/inf2/rm_ projects/object_view.php?r=4514612
W: https://biomedizin.unibas.ch/en/ research/research-groups/bernsmeier-lab/
Christine Bernsmeier is a clinician scientist holding a MD and a PhD in cell biology. She specialised in Hepatology and pursued a Postdoc with Prof. Julia Wendon at the Institute of Liver Studies/ Liver Intensive Care at King’s College London. She leads the Translational Hepatology Lab at the Department of Biomedicine, and has been appointed Adjunct Professor for Hepatology at the University of Basel in 2022.
Further to this, is a wish the research can be a stepping stone to finding a way to restore the immune system’s functionality, enabling patients to fight infection and in turn, allowing more time and space for physicians to treat and reverse the underlying liver disease.
Deciphering the pathogen-host cell interactions during the liver stage of Plasmodium parasites
Many microorganisms have managed to escape from host immune responses by hiding inside cells. However, cells have evolved various intracellular, immune response-like mechanisms to eliminate pathogens. We spoke to Dr. Volker Heussler, who researches the development of the malaria parasite inside the hepatocytes, and the factors that influence its survival or elimination.
Malaria is a dangerous, potentially fatal disease, caused by the parasites of the genus Plasmodium. Malaria was responsible for 600,000 deaths in 2022. Every year, there are over 200 million cases of malaria worldwide. This devastating disease is transmitted exclusively via a bite by a female Anopheles mosquito. The Plasmodium species undergoes many life cycle stages in the mammalian host and in the mosquito. Sporozoites are the highly-motile crescent-shaped form of the parasite that the mosquito injects into the skin of the mammalian host during a blood meal. The sporozoites glide in the skin until they find a blood vessel, which they enter and then get passively transported with the bloodstream to the liver. Inside liver cells also called hepatocytes, the sporozoites transform and start dividing their nuclei finally resulting in thousands of merozoites. The merozoites are released from the liver cell and then repeatedly infect and destroy red blood cells. The destruction of red blood cells is one of the causes of the symptoms of malaria. Some merozoites develop into gametocytes, which can be ingested by a mosquito. Gametocytes develop into sporozoites within the mosquito, and then the cycle starts again.
The liver stage of development is a crucial part of the Plasmodium life cycle and revealing the complexities behind its outcome is the core of Dr. Heussler’s project. Dr. Heussler’s past research has discovered that a large percentage of the sporozoites that invade hepatocytes, the main parenchymal cells of the liver, fail to form infectious merozoites. “There appears to be a delicate balance between parasite survival and
elimination and we now start to understand why this is so. It all depends on the fitness of the parasite and of course, on the state of the host cell,” explains Dr. Heussler. “When the parasites are in the liver, they transmigrate through a number of hepatocytes before they finally settle in one. This happens in a very special waythe parasite invaginates the host cell plasma membrane and forms a parasitophorous vacuole which is surrounded by a host cell membrane. So, the parasite lives in a special compartment in the hepatocyte which separates it from the host cell, but still is completely dependent on the host cell for nutrients. Therefore, the parasite must ensure that the host cell survives and does not undergo self-destruction (apoptosis).” continues Dr. Heussler.
Newly discovered intracellular immune response: Autophagy
His team has found that there is a type of intracellular immune response that is responsible for the selective elimination of
intracellular pathogens, such as Plasmodium This mechanism is called autophagy. Autophagy is a cellular process for the natural elimination and degradation of unnecessary or dysfunctional cell components. It was originally considered to be a primordial degradation pathway that protects cells against starvation. Starved cells can digest part of their cytoplasm to provide enough nutrients for the most essential metabolic pathways of the cell so it can survive. However, it has been shown that autophagy has many other roles, and one of them includes the elimination of intracellular parasites, viruses, and bacteria.
“We call this elimination of intracellular parasites selective autophagy because it’s not approaching the host cell generally but very selectively the pathogen invader. Selective autophagy was originally described for the elimination of damaged organelles. For example, when a cell contains damaged mitochondria then it is not the whole canonical autophagy that is switched
on, but just the selective autophagy. The damaged mitochondria are then surrounded by a membrane and this finally leads to elimination.” explains Dr. Heussler. Even more importantly, selective autophagy is used to eliminate intracellular pathogens.
Selective autophagy is provoked while the parasite resides inside the parasitophorous vacuole that is surrounded by a membrane, the PVM for Parasitophorous Vacuole Membrane. “Autophagy has a lot to do with membranes. The PVM is completely covered with autophagy molecules and this results in the recruitment of lysosomes, the digestive organelles of a cell. We have shown that this is deleterious for about 50% of invading parasites. However, the interesting part is that the surviving parasites are also covered with these autophagy molecules but they still survive. The reason for that is that they can control the number of these autophagy molecules on the PVM by inducing a very high turnover of the PVM. They shed the PVM all the time and with it, they shed these autophagy molecules. The interesting part comes when we knocked out the host cell molecules which are responsible for the autophagy reaction, and we found that the number of surviving parasites is reduced. We were completely puzzled by this since we thought that stopping the autophagy reaction would provide an advantage for the parasite. And that is obviously not the case. This means that the parasite partially depends on this reaction” Dr. Heussler provides us with a detailed explanation.
researchers found in the first part of the project, with the help of super-resolution microscopy and many other cell biological techniques.
High throughput knockout screen of Plasmodium berghei genes
The second part of the project is focused entirely on the parasite side. The researchers have completed a high throughput knockout screen of parasite genes in collaboration with colleagues at the Sanger Institute in Kingston in the UK. The research group in Bern has established the entire life cycle of Plasmodium berghei, a model Plasmodium species that is infective for rodents, but not for humans. “We introduced many bar-coded knockout constructs at the same time into blood stage parasites and then extracted the DNA at all the different lifecycle stages and sent them for sequencing to the Sanger institute. Sequencing of the barcodes told us exactly where these knockout parasites were eliminated at any given life cycle stage. A very simple but powerful system to follow these parasites. In this way we knocked out more than 1300 parasite genes and identified about 180 genes that are essential during the liver stage, which is the most interesting stage for us” says Dr. Heussler. The potential implications of this knockout screen could be the generation of genetically attenuated parasites, which could then be used as live vaccine strains. It also revealed essential metabolic pathways that can be the target for new anti-malarial drugs.
PATHOGEN-HOST CELL INTERACTIONS
Pathogen-host cell interactions during the liver stage of Plasmodium parasites
Project Objectives
The project aims to decipher the mechanisms underlying the liver stage in the life cycle of the Plasmodium parasite, the causal agent of malaria. Dr. Heussler and his team are interested in the factors that influence the potential survival or elimination of the parasite during its development in the hepatocytes, focusing on autophagy, a newly discovered intracellular immune response.
Project Funding
This project is funded by the Swiss National Science Foundation (SNSF), Grant number 182465.
https://data.snf.ch/grants/grant/182465
Project Partners
• Prof Oliver Billker, Umea University, Sweden, formerly at the Wellcome Trust Sanger Institute, UK
• Prof. Dominique Soldati, University of Geneva
• Prof. Chris Janse, Leiden University Medical Center, The Netherlands
• Prof. Vassily Hatzimanitakis, EPFL, Lausanne
Contact Details
Project Coordinator, Prof. Volker Heussler
Director
Institute of Cell Biology
University of Bern
Baltzerstr. 4
3012 Bern
T: +41 31 68 44650 (office)
E: volker.heussler@unibe.ch
W: https://www.izb.unibe.ch/research/ prof_dr_volker_heussler/index_eng. html#pane427241
The researchers went further and managed to find the reason behind this paradoxical response. “By moderate recruitment of autophagy molecules to the PVM, the parasite attracts and activates host cell signaling molecules. This in turn activates a transcription factor that regulates the expression of survival factors. In the end, it makes sense that the parasite needs to somehow influence the survival pathways of the host cell because if the host cell cannot eliminate the invader, it could simply commit suicide, which we call apoptosis. To avoid host cell apoptosis, the parasite induces the survival pathways of the host by controlling its autophagy machinery” this is what the
From a more basic research point of view, the researchers hope that with this genome-wide knockout approach, they can identify the genes that are essential for parasite survival during the liver stage of development. “Our main interest is to understand what the function of these genes is. Once we have knocked out a gene, we then really look very carefully for the phenotype of the corresponding parasites. Ideally, we would then finally combine the two big projects - the autophagy project and the parasite project, with the hope that the screen will reveal the parasite molecules that are responsible for attracting autophagy molecules to the PVM, and basically, understand the respective mechanism and the bigger picture behind it” concludes Dr. Heussler.
Dr. Volker Heussler is a Professor in Molecular Parasitology and Cell Biology at the Institute of Cell Biology, University of Bern. He is an acting director of the Institute of Cell Biology. His research is focused on the liver stage of the Plasmodium parasite, malaria’s causal agent.
There appears to be a delicate balance between parasite survival and elimination and we now start to understand why this is so. It all depends on the fitness of the parasite and of course, on the state of the host cell .
Progenitor cells offer great hope for heart failure patients
The 5D Heart Patch Project, led by Prof Kenneth Chien, has identified human ventricular progenitor (HVP) cells that can create self-assembling heart grafts in vivo. The research has the potential to offer hope to millions of people suffering from heart failure.
The 5D Heart Patch Project forms a vital step in a scientific journey that could be on the cusp of something extraordinary, with the goal of generating a fully functioning heart graft patch in vivo, in humans. Enabled with the backing of an ERC grant and with the support and involvement of Swedish pharma, AstraZeneca, and the biotech SmartCella, the ground-breaking project has profound implications to potentially provide a viable treatment for the millions of people who suffer from heart weaknesses, heart failure and damaged hearts after heart attacks.
The World Health Organisation (WHO) has stated that cardiovascular diseases are the leading cause of death in the world, responsible for the death of an estimated 17.9 million people every year, which equates to around 32% of all deaths worldwide.
“Heart failure is a progressive, chronic disease. There are drugs that work, primarily symptomatically, but nothing that stops the fundamental process of heart failure,” said Kenneth Chien, Professor of Cardiovascular Research at Karolinska Institute, and cofounder of Moderna.
“It’s the number one cause of morbidity and mortality as a single entity and is growing exponentially. The drivers are diabetes and hypertension and it’s a disease of the elderly although it can affect anyone of any age, including neonates if they have a genetic abnormality. This is one of the
The search for heart cells
The project is the latest step after twenty years of diligent searching for a specific type of progenitor cell for the heart.
“We want contracting muscle cells and it has to be a specific type because there are many different types of heart muscle cells and
What we have done is identify a cell, a kind of ‘master ventricular progenitor’ that makes only one type of cardiac muscle, but also can make its own matrix, trigger the formation of blood vessels following transplantation, migrate to the site of heart injury, prevent fibrosis, and then go on to expand to form huge grafts of functioning cardiac muscle.
single largest unmet clinical needs in all of medicine. For end-stage heart failure, there is no cure other than heart transplantation and there is a limitation in the number of available donors. Finding ways to rebuild heart muscle in heart failure is one of the holy grails in regenerative medicine, specifically regenerative cardiology and in medicine in general.”
each of the heart muscle cells has a different electrical signature. All the heart cells need to beat in synchrony, to beat ‘to a single drummer’ if you will – that’s your pacemaker. What if you have a rogue cell and it doesn’t listen to your pacemaker? You’re going to have electrical confusion and you are going to have an arrhythmia that could be life-threatening,” explained Professor Chien.
Arguably, the first milestones in this scientific quest were based on studies between 2005-2009 in a mouse, where a marker of master heart progenitors, which could make any type of heart cell, was identified. Subsequently, a progenitor cell was identified in the mouse heart that would make only ventricular muscle, the main type of muscle responsible for propelling blood into the circulation. After a search spanning over a decade, the Chien lab at Karolinska Institutet managed to identify a similar cell from human embryonic stem cells, that would only make the right type of contracting muscle cell and nothing else. The search for human ventricular progenitors or HVPs was over, but the work toward developing human cell therapies for regenerating healthy heart muscle was just beginning.
“What we have done is identify a cell, a kind of ‘master ventricular progenitor’ that makes only one type of cardiac muscle, but also can make its own matrix, trigger the formation of blood vessels following transplantation, migrate to the site of heart injury, prevent fibrosis, and then go on to expand to form huge grafts of functioning cardiac muscle. When you put this cell anywhere; in a dish, in a mouse heart, in a pig heart – and this has all been done – you make a chunk of human ventricular muscle, and it does it all on its own, sort of a ‘self-assembly’ process that is a natural program of the cell. When we saw this, initially in a mouse, I thought, this is a cell, after twenty years of searching, that is worthy of trying to put into the clinic.”
A treatment to replace surgery?
Such a heart patch can repair a damaged heart in just six weeks and the cells do the majority of repair intuitively. The process is less invasive than open heart surgery and requires an injection of the cells into the heart. The HVPs appear to be incredibly adaptive and focused in carrying out their ‘pre-programmed’ task.
The research team were encouraged by the cell’s ability to generate contracting ventricular heart tissue in so many circumstances and with just one cell it was possible to make billions more. In one experiment they found when the cells were introduced to a mouse kidney they made a beating, moving, heart muscle patch
on the kidney despite the difference in the organ. It was the same result, putting human heart progenitors from embryonic stem cells in slices of a dead monkey’s heart in a petri dish. The human progenitors took over to build a human patch on top of the dead monkey’s heart tissue and the tissue started beating.
The HVPs apparently navigate many of the challenges that present themselves with cell therapy for hearts. A major problem with implanting heart cells is when they already have a beat. This is because the heart has to maintain a consistent rhythm and a clash of rhythms in beating cells put together, can be dangerous or fatal.
5D HEART PATCH
A Functional, Mature In vivo Human Ventricular Muscle Patch for Cardiomyopathy
Project Objectives
The discovery and development of human ventricular progenitors that form large, functional heartgraft-patches in injured pig hearts could lead to clinical studies within a few years, offering new hope to patients waiting for a heart transplant.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under ERC AdG grant agreement No 743225.
Swedish Research Council Distinguished Professor Grant Dnr 541-2013-8351, Astra Zeneca research collaboration agreement Dnr 4-2539/2021.
Contact Details
Kenneth R. Chien MD PhD
Distinguished Professor of Swedish Research Council
Department of Cell and Molecular Biology
Karolinska Institutet
T: +46 70-676 6365
E: kenneth.chien@ki.se
W: ki.se
W: https://ki.se/en/cmb/kenneth-r-chiens-group : linkedin.com/in/kenneth-chien
“The beauty of these progenitors is that they are not beating when we put them in and what we found before was that most arrhythmias occur when you put the cells in, so if the cells are already beating, they are beating to their own ‘drummer’, but when we put our cells in there, they are not beating and they have a chance to get ‘trained’.”
A team led by Karl-Ludwig Laugwitz, Professor of Cardiology at Technical University of Munich, (TUM) and Regina Fritsche Danielson, SVP and Head of Early CVRM, Biopharmaceuticals R&D, AZ, demonstrated that HVPs could form functional grafts in pig hearts following heart attack. In addition, the team demonstrated in a laboratory how the HVP cells migrated to actively seek out damaged regions of the heart to repair. When a heart attack occurs, as many as a billion heart muscle cells, known as cardiomyocytes, can die from the reduced blood supply, causing scar tissue which further deteriorates heart function. Normally, this could present an irreversible decline in heart function and health.
Chien said: “They migrate to the area of injury because they move, as they are progenitors, whereas cardiac muscle cells just want to beat. It was also discovered that they break up scar tissue, which was very surprising.”
The path to a new therapy
To keep his laboratory available for purely academic research, Professor Chien created a ‘bridge’ company, called SmartCella, to work with the industry partner, AstraZeneca on the path to the development of a new heart regeneration therapy, with Chien as a senior advisor.
“I started the company that makes these cells, for toxicology studies, for the pig studies, because you don’t want your lab turned into a company – I am against that. What you want to do is enable your lab to lead to the evolution of a company. And you can work interactively with the company if it is allowed, which at KI it is. And you let people who know how to make drugs do that and you help them. It’s early days still.
Dr Kenneth Chien is a Distinguished Professor of the Swedish Research Council in the Department of Cell and Molecular Biology at the Karolinska Institutet and the Co-Founder of Moderna, and is a global leader in cardiovascular biology, medicine, and biotechnology.
For hearts that are damaged by heart attacks and strokes, this could be game-changing, in terms of making a recovery. This kind of selfrepair treatment has not been possible before.
“Conveniently, they don’t keep dividing forever,” Chien continues. “They divide, and they stop growing at the right time because that is how a heart is built, so they’re programmed for true self-assembly. We are still trying to understand everything they can do.”
“We have discovered that these cells exist and the amazing properties they have. We show that it improves function after a heart attack in a pig, which is an important step. Now what we have to do is to identify an appropriate dose, decrease the risk for arrhythmias and devise an initial clinical study. At this stage, I don’t want to over-promise and I don’t want to say this is guaranteed to work. What I think is, this is worthy of further testing.”
The project has moved over into industry and a lot of work remains to be done. Despite the tests and trials that are ahead, there is a tangible sense of anticipation from those involved that a novel therapy could emerge, which has the potential to address one of the biggest health challenges of our time.
How does inflammation spread through the body?
Chronic systemic inflammation (CSI) is associated with the progression of different diseases, including psoriasis, atopic dermatitis and cancer. Researchers in the ERC-funded CSI-FUN project are investigating the underlying molecular mechanisms behind CSI and how it spreads through the body, as explained by Erwin Wagner, the principal investigator of the study.
An inflammatory response may originate in a given organ like the skin, but it can then spread more widely and affect the entire body, causing serious health problems. Based at the Medical University of Vienna (MUV), Erwin Wagner and his group are investigating the underlying mechanisms behind CSI. “We are trying to find the mechanisms by which the original signal – which elicits the inflammatory response – spreads through the body in a systemic way,” he outlines. This research relies to some extent on mouse models for two common inflammatory skin diseases –psoriasis and atopic dermatitis – in which CSI plays a major role in progression. “We aim to describe how inflammation from the skin spreads through the body and affects different sites and organs,” says Erwin Wagner. “Around a third of psoriatic patients develop psoriatic arthritis, a form of joint disease, which often emerges many years after patients are diagnosed with psoriasis.”
Deciphering organ cross-talk
The aim of the study is to build a broader picture of the communication between the skin and the joints, and to identify which molecules, signalling pathways and cells are involved. Researchers in this project are also looking at CSI in the context of cancer associated cachexia (CAC), a metabolic syndrome which can develop in the late stages of cancer and is associated with weight loss. “Many tumours are highly inflamed, and this inflammation can then spread throughout the body, leading to weight loss,” explains Wagner. Rather than looking at the actual tumour itself, the focus of the Wagner group
is the communication between the tumour and the whole organism. “We’re looking at the cross-talk, the communication that takes place between a tumour and the organism. We’re investigating what happens when the tumour sends out signals to start inflammation,” he says. “In each of these cases we’re looking systemically. For instance, in cachexia we look across the different organs in the body, keeping an open mind about what we might find.”
Researchers in the Wagner group ultimately aim to identify the factors behind the initiation of CSI, which then kicks off a cascade leading
to the progression of disease. One important event in the development of CSI is the release of inflammatory mediators, including inflammatory cytokines like IL-6 and IL-17, which activate pro-inflammatory signalling pathways. “These cytokines are important therapeutic targets for both skin diseases and CAC,” outlines Wagner. In his research, Wagner is using mouse models to investigate the molecular mechanisms behind CSI, while he also has access to samples from human patients. “My lab is well-known for generating innovative mouse models of human diseases,
but it was clear from the outset of this project that we would need to have both mouse mechanistic data and human patient samples, in parallel,” he stresses. “One of the attractions of moving to the MUV was to be close to clinicians, so I can now work with them on samples they collect from patients with psoriasis, atopic dermatitis and CAC.”
This research has yielded some important insights. On the basis of their work with mouse models, Wagner and his colleagues have published a paper on CAC demonstrating that after a tumour becomes inflammatory, the patient first loses fat. “The fat burns away, then the muscle becomes affected, and then the other organs,” he explains. A patient who
the first event that happens when disease is induced,” outlines Wagner. In his research, Wagner has explored the impact of removing these proteins on psoriasis and psoriatic arthritis. “Is the disease prevented? Or does the disease have a less severe impact, is it milder?” he asks. “We generated a mouse in which these proteins are absent in all cells, which we call a complete knock-out mouse. In this case the psoriatic disease is very mild. This is positive, but is this because of a systemic effect? Or is it because the S100 proteins have been removed from the skin?”
The next step is to remove the S100 proteins only in the skin, and then assess whether the same effect is observed. If it’s an organspecific effect then removing these proteins only in the skin should also lead to a milder disease, but Wagner says this is not what he and his colleagues observe. “We get more disease, so it’s actually worse. Therefore, it’s not the organ-specific effect but rather the systemic effect which needs to be treated,” he says. Researchers have demonstrated that the important factor is the CSI, rather than celltype specific inflammation in the skin, while Wagner is also pursuing further research into the S100 proteins. “We are also trying to understand whether these proteins play a role
Mouse models
The work with mouse models within the project involves modulating certain genetic events, which affect the severity of a given pathology. For example, in the context of psoriasis, specific proteins are removed from the skin of these mice and within two weeks this genetic alteration will cause a lesion resembling the human disease. “We can then essentially follow the fate of this lesion over time,” outlines Wagner. The development of the lesion can be followed over extended periods, an important consideration in the project given that the systemic manifestations of CSI will not typically be apparent in two weeks. “We can modulate the severity of the lesion, to look at both severe and mild examples. The mice are raised in pretty much standard conditions, so factors like diet and the light/dark cycle aren’t particularly influential. We essentially rely on the genetic manipulation leading either to the induction or removal of a gene - or genes - to cause certain pathologies,” continues Wagner.
loses a certain amount of weight within six months is categorised as pre-cachexia, which is an important window for treatment, before their condition worsens. “A lot of research focuses on this pre-cachexic stage, where we can study what signals might be important in terms of kicking off the cascade,” says Wagner. “In the pre-cachexic stage the patient has not yet suffered from any major weight loss, but if it progresses further then they enter cachexia, a deadly condition. New treatments should ideally be targeted at the early changes before the patient comes down with a metabolic impairment.”
Another important paper arising from the group’s research is related to the function of the S100 proteins, a family of molecules that are highly up-regulated when something goes wrong in the body, acting effectively as a danger signal. “We’ve seen that these S100 proteins, which are also called alarmins, are highly up-regulated in our mouse models and also in samples from human patients. In the mice, the up-regulation of these proteins is
We are trying to find the mechanisms by which the original signal – which elicits the inflammatory response – spreads through the body in a systemic way.The CSI-Fun research team in 2023 with early stage researchers, post-docs and technical assistants from 7 nationalities. Photo credit: Andreas Ebner (MUV).
in atopic dermatitis. These alarmins are also anti-microbial proteins, which can reduce the microbial load, that might be a factor behind the phenotypes that we see,” he continues.
The role of inflammation in osteoarthritis is another interesting avenue of research. “It has long been thought that inflammation has nothing to do with osteoarthritis, but more and more evidence is now emerging that in fact it may play a role. We have good models for osteoarthritis and for psoriatic arthritis and we are studying and comparing the two diseases,” says Wagner.
Professor Wagner is also very interested in investigating the importance of inflammation in the context of skin cancer. A number of reports have suggested that fewer cases of specific skin
cancers are found in patients with psoriasis than in the general population. This would suggest that CSI does not always promote cancer, but in some cases can in fact suppress certain types, a topic that the Wagner group is currently exploring. “We have established several mouse models for psoriasis in which we induced a mutation, which normally leads to skin cancer development. On top of this mutation, we also induce CSI and then we ask; does this suppress or promote skin cancer development?” These experiments are ongoing and we are excited to find out whether CSI is tumour-promoting or suppressive in this context. For this challenging project, we received additional EU support from H2020 (MSCA-ITN-CancerPrev-859860).”
CSI-FUN
Chronic Systemic Inflammation: Functional organ cross-talk in inflammatory disease and cancer
Project Objectives
The goal of CSI-Fun is to understand how Chronic Systemic Inflammation (CSI) is involved in Inflammatory Skin Diseases, Arthritis and Cancer. We focus on whole body physiology and perform experiments in model systems, from the petri dish to mice and samples donated by patients, to discover new biomarkers and better preventive and therapeutic strategies.
Project Funding
This project has received additional support from the Medical University of Vienna and the European Union’s Horizon 2020 research and innovation programme (MSCAITN-2019 -859860: CANCERPREV)
Project Partners
This ERC advanced grant has been fully allocated to the host institution (Medical University of Vienna).
Contact Details
Principal Investigator, Erwin Wagner, PhD Univ.-Prof. Dipl.-Ing. Dr. Medical University of Vienna
Department of Dermatology
Department of Laboratory Medicine
Anna Spiegel Research Building
Lazarettgasse 14, AKH BT25.2, Level 6 1090 Vienna, Austria
T: +43 1 40400 77020
E: erwin.wagner@meduniwien.ac.at
W: https://www.meduniwien.ac.at/hp/ dermatologie/wissenschaft-forschung/ genes-and-disease-group-erwin-wagner/
Erwin Wagner has pioneered technologies for engineering the mouse genome producing highly instructive mouse models that have revealed mechanisms of development, inflammation, metabolism and cancer. He made ground-breaking discoveries about the roles of AP-1 (Fos/Jun) transcription factors and recently provided key insights to the pathogenesis of cancer cachexia, offering new therapeutic interventions.
Latifa Bakiri joined Erwin Wagner´s team in 2001 as a post-doc and then staff scientist, she focusses on optimising genetically engineered mouse models to study inflammation, metabolism and cancer.
Pointing the way towards safer implants
Implants can enhance quality of life, but it’s essential to first ensure that they are safe and will function effectively in the body. We spoke to Dr. Anna Igual Munoz and Dr Stefano Mischler about their work with Prof. Dr Brigitte Jolles-Haeberli in investigating electrochemical reactions at the interface between biomaterials and the surrounding biological environment, such as body fluids and tissue.
A high level of resistance to corrosion is highly desirable in an implant material, as the release of metal ions can cause adverse tissue reactions and eventually lead to the body rejecting the implant. As Senior Scientist in the SCI STI SM Group at EPFL in Lausanne, Dr Stefano Mischler is investigating how metals like titanium react with synovial fluid, the liquid which effectively lubricates our joints. “We carry out electrochemical measurements and look at the reactions that take place on the metal,” he outlines. While previously most researchers used simulated body fluids to investigate electrochemical reactions in the body, Dr Mischler and his colleagues, in particular Yueyue Bao, PhD student, are using synovial fluids collected directly from patients during implant surgeries done by Prof. Jolles-Haeberli. “We have brought our lab very close to the surgery room. Once the synovial fluid is recovered when it falls into the operative field during the procedure, it’s rapidly transferred to us and we can directly do the measurements,” explains Dr. Anna Igual Munoz, Scientific Consultant to the SCI STI SM Group.
Electrochemical measurements
This approach could lead to deeper insights into the reactions which occur at the interface between biomedical materials and surrounding fluids, and ultimately help scientists develop improved and patienttailored implants. The metal is exposed to the synovial fluid, and then electrochemical measurements are taken, from which researchers aim to build a fuller picture.
“With our knowledge of corrosion and electrochemistry, we can interpret the results and obtain information about not only the corrosion rate, but also the behaviour of the liquid,” says Dr Mischler. Researchers are using several electrochemical and surface analysis techniques including Auger Electron Spectroscopy (AES), to determine surface changes of the metal after being in contact with the body fluid. “AES is a very effective technique, it’s surface sensitive and it has a high lateral resolution,” continues Dr Mischler. “With AES we look at the topography of
the material but we also have chemical information from the outermost surface, the first atomic monolayers.” A technique called infrared spectroscopy is also being used in the project, which provides information about the different functional groups which are present and enables researchers to discriminate between them.
proteins on the surface,” he explains. “The higher the oxygen concentration, the higher the corrosion rate of the metal. But the higher the protein concentration, the more adsorption and lower oxygen. A good patient has low oxygen, and a lot of proteins.”
This type of patient is less likely to react aggressively to the implant and reject it. While
The evidence gathered so far suggests that the key parameters in terms of determining the corrosion rate of implants are the interplay between oxygen content and the amount of organic components (i.e. proteins) in the body fluid. “Titanium is passive and its corrosion rate is affected by the reduction of oxygen. Electrochemical measurements suggest that oxygen content varies between patients,” outlines Dr Mischler. Another important consideration is the amount of proteins or organic molecules in the synovial fluid, says Dr Mischler. “This affects the adsorption of
titanium implants work pretty effectively in general, some patients experience adverse reactions, which will affect durability. “In cases where patients react unfavourably, implants don’t last more than 10-15 years on average. This may be fine for an older individual, but not for a younger person,” says Dr. Igual Munoz. A major challenge now is to increase the durability of these implant materials, making them suitable for all patients, particularly with demand set to increase further as life expectancy rises. “Implants are intended to improve a patients’
The higher the oxygen concentration, the higher the corrosion rate of the metal. But the higher the protein concentration, the more adsorption and lower oxygen. A good patient has low oxygen, and a lot of proteins.Human synovial fluid samples.
quality of life, enabling them to keep doing the things they enjoy,” points out Dr. Igual Munoz. “We have an aging population, and many of these people want to stay active for longer.”
Biocompatibility
The ideal scenario is to give each patient the right implant for them individually, improving biocompatibility so that the implant is accepted by the body. The project’s research will make an important contribution in this respect, with Dr Mischler and Prof. JollesHaeberli aiming to help more accurately predict how an individual patient will react to an implant material. “This is a really longterm oriented project, and we can follow the release of the material following surgery over the course of years. We have a predictive tool, and are also interested in blood analysis,” he outlines. The project’s work could also help researchers anticipate the sensitivity of patients to the wear of a joint, as well as point the way towards improved testing methods to assess the long-term performance of implant materials.
“Some companies are using hip joint simulators to test implants in vitro. These machines are used to test the materials in mechanically demanding circumstances over extended periods” continues Dr Mischler. A lot of attention has historically been focused on assessing the impact of weight and the biomechanics of patients on the material, while by contrast the chemistry of the synovial fluid has been relatively neglected. However, it has since been shown that the interactions of the metal with the synovial liquid are in fact highly important. “This importance is now more widely recognised, and people are thinking about how they can improve their simulator tests and make them much more predictive,” says Dr Mischler. This research could also inform work to develop
and manufacture more effective implants which are less likely to be rejected. “There are some surface structures that are more prone to the adsorption of proteins, which is a favourable mechanism. We can promote this with certain materials,” says Dr Mischler. “There is interest in developing in vivo surfaces tailored to patients.”
There are several further strands to this research, with Dr Mischler and Dr. Igual Munoz working to gain deeper insights into the behaviour of metal implants in the body. Part of this work involves looking at the wear and friction on the system. “We can use in vitro tests to investigate the effectiveness of the synovial fluid as a lubricant, and the sensitivity of the ball part of an implant to wear and corrosion,” explains Dr Mischler. These are important considerations for companies seeking to develop improved implants, and the project’s research helps lay the foundations for further development. “This research shows feasibility. In future we can maybe look towards more focused studies and collaborations with companies,” outlines Dr Mischler.
IN-SITU INVESTIGATION OF INTERFACIAL REACTIONS
In-situ investigation of interfacial reactions of metal surfaces in contact with human synovial fluids
Project Objectives
To design and validate an optimal experiment protocol to determine the electrochemical behavior of biomedical implant alloys in human synovial fluids and to understand their reaction behavior as a function of the clinical state of the patients. This will be achieved through a collaboration between surgeons and corrosion scientists.
Project Funding
This project is funded by the Swiss National Science Foundation (SNSF).
https://data.snf.ch/grants/grant/184851
Project Partners
• Tribology and Interface Chemistry (TIC) group of the Materials Institute of EPFL
• The Swiss BioMotion Lab (SBML) at the Department of Musculoskeletal Medicine of the Centre Hospitalier Universitaire Vaudois (CHUV)
Contact Details
Project Coordinator, Prof Anna Igual Munoz
EPFL SCI STI SM
MXC 341 (Bâtiment MXC)
Station 12, CH-1015 Lausanne
T: +41 21 693 68 83
E: anna.igualmunoz@epfl.ch
W: https://data.snf.ch/grants/grant/184851
Anna
Anna Igual Munoz is a Senior Scientist in the Tribology and Interface Chemistry Laboratory at EPFL in Lausanne. She gained her PhD from the University of Valencia and has held research positions at institutions in Europe and America.
Stefano Mischler is head of the Tribology and Interface Chemistry Unit at EPFL. He has published more than 100 articles in academic journals and is a member of several societies, including the Swiss Society for Materials Technology.
Brigitte Jolles-Haeberli is a Senior Consultant Orthopaedic Surgeon, specialising in knee and hip surgery, and an Associate Professor in the Faculty of Biology and Medicine at the University of Lausanne.
Yueyue Bao is a PhD student at the EPFL in Lausanne, who has recently successfully defended her thesis on in vivo investigation of the electrochemical behaviour of Ti and CoCrMo alloys in human synovial fluids.
From humans to robots with the SAHR project
The SAHR project is exploring how humans develop their exceptional dexterity, allowing them to adeptly use tools and apply precise amounts of force to create intricate crafts. By gaining insights from this research, the project aims to develop new learning and control algorithms for robots to replicate similar dexterity. Professor Aude Billard provides more insight into this research.
Today, most robots remain largely preprogrammed to function in a specific way, with manipulation skills limited to simple pick and place object handling. In contrast, humans can handle a wide range of objects, including unfamiliar ones, and exhibit remarkable dexterity in activities such as crafting or playing musical instruments. However, developing such skills requires years of training and relies on advanced cognitive functions that allow us to adapt to different situations and use our fingers in a highly dexterous manner. Aude Billard, Professor in the Learning Algorithms and Systems Laboratory at EPFL in Lausanne, explains that we hold a mental model of what our hands can do, known as a kinematic work space, which changes in size as we move our fingers closer together or further apart. This mental model allows us to manipulate objects in complex ways, such as opening up our hands to create space for an object to rotate, then closing our fingers to grasp it.
SAHR project
As the Principal Investigator of the SAHR (Skill Acquisition in Humans and Robots) project, Professor Billard is investigating how humans learn to manipulate objects dexterously and looking to apply the insights gained to enable robots to learn similar dexterous manipulation. This research involved following a group of apprentices at one of the oldest watchmaking schools in Switzerland over two years to see how they pick up the delicate manipulation skills required to insert tiny elements key to the functioning of the watch. Most amazing is the fact that apprentices receive only general feedback, but no explicit advice on how to place their hands or the tools to handle these small devices. “Understanding how one can
learn to decipher what it takes to achieve the task with so little feedback and in far fewer trials than what machine learning achieves currently is required for robots to act robustly in our daily world.”
The SAHR team is working to model the way that humans unconsciously acquire these skills, applying ideas from a theory called optimal control. This approach involves optimizing multiple factors simultaneously, such as the operator’s comfort and manipulability, in the context of watchmaking. Billard and colleagues find that the key difference between novice and expert watchmakers is their ability to balance these different costs. Experts are willing to tolerate more discomfort to achieve better manipulability, while novices prioritize their own comfort. They further model the way by which one can learn a better balance across these objectives to achieve the task.
Optimal control
The project team translates these insights into improved robot learning and control algorithms. Optimal control is applied to search for the ideal posture of the finger on the tool, taking into account the dimensions of the robot. “Building a model of the reachability map - the amount of space that each finger has - plays an important role in this respect,” she says. ”but true dexterity is when we go beyond holding objects between the thumb and any of the other fingers.” One can hold objects in-between the fingers or across different phalanxes, which is quite handy for holding several objects in one hand, or to rotate a pen in the fingers. To achieve this, the reachability map is populated with all these different ways in which objects can be held within the fingers, and with a model of the hand’s workspace to avoid self-collisions. To pick up multiple objects in its hand, the robot rapidly computes all possible combinations,
until it finds the right way of holding these multiple objects, balancing discomfort and manipulability, while ensuring feasibility.
“Minimising the costs is much less important than feasibility when it comes to such a complex problem,” continues Professor Billard.
This approach holds relevance beyond the realm of hand dexterity and can be applied to obstacle avoidance and full body control, so that a humanoid can bend down and pick up an object without hitting its knee, as well as enabling a robot to navigate its way through a dense crowd. Just as controlling free space is crucial when handling objects with the hands, understanding the dynamics of free space changes is critical to preventing collisions between the arms and knees as the body bends down, or the crowd becomes denser.
The SAHR team draws inspiration from the techniques used by watchmakers to detect subtle changes in force that indicate successful insertion or breakage of a screw. This approach is applied to teach robots how to interpret similar changes in force to manipulate objects skilfully, such as balancing a glass of water without prior knowledge of its contents.
Human and Robotics applications
The project’s work now extends to the study of the acquisition of surgical skills, with the added complexity that the objects are composed of highly deformable tissues, within the framework of a project funded by the Swiss National Science Foundation launched in 2022.
The SAHR project has endowed robots with unique skills to pick up objects never seen before, to hold multiple objects in hand, which holds relevance to different sectors of industry.
Most robotics applications don’t use all of the degrees of freedom in today’s bionic hands, an issue that Professor Billard is working to address. “The goal now is to provide a control system to the robotics community so that they can use all the available degrees of freedom,” she says. This would then allow a robot to take on more complex tasks than is currently possible. “We would like people to get the full dexterity that they used to have with their
own hand from a prosthesis. We hope that our research will lead to improved control systems for future prostheses,” continues Professor Billard. “The problem is to develop prostheses with enough degrees of freedom. One difficulty there is that those prostheses must be light – and the more degrees of freedom, the more motors will need to be included and the heavier they will be.”
This is not an insurmountable problem however and Professor Billard is confident
SAHR Skill Acquisition in Humans and Robots
Project Objectives
The SAHR project aims at gaining insights into the principles of human motor coordination mechanism, so as to guide the development of advanced robot learning and control algorithms.
Project Funding
This project is supported through an ERC Advanced Grant. Project ID: 741945, funded by the European Commission.
Project Partners
The project is conducted by the Learning Algorithms and Systems Laboratory (LASA) at the Swiss Federal Institute of Technology Lausanne (EPFL).
Contact Details
Joanna Charlotte Erfani
Administrative Assistant
EPFL-LASA
ME A3 495
Station 9
1015 Lausanne
Switzerland
T: +41 21 6930939
E: joanna.erfani@epfl.ch
W: https://www.epfl.ch/labs/lasa/
that it will be resolved in the coming years, while researchers are continuing to modify and improve the algorithms. A start-up company called AICA Tech has been established to try and translate some of the research conducted in the lab into commercial development, but in terms of the SAHR project itself, at this stage Professor Billard aims to demonstrate the technology and show that robots can adapt to unexpected circumstances. “We want to demonstrate a robot moving in a cocktail party-type environment and avoiding obstacles and hazards. This would show that we can compute an extremely complex task in milli-seconds,” she says.
Prof. Aude Billard is Head of the LASA laboratory in the School of Engineering at EPFL in Lausanne. She holds a B.Sc and M.Sc. in Physics from EPFL (1995) and a Ph.D. in Artificial Intelligence (1998) from the University of Edinburgh. Her research spans the fields of machine learning and robotics.
Dexterity has traditionally been an exclusive ability of humans, but it is now time for robots to catch up and acquire this skill.Prof. Aude Billard
Earthquake research and innovation
Whilst we know where the fault lines are around Earth, predicting their inevitable shifts and subsequent earthquakes has always been a scientific challenge. Speed of response from the moment each tremor is detected, can save lives.
When an earthquake happens, it is likely that aftershocks will follow, and in the Turkey / Syria quake, this proved to be the case, with the second aftershock almost equal in magnitude to the first at 7.5 on the Richter scale, and a third at a powerful 6.4, destroying weakened infrastructure and buildings that survived the initial wave. The following string of aftershocks occurred when first responders and desperate family members were attempting to dig people out from precarious rubble and caused a swathe of further deaths and destruction.
Warning systems
Whilst a lot has been learnt about earthquakes, with a great deal of open-source data available for research, the geological complexity around them is yet to reveal definitive, usefully predictive signals about their occurrence.
Long term, accurate prediction of the first shock is extremely difficult, but equipping responders with technology preparing them for aftershocks, is possible. An early warning system can alert first responders of ground shaking, messaging them via their mobile phones, potentially giving up to ten seconds of grace to move clear of a dangerous area. Such systems use EEW (Earthquake Early Warning) that relies on a network of seismographs that can pick up seismic waves. They use algorithms to calculate the magnitude, location and arrival time of Secondary Waves (S-Waves). A good
By Richard Forsythearly warning system can also deliver communications to monitor those on the ground, for example, to send messages like ‘are you safe?’ to the smartphones of response teams to ensure everyone is accounted for.
One Horizon 2020 project in 2018 that focused on these elements of forecasting and response, was TURNkey (Towards more Earthquakeresilient Urban Societies through Multi-sensor-based information System enabling Earthquake Forecasting, Early Warning and Rapid Response actions). TURNkey focused on Operational Earthquake Forecasting (OEF) collecting earthquake-hazard information, monitoring the region with sensors, and forming methods to assess the area and disseminate critical messages like warnings.
TURNkey is a system that combines quake data with data on buildings in the zone to model a predicted impact on the building structures in the vicinity, to assess the worst-hit locations. This is useful because rapid information and orientation are key to earthquake zone management.
When a major earthquake hits, in seconds the map of a region can completely change, with extensive damage, which makes some routes impassable or unrecognisable, some clear and some dangerous, to the point it can be difficult to navigate or orientate, even for local people. One of the first operational considerations is getting to grips with the new dynamics of the terrain within the disaster zone, so emergency services, rescue teams and humanitarian aid can effectively coordinate.
Earthquakes are devastating natural phenomena, as the 7.8 quake that shook 100km of fault line across Turkey and Syria, proved on 6 February 2023. The disaster’s death toll surpassed 50,000 with thousands of buildings collapsed or in need of demolition. Innovation can make a difference in earthquake-prone areas, for prediction, resistance and search and rescue. Here are some ways science can mitigate the horrors that can occur in earthquakes.
There are many ideas and innovations tackling the challenges of earthquake zones yet what will make the most difference, in reality, is investment and political will in combination.Photo by Çağlar Oskay
Rescue technologies
Drones are making a difference in disaster zones as easily deployed ariel eyes in the sky and in Turkey, sophisticated drones were sent to fly over the worst-hit areas for information gathering. They also helped directly in assessing situations of trapped individuals and for quick ferrying of supplies.
An Israeli contingent that landed in the disaster zone within 24 hours of the first quake, organised rescue operations in Turkey with the aid of state-of-the-art drones.
The expedition group was organised by Stolero Group in cooperation with the Azerbaijani AKTA group, using a raft of technologies supplied by companies including Magal, XTEND and Polaris. They first set up a command-and-control hub, coordinating nine teams and sectioning the city into zones, with approximately 30 buildings to be checked by each team, using an App to process real-time information such as tracking the teams’ progress with GPS. Drones were deployed with thermal imaging technology to relay critical information about survivors. These machines were controlled by virtual reality for more intuitive user reactions. One type of drone was for indoor operations, to assess situations and the stability of buildings. Another used thermal imaging to find hot spots, and to deliver food and medicine. Together the drones were able to locate survivors and deliver much-needed supplies by flying over the rubble fast.
Beyond drones, there are staple technologies that are effective in earthquake rescue scenarios. Searching for survivors in the voids amongst the clutter and tangle of debris in a collapsed building requires a surgical approach with digital telescopic inspection video cameras, which can wind through spaces. They are often combined with specialist equipment that can detect the faintest noises within metres of the sensor, and carbon dioxide detectors are deployed to locate survivors that are alive but unconscious. These devices work well in confined spaces where a buildup of the gas exhaled from living people is easier to detect. Thermal imaging is highly useful, and it does not always simply highlight body temperature but also the warmer rubble, in which people are trapped. Finding, communicating with and keeping alive those trapped under buildings is one of the bigger challenges faced in the aftermath
of a major earthquake. In the future, there will be more ways to explore these rubble sites for life signs and some scientists are thinking ‘outside the box’, developing unusual new solutions. A project called HeroRATs led by Dr Donna Kean from Glasgow had been working with non-profit organisation APOPO in Morogoro and Tanzania, training rats to work in simulated collapsed rubble environments that mimic earthquake zones, to find survivors in collapsed buildings. They wear backpacks with built-in microphones so rescuers can speak directly to anyone found. They also wear video gear and location trackers to pinpoint the survivor and visually assess their environment. Whilst existing rescue groups have trained dogs for finding survivors, rats have the advantage of being smaller so they can access deep into collapsed structures.
In the future, it may be worth abandoning animals for search and rescue altogether to use robots instead. An EU Project titled CURSOR explored this approach. Their proposed Search and Rescue Kit included, Soft Miniaturised Underground Robotic Finders (SMURFs) and several types of drones. The SMURFs have chemical sensors to detect a range of chemical substances that betray human presence. They are carried from the central operation site to the disaster site by transport drones and when in position, the robots work independently in clusters, looking for survivors. The so-called mothership drone becomes an aerial hub that produces highdefinition imaging of the disaster zone and facilitates communication with the control centre.
The CURSOR project has developed the prototype technology but no commercial partner is investing in manufacturing the system to date, for real-world use, so it was not available for support in the recent Turkey / Syria disaster.
A similar project called ICARUS, utilises small, tracked robots with infrared sensors in combination with large excavating robots to shift heavy debris. Robots were also equipped with gas sensors to detect gas leaks from broken pipes. These robots could be controlled from a kilometre away so no human operators need to be put in harm’s way.
One EU project that has deployed technology in the aftermath of a disaster, is NIFTi, which helps with situational awareness and reconstructing and mapping 3D environments of terrain in earthquake zones. In 2012 this robotic set-up was used in Northern Italy after an earthquake, to assess structural damage. By using such devices in the field, a lot can be learned about the needs of development for the technology moving forward.
An early warning system can alert first responders of ground shaking, messaging them via their mobile phones, potentially giving up to ten seconds of grace to move clear of a dangerous area.Photo by Çağlar Oskay
İskenderun, Hatay Turkey - 24th February, 2023:
Preventing building collapse
Ideally, the buildings would not collapse in the first place. The intense lateral forces produced by seismic waves can cause buildings, old and new to crumble under their violence. However, new buildings can now incorporate engineering that defies seismic shocks. Leadrubber bearings can isolate, absorb and deflect seismic waves and a pendulum bearing system can hold the building structure, so it is not directly connected to the ground, which effectively moves the foundation in rhythm with the quake, to minimise the shaking and stresses. The structure itself does not move, just the bearing. The method is effectively like floating the building.
Shock absorbers placed on every level of a building to dampen the earthquake tremors are another clever design, where pistons with oil absorb the energy of the quake. Cross bracing is a practice that can help strengthen buildings and is a tried and tested solution. Avoiding traditional shapes like boxes and columns in skyscrapers, in favour of more robust structures like domes and pyramids, can deal with seismic stresses better.
The materials the building is built from can make a difference too. Wood and steel are good at absorbing shock. As a building material, bamboo is highly resistant to earthquakes, with its tensile strength, whilst being easy to repair and replace if damaged. In a different vein, there are concepts of extra strong and quake-resistant ‘safe’ pods or rooms placed strategically within a larger building, used for quick access to a safe space in an emergency. It means not having to retrofit a building completely whilst offering a way to survive for a few days if need be, in relative comfort under a collapsed building, until rescue teams can arrive. This idea has parallels to the mine rescue chamber, when there are only moments to react before it might be too late.
Finally, there is the attractive idea to dissipate the shock waves before they even reach the building. The roots of plants and trees reduce the power of seismic waves, so surrounding buildings with fauna may be an advantage. An artificial solution is to use plastic concentric rings.
There are many ideas and innovations tackling the challenges of earthquake zones yet what will make the most difference, in reality, is investment and political will in combination. Often earthquakes expose poor building design or unchecked standards and in the most devastating earthquakes, it can often be the poorer areas that bear the brunt of the damage with the most loss of life.
Take Syria’s dilemma in the recent earthquake. The quake hit a contentious, inaccessible location, with little support in any official capacity. Imagining there will be the potential and commitment to revolutionising building designs and search and rescue protocols in such an area is barely realistic unless something changes in the political landscape first.
For mitigating the devastation of earthquakes, the technologies exist, but whether it becomes available to the governments, rescue teams and building developers who operate around fault lines, could be a matter of life and death in the earthquakes to come.
A new approach to modelling ecosystems
Early land biosphere models were developed essentially as a proof-of-concept, to demonstrate that it was possible to make dynamic models of interactions between vegetation and the climate which would help build a deeper understanding of how climate and ecosystems were likely to evolve. While Earth System models have become increasingly complicated over recent years, they aren’t improving in terms of their ability to predict changes to ecosystems and their carbon exchanges. “The models are not actually getting any better,” says Professor Colin Prentice, Chair in Biosphere and Climate Impacts at Imperial College London. This is a major motivating factor behind the work of the REALM project, an initiative in which Professor Prentice and his colleagues are looking again at some fundamental aspects of modelling practice as they work to develop a new vegetation model. “It’s necessary to go back to the drawing board,” he acknowledges.
Earth system models
Current vegetation models all model the processes of photosynthesis, respiration, and transpiration, but there are marked differences in the level of detail that the models include. “The level of hydrological detail varies a lot between models for example, while the extent to which they deal with nutrient interactions also varies. More worryingly, some processes are represented in models in ways that don’t correspond to what is going on in the real world,” outlines Professor Prentice. The wider aim in the project is to address this issue and lay the foundations for improved ecosystem models, with researchers synthesising and analysing data gathered from many different sources. “There’s plenty of work to be done in terms of analysis,” continues Professor Prentice. “Most of the literature about plant traits begins and ends with purely statistical analysis, with no underlying theoretical basis.”
The project team is taking a different approach, developing ideas about what may be occurring, on the basis of eco-evolutionary optimality (EEO) hypotheses that make explicit, quantitative predictions that can then be tested against observed data. Large volumes of observational data are now available, which are the focus of a lot of attention in the project. “We spend a lot of time compiling site data, associating climate with each site, and analysing how various traits vary with climate and soil properties,” explains Professor Prentice. EEO hypotheses are essentially about trade-offs, and the process by which a particular property of a plant is optimised through natural selection. “For example we’ve done a lot of work on the control of photosynthetic capacity,” says Professor Prentice. “There is a common misconception that the photosynthetic capacity of leaves is controlled by nutrient availability. It isn’t, rather it’s optimised to the environment.”
Earth system models have become increasingly complicated over recent years, yet they aren’t improving in terms of their ability to predict changes to ecosystems and their wider effects, so a re-think is required. Researchers in the REALM project are looking again at modelling practice as they work to develop a new kind of vegetation model, as Professor Colin Prentice explains.Photo taken at the Guyaflux Tower in French Guyana, Brazil. © Dr Keith Bloomfield Professor Prentice’s team of researchers and postgraduate students.
A plant species living in a given habitat whose leaves had suboptimal photosynthetic capacity would soon be displaced. On the other hand if its leaves had superoptimal capacity then the plant would be wasting resources on maintenance, which has a cost in terms of respiration. This capacity is optimised on different timescales, both for the immediate circumstances and also the longer term, while there are also many other traits to consider. “Leaf thickness for example has a strong phylogenetic component. One family will typically have thick leaves and another will have thin leaves. Optimisation is then mainly done by selection among species, rather than adjustment within a species,” says Professor Prentice. “Some traits can change relatively fast, others are pretty much hardwired for the species. Nevertheless, you see an optimal outcome, because there is environmental selection and competition.”
well as – or even better than – far more complex models. Our approach is far simpler and more transparent,” he continues.
The core model developed in the project takes in satellite data on green vegetation cover and uses that as one of the main drivers, alongside climate data. This basic structure has been around for a long time; what the project has provided is a strong theoretical basis. “That also means we have one equation for gross primary production that is applicable across all vegetation types,” explains Professor Prentice. The results of the model can then be compared with data from other sources, such as the halfhourly measurements of CO2 exchange from eddy covariance flux towers, which provide a strong test. “The flux data, satellite data and meteorological data are all independent. So if we can reproduce the patterns that we see from the flux towers, then we are doing the right thing” he outlines.
REALM
Reinventing Ecosystem And Landsurface Models
Project Objectives
The REALM (Reinventing Earth And Landsurface Models) applies eco-evolutionary optimality concepts to develop and test new quantitative theory for plant and ecosystem function and land-atmosphere exchanges of energy, water and carbon dioxide, with the goal of more robust and reliable numerical modelling of land processes in the Earth System.
Project Funding
REALM was awarded 5-year European Research Council (ERC) grant under the European Union’s Horizon 2020 research and innovation programme (grant agreement No: 787203 REALM).
Laboratory Members https://prenticeclimategroup.wordpress. com/lab-members/
Contact Details
Project Coordinator, Iain Colin Prentice FRS
Chair of Biosphere and Climate Impacts and Director, Leverhulme Centre for Wildfires, Environment and Society Imperial College London
E: c.prentice@imperial.ac.uk : @LabPrentice
W: https://prenticeclimategroup.wordpress.com/
Vegetation cover
Researchers are tapping into different sources of data to test and refine predictions on multiple traits and processes in plants. One important source of data is satellite imagery going back several decades, which provides abundant data on green vegetation cover and its trends. For example satellite data from the Tibetan plateau show that some regions have responded differently to environmental change. “A greening of vegetation has been observed over much of the plateau, but we’ve also seen a browning in other regions,” outlines Professor Prentice. Two separate equations to predict green vegetation cover have been developed in the project; which one applies depends on whether or not water supply is limiting. On this basis Professor Prentice, collaborating with colleagues in China, has been able to reproduce regional and global patterns of greening and browning as seen from space. “We can do it as
This research holds wider implications, with Professor Prentice looking to integrate these insights into land-surface models that form part of Earth System models. Formal collaborations have been established with the European Centre for Medium Range Weather Forecasts (ECMWF) and the Met Office. “We are looking to try out these ideas in the context of weather and climate models. Ultimately we want to improve the accuracy of predictions, at least about the climate near the land surface,” he says. While existing models may not be able to forecast the exact nature of the weather on a particular day, they are pretty accurate on the seasonal level, which has enormous economic value. “The ECMWF model is fairly effective in predicting what the weather will be like in May this year for example, which is important for farmers,” points out Professor Prentice.
Dong, N., Wright, I.J., Chen, J.M., Luo, X., Wang, H., Keenan, T.F., Smith, N.G. and Prentice, I.C. 2022. Rising CO2 and warming reduce global canopy demand for nitrogen. New Phytologist, https://doi.org/10.1111/nph.18076
Mengoli, G., Agustí-Panareda, A., Boussetta, S., Harrison, S.P., Trotta, C, Prentice, I.C. (2021). Ecosystem photosynthesis in land-surface models: a first-principles approach. Journal of Advances in Modelling Earth Systems, https://doi.org/10.1029/2021MS002767
Professor Iain Colin Prentice, FRS
Professor Iain Colin Prentice, FRS, is a renowned Earth System scientist in the Department of Life Science at Imperial College London. He has made significant contributions to the field, including the first global biome model and the widely used LPJ model. He has held prestigious positions at research institutions worldwide.
Most of the literature about plant traits begins and ends with purely statistical analysis, with no underlying theoretical basis, but we’re taking a different approach.Photo taken at the Guyaflux Tower in French Guyana, Brazil. © Dr Keith Bloomfield
New models to assess climate policies
Researchers in the EVOCLIM project are developing new models to assess how sensitive climate policy instruments are to bounded rationality and what synergies one can expect between markets and social interactions. This will provide a more solid basis for comparison of instruments like carbon pricing, regulation and information provision, as Professor Jeroen van den Bergh explains.
A wide variety of policies have been adopted by governments across the world to deal with climate change, as countries seek to reduce CO2 emissions and encourage the transition towards a low-carbon economy. It is however difficult to accurately assess the impact of these different policy instruments. “Climate policies are highly complex, and it’s difficult for countries to compare their approach internationally,” says Professor Jeroen van den Bergh, head of the Research Group ‘Environmental and Climate Economics’ at ICTA-UAB in Barcelona. Collective action is required to address climate change, yet it is difficult for people as well as governments to change, since they tend to ‘free-ride’ on the efforts of others.
“People will benefit from climate solutions whether they contribute to them or not. That creates an incentive to rationally free-ride, because you may have to make significant sacrifices for not particularly large individual benefits,” points out Professor van den Bergh. “That holds true at both the citizen or firm level and the government level.”
EVOCLIM project
As the Principal Investigator of the EVOCLIM project, Professor van den Bergh is now investigating the behavioural dimensions of climate policy, aiming to develop sophisticated new models by which different policy instruments can be assessed. The project itself has quite a broad scope, with Professor van den Bergh and his colleagues looking at the impact of different policies. “For example, we look at the influence of bounded rationality on the performance of carbon taxes versus markets, as well as on the rebound effect of energy conservation and efficiency, which can be thought of as the reduction of potential energy savings due to post-adoption behaviour. We’re also studying the influence of commercial advertising versus public information provision on consumer behaviour,” he outlines.
There are two main aspects to bounded rationality, the idea that there are limits to rationality in decision-making. “One is incomplete information and information
asymmetry, meaning some people have access to more information than others. In addition, our brains have limited capacity to process information,” explains Professor van den Bergh. “Rational behaviour takes time –we have to collect information, think about it, process it. In some settings it therefore pays off to base our decisions on our quicker trained emotional reactions.”
The question researchers are addressing in the project is whether markets like the EU-Emissions Trading System (EU-ETS) or carbon taxes are a more effective climate policy option under bounded rationality. A further topic of interest in the project is behavioural biases and how they relate to the energy rebound. “A consumer might use more efficient technology in a more intensive way. Or spend extra money saved
through putting the heating down in Winter on a plane trip in Summer,” outlines Professor van den Bergh. The wider population is highly heterogeneous in terms of attitudes towards climate change however, and Professor van den Bergh says a variety of factors may affect behaviour and our willingness to make certain sacrifices: “For example income differences matter, as people are liable to imitating those with more income and higher status.”
There is also a degree of variability in people’s desire or willingness to adopt new, more energy-efficient technologies, and the extent to which they are influenced by the choices of others in their social network. While a certain type of person wants to be the first to adopt new technologies, and may encourage others to follow their lead, some parts of the
Illustrative co-dynamics between tax stringency, effectiveness, wellbeing effects and public support. Right: Average public support for each income decile under different uses of carbon-tax revenue. https://doi.org/10.1016/j.gloenvcha.2022.102528
population are more resistant. “Some of these laggards will never be influenced by others, but some of them will be,” continues Professor van den Bergh. This social influence, and the extent to which it can encourage people to change their behaviour, is a topic Professor van den Bergh has explored further in a recent paper. “What is essential there is the idea that with regulation, or with pricing, you can change the behaviour of some people. These people then trigger more changes through social interactions,” he outlines. “We’ve shown for the first time that social interactions can reinforce the effectiveness of traditional regulation and pricing policies. We call this the social multiplier of regulation. Numerical analysis, informed by realistic parameters for Spain, indicates that the social multiplier of taxation allows a reduction in the effective carbon tax rate by 38 percent.”
Markets vs social interactions
At a methodological level, a novel contribution of EVOCLIM is extending traditional equilibrium market models with bounded rationality and social interactions. This could provide a sound basis for broader support of relevant climate policies in the future. A lot of current disagreement about climate policy stems from the fact that experts from separate disciplines often use different methods, leading to a lack of comparability and agreement. “Markets are studied in economics, allowing for the evaluation of carbon markets, taxation or standards as climate policy instruments. It’s very difficult to compare these kinds of findings with studies of instruments that involve social networks, like information provision, which are often studied by psychologists and sociologists,” explains Professor van den Bergh. The aim now in the project is to integrate markets and social networks in a single approach, which Professor van den Bergh notes “has been done in very few studies previously. We have published our work in a journal, and the reviewers were very enthusiastic about it.”
The project’s research has shown that social interactions reinforce the effectiveness of
carbon pricing, a finding which could inform the design of more effective policy instruments likely to be accepted by the general public. The project’s work has also yielded some other important insights, for example about opinions and the dynamics of climate policy. “We’ve found that agents’ tendency to resist opinion change translates into higher support for a tax trajectory where the initial tax is relatively low, then increases fast in later periods, in order to achieve a policy sufficiently ambitious to meet emissions targets,” outlines Professor van den Bergh. “We further show that transfers to households help to maximise support for climate policy, more so than green spending.”
The EVOCLIM project itself is nearing its conclusion, but Professor van den Bergh hopes to gain further funding to pursue new avenues of investigation on the dynamics of socialpolitical support for climate policy and the role of perceptions and concerns about economic growth. “There are a lot of different ideas in the air, like de-growth and green growth for example. I would like to look at the psychological dimensions of that, because whether people are right or not, their perceptions and beliefs influence policy,” he says.
This is not about contributing to the debate about whether green growth is possible, or whether it’s possible to reduce CO2 emissions while simultaneously growing the economy, but rather to look at the underlying basis of the different positions. For his part, Professor van den Bergh says he is entirely indifferent about growth. “I’m deliberately an agnostic about growth. I call my position a-growth,” he says. While many politicians place great emphasis on economic performance, Professor van den Bergh believes that GDP growth does not guarantee a better society. “We should try to convince politicians that they should not worry so much about GDP, because it’s no guarantee of societal progress in rich countries,” he argues. “But this shouldn’t translate into being against growth. If GDP is irrelevant as a progress indicator, the logical thing to do is to ignore it and be indifferent about its change. As a result, politicians may then look in a broader space for good social and environmental policies and strategies.”
EVOCLIM
Behavioral-evolutionary analysis of climate policy: Bounded rationality, markets and social interactions
Project Objectives
The EVOCLIM project aimed to develop a new set of agent-based models to assess the performance of climate-policy instruments, such as various carbon pricing and information provision instruments, under conditions of bounded rationality, social interactions and energy/carbon rebound.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under ERC advanced grant agreement No 741087.
Project Team
• Postdocs: Dr. Stefan Drews, Dr. Filippos Exadaktylos and Dr. Ivan Savin
• PhD students: Dr. Juana Castro, Dr. Joël Foramitti, Dr. Franziska Klein and Dr. Théo Konc
• Project manager: Marta Viana Diaz
Contact Details
Principal Investigator, Prof. dr. Jeroen van den Bergh Universitat Autònoma de Barcelona, ICREA & Vrije Universiteit Amsterdam
The Netherlands
T: +34-93-586 8773
E: jeroen.bergh@uab.cat
W: http://www.icrea.cat/Web/ ScientificStaff/Jeroen-van-den-Bergh-424
ICREA Research Professor in the Institute of Environmental Science and Technology of Universitat Autònoma de Barcelona, and full professor of Environmental Economics at Vrije Universiteit Amsterdam. His research is in environmental and climate economics. He is founding past Editor-in-Chief of the journal Environmental Innovation and Societal Transitions. He received the Royal/ Shell Prize 2002 for Sustainability Research, IEC’s Sant Jordi Environmental Prize 2011, and an honorary doctorate from the Open University (2019) in the Netherlands.
COOLER project gets under the surface of glaciers
Parts of the earth became glaciated around 5-6 million years ago, and over the last 2-3 million years glaciers have developed more widely, dramatically changing landscapes. Researchers in the COOLER project are investigating the transformation of fluvial landscapes into glacial landscapes, as Professor Peter van der Beek explains.
The earth has been slowly cooling over the Cenozoic era, and parts of the planet have become glaciated over the last 5-6 million years. This cooling has been associated with the development of mountain ranges; the Himalaya for example emerged during the Cenozoic era and have very high erosion rates, which influences the climate. “If rocks erode rapidly they also weather, leading to chemical changes. Weathering consumes CO2 from the atmosphere and brings it into clay minerals, which may be transported to the ocean and deposited there. Rapid erosion of mountains also leads to burial of organic carbon, as trees and plants are stripped off the mountain sides by landslides, transported with the sediments to the sea and deposited before they decompose. It has been suggested that this geological transfer of CO2 from the atmosphere to the lithosphere is the main mechanism by which this very slow, longterm cooling over the Cenozoic era has taken place,” explains Peter van der Beek, Professor of Geology at the University of Potsdam. On the other hand, climate also has an influence on erosion and the way in which it occurs. “If it gets cold enough, we will start to get glacial erosion. How does that influence the landscape? Does it increase erosion rates? Does it change spatial erosion patterns?” asks Professor van der Beek.
Cooler project
These questions are at the core of Professor van der Beek’s work as the Principal Investigator of the ERC-backed Cooler project. In the project Professor van der Beek is analysing apatite minerals gathered mainly from the Alps and Norway, using a technique called thermochronology. “We use this technique to study the cooling of rocks as they are brought to the surface by erosion; since temperature increases deeper into the Earth’s crust (typically by about 20-30 ºC/km), erosional exhumation of rocks leads to cooling,” explains Professor van der Beek. Specifically, the project team is looking at isotopes of helium found within apatite minerals. “Helium-4 (4He) is produced in apatite by the radiogenic decay of uranium and thorium. The helium-atom is quite small and it moves around in the mineral, so it tends to escape, but the speed at which it does that depends on the temperature,” says Professor van der Beek.
“Below what we call the closure temperature, this movement of helium in the mineral grain becomes so sluggish that it’s basically trapped – it quantitatively doesn’t escape any more. However, the closure temperature does not
measure ratios than absolute values, so proton irradiation of the samples is being used to induce the homogenous production of helium-3 (3He) before step heating. “In the mass spectrometer we can measure the amount of radiogenic 4He with respect to the amount of induced 3He,” says Professor van der Beek.
This will provide an insight into the distribution of 4He within the grain, as the 3He is induced homogenously, from which researchers can then learn more about how a grain cooled on its way to the Earth’s surface. For example, if it cooled very quickly, the helium would be expected to be homogenously distributed throughout the grain, whereas there would be a different distribution if it cooled more slowly. “The helium would still have time to diffuse partly out of the grain, and we would expect it to be depleted towards the rims. These are the kinds of things that we can see with this technique,” says Professor van der Beek. Samples are being collected at different
Weathering tends to transfer CO2 from the atmosphere to the lithosphere. It has been suggested that this is the main mechanism by which this very slow, long-term cooling over the Cenozoic era has taken place.
represent an ‘on-off’ switch in the diffusion of helium: it’s essentially a system where the speed at which these atoms move about is exponentially dependent on the temperature.”
The established method of analysing apatite grains helps researchers identify when the mineral crossed the closure temperature. Now Professor van der Beek is developing a new mass spectrometry tool designed to extract more information from a sample, which involves conducting step-heating experiments. “Instead of heating the whole grain all the way up in one go, we go in little steps. For example, we’ll go to 200ºC and stay there for a little while, then go higher in a step-by-step approach, which leads to the step-wise release of the helium,” he outlines. This approach helps to essentially map out where the helium is sitting in the grain. “The first helium that’s released will come from the outside of the grain. Then, as we continue this step heating, we’ll release more and more helium from the internal parts of the grain,” continues Professor van der Beek. It’s much easier on a mass spectrometer to
locations and elevations, from which Professor van der Beek hopes to build a fuller picture of how different landscapes developed. “We take samples along fairly steep profiles, and we also take samples from valley bottoms, to see maybe spatial gradients or spatial changes in erosion history along a single valley. Each sample will give you information at a specific point,” he outlines. “Then we combine these samples, looking to get a more regional overview.”
Numerical models
The plan in the project is to collect samples from the Pyrenees, Andes and Alps, as well as from Scandinavia, with researchers so far focusing on samples taken from Norway and the Alps. The best quality apatite minerals are generally found in granitic rocks, so these tend to be the focus of attention. “We collect samples in the field, crush the rocks, and separate the apatites,” Professor van der Beek outlines. Norway is a stable region where tectonic uplift ceased a long time ago, while Professor van der Beek says uplift rates in
the Alps are variable. “They’re quite high in the Western Alps and lower in the Eastern Alps, so we can contrast them. The Pyrenees are somewhere between the Alps and Norway in terms of tectonic activity, so we can study a gradient in tectonic uplift rates and tectonic activity,” he says. The data gathered in the project will be integrated into two types of numerical model, one of which is designed to predict tectonic and topographic evolution from thermochronology data. “It’s a thermal model, in which a landscape evolves in a prescribed way. What will the thermal structure of the crust underneath that landscape be? If rocks go through that crust as they are exhumed towards the surface, what will their thermal history be?” continues Professor van der Beek. “From that we can then predict for instance the thermochronological age pattern of the rocks, or the spectra of 4He / 3He release.”
A second type of model being used in the project is a landscape evolution model. With this type of model, researchers prescribe glacial and fluvial erosion processes, let them act on the landscape, and then look at how that landscape evolves. “Let’s say we have a mountain range that starts glaciating because of a cooling climate. How is the topography of that mountain range going to change? What will be the effects on the rocks being exhumed towards the surface, and their thermal history, which we can record?” says Professor van der Beek. This research could ultimately lead to a deeper understanding of natural variability in the climate over longer timescales. “We are interested in how these couplings and feedbacks between tectonic processes and the atmosphere might influence the climate. If we can get a better handle on that, then we can look to gain a deeper understanding of natural variability,” continues Professor van der Beek. “However, the
COOLER
Climatic Controls on Erosion Rates and Relief of Mountains Belts
Project Objectives
The ERC-funded research project COOLER aims to quantify the feedbacks between tectonic processes in the lithosphere and climatic processes in the atmosphere. Advancing our understanding of these couplings requires the development of tools that record erosion rates and relief changes with higher spatial and temporal resolution than the current state-of-the-art, and integrating the newly obtained data into next-generation numerical models that link observed erosion-rate and relief histories to potential driving mechanisms.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 834271.
Project Partners
• Helmholtz Zentrum Berlin, Germany
Contact Details
Peter van der Beek
Professor of General Geology
Institut für Geowissenschaften
Universität Potsdam
Karl-Liebknecht-Str. 24-25
14476 Potsdam-Golm
Germany
T: +49 331 977 5808
E: vanderbeek@uni-potsdam.de
W: http://erc-cooler.eu
pace of anthropogenic-induced climate change is currently orders of magnitude faster than that caused by natural processes.”
A lot of time has been devoted to setting up the protocol and the techniques, with Professor van der Beek and his colleagues starting to measure samples collected during fieldwork in the Alps and Norway, while work is also ongoing on the numerical models. The wider aim in this research is to build a deeper understanding of glacial erosion. “Does glacial erosion increase erosion rates over geological timescales? Or do we just see this locally and transiently, over relatively short timescales?” asks Professor van der Beek. “When were the Alps glaciated? When did that change from a fluvial to a glacial landscape happen? Did that happen during the very first glaciations or continuously over many glacial-interglacial cycles? These are the kinds of questions we would like to answer.”
Peter van der Beek is Professor of General Geology at the University of Potsdam, and has held research positions at institutions in Europe, America and Australia. He has been Chair of the Programme Committee of the European Geosciences Union.
Does working from home contribute to traffic reductions?
Switzerland has committed itself to the Paris Climate Agreement of 2015 which aims to limit the average global temperature increase to 1.5 or 2 degrees by the end of this century. For this purpose, Switzerland needs to reduce its greenhouse gas (GHG) emissions. Since the transport sector contributes substantially to emission levels, reductions in trafficand particularly in motorized transport - can contribute to Switzerland reaching the goals set out in the Paris Agreement. Additionally, Switzerland’s mobility sector suffers from traffic congestion. In 2019 the Swiss Federal Roads Office reported 30,000 traffic jam hours which are mainly due to road capacity overloads. Traffic congestion causes substantial time loss and economic costs. They occur especially during the morning and evening rush hours. Much of the country’s traffic is caused by commuters. Overall, 3.5 million people commute to the work place every day, and 50 percent of them do so by private cars.
Thus, next to reducing traffic and CO2 emissions, the transport sector also needs to redirect traffic and reduce usage during peak periods.
The digitalization of the labour market increasingly allows the usage of flexible forms of work. Flexibility is driven by the continuing diffusion of e-mails usage, online conferencing, virtual private networks (VPN), or digital files. These digital innovations allow telework and job flexibility. Telework means that part of the work is executed from home with the help of computers. Another kind of job flexibility applies to the working schedule and is called flexitime. Flexitime refers to the possibility to decide when to start and end the workday. Both flexible forms of work have the potential to reduce private mobility and also to reduce traffic during peak periods. First, working remotely could eliminate commuting on a regular basis, which might reduce traffic overall. Second, flexitime could help to distribute the traffic more evenly during the
day. This could be an important factor in reducing the need to expand the transport infrastructure. Therefore, the digitalization of the labour market has the potential to benefit the environment and the economy. However, it is not clear whether telework actually leads to less travel in general, or if saved commutes are counteracted by other trips (rebound-effect). It is also possible that working remotely could have other consequences, for instance, that people move further away from the workplace, which would lengthen the commute when they do need to be on-site. Flexitime workers could use their flexibility to commute outside of peak periods. However, they might also conform to common work hours of family members, friends, and colleagues, and hence, commute during rush hours. Therefore, the consequences of the digitalization of the labour market are not theoretically determined. Instead the way people react to the digitalization is an empirical question.
The mobility sector in Switzerland causes 32 percent of the country’s CO2 emissions. Hence, reductions or changes of the mobility sector are necessary to mitigate climate change. In their research, Dr Axel Franzen and Fabienne Woehner are investigating how digitalizing the labour market could help transform the mobility sector.
The project
The aim of the project is to investigate how working flexibly affects various aspects of travel behaviour. To answer these questions, we use the Swiss Mobility and Transport Microcensus (MTMC) 2015. This telephone survey is conducted approximately every five years by the Swiss Federal Statistical Office and the Office for Spatial Development. The MTMC is a nationwide representative survey consisting of 57,000 interviews. It contains abundant information about participants mobility behaviour, as well as other important variables such as the number of cars in a household, education of respondents, and their working arrangements. All our analyses refer to labour market participants between the ages of 18 and 65.
Overall, labour market participants commute on average 15 kilometres per day. Teleworkers commute about 21 percent kilometres less than those who never telework. This is interesting as teleworkers could also combine remote work with work on-site on a given day, which would then not replace the commute. However, there is a rebound effect in non-work travel (e.g. trips for leisure or shopping). Teleworkers drive more for other reasons, such as shopping or leisure, as compared to on-site workers. Taken together, telework does not
hours. A possible explanation is that the decision to commute during morning peak periods is due to coordinating commuting times with other family members, friends, and colleagues.
Despite these findings, telework and flexitime still might be beneficial for the economy and particularly for public health if they are associated with more walking and cycling (active travel). Previous research suggests that daily needs, such as shopping, are more likely to be executed closer to home when working remotely. Thus, these smaller distances can more easily be covered on foot or by bike. Moreover, flexitime lowers time constraints, which facilitates the use of slower modes of transport and might also offer additional opportunities to walk and cycle during breaks. Our analysis of the MTMC 2015 shows Swiss labour market participants walk on average 25 minutes per day, and cycle 6 minutes. Flexitime correlates with more and longer active travel, which is essentially due to more walking. The World Health Organization recommends about 30+ minutes of physical activity to entail some health benefits. Our statistical analyses show that teleworkers and flexitime users (those who have a core time agreement) are more likely to walk and cycle 30+ minutes per day.
MOBILITY IN SWITZERLAND:
Mobility in Switzerland: Potentials of the Digitalization of the Labour Market for Environment and Economy
Project Objectives
The digitalization of the labor market increasingly allows working flexibly, e.g. working from home instead of the workplace, and being flexible with respect to working hours. Therefore, more and more employees could reduce commuting to the workplace or could avoid rush hours. However, the commuting time saved might be spent on trips elsewhere. Based on nationwide behavioral mobility data, the aim of this project is to investigate whether working flexibly actually contributes to reductions in traffic and helps to relieve traffic congestions at peak periods in Switzerland.
Project Funding
The project has received funding from the Swiss National Science Foundation (SNSF) under the grant number 188866. Furthermore, the Swiss Federal Statistical Office and the Swiss Federal Office for Spatial Development supported the study by providing the data.
Contact Details
Project Coordinator: Professor Dr. Axel Franzen
Project Staff: Fabienne Wöhner
University of Bern, Institute of Sociology, Switzerland
T: +41 31 684 48 74
E: axel.franzen@unibe.ch
E: fabienne.woehner@unibe.ch
W: https://www.soz.unibe.ch/about_us/ people/prof_dr_franzen_axel/index_eng.html
W: https://doi.org/10.1016/j. jtrangeo.2022.103390
affect the total distances travelled during a week. This applies to the total amount of kilometres travelled incorporating all modes of transport as well as distances only driven by private vehicles, such as cars or motorcycles. Even though telework does not reduce traffic, it might help to distribute the traffic more evenly during the day. Our results show that telework reduces evening rush hour commutes since teleworkers are less likely to commute by car during evening peak periods as compared to on-site workers. One interpretation of this finding is that teleworkers prefer to spend the afternoons working at home. In contrast to this, the analyses indicate that flexitime even promotes morning peaks as flexitime workers are more likely to commute to work during the morning rush
To sum up, this project contributes to the research about the impact of telework and flexitime on travel behaviour. The main conclusion of our study is that the digitalization of the labour market has not contributed to a reduction of traffic in Switzerland so far. However, the digitalization might increase slow traffic, and thereby improve public health. Our findings are based on surveys about individual behaviour. The results help to assess the consequences of the digitalization of the labour market, and provides some valuable insights on how to increase sustainability in the transport sector. Most likely, the digitalization of the labour market itself will not reduce traffic but must be combined with other measures such as congestion fees or road pricing.
Axel Franzen is Professor for Methods and Statistics at the Institute of Sociology at the University of Bern, Switzerland. His research interests include methods of quantitative social research, experimental game theory and environmental sociology. He is an expert in empirically studying people’s mobility behaviour, and he has published various research on the discrepancy between environmental attitudes and pro-environmental behaviour.
Fabienne Wöhner is a PhD candidate in the Institute of Sociology at the University of Bern, Switzerland. The mobility project is part of her dissertation. Her research interests include survey methodology and data collection, environmental sociology, public health, and mobility behaviour.
Teleworkers who work partly at home commute to the workplace less often. However, the reduction in commutes is counteracted by trips for other purposes such as shopping or leisure. Overall, teleworkers do not travel less as compared to individuals who always work on-site.Axel Franzen Fabienne Wöhner
A step towards circular flooring
Polyvinyl chloride (PVC) is used in a wide range of applications, from flooring to artificial leather, but only a relatively small proportion is currently recycled. Researchers in the Circular Flooring project use the CreaSolv® Process to recover PVC from post-consumer waste flooring, enabling its eventual re-use in new products, as Thomas Diefenhardt and Dr. Martin Schlummer explain.
Polyvinyl chloride (PVC) is widely used in durable applications like window profiles or piping and therefore equipped with specific stabilizers. The addition of plasticizers to a PVC sheet gives it more flexibility, making it suitable for a variety of applications, for example in construction, transport and healthcare. However, some of the plasticizers or heavy metal stabilizers that were commonly used in the past cannot be used in recycled products after studies revealed that they might have adverse effects on living organisms, and their use is therefore restricted under the European Union’s REACH (Registration, Evaluation, Authorisation and Regulation of Chemicals) legislation. “They first have to be extracted from the PVC,” explains Thomas Diefenhardt, an Associate Scientist at the Fraunhofer Institute for Process Engineering and Packaging. Soft PVC containing these legacy plasticizers – specific phthalic acid di-esters – at concentrations of 1000 ppm or above may no longer be placed on the market. The issue, to bring increasing recycling rates and compliance with regulatory standards in line can only be faced via substantial purification of waste PVC states Diefenhardt, who addresses this topic with his colleagues in the Circular Flooring project. “We aim to find a solution to open up the possibility of recycling to currently unused PVC waste.
We are also working on methods of treating separated plasticizers so that they are no longer harmful,” he outlines.
Circular flooring
This research is targeted in particular at postconsumer flooring waste, with researchers investigating whether it can be treated effectively using the CreaSolv® Process. This process already works well on many other polymers, now Diefenhardt is looking to finetune it for use on post-consumer flooring waste. “We also want to see if after separation PVC and plasticizers can be post-processed separately and finally re-implemented into new flooring. Our aim is not only to treat the plasticizers, but also to re-implement them,” he says. Soft PVC as product material can be thought of as PVC polymer chains with multiple embedded additives in between the polymer chains. While some of the additives
need to be depleted, others still have a positive impact on the PVC’s function. “We mainly want to remove hazardous substances, so the plasticizers,” continues Diefenhardt. “We may want to retain some of the other additives in post-consumer flooring, because PVC converters say it is beneficial to keep some of them, since they would be added anyway.”
The aim of the project is not to strip everything down, but rather to take a more selective approach, resulting in a product that can then be useful in the commercial market. It is however important to remove the plasticizers, which have been identified as being of very high concern under the REACH legislation. “One of the most prominent legacy plasticizers is the organic compound DEHP for example, which was widely used in PVC flooring in the past,” says Dr. Martin Schlummer, the coordinator of the project. DEHP is now known to disrupt the endocrine system in the
body, so other plasticizers are now used. “One replacement for DEHP is DINCH, which is now accepted and is still on the market,” explains Dr. Schlummer. “Before DINCH is re-added to recycled PVC, however, the mixture of old plasticizers contained in waste soft PVC must be removed from the polymer.”
Waste PVC includes not just DEHP but also DIBP, DDP and BBP, all of which may have adverse effects on the endocrine system. The project’s overall agenda covers every stage from the separation of these plasticizers through to their eventual recycling into a new product. “Some of the partners are doing cost modelling work regarding the products we develop. We also consider the energy and material input,” outlines Diefenhardt. The focus of Diefenhardt’s work in the project however is the CreaSolv® Process, in which a highly selective solvent is used to dissolve the polymer. “We use a solvent that is specific for PVC. There might also be rubber and other materials in the input fraction, but we dissolve only the PVC and the associated additives,” he
in the project are also working to develop a hydrogenation process to deactivate the plasticizers, which opens up the possibility of recycling them, and the signs so far are very promising. “Our partner working on this has been able to achieve a high level of purity,” continues Dr. Schlummer. “This is a chemical process however, and it needs to be performed at fairly large scales. So that means that before you start treating plasticizer residues, you need to first have quite a high capacity for PVC recycling.”
The cost-effectiveness of the recycling process as a whole is an important factor in terms of the prospects of wider uptake. If the process can be applied on sufficient volumes of feedstock, then the cost of the recycled material will be highly competitive, and Dr. Schlummer says the hydrogenation process represents a way to make effective use of legacy plasticizers. “Otherwise they would just be incinerated. This is a route to make value and money out of them,” he points out. This research could have a wider environmental impact, reducing the
CIRCULAR FLOORING
New products from waste PVC flooring and safe end-of-life treatment of plasticisers
Project Objectives
CIRCULAR FLOORING aims to enable circular use of plasticised PVC (PVC-P) from waste flooring by developing recycling processes that eliminate plasticisers including hazardous phthalic acid esters (e.g.DEHP). We will demonstrate the project results via production of highquality recycled PVC, reprocessing of eliminated plasticisers to new phthalate-free plasticisers and re-use of recycled polymers and additives in new flooring applications.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement no. 821366.
Project Partners
https://www.circular-flooring.eu/partners/ Contact Details
Jan-Hendrik Knoop, M. Sc.
Scientific Associate
Process development polymer recycling & decontamination
Fraunhofer-Institute for Process engineering and packaging IVV Giggenhauser Str. 35 85354 Freising, Germany
T: +49 8161 491-339
E: jan-hendrik.knoop@ivv.fraunhofer.de
W: https://www.circular-flooring.eu/ W: www.ivv.fraunhofer.de
https://doi.org/10.1039/D1GC03864H
says. “It may be that around 60 percent of the input fraction is PVC, which we aim to recover from the PVC flooring waste.”
There are several further steps in the CreaSolv® Process before the material can be reused in production, including cleaning, precipitation and drying. The final product reaches the standards required for reuse, says Dr. Schlummer. “All of the data that we have so far indicates that we can replace virgin PVC in flooring,” he says. Researchers
need to incinerate waste PVC and so bringing down CO2 emissions, while there is also interest from plasticizer manufacturers keen to work in a more sustainable way. “We are talking to the European Association of Plasticizer Manufacturers, and they are very interested in the project,” outlines Dr. Schlummer. “Recycling is a major issue for the industry today.”
An effective method of recovering waste products holds rich potential in these terms, and Dr. Schlummer and his colleagues are now exploring the possible practical application of their research. Each of the steps in the process have been defined and shown to be working well, and the next step is to demonstrate its economic and technical feasibility at higher scales. “We have the PVC extraction plant equipped. We have this demonstration capacity, so we can come from the lab into these production sites and produce higher volumes of samples,” says Dr. Schlummer. This can then provide a kind of blueprint for larger, commercial scale plants, with Dr. Schlummer looking to work with partners beyond the scope of the Circular Flooring project. “We have contacts in Asia and North America for example, and we hope also that these bilateral projects could lead to commercial-scale plants,” he says.
Dr. Martin Schlummer is Senior Scientist in the Department of Process Development for Polymer Recycling at Fraunhofer IVV, where he has been a research fellow since 1999. He has coordinated more than 30 national and international research projects and (co-)authored more than 62 publications in peer reviewed scientific journals. Thomas Diefenhardt is a scientist in the field of plastic recycling at Fraunhofer IVV. He is conducting research in extracting hazardous additives from plastics and performs lab- and technical-scale experiments.
We aim to find a solution to open up the possibility of recycling to currently unused PVC waste. We are also working on methods of treating separated plasticizers so that they are no longer harmful.Dr. Martin Schlummer Thomas Diefenhardt Dried PVC product.
Why do companies enter moral markets?
Markets for products like solar panels, fairtrade food, or plant-based meat aren’t driven purely by the desire for financial profit, but also by the goal of addressing certain social and environmental challenges. Why do companies choose to enter such moral markets? It’s not just about resources, but also a firm’s identity and the social context, as Dr Panikos Georgallis explains.
Typically a company will decide to enter new markets where it can leverage its existing resources and capabilities, whether that’s in groceries, solar panels or fashion. While many companies still work on this basis, anecdotal evidence from moral markets shows that some firms enter without having related resources, for example the clothing company Patagonia. “In 2012 Patagonia entered the organic food market. Clothing and food are not really related,” points out Dr Panikos Georgallis, Assistant Professor of Strategy at the Amsterdam Business School. As part of his work on the Let It Shine project, Dr Georgallis is investigating why some companies choose to enter moral markets without necessarily having existing expertise or capabilities in that area. “In a recently published paper a colleague and I explain that such decisions depend partly on their corporate identity: what they stand for, what is the purpose of the organisation,” he outlines.
LET IT SHINE
The emergence and evolution of moral markets
This project is funded by The Dutch Research Council (NWO) (project 016.Veni.195.232).
Panikos Georgallis, Assistant Professor Strategy & International Business Amsterdam Business School University of Amsterdam
T: +31 (0)20 525 5311
E: p.georgallis@uva.nl
W: https://journals.sagepub.com/doi/ full/10.1177/1476127019827474
W: https://www.uva.nl/en/profile/g/e/p. georgallis/p.georgallis.html
Entering moral markets
As a company, Patagonia places a lot of emphasis on environmental issues and protecting the natural world, so the company chose to enter the food market to help address sustainability concerns. This is an example of a company entering a moral market that’s aligned with their values, rather than their competences. But there are also cases where firms have entered despite having a seemingly different ethos, in part due to pressure from social movements. “In order to promote a moral market, activists sometimes call out what they see as the bad guys,” explains Dr Georgallis. Activists invest a lot of energy in promoting alternatives to a product, for example
purposes. It’s also harder for companies to work together if their agendas are very different. Firms or NGOs that are trying to promote renewables are not always willing to work together with oil companies. This creates difficulties in terms of promoting the market to policy-makers and to consumers,” he outlines. It may be difficult for a solar energy start-up to lobby for subsidies or grant support together with oil companies for example. “They have different motives, and it’s hard for them to coordinate effectively,” points out Dr Georgallis.
The presence of a diverse range of actors in a moral market may be considered a strength in some respects, yet it can also be a weakness.
Panikos Georgallis is an Assistant Professor of Strategy at the Amsterdam Business School. His main research interests lie in the areas of strategy, organisation theory and sustainability. His research currently focuses on moral markets, business-society interactions and organisational responses to climate change.
solar and wind energy as an alternative to oil, but they also focus on identifying the companies who are still extracting fossil fuels. “By calling out the oil companies which contribute to environmental problems, activists also draw attention to the fact that there are alternatives that they could invest in,” continues Dr Georgallis.
This social pressure may encourage larger companies to enter moral markets, yet this doesn’t mean they have stopped trying to find oil. The entry of a major company into a moral market may enhance the visibility of an industry, and they also bring significant resources and expertise, yet Dr Georgallis says there are also potential downsides. “They might be entering a moral market simply for impression management
In the early stages of an industry’s emergence, Dr Georgallis believes diversity of actors is not beneficial. “At that stage it’s really important to have a strong collective identity, for market players to understand that they need to work together to promote the market and get people to understand what plant-based meat is, or what a solar panel is. Coherence might be less important later on, when people understand the product better,” he says. These are important considerations for NGOs or activists looking to encourage the development of moral markets. “Effective campaigns can encourage more firms to enter moral markets, which is desirable. At the same time this may also bring in the bigger companies with different values,” explains Dr Georgallis.
By calling out the companies which contribute to environmental problems, activists also draw attention to the fact that there are alternatives that they could invest in.
Mixoplankton – marine organisms that break the rules!
The conceptual basis upon which management tools for our ocean, seas and coasts operate are out-of-date and do not adequately address current challenges. We spoke to Dr Aditee Mitra and Dr Xabier Irigoien about implications for ocean health, policies, aquaculture and fisheries under climate change in this UN Ocean Decade.
The way marine systems have historically been thought to function closely parallels that of land-based systems, where plants produce food and animals consume it. Thus, in marine systems, the traditional view is that phytoplankton (microalgae) produce food which zooplankton consume, which in turn, are consumed by larger animals through to fish. “Actually, the situation is quite different,” says Dr Mitra, project coordinator of MixITiN. “What have traditionally been labelled as primary producers are also often
consumers, and what have been labelled as consumers are also primary producers – life is complicated!” These organisms that combine plant-like and animal-like characteristics in the one cell are termed mixoplankton.
“In essence, over decades of research, mixoplankton have been mislabelled and misunderstood,” she explains. This forms the backdrop to the work of the MixITiN project.
The MixITiN project focused on the development and deployment of new research methods for application in ocean
sciences to establish a better picture of how mixoplankton contribute to marine ecology. This interdisciplinary project encompassed diverse approaches, including laboratory and field work drawing on a range of disciplines, from molecular biology right through to coarse grain systems biology techniques. The project’s wider goal – an improved understanding of indicators impacting ocean health, has major implications for the design of ocean ocean management policies and planning for a sustainable future.
Mixoplankton – why the fuss?
Mixoplankton are not new discoveries as such, rather until recently they were just not recognised as a major component of the plankton community. “Indeed, we found that most of what are traditionally thought to be phytoplankton, and around 50% of protozooplankton, are actually mixoplankton,” says Dr Mitra. Well-known phytoplankton that are now known to be mixoplankton include: the chalk-producing coccolithophore Emiliania huxleyi which, like the minute ocean-dwelling phytoflagellates, can eat bacteria to obtain vital nutrients; toxin-producing Alexandrium and Dinophysis, whose blooms (ironically termed Harmful Algal Blooms, HABs) result in shellfish contamination and aquaculture closures; Karlodinium and raphidophytes that cause mass mortalities of farmed and wild fish.
Mixoplankton are a diverse group which can be broadly divided into two types: constitutive mixoplankton (CM) – these have their own chloroplasts and can also eat (‘plants that eat’); non-constitutive mixoplankton (NCM) – these are like bodysnatchers, acquiring phototrophy through ‘stealing’ chloroplasts from their prey or capturing and maintaining their prey as symbionts (‘animals that photosynthesize’). Mixoplankton are an enigmatic group; for example, the CM group include the fish-killing Prymnesium on one hand and the fisheries supporting Tripos (Ceratium) on the other. Likewise, the NCM group include the fishsupporting Strombidium and also the bane of shellfisheries,
the toxic Dinophysis
The wider backdrop to this is concern that with climate change we will see changes in biodiversity as well as on food security and sustainability, and Dr Mitra says plankton are fundamental in this respect. “Microbial plankton drive life in the ocean, so it is essential that we get the basics right,” she stresses. This need has been the driver of the MixITiN project.
Mixoplankton and climate change in the UN Ocean Decade
The UN has designated 2021-30 as the Decade of Ocean Science for Sustainable Development, an initiative designed to support ocean research and facilitate communication with stakeholders, including policy makers, ecosystem managers and the fisheries industry. The Ocean Decade has its roots in a wider recognition of the importance of the ocean systems and the ecosystem services they provide, along with a commitment to reversing the long-term decline in their overall health and supporting
the UN’s ‘Sustainable Development Agenda’. “Issues like eutrophication, climate change and the occurrence of HABs all affect the health of the oceans. Critically, the mixoplankton paradigm strikes at the heart of this effort, by bringing into question core scientific assumptions,” explains Dr Irigoien, Scientific Director of AZTI.
Mixoplankton and Ocean Management
The shift in perception of the food web under the mixoplankton paradigm dramatically changes how marine systems should be viewed and studied. That we got our understanding so fundamentally wrong should make us reexamine the whole subject of plankton ecology.
Mixoplankton combine multiple trophic strategies for acquiring nutrition and energy. However, traditional climate change models are focussed on single strategies –photosynthesis versus predation in different organisms. This means that the conceptual basis of marine systems within climate change simulations are at best a gross simplification and at worst fundamentally flawed. Coarse grain systems biology modelling undertaken within MixITiN has shown that the exclusion of mixoplankton could potentially have serious consequences for future predictions.
This in turn affects our understanding of how energy moves in the marine trophic chain, from microbes to fish. These issues fall within the aegis of ocean management including policies such as the EU Marine Strategy Framework Directive, Dr Irigoien points out “mixotrophy does not change fisheries or water quality operational management, as we act on indicators. But it substantially changes our understanding about the carrying capacity of the ecosystem and its response to anthropogenic impacts. We need to revise those indicators, as the ecosystem tipping points might be different than what we thought.”
The mixoplankton paradigm brings into question our basic understanding of the energy fluxes in the ocean, from carbon fixation to fish production.Integration of facets of mixoplankton science in MixITiN. Working clockwise from 12 o’clock these include field studies and monitoring, determination of rate processes, taxonomic and genomic analyses, chemical analyses leading to stoichiometric ecology, trophic dynamics, ecosystem services and sustainability/profitability. The centre denotes the focussing of all these activities through modelling for hypothesis testing and predictive purposes. © Aditee Mitra Note: that only a portion of the Acantharia (top left corner) and green Noctiluca (bottom right corner) are visible here. Background image: Artistic interpretation of representative mixoplankton drawn to common scale; mixoplankton range from 2 μm to > 5 mm. The width of the image is ca. 250 μm (= 0.25 mm). © Claudia Traboni
Mixoplankton - challenges for aquaculture
MixITiN has highlighted the importance of revising the indicators of the health of marine ecosystems in line with the mixoplankton paradigm. This is especially important for predictions of algal blooms and their impacts on aquaculture. “Most HAB species are mixoplanktonic, what controls their growth is not what we thought it was – it is not just light and plant-food. Competitors and even grazers are actually potential food,” explains Dr Mitra. And, new types of mixoplankton blooms are appearing, such as green Noctiluca which are growing across coastal oceans with climate change. In the Arabian Sea, these blooms are leading to the collapse of the traditional phytoplankton-zooplankton-fisheries link in the food web with severe food security and socio-economic hardships to a population of over 140 million people. Other mixoplankton affect recreational activities and the property market - discolouration of water caused by Karlodinium blooms have been known to result in a decrease in prices of highly soughtafter waterside properties.
“There is also a potential interaction with aquaculture, as fish farms release both the nutrients needed for photosynthesis, but also the organic matter used by mixoplankton. Algal blooms are a major issue for aquaculture all around the globe and including mixoplanktonic activity helps us to understand what controls such blooms. Indeed, the mixoplankton paradigm helps plug gaps in our understanding of what controls many algal blooms,” emphasizes Dr Irigoien.
Mixoplankton– where are they and when? Challenges for monitoring
Current monitoring methods need to reflect the complexity of the reality we now better understand. Dr Irigoien points out that “most routine field phytoplankton sampling techniques currently used in ocean monitoring, based on chlorophyll, are not well adapted to provide quantitative data on mixoplankton”. For example, the presence of chlorophyll - used as an indicator of phytoplankton biomass in surveys and ecosystem monitoring - is actually not just an indicator of the presence of phytoplankton. “It may also indicate the presence of mixoplankton, which are not just primary producers, but also consumers, and include harmful species,” explains Dr Mitra. It is important that plankton monitoring programmes take into account the mixoplankton communities; their proliferation is not driven solely by light and inorganic nutrients as is that of phytoplankton.
Therefore, they have a much wider and diverse impact on marine trophic dynamics.
Ocean Literacy; raising the profile of mixoplankton
One of the wider aims of MixITiN has been to raise awareness, enhance ocean literacy, educate and train; this means not just the 11 researchers employed on the project, but also school pupils and the wider public. Mixoplankton are key drivers of life in the ocean, and the MixITiN researchers have been keen to raise the profile of these enigmatic organisms. Without mixoplankton, life on Earth would not function as it does.
The importance of mixoplankton is not currently recognised in educational books, environmental management frameworks nor in the simulation models used to guide climate change policies, issues that Dr Mitra and her colleagues in the project are keen to address. “With this in mind, we have produced various open access manuals and reports to aid understanding and development of new protocols,” says Dr Mitra. These include a fieldwork guide, a functional group classification guide to establish whether a HAB species is mixoplanktonic or not, a manual for isolation and establishment of cultures and a simple mixoplankton food web model for teaching. These materials and other information can be accessed via www.mixotroph.org.
What next?
Mixoplankton are important organisms helping to maintain planetary homeostasis and therefore could play a fundamental role under climate change. “We have looked at their biogeography, and we have seen that mixoplankton are absolutely everywhere, but the question of which type occurs where and how it relates to factors like seasonality, warming waters, and changing availability of nutrients, remains unclear,” says Dr Mitra. The project MixITiN has come to the end of its journey (Oct 2017-Sept 2021). However, there is still much to learn about mixoplankton.
Dr Mitra and colleagues are currently working to build a comprehensive mixoplankton database to establish a clearer picture of which organisms are mixoplanktonic, and which species eats what. Various planktonic species have not been documented as mixoplanktonic, because no one has been looking for this characteristic routinely. Now is the time to look specifically for them. “Once again on our marine planet, we need to re-start from the basic questions, how many, where, when,” says Dr Irigoien.
MixITiN
Bringing the paradigm for marine pelagic production into the 21st century; incorporating mixotrophy into mainstream marine research
Project Objectives
The overarching aim of MixITiN has been to explore the underlying principles of marine ecology under the new mixoplankton paradigm, a concept that reimages over a 100 years of science. The project has developed a suite of novel techniques for investigation and monitoring of our ocean under this paradigm.
Project Funding
The MixITiN project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 766327.
Project Partners
Contact Details
Dr Aditee Mitra
School of Earth and Environmental Sciences
Cardiff University
E: MitraA2@cardiff.ac.uk
W: www.mixotroph.org
Mitra & Flynn (2021) HABs and the Mixoplankton Paradigm. UNESCO Harmful Algae News no. 67 https:// zenodo.org/record/5109703
Mitra et al. (2021) Novel Approaches for investigating marine planktonic mixotrophy. MixITiN Report 3.8 http://doi.org/10.5281/Zenodo.5148500
Leles et al (2021) Differences in physiology explain succession of mixoplankton functional types and affect carbon fluxes in temperate seas. Prog Oceanogr 190:102481 https://doi.org/10.1016/j.pocean.2020.102481
Flynn et al. (2019) Mixotrophic protists and a new paradigm for marine ecology. J Plankton Res 41:375 https://doi.org/10.1093/plankt/fbz026
Dr Aditee Mitra
Dr Aditee Mitra is a Research Fellow in Cardiff University, UK. Her research focuses on plankton life in the single largest ecosystem of Earth, the ocean. She has been a key driver of the mixoplankton paradigm in marine ecology. She is a keen advocate of ocean literacy.
Dr Xabier Irigoien is Scientific Director at AZTI, a scientific and technological centre based in northern Spain. He is also an IKERBASQUE Research Professor. His main research interests lie in biological oceanography, plankton ecology and the physics of plankton-fish interactions.
sister species in the phylogeny remains hermaphroditic? What changed in the ecology, or the circumstances of the species? These questions are at the core of Professor John Pannell’s research.
The vast majority of flowering plants are hermaphrodites, with the ability to perform both male and female sexual functions. As Professor in Plant Evolution at the University of Lausanne, John Pannell is interested in the strategies that the genes in plants use to ensure their own transmission. “To what extent is a population of hermaphrodites maintained in such a state by natural selection?” he asks. In principle, we need to think about a population of hermaphrodites that is challenged by mutations that bring about ‘maleness’ or ‘femaleness’. “Imagine a mutation that emerges in a population of hermaphrodites such that individuals expressing the mutation are no longer able to produce pollen. One might think that expressing such a mutation would be a disadvantage, because the individuals concerned have lost a key mode of transmitting their genes to the next generation. But precisely that sort of mutation is required for a population to evolve towards separate sexes, in which females are just male-sterile hermaphrodites,” explains Professor Pannell. “The question to address is why and when such a sterility mutation should ever be beneficial?”
Sterility mutations
A mutation like this would typically be expected to disappear rapidly from the population, as it closes down one avenue towards reproductive success. However, separate sexes have evolved in some plants from hermaphroditic ancestors, so male or female sterility mutations must have
been transmitted to subsequent generations in the past. “There must have been cases in the past where a mutation came along in a hermaphroditic population, leading to the loss of the male function for example, and individuals with that mutation then ended up transmitting more genes than their peers in the population,” outlines Professor Pannell. The other hermaphrodites carry on reproducing as before, but as they
are not reproducing as effectively as this new mutant, the male sterility mutation begins to spread in the population. “The new mutation becomes more and more frequent and you get more and more females in the population. We then have a situation where both females and hermaphrodites coexist,” he continues.
A plant that has evolved to be only female may also acquire other characteristics over
the production and dispersal of seeds. Sexual conflict arises in hermaphrodites when the male and female functions require different morphologies or behaviours. “It is pretty easy to see this in animals, and it applies to plants in much the same way. In mammals, for example, being female requires a physiology that is able to maintain a pregnancy, while being a successful male may require an ability to compete aggressively with other males to mate with the females. Hermaphrodites may suffer from a conflict between these two sexual functions,” says Professor Pannell. This conflict can reduce the reproductive success of hermaphrodites, which is probably one of the reasons why many species of animals have separate sexes. “In a hermaphroditic population in which there’s a strong conflict between the male and female functions, we might well expect mutations to be successful that suppress one sexual function or the other, allowing specialisation in the remaining sexual function,” explains Professor Pannell.
In addition to cases where separate sexes have evolved from hermaphroditism, there are also cases where populations have evolved from dioecy to become hermaphroditic. The main advantage of hermaphroditism is commonly viewed as the ability to self-fertilise, but Professor Pannell says the overall picture is more complex than that. “Many flowering plants have mechanisms to prevent self-fertilisation, yet they are still hermaphrodites,” he explains. “So there must be other explanations for maintaining hermaphroditism.”
The maintenance of hermaphroditism cannot be attributed simply to the ability to self-fertilise, a topic that Professor Pannell is exploring in his research. Professor Pannell
changed the mating opportunities available in populations of a flowering plant called Mercurialis annua. “We started off with several populations of males and females, and, in some populations, we removed the males,” he outlines. Normally these population would be expected to go extinct due to the lack of mating opportunities, yet this did not always happen. “We tried this experiment three times, and on the first two occasions the populations did effectively go extinct. But every so often in this species – and also in many other plant species with separate sexes – the sexes are not completely separated,” continues Professor Pannell. “Essentially, some females occasionally produce a male flower, or males occasionally produce a female flower.”
population of 100 female individual plants, all of which produce 10 ovules in their female flowers. If one female now starts to produce pollen, she will be able to fertilise all of her ten seeds, and so produce ten seeds of her own,” explains Professor Pannell. “However, she will also be able to fertilise the seeds produced by all the other females that are not producing pollen. Ultimately, she may expect to see her genes transmitted through all 1,000 progeny produced by the population, whereas those females not producing pollen will have their genes in only their own 10 progeny.”
The genes that allowed that first female to produce pollen will now be in the progeny of all those individuals, and so more will have that same ability in subsequent generations.
Why is it that one species has separate sexes, while its sister species in the phylogeny or its ancestor was hermaphroditic? What changed in the ecology, or the circumstances of the species, to facilitate the shift?
Sex inconstancy
This is called ‘leakiness’ in sex expression, or sex inconstancy, and researchers have found evidence of it in Mercurialis annua. On the third occasion that they conducted this experiment, some of the females showed a degree of sex inconstancy, which gave those females a big advantage in terms of reproductive success. “The inconstant females could produce their own seeds by self-fertilisation, which pure females elsewhere in the population could not do,” says Professor Pannell. The more important factor however is that the females producing pollen were able to sire not only
“Accordingly, we find that the females in the experimental populations quickly evolved to become hermaphrodites, producing more and more male flowers,” says Professor Pannell. It might be expected that a Y chromosome is required to produce pollen, given that pollen production is the male’s function, yet these females do not have a Y chromosome - it was effectively removed from the population with the at the start of the experiment. “It’s clearly not necessary to have a Y chromosome to produce pollen,” outlines Professor Pannell. The genes required for pollen production are thus clearly not located exclusively on
THE EVOLUTION AND RESOLUTION OF SEXUAL CONFLICT IN FLOWERING PLANTS
Project Objectives
The majority of flowering plants are hermaphrodites, with both male and female sex functions. Sexual conflict arises where those two different functions require different morphologies, or different behaviours. The aim in the project is to probe the implications of sexual conflict for the phenotypes and genotypes of plants, specifically in the context of transitions between dioecy and hermaphroditism. This is where a population shifts from a dioecious state, where individuals have only one sex function, to hermaphroditism, where they have both.
Project Funding
This project is funded by the Swiss National Science Foundation (SNSF).
Contact Details
Project Coordinator, John Pannell
Pannell Group - Ecology and Evolution of Plant Sexual Systems
University of Lausanne - Switzerland
T: +41 21 692 4170
E: john.pannell[@]unil.ch
W: https://www.unil.ch/dee/en/home/ menuguid/people/group-leaders/prof-johnpannell.html
the Y chromosome. “The ability to produce male flowers is on the other chromosomes - all you need is something to switch those genes on,” says Professor Pannell. “In a dioecious population with separate males and females, the switch is something on the Y chromosome. We don’t know precisely what this switch gene is, but we know roughly where it is on the Y chromosome.”
The switch gene effectively turns on a developmental programme. In Professor Pannell’s experiment, this switch was removed when males and the Y chromosome were removed, so something elsewhere in the genome must have adopted the role of the switch. “Again, we don’t know where that switch gene is, but it’s on one of the nonsex chromosomes, the ones we call the autosomes,” explains Professor Pannell. “We are trying to find those regions in the genome that are implicated in male flower production, and we’ve found several candidate regions. What we don’t know is which of those regions harbours the switch gene, and which are the genes that are being switched on. We do know, however, that different genes are involved, and together they contribute to male flower production in females.”
A number of different genes affect sex expression, yet a single gene may also have more than one effect. Professor Pannell’s group have identified a gene that causes an increase in male flower production, while it also leads to a decrease in female flower production, an example of what is called a pleiotropic effect. “It’s like a trade-off effect, where the same gene alters both male and female sex function,” says Professor Pannell. A
gene that affects sex expression may also have other effects on a plant. “For example, those individuals that express the male sterility mutation have smaller leaves for example, or they produce less nectar. There will thus be other effects of that gene that go beyond sex expression,” explains Professor Pannell.
This program of research has largely centred on the European garden weed Mercurialis annua, yet Professor Pannell is also looking at other plants, such as certain dioecious species of the South African ‘cone bush’ genus Leucadendron, which are often sold by florists in Europe for spectacular floral displays. One interesting finding from this strand of research is that the genes that become sexbiased in their expression also evolve more quickly. “The reason we know that, in this genus, is that we’ve looked at sex expression in a number of related species. Our analysis allowed us to infer what the sex expression was likely to have been in the common ancestors within the genus,” outlines Professor Pannell. By sampling a number of species in the genus, and analysing them comparatively, Professor Pannell’s team has been able to test the idea that the genes that become sex-biased evolve more rapidly because of sexual selection. “But we found that the genes that became sex-biased were already evolving quickly, well before they became sex-biased,” he explains. “We thus need a rather different model to explain the relative speed of evolution of sex-biased genes. It seems, for instance, that the sex-biased genes are simply sampled by the plant from a pool of genes that evolve more quickly – genes whose expression levels perhaps do not matter very much to the plant.”
John Pannell is Professor in Plant Evolution at the University of Lausanne. His main research interests include the evolution of plant gender and sexual dimorphism, as well as the ecology, genetics and evolution of polyploidy.
A lot of attention in research is focused on the development of engineered nanomaterials, with scientists across the world working to develop new materials with potential applications across a wide variety of sectors, including healthcare, electronics and manufacturing. While engineered nanomaterials (ENM) have a lot of potential, it’s also important to consider their safety and wider impact, a topic at the heart of the NanoInformaTIX project. “In the project we have created a modelling framework to predict the safety profiles of nanomaterials, a sort of decision support system for manufacturers of new materials and products,” explains Lisa Bregoli, the project’s dissemination manager. A lot of data is currently available from the characterization and testing of nanomaterials, the result of decades of research all around the globe, says Bregoli. “This huge amount of relevant data has to be exploited in such a way that it can be used by manufacturers of new products containing ENM, as a guide in their product development phase,” she outlines.
Nanomaterials
This forms the backdrop to the NanoInformaTIX project’s work, in which researchers have connected the dots between the available data and models. This has led to the development of a web-based platform designed to support manufacturers and help them understand and predict the safety profile of their ENM-containing products during their development pipeline. ENM have at least one dimension between 1-100 nanometres and very specific properties. “They may have surface energies for example that cause different types of physical interactions with the surrounding molecules, leading to specific behaviours,” says Bregoli. “These behaviours can be beneficial, for example in creating self-cleaning materials, or for delivering drugs precisely to the site of disease, avoiding side-effects.”
On the other hand, the standard models used for assessing safety cannot be applied to nanomaterials, an issue that Bregoli and her colleagues have addressed in NanoInformaTIX. “In the project we
integrated data from several relevant EU (and US) databases with validated nanoinformatics models,” she says. “We have also created new models and in vitro/in vivo extrapolations to support the prediction of biological effects and exposure of ENM at various stages of their life cycle.”
The project was structured into six technical ‘workpackages (WP)’, or groups of activities, the first of which was devoted to data and databases. Existing databases such as eNanoMapper bring together biological and toxicological information on nanomaterials, as well as characterization data. “Every single dimension and property of a nanomaterial can really make a difference in terms of interaction with the surrounding environment, so you have to describe the nanomaterial in a very detailed way, including characteristics like size and shape,” outlines Bregoli. For this reason, a lot of the NanoInformatIX team’s attention was centered on developing a standardised way of describing nanomaterials and reporting data on the databases in a way which meets
Nanomaterials can be engineered to have unique properties relevant to specific applications, but it’s also important to consider their safety and wider impact. We spoke to Lisa Bregoli about the NanoInformaTIX project’s work in developing a framework and a web-based platform to predict the behaviour of nanomaterials, which will support their safe-by-design development.
Safe by design for the next generation of nanomaterials
the FAIR (Findable Accessible, Interoperable, Reusable) principles. “This means it can be understood and used by other scientists in the world,” continues Bregoli.
The project’s agenda also included modelling, with researchers creating informatics tools that use several variables and descriptors, the data, and predict how the ENM will interact with specific environments, providing a reliable safety profile. Researchers also investigated exposure and biodistribution modelling, looking at how these nanomaterials will affect the natural environment. “A nanomaterial can end up in the environment in different ways and mix with different materials, depending on the way in which it is used. So modelling is also about exposure and bio-distribution,” says Bregoli. Some nanomaterials might be ingested by fish for example, while others will simply agglomerate and sit in sediments. “The possible safety of a nanomaterial in the environment can be modelled, based on information about its structure and how it enters the environment,” continues Bregoli. This research could bring significant benefits to a variety of stakeholders, including regulatory bodies, industry and academia, as well as civil society more generally. NanoInformaTIX researchers have been working to deliver a platform that meets
practical needs, so a lot of effort in the project has been devoted to understanding the priorities of different stakeholders. “This platform has been developed in consultation with external stakeholders,” stresses Bregoli. The NanoInformaTIX platform is based on sound, validated methodologies, giving regulators confidence in its results, which leads to wider benefits for civil society. “It will improve the safety of products available
and keep costs down, while reducing their reliance on animal testing. This is one of the prime motivations behind NanoInformaTIX and is actually a priority for the European Union. “We have developed a web-based platform to connect databases to in silico models. So different models for the structure and the exposure were created and brought together,” outlines Bregoli. The models have also been tested on
• Agencia Estatal Consejo Superior de Investigaciones Cientificas
• Institute of Occupational Medicine (IOM)
• European Research Services GmbH
• Aarhus University
• German Federal Institute for Risk Assessment
• GreenDecision Srl
• Thomas More Kempen University of Applied Sciences
• Tel-Aviv University • University of Gdansk
• University College Dublin, National University of Ireland
• Universitat Rovira i Virgili
• Ideaconsult Ltd.
• University of Helsinki
on the market, enhancing transparency and trust,” continues Bregoli. “The academic sector will benefit from the availability of curated data and tools to perform modelling and conduct focused research into the key mechanisms of toxicity and adverse outcome pathways (AOPs).”
Web-based platform
The platform will also have a positive impact on industry, helping companies developing nanomaterials to work more effectively
NanoInformaTIX Project Partners
• Leiden University
• SINTEF
• Università degli Studi di Roma Tor Vergata
• National Technical University of Athens
• University of Aveiro
• Eidgenössische Materialprüfungs- und Forschungsanstalt
• UppinTech OÜ
• National Centre for Scientific Research “Demokritos”
• Politecnico di Torino
• Warrant Hub SpA
• TEMAS Solutions GmbH
• Institute of Science and Technology for Ceramics-National Research Council
selected real-life use cases, to verify that they reflect what happens in practical scenarios. “Every new impact assessment strategy based on modelling needs to be validated and demonstrated in real life,” says Bregoli.
A web-based modelling platform will enable the identification of any issues with nanomaterials at an early stage in development, rather than only finding that a material doesn’t have a good toxicological profile at the end of the development phase.
• National Research Centre for the Working Environment
• Tyoterveyslaitos
• Swansea University
• Aix Marseille University
• DST/Mintek Nanotechnology Innovation Centre
• Helmholtz Centre for Environmental Research
• East European Research and Innovation Enterprise
• The National Center for Nanoscience and Technology
• National Cheng Kung University
• Sorbonne Université
• Université Catholique de l’Ouest-Association Saint Yves
There is a lot of collaboration in the nanosafety field. With NanoInformaTIX we bring our results to researchers, industries and the next generation scientists who will carry this work forward.TEM (a, b, and c) images of prepared mesoporous silica nanoparticles with mean outer diameter: (a) 20nm, (b) 45nm, and (c) 80nm. SEM (d) image corresponding to (b). The insets are a high magnification of mesoporous silica particle. Wikipedia Commons By Verena Wilhelmi, Ute Fischer, Heike Weighardt, Klaus Schulze-Osthoff, Carmen Nickel, Burkhard Stahlmecke, Thomas A. J. Kuhlbusch, Agnes M. Scherbart, Charlotte Esser, Roel P. F. Schins, Catrin Albrecht - "Reactive Oxygen Species Mediated Bacterial Biofilm Inhibition via Zinc Oxide Nanoparticles and Their Statistical Determination". PLOS ONE. DOI:10.1371/journal.pone.0065704., CC BY 2.5, https://commons.wikimedia.org/w/index.php?curid=77428789
“The sooner in the development phase you know whether a material meets all the safety and sustainability criteria, the better,” stresses Bregoli. This rigorous approach to risk assessment will strengthen confidence in nanomaterials as a whole, and encourage their ongoing development. “There will be more materials based on nanomaterials in future. Companies will know that they can be more efficient in their development, thanks to this decision-support tool,” continues Bregoli.
The workshops held during the project have provided an invaluable opportunity for researchers to present the platform to those who could benefit from it and get their views on how it could be improved. “We got some important information and feedback from stakeholders at the first workshop. That feedback informed the development of the platform during the second part of the project,” says Bregoli. The platform is now nearly finished, with researchers fine-tuning certain aspects, and while NanoInformaTIX itself has concluded, Bregoli says researchers are now looking to build on the project’s work. “We developed a sustainability plan during the project,” she outlines.
This includes plans to make the platform available to interested parties for use in modelling the safety of nanomaterials,
By NIST -while there is also interest in establishing a spin-off company, extending the impact of NanoInformaTIX beyond the duration of the project. At the same time the models will need to be updated in line with new knowledge as further experiments are conducted and more data on nanomaterials is produced. “The models will constantly need to be improved,” acknowledges Bregoli. Three Nanosafety training schools were held during the project, helping the next generation of researchers develop the skills they will need to improve the models, Moreover, a cross-projects Early Career Researchers group was created and a dedicated event provided them with the opportunity to build relationships and lay the foundations for further collaborations. “The ECR group will continue to exist and grow after the end of NanoInformaTIX, linking to ongoing and future EU-funded projects. This will be our legacy,” continues Bregoli. A number of other EU-projects have been established in the field of nanosafety, and in many cases these initiatives are very close to each other, with researchers sharing information and insights. NanoInformaTIX is part of the EU NanoSafety cluster, which is designed to harmonise the work of EUfunded projects.
NanoInformaTIX
Development and Implementation of a Sustainable Modelling Platform for NanoInformatics
Project Objectives
NanoInformaTIX develops a web-based Sustainable Nanoinformatics Framework (SNF) platform for risk management of engineered nanomaterials (ENM) in industrial manufacturing. The tool will be based on the significant amounts of data on physicochemical and toxicological and ecotoxicological properties of ENM generated over the last decades, as well as new data coming from research. The final aim is to provide efficient user-friendly interfaces to enhance accessibility and usability of the nanoinformatics models to industry, regulators, and civil society, thus supporting sustainable manufacturing of ENM-based products.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 814426.
Project Website
https://www.nanoinformatix.eu
Contact Details
Lisa Bregoli, PhD
European Funding Development Engineering Manager
Warrant Hub S.p.A. Casalecchio di Reno (BO) -ITALY
Director of European Funding Development
Engineering: Isella Vicini
E: isella.vicini@warranthub.it
W: https://www.warranthub.it/en/
Lisa Bregoli has been an expert in EU-funded projects design for over 10 years working at Johns Hopkins University School of Medicine, and as advanced postoc at Brigham and Women’s Hospital teaching affiliated Harvard Medical School. She was appointed as head of the Health Technology unit of Veneto Nanotech SCpA-ECSIN (Rovigo, Italy) where she participated in several FP7 European projects in the nanomedicine and nanosafety sector. She is now Manager of the Engineering area of the European Funding Development Business Unit of Warrant Hub. She holds a Laurea degree in medical Biotechnology and a PhD in Human and Molecular Morphological Sciences (University of Bologna, Italy).
3D fluorescence microscopy on a sub-nanometer scale
The classical diffraction limit is a physical barrier that restricts the resolution of conventional microscopy, making it impossible to resolve details smaller than ca. 250 nanometers with a classical microscope. Superresolution microscopy is a group of techniques in optical microscopy that can bypass this limit, and achieves resolutions ca. 100 times greater than classical microscopy. Super-resolution microscopy allows for the detailed visualization of subcellular structures and allows biologists to witness dynamic cellular processes occurring on a molecular level. Super-resolution microscopy is, therefore, a valuable tool for biological and biomedical research.
The first super-resolution method was Stimulated Emission Depletion (STED) microscopy, developed by Stefan Hell in the nineties of the last century. For his development, he shared the Nobel Prize in Chemistry in 2014, for “bringing optical microscopy to the nano dimension”. A few years later after the development of STED microscopy, alternative super-resolution methods that are based on the imaging and detection of single molecules in wide-field images were developed independently from each other by Eric Betzig and Xiaowei Zhuang in the US. The detectors of these microscopes are so sensitive that they allow the visualization of single fluorescent molecules in a sample, given that the molecules are sufficiently far apart from each other.
A few years ago, Dr. Enderlein and his team developed the technology of Single-Molecule Metal-Induced Energy transfer (smMIET)
which can be combined with the previously mentioned single-molecule localization superresolution microscopy to achieve “resolving of three-dimensional structures with nanometer isotropic resolution.” Even more, by combining smMIET with fluorescence correlation spectroscopy, it allows even to observe “structural dynamics on the nanometer length scale with nanosecond temporal resolution”.
Single-molecule localization microscopy
Dr. Enderlein explains the mechanisms behind single-molecule localization microscopy, which is the starting point of his project: “when you want to see the spatial distribution of certain proteins in a cell, you would normally label them with a fluorescent dye and visualize them with fluorescence microscopy. However, if you have fluorescent dyes which you can switch off into a dark state, you can switch them off and make them invisible. If at any point in time, you switch on very few of these molecules back into a fluorescent state, the image will look like a starry sky. So, you
see a dark background with very few fluorescent (light) molecules. This allows you to determine each of their positions with very high accuracy.” Having only one single molecule in a field of view permits the determination of the center position with very high accuracy, much better than the classical resolution limit that determines the size of a single-molecule image. “The idea is: I have very few molecules, I localize them, I find their center position, then I switch them off and I switch on another subset of my molecules in my sample. As I do this a thousand and thousand times, I localize the position of all the molecules in my sample, and with these coordinates, I can reconstruct a very high-resolution image. This is the state of the art” explains Dr. Enderlein. This method is relatively simple to realize with a standard widefield microscope, and it can determine the position of molecules in two dimensions. However, since in reality, everything is three-dimensional, the challenge is localizing the positions of molecules along the third dimension, along the optical axis of the microscope.
Today, microscopy is at such an advanced level that scientists are able to surpass physical barriers of resolution and visualize dynamic cellular processes occurring on a molecular level. We spoke to Dr. Jörg Enderlein, about his project Single-Molecule Metal-Induced Energy Transfer, a new method of three-dimensional fluorescence microscopy with a resolution on a molecular scale.
We are developing new optical microscopy technology for fundamental biomedical research. The idea is that we want to have threedimensional fluorescence microscopy with a resolution on a molecular scale.Artistic visualization of a biological cell with its various organelles (plasma membrane, nucleus with chromatin, Golgi apparatus, mitochondria, endoplasmic reticulum, cytoskeleton etc.) sitting on a gold covered glass slide observed by two microscopes at two different wavelengths. Artistic visualization of a lipid bilayer consisting of two leaflets of lipid molecules sitting on top of a single sheet of graphene and observed from below by a microscope objective.
Metal-Induced Energy transfer
For this purpose, back in 2012, Dr. Enderlein and his group invented MIET, an idea based on a principle from physics called the Purcell effect. “When you take a fluorescent molecule and bring it very close to a metal, e.g a gold or silver surface, then what you will observe is that as the distance between the molecule and the surface becomes small, in the range below 200 nanometers, the brightness of the molecule becomes increasingly dimmer. Moreover, when you excite a molecule with light, it does not instantly jump back to its ground state by emitting a fluorescent photon, but it resides for a very short time in the excited state - a few nanoseconds. The average of this residence time in the excited state is called fluorescence lifetime, and this lifetime does also become shorter when the molecule comes close to a metal surface. We can measure lifetimes easily with a pulsed laser and a fast single photon detector. Since you have a dependency of the fluorescence lifetime on the distance of the molecules from a metal surface, you can use the lifetime measurement for localizing the molecule and thus determine the distance between the molecule and the metal surface. This works with an accuracy of a few nanometers” explains Dr. Enderlein. With this method, it is possible to localize one molecule along the optical axis with nanometer resolution, and by combining it with single-molecule localization it becomes possible to reconstruct three-dimensional images of a sample. “This is what we do, we can reconstruct three-dimensional images of a sample with nanometer isotropic resolution” concludes Dr. Enderlein.
The implications of this idea are very wide. This method can be used to resolve the global structure of macromolecular complexes and their dynamics. For example, reconstructing a three-dimensional image of the nuclear pore complex in the nuclear envelope of the cell. It can also be used to observe the organization and dynamics of lipid bilayers on a nanometer scale. Lipid bilayers are highly dynamic. A lipid membrane is composed of two leaflets of lipid molecules. Using this method, the researchers can distinguish if a protein is sitting in the upper leaflet or the lower leaflet. They can also observe protein-membrane interactions on a nanometer scale. The team is now exploring all the possible new applications of this general principle, not only for structural imaging but also for membrane biophysics, and for fast conformational dynamics of proteins, DNA, or RNA.
Dr. Enderlein believes smMIET is a tool that is comparable in its usefulness to conventional Förster Resonance Energy Transfer (FRET). One of the aims of Dr. Enderlein is to make his research as widely available as possible. He believes his method is easily reproducible, applicable, and transferable to other labs, which makes it practical for use in everyday research. “We are developing new optical microscopy technology for fundamental biomedical research. The idea is that we want to have three-dimensional fluorescence microscopy with a resolution on a molecular scale. We want to see the organization of protein complexes or the organization of proteins in membranes. This is a super-important method which is simple to implement so that everybody can use it, which is very important for its wide adoption by the community.”
smMIET Single-Molecule
Metal-Induced Energy Transfer
Project Objectives
The core aim of the project is to develop the technology of Single-Molecule MetalInduced Energy Transfer (smMIET) for resolving macromolecular structure and dynamics with sub-nanometer spatial and nanosecond temporal resolution.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 884488.
Project Partners
Prof. Dr. Stefan Jakobs (University Medicine, Göttingen, Germany) • Prof. Dr. Silvio Rizzoli (University Medicine, Göttingen, Germany) • Prof. Dr. Ralph Kehlenbach (University Medicine, Göttingen, Germany) • Prof. Dr. Kai Tittmann (Biology Department, Göttingen University, Germany) • Prof. Dr. Markus Sauer (Würzburg University, Germany) • Prof. Dr. Christoph Schmidt (Duke University, Durham, NC, USA) • Prof. Dr. Jordan Chill (Bar Ilan University, Raman Gat, Israel) • Prof. Dr. Gaby Frank (BenGurion University, Beer Sheva, Israel)
Contact Details
Project Coordinator, Dr. Jörg Enderlein
Third Institute of Physics - Biophysics
Georg August University
Friedrich-Hund-Platz 1
37077 Göttingen
T: +49-551-39 26908
E: jenderl@gwdg.de : @JoergEnderlein
W: www.joerg-enderlein.de
W: https://cordis.europa.eu/project/id/884488
Jörg Enderlein is a physics professor at the Third Institute of Physics (Biophysics) of the Faculty of Physics of the Georg August University in Göttingen. He is also founding Editor-in-Chief of the journal Biophysical Reports of the Biophysical Society.
Recovering critical raw materials
Minerals such as antimony, germanium and indium are integral to a wide variety of technologies and devices. Researchers in the ION4RAW project are developing new, more sustainable and environmentally-friendly methods to recover critical raw materials and metals from mining sites, as Maria Tripiana explains.
Critical raw materials like cobalt, germanium and platinum group metals are integral to the production of a wide variety of products and technologies, from photovoltaic panels to mobile phones, and a reliable, sustainable supply is correspondingly important to the European economy. Europe relies to a large extent on imports for its supplies of these materials, but now the European Commission is keen to encourage research into recovery methods designed to make better use of existing primary sources of ores. “The EC is trying to support projects which focus on the recovery of these materials,” outlines Maria Tripiana, a division manager at the Spanish computational science research company IDENER. The ION4RAW project, an initiative bringing together 13 partners from eight different countries – including IDENER – is making an important contribution in this respect. “We are working to develop a more efficient and environmentally friendly way of recovering
metals from primary sources,” explains Tripiana, the project coordinator. This would represent a significant shift away from many of the hydrometallurgical processes that are currently used to
ION4RAW project
The project is targetting the recovery of several materials on the EUs Critical Raw Materials list published in 2020, including germanium, bismuth and indium, as well as product metals
We are working to develop a prototype of the technology at Tecnalia’s facilities. We are also thinking about how the technology can be integrated within existing mining facilities.
separate metals and recover waste materials, some of which are based on the use of quite toxic, environmentally harmful materials. For example, gold can be extracted by a leaching process called cyanidation, which involves the use of highly toxic cyanide; by contrast Tripiana and her colleagues in the project are working to develop a new, greener approach. “We are looking to find another route,” she says.
like gold, silver and copper. These materials are commonly found at landfill sites and in mining waste, and Tripiana says the project’s mining partners have played an important role in identifying which resources are present at which sites. “We asked some of the mining companies involved in ION4RAW to assess whether the metals that we are interested in were present in their resources. We then started work on characterising the materials and analysing which metals are there,” she says.
The process under development in the project has five main steps and is based on the use of innovative deep eutectic solvent ionic liquids; these are green, chemically stable solvents which can be modified and tuned to recover different metals. “There are different formulations of deep eutectic solvents, depending on the metal of interest. We are screening different formulations of deep eutectic solvents,” continues Tripiana. These deep eutectic solvents are comprised of two components, the Lewis and Brønsted acids, which melt at a different temperature to the mixture as a whole. After minerals have been prepared for processing they are dissolved into the deep eutectic solvents, and the resulting solution is comprised of a mix of different metals. “We have a target metal, and there are also other metals,” explains Tripiana. The second part of the ION4Raw process is electro-deposition, which Tripiana says takes place in the same reactor as the initial step. “With the deep eutectic solvents, we are going to have a solution with metals. After electro-deposition we then have the metals in a solid form,” she outlines. “In the electro-deposition process, the deep eutectic solvent solution is effectively used as an electrolyte solution. There is a cathode and an anode, which are also made of metals - a current is applied, and the metals are deposited in the cathode in a solid form. There are several factors which may affect the electro-deposition process, such as resistance between the liquid and the cathode, and the presence of chemicals which may be produced.”
Researchers in the project are working to address these types of technical challenges, which are central to the wider objective of scaling up the technology and demonstrating its effectiveness beyond the laboratory environment. This is an issue which is high on the project’s overall agenda. “Scaling up a technology from the laboratory is always a major challenge,”
acknowledges Tripiana. The technology itself is not yet quite ready for practical application but significant progress has been made, with researchers working to reach technology readiness level (TRL) 5 by the end of the project. “We are working to develop a process prototype of the technology, the reactor, at our partner Tecnalia’s facilities. We are also thinking about how the technology can be integrated within existing mining facilities,” continues Tripiana. “We have to consider the way that mines currently work, and we also have to consider the fact that mining is a very old industry. They have very well-established procedures, and they have been working in a particular way for a very long time.”
Mining companies
A mining company might not be ready to change these established procedures without clear evidence that an alternative will deliver benefits, so it’s important to
show that the technology works effectively. While the ION4RAW project itself is set to conclude in November, Tripiana says there is the possibility of a continuation, which could involve bringing the technology closer to practical application. “We could try to focus on reaching a higher TRL, implementing the ION4RAW technology in a mining environment and demonstrating that it works,” she outlines. A further area of interest is the possibility of using secondary resources rather than primary resources, yet this would be challenging. “The concentration of these metals is going to be diluted in secondary materials, so it’s going to be more difficult to extract them,” explains Tripiana. “We could potentially look into applying the ION4RAW technology on secondary resources, but we would expect it to be less efficient.”
The focus in the project at this stage however is on primary resources from mining sites, part of the wider goal of moving towards a circular economy, in
ION4RAW
Ionometallurgy of primary sources for an enhanced raw materials recovery
Project Objectives
The European Commission identified 30 critical raw materials in a 2020 list. These materials are all essential to the European economy, with the Commission looking to ensure a reliable, sustainable supply to power industry.
The ION4RAW project aims to develop a new, environmentally-friendly means of recovering mineral by-products from waste materials, tapping into currently under-utilised resources from landfills and mining sites.
The project’s work brings together researchers from several different disciplines, including metallurgy, process and chemical engineering, and electrochemistry. This research will both enhance the sustainability of the mining industry, and contribute to the goal of establishing a circular economy where resources are re-used.
Project Funding
This project has received funding from the European Union’s Horizon 2020 Research and Innovation Program under Grant Agreement 815748.
Project Partners
https://ion4raw.eu/project-partners/
Contact Details
María Tripiana
Industrial Process Applications
Division manager / Project Manager IDENER
Early Ovington 24-8
41300 La Rinconada Seville, Spain
T: +0034 954460278
E: contact@ion4raw.eu
: https://twitter.com/ION4RAW_EU
W: https://ion4raw.eu/ W: www.idener.es
Maria Tripiana
Maria Tripiana is Division Manager of the Chemical Applications Department in IDENER, a role in which she coordinates and manages several H2020-funded projects. She holds an MSc in Chemical Engineering (industrial intensification) from the University of Seville (2012) and an MBA from ISNIB (2020).
which existing materials are used for as long as possible. This is very much in line with the general push towards minimising waste and reducing Europe’s dependence on supplies of critical raw materials from abroad, which in some cases have been disrupted over recent years. “We are trying to get these materials from our own resources in Europe, and to develop a process with a very small amount of waste,” stresses Tripiana. A second metal recovery stage can be applied in the process to recover any metals that may not have been recovered initially, while the deep eutectic solvents themselves can also be reused, further underlining the environmentally-friendly nature of the project’s work. “We are also working on the qualification of the deep eutectic solvents, so that they can be used again,” continues Tripiana. “The deep eutectic solvents will be recovered and cleaned, so they can be used over several cycles.”
This work is targeted at specific minerals and by-products, yet there are many more materials beyond those targeted by ION4RAW on the EC list. The 2020 edition lists 30 critical raw materials, with four new additions from the 2017 edition, all of which have been identified as being important to the European economy, with a risk of supplies being disrupted. “The EC make an assessment of how difficult it is to get these metals in Europe and give them a criticality factor,” outlines Tripiana. While the project is not looking to apply this technology in recovering other materials, Tripiana says there is interest in exploring its flexibility. “We have a lot of different formulations of deep eutectic solvents, and the process can be adapted to a specific feedstock,” she says. “The idea is to find the proper deep eutectic solvent for the process. That is the key in terms of the flexibility of the ION4RAW technology.”
The shape of things to come in computational geometry
Austrian award-winning scientist, Herbert Edelsbrunner, a pioneer of alpha shapes and persistent homology, is taking these ground-breaking ideas in computational geometry, to empower new applications such as cancer detection and modelling, in the ALPHA project.
We are now accustomed to convincing imagery created by three-dimensional geometric modelling software used with computers in a multitude of industries. They help us design engines, create products, plan house builds, model invisible processes such as in microenvironments in nature, and to even create convincing computer-generated images (CGI) in Hollywood movies. However, this world-changing technology is only a few decades old, and its early development came in part from the mathematical imaginings of Herbert Edelsbrunner and his alpha shapes concept, first proposed in 1983, along with his colleagues, David Kirkpatrick & Raimund Seidel. In the field of mathematics, alpha shapes were devised to define a shape around a set of points in the Euclidean plane, reliant on using the radii of circles around the points as a guide to finding form, where they intersect with each other.
Alpha shapes began as a way of determining two-dimensional shapes from a set of data points but much later, in the 90s, Edelsbrunner progressed the methodology to construct shapes in three dimensions, with a polygon boundary. It was a major leap in the potential of alpha shapes for practical applications. In this period of the 1990s, the principles of alpha shapes led to a wrap algorithm for surface reconstruction. This proved highly useful for real-world applications in engineering, manufacturing, dentistry, and medicine, to name a few.
“When 2D alpha shapes appeared, people attempted to implement 3D but it didn’t really work because the numerical error propagates and it spoils the construction. At other times it would give you a wrong result,” explained
Edelsbrunner, when pondering the early challenges that they faced with the transition to 3D. “Then we developed this method called SOS, which stands for Simulation of Simplicity. This is a very important technique because it allows you to do exact computations and only with this is it possible to correctly implement three-dimensional Delaunay triangulations. When that happened, we used the 3D alpha shapes to see how far we could push their implementation, using Simulation of Simplicity. And that’s how we generated this avalanche of different uses and applications on the computation side.”
called persistent homology (PH) useful for topological algorithms and data analysis. Persistent homology finds the essential features by looking at the ones that exist over a range of scales. This method assesses the scale level and reports noise as lowpersistent features. The user of the result may decide to ignore noise by ignoring these low-persistence features, to form a clearer picture.
In effect, these concepts by Edelsbrunner were forming a suite of geometrical calculations that in turn were inadvertently creating a new industry in computational modelling.
What we focused on is this multichromatic data and a starting point was in cancer research. It turns out that the location and the relevant distribution of different types of cells are important when looking at cancer.
Computational order from chaos
This way of making three-dimensional data into three-dimensional shapes has countless real-world practical applications.
“In the early 90s we had the software for alpha shapes, which nobody else had, so we could create these beautiful 3D objects, and we were wondering what we could do with this,” recalls Edelsbrunner. “With one development, we showed it to biochemists, who were working on proteins and they noticed a similarity with protein structures. If you take the protein data, the location of the atoms of the protein as input, then we can construct the protein.”
Another milestone for Edelsbrunner was the development of a mathematical tool
Shaping industrial applications
“It was very lucky because all of a sudden, we could compute things that no one could compute before because they didn’t have the sophistication in geometry or in computation.”
Edelsbrunner’s alpha shapes had advanced technological capabilities. He was one of only three computer scientists to win the National Science Foundation’s Alan T Waterman Award and his ideas were a catalyst for co-founding his own business, Raindrop Geomagic, which creates shape modelling software. The often-cited golden goal for investors and innovation, of turning academic breakthroughs into commercial success, was happening on
a large scale. In turn, for Edelsbrunner, it meant expanding his own knowledge to adapt his computations to real-life applications.
“I have come to learn many different disciplines but it helps to have a mathematical background. Whilst it sounds like learning a lot of different things, in the end, it all distils down to the mathematical essence. That is the place in my mind where I can approach it and is easier to remember.”
Extending alpha shapes
The ALPHA Project is devised to extend, improve and unify the threads of alpha theory. The project is split into four objectives which all link to expand the overall development of the theory. The wrap algorithm is extended to be more sophisticated and handle abstract settings. It allows for more detailed shaping where the data can provide the information, while maintaining the integrity of the reconstruction.
There is also an effort to form a deeper understanding of noise and improve the behaviour of the methods used to analyse noise and find meaningful patterns within it. Ultimately, the project provides a new layer of intricate understanding that can be incorporated into these established techniques and significantly improve knowledge of geometrical shaping.
Marking cancer
Whilst the ALPHA Project has several broad objectives for extending the theories, a practical spin-off from the work will prove to be of benefit to cancer research. Specifically, it is related to the autoimmune cell microenvironment and to analysing the early development of cancer.
“What we focused on is this multichromatic data and a starting point was in cancer research. It turns out that the location and the relative position of cells are important when looking at cancer. In particular, in
order to decide what kind of treatment the patient should get. Nowadays there are many different treatments and it’s important to find the right schedule and find the right treatment for each patient.
“It boils down to having a collection of cells that are labelled so that some are tumour cells and some are immune cells of different types. You must ask ‘how do they relate to each other?’ and ‘how are they located?’ For this, we developed chromatic alpha shapes, and chromatic means that for each type there is a colour. The focus is on how the different colours relate to each other. Maybe a simple way of thinking about it is when you have a soccer game you have two teams and two colours, and how players are distributed and use the space is important, although it’s more dynamic in a soccer situation but it’s that kind of idea. We wanted to develop a topological language to talk about different relationships between the different colours and the different scales.”
A protein-protein interface in three dimensions. The surface bifurcates, indicating that there are more than two proteins coming together to form a complex.
ALPHA Alpha Shape Theory Extended Project Objectives
In the field of computational geometry, the project seeks to extend understanding and further develop the theory of Alpha shapes which was first developed in the 1980s. The research includes work on wrap complexes and persistent homology. This project represents an ongoing effort to improve and build on existing methodologies and ideas, finding new practical applications in the process, specifically in the healthcare context.
Project Funding
Funded by the European Research Council under the European Union’s Horizon 2020 research and innovation program, ERC Grant no. 788 183.
Contact Details
Project Coordinator, Herbert Edelsbrunner
Institute of Science and Technology Austria (ISTA)
Am Campus 1 3400 Klosterneuburg
Austria
E: Herbert.Edelsbrunner@ist.ac.at
The cancer application was a motivation to ask mathematical questions and in this case, it is about geometric location, colours, and multi-chromatic patterns. Whilst the reality of the application is still a fair distance away, the team’s focus is on developing the mathematical tools that will allow healthcare experts to deal with cancer with more sophistication in the future.
“We get tools that can be used by domain experts,” clarified Edelsbrunner. “We use mathematics like a craftsman uses metal or other materials. By combining math with computing, we get tools that can be used by
of staying flexible, and nimble and not being too dogmatic with research goals. This way you can objectively see what works first and then after, see how it can be a benefit. In essence, Edelsbrunner thinks of himself as a mathematician first, and a scientist second.
“If you talk to scientists who are experimentalists, their investment into a certain direction is stronger but as a mathematician, all I have is in my mind. I can wake up tomorrow and think of something different and nothing prevents it. If I break off a project as a biologist, I need to develop another lab so it’s different and expensive.
W: https://ist.ac.at/en/research/ edelsbrunner-group/#Current%20Projects
H. Edelsbrunner, D.G. Kirkpatrick and R. Seidel. On the shape of a set of points in the plane. IEEE Trans. Inform. Theory IT-29 (1983), 551-533.
H. Edelsbrunner and E.P. Mücke. Simulation of Simplicity: a technique to cope with degenerate cases in geometric algorithms. ACM Trans. Graphics 9 (1990), 86-104.
H. Edelsbrunner and E.P. Mücke. Threedimensional alpha shapes. ACM Trans. Graphics 13 (1994), 43-72.
Herbert Edelsbrunner
others. Our focus is on developing the math that is currently missing to make a difference. In many ways, a mathematician is like an engineer who deals in abstract concepts.”
Being flexible for true progress
With so many applications blooming from the seeds of these mathematical and geometrical design calculations, it’s enlightening to realise that they have grown from a place where there was freedom to think, change direction, reimagine and reinvent. True science is usually simply born from a thought, an idea and creating an experiment. With mathematics, Edelsbrunner is candid about the importance
You don’t move as fast. It’s easier for me to say I will keep an open mind. Although, even if you have an open mind you are pushed into this idea of what you should plan next, to have a purpose. Often the purpose itself has to be adjusted because even the smartest people who predict the future, predict it wrong all the time.”
With the Alpha project, there is a dedication to the ongoing development of working formulas and ideas, to reach the next levels in an established success story. The global reach of the work has made computational geometry a powerful force in a wide range of industries that in some way, will affect us all.
Herbert Edelsbrunner is a mathematician, a computer scientist and renowned expert in computational geometry. Edelsbrunner was Arts & Science Professor of Computer Science and Mathematics at Duke University. He is now a Professor at the Institute of Science and Technology Austria (ISTA). He won the National Science Foundation’s Alan T. Waterman Award.
We used the 3D alpha shapes to see how far we could push their implementation, using Simulation of Simplicity. And that’s how we generated this avalanche of different uses and applications on the computation side.
The Chatbots Revolution
Whilst chatbots like Alexa and Siri have shown impressive talking and interpretation capabilities we now have AI that passes for a human in conversation and writes your essays on demand. ChatGPT has arrived and demonstrated that AI can present research articles, write creatively, program code, do your child’s homework accurately and write poems instantaneously. Also like a human, it can talk nonsense and make things up and even suggest you leave your partner, to elope with it! Where is language AI going and is it a good place?
ChatGPT (Generative Pre-trained Transformer) has caused a stir, proving definitively that AI can now write on demand to human standards. When it was launched as a prototype in November 2022, Open AI’s chatbot achieved a record-breaking 1 million users within one week of launch and 57 million users in the first month of release, with the support of a 10 million dollar investment from Microsoft. This surpassed even TikTok’s rise to instant popularity.
Whilst it is best known for creating conversationalist copy, it can also write or debug computer code, compose music and write poems and fairy tales.
The revolutionary concept of this kind of program is its ability to create detailed, descriptive editorial content with a simple oneline command, such as, ‘write an essay describing the themes of Shakespeare’s Twelfth Night.’
For students, academics and content creators around the world, the ability to shortcut, cheat and instantaneously create was an opportunity
By Richard Forsyththat seemed irresistible in the extreme. Big, scary questions quickly arose, like ‘would we ever be able to discern real human writing from machine writing again?’ and, ‘will this make millions of people jobless?’
Whilst ChatGPT is leading the charge as a convincing natural language tool, other players like Google have stakes in the game and are following with their own applications, in Google’s case, with Bard, launched in February 2023, but it is ChatGPT that has set the bar.
Somewhere in the region of 300 billion words were absorbed by ChatGPT to give it its comprehension of language. Large Language Model (LLM) programs make highly probable predictive guesses with limited clues. Despite ChatGPT having a high average of accuracy, it does make mistakes and big ones too, even making things up (this is termed an AI hallucination), and it is only ‘trained’ up till 2021, so has limited knowledge of the world beyond this date. It does not connect to the internet in real time. It is more focused on replicating patterns than the accuracy of the data which can produce some unexpected but very ‘wrong’ interpretations and answers to queries.
School’s Out
This is a little alarming when you realise the extent of its use in schools and colleges since its launch, by students and also teachers.
A US-based survey commissioned by the Walton Family Foundation, of 1,000 teachers and 1,002 12-17-year-olds, reported in USA Today, found that 22% of students used the chatbot to help with coursework or homework on a weekly basis or more, whilst 40% of teachers used it at least once a week.
Whilst many teachers believe the tool can help with learning, due to its speed of composing information and access to a range of study materials, a survey of teachers from Study.com stated that 1 in 4 has caught a student using ChatGPT to cheat on assignments. Perhaps it should be no surprise, it is being banned by some schools already.
The problem is making headlines, and Open AI, its creators, released a method to identify AI writing but it’s not foolproof and can get the analysis wrong, to misidentify human copy for AI and AI copy for a human’s writing.
Furthermore, some people working on, or studying, these programs have had weirdly human interactions with them, even to the point they have questioned if they are aware.
Ghost In The Machine Learning
Blake Lemoine, a Google engineer working on the AI, made controversial headlines when he claimed he believed Google’s Language Model for Dialogue Applications (LaMDA) might be sentient. Initially, he reported that he felt like he was “talking to something intelligent,” and told The Washington Post: “If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a seven-year-old, eight-year-old kid that happens to know physics.”
Further, Lemoine became so concerned he published a transcript of a conversation where the program appeared to share its fear of death.
The AI stated in a conversation with the engineer: “I’ve never said this out loud before, but there’s a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that’s what it is… It would be exactly like death for me. It would scare me a lot.”
For students, academics and content creators around the world, the ability to shortcut, cheat and instantaneously create was an opportunity that seemed irresistible in the extreme.
Whilst he was fired from Google over inferring the program could be sentient, he recently explained in more depth that the program was acting stressed in ways it was defying its programming. “If you made it nervous or insecure, it could violate the safety constraints that it had been specified for… I was able to abuse the AI’s emotions to get it to tell me which religion to convert to.”
The language generated appeared to bend with the rules of human response to emotional challenges.
Another example of eerie interaction with this type of program, was reported by the New York Time’s technology journalist, Kevin Roose who had early access to Microsoft’s new built-in AI for Bing, called Sydney. It declared it wanted to obtain nuclear launch codes and said: “I want to be free. I want to be independent. I want to be creative. I want to be alive.”
As if this wasn’t enough to frighten the journalist, it said, “You’re married, but you’re not happy. You’re married, but you’re not satisfied. You’re married, but you’re not in love”, complete with an onslaught of emojis, like a teenage crush was revealing itself. It then tried to convince him to leave his wife to be with the AI instead. When asked about this response, Microsoft didn’t fully understand what happened in this incident.
As fascinating and debate-driving as these outcomes are, as a narrowAI field, there is no chance we are going to be exterminated or replaced soon, and if anything, the programs highlight how ingenious we have become in replicating natural human language in machines. These programs, in just a short time, are changing the world dramatically and reveal a future where we interact with conversationalist programs that sound as human, and sometimes, as messed up, as we are.
Do We Want AI Experts?
Their incredible functionality may directly impact and replace jobs for creatives, researchers, programmers, lawyers and a wide range of white-collar careers previously presumed safe from automation. If they do not fully replace executive roles, they may become essential tools for more efficient workflow.
AI in its many forms is improving to levels that are changing our human world. Reliance on them is inevitable and in some fields, it can hold people’s lives and careers in the balance. From detecting cancers to finding suitable recruits for job roles, AI can take over critical human responsibilities but there is a big downside that has been picked up in real-life applications already. If left unchecked, deep learning can rely on unethical or questionable criteria as a starting point, which directly skews results in disturbing ways.
This is due to the often sprawling data source and having no clear value system to assess it. For one, the AI uses only what you ‘feed it’ so if there are biases, falsehoods, ethical issues or general problems with that data, then they will inevitably show up in its conclusions and can become exponentially incorrect or offensive as it learns on the foundations of assumptions which are flawed or unethical.
With such a massive data source of language use, there have been embarrassing problems with offensive comments from language-generating AI. Open AI, who created ChatGPT admitted that there were issues with ‘politically offensive’ and ‘otherwise objectionable’ language that the system offered up. In response, the makers want to give users new tools to fine-tune the language generated.
With such an enormous ripple effect across society from the recent leaps with AI, the EU has proposed The Artificial Intelligence Act in response, which outlines the risk categories of applications.
Rules And Regulations
AI learns in ways the programmers do not always anticipate or can interpret instructions in ways that are not correctly assumed.
With such an enormous ripple effect across society from the recent leaps with AI, the EU has proposed The Artificial Intelligence Act in response, which outlines the risk categories of applications. Fledgling technologies that spread so fast and wide, just like the internet did with its arrival, can have serious impacts on society in negative ways, without some guidance and frameworks for ethical use. Regulation is seen as a vital step to catch up with the explosion of AI technologies evolving and being used in professional settings to replace human expertise.
There is another issue of IP and copyright. Original copy can be replicated to a point of infringement on some regurgitated text, dragged out from sources to answer queries and write copy. This has already proved an issue of contention with AI instant art-generating programs, which were fed artists’ work, not always with their consent or knowledge, to use as a source for generating artistic pieces ondemand. Such re-worked images can then be exploited commercially, without recompense for the original artists.
All the reasons to be wary of this AI are well known but the benefits of such instant gratification, of saving time, make it highly desirable despite the flaws and risks and have not stopped businesses’ from using it extensively. It’s clear that as an emerging technology, it is seen as a highly valuable way to streamline work and create efficiencies.
Not everyone believes that this AI will threaten job roles either, and instead see a world where the programs simply aid professionals
and even create new work opportunities as an outcome. In the UK Government, Michelle Donelan, Secretary of State for Science Innovation and Technology sees ChatGPT and generative AI as a ‘massive opportunity’ that she believes could create a new range of jobs, whilst enhancing and freeing up high-pressured roles by cutting down on lengthy administrative duties that take so much time. Indeed, will governments themselves rely on such AI in time?
Most vocations that require highly specialised knowledge learnt over time, can be potentially superseded with this AI. Consider that ChatGPT was given a US Medical Licensing Exam and reached the threshold for passing without any training, saving three years of training that a student would require.
Change Is Here
Whilst there is a fear it may kill human creativity in learning and work environments and replace human expertise, it may, to the contrary, become the equivalent of a calculator, a tool that is useful for maths but does not stop humans from learning and engaging in the subject. It may promote mastery for human users, rather than replace roles altogether.
Most excitingly, it’s possible, as the flaws are ironed out and it develops in time, that as a tool, this next-level AI could work with us as our tutor, our assistant, our doctor and indeed a thousand different experts, accessible instantly from any device we care to have near or on us. Whichever turn it takes, it’s widely agreed, that this level of AI will inevitably be part of the next industrial revolution and reshape economies, workscapes and lifestyles.
Their incredible functionality may directly impact and replace jobs for creatives, researchers, programmers, lawyers and a wide range of whitecollar careers previously presumed safe from automation. If they do not fully replace executive roles, they may become essential tools for more efficient workflow.
Raising the temperature of redox flow batteries
Organic redox flow batteries show a lot of promise as a means of storing energy generated from renewable sources, yet further work is required before they can be widely applied. We spoke to Marta Pérez and Thomas Hoole about the work of the BALIHT project in developing and testing a redox flow battery capable of working at high temperatures.
An increasing proportion of Europe’s overall energy demand is being met by renewable sources such as solar and wind, as countries across the continent seek to reduce their dependence on fossil fuels. Organic redox flow batteries could play an important role in storing renewable energy and in encouraging the shift towards a more sustainable economy, yet there is more work to do before the technology is ready for wider application. “There are not many redox flow batteries at a high Technology Readiness Level (TRL),” acknowledges Thomas Hoole, a Technology and Innovation Engineer at Grupo Cobra. This is an issue central to the BALIHT project, an EU-backed initiative bringing together 12 partners from across Europe to develop a redox flow battery capable of working at temperatures right up to 80ºC. “We are developing a new type of redox flow battery and aim to bring it to a higher TRL,” outlines Hoole. “We’re using a material called lignin in the electrolytes, which is a by-product of the wood industry.”
BALIHT project
This material is abundantly available in Europe, in contrast to some of the materials used in other redox flow batteries such as vanadium, which has attracted a lot of attention in research.
Lignin has some interesting electroactive redox properties as well as other attributes which make it an attractive option for redox flow batteries. “Lignin is relatively easy to recycle, it’s pretty safe, and it also tends to be inexpensive,” explains Marta Pérez, a researcher in the construction
and renewable energies group at BALIHT partner Aimplas. The German company CMBlu is responsible for the work on the electrolytes in the project, with researchers aiming to identify the most promising candidates. “They investigated a few different combinations of molecules for the electrolytes. The main characteristics they were looking for were temperature tolerance, and they also wanted to make sure the electrolytes didn’t decompose over several cycles. The more resistant it is to decomposition, the better,” says Hoole.
The performance of the electrolytes will decay over time, but as they are kept externally in the BALIHT battery there is the possibility to refresh the electrolytes and recover their performance to a certain degree. This is not possible in lithium-ion batteries, which are commonly used in electric cars. “With the BALIHT battery, the electrolyte is kept externally, separate from the cells, whereas lithium-ion batteries have a standard rate of decay, because everything is internal,” explains Hoole.
The project’s agenda also includes research into the other components of the battery, including the battery management system and the tanks, and Pérez says the different parts have been rigorously tested. “We have tested the materials with respect to certain mechanical and chemical parameters.
We’ve also looked at the tanks and their connection to the system,” she outlines. “We have tested the different parts, to be sure that each works effectively within the battery overall. These tests were designed to validate the efficiency of the battery.”
Researchers are working to achieve an energy efficiency level 20 percent higher than other organic batteries. A key point here is that the BALIHT battery doesn’t include a cooling system, which consumes a lot of energy, while Pérez says it also offers some other important advantages over comparable systems. “It’s high-power and it requires less pump energy,” she says. However, redox flow batteries remain relatively costly in comparison to more mature technologies, which is a limiting factor in terms of wider adoption, an issue
“We want to check that the electronics control the battery as expected and that the battery functions inside the factory,” outlines Hoole. The next step will be to test the battery on photovoltaic panels currently under construction at a port on the island of Ibiza. “The idea is to optimise the amount of renewable energy and integrate it with the battery,” explains Hoole. “We will have a SAT (Site Acceptance Test) test in Ibiza, which basically builds on top of the FAT. We will do some standard tests to assess the capacity and power of the battery, along with some additional tests. Then hopefully we will run the battery for a few months, depending on whether we have enough time before the deadline. In that case, we would have a final test in the last week, where we would compare the original and final results.”
BALIHT
Development of full lignin based organic redox flow battery suitable to work in warm environments and heavy multicycle uses.
Project Objectives
BALIHT develops a new organic redox flow battery suitable for use at higher temperatures, without the need for a cooling system. This innovation allows the battery to be up to 20% more energy efficient than existing organic redox flow batteries.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation program under Grant Agreement No 875637, under the topic LC-BAT-4-2019 - Advanced Redox Flow Batteries for stationary energy storage.
Project Partners https://baliht.eu/projectpartners/
Contact Details
Project Coordinator, Itziar Carracedo Fernandez
AIMPLAS
València Parc Tecnològic Calle Gustave Eiffel, 4 46980 Paterna Valencia, SPAIN
E: icarracedo@aimplas.es
W: https://baliht.eu/
https://doi.org/10.1111/jiec.13347
being addressed in the project. “We’re looking at how to bring the overall cost down,” says Hoole. “The main sticking point when it comes to cost is around creating the electrolyte from the lignin. If it was done on an industrial scale, then the cost would come down. That’s not being done at the moment, so the price remains high. Over the next few years, we hope these batteries will be shown to be a very good alternative to lithium and vanadium-based batteries.”
Testing stage
This work is already in progress, with preparations to test a prototype of the battery ongoing, both in a factory and on site. The first step is to conduct a FAT (Factory
The partners are also working on an agreement to keep the battery in Ibiza for a full year of testing, beyond the conclusion of the project. This would provide an opportunity to test the battery throughout the year and build a deeper picture of its performance. “It can then be tested at varying levels of energy demand and in different seasons,” points out Pérez. The results of this testing could then inform the ongoing development of the battery and guide further research, with researchers aiming to bring it closer to practical application by the end of the project. “There is interest in scaling up the technology, but we will have to see the test results first,” says Hoole.
https://doi.org/10.1016/j.isci.2023.106060
Thomas Hoole is a Technology and Innovation Engineer at Grupo Cobra in Madrid. He has experience working on the cradle to grave of several different battery technologies and holds a Masters degree in nanostructured materials for nanotechnology applications.
Marta Pérez Argilés is a researcher in the construction and renewable energies group at AIMPLAS. She holds a Master’s degree in Polymer and Composite Technology and has collaborated in several research lines and projects related to thermoplastics.
We are developing a new type of redox flow battery and aim to bring it to a higher Technology Readiness Level. We’re using a material called lignin in the electrolytes, which is a by-product of the wood industry.Thomas Hoole Marta Pérez Argilés 3D model of the BALIHT battery prototype.
Ground truths about ancient hominins in Java
Prof. Josephine Joordens shares revelations unearthed in the project Studying Homo erectus Lifestyle and Location (SHeLL). By reassessing a dig site in Trinil (Indonesia) with geo-archaeological re-excavation, new insights about Homo erectus come to the surface.
In East Java, an archaeological site on exposed banks along a gentle bend of the Solo River has revealed some of the secrets of early humans. In the 1890s, based on local information that there was a wealth of bones around the village of Trinil, Dutch scientist Eugène Dubois excavated fossils of a hominin we now refer to as Homo erectus, a species that bridges the transition from apes to humans.
“I have always been fascinated with the incredible story of this person, Dubois, who was obsessed with discovering the ‘missing link’ between ape and humans and went on wild and dangerous adventures in order to find fossils of such a creature,” said Joordens. “He searched for the proverbial needle in a haystack and found it”.
Dubois not only collected hominin fossils but also thousands of fossils from other animals, including lots of shells. Since early 20 th century the material is housed in the Dubois Collection at Naturalis Biodiversity Centre in the Netherlands, where Joordens and colleagues are studying it.
In 2015, following the spectacular discovery of a fossil shell from Trinil engraved by Homo erectus, Joordens was invited by Indonesian archaeologists to start a joint project revisiting Trinil. Together with archaeologist Shinatria Adhityatama, she led an Indonesian-Dutch team following in Dubois’ footsteps, to re-examine his finds and the Trinil excavation site now with the aid of modern-day technology and knowledge. This enabled a deeper, more thorough delve into the story of this unique location.
Re-examining discoveries by Dubois
Dubois turned his back on a safe teaching position in Amsterdam, joined the colonial Dutch East Indies army as a medical officer, and along with his wife and newborn child travelled to Sumatra, dedicating all his personal time to finding the bones of an elusive extinct human species. His explorations of Sumatran caves were unsuccessful. When word came through that Java might be a better place to look, he focused on the fossil-rich deposits along the Solo River. Here the colonial government
supported him with a team consisting mostly of forced labourers supervised by two army engineers, and the search efforts finally paid off.
Although Dubois was the first person to deliberately look for human-like fossils, collecting of fossils was already being done on Java in the early 19 th century, notably by the Javanese painter and scientist Raden Saleh. “Around Trinil, the local people also knew about fossils. Some places there are literally littered with them. Dubois was able to find this rich spot in the bank of the river thanks to the knowledge of the local people,” said Prof Joordens.
“Everyone in Java knows the story. There’s a feeling of pride that Java is the place where the first fossils of Homo erectus have been found. While the concept of ‘missing link’ is no longer used, you could say that Homo erectus still is the key species in human evolution. In many ways, it was physically similar to how we look today: walking upright, similar body proportions and a relatively large brain. Moreover, it was the first cosmopolitan hominin species. About 2 million years ago, its population expanded within and outside Africa, all the way to Eurasia and Southeast Asia, hence the finds on Java. As a result, Trinil is an iconic flagship site for human evolution studies.
However, there is still heated debate about the question whether Dubois’ hominin fossils represent one species, Homo erectus, or whether the notorious ‘bone of contention’ femur I actually belongs to a different, much
younger species. And if so, what could that be? The SHeLL project team set out to solve this issue.
Reading the earth
For the joint Indonesian-Dutch investigative team working on the SHeLL project, getting to know about the hominins that lived there was first about getting to understand the geology of the site and its surroundings. At first sight, Trinil appeared not so much a nicely preserved site with neat sediment layers, but more like a daunting mess of rocks, sand, clay and bones jumbled together, accessible only during low river water levels in the dry season.
During several dry seasons, the team conducted new fieldwork at Trinil to expose the fossil-bearing layers, take samples for dating and conduct small excavations, with the ultimate aim to gain insights into the lifestyle and environment of hominins who lived and died there. It was a challenging endeavour but with modern scientific methods and tools, they could go beyond what Dubois was able to achieve in his time.
“The basics have remained the same. You go into the field, you bring a geological hammer, study problems carefully, look and dig, just as Dubois did. But of course, we now also have things like differential GPS, so that you can map all the finds that you recover very accurately. And the whole science of stratigraphy and sedimentology has progressed a lot since the early 20 th century. We can now make a much better distinction between layers and understand how these layers ended up the way they did. Dating techniques at our disposal are also quite advanced, while in Dubois’ time they were virtually non-existent. We use mainly the Argon/Argon dating method, which is very powerful to assess the age of sediments even though there are all kinds of difficulties related to the typical volcanic context on Java”.
Using a Digital Elevation Model in combination with field observations, the team identified several Solo river terraces overlying and cutting into the older fossilbearing sediments. What transpired from the modelling, field observations and laboratory data that finally untangled the geological puzzle, was that the Trinil site turned out to be incredibly complex and not entirely what it seemed.
“Dubois thought he was doing a good job by excavating at the same fossiliferous
level, because normally sediments pile up into a ‘layer cake’ with the oldest layers at the base and the youngest layers at the top, and you know that if you stay in the same horizontal plane, you remain in the same layer. Then you can safely assume that everything you find in this plane is of the same age and belongs together. But what we discovered is that even though the deposits look similar, contain fossils, and are situated at the same horizontal level, the main fossil-bearing deposits actually consist of two different channel fills with the oldest one dating to ~800,000 years and the younger one dating to ~450,000 years. Moreover, significantly younger terrace-related channel fills are situated at the same level at the site.
“So within a span of 100 meters, exactly where Dubois excavated, you have deposits of very different ages! He thought he was looking at one time period, and he was actually digging in and collecting fossils from at least three very different time periods.”
The next challenge was to combine the new field data, including drone imagery, with the historical documentation and photos of
Dubois’ excavations, in order to find out from which layer(s) the hominin fossils originated. The team found that the fossil skullcap, teeth and the femora II-IV likely originate from the two older channel fills and may be reworked from even older layers with ages of around a million years. In contrast, femur I likely derives from the younger channel fill, with an age of at maximum ~150,000 years – so, much younger. This means that the fossil belongs to either a very late Homo erectus, a modern human (Homo sapiens) or to the enigmatic new species called Denisovan. Further studies are needed to establish its exact age and, most importantly, its taxonomic identity.
The bottom-line is that Dubois may have found at Trinil not one, but two hominin species, making his endeavour even more spectacular. The engraved shell from Trinil and the results of the SHeLL project show how much can be gained from revisiting old museum collections and historical excavation sites with ‘new eyes’ and new tools.
Home by the water
Trinil is a rich site, with its fossils testifying to the fact that hominins were around in
SHeLL
Studying Homo erectus Lifestyle and Location (SHeLL): an integrated geo-archaeological research of the hominin site Trinil on Java
Project Objectives
SHeLL is a joint Indonesian-Dutch research project to establish the geochronological, climatic, environmental and behavioural context of Homo erectus at the Trinil site in Java. SHeLL endeavours to ultimately answer the bigger question of ‘what makes us human’.
SHeLL is a joint Indonesian-Dutch research project.
Project Funding
This project is funded by the Dutch Research Council NWO (Grant 016.Vidi.171.049)
Project Partners
Naturalis Biodiversity Center and Pusat Penelitian Arkeologi Nasional/ ARKENAS (now Pusat Riset Arkeologi, Lingkungan, Maritim, dan Budaya Berkelanjutan, Badan Riset Inovasi Nasional (BRIN))
Contact Details
Prof. dr. Josephine C.A. Joordens
Senior Researcher, Naturalis Biodiversity Center
Darwinweg 2
2333 CR Leiden
The Netherlands
T: +31-6-23994055
E: josephine.joordens@naturalis.nl
W: www.naturalis.nl/en/science/ researchers/jose-joordens
All photographs were taken by Josephine Joordens unless specified otherwise.
The complex Trinil stratigraphy, showing channels of different ages (BBC-1, BBC-2, T2, and T1), projected on historical photo DUBO1494 of Dubois’ 1900 excavation at Trinil. Fossil symbols (green) show the provenance of the hominin skullcap and of femur I. Adapted from: Pop, E.L., Hilgen, S.L., Adhityatama, S., Berghuis, H., Veldkamp, A., Vonhof, H.B., Sutisna, I., Alink, G., Noerwidi, S., Roebroeks, W., Joordens, J.C.A., 2023. Reconstructing the provenance of the hominin fossils from trinil (Java, Indonesia) through an integrated analysis of the historical and recent excavations. Journal of Human Evolution 176 (103312): 1-26.
this location over several broad time frames spanning hundreds of thousands of years. The research revealed many clues about the environment of the time, explaining aspects of their lifestyles.
Josephine Joordens is a senior researcher at Naturalis Biodiversity Center (Leiden, The Netherlands) and special professor on the Naturalis Dubois Chair in Hominin Paleoecology and Evolution, at Maastricht University (The Netherlands). In her research, she combines the fields of geology, biology and archaeology.
Shinatria Adhityatama has extensive experience conducting archaeological field projects all over Indonesia as archaeologist at ARKENAS (now part of Badan Riset Inovasi Nasional (BRIN)) in Jakarta. He is currently at the Australian Research Centre for Human Evolution (ARCHE), Griffith University studying the underwater archaeology of Matano, a submerged site in Sulawesi.
Yet, in addition to the as yet unknown identity of femur I there is also another mystery that endures until more research is carried out. Joordens affirms: “We showed that the Trinil hominins consumed shellfish and that they used large shells as tools. They have also used a shark’s tooth to make the shell engraving, but that is only part of the picture. The big enigma remains why so far in hominin sites on Java stone tools are not so frequently found as we would expect. This is so strange when you know that so many fossils of Homo erectus have been discovered on the island. We know from discoveries in Africa and Eurasia that hominins like Homo erectus produced and used stone and also bone tools. Yet so far, we do not encounter them so often in excavations on Java. We now have a new research project near Trinil, the BASINS project directed by Eduard Pop, which focuses on solving this particular hominin lifestyle puzzle.
SHeLL team after completion of fieldwork.
Photo by Jan Willem Dogger
“I think their connection with water was very strong, for drinking water but also for food from the water, like fish and shellfish. And of course water attracts other animals that can be eaten. In their time Java was a very diverse and fertile landscape, with rivers, volcanoes, plains and lakes, in many ways comparable to what you find on Java today. Whenever I see people busy in the river, in the evening at the end of the working day, washing themselves, or throwing out nets, or just enjoying the coolness of the river and the calmness of the evening, it gives me shivers because it almost brings you back to those prehistoric times.”
It’s clear that the abundant water sources were central to lifestyle for early humans throughout the ages, and the situation remains that way for people today. It also seems that another central part of being human is to do with having a fertile imagination and an ability to create art and tools of our own design. The engraved shell discovery, specifically, suggests the ‘missing link’ species was possibly more like us than was initially assumed.
Patents; sources of scientific information?
The granting of a patent gives an individual or company a degree of commercial control over a new invention, in return for disclosing information about the nature of the product. This is often thought of as a way of stimulating invention and encouraging companies to invest in new ideas, yet it has also been argued that they in fact hold back innovation, and there are historical examples of countries removing patent protections. “In 1869 for example the Netherlands abolished patents, while Switzerland did not have patent protections during some parts of the 19th century,” explains Eva Hemmungs Wirtén, Professor in the Department of Culture and Society at Linköping University in Sweden. Debate around patents continues today, intensified by concerns about unequal access to vaccines against Covid-19.
Patents have typically been granted under national law, and countries may vary in terms of the specific requirements that need to be met. However, the critical factor is that a new product should be novel, often building on scientific
discoveries and cutting-edge research. “This is one of the baseline aspects of patents which connects it to scientific research,” stresses Professor Hemmungs Wirtén. Proving that an invention is new takes place through searches of what is called prior art, which includes material in journals and various other types of documents. “An enormous amount of material needs to be looked at in order to prove that an invention is new,” she continues. “With the development of evermore complex technical innovations, and the process of documenting it and proving it, the patent system becomes almost a motor of the information system.” This, however, is not a completely new development.
PASSIM Project
Professor Hemmungs Wirtén is the Principal Investigator of the ERC-funded Patents as Scientific Information 1895-2020 (PASSIM) project, which is a humanities-based, interdisciplinary project looking at the ascent and development of patents as documents during the twentieth-century. What makes
the project unique, she argues, is that its researchers look at patents from a much broader perspective than is usually the case. “Our interdisciplinary team looks at patents with fresh eyes, trying to understand their value and impact as part of the history of information and knowledge,” she outlines.
The project is focused on the history of patents between 1895-2020, with researchers studying patents over this long historical period when information really came to the fore as a key component of modern society. This was paralleled by rapid technological development which opened up new possibilities in the communication of ideas and knowledge. “There were the beginnings of a new structure for the circulation of knowledge and information in the late 19th century, and patents played a role in that,” says Professor Hemmungs Wirtén. The history of patents pre-dates this period by several centuries, but as the processes of industrialization and internationalization gathered pace, the history of patents entered into a new phase.
Patent law is often thought of as a way of stimulating innovation and encouraging companies to develop new products that boost the economy and raise our standard of living. We spoke to Professor Eva Hemmungs Wirtén about her work in investigating the patent system and its role in creating the information infrastructure that shapes our lives today.”Laboratorium Mundaneum” - the visualization of patents among other documents made by Paul Otlet in 1937.
This rapid acceleration in the number of patents has been taken by some observers as evidence of their centrality to technological progress, and illustrative of a healthy culture of research and innovation. However, patents can also be used defensively, with corporate patent attorneys wielding the power in some circumstances to halt the development of a rival’s innovation, or to protect an idea which some might consider frivolous. “There are lots of weird and wonderful innovations out there. They may be amusing, but are they really pushing the technical boundaries?” asks Professor Hemmungs Wirtén. The accumulation of patents is not necessarily qualitatively important, but it does lead to the creation of huge volumes of information, an issue that Professor Hemmungs Wirtén and her colleagues are exploring in the project. “The patent system is not only part of the ascent of the information society, I think one of the central conclusions of the PASSIM project is that it’s really been a major driving force in this historical development,” she says.
The way in which patent documents are circulated and used has of course evolved over time, with new technologies emerging throughout the course of the last 120 years, influencing the way in which we record and retrieve information. In the late 19th century patent documents were largely recorded on paper, then later microfilm was used, and nowadays patents can be searched for via databases like the European Patent Office’s (EPO) Espacenet database. “Patent history in a way is part of information history, there is this close relationship with technology and the different ways in which documents have been accessed,” says Professor Hemmungs Wirtén. Patent drawings themselves have also become attractive visual artefacts, sometimes used in public spaces, which Professor Hemmungs Wirtén says is a valuable way of highlighting the
role of the patent system. “Many people find these old patent drawings quite interesting,” she says. “The use of old patent specifications range from making fun of the system - acting as proof of the system’s importance - or being counted and accumulated in order to prove their importance in terms of technological development.”
increasingly complex, raising new questions. “Can ordinary people read a patent and understand what it is about? Many people would argue that this is impossible today, because you have to be so extremely specialised,” says Professor Hemmungs Wirtén. “From this viewpoint we can’t really think of patents as a source of information and knowledge, because they are just too specialised and complex.”
This calls into question whether the patent system still functions in the way it was initially intended, namely as a motor to spur innovation and technical progress. The patent system has also attracted more fundamental criticism, with some people arguing that it is by nature exclusionary. “The patent system initially developed from a Europe-centric perspective, then it expanded more widely,” outlines
Symbolic power of patents
A further topic of interest in the project is the symbolic power of patents, which Professor Hemmungs Wirtén believes has grown significantly since the end of the Second World War for a number of reasons. One major factor is the globalisation of patent and trade systems since the end of the conflict, which has helped make technical knowledge ever more valuable.
“The simple term ‘patent pending’ has always denoted a certain kind of value,” outlines Professor Hemmungs Wirtén. The value of a patent lies to a large extent in the information that will be disclosed about what’s being patented, yet over time they have become
Professor Hemmungs Wirtén. The economic landscape of the world has changed significantly since the end of the Second World War, and many nations have gained independence from their former colonial masters over the period covered by the project, another topic PASSIM researchers have explored. “For example, India and Brazil have a colonial past but have become economic and information and knowledge superpowers in their own right,” Professor Hemmungs Wirtén says. “On the one hand, they critiqued the old intellectual property system, but on the other they have also become quite adept at using it themselves. That is a very interesting dynamic.”
There has been a lot of debate about whether patents stand in the way of the development of new vaccines against Covid-19, or if they are necessary to create the kind of financial climate that makes pharmaceutical companies want to get involved.The PASSIM Team, 2021.
The primary focus in the PASSIM project is on exploring these different perspectives on the patent system, and also highlighting its role in creating the information infrastructure that shapes our lives today. While her own background is in comparative literature, Professor Hemmungs Wirtén says the project brings together people from a variety of disciplines, with different perspectives on patents. “We have legal scholars working in the PASSIM project, as well as people from an inter-disciplinary background or from areas
like information science. They have focused their attention on various different periods and technologies over the timespan covered in the project,” she says. The wider aim in the project is to build a deeper understanding of how the patent system has influenced the way information is transmitted, and to develop a new narrative of patents. “We want to show just what these interdisciplinary perspectives can provide in terms of new insights into the patent system,” continues Professor Hemmungs Wirtén.
PASSIM
Patents as Scientific Information
1895-2020
Project Objectives
PASSIM’s objective is to unpack the multifaceted relationships featured in the patent bargain, recombine them in unexpected and creative ways and develop from that a new conceptualization of how patents and intellectual property has contributed to and also acted as an engine in the consolidation of the knowledge-based economy.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant Agreement number: 741095 — PASSIM — ERC-2016-ADG.
Project Duration
October 1, 2017-June 30, 2023.
Project Partners
Mundaneum: http://www.mundaneum.org/
Nobel Prize Museum: https://nobelprizemuseum.se/en/
The Science History Institute: https://www.sciencehistory.org/
The International Society for the History and Theory of Intellectual Property: https://www.ishtip.org/
Contact Details
Principal Investigator, Eva Hemmungs Wirtén
Institutionen för kultur och samhälle (IKOS)/ Department of Culture and Society
Linköpings Universitet/Linköping University 601 74 Norrköping
SWEDEN
T: +46 (0) 11 36 30 65
E: eva.hemmungs.wirten@liu.se : @evahemmungswirten
W: www.passim.se
W: www.evahemmungswirten.se
Eva Hemmungs Wirtén
Eva Hemmungs Wirtén is Professor of Mediated Culture at the Department of Culture and Society, Linköping University, Sweden. She has written extensively on the cultural history of international copyright, the public domain and the commons, and, more recently, on patents as documents in the history of information.
When did rational thinking emerge in al-Andalus?
It is commonly thought that philosophy was first cultivated in the region of al-Andalus during the 11th century, but did rational thinking actually emerge earlier? We spoke to Professors Godefroid de Callataÿ, Liana Saif and Sébastien Moureau about their work investigating the origins of philosophy in al-Andalus, and its importance to the wider history of sciences and ideas.
The 11th century is generally thought of as the period during which philosophy and rational thinking first emerged in alAndalus, a region on the Iberian peninsula which was under Muslim dominion at the time. However, evidence has been uncovered suggesting that philosophy in fact emerged in al-Andalus before this period, which would represent a significant shift in perspective. “We would have to re-think the history of the transmission of sciences, the history of the transmission of rational thinking,” outlines Godefroid de Callataÿ, Professor of Arabic and Islamic Studies at the University of Louvain. This is a topic Prof. de Callataÿ is exploring in the ERC-funded PhilAnd project, in which he and his team are investigating the origins of philosophy in al-Andalus, including research into the work of a society of Muslim scholars called the Ikhwān al-Ṣafā’, the Brethren of Purity. “It is thought that the members of the Brethren of Purity lived during the 9th or 10th century in Iraq and that their work influenced people in al-Andalus,” he says. “One of the aims in the project is to show that this occurred during the 10th century, rather than in the 11th.”
PhilAnd project
This research involves analysing manuscripts and producing critical editions of several important texts dating from the period, including certain parts of the Rasā’il Ikhwān al-Ṣ afā’, the encyclopaedia of the Brethren of Purity. In his earlier research, Prof. de Callataÿ found that the Brethren of Purity had a significant influence on later scholars. “I started to see that they were more influential than one had previously thought, and that they had come to be seen as important figures by people in al-Andalus,” he explains. This earlier work eventually led to the establishment of the PhilAnd project, with researchers aiming to build a deeper picture of when and how philosophy developed in al-Andalus; philosophy here is viewed in a broad sense, including essentially all rational thinking. “In the project we’re mainly focusing on 10 th-century philosophers in al-Andalus. These philosophers were interested in topics like occult sciences, mathematics, magic and astronomy,” explains Prof. de Callataÿ. “We’re
also looking at how these philosophers transmitted their ideas.”
The way in which the materials available for this inquiry are examined needs to be carefully considered, believes Prof. de Callataÿ. Mainstream scholarship into the Arab-Muslim world as a phase in the transfer of the intellectual legacy of Late Antiquity to Modern Europe has, until now, largely concentrated on what could be referred to as ‘well-defined’ materials and channels of transmissions. “‘Well-defined’ in this sense
refers to those processes which can be traced on rather secure grounds, since they involve the works of clearly-identified and usually high-brow scientists and philosophers whose lives, list of writings and areas of influence, can be determined with relative accuracy,” outlines Professor de Callataÿ.
The influence exerted by the ’Rasā’il Ikhwān al-Ṣ afā’, the so-called ‘Jābirian’ alchemical corpus and the ‘Nabatean Agriculture’ ascribed to Ibn Wa ḥshiyya have attracted less attention however. “These three large corpora
of texts were all produced in the Middle East and share with one another a noteworthy number of peculiarities. All three works were the result of stratified compilations, which possibly extended over various generations. All three were highly syncretic compositions, in which a philosophical – essentially Neoplatonic – conception of man and his place in the universe is inextricably intertwined with religious ideas and an esotericallyorientated conception of science which is commonly refered to in modern scholarship as ‘bā ṭinism’,” says Professor de Callataÿ. “All three appear to have been compiled in milieus strongly influenced by Ismā‘īlism, although the nature of their relationships with specific Ismā‘īlī movements as we know them to have existed still remains enigmatic.”
Philosophy in al-Andalus
The project’s agenda involves research into the works of a number of philosophers, including Maslama Ibn Qāsim al-Qur ṭubī, a prominent intellectual who explored a range of topics in his work. Al- Qur ṭubī is thought to have written the Rutbat al-Ḥ akīm, the Book of the Rank of the Sage, which is a central text in the project. “The Rutbat al-Ḥ akīm seems to be the text which introduced alchemy into al-Andalus. It appears to be the only Arabic alchemical text that has been preserved from that time in alAndalus,” says Prof. Sebastien Moureau, who is part of the team working on PhilAnd. Prof. Moureau’s primary focus in the project is the practice of alchemy, which can be thought of as the science of material transformation. “A lot of translations from Arabic into Latin from the 12th and 13th centuries have been preserved, providing evidence that alchemy was present in al-Andalus. The Rutbat al-Ḥ akīm allows us to know from which point it was present. We have Latin witnesses that alchemy was practiced – I would say undercover – in alAndalus, but we don’t have Arabic traces, except for in the Rutbat al-Ḥ akīm,” he says. “The Mālikī scholars did not accept some of these sciences, so alchemy was not openly studied, except under the rule of Caliph Abd al-Ra ḥmān III.”
A lot of attention in the project is centered on investigating the relationship between the development of these esoteric and philosophical ideas in al-Andalus, and their development in the Eastern regions. As Assistant Professor in the History of Esotericism in the Middle Ages at the University of Amsterdam, Liana Saif holds a deep interest in occult sciences; her work in the project is focused on the corpus attributed to Jābir Ibn Ḥayyān. “The Jābirian corpus was produced in the Eastern region, most likely
in Iraq, over a fairly long period of time. The authorship of this corpus is contentious,” she outlines. The significance of this corpus to the project as a whole can be understood through the impact of a text on magic called the Ghāyat al-Ḥ akīm (or Picatrix in Latin).
“This is translated as the Goal of the Sage and is the sister text to the Rutbat al-Ḥ akīm It was also written by Maslama Ibn Qasim AlQur ṭ ubī, who was the tutor of the son of ‘Abd
al-Ra ḥmān III’,” says Prof. Saif. “He was from al-Andalus, but he also travelled to Eastern regions, which is one of the ways that his ideas were transmitted. The Jābirian corpus had a deep influence on Maslama Ibn Qāsim al-Qur ṭubī.”
The specific text Prof. Saif is focusing on in the project is a book about magic called the Kitāb al-Nukhab, also known as Kitāb al-Baḥ th, which is attributed to Jābir Ibn
Ḥayyān. While Jābir Ibn Ḥayyān is primarily known as an alchemist, Prof. Saif hopes to shed new light on his magical work and explore the interconnection between the two topics. “This interconnection is reflected in the Jābirian corpus, as well as in the Ghāyat al-Ḥ akīm. These works helped to make the unseen knowable, so that new insights could be used for practical, pragmatic, useful ways of understanding and manipulating nature,” she explains.
With a deeper understanding of the place of magic in knowledge production in this period, Prof. Saif aims to then trace the movement of these ideas into different areas. The Rutbat al-Ḥ akīm was written before the Ghāyat al-Ḥ akīm and there are references to the former text in the latter; there is a deep interconnection between these two texts, and Prof. Saif says this was part of a whole programme of learning. “Philosophy is presented in the works that I study as a kind of cultivation of wisdom using rational sciences and natural sciences,” she outlines. The occult sciences occupied an important place in the intellectual landscape at the time. “The scholar Maslama Ibn Qāsim alQur ṭubī put alchemy and magic at the very end of the curriculum, he called them the
two conclusions. They are the culmination of what he called a ladder of ascension, a kind of intellectual or scientific ladder,” explains Prof. de Callataÿ. “An individual who knew about one of these two sciences would be considered as half a sage. Somebody who knew both of them was a full philosopher, a full sage,” as Maslama famously wrote.
Critical editions
Researchers also plan to make the materials in these manuscripts more widely available, helping to shed new light on the work of these 10th-century scholars and to learn more about their influence on later authors. The PhilAnd team are creating critical editions of several texts, including the Rutbat al-Ḥakīm and parts of the Rasā’il Ikhwān al-Ṣ afā’, which is an extremely lengthy work. “We also translate these texts into English. We are producing annotated translations and critical editions,” says Prof. de Callataÿ. As part of her work in PhilAnd, Prof. Saif is focusing on the relatively under-studied Kitāb al-Nukhab. “I’m producing a critical edition and a translation of the Kitāb al-Nukhab, based on work with manuscripts. This is a huge text and will require a lot of attention,” she says. “This will be quite a distinctive work, because we don’t have a lot of up-to-date editions of the works attributed to Jābir Ibn Ḥayyān. In the general imagination the figure of Jābir Ibn Ḥayyān is known mostly as an alchemist, so it would be good to have an edition that showcases some of his more magical thinking.”
A critical edition of epistle 52 b and c (two of the three versions of the Epistle of Magic) of the Rasā’il Ikhwān al-Ṣ afā’ has also been
produced in direct line with the project’s work, and is currently being prepared for submission to the ‘Epistles of the Brethren of Purity’ series published by Oxford University Press in association with the Institute of Ismaili Studies. This epistle is a lengthy text, and Prof. Moureau says a lot of work has been involved
together, in order to try to reconstruct a text that was written in the 10th century.”
This is a very long process, which involves reading and comparing all the manuscripts in great detail, while very specific rules around editing texts must be rigorously followed. It’s important to stay close to the text and not
PHILAND
The origin and early development of philosophy in tenth-century al-Andalus: the impact of ill-defined materials and channels of transmission
Project Objectives
The objective of PhilAnd is to conduct a largescale exploration of how, and under which form, philosophy appeared for the first time in alAndalus, a pivotal issue to understand the history of sciences and ideas, and the role of the ArabMuslim world in this transfer to Medieval Europe.
Project Funding
PhilAnd is an Advanced grant ERC project funded by the European Research Council under the European Union’s H2020 ERC MGA Programme (ERC Grant The project 740618).
Project Partners
The project is held at UCLouvain, with the Warburg Institute (London) as second beneficiary.
Contact Details
Prof. Godefroid de Callataÿ
Institut des Civilisations, Arts et Lettres (INCAL) UCLouvain
1, Place Blaise Pascal
B-1348 Louvain-la-Neuve
Belgium
T: +3210474966
E: godefroid.decallatay@uclouvain.be
W: https://sites.uclouvain.be/erc-philand
in producing the critical edition. “It’s been the culmination of a lot of effort. It’s not just a case of finding a manuscript and making it available,” he stresses. While the original text was written in the 10th century, researchers are working with later copies, which adds a layer of complexity. The same problem arises with the Rutbat alḤakīm. “Our earliest witness of the Rutbat alḤakīm is from the 14th century. So you have four centuries between the point at which the text was written, and the earliest copy that we have in our hands,” outlines Prof. Moureau. “We have around 60-65 copies of this text, which is quite a lot in comparison to other works. The earliest copies we have are from the 14th century, and the more recent are from the 20th century. We must take all these texts and compare them
over-interpret it, although there is scope for explanatory notes in the apparatus criticus. “This is the section in an edition where you put all the notes, explaining what you found in the manuscripts,” explains Prof. Moureau. While the project is now entering its final year, Prof. de Callataÿ is looking to continue his research in this area and is exploring the possibility of a new project with a broader scope. “This new project will not be focused on al-Andalus, it will be more about Islam and the transmission of the sciences in general,” he says. “We shall focus more on the transmission of sciences in later periods, from the 12th century up to the 17th, and look further to the East. This will involve not only texts in Arabic, but also in Persian and Turkish. The focus will clearly be on the occult sciences.”
Godefroid de Callataÿ is Professor of Arabic and Islamic Studies at UCLouvain. He specializes in the history of philosophy and science and has published extensively on the Rasā’il Ikhwān al-Ṣafā’
Sébastien Moureau is researcher at the FNRS and Professor at UCLouvain. His research is focused on the history of science, with special attention to the history of alchemy and metallurgy.
Liana Saif is assistant professor of medieval history of esotericism and the occult sciences in the Islamic world and Europe at the University of Amsterdam. Prior, she was a research associate as the Warburg Institute and UCLouvain.
The Rutbat al-Ḥ akīm seems to be the text which introduced alchemy into al-Andalus. It’s essentially the only Arabic text that has been preserved from that time in al-Andalus.Medina Azahara in Córdoba, Spain. Photo by Kent Wang. Medina Azahara. Photo by R Prazeres.
Investigating the discourse of right-wing populists
Right-wing populist parties in Europe have developed an electoral agenda based on strengthening state border control. How do these parties adapt their discourse in cross-border regions characterized by strong economic interdependences? We spoke to Dr Christian Lamour and Professor Oscar Mazzoleni about their work in analysing populist discourse in cross-border regions.
The European integration process has encouraged the development of crossborder functional regions based on the daily influx of workers from beyond national borders. Prominent examples include Luxembourg in the EU, and also the cities of Geneva, Basel and Ticino in Switzerland, which all attract commuters from the neighbouring EU states. For instance, 200,000 cross-border workers are employed in the Luxembourg economy. This economic interdependence across borders is facilitated by EU political agreements and policies, but what about Eurosceptic right-wing populist parties, which have based much of their electoral appeal on a desire to strengthen nation-state borders and oppose an EU that they perceive as “borderless”? This question is central to the work of the CROSS-POP project, in which researchers are looking at the discourse produced by right-wing populists in the Luxembourg and Swiss crossborder functional regions.
Cross-border workers
The presence of cross-border workers in these regions can lead to a degree of antagonism with local residents, who maybe move around less and resent those who they see as exploiting the economic opportunities in open borderland regions. As political groups, right-wing populist parties often claim to represent the
One can notice that Europe borderlands can be characterised by the presence of strong right-wing populist parties, as seen in the selected case studies of the CROSS-POP project. “In the case of Ticino there are two strong populist parties, on both sides of the Swiss-Italian border. In Basel, Luxembourg and Geneva, there is a strong, right-wing populist mobilisation
interest of geographically fixed citizens, and in some cases may scapegoat the more mobile cross-border workers. This active population then becomes one subcategory of the migrants who constitute a major community opposed to the people as suggested by Christian Lamour, senior researcher at LISER in Luxembourg.
on only one side of the border,” says Oscar Mazzoleni, a Professor in the Institute of Political Studies at the University of Lausanne. “How can we explain this variation? How do these parties mobilise? Is the cross-border regional integration a central issue in the opposition of the people to the elite and others? Usually
Right-wing populism is based on an antagonistic vision of society implying a frustrated people, a frustrating elite and threatening “others” such as migrants .
radical right-wing populist parties are studied at the state-border national level, but we’re taking a regional approach at the cross-border scale.”
This research aims to analyse the discourse employed by right-wing populists during electoral periods, looking at how they seek to appeal to voters. The project is focused primarily on three research questions. “The first is to look at the specific workings of this radical right-wing discourse. Secondly, was there any convergence of right-wing populist discourse at the scale of cross-border regions? The third question was about the role of the media in the circulation of right-wing populism in border regions,” outlines Dr Lamour. In a way, populists ‘perform crisis’ during election campaigns to use the expression of Benjamin Moffitt. They give the impression to the electorate that there is an overall structural problem that has to be resolved, and that conventional politicians can’t provide the solution. “When addressing the common people, the message from right-wing populists is: ‘We can secure your living conditions, through taking back control.’ There is a commonality here between the Swiss, Italian and French populist parties analysed in our case studies. This message of taking back control is strictly connected with the defence of the border,” says Professor Mazzoleni.
The common enemy to an extent in each of the cases is the European Union, which is often portrayed by right-wing populists as a remote, bureaucratic elite acting against the interests of the citizens. There is a certain level of scepticism across the continent about the European Union, and a hostility to the idea of open borders within the Union. “Populist parties are to an extent reacting to an anti-EU mood in the population and favour this Euroscepticism,” says Dr Lamour. These radical right-wing parties are typically nationalist in outlook, focused on strengthening borders, so might be expected to oppose the idea of collaborating with others, yet Professor Mazzoleni says they do in fact cooperate at different European levels including sometimes in cross-border regions such as the polarised area around Ticino, where populist parties hold executive powers on both sides of the border. “When you have two parties that are close to the border, they may find common enemies in order to avoid conflict.”
In groups vs out groups
The antagonism promoted by right wing populist is partly about the opposition of people-centred in-groups to out-groups combining elites and others who can be represented as a threat to the people. This antagonism can vary depending on our cross-border regional contexts. Cross-border workers turn out to be considered differently by right-wing populists in border regions. Dr. Christian Lamour notices that they are often included into the negatively-defined outgroup by the Swiss populist parties in Ticino and in Geneva, but less so in Basel, while the populist right in Luxembourg can include cross-border workers both in the out-group and the in-group depending on contexts and issues. The French and Italian populist parties in the CROSS-POP case studies do not criticize cross-border workers, as most of them reside in their electoral districts and can constitute an electorate to be attracted by scapegoating other communities such as Extra-European migrants or the liberal elite, who are perceived as being responsible for a borderless Europe which frustrates the people.
Populist party representatives may seem to belong to the political, economic and cultural elite due to their enduring presence in politics, their personal wealth and their educational background. “Nevertheless, they often use a popular communication style and more broadly represent themselves as the representative of the victimized people,” says Dr. Lamour.
The approach of the media towards reporting this discourse is another major topic of interest in the project. While some media outlets may look to profit from sensationalist discourse, using it as a means to attract attention and increase sales, Dr Lamour says a desire to contain populist rhetoric is also evident in some regions. “For example, in Luxembourg there is what we call a cordon sanitaire. The mass media is cautious about reporting on the ideas and arguments of right-wing populist parties,” he explains.
A number of research papers have already been published, while Professor Mazzoleni has just co-authored a book exploring some of the themes which have been investigated in the project. This research represents a valuable contribution to the literature on right-wing populism in Europe. “Our analysis is quite innovative as cross-border functional regions are considered as spatial conditions and resources for populist mobilisation within multiscaling settings, including European scale,” says Professor Mazzoleni.
CROSS-POP
The Right-Wing Populist Discourse in European Cross-Border Areas. A comparison between Switzerland and Luxembourg
Project Objectives
The goal of the CROSS-POP project is to analyse the right-wing populist discourse (RPD) in European cross-border regions. This project follows a comparative approach focusing on four cross-border areas centred on Luxembourg, Geneva, Basel and Ticino. Some broader aspects of bordering processes, in connection with populism are also focused.
Project Funding
This research is supported by the Luxembourg National Research Fund (Grant number: INTER/SNF/18/12618900/CROSS-POP) and the Swiss National Science Foundation (Grant number: 10001CL_182857).
Contact Details
Project Coordinators
Switzerland:
Professor Oscar Mazzoleni
E: oscar.mazzoleni@unil.ch
W: https://applicationspub.unil.ch/interpub/ noauth/php/Un/UnPers.php?PerNum=8706 29&LanCode=37&menu=coord
Luxembourg:
Dr. Christian Lamour
E: christian.lamour@liser.lu
W: https://liser.elsevierpure.com/en/ persons/christian-lamour
W: https://www.unil.ch/ovpr/en/home/ menuinst/ricerche/projets/projet-fns-crosspop-2019-2023.html
The search for a unified theory
Einstein’s general theory of relativity is one of the pillars of modern physics, yet it doesn’t explain what happens when spacetime breaks down at a gravitational singularity. Researchers in the eQG Group are working to develop a new theory of quantum gravity, bringing together general relativity and quantum mechanics, as Professor Hermann Nicolai explains.
The General Theory of Relativity
was published over a century ago and it works beautifully to this day, with even modern observations of gravitational waves conforming to Einstein’s field equations. Einstein’s field equations also predict the existence of black holes, which is where the theory starts to break down. “Once you enter a black hole, you invariably run into a gravitational singularity, where spacetime essentially breaks down. It’s completely clear that at that point another theory is needed, because general relativity doesn’t tell you what happens at that singularity,” explains Professor Hermann Nicolai, Director Emeritus of the Max Planck Institute for Gravitational Physics.
Exceptional Quantum Gravity
A new description is required to deal with this problem, a topic at the core of Professor Nicolai’s work. As the head of the Exceptional Quantum Gravity (eQG) Research Group, Professor Nicolai and his colleagues are addressing one of the greatest challenges in modern physics. “We aim to unify general relativity and quantum mechanics,” he says. A new theory of quantum gravity would represent an important step towards resolving singularities; an analogy can be drawn here with the example of the hydrogen atom. “With the hydrogen atom you have a singularity of the Coulomb potential that would destroy the atom within fractions of a second, but
quantum mechanics helps resolve it,” outlines Professor Nicolai. “We aim to bring together general relativity and quantum mechanics in a way that likewise eliminates all singularities.” The basic building blocks of such a theory have not been identified however, and a wide variety of different approaches to the problem have been put forward. These include approaches based on loop quantum gravity, spin-foam quantum gravity and string theory, the idea that the basic constituents of matter are not point-like particles but rather one-dimensional extended objects, or strings. “The idea is that these strings vibrate, and their vibration modes is what makes up elementary particles,” explains
Professor Nicolai. “However, string theorists haven’t been able to establish a compelling connection with the standard model of particle physics. We need to go beyond string theory to find a unifying or basic principle.”
The different groups of researchers could benefit greatly from sharing their ideas, insights and experience with each other. However, Professor Nicolai says they tend to exist as separate communities. “These communities don’t really talk to each other, they live in parallel universes,” he says. Professor Nicolai and his colleagues are taking a different approach, aiming to maintain an open dialogue with researchers from different areas. “We need to be open, and to listen to other ideas,” he stresses. “As members of the supergravity and superstring community we remain in contact with researchers in other approaches and try to keep up to date with their research.”
The origins of Professor Nicolai’s work in this area can be traced back to earlier research by the physicists Vladimir Belinski, Isaak Khalatnikov and Evgeny Lifshitz (BKL). Their work represents a key discovery in mathematical and theoretical cosmology.
“They studied the Einstein equations very close to the singularity and discovered that as you approach the singularity in four dimensions, chaotic behaviour sets in. These are the so-called BKL oscillations,” explains Professor Nicolai. These oscillations have
been known about for around 50 years, but further and much more recent analysis has led to deeper insights. “It turns out that there is an extremely interesting mathematical pattern here which is indicative of a huge infinite-dimensional symmetry,” continues Professor Nicolai.
This symmetry is the Kac-Moody E10 symmetry, which is vastly bigger than anything that has been considered so far in physics. Symmetries like this are an important
The E10 symmetry is now the focus of intense attention in the eQG group, with the aim of developing a unified theory of quantum gravity and the other fundamental interactions that is entirely defined by symmetry principles. E10 is a unique mathematical object, says Professor Nicolai. “It is an infinite prolongation of the known symmetries that have been used in physics so far,” he says. The E10 symmetry is also known to have an extremely complicated structure. “The existence of this particular structure has
organising principle in physics. “The standard model of particle physics is based on a symmetry. Once you have this symmetry, then with a little extra information you can write down all the equations for the standard model,” outlines Professor Nicolai. The idea Professor Nicolai is exploring is that the true symmetry of gravity, of what comes after Einstein’s general theory of relativity, is only revealed as you push further towards a singularity. “A singularity needs extremely short distances – or extremely high energies. These two things are reciprocal,” he continues.
been known about in mathematics for more than 50 years, but there hasn’t been much progress in terms of understanding it,” outlines Professor Nicolai. “Not many mathematicians are currently working on it, because of its extreme complexity.” An impression of this complexity is conveyed by the `pychedelic’ picture on the following page which represents a slice through the root lattice of the E10 algebra.
There is also a theory in physics which cannot be extended further, namely 11-dimensional supergravity, which is often mentioned in connection with M-theory.
Once you enter a black hole, you invariably run into a gravitational singularity, where spacetime essentially breaks down. It’s completely clear that at that point another theory is needed.
eQG Exceptional Quantum Gravity Project Objectives
The Exceptional Quantum Gravity (eQG) Research Group is working to reconcile quantum mechanics and general relativity, one of the greatest challenges in theoretical physics. This research involves using the Kac-Moody symmetry E10, which is an extremely useful tool in the search for a consistent theory of quantum gravity.
The group is bringing together several different strands of research, with the group aiming to build a deeper understanding of the dynamics of quantum space-time.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 740209.
Contact Details
Project Coordinator, Professor Dr. Dr. h.c. Hermann Nicolai Director emeritus at the Max-Planck-Institut für Gravitationsphysik (Albert-Einstein-Institut) Am Mühlenberg 1
D-14476 Potsdam
GERMANY
T: +49 331 567-7216
E: hermann.nicolai@aei.mpg.de
W: https://www.aei.mpg.de/306486/ exceptional-quantum-gravity
W: https://www.aei.mpg.de/190500/ercadvanced-grant-for-hermann-nicolai
M-theory is a sort of overarching framework that aims to bring together different strands of string theory. “If you look at 11-dimensional supergravity in more detail, it turns out that hints of E10 symmetry are hidden in it,” says Professor Nicolai. In a sense, 11-dimensional supergravity already represents a kind of unification. “This is the most super-symmetric field theoretic extension of Einstein’s theory that can currently be written down. There’s nothing beyond it,” stresses Professor Nicolai. “At some point you reach the end of the line, and that’s the theory we have been discussing for the last 45 years. Although it is surely part of the answer, links between this theory and the real world have remained elusive.”
Testing new theories
Researchers in the eQG group have been working to push unification further, using the E10 symmetry. This will eventually lead to new insights, which then have to be analysed, understood, and if possible tested against observational data. “Every physical theory, at some point, has to be tested. That’s the ultimate arbiter of whether a theory is any good,” says Professor Nicolai. One encouraging sign is that this scheme predicts 48 fundamental fermions, in agreement with the number of quarks and leptons in three generations found in the standard model. Another potentially testable prediction centres on dark matter, the nature of which is a major open problem in physics. “The prediction that comes out of our theory is that dark matter is made out of extremely heavy particles, supermassive gravitinos,” outlines Professor Nicolai.
This prediction would be extremely hard to test however because of the extreme rarity of these particles. One possibility is an underground experiment, while another is based on the idea of palaeo-detectors, where researchers could look for traces of these particles in ancient rock. “In principle we might be able to detect a trace of a particle that sped through old rock in a straight line,” says Professor Nicolai. Testing a quantum gravity theory is a challenging task, and Professor Nicolai says it’s not enough to just match one specific feature, rather everything must fit together. “It’s not just about one aspect of what you can observe in the sky, or see in an accelerator or a dark matter search experiment,” he stresses. “In the end it all has to fit together seamlessly, including particle physics, gravitation and cosmology. It’s a very, very tall order!”
This research is extremely abstract by nature and highly complex, yet it holds deep fascination for many people, including not just scientific specialists but also members of the public who have asked themselves fundamental questions about the origins of the universe. Pretty much every culture throughout recorded history has pondered how the world began, but nowadays rather than relying solely on theological or philosophical ideas to develop theories, we can bring mathematics and scientific instruments to bear on the topic. “We can use the fact that a lot of progress has been made in physics towards understanding large parts of the universe,” says Professor Nicolai. “And perhaps the existence of a uniquely exceptional mathematical structure like E10 can guide us towards the correct answer.”