Ursula von der Leyen vows to increase EU research spending
Over €380 million new funding for LIFE programme
Microplastics merge with ‘Forever Chemicals
The history and heritage of the written word
Ursula von der Leyen vows to increase EU research spending
Over €380 million new funding for LIFE programme
Microplastics merge with ‘Forever Chemicals
The history and heritage of the written word
As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.
At a summit of leaders in 2021, the then relatively new in-post US President, Joe Biden, said we must – “overcome the existential crisis of our time. We know just how critically important that is because scientists tell us that this is the decisive decade. This is the decade where we will make decisions to avoid the worst consequences of the climate crisis”.
As Donald Trump reclaims power over the US, those scientists Biden alluded to will be watching with a sense of returning gloom as the direction of policies for sustainable development looks inevitable to take a nosedive beyond 2025.
Where Biden proclaimed to lead energy and climate mitigation policies with science, Trump has other ideas. Trump’s expected choice of Energy Secretary is Chris Wright, the CEO of Liberty Energy, a giant of a North American hydraulic fracturing company. He’s a pro-oil, climate sceptic who was flagged by LinkedIn for his statement declaring: “There is no climate crisis…”
There will also likely be a systematic dismantling of sustainability-related regulations, a playbook of the former President, which will leave standards and scrutiny taking the backseat for a more ‘open’ market, made worse by the championing of fossil fuels.
In the US and many other locations too, it is apparent that a lot of people are choosing short-term over long-term survival, and that’s understandable when you struggle to drive to work because it costs more at the pump, or you can’t afford food or escalating bills to heat your property. Climate impact can sometimes be perceived as something that will only happen to others, so that’s not seemingly as bad as being short of money. It may only be when climate disaster arrives on everyone’s literal doorstep that it begins to matter, but by then it may be too late to act.
With this in mind, it’s interesting to glance back at The World Economic Forum’s Global Risks Report 2024, published last January. The report was compiled from findings from the Forum’s Global Risks Perception Survey, which takes answers from around 1,500 experts from academia, business, government and key institutions. The report revealed the biggest global risks beyond 2024, with a view of two years into the future, and then ten.
It stated that ‘misinformation and disinformation’ would be the biggest immediate risk until 2026, with extreme weather events coming in second place, ‘societal polarisation’ in third place and ‘cyber insecurity’ in fourth. Whereas in the next 10-year period from its publication, the predicted order of global risk would firmly be with the natural world, with the number one risk being extreme weather events, then in order descending, critical changes to Earth systems, biodiversity loss, ecosystem collapse – followed by natural resources shortage – so all nature’s demise in the top spots of threats to our way of life, played out over the coming nine years.
While the US commands a dominant role on the global and Western stage, Europe has to remain resolute and committed to scientific-led policy for the safety and security of its people and the wider world. We have to have the imagination to understand that climate change, biodiversity loss and resource degradation will impact us, sooner rather than later, so every single decision we make about our future today, counts like never before.
The latest US election tells us that a large proportion of people still don’t get it, or care enough. Any diminishing or degrading of sustainability and climate mitigation efforts now, will have great impacts in just a handful of years, impacts that go beyond prices of goods, energy security and our human sphere. An impending collapse of our natural world’s stability will make today’s decisions pitched against scientific guidance, nothing short of historical markers of our ignorance in the face of reality.
Richard Forsyth Editor
EU Research takes a closer look at the latest news and technical breakthroughs from across the European research landscape.
10
We spoke to Professor Martin Røssel Larsen about how he is using organoids to investigate the role of sialic acids in early brain development, and the long-term consequences of any malformations.
The SciFiMed project team are developing new tools to analyse the FH family of proteins and investigate diseases associated with dysregulation of the complement system, as Professor Diana Pauly explains.
We spoke to Prof. Marianna Kruithof-de Julio and her team at the University of Bern about their research which involves using patient-derived organoids to improve bladder cancer treatment. Their work aims to tailor therapies based on individual tumor profiles, addressing the disease’s high heterogeneity and recurrence rates.
Enzymes transfer oxygen or electrons and speed up oxidation reactions so that a wide variety of products can be made without using high temperatures or harsh chemicals, as Professor John Woodley explains.
We spoke to Professor Johannes Lengler about his work in refining geometric inhomogenous random graphs (GIRGs), and how they could improve epidemic modelling and inform more effective quarantine strategies.
It’s not clear whether left atrial appendage closure prevents strokes in patients who have not been diagnosed with the condition, a question Dr Helena Dominguez and her colleagues in the LAACS-2 study are addressing.
Mount Etna is one of the most active volcanoes in the world. The Improve project team are investigating signals from the volcano and how they relate to magma dynamics, as Dr Paolo Papale explains.
The Re:FAB project aims to address sustainability concerns and encourage the transition towards a circular economy, as Helena Tuvendal, Ola Karlsson, Johan Borgström and Fabian Sellberg explain.
We spoke to Professor Torsten Nygaard Kristensen from Aalborg University about his work in optimizing the farming of efficient insect populations that can convert organic waste into highvalue products.
We spoke with Professor Charlotte Jacobsen, Professor Poul Erik Jensen, Professor Marianne Thomsen, Emil Gundersen and Malene Fog Lihme Olsen to learn more about their research into microalgae cultivation.
Researchers in the Co-Green project are looking at the issues around wind farm noise and how local communities can be better included in project planning, as Dr Julia Kirch Kirkegaard and Daniel Frantzen explain.
35 TRINC
Dr Astrid Nonbo Andersen and the TRiNC project team are exploring how truth and reconciliation commissions function in different Scandinavian countries and their role in shedding new light on the past.
36 UNDERSTANDING WRITTEN ARTEFACTS
The Understanding Written Artefacts (UWA) cluster brings together a broad community of researchers to analyse a variety of written artefacts, from 5,000 years ago right up to the present day.
42 FAMILY BACKGROUND, EARLY INVESTMENT POLICIES, AND PARENTAL INVESTMENTS
How do young families respond to public policies such as subsidised childcare? Does their response depend on their background? These questions lie at the core of Dr Hans Sievertsen’s research.
44 CONTESTABLE (AI)
Individuals affected by automated decisions have the right to meaningful information about the basis on which they were reached, as well as the right to contest the decision, issues Professor Thomas Ploug is addressing.
The team behind the Ultra-Lux project are developing a new type of thin-film light-emitting diode, using a class of materials called perovskites, as Professor Paul Heremans and Karim Elkhouly explain.
48 METEOR
Rare earth elements have been used to produce magnets since the ‘60s. We spoke to Professor Mogens Christensen about his work in developing a more environmentallyfriendly method of producing ceramic magnets.
The QFaST project team are using sophisticated sensors to measure the properties of spin excitations in magnets, and investigating whether they could be used to perform certain tasks, as Dr Pepa MartínezPérez explains.
52
The production of terrestrial gamma-ray flashes within thunderclouds is the most energetic natural process on earth. We spoke to Dr Christoph Köhn about his research into the signatures of TGFs.
EDITORIAL
Managing Editor Richard Forsyth info@euresearcher.com
Deputy Editor Patrick Truss patrick@euresearcher.com
Science Writer Nevena Nikolova nikolovan31@gmail.com
Science Writer Ruth Sullivan editor@euresearcher.com
Production Manager Jenny O’Neill jenny@euresearcher.com
Production Assistant Tim Smith info@euresearcher.com
Art Director Daniel Hall design@euresearcher.com
Design Manager David Patten design@euresearcher.com
Illustrator Martin Carr mary@twocatsintheyard.co.uk
PUBLISHING
Managing Director Edward Taberner ed@euresearcher.com
Scientific Director Dr Peter Taberner info@euresearcher.com
Office Manager Janis Beazley info@euresearcher.com
Finance Manager Adrian Hawthorne finance@euresearcher.com
Senior Account Manager Louise King louise@euresearcher.com
EU Research
Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom
T: +44 (0)207 193 9820
F: +44 (0)117 9244 022
E: info@euresearcher.com www.euresearcher.com
© Blazon Publishing June 2010 ISSN 2752-4736
The EU Research team take a look at current events in the scientific news
These projects aim to address environmental challenges and advance climate action across the continent.
The Commission has granted today more than €380 million to 133 new projects across Europe under the LIFE Programme for environment and climate action. The allocated amount represents more than half of the €574 million total investment needs for these projects – the remainder coming from national, regional and local governments, public-private partnerships, businesses, and civil society organisations.
LIFE projects contribute to reaching the European Green Deal’s broad range of climate, energy and environmental goals, including the EU’s aim to become climate-neutral by 2050 and to halt and reverse biodiversity loss by 2030, while ensuring Europe’s longterm prosperity. This investment will have a lasting impact on our environment, the economy and the well-being of all Europeans.
The projects cover a range of environmental areas, including the circular economy, nature and biodiversity restoration, climate resilience, and clean energy transitions. These investments will have a profound impact on Europe’s environment, economy, and citizens’ quality of life, propelling the EU’s green transition.
One of the primary focuses of the LIFE Programme is promoting a circular economy. Nearly €143m has been dedicated to 26 projects that aim to reduce waste, improve water use, and tackle air and noise pollution. These initiatives also emphasise the importance of recycling and reusing materials to minimise environmental impact. Notably, Italy’s LIFE GRAPhiREC project is set to recycle graphite from battery waste, potentially generating €23.4m in revenue while cutting production costs by €25m. Spain’s LIFE POLITEX project, with a €5m investment, aims to reduce the fashion industry’s environmental footprint by converting textile waste into new fabrics. Additionally, the Canary Islands’ €9.8m DESALIFE project seeks to improve water resilience by desalinating 1.7 billion litres of ocean water using offshore wave-powered buoys.
Close to €216m has been allocated for projects that restore ecosystems and protect biodiversity. Twenty-five projects will address habitat restoration, species conservation, and improved management of freshwater, marine, and coastal environments.
LIFE4AquaticWarbler and LIFE AWOM, two biodiversity projects, aim to save the rare aquatic warbler bird, mobilising €24m across multiple countries, including Belgium, Germany, and Spain.
Budapest’s €3.6m Biodiverse City LIFE project promotes the peaceful coexistence of nature and urban life, showcasing how cities can integrate biodiversity into their planning.
With €110m earmarked for climate resilience and mitigation efforts, Europe is making strides in adapting to the impacts of climate change. IMAGE LIFE and LIFE VINOSHIELD, two notable projects, will help European vineyards and cheese producers (e.g., Parmigiano Reggiano and Camembert) adapt to extreme weather conditions and water scarcity. To further accelerate the green transition, €105m is being invested in projects that promote clean energy solutions. One example is LIFE DiVirtue, a €1.25m project using virtual and augmented reality to train construction professionals in building zero-emission structures.
Meanwhile, the €10m ENERCOM FACILITY project will empower 140 emerging energy communities across Europe to develop sustainable energy initiatives and promote long-term renewable energy solutions. The LIFE Programme’s investment in these 133 projects highlights the EU’s commitment to tackling environmental challenges and achieving a sustainable future. Through innovations in the circular economy, biodiversity restoration, and climate resilience, Europe is moving closer to its goal of becoming climate-neutral by 2050 and ensuring the success of its green transition.
https://cinea.ec.europa.eu/programmes/life_en
Closing the innovation gap with the US and China is her first ambition –but speech is light on new details.
Ursula von der Leyen promised to put research and innovation “at the centre of our economy” as she laid out her plans for the next five years, before the European Parliament voted to confirm her second term as president of the European Commission.
A total of 401 MEPs voted in favour of von der Leyen’s reelection this afternoon, and 284 voted against, making for a more comfortable majority than her first vote in 2019. There was no mention of research in her speech to MEPs before the vote, but in her political guidelines published this morning, von der Leyen pledges to increase the EU’s research spending, and “expand” the European Research Council (ERC) and the European Innovation Council (EIC).
The ERC delivers grants to support primarily early-stage research, with a budget of more than €16 billion under Horizon Europe. French president Emmanuel Macron has called for the ERC to be reinforced, and ERC president Maria Leptin recently urged grant holders to lobby politicians for more funding. The EIC supports researchers and innovators with grants and equity investments. Its €10.1 billion budget until 2027 makes it one of the biggest deep tech investors in the world, but its president has stressed the need for additional funding. Meanwhile MEPs and the research community have been calling for the Horizon Europe budget to be doubled to €200 billion in the next framework programme, as member states continue to show no signs of spending 3% of GDP on R&D, targets which were proposed back in 2003.
Von der Leyen also wants to launch new public-private partnerships, propose a European biotech act next year as part of a wider life sciences strategy, and strengthen the university alliances designed to deepen cross-border links between institutions. Leptin said von der Leyen’s appointment would provide much-needed continuity, and she is “very pleased” with the commitment to increase research spending with a focus on fundamental research, scientific excellence, and disruptive innovation, and to expand the ERC. “In the ERC competitions, we can now only fund 60% of excellent proposals; increased funds would certainly help to close this gap and further curb the brain drain from Europe,” Leptin said. “What’s more, there is a need to increase the funding given to ERC grantees as the grant sizes have not changed since the creation of the ERC in 2007, despite inflation.”
Her focus on research and innovation won applause from MEPs. “I very much welcome that we finally have a Commission President again that understand the value of science, research and innovation for the European project,” said German MEP Christian Ehler in a statement posted to X. But despite research and innovation winning pride of place in von der Leyen’s speech, she didn’t yet flesh out any policy details.
In her political guidelines for this term, von der Leyen has promised to “increase” research spending and expand the European Research Council (ERC) and European Innovation Council (EIC). But it’s unclear what budget the Commission will shoot for when haggling with member states begins in earnest next year.
In a report published this week, MEPs in the Parliament’s Committee on Industry, Research, and Energy (ITRE) say half of the FP10 budget should go to the ERC and EIC. MEPs are also backing the creation of a European Technology and Industrial Competitiveness Council to enhance private sector participation, and a European Societal Challenges Council to manage research and innovation activities addressing societal challenges.
In her speech on Wednesday, von der Leyen did not mention several policy ideas in the recent report from former Italian prime minister Mario Draghi – even though the Competitiveness Compass is set to be based on his report. The report, for example, wants the EU to set up its own breakthrough innovation agency, modelled on the US’s Defense Advanced Research Projects Agency. It also wants the ERC to fund EU chairs for “top researchers”. Following her speech, MEPs approved von der Leyen’s picks for commissioner, as expected following weeks of haggling among the parliament’s political groups. 370 voted for, 282 against, and 36 abstained. They will take office on 1 December.
Study detects a synergistic effect making substances more dangerous, raising a significant alarm considering the fact that humans are exposed to both.
Microplastics and persistant materials known as ‘forever chemicals’ are two of our most concerning modern pollution problems. Now new research has shown how their impact on the environment drastically increases when combined. A team from the University of Birmingham in the UK looked at the effects of microplastics and PFAS (per- and polyfluoroalkyl substances) on Daphnia magna water fleas, both separately and mixed together.
Exposing Daphnia to both pollutants together under laboratory conditions caused up to 41 percent more damage to the water fleas than the plastics and forever chemicals did separately. That effects included stunted growth, delayed sexual maturity, and fewer offspring, with the severity of harm greater in tests on water fleas that had previously been exposed to other chemical pollution, suggesting a cumulative effect. “It is imperative that we investigate the combined impacts of pollutants on wildlife throughout their lifecycle to get a better understanding of the risk posed by these pollutants under real-life conditions,” says environmental scientist Mohamed Abdallah. “This is crucial to drive conservation efforts and inform policy on facing the growing threat of emerging contaminants such as forever chemicals.”
Microplastics are fragments of plastic less than 5 millimeters (0.2 inches) across, which accumulate in the environment as a result of the break-up of larger materials or the shedding of synthetic fibres.
Though the extent of their effects on ecosystems and human health isn’t fully known, research suggests there is good reason to be concerned as they increasingly spread, both into the most remote spots on Earth and deep inside our own bodies. PFAS, meanwhile, are used in a multitude of manufacturing processes for their fire-suppressing qualities, and have been linked to kidney damage and cancer growth. Slow to break down, these pollutants have been found in wildlife and falling rain – as ubiquitous in our environment as microplastics.
The study was designed to simulate D. magna’s potential exposure to both of these toxins in the natural world. These water fleas are a key part of the aquatic food chain, as well as a useful indicator of environmental pollution. “Our research paves the way for future studies on how PFAS chemicals affect gene function, providing crucial insights into their long-term biological impacts,” says evolutionary systems biologist Luisa Orsini.
Identifying the impact of individual pollutants is challenging, let alone deciphering their impact on the environment when combined. With improvements in analytical methods and technology, the researchers hope we can better quantify the harm caused by numerous pollutants under more complex circumstances. “These findings will be relevant not only to aquatic species but also to humans, highlighting the urgent need for regulatory frameworks that address the unintended combinations of pollutants in the environment,” says Orsini. “Understanding the chronic, long-term effects of chemical mixtures is crucial, especially when considering that previous exposures to other chemicals and environmental threats may weaken organisms’ ability to tolerate novel chemical pollution.”
Air pollution in Europe has declined over the past two decades but remains one of the greatest environmental health threats. This is the message from the United Nations Environment Programme (UNEP) on the International Day of Clean Air for Blue Skies, celebrated on 7 September. Almost all European city-dwellers (96%) are exposed to concentrations of fine particles well above the World Health Organization (WHO) guidelines. In 2021, this reference threshold was set at 5 µg/m³, a level deemed to pose no health risk.
Fine particles come from solid fuels used for domestic heating, industrial activities, and road transport. In 2022, only Iceland recorded a national average of fine particle concentrations below the WHO guide value, according to the European Environment Agency (EEA). On the other hand, Croatia, Italy and Poland recorded concentrations above the current European Union (EU) limit of 25 µg/m³ in 2022—a standard five times higher than the WHO threshold. Fine particles, one of the worst contributors to air pollution, caused the premature death of almost 4 million people in 2019, according to UNEP. East Asia and Central Europe are the hardest hit. The European Commission reports that this type of pollution causes 300,000 premature deaths annually in the EU.
UNEP’s “Air Pollution Note”, based on 2019 data, paints a mixed picture for the EU. In France and Belgium, the population is exposed to levels of fine particles 2.2 and 2.6 times higher than the WHO threshold (5µg/m3). In Italy, Hungary and Romania, the average level (16 µg/m3) is 3.2 times higher than the WHO
guideline. In Poland, fine particle pollution is twice as high as in Belgium (23 µg/m3, 4.6 times the WHO threshold). Peaks in concentration of fine suspended particles in Serbia, BosniaHerzegovina and Northern Macedonia, reached levels 5 to 6 times higher than the WHO threshold.
Legislation was adopted by the EU in 2016, via the new National Emission Ceilings Directive (“NEC” Directive). This legislation is in force in each of the 27 member states for five primary air pollutants: sulfur dioxide (SO2), nitrogen oxides (NOx), non-methane volatile organic compounds (NMVOCs), ammonia (NH3) and fine particulate matter (PM2.5). However, by 2022, only 16 member states had met their national commitments for the period 2020-29, according to the EEA. For 11 other countries, the targets for at least one of the five primary pollutants have not been met.
The challenge remains for all EU countries, particularly in the case of ammonia. The agricultural sector is the main source of this air pollutant, and these emissions have fallen only slightly since 2005. The Green Deal’s “zero pollution” action plan targets reducing premature deaths caused by fine particles by at least 55% by 2030, compared with 2005 levels. It also includes a long-term objective of no significant impact on health by 2050. In March 2024, the European institutions reached a political agreement on ambient air quality to bring EU standards closer to the WHO reference thresholds. Thus, from 1 January 2030, the new European standard will be 10 µg/m³ for fine particles.
Dengue, the most common mosquito-borne disease in the world, is having a record-breaking year.
Climate change is placing more people at risk of mosquitoborne diseases such as dengue fever, extending the seasonal window and creating frequent outbreaks that will become increasingly difficult to deal with, experts have said. The geographic range of vector-borne diseases has expanded rapidly in the past 80 years, placing more than half the world’s population at risk. Experts warn that due to accelerating global warming and urbanisation, cases are set to spread across currently unaffected parts of northern Europe, Asia, North America and Australia over the next few decades.
As a result, an additional 4.7 billion people will be placed in danger if emissions and population growth continue to rise at their current rate. The warning – which will be shared in a presentation at this year’s Global Congress held by the European Society of Clinical Microbiology and Infectious Diseases – comes after a report by The National last month revealed global cases of dengue are rising sharply, with a number of Arab nations in particular experiencing an increase in cases in recent months.
Prof Rachel Lowe of the Catalan Institution for Research and Advanced Studies will tell the forum substantially more people will be placed at risk of mosquito-borne diseases due to warming temperatures. She said: “Global warming due to climate change means that the disease vectors that carry and spread malaria and dengue can find a home in more regions, with outbreaks occurring in areas where people are likely to be immunologically naive and public health systems unprepared.
“The stark reality is that longer hot seasons will enlarge the seasonal window for the spread of mosquito-borne diseases and favour increasingly frequent outbreaks that are increasingly complex to deal with.”
Dengue cases have been largely confined to tropical and subtropical regions because colder temperatures kill the mosquito’s larvae and eggs. But longer, hotter seasons have resulted in dengue becoming the most rapidly spreading mosquito-borne viral disease in the world. Nine of the 10 most hospitable years for dengue transmission have occurred since 2000, resulting in the disease spreading in 13 European countries, with localised transmission in France, Italy and Spain last year. According to the World Health Organisation, the number of dengue cases reported has increased in the past two decades from 500,000 in 2000 to more than 5 million in 2019.
By combining disease-carrying insect surveillance with climate forecasts, researchers are developing ways to predict when and where epidemics might occur and direct interventions to the most at-risk areas in advance. One similar project, which is being led by Prof Lowe, is using a powerful supercomputer to understand how climate and disease transmission are linked to predict mosquito-borne disease outbreaks in 12 countries. “By analysing weather patterns, finding mosquito breeding sites with drones and gathering information from local communities and health officials, we are hoping to give communities time to prepare and protect themselves,” she said. “But ultimately, the most effective way to reduce the risk of these diseases spreading to new areas will be to dramatically curb emissions.”
Scientists and whale watchers have recently spotted killer whales off the coast of Washington state in the US swimming with dead fish on their heads.
In September 2024, scientists and whale watchers spotted orcas (Orcinus orca) in South Puget Sound and off Point No Point in Washington State swimming with dead fish on their heads. This is the first time they’ve donned the bizarre headgear since the summer of 1987, when a trendsetting female West Coast orca kickstarted the behavior for no apparent reason. Within a couple of weeks, the rest of the pod had jumped on the bandwagon and turned salmon corpses into must-have fashion accessories, according to the marine conservation charity ORCA — but it’s unclear whether the same will happen this time around.
The motivation for the salmon hat trend remains a mystery. “Honestly, your guess is as good as mine,” Deborah Giles, an orca researcher at the University of Washington who also heads the science and research teams at the non-profit Wild Orca. Salmon hats are a perfect example of what researchers call a “fad” — a behavior initiated by one or two individuals and temporarily picked up by others before it’s abandoned. Back in the 1980s, the trend only lasted a year; by the summer of 1988, dead fish were totally passé and salmon hats disappeared from the West Coast orca population.
Orca researchers’ best guess is that salmon hat fads are linked to high food availability. South Puget Sound is currently teeming with chum salmon (Oncorhynchus keta), and with too much food to eat on the spot, orcas may be saving fish for later by balancing them on their heads. Orcas have been spotted stashing food away in other places, too. “We’ve seen mammaleating killer whales carry large chunks of food under their pectoral fin, kind of tucked in next to their body,” Giles said. Salmon is probably too small to fit securely under orcas’ pectoral fins, so the marine mammals may have opted for the top of their heads instead.
Camera-equipped drones could help researchers monitor salmon hat-wearing orcas in a way that was not possible 37 years ago. “Over time, we may be able to gather enough information to show that, for instance, one carried a fish for 30 minutes or so, and then he ate it,” Giles said. But the food availability theory could be wrong — if the footage reveals that orcas abandon the salmon without eating them, researchers will be sent back to the drawing board. Whatever the reason for the behavior, Giles said it’s been fun to watch it come back in style. “It’s been a while since I’ve personally seen it,” she said.
Brain organoids provide a means for researchers to study different mechanisms in the developing brain, and to investigate how they function at the molecular level. We spoke to Professor Martin Røssel Larsen about how he is using organoids to investigate the role of sialic acids in early brain development, and the long-term consequences of any malformations.
A type of negatively charged monosaccharide, sialic acids play an important role in intermolecular and intercellular interactions, and in the development of neurons. For example, the presence of a post-translational modification (PTM) called polysialic acid (PSA) on neural cell adhesion molecules (NCAM) prevents neurons from interacting with each other. “These cells are therefore effectively pushed away from each other by the negative charge of PSA, and then free to migrate. This is very important in early brain development, where cells migrate to organise and generate axons, dendrites, and other cell types,” explains Martin Røssel Larsen, Professor of Molecular Biology at the University of Southern Denmark (SDU), where he leads a research group. As Principal Investigator of several research projects based at SDU, Professor Røssel Larsen aims to build a deeper picture of the role of sialic acids in early brain development, work in which he is also considering the brains of other species.
“The biggest difference between humans and chimpanzees is sialic acid,” he continues.
“There is an enzyme that takes human sialic acid and puts on an OH group, then you get the sialic acid we find in monkeys.”
As humans we all have the gene that leads to the addition of an OH group, although it doesn’t function due to a mutation. The project team is working to correct that specific gene in stem cells, then the plan is to investigate whether the introduction of monkey-type sialic acid leads to changes in the human brain, using brain organoids grown from induced pluripotent stem cells. “These organoids can be thought of as minibrains reflecting early brain development. They represent just one part of the brain, the cortical layer,” outlines Professor Røssel Larsen. Researchers are using sophisticated imaging techniques in this work. “If we change the sialic acid, do we get a different morphology of our mini-brains? Can we correlate that to the size of the cortical layers? It is believed that the size of the cortex is related to the learning and memory differences that we see between monkeys and
humans,” continues Professor Røssel Larsen. “To study this specific molecular change between human and chimpanzees we aim at introducing the mutation of this gene in chimpanzee stem cells, so it doesn’t function. Then we make organoids and look at whether that mutation leads to changes in the size of the cortex and further change in proteins and their PTMs, especially sialylation.”
developed to isolate nerve terminals from brain organoids, meaning they can now be taken into a human system, which opens up new avenues of investigation, says Professor Røssel Larsen. “We can manipulate stem cells, make brain organoids from these, and then see if there is a change in the nerve terminals,” he outlines. “We’re looking at whether we can see changes if we manipulate enzymes
“If we change the sialic acid, do we get a different morphology of our mini-brains? Can we correlate that to the size of cortex? Many people believe that the size of cortex is related to the learning and memory differences that we see between monkeys and humans.”
Researchers in the project are using imaging techniques as well as mass spectrometry to look at changes in sialic acid in the cortical layers, then relate it back to the function of specific proteins. A further strand of research involves working with nerve terminals – these are small, active compartments which are sometimes called synaptosomes. “We can stimulate these nerve terminals, and they will release neurotransmitter, then they will take up a vesicle again and perform synaptic transmission repeatedly,” explains Professor Røssel Larsen. A method has now been
that add sialic acids - sialyltransferases. How is sialic acid used in this very sophisticated brain system? Is it a way of fine-tuning interactions? It is likely that protein-protein interaction is heavily regulated by sialic acid on cell surfaces.”
The hope is that this research will uncover morphological changes in the brain that can be related to changes in sialic acid, while Professor Røssel Larsen also hopes to identify the targets of sialic transferases in the brain. By knocking down for example the 2,6 sialyltransferases - another distinct difference
in the sialic acid biology between human and chimpanzees - Professor Røssel Larsen and his team hope to see exactly what sites and what proteins it changes. “We can then find substrates for these sialyltransferases, and correlate this with any morphological changes in the brain,” he explains. Sialic acids are also thought to play a major role in immunity and in the progression of certain conditions, including certain forms of cancer and schizophrenia, another topic that Professor Røssel Larsen is addressing in his research. “We think that if we want to look at diseases like schizophrenia, we need to move away from only focusing on the genes to look at what is present in the brain at the time with respect to the building blocks that do the work in the cell, the proteins,” he outlines. “In our interdisciplinary DEVELOPNOID project we have taken plasma from patients with schizophrenia and controls. We then reprogrammed the blood cells from patients to stem cells, and we are now growing brain organoids to investigate underlying molecular mechanisms leading to schizophrenia using ‘Omics technologies and imaging.”
This research could lead to new insights into the underlying mechanisms behind schizophrenia and other diseases, and ultimately the development of new, more effective treatments. The project’s research at this stage is largely fundamental in nature however, with the team focused primarily on uncovering the role of sialic acids in the brain. “If we can identify the role of sialic
acids in the brain it would be a very big achievement, it would be really exciting,” says Professor Røssel Larsen. This is still a relatively neglected area of research, and Professor Røssel Larsen hopes that the project’s work will stimulate further interest and development. “We hope there will be a renewed focus on PTMs like glycosylation, which have been overlooked for a long time. Not many people work on glycosylation in Denmark, and not a lot is known about its true function, partly because the technology is not there yet,” he explains. “We want to contribute to continued development, and to use brain organoids to investigate other diseases beyond schizophrenia. Brain organoids are a very effective model system, they cover more ground than 2-d systems.”
The brain is comprised a lot of different paths that communicate with each other, yet how this communication occurs is not well understood. More sophisticated brain organoid models could help researchers gain a fuller picture. “It is possible now to make different brain organoids from several parts of the brain, then put them together and let them grow together. Then you will see that the neurons start migrating in and out of the different organoids,” says Professor Røssel Larsen. A lot of attention in the research group will be focused in future on developing methods to characterise PTMs, in particular glycosylation, which is a topic of deep interest to Professor Røssel Larsen. “It’s very difficult to measure and analyse changes in the glycostructure, but it’s also very fascinating,” he stresses.
Interdisciplinary project DEVELOPNOID
Project Objectives
Protein glycosylation is important for communication between cells and cell migration, mechanisms essential for development of the brain. A big difference between us and chimpanzees is the sugar molecule, sialic acids (SA), on cell surface proteins. SA is involved in neural development, and in the present project we will investigate the role of SAs in early brain development using brain organoids, multi-omics and imaging.
Project Funding
The role of sialic acids in early brain development project is funded by The Independent Research Fund Denmark.
Project Partners
• Prof. Kristine Freude, Department for Veterinary and Animal Science, University of Copenhagen, DK.
• Dr. Madeline A. Lancaster, MRC Laboratory of Molecular Biology, Cambridge Biomedical Campus, Cambridge, UK.
• Prof. Jonathan Brewer, Department of Biochemistry and Molecular Biology, University of Southern Denmark, DK.
Contact Details
Project Coordinator,
Professor Martin Røssel Larsen
Department of Biochemistry and Molecular Biology
University of Southern Denmark (SDU) Campusvej 55; Odense M - DK-5230
T: +45 6550 1872
E: mrl@bmb.sdu.dk
W: https://dff.dk/ansog/stottet-forskning/ podcasts/celler-med-sukker-pa
Martin Røssel Larsen
Martin Røssel Larsen is Professor of Molecular Biology at the University of Southern Denmark, where he leads a research group. He is internationally recognized for his work in developing methods for characterising post-translational modifications (PTMs) of proteins and in bridging biological mass spectrometry and biomedical research.
The complement system is an important part of the body’s defence against pathogens, and malfunctions can leave people more susceptible to disease. The SciFiMed project team are developing new tools to analyse the FH family of proteins and build a deeper picture of diseases associated with dysregulation of the complement system, as Professor Diana Pauly explains.
The complement system forms an important part of the body’s defence against pathogens, reacting rapidly to the presence of foreign material, while also identifying and tagging cellular debris for removal by phagocytes. When this system is dysregulated or out of balance, it can leave individuals more susceptible to diseases. “A malfunctioning complement system can heighten susceptibility to infections, such as meningococcal infections. Conversely, an overactive complement system can lead to autoimmune diseases, where the body attacks itself,” explains Professor Diana Pauly, group leader of the experimental ophthalmology lab at Philipps University Marburg in Germany. While the complement system has been intensively studied over the years, some of the regulatory mechanisms are still not fully understood, a topic at the heart of the EU-backed SciFiMed project. “We are investigating the role of complement factor H (FH) and FH-related proteins (FHRs), which are key regulators of the complement system. FH is one of the most important inhibitors, while FHR proteins share structural domains with FH,” says Professor Pauly, the project’s Principal Investigator.
There are generally considered to be five main FHR proteins. Researchers in the project aim to investigate their functions and gain a deeper understanding of their roles in disease development and progression. The FHRs share some structural similarities with FH, but Professor Pauly says there are also some clear differences. “The FH protein has some regulatory domains which are not present in the FHRs, while some of the FHRs have dimerization domains which are not present in the others,” she outlines. The project team is working to build a fuller picture of the impact of FHR proteins on disease, looking beyond
genetic associations. “We know that changes in FHRs are associated with the development of specific diseases – or protection against others – especially in the eye and the kidney. But we only know that on a DNA basis,” explains Professor Pauly. “We have developed a variety of new tools to analyse each of the individual proteins in this FH family and get an idea of their function. We have commercialised new ELISAs (Enzyme-Linked Immunosorbent Assays) to detect FHR proteins.”
This allows researchers to identify cases where the concentration of these proteins is particularly high or low, and then investigate the impact on disease. Specific antibodies have also been generated through project partner, Sanquin, allowing researchers to distinguish between each of the proteins, which Professor Pauly says is an important development. “These tools, which are now available commercially through our partner Hycult, will be instrumental in advancing research,” she says. These antibodies are already proving invaluable for investigating the function of FHR proteins, while other tools are also under development in the project. “We have used these antibodies to stain FHR proteins in specific tissues for example, and to isolate them from tissue extracts for interaction studies. This helps us to identify the regulatory processes in which they are involved,” explains Professor Pauly. “We believe that this research will extend beyond SciFiMed, paving the way for many new discoveries.”
The development of ELISA, as well as lateral
flow assays using these new antibodies, opens up wider possibilities. Researchers can now conduct clinical studies across a variety of biomaterials for different diseases, where previously only genetic associations with these proteins had been established. “This allows us and other researchers to explore a wide range of diseases. Initially, our partners tested healthy individuals to establish baseline values, and we have conducted preliminary studies in patients with age-related macular degeneration, delirium, and stroke, showing promising trends that merit further exploration,” says Professor Pauly. The complement system is known to be involved in a broad spectrum of autoimmune diseases, and Professor Pauly says functional assays can help researchers build a fuller picture. “We know a fair amount about the complement system’s role in infections, but less about autoimmune diseases. These assays could help researchers find out how the complement system is involved in autoimmune diseases,” she outlines.
longer have to be sent to central laboratories, with a waiting time of a week or more for the results,” says Professor Pauly. “With lateral flow assays, doctors or medical technicians can check the FHR status directly on-site, allowing for immediate adjustments to potential therapies.”
A significant degree of progress has been made in this respect, and work is ongoing to improve the lateral flow assays, with the aim of moving towards commercialisation in the future. Currently the project’s lateral flow assay is capable of detecting three proteins - FHR-2, FHR-3, and FHR-4 - within the FH family in 15 minutes, while other assays are at different stages of development. “The quantitative assays and the antibodies have been commercialised and are available for researchers, while we are also developing at least two functional assays for the whole protein family,” outlines Professor Pauly.
The overarching goal in this research is to help improve the treatment of diseases associated
“A malfunctioning complement system can heighten susceptibility to infections, such as meningococcal infections. Conversely, an overactive complement system can lead to autoimmune diseases, where the body attacks itself.”
This research represents a step towards the goal of providing personalised care that more closely reflects the specific circumstances of individual patients. Clinical studies (although not on a personalised basis) are planned, which it is hoped will lead to the identification of biomarkers for disease development and treatment effectiveness. “This will help doctors provide more personalised treatment,” continues Professor Pauly. Researchers are also working to develop lateral flow assays capable of rapidly providing information to clinicians on the levels of FH and FHR proteins, which can then guide treatment. “We can imagine a scenario where blood samples, urine, or other samples that need FHR concentration measurements no
with dysregulation of the complement system. The tools developed in the project will help researchers build a deeper understanding of these diseases, believes Professor Pauly. “With the ELISA and the antibodies, we hope that researchers will push the science forward, conducting studies and uncovering the role of these proteins in infections and autoimmune diseases,” she says. This will ultimately contribute to the development of new, more effective therapies against a variety of different diseases. “The importance of the complement system is increasingly recognised, especially with the growing number of complement-targeting drugs that have been approved. This represents a significant opportunity for combating various diseases,” says Professor Pauly.
Screening of inFlammation to enable personalized Medicine
Project Objectives
The function of the FH-related (FHR) proteins, important components of the complement system, is unclear, a topic at the heart of the SciFiMed project. The project consortium, bringing together eight partners from across Europe, are working together to develop tools to analyse these proteins, which will ultimately help researchers develop more effective therapies against a wide variety of diseases.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 899163.
Project Partners
SciFiMed brings together an excellent multidisciplinary team of Engineers, Chemists, Geneticists, Immunologists, and Physicians from both academia and industry from four different countries. https://www.scifimed.eu/our-team
Contact Details
Project Coordinator,
Professor Diana Pauly, Ph.D
Exp. Ophthalmologie, Augenheilkunde AG Pauly
AWT-Nummer: 3911
Baldingerstraße 35043 Marburg
Germany
T: +49 170 2738864
E: diana.pauly@uni-marburg.de W: https://www.scifimed.eu/ W: www.paulylab.de
Professor Diana Pauly is a research scientist and group leader at the Philipps University Marburg in Germany. She holds a Ph.D in Biotechnology from the University of Potsdam, and a degree in Human Biology, specialising in immunology, from the University of Greifswald.
We
spoke to Prof. Marianna Kruithof-de Julio and her team at the University of Bern about their research which involves using patient-derived organoids to improve bladder cancer treatment. Their work aims to tailor therapies based on individual tumor profiles, addressing the disease’s high heterogeneity and recurrence rates.
Bladder cancer ranks as the tenth most prevalent cancer worldwide, with approximately 550,000 new cases and 200,000 deaths each year, making it a substantial public health challenge. Despite advancements in treatment, its heterogeneity and high recurrence rate pose ongoing difficulties, highlighting the demand for more effective and personalised treatments. Dr. Marianna Kruithof-de Julio and her team at the University of Bern are developing patientderived organoids from bladder cancer tumors to tackle these issues. These small clusters of cells mimic the genetic and molecular traits of the original tumors, providing a promising path for personalised cancer treatment.
Bladder cancer is characterised by a high heterogeneity between patients which often contributes to treatment failures. Molecular and histological differences highlight the need for personalised treatment approaches. Pathologically, the disease is classified into two main types: non-muscle invasive bladder cancer (NMIBC) and muscle-invasive bladder cancer (MIBC). NMIBC accounts for around 70% of cases and is typically treated with transurethral resection of the bladder (TUR-B) and intravesical instillations of chemotherapy, often in combination with the Bacillus Calmette–Guérin (BCG) vaccine. However, many of these tumors will recur, and about 10% will progress to MIBC, a highly aggressive tumor with a high mortality rate. MIBC is usually treated with systemic cisplatin-based chemotherapy followed by radical cystectomy, a highly invasive procedure that is only beneficial for 30-40% of patients.
Patient-derived organoids have been successfully used to predict drug response in vitro for various cancers such as ovarian, pancreatic, and gastrointestinal cancers. Prof. Kruithof-de Julio and her research team have successfully grown organoids from various bladder cancer stages and grades that preserve the histological and molecular heterogeneity of the original tumors. In their study, the team created the organoids from specimens obtained through transurethral resection, cystectomy. The organoids were successfully cultured
biomarkers that could forecast therapy response and resistance. This approach aids in understanding how mutations render therapies ineffective and how tumors develop resistance to treatment over time. It can also determine which treatments might be effective based on the genetic composition of a patient’s tumor. However, before organoids can be broadly implemented in clinical settings, they must undergo further validation through clinical trials. Dr. Kruithof-de Julio highlights that the success achieved with organoids in cancer treatment can potentially be applied to human patients.
“With the use of this technology, we can evaluate a broad spectrum of drugs to identify the most promising options for each individual patient.”
regardless of the tumor’s stage, grade, or histological pattern. The organoids maintained a similar level of cellular proliferation compared to the original tumors and mirrored the tumor’s evolution over time, allowing researchers to observe how the cancer changes and adapts.
The researchers have created a drug screening method to evaluate both standard-care drugs and FDA-approved cancer drugs on patient-derived organoids. By examining how these organoids react to the drugs and comparing their drug response profiles with their genetic information, the researchers have identified
One of the most significant findings is the heterogeneity in organoid drug responses. For example, in few NMIBC organoids tested, the standard chemotherapy drug mitomycin C showed low significant effect, whereas epirubicin was effective in 60% of the samples tested. In contrast, MIBC organoids displayed a higher sensitivity to the cisplatin/gemcitabine combination, which is a common treatment regimen for this aggressive form of cancer. “These findings underscore the significance of personalised treatment plans. Identifying which drugs are
most effective for particular genetic profiles can enhance patient outcomes and minimise the trial-and-error method that currently dominates cancer treatment”, explains Prof. Kruithof-de Julio.
The researchers analysed the genetic mechanisms behind drug resistance. They found that organoids derived from tumors with high genomic burdens, such as those from high-grade invasive cancers, showed different responses than those from low-grade tumors. In one case, the team analysed organoids from a patient who experienced multiple recurrences of NMIBC. By comparing organoids from different stages of the disease, they could track how the tumor evolved and became resistant to treatment. This longitudinal analysis provides a deeper understanding of the disease and its evolution. In another case, organoids from a patient diagnosed with pan-urothelial disease were tested with various drugs. The organoids showed a high
sensitivity to lapatinib, a drug targeting the EGFR/ERBB2 pathway, which is frequently altered in bladder cancer. This finding suggests that lapatinib could be a viable option for patients with similar genetic profiles, which highlights the potential for repurposing existing drugs for more personalised treatments.
“Organoids allow us to test a wide array of drugs and pinpoint the most promising options for each patient,” says Prof. Kruithofde Julio. “This approach not only saves time and resources but also reduces the burden on patients who would otherwise endure multiple treatment trials.” The research team is now intent on further refining their organoid models and increasing their drug screening capabilities. They aim to include a wider range of drugs and investigate new therapeutic targets. Additionally, they are working on enhancing the scalability of organoid cultures to make this technology more accessible for clinical use.
Project Objectives
The research project focuses on cultivating patient-derived organoids from bladder cancer tumors to develop personalised treatment plans. By growing organoids replicating the original tumors’ genetic and molecular characteristics, the team aims to identify effective drug therapies for individual patients. The study also seeks to understand the genetic mechanisms behind drug resistance and the evolution of bladder cancer, ultimately aiming to translate these findings into clinical practice for improved patient outcomes.
Project Funding
Development of a platform for GU cancer patient-derived organoids OC-2019003 3RCC / InoSwiss 101.951 IP-LS awarded to Marianna Kruithof-de Julio.
Project Partners
The team includes Prof. Marianna Kruithof-de Julio, Prof. Bernhard Kiss, Prof Beat Roth, and Prof. George Thalmann, from the Department of Urology of the University Hospital Bern, Prof. Roland Seiler-Blarer from the Department of Urology of the Spitalzentrum Biel-Bienne and Dr. Marta De Menna, Deputy Director of the Translational Organoid Resource Core at the Department for BioMedical Research of the University of Bern. Their combined expertise drives this innovative research in personalised cancer treatment and precision oncology.
Contact Details
Professor Marianna Kruithof-de Julio, PhD Head of the Urology Research Laboratory Department of Urology & Department for BioMedical Research
Director
Organoid CORE facility
University of Bern Murtenstrasse 24, 3008 Bern, Switzerland E: marianna.kruithofdejulio@unibe.ch
Professor Kruithof-de Julio is a researcher at the University of Bern, specializing in precision medicine within urology. She leads the Urology Research Laboratory and the Organoid CORE facility, focusing on advanced tools for precision medicine. Her team has successfully developed patient-derived organoids for bladder, prostate, and pancreatic cancers, providing critical models for better understanding these diseases. They also utilize microvascular on-chip chambers to study the functional potential of single cells. What distinguishes her research group is their holistic approach, examining tumor cells, stroma, immune cells, and vasculature together to gain a comprehensive view of cancer and its progression.
As natural and biodegradable agents, enzymes are key to advancing sustainable, green manufacturing practices across diverse sectors. These natural proteins/catalyst facilitate oxidation reactions by transferring oxygen or electrons and speed up oxidation reactions so that a wide variety of products can be made without using high temperatures or harsh chemicals, as Professor John Woodley explains.
The majority of oxidation reactions carried out in industrial chemistry involve the use of environmentally harmful oxidants, while the reactions themselves are not particularly selective. This means many of these reactions are not sustainable. Nevertheless, oxidations are amongst the most important chemical reactions in industry. Fortunately, enzymes found in nature could provide a more sustainable way of catalysing some of these reactions (for example in producing pharmaceuticals), and researchers are exploring their wider potential. “There is a lot of interest in using enzymes to carry out and catalyse reactions in industrial reactors and process plants,” explains John Woodley, Professor of Chemical and Biochemical engineering at the Technical University of Denmark (DTU), and Principal Investigator of the FLUIZYME project. Enzymes catalyse a wide variety of reactions in the natural world, but they may lose their structure and lose stability in an industrial reactor, which will then affect how they function. “Enzymes are large molecules, made up of lots of amino acids, which form into a particular structure (or shape). Enzymes need to maintain their structure in order to maintain stability, “ says Professor Woodley. “And when an enzyme loses stability, it can no longer perform at the same level as it did initially.”
This may occur naturally, as a result of higher temperatures or changes in the pH of the solution for example, while other factors may also be involved. The FLUIZYME project team is investigating whether the rate at which this transition takes place is accelerated in industrial environments, using scale down apparatus designed to mimic the conditions in the larger reactors. “For example, we put bubbles containing oxygen in at the bottom of a vertical tube, which then rise to the top. We do this in a very defined way, so we know there’s a bubble coming at a specific point, and then a second will come later on,” outlines Professor Woodley. The bubbles rise to the top and the enzyme inside the tube is then exposed to the bubble, with Professor Woodley and his colleagues investigating changes in enzyme
stability in this environment. “We know the size of the bubbles and how much protein is in there as well. We can measure the activity of the enzyme before and afterwards, and we can also look at the structure of the enzyme before and afterwards, and so on,” he says. “We expose enzymes to these different conditions for periods of around 70 hours, and over that time we see that they lose structure and therefore stability.”
their stability in this bubble apparatus,” explains Professor Woodley. This approach has already yielded some interesting insights. “We’ve seen that an enzyme in solution gets pulled towards the bubble, and likes to stick on the bubble. This may not seem surprising in itself, but this happens to quite a marked degree,” continues Professor Woodley. “We think that hydrophobic amino acids buried inside the enzyme are dragged towards the gas-liquid interface (bubble
“Enzymes are large molecules, made up of lots of amino acids, which form into a particular structure or shape. Enzymes need to maintain their shape in order to catalyse reactions.”
The project team works on many enzymes, but mostly oxidases such as NADPH oxidase and also HMFO oxidase, alongside a number of variants. One of the standard ways of measuring the stability of these enzymes has been to heat them up and observe what happens to the structure until it reaches its melting temperature, at which it essentially falls to pieces. “To date, this socalled melting temperature has been used as an indicator of the stability of an enzyme. Now we are trying with these variants to measure their melting temperatures, and to correlate that with
surface). This then has the effect of pulling the enzyme out of solution, so that it is no longer available for reactions to take place.”
This then raises the question of whether introducing a substance like a surfactant to the solution would have a positive effect, so that an enzyme would no longer stick to the bubble, which could be an interesting avenue of investigation. In future, Professor Woodley and his colleagues plan to test many more enzymes, and build a mathematical model of their behaviour. “We are in the process of building a model which can be
used to make predictions. We hope to be able to predict whether a particular enzyme with certain properties will have problems with the bubbling, or if we need to control the bubbling in some way,” he says. This is with a view to maintaining the stability of an enzyme and enabling more efficient industrial biocatalysis. “A big part of the cost of implementing industrial biocatalysis processes relates to enzyme stability. In many cases, the more stable an enzyme is, the lower the process cost,” continues Professor Woodley. “This research is very relevant to the industrial application of enzymes.”
The wider backdrop to this research is the challenge of developing more environmentally friendly and sustainable methods of producing valuable chemicals (including pharmaceuticals). Biocatalysis represents a more sustainable approach than existing methods, yet Professor Woodley says it remains difficult to run a large-scale process. “Biocatalysis takes place in water, and the solubility of oxygen in water is
for enzymes,” he explains. The project’s work holds wider importance in terms of running any process with proteins on larger scales, and Professor Woodley is looking to share their findings more widely. “A lot of the processes use proteins as catalysts (enzymes) or produce proteins as products. Potential exposure to air is always a risk in stirred tanks, so it’s a very big issue,” he says. “We need to understand how to avoid the unnecessary effects of air drawn from the surface and also the supply oxygen to oxidation reactions. This is a very different kind of environment to those typically considered in enzymology.”
This research could ultimately help improve the design of bioprocess plants, reduce the cost of implementing industrial processes, and address concerns around the sustainability of the chemical and biotechnology industries. Alongside these more long-term considerations, Professor Woodley is also keen to utilise the apparatus developed in the project, and contribute to the goal of building a deeper picture of the factors that affect the stability of different
Understanding the Effect of Non-natural Fluid Environments on Enzyme Stability
Project Objectives
Fluizyme is an Advanced ERC project with the overall objective of understanding the effect of new-to nature conditions on enzyme stability. Such conditions occur when biocatalytic reactions are scaled-up in industry, in large tanks with fluctuating concentrations and often with interfaces (gas-liquid and liquid-liquid). In particular the focus in this project is on understanding the effect of the gas-liquid interface on the stability of oxygen-dependent enzymes, such as oxidases.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant agreement No 101021024.
Project Participants
DTU Chemical Engineering
Contact Details
Project Coordinator, Professor John Woodley
DTU Chemical Engineering
Department of Chemical and Biochemical Engineering
Søltofts Plads 228A 2800 Kgs.
Lyngby
Denmark
T: +45 5350 3747
E: jw@kt.dtu.dk
W: https://www.kt.dtu.dk/research/prosys/ projects#fluizyme
John Woodley is Professor of Chemical Engineering at the Technical University of Denmark. He has a PhD from University College London (UK) and after gaining industrial and academic experience, in 2007 he moved to Denmark. He leads research at the interface of process and protein engineering, and has long standing interest in biocatalysis, where enzymes are used for the synthesis of valuable chemicals. A dedicated teacher, he has educated more than 60 PhD students. He was co-chair of the GRC Biocatalysis conference in 2016 (USA) and will be co-chair of the EIC Enzyme Engineering conference in 2025 (Denmark). John is a Fellow of the Royal Academy of Engineering (UK).
New network models have been developed over recent years that could help researchers gain deeper insights into the spread of disease. We spoke to Professor Johannes Lengler about his work in refining geometric inhomogenous random graphs (GIRGs), and how they could improve epidemic modelling and inform more effective quarantine strategies.
A variety of highly sophisticated network models have been developed over the years to look at the dynamics of social interaction, for example the way that rumours filter through a social network and how diseases spread across populations. One prominent type are inhomogenous network models, which can generate clear results. “For example, we see with inhomogenous network models that information can spread very fast,” outlines Johannes Lengler, Professor of Computer Science at ETH Zurich. Another type are geometric models, which while capturing certain aspects of social networks in the real world, may generate results contradictory to those from inhomogenous network models. “For example, if you look at the spread of epidemics, then it’s extremely fast if you look at the inhomogenous models, then it’s very slow if you look at the other type, the geometric models. So this raises the question, which is the right one?” asks Professor Lengler.
This is a topic Professor Lengler is exploring further in the DynaGIRG project, an initiative backed by the Swiss National Science Foundation (SNSF), in which he and his colleagues are looking at the dynamics of social interaction in network models that combine these characteristics. This research centres around geometric inhomogenous random graphs (GIRGs), a new type of network model which Professor Lengler says is highly flexible. “There is a lot of freedom in the model definition, and it’s quite a robust model,” he explains. In the project Professor Lengler is working to refine these GIRGs and build a deeper understanding of their component structure. “I am taking the GIRG model, trying to improve it, and to make it even more flexible,” he continues. “Another part involves looking at certain processes on the model, including bootstrap percolation, routing and the progression of epidemics.
The project’s agenda also includes research into how information is transmitted through social networks, a topic addressed in Stanley Milgram’s famous ‘small-world’ experiment in 1967. In this experiment two people were randomly selected in two entirely different locations, and one was asked to send a letter to another, with only limited information. “They knew their address and a little personal information, but person A was not allowed to send the letter to person B directly. They were only allowed to send the letter to one of their friends, who was then asked to send a letter on to one of their friends, and so on. Was it possible with such a chain to reach B?” says Professor Lengler. This was indeed found to be possible, and letters needed just six hops on average before they reached their intended recipient. “Even if you have only a little information, you will be able to find a friend who is better at finding the intended recipient,” continues Professor Lengler.
This routing process has been modelled on GIRGs, and researchers have been able to essentially reconstruct the chain of intermediaries which passed on the letters, as well as to analyse what happens in each of the different phases. As part of his work in the project, Professor Lengler is now looking to check whether these insights also hold in terms of how information filters through social networks. “The most obvious question is, how many steps does it take to get to the intended recipient? We can also look at the route of messages and what happens when a lot of messages are sent. Do all these paths go through the same few vertices in the model, or are all these paths very different from each other?” says Professor Lengler. “We have found that these messages do indeed all go through a very small set of well-connected vertices, which is what we would expect to see in social networks.”
A parallel can be drawn here with the epidemiology case, where a single individual may play a particularly significant role in the spread of a disease, a so-called superspreader. The first step towards becoming a super-spreader is to meet a lot of people, yet Professor Lengler says other factors are also involved. “A person can be a super-spreader if they have a high viral load. However, it’s
not the case that if you have 100 times more contacts then you will also spread a disease to 100 times more people, and we study the effects of this,” he says. The project has largely focused on the role of people in spreading disease, but Professor Lengler also intends to look at the importance of events and meetings in future. “One type of meeting would be between close-by vertices, where one vertex invites others, and usually they know each other,” he continues. “Another type of event is where vertices which are very different from each other come together.”
The project’s research holds wider relevance in terms of understanding how diseases spread and in developing quarantine strategies to control or limit their transmission. Epidemiological models were used extensively during the Covid-19 pandemic for example to assess the likely impact of restrictions on the spread of the disease. “If you impose restrictions on flights, if you try to force vertices not to move very far but otherwise don’t restrict them very much, does this change the dynamics of disease spread?” outlines Professor Lengler. The GIRG models can provide a more detailed picture than standard epidemic models in this respect. “Standard epidemic models
Dynamic Processes on Geometric Inhomogeneous Random Graphs
Project Objectives
The project aims to understand how epidemics and other dynamic processes spread in networks that combine inhomogeneity with clusterlike characteristics. In particular, researchers aim to understand under which circumstances processes spread exponentially, and in which circumstances they spread even faster or slower.
Project Funding
This project is funded by the SNSF.
Contact Details
Project Coordinator,
Prof. Dr. Johannes Lengler
ETH Zürich
Department of Computer Science
OAT Z 14.1
Andreasstrasse 5 8050 Zürich
E: johannes.lengler@inf.ethz.ch
W: https://as.inf.ethz.ch/people/members/ lenglerj/index.html
Depending on the parameters, the macroscopic structure of the networks differs. Left: strong heterogeneity. Shortest paths mostly use nodes with many neighbors. Right: weak heterogenity. Shortest paths mostly use nodes with few neighbors. Image created by Joost Jorritsma.
build on the hidden assumption that when one vertex infects another, then this new vertex is found at random from the whole population. So if someone in the UK infects someone else there, then this second person is just uniformly at random, you will not see geometric information,”
This geometric information was not typically available in the models used during the Covid-19 pandemic, a major motivation behind Professor Lengler’s work. The hope is that these new models will add a further dimension in epidemiology and allow researchers to gain a fuller picture of how diseases spread. “We have looked at how geometry affects the spread of a disease. It
Johannes Lengler is a Professor in the Department of Computer Science at ETH Zurich. He gained his PhD at the University of Saarland before moving to Zurich, and has conducted research in both pure and applied mathematics.
similar properties. “This is about the extent to which vertices are sorted by similarities of degrees. How does this distribution compare to a baseline? For a popular vertex, do we see a lot more vertices in a neighbourhood which are also popular?” outlines Professor Lengler. The overall picture seems to be that there is a high degree of positive assortativity in social networks, where for example stars associate with other stars, while negative assortativity has been observed in certain technological networks. “The central nodes and servers here tend not to connect so much to other central nodes and servers, but rather to low degree and low weight nodes,” continues Professor Lengler.
“The spread of epidemics is extremely fast if you look at the inhomogenous models, then it’s very slow if you look at the geometric models. So this raises the question, which is right?”
may or may not have a very strong impact, and this depends on the characteristics of the disease and the contact network,” says Professor Lengler. Some infections are known to spread exponentially, but there are also others which grow polynomially, and Professor Lengler says GIRGs can be used in both of these different scenarios. “These models can show both types of behaviours. Earlier models had a lot of trouble modelling infections like ebola or aids, because they did not have the geometry in them, while the GIRG model takes this into account,” he explains.
The evidence suggests that GIRGs are highly effective in terms of replicating the features of real-world social networks, and Professor Lengler is now actively working to improve them further. One major topic of interest is assortativity, the tendency for vertices to associate with others with
The GIRG model as currently defined is entirely neutral with respect to assortativity, a topic that Professor Lengler and his colleagues hope to address in future. The idea will be to develop a method to control assortativity. “We can then have strong positive assortativity, or strong negative assortativity in a model,” says Professor Lengler. This would add a further dimension to GIRGs, and open up deeper insights into not just how diseases spread, but also other scenarios. “We can look at how two competing opinions spread in the network for example, opinions A and B. Let’s say that a vertex changes, perhaps with a bit of randomness, to the majority opinion in a neighbourhood. Can there be bubbles where both of these opinions are stable?” continues Professor Lengler. “This can happen in some types of network, while in others this would be unstable, and one opinion would take over in the end.”
The left atrial appendage is often closed by doctors performing open-heart surgery to reduce the risk of stroke for patients with atrial fibrillation, yet it’s not clear whether this can prevent strokes in patients who have not been diagnosed with the condition. The LAACS-2 team aim to gain a fuller picture of whether this operation protects against stroke, as Dr Helena Dominguez explains.
A condition characterised by an irregular cardiac rhythm, atrial fibrillation can lead to the development of blood clots in the upper chamber of the heart, which may provoke a stroke. It has long been common practice for surgeons to close the left atrial appendage, a kind of cul-de-sac in the heart that doesn’t seem to be necessary for cardiac function, as a preventative measure. “It’s known that clots can build up in the left atrial appendage, and surgeons have been cutting it away for many years,” says Dr Helena Dominguez. The LAAOS-3 study showed that this procedure is an effective preventive measure in patients with atrial fibrillation. Dr Dominguez and her colleagues investigated whether this procedure might also protect patients without atrial fibrillation from stroke in an earlier study. “In our study we investigated whether left atrial appendage closure by surgery (LAACS) prevented damage to the brain regardless of whether atrial fibrillation had been diagnosed before surgery, looking at endpoints stroke or new infractions, in CT and MRI scans,” she outlines. “We found a difference between the group who had undergone the LAACS procedure, and those who hadn’t.”
“It’s
looking at the incidence of strokes. Biology samples have been taken from the left and the right heart appendages in a subgroup of patients, to investigate whether there are structural changes that render the heart prone to building up clots, even in the absence of atrial fibrillation.
known that clots can build up in the left atrial appendage, and surgeons have been cutting it away for many years.”
As head of the LAACS-2 study, Dr Dominguez is now looking to gather more evidence on the effectiveness of the procedure for all patients undergoing heart surgery, and also to build a deeper picture of the factors that may heighten the risk of stroke beyond atrial fibrillation. Researchers are following the progress of a randomized group of 1,500 patients from four European sites who had open-heart surgery, although they may not have had atrial fibrillation. “These patients were scheduled for coronary bypass, aortic or mitral valve surgery or any of these combined. They were randomized to undergo surgery as planned or to add a LAACS procedure using clips,” says Dr Dominguez. Researchers are taking a pragmatic approach to this work, following the patients’ e-health records, and
This research will help provide a fuller picture of whether the LAACS procedure helps prevent stroke, which may then inform clinical guidelines for patients, regardless of whether they have atrial fibrillation or not. The evidence from Dr Dominguez’s study and others suggests that the left atrial appendage plays a major role in stroke, and closing it could reduce the risk. “In future it may be recommended to close the appendage in everybody who undergoes open-heart surgery,” she outlines. This line of research continues beyond LAACS-2, as Dr Dominguez has launched a mechanistic study, with further collection of heart tissue, blood and ECG, looking for biomarkers that could be used to identify patients at heightened risk of stroke. “It may be that there are some factors that cause blood to clot, then if you have atrial fibrillation on top, that heightens stroke risk,” she explains.
Left Atrial Appendage Closure by Surgery-2
The main objective of the project is to determine if Left Atrium Appendage (LAA) closure added to planned open heart surgery protects against future major stroke and minor stroke.
Funded by the Independent Research Fund Denmark, the Innovation Fund Denmark, Novo Nordisk Foundation, Ib Mogens Christiansen, and Bispebjerg-Frederiksberg Research Fund.
Associate Professor Helena Domínguez, PhD Bispebjerg-Frederiksberg Hospital
Cardiology dept.
Nordre Fasanvej 57 DK-2000 Frederiksberg
Denmark
T: +45 3816 6162/6148
E: mail mdom0002@regionh.dk W: www.frederiksberghospital.dk
Domínguez graduated in Medicine at Universitat Autònoma de Barcelona in 1988. She has been engaged in research since her studies, along with clinical specialisation as a cardiologist in 2007. She gained her PhD in 2005 at the University of Copenhagen (UCPH). She currently works as a cardiologist in Bispebjerg and Frederiksberg Hospital and is an Associate Professor in the Dept. Biomedicine, UCPH, Denmark.
Mount Etna is one of the most active and most comprehensively monitored volcanoes in the world, with hundreds of instruments on the surface providing terabytes of data on its activity. The Improve project team are investigating signals from the volcano and how they relate to magma dynamics, as Dr Paolo Papale explains.
As one of the best monitored and also most active volcanoes in the world, Mount Etna is the ideal location to study the behaviour of magma below the earth’s surface and its impact on volcano dynamics. Several hundred instruments are permanently deployed on the surface of Mount Etna and in the surrounding area, which provide large volumes of data on the volcano’s activity. “We are capable of seeing even very small ground movements at Mount Etna, over hundreds of square kilometres in the surrounding area,” says Dr Paolo Papale, Research Director of the Italian National Institute of Geophysics and Volcanology (INGV). As coordinator of the Improve training network, Dr Papale is part of a team analysing data from Etna, in order to gain deeper insights into volcano dynamics. “We collect many different signals from multi-parametric instrumental systems placed on Mount Etna that record continuously in real-time,” he outlines. “We investigate those signals in order to understand them and to relate them to the dynamics of magma at depths.”
A wide variety of signals are being considered in this research, including ground deformation and shaking, gas release and changes in gravity, which is related to movements of mass and variations in density. Historically, the study of ground motion around volcanoes has been divided into two disciplines, related to two extremes in terms of timescales. “One is the study of ground-shaking during earthquakes, which is very rapid, and is recorded with seismic instruments. The other is ground movements related to the inflation or deflation of the volcanic system, which occur over longer timescales. We record the deformation of the system over days, months, years, even decades,” explains Dr Papale. Between high frequency ground-shaking and very slow ground motion there is a large gap, which Dr Papale says is a major topic of interest
in the project. “The numerical simulations that have been done suggest that ground movements in this hidden window can be very important,” he says.
The focus of attention in the project is on this window, with researchers placing instruments on the volcano that are designed to record signals and gather data
band, which represents a new dimension in the study of active volcanoes,” he outlines. The instruments, including infrasonic microphones, broadband seismometers, tiltmeters, high-speed visible and infra-red cameras, and prototypal high-frequency GPS receivers, were placed mainly in the upper part of Mount Etna, which
“We collect many different signals from multi-parametric instrumental systems placed on Mount Etna that record continuously in real-time, complemented by data from field experiments and by sophisticated numerical simulations. We aim at developing a more comprehensive understanding of magma dynamics leading to volcanic eruptions.”
over the period between one minute and one day. The project team is conducting experimental, numerical and field work on ground movements over that period, which Dr Papale hopes will open up new perspectives in the study of active volcanoes. “We are bringing together researchers from several disciplines to try and understand the signals in this frequency
will help researchers learn more about volcano dynamics at shallow depths. “We concentrated the instruments in an array geometry in a comparatively small region close to the top to see signals from shallow depths in the volcano,” continues Dr Papale. This geometry allows researchers to accurately constrain the region from which the signals emanate and also to
characterise those signals. Alongside the permanent instrumentation on Mount Etna, this provides a wealth of signals for researchers to analyse and interpret, with the goal of building a fuller understanding of the volcano. “We want to understand how things happen, such as how magma is displaced and how it moves through different reservoirs in the volcanic system. What kinds of signals are then released?” explains Dr Papale. The ultimate aim is to develop the capability to accurately forecast eruptions, and protect the many
millions of people across the world who live in close proximity to active volcanoes, yet Dr Papale says it remains difficult to interpret these signals.
An effective and reliable forecasting system must be built on solid foundations, and so great emphasis is placed on scientific rigour in the project. By working with many different instruments and data sources, researchers are able to ensure their findings are robust. “We need to be careful in our research.
That’s why we need to place so many different instruments and record so many different things,” says Dr Papale. Alongside experimental, numerical and physical studies, the project’s agenda also includes machine learning research, looking to identify similarities and differences between volcanic signals. “We are also working on automatic signal recognition through machine learning techniques, looking at both data from the volcano and synthetic data,” continues Dr Papale. “If we find some relevant similarities in the data, robustly rooted in the processes we are studying, then we can use numerical simulations to build a deeper understanding.”
This can then help researchers identify where instruments should be placed in order to maximise their observational value and guide further investigation into volcano dynamics beyond the project. Strong relationships have been forged between the partners in Improve, which Dr Papale says will provide a kind of seed for the development of future research projects. “Improve is contributing to the establishment of a European community of volcanologists, and we will continue to work together in future,” he says. “We want to help create a strongly interconnected European research community.”
Innovative Multi-disciplinary European Research training network on VolcanoEs
Project Objectives
IMPROVE is a highly cooperative multidisciplinary network of European Research Institutes and Small-Medium Enterprises. In IMPROVE, 15 Early Stage Researchers are trained to innovative research in volcano science from monitoring and prospecting to advanced lab experiments, High Performance Computing, and Artificial Intelligence. Two volcanic target areas are Mount Etna in Sicily and the Krafla caldera in Iceland.
Project Funding
IMPROVE is a Marie Sklodowska-Curie European Training Network, with full title: “Innovative Multi-disciplinary European Research training network on VolcanoEs”. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 858092.
Project Partners
There are 9 academic partners + 3 industrial partners in the project.
https://www.improve-etn.eu/index.php/partners/
Contact Details
Paolo Papale, Research Director
Chair, Class of Exact Sciences, Academia Europaea Istituto Nazionale di Geofisica e Vulcanologia
Sezione di Pisa, Via Cesare Battisti, 53 56125 Pisa, Italy
T: +39 050 8311931
E: paolo.papale@ingv.it
W: https://www.improve-etn.eu/ Andrew Mitchell
E: a.p.mitchell@lancaster.ac.uk
Clothilde Biensan
E: clothilde.biensan@ingv.it
Paolo Papale
Clothilde Biensan
Andrew Mitchell
Paolo Papale is Research Director at the National Institute of Geophysics and Volcanology (INGV) of Italy.
Clothilde Biensan is a PhD candidate in Earth Science at Sapienza University in Rome (Italy), working within the INGV (Istituto Nazionale di Geofisica e Vulcanologia) in Rome. Her research focuses on assessing the fundamental parameters that control the different modes of active degassing and their transitions at mafic volcanoes.
Andrew Mitchell is a PhD researcher at Lancaster University. His research focuses on using analogue models to help understand the relationship between volcano deformation and magma intrusions.
The Improve project is a training network, in which fifteen early stage researchers (ESRs) are investigating different topics around volcano dynamics. We spoke to two ESRs participating in the project, Andrew Mitchell and Clothilde Biensan, about their research at Mount Etna and its wider importance in understanding volcanoes and forecasting eruptions.
Clothilde Biensan
EU Researcher: What topic are you addressing in the Improve project?
Clothilde Biensan: I’m investigating degassing on Mount Etna. It is an open conduit volcano, so we see continuous degassing behaviour. I’m using a setup that has been developed at the INGV called SKATE (System for Kinematic Acquisition of Transient Eruptions) to acquire data.
EUR: Are you trying to relate information about degassing to the behaviour of the volcano as a whole?
CB: Yes, I’m trying to understand the behaviour and the transitions that we can observe. There are sensors in the setup that acquire visible, thermal and microphone data simultaneously and continuously.
EUR: As Etna is continuously degassing, is it difficult to relate those events to seismic behaviour?
CB: When there is a degassing event without particles, it can be correlated to a seismic signal due to in-conduit explosion or implosion of gas bubbles and also to the penetration of the acoustic wave generated by the explosion. But in our case, it is quite challenging to identify a good signature with the instruments that we have in the field, as the degassing events we observed in July 2023 were small.
When there is a change in the volcano behaviour on the seismic side (emergence of tremors for example), we can approach things differently. In a previous trip to Mount Etna the degassing activity of the volcano seemed to be constant for ten days, while I’ve just done some field work on an eruptive open-conduit volcano called Stromboli, where the behaviour changed in a single day. I have been able to do some field trips on other open-vent and mafic magma volcanoes to draw comparisons and see if there are any similarities in their behaviour.
EUR: How can you combine narrowly focused, specialised research with the goal of understanding the bigger picture around volcano behaviour?
CB: I have spent a few months abroad in the project with other ESRs working on different
topics, including statistical data analysis and pattern recognition. I’m working with different people to try and model Mount Etna, and understand the behaviour we have observed.
Andrew Mitchell
EU Researcher: What is the focus of your research in Improve?
Andrew Mitchell: My interest is in the patterns and rates of ground deformation that we see around Mount Etna. In my project I’m looking at this from an analogue modelling perspective, focusing on nearsurface deformations.
EUR: Could you describe the set-up of these models?
AM: In one of the set-ups we’re using granular material such as sand and glass beads to represent a scaled down Mount Etna, while injecting viscous liquids into the model and measuring deformation changes. In another experiment we have a balloon underneath this sand cone - we inject water into this balloon, withdraw it, then repeat the process and observe the deformation on the cone.
I’m collaborating with other ESRs within IMPROVE who are working with computer models, while we also have GPS data on movements from Etna. So we have this exchange of information from multiple areas of research.
EUR: How do analogue models contribute to the overall goal of understanding volcanoes?
AM: Although Mount Etna does deform a lot over time, when the rate increases, it’s often seen as a sign of volcanic unrest. Therefore can we identify certain deformation patterns? If we can see these patterns in the analogue models and relate the results to Mount Etna, we may then have a better understanding of how the volcano deforms prior to an eruption.
EUR: Are you working towards a specific goal in the project, or are you focused mainly on improving the models?
AM: The ultimate goal is always to understand volcano behaviour and predict eruptions, but improving these models and understanding the underlying processes is an important step towards that.
The team behind the Re:FAB project are working to turn recycled materials from construction and demolition sites into high value new products. This will help address sustainability concerns and encourage the transition towards a circular economy, as Helena Tuvendal , Ola Karlsson, Johan Borgström and Fabian Sellberg explain.
A type of insulating material widely used in the construction industry, glass wool is currently produced through a highly energy intensive process, which involves melting glass at temperatures of over 1,000ºC. With many companies aiming to improve resource efficiency and reduce their environmental impact, the team behind the Re:FAB project are developing innovative sound absorbent solutions using recycled glass wool from acoustic ceiling panels. “From a technical point of view, glass wool ceiling panels are an ideal candidate for recycling and reuse, and they are also easy to handle and access,” explains Ola Karlsson, Sustainability & Innovation Director at Saint-Gobin Ecophon, one of the partners in the project. The aim is to demonstrate a circular value chain, from the recycling of glass wool through to its eventual reuse in new products, alongside developing new manufacturing technologies, with each of the six partners bringing their own expertise to bear on the challenge of improving resource efficiency. “At The Loop Factory we’re looking at how the raw material can be used to develop innovative new products. What technologies do we need
in our production line in order to optimise production efficiency?” outlines Helena Tuvendal, Manager of the project.
This research is part of the wider goal of making better use of existing resources and improving the sustainability of the construction industry, rather than simply using new materials every time an office is renovated or redesigned. The typical Swedish office is renovated around every seven years, with an increased focus
today on reusing materials from construction and demolition sites. “We aim to reuse building materials as much as possible. One part of that is reusing the acoustic ceilings that are already present in offices,” says Fabian Sellberg, an architect at project partner, LINK Architecture. Not all of this material may be reused in new products, as it may not reach modern safety standards, but it can still hold value. “Very old glass wool can be taken back and re-melted, or perhaps go into other value streams, but it will not be re-introduced to
customers, due to the composition” explains Johan Borgström, a senior researcher at Saint-Gobin Ecophon. “As we go forward, the problem will become less acute, and even 20 year old glass fibres provide a lot to work with. The majority of the ceilings that we get back are less than 20 years old.”
The recycled material is then transformed into tailored new products using the parametric design methodology, in which algorithms based on certain input parameters are used to identify the optimal design to meet a very specific need. This represents a new way of thinking in construction, which could have a significant impact on sustainability. “It’s not necessary to manufacture in large volumes and keep products in storage. We can start to press the product when we have the order, so that everything will be unique and fit for purpose,” explains Tuvendal. With parametric design each product is made to order, with the project team looking to develop acoustically optimised panels. “Let’s imagine a room that is going to be used for meetings. It might
& Innovation
Heading R&D, Innovation and Sustainability. Ola Karlsson’s focus areas are sustainable innovation including recycling and circular business models, material science in combination with applied room acoustics. Former professor in Polymer Science and Physical Chemistry and an experienced researcher from leading positions in the Swedish and global chemical and material industry.
be 10 x 5 metres, with a height of 3 metres, providing space for 50 people to meet around a table. We can then calculate the level of noise that such a meeting would generate,” explains Sellberg, a specialist in parametric design. “Instead of simply deciding that we need a certain amount of acoustic panels inside, we can make a panel that is formed to reflect some sound, and to absorb the wavelengths we want to capture.”
panels usually absorb sound waves, but we’ve also been looking at reflecting or directing it,” he says. The types of sound a product will absorb depends to a degree on its porosity, which can be modified by pressing the material in different ways, an issue that researchers are exploring in the project. “We are looking into pressing the material unevenly, to get different thicknesses and absorption rates, depending on where it will be placed on the
“We are looking into pressing the material unevenly, to get different thicknesses and absorption rates, depending on where it will be placed on the bigger panel.”
A school may have a very different set of acoustic requirements however, something which can be reflected in the design and properties of the panels. As part of their work in the project, Sellberg and his colleagues at LINK have been trying to optimise the design of the panels by including both absorption and diffusion in their calculations. “Acoustic
Johan Borgström has experience in material and product development in paint and pharmaceuticals and special expertise in dispersions, surface chemistry and powders. PhD in chemistry and M.Sc. in chemical engineering, with 20 years of experience in material development and surface treatment processes.
bigger panel,” outlines Sellberg. “For example if there is a panel with lower absorption rates above a teacher, where they stand and talk, then they will be more audible at the far end of a classroom. We are modelling materials and their acoustic properties in the project, taking further steps towards customised design of sound absorbants.”
Fabian Sellberg has a background in coding, architecture, and computer science. Special expertise in development of the parametric design pipeline from design to manufacture, with focus on the development of a parametric tool to analyse and evaluate the acoustic quality of a given room.
A further important consideration in the project is the traceability of these materials, which is central to their potential re-use in future as part of a circular economy. While previously it might have been sufficient to provide relatively limited information about an acoustic panel, Sellberg says that more data will be required if the lifetime of these materials is to be extended and their value increased, reflecting the customised nature of parametric design. “As the parametric workflow is unique for each product, all the associated information becomes very customised as well,” he points out. Information about specific aspects of an acoustic panel will be required, such as how it was produced, what room it was produced for, and how that room was used. “If a panel in a 5X5 grid breaks, and this panel had a specific function, then you need to know what the function was for it to be replaced,” continues Sellberg. “We’ve been working with one of the partners in the project called Logtrade Technology. They’ve been creating an ecosystem and infrastructure around how we can facilitate this kind of information-gathering on the composition of acoustic panels.”
The demand for these acoustic panels has grown rapidly over recent years, with many companies using them in open-plan offices to create productive, collaborative working environments. At the same time commercial companies are also looking for new ways to minimise their environmental impact, and Karlsson believes recycled materials will have a big role to play in this respect. “We see significant growth in interest,” he says. While the project has focused largely on acoustic wall panels, Sellberg says this research could also hold lessons for the development of other products, in particular modular products. “One of the main technologies that we have developed in this project is
around the placement of different modules, and looking at how they perform. Then we look at moving them around, replacing them, and try to assess the impact of changes,” he continues. “In another project we’re reusing concrete slabs, checking their structural integrity, then looking to use them again in new buildings. There are similarities with the Re:FAB project, but in this case we are looking at the performance values of a concrete slab rather than an acoustic panel.”
This reflects a general shift in outlook across the construction industry towards a greater focus on resource efficiency, and the project’s work represents an important contribution in this respect, in line with wider environmental and sustainability goals. While parametric design is still relatively new in the construction industry, it is being applied on an ever broader range of products. “It’s possible now to apply parametric design processes to more intricate processes, as the software and the technology has become more widely available. We can start to really calculate the impact of every building part earlier and with much greater levels of precision,” explains Sellberg. New prototypes have been developed in the project, which has been enormously valuable to Sellberg and his colleagues at LINK. “The performance of the acoustic panels can be measured, and we’ve been able to test them out and see how they perform in practical scenarios,” he says. “We’ve been able to translate a simulation into an actual physical product, thanks to The Loop Factory’s advanced prototyping chambers.”
The project is coming towards the end of its three year funding term, yet strong relationships have been forged between the partners, and Tuvendal is keen to explore the possibility of building on the progress that has been made. “We have a lot of new ideas for further development. This work hasn’t finished, we want to continue our research,” she says.
and
Design of circular and resource efficient processes and products for recycled glass wool
Project Objectives
This project aims to create a circular value chain by transforming complex construction and demolition waste into resourceefficient products. Through innovative manufacturing technologies, the goal is to exceed market demands of design properties and sustainability, as well as acoustically optimised materials. The project will pave the way for scalable solutions that benefit other industries and materials.
Project Funding
This project is funded by Vinnova (the Swedish Innovation Agency). SEK 5 392 752
Project Partners
• Saint-Gobain Ecophon
• LINK Arkitektur
• DECIBEL by Johanson Design
• LogTrade Technologies
• Lund University
• The Loop Factory
Contact Details
Project Coordinator, Helena Tuvendal
Senior Project manager
The Loop Factory Honungsgatan 4B, 432 48 Varberg, Sweden
T: +46 76 131 17 97
E: helena.tuvendal@loopfactory.se
W: https://www.loopfactory.se/case/refab
Joining forces with partners SaintGobain Ecophon , LINK Arkitektur, DECIBEL by Johanson Design , LogTrade Technologies , and Lund University the aim is to demonstrate a circular value chain and new resourceefficient manufacturing technologies within the construction and demolition industry. supported by Vinnova , the Swedish Innovation Agency.
an M.Sc. in mechanical engineering specialised in the field of wood technology. She has solid experience from the industry, with connections to several industries, wood, forestry, electronics and medical technology. She has worked with several demonstration projects where lab results have gone to the pilot scale.
We spoke to Professor Torsten Nygaard Kristensen from Aalborg University in Denmark. His research team is working on optimizing the farming of efficient insect populations that can convert organic waste into high-value products. Their efforts will contribute to the development of sustainable food systems and address the rising global food demand.
As the global human population approaches 8 billion, we face a challenge to meet rising food demands while attempting to minimize the environmental damage caused by food production. Currently, traditional agriculture significantly strains the planet’s resources, while livestock farming contributes to climate change, land degradation, and biodiversity loss. Professor Torsten Nygaard Kristensen and his research team at the Department of Chemistry and Bioscience at Aalborg University in Denmark, are exploring innovative alternatives and solutions to these challenges, with insect farming emerging as a promising candidate because some insect species can be produced as food and feed more sustainably than traditional livestock species.
In a collaborative project involving researchers also at the Center for Quantitative Genetics and Genomics and the Department of Biology at Aarhus University, Denmark, he works on ways to optimize insect production for animal feed, and human consumption. For the past four years, they have been exploring ways to enhance the efficiency of insect production through environmental and genetic improvements. “One part is, how can we optimize food and feed production through environmental improvements such as better production conditions involving optimizing temperatures, humidity, and other environmental factors. The second part is the genetic part, where our ambition is to adapt principles and tools that animal and plant breeders have been using for decades, to insects. This involves selective breeding where insects with superior genes coding for traits of interest are used as parents establishing the next generation. This work is focused on traits
of importance from a production point of view such as weight and protein content, but also on health and reproduction traits so that robust populations are generated. While we have made significant progress, it is important to recognize that, unlike livestock such as pigs, cattle, and poultry which are domesticated and have been bred by farmers for thousands of years—large-
scale insect farming is still a relatively new field in many parts of the world. This means that we have limited knowledge of the basic biology of most of the species that we work with and unlike traditional livestock we cannot easily keep track of genetic relationships making it challenging to execute e.g. selective breeding effectively” explains Prof. Nygaard Kristensen.
In their work, Prof. Nygaard Kristensen and his team primarily work with two insect species; the black soldier fly (Hermetia illucens) and the house fly (Musca domestica). These insects have the potential to utilize and valorize low-quality waste products that cannot be used as feed for traditional livestock. The black soldier fly and the house fly have this ability to produce value from materials that are typically put into landfills, used as soil fertilisers, or burned for energy. From a sustainability point of view, it is far more efficient to process them through insects, and thereby convert them into valuable biomass. Insects, and especially the two species that we have worked with, are eminent in doing that, valorizing organic waste products that are not used optimally today.” says Prof. Nygaard Kristensen. While most research has focused on the larval stage (which is typically the harvested life stage), they have also focused on the adult biology of black soldier and house flies, especially reproduction and stress tolerance traits. One example of this work is a study led by a former PhD student on the project, Dr. Stine Frey Laursen who explains that one of her studies showed that even on diets of low nutritional value, the insects were able to complete their life cycle and produce viable eggs. Although these diets resulted in lower yields compared to higher-quality alternatives, using such waste streams as insect feed could still be a sustainable option, adding value to underutilized and wasted resources.
two species are promising candidates in the commercial production of insect protein and both species have been investigated in the current project.
Project Objectives
The project “Optimization of insect production for animal feed through breeding” aims to optimize insect production for sustainable animal feed by improving environmental farming conditions and incorporating advanced genetic tools and selective breeding techniques that enhance specific production traits.
Project Funding
This project is funded by the Independent Research Fund Denmark (DFF-0136-00171B).
Project Partners
“Instead of wasting these low-quality nutritional sources, it is far more efficient and sustainable to process them through insects, and thereby convert them into valuable biomass.”
Prof. Nygaard Kristensen and his team are also working on enhancing the genetics of insects through selective breeding and the use of quantitative genetic breeding tools that have been previously used in domestic livestock. The goal of this research is to improve production traits such as growth rate, body size, and reproduction, which is expected to make insect farming more efficient and sustainable. Selective breeding can be extremely successful in changing traits of interest across generations. This is exemplified by milk yield in dairy cattle and weight of broiler chicken which have been more than doubled during the last ca. 50 years.
So the idea is simple, namely to select those individuals that have genes that give rise to superior phenotypes for traits of interest in a given population and that this will lead to cumulative changes across generations However, breeding in insects is not without challenges. “Insects, like the black soldier flies, have very short generation intervals; it takes only a few weeks to go through the different life stages. In many ways, that is a benefit because it means that we can rapidly obtain genetically based changes. However, it also means we need to be able to phenotype individuals very quickly to identify the best individuals that should be used as parents, and this is a main challenge. Another challenge is that, unlike traditional livestock where individual animals can be tagged, e.g. using ear tags for identification, insect breeding involves managing thousands, or even hundreds of thousands of small individuals that molt (shed the exoskeleton) when passing from one life stage to another, making it impractical to track individuals. This means that obtaining a pedigreed population is challenging”
says Dr. Laura Skrubbeltrang Hansen, who has also been a PhD student on the project.
Despite these challenges, the researchers have found solutions by using e.g. half-sib fullsib designs where one male is mated to a number of females so that both half and full siblings are produced. Fast automated phenotyping methods that apart from increasing the speed of assessing larval size also improve efficiency and accuracy are also an important outcome of the project. This system can be used in largescale breeding programs, since it allows for an efficient and unbiased phenotyping of large numbers of individuals, making the selection process faster and more reliable.
Additionally, to improve the breeding program, the researchers used whole genome sequencing to study two black soldier fly populations, one from Denmark and one in Texas, USA. This study showed marked genetic differences between the two populations and illustrated the power of using genomics to pinpoint signatures of selection, genetic drift and inbreeding. Although not part of this study, the approach taken illustrates the power of full genome sequencing in identifying genetic markers linked to important production traits, such as rapid growth and high protein content, paving the way for marker-assisted selection.
“A lot has been learned from this project and although challenging we can optimize production environments and use selective breeding on insects. This means that in the future we can develop more efficient, high-yielding insect populations aiding in solving major challenges of our and future generations which includes generating food for an increasing human population while reducing negative environmental impacts” Prof. Nygaard Kristensen concludes.
Department of Chemistry and Bioscience, Aalborg University, Denmark, Center for Quantitative Genetics and Genomics, Aarhus University, Denmark, Department of Biology, Aarhus University, Denmark.
Contact Details
Project Coordinator,
Torsten Nygaard Kristensen
Professor Department of Chemistry and Bioscience
The Faculty of Engineering and Science Functional Ecology and Genomics
Aalborg University
T: +45 61463375
E: tnk@bio.aau.dk
W: https://vbn.aau.dk/en/persons/tnk
Torsten Nygaard Kristensen is an evolutionary biologist and professor at the Department of Chemistry and Bioscience at The Faculty of Engineering and Science at Aalborg University Denmark. His research interests include conservation and population genetics, ecophysiology, environmental stress adaptation, and animal breeding.
Stine Frey Laursen is an evolutionary biologist, working on improvement of insect production through environmental and genetic means. She is employed as a postdoc at Department of Chemistry and Bioscience, Aalborg University, Denmark.
Laura Skrubbeltrang Hansen is a quantitative geneticist, working on insect genetics and is employed as a postdoc at the Center for Quantitative Genetics and Genomics, Aarhus University, Denmark.
Researchers in the MASSPROVIT project are exploring the potential of microalgae cultivation - microscopic algae rich in proteins, vitamins, and omega-3 fatty acids. We spoke with Professor Charlotte Jacobsen, Professor Poul Erik Jensen, Professor Marianne Thomsen, Emil Gundersen and Malene Fog Lihme Olsen to learn more about this promising area of research.
As the global population approaches 10 billion by 2050, the demand for sustainable, resource-efficient, and environmentally friendly food production systems is becoming increasingly critical. Traditional agricultural methods place significant strain on resources such as land, water, and energy, highlighting the urgent need for innovative solutions. Microalgae offer a highly promising alternative, with their potential to provide essential nutrients while minimising environmental impact. Over the past 70 years, microalgae have repeatedly attracted attention as a valuable resource for food, biofuels, and high-value bioproducts. Thanks to their high productivity compared to traditional crops, their ability to grow in seawater, and their minimal land use requirements, microalgae stand out as an environmentally efficient option.
The first large-scale efforts to industrialise microalgae for food purposes began during World War II in Germany, when food security was a pressing concern. In the 1970s, applications expanded to include wastewater purification and aquaculture feeds, further establishing the potential of microalgae in industrial processes. More recently, the global demand for sustainable sources of protein and polyunsaturated fatty acids (PUFAs) has reignited interest in microalgae production, with companies and research institutions now focusing on microalgae as a potential replacement for meat and soy protein, as well as a source of bioactive compounds for the nutraceutical market.
In this context, the MASSPROVIT project emerges as a key initiative aimed at advancing the sustainable cultivation of microalgae using foodgrade industrial side streams. Researchers from the Technical University of Denmark (DTU) and University of Copenhagen (UCPH) have, within two years of intensive research, explored the feasibility of using process water from industrial enzyme production to cultivate Nannochloropsis oceanica, a microalgae species renowned for its rich nutritional profile, which includes PUFAs, vitamins, and essential amino acids. The MASSPROVIT project aims to assess the effects of incorporating these industrial by-products on algal growth and the nutritional quality of the biomass, focusing on optimising the production process and improving the environmental sustainability of microalgae cultivation.
The MASSPROVIT project has two overarching objectives: first, to enable the sustainable, local Danish production of microalgae as a novel source of non-animal protein and health-promoting compounds such as omega-3 fatty acids and vitamins; and second, to provide proof of concept for a lowfootprint, protein-rich microalgae biomass that could potentially reduce Denmark’s reliance on imported, emission-intensive protein sources.
As part of the MASSPROVIT project, the researchers tested two different industrial side streams to evaluate their suitability as partial replacements for commercial nutrient supplements in microalgae cultivation. The study revealed that one of the side streams, referred to as SS1, was capable of replacing up to 40% of the commercial nutrient supplement without negatively impacting algal growth or the nutritional quality of the biomass. SS1 was found to have a high concentration of inorganic nitrogen, primarily in the form of ammonium, which was readily assimilated by Nannochloropsis oceanica. Moreover, the nitrogen-to-phosphorous ratio (NPR) in SS1 closely matched that of commercial nutrients, supporting its use as an efficient and sustainable alternative.
In contrast, the second side stream, SS2, caused a significant reduction in both growth and omega-3 fatty acid content, even at a 20% replacement level. The lower NPR and nutrient content in SS2, along with potential growthinhibiting compounds, were likely responsible for the reduced performance. These findings, while highlighting the potential of industrial side streams like SS1, also underscore the challenges of optimising nutrient formulations in microalgae cultivation, a key focus of the ongoing MASSPROVIT project.
A critical aspect of the MASSPROVIT project is its focus on the environmental and economic benefits of integrating food-grade industrial side streams into microalgae cultivation. The use of these side streams has the potential to
significantly reduce the resource footprint and production costs of microalgae-based food ingredients. By optimising side stream usage, the project aims to replace synthetic fertilisers, which are energy-intensive to produce and contribute to the environmental burden of agriculture. This is particularly important given the long-term sustainability concerns surrounding nitrogen fertilisers produced via the energy-intensive Haber-Bosch process, as well as the depleting global reserves of phosphate rock.
Moreover, the MASSPROVIT project demonstrates that microalgae can be cultivated on non-arable land and with non-potable water, reducing the strain on freshwater resources and agricultural land. Microalgae offer a significantly higher area of productivity than most traditional plant crops, making them a more efficient option for future food production systems. By unlocking the potential of industrial side streams, MASSPROVIT seeks to address the pressing challenge of producing sufficient nutritious food in a resource-constrained world, while minimising environmental degradation.
MASSPROVIT project is exploring such innovative approaches to enhance the overall efficiency of microalgae cultivation.
While the MASSPROVIT project highlights the potential of using industrial side streams for sustainable microalgae cultivation, several challenges persist. One key issue is the digestibility of microalgae biomass for human consumption. Although microalgae offer a wealth of nutrients, current processing methods may not fully address the bioavailability of these nutrients. Future research, including ongoing efforts within the MASSPROVIT project, is focused on refining processing techniques to improve the digestibility and nutritional quality of microalgae-based food products.
Additionally, large-scale cultivation and the development of controlled, high-tech systems remain critical hurdles in the transition from research to market. As part of the MASSPROVIT project, ongoing investigations are examining zero-emission industrial systems that use upcycled water and LED
“By unlocking the potential of industrial side streams, MASSPROVIT seeks to address the pressing challenge of producing sufficient nutritious food in a resourceconstrained world, while minimising environmental degradation.”
Over the decades, microalgae have periodically gained and lost attention as industrial solutions to various global challenges, from food production during World War II to biofuels during the peak oil crisis. Despite fluctuations in interest, each wave of innovation has brought significant advancements in cultivation technologies. These advancements have enabled the development of raceways for wastewater treatment in the 1970s and, more recently, improvements in the production of bulk products such as singlecell protein (SCP), oils, and pigments. The MASSPROVIT project builds on this legacy, seeking to refine microalgae cultivation methods to make them more efficient and sustainable.
Nonetheless, significant challenges remain. While SCP production offers high yields, the associated costs—including those linked to low biomass concentrations, oxygen inhibition, and the energy demands for mixing and degassing in phototrophic systems—remain formidable obstacles. Emerging cultivation strategies, such as oxygen-balanced mixotrophy (OBM), offer promising solutions by recycling oxygen and carbon dioxide within the system, thereby reducing the need for external inputs. The
lighting to further reduce the carbon footprint of microalgae production. The project is also exploring the potential of forward genetics and high-throughput screening to address current gaps in understanding microalgae metabolism, which could pave the way for strain improvement through genetic engineering.
The MASSPROVIT project underscores the feasibility of using food-grade industrial side streams as a viable strategy for enhancing the sustainability of microalgae cultivation. The positive impact of SS1 on growth and nutritional quality highlights the potential for integrating industrial by-products into microalgae production processes. Continued research within the MASSPROVIT framework will be essential for optimising side stream compositions and scaling up cultivation techniques to maximise the environmental and economic benefits.
As the world seeks more sustainable food production solutions, the insights gained from the MASSPROVIT project contribute to the broader goal of developing eco-efficient systems that can meet the growing demand for nutritious food while minimising environmental impacts. Through innovations in microalgae cultivation, the project is helping to build a more sustainable future for global food systems.
Making microalgae a sustainable future source of proteins and vitamins
Project Objectives
The MASSPROVIT project has two overarching objectives: first, to enable the sustainable, local Danish production of microalgae as a novel source of non-animal protein and healthpromoting compounds such as omega-3 fatty acids and vitamins; and second, to provide proof of concept for a low-footprint, proteinrich microalgae biomass that could potentially reduce Denmark’s reliance on imported, emission-intensive protein sources.
Project Funding
Funded by The Independent Research Fund Denmark. Grant number: 1127 – 0261B
Project Partners
Technical University of Denmark and University of Copenhagen.
Contact Details
Project Coordinator,
Professor Charlotte Jacobsen, Ph.D.
Technical University of Denmark
Head of Research Group for Bioactives –Analysis and Application
Henrik Dams Allé
Building 201, room 130 2800 Kgs. Lyngby
Denmark
T: +45 23 27 90 75
E: chja@food.dtu.dk
W: https://orbit.dtu.dk/en/persons/charlottemunch-jacobsen
Charlotte Jacobsen, Ph.D. is professor in Bioactives – Analysis and Application. She is internationally renowned for her research in lipid oxidation of omega-3 rich foods and she has received several awards including the Danish Danisco price 2003, the French La Médaille Chevreul 2010, the German DGF Normann Medaille 2020, the Stephen S. Chang award (2021), AOCS Fellow (2022), European Lipid Science Technology award (2023), Knights Cross of the Dannebrog Order. Her publication list includes more than 290 peer-reviewed manuscripts and book chapters.
Professor Charlotte Jacobsen, Ph.D. is the main supervisor of Emil Gundersen and Professor Poul Erik Jensen is the supervisor of Malene Lihme Olsen.
Denmark has played a pioneering role in the development of wind energy and it generates a large proportion of the country’s electricity, yet noise from turbines can cause controversy. Researchers in the Co-Green project are looking at the issues around wind farm noise and how local communities can be better included in project planning, as Dr Julia Kirch Kirkegaard and Daniel Frantzen explain.
Figure 1: Dynamics of technification and politicisation in wind energy.
Regulatory Frame
Noise from wind turbines must not be more than 44 dB for “ordinary” noise and 20 dB for “low frequency noise”
Technical Frame
Sound is to be controlled:
• Standards
• Modelling
• Wind tunnel tests
• Manufacturing Techniques to ensure regulations are met
Planning Frame Environmental Impact Study:
Map with calculated noise contours and people’s houses
Community Responses
• “What is a dB?”
• “Why is noise calculated not measured?”
• “I am different – I am sensitive to noise.”
• “Who set the regulations?”
• “The standards don’t reflect what I hear.”
The wind energy sector accounts for the generation of over half of the total electricity consumed in Denmark, while the country is also an international leader in the development of turbine technologies, as well as in integrating wind power into the energy system. There is broad support for wind energy across the Danish population. “Most people in Denmark are pro-renewable energy and pro-wind, but they might argue that there are genuine issues with individual wind farms,” says Daniel Frantzen, a researcher at the Danish Technical University (DTU) working on the Co-Green project, an initiative backed by the Independent Research Fund Denmark (DFF). The noise associated with wind turbines is a prominent concern, an issue at the heart of the project. “Why is it that we see a lot of controversy over wind turbine noise?” asks Julia Kirch Kirkegaard, Professor at the Department of Technology, Management and Economics at DTU.
Co-Green project
As head of the project, Professor Kirkegaard is looking into the root causes of increased
public resistance to wind farms, with the wider aim of improving communication between local communities and technical experts. Wind farm development, and wind energy generally, is often regarded as a highly expert and techno-scientific process, which is regulated through policies,
This can cause resentment and stimulate opposition to a development, which often centres around the issue of noise, the impact of which is difficult to measure objectively. While developers can take noise measurements and run models, this may not reflect the actual experiences of
“When developers think about noise only in very technical, physical terms, it doesn’t
necessarily match the daily reality of people’s lives. Noise can disrupt people’s lives and activities, it has a cumulative impact.”
regulations, and planning law. “We refer to this governance mode of the energy transition as ‘technification’”, explains Professor Kirkegaard. However, this technical information does not fully reflect the way that lay people and communities are affected by the presence of wind farms, who often respond by describing their everyday experiences. “There is a clash of values, a struggle we refer to as ‘politicisation’,” continues Professor Kirkegaard. (Figure 1)
the people affected. “When developers think about noise only in very technical, physical terms, it doesn’t necessarily match the daily reality of people’s lives. Noise can disrupt people’s lives and activities, it has a cumulative impact,” points out Professor Kirkegaard. A wide variety of factors may affect the way people experience noise. “For example, it’s well-established in the literature that if you benefit economically from wind farms then you will be less
Isolate the noise and simplify
bothered by the noise they generate,” says Frantzen. “It’s also been found that people who find wind turbines visually intrusive, who believe they don’t fit in the landscape, will perceive them as more noisy.”
The project team has reviewed many strands of scientific literature and interviewed not only local community members, but also a number of experts, aiming to build a deeper picture of the issue. They found that the various scientific disciplines currently involved in noise research (technical, health and social acceptance) understand noise as something fundamentally different. (Figure 2)
The efforts to address noise are thus different and sometimes contradictory. The isolation of the volume of a noise, as measured in dB(A), is a major factor in technical and health research. “Noise doesn’t have to be particularly loud to be disruptive,” says Frantzen. The characteristics of wind turbine noise are also an important consideration. “Wind turbine noise has a particular rhythm
Identify the noise and recognise other factors
that makes it more distinctive to some people, and that can make it more disruptive than other, more stable types of noise,” says Frantzen. “Things may also change in the surrounding landscape, so it adds up to a complex overall picture.”
A second strand of investigation centres around case studies on Danish wind farm projects at various stages of the planning process, where researchers have interviewed the actors involved, including local politicians, planners, developers and concerned citizens. The latter group includes people who have become engaged in controversies about wind farms, and while some opponents may be motivated by narrow concerns, Professor Kirkegaard says it is misleading to characterise them all in this way. “We’re trying to show that it’s more complicated than that. Opponents of wind farms often have good ideas about how things could be done differently and how local people could be involved,” she
Fairness
Sound Justice Engagement Planning Proceedure Visual Impact Green Transition
Many aspects to ‘acceptance’ that need to be understood together
stresses. However, it is often relatively late in the planning process that these alternatives are heard, meaning they are often formally dismissed, leading citizens to search for new means to block projects. (Figure 3)
Many municipalities do carry out early strategic planning processes for renewable energy before specific projects are considered, but Frantzen says people are often not aware that these processes are under way. “Often citizens only become aware of energy planning in their municipality when things become very concrete,” he says. “But once a very concrete project has been announced, you can’t change a lot due to the nature of planning law.”
There is a high degree of consensus in the literature that community engagement should begin at an earlier stage than is currently the case, giving local people more time to make suggestions. However, there is
3. Frustrated by how little they can affect the planning process, citizens search for ways to cancel projects – e.g. by finding protected birds.
Co-Green Controversies in the green transition: The case of wind turbine sound and its politicisation
Project Objectives
The project investigates the many different ways in which wind turbine sound is understood and the various types of expertise that try to explain it. We use this deeper understanding of sound to examine how sound from wind turbines is often politicized and problematized in wind power deployment and sometimes leads to controversy and delayed projects. On the basis of experimenting with innovative ways of communicating and ‘co-creating’ a common understanding of wind turbine “soundscapes”, the project creates fertile ground for a better involvement of citizens in the green transition.
Project Funding
This project is funded by the Independent Research Fund Denmark DFF.
Project Partners
Tom Cronin and Sophie Nyborg, DTU Wind and Energy Systems
Contact Details
Julia Kirch Kirkegaard
Professor Department of Technology, Management and Economics, Akademivej, Building 358 DK-2800 Kgs. Lyngby
Denmark
T: +45 93511431
E: jukk@dtu.dk
W: https://wind.dtu.dk/projects/researchprojects/co-green
Julia Kirch Kirkegaard is a Professor in the Center for Human-Centered Innovation, at the Danish Technical University (DTU). Her PhD is from Copenhagen Business School, and she specialises in valuation studies, within the field of Science & Technology studies.
Daniel Nordstrand Frantzen is a sociologist. He recently submitted his PhD thesis at the Department of Wind and Energy Systems, DTU. In this he explores ‘valuation struggles and compromises’ regarding wind energy in Denmark.
not a lot of agreement on how to conduct community engagement, another topic that researchers are exploring in the project. “We want to experiment with the notion of co-creation. We plan to hold co-creation workshops for an energy community. This is about community-based energy, where people can co-own and use their own system, then integrate it into the broader electricity grid,” says Professor Kirkegaard. Community-owned companies may be more likely to garner local support, but it’s also important to consider the wider picture, with climate concerns, the biodiversity crisis and conflict in Ukraine all increasing pressure to speed up development. “The issue is how to develop wind farms quickly,” says Frantzen.
The wind sector is likely to remain a major part of Denmark’s energy mix well into the future, yet more effective consultation between experts and local communities is central to maintaining public support. This is not simply a question of providing numbers and graphs on how noisy wind turbines are, as that doesn’t capture the reality of their impact. “We’ve seen in the past that the ‘technified’ strategy of letting science and technology give us the numbers doesn’t really end the controversy over wind turbine noise,” stresses Professor Kirkegaard. The hope is to find collaborative ways to discuss differences and identify compromises in the ongoing development of the wind energy sector. “There are plans for energy islands in the North Sea and around the Baltic. These will be huge facilities, with offshore hubs to connect multiple wind farms, enabling the distribution of very large amounts of
electricity over very long distances, first to Germany, and potentially also to Sweden and Poland,” says Professor Kirkegaard. There is also potential for any excess wind energy to be used to mitigate the environmental impact of hard-to-abate sectors, like aviation, public transport and agriculture. Excess wind energy can be converted, so it is stored in hydrogen molecules and other types of fuels through Power-to-X technologies, which opens up new possibilities. “These fuels can then be used in heavy transport vehicles. This means we can now put even more value on wind power, because the excess wind power can now be put to use in other ways,” explains Professor Kirkegaard. While this would seem to strengthen the case for further development of wind energy, Professor Kirkegaard says it is important to consult with affected communities and take local concerns into account. “Energy companies need to be mindful of local concerns and navigate carefully,” she acknowledges. “That’s something that we are contributing to through our work in the project.”
This could help build local support for wind farms at an early stage in development and prevent problems later on. Where people feel they haven’t been consulted, they may move to outright opposition, and seek ways of stopping a development, a situation researchers hope to prevent. “If we can get more people involved in figuring out what kinds of solutions we should have and where should they be placed, then we would probably have fewer of these big local conflicts, where projects are delayed for many years or even cancelled completely,” says Frantzen.
“ We refer to this governance mode of the energy transition as ‘technification’”
Truth and reconciliation commissions have been established in several Scandinavian countries to deal with historical injustices committed against the indigenous Sámi people and Finnic speaking minorities. Researchers in the TRiNC project are exploring how these commissions function and their role in shedding new light on the past, as Dr Astrid Nonbo Andersen explains.
The indigenous Sámi people and national minorities the Kven, NorwegianFinns and Forrest-Finns in Norway, as well as the Kven, the Tornedalians and the Lantalaiset in Sweden, have historically been subject to discrimination and injustice by the states of Norway, Sweden and Finland. These practices include often harsh assimilation policies that marginalised or even suppressed indigenous and minority cultures. Truth- and Truth and Reconciliation commissions have been established in these three countries to investigate what happened and deal with the legacy of the past, although the scope of the mandate varies in each case. “The Norwegian mandate is quite tight and is focused on assimilation policies. The commission was asked to identify the truth about those policies, get an overview of their long-term effects, and make recommendations on reconciliation,”
As part of her work on the TRiNC project, an initiative funded by the Danish Research Council, Dr Nonbo Andersen is looking at how these commissions came about, how their reports are received in the public and which types of political action they generate. The Sámi and minorities themselves initiated the processes that led to the establishment of the commissions. “The Sámi achieved some major political and legal victories in the 1980s , particularly in Norway, and the minorities a little later,” says Dr Nonbo Andersen. However, their languages and culture are still under pressure, and the commissions are part not only of the struggle to set the historical records straight, but also solve ongoing conflicts. The Nordic TRCs share this in common, but there are also major differences. “The Scandinavian nations are often grouped together and thought of as pretty similar, yet there are actually some pretty big differences between them. That is
“The Scandinavian nations are often grouped together and thought of as pretty similar, yet there are actually some pretty big differences between them. That is related to their history, and how these states treated the Sámi people and the Finnic speaking minorities.”
outlines Dr Astrid Nonbo Andersen, a Senior Researcher at the Danish Institute for International Studies. The Swedish and Finnish mandates by contrast are longer. “The Swedish mandate for the Truth commission for the Sámi is focused on truth while it was stipulated that a separate reconciliation process could only take place after the Truth had been uncovered,” continues Dr Nonbo Andersen. “The Finnish mandate is very broad and includes a focus on rights, which is absent in the Norwegian mandate.”
related to their history, and how these states treated the Sámi people and the Finnic speaking minorities,” explains Dr Nonbo Andersen. “This is directly reflected in the mandates and on how the TRC processes unfold in the three countries.”
The project team have been following the work of the commissions as they have progressed, with Dr Nonbo Andersen and her colleagues seeking to take a comparative, trans-state perspective. The papers written by team members reflect on how international experience with TRC processes reflect in the
Nordic TRCs. But also to inform a broader international audience about how the Nordic TRCs differ from previous TRCs, which may not be obvious at first glance. “We hope to shed light on how the concept of reconciliation is translated into practice in a Nordic context, through detailed analysis of the processes. This may also be useful for other activists, practitioners and policymakers considering how to deal with instances of historic injustice,” she outlines.
Truth & Reconciliation in the Nordic Countries
The project is funded by The Danish Research Council’s Sapere Aude Research Leader Program
Dr Astrid Nonbo Andersen, Senior Researcher Peace and Violence
Danish Institute for International Studies Gl. Kalkbrænderi Vej 51A
2100 Copenhagen O
Denmark
T: +45 9132 5594
E: asan@diis.dk
W: https://www.diis.dk/en/projects/trinctruth-reconciliation-in-the-nordic-countries
Astrid Nonbo Andersen is a Senior Researcher in the Unit for Peace and Conflict at the Danish Institute for International Studies (DIIS). Her research focuses on historical justice, reconciliation and restitution in the Nordic region as well as current and historical relations between Denmark and its former colonies.
Handwriting is an enormously important cultural practice, a means for people to communicate and pass on knowledge and experience. The Understanding Written Artefacts (UWA) cluster brings together a broad community of researchers to analyse a variety of written artefacts, from 5,000 years ago right up to the present day. We spoke to some members of the cluster about their research.
The oldest surviving written artefacts in the world are thought to be Mesopotamian clay tablets dating from around 5,000 years ago, while handwriting is still being practiced today in a wide variety of forms. The team behind the Understanding Written Artefacts (UWA) research cluster, based at the Centre for the Study of Manuscript Cultures (CSMC) at the University of Hamburg in Germany, are studying a diverse range of written artefacts from cultures across the world. The cluster brings together researchers from more than 40 disciplines, who are currently working on around 60 different projects. “We are looking at any practice where human societies wrote on objects. Our premise is always that we start from the material object, not the text,” outlines Professor Konrad Hirschler, Director of the UWA cluster. “This could be a cuneiform tablet, a parchment scroll, a WW1 letter or modern-day graffiti for example.”
The cluster’s distinctive feature is that it brings together humanities scholars alongside researchers in natural sciences and computer science, who have developed a number of innovative techniques to analyse written artefacts. The artefact laboratory at CSMC provides both stationary and mobile facilities designed to help researchers build a fuller picture of these objects and enrich our understanding of the past. There are three components of the artefact lab, one of which is a mobile lab, providing equipment which can be transported to the locations where written artefacts are held, so avoiding the need to transport them and expose them to undue risk. “We can go into museums and analyse the objects there, as museums will not always send them to an outside institution,” continues Professor Hirschler. “Members of the cluster have conducted research at the Louvre in Paris for example, and we have also been to the Museum of Anatolian Civilizations in Ankara recently.”
The high-performance lab at CSMC meanwhile is equipped with state-of-theart technology to analyse those artefacts that can be transported. For example Graz University Library recently sent an Egyptian papyrus known as the ‘Mummy Book’: arguably, this papyrus is a fragment of a codex dating from the 3 rd century BCE, around 300-400 years before book binding
whether the papyrus was actually part of a bound codex.
A third component of the artefact lab is a container lab, which is currently stationed in Puducherry, India, for analysis of palmleaf manuscripts, one of the most important writing supports in East and South-East Asia. This laboratory is fully equipped to conduct biochemical and DNA analysis of
“Our researchers have been working with handwritten artefacts for over a decade, and they develop cuttingedge methods, such as image inpainting employing deep learning , to make writing visible.”
was believed to have been established. This exciting manuscript was analysed in great depth in Hamburg in August 2024.
“We have conducted full research on the ‘Mummy Book’ across our methodological toolkit,” outlines Professor Hirschler. The researchers are currently analysing the data, which they hope will shed light on
historical palm-leaf manuscripts, which Professor Hirschler says can lead to fresh insights into their provenance. “We are able to gain entirely new information on the biography of these manuscripts, which are currently de-contextualised. Our approach adds a whole range of new data to scholarly discussions,” he continues.
A variety of other techniques have also been developed within the cluster to uncover writing that was previously hidden or invisible.
A prime example is the ENCI (Extracting Non-destructively Cuneiform Inscriptions)
CT-scanner, which allows researchers to read cuneiform tablets enclosed in clay envelopes. Other projects in the cluster employ imaging technologies and AI, for instance to make palimpsests - manuscripts on which later writing has been superimposed - readable. “Our researchers have been working with handwritten artefacts for over a decade, and they develop cutting-edge methods, such as image inpainting employing deep learning, to make writing visible,” says Professor Hirschler. Among the methods used, the cluster has built up a global reputation for multi-spectral imaging, which involves emitting light at different wavelengths.
“The idea is that the reflections will differ depending on whether there are still traces of ink at that particular point on the object,” explains Dr Jakob Hinze, the cluster’s communications coordinator. This can enable researchers to essentially reconstruct faded or erased texts on several different writing supports.
Many scholars within the cluster are also interested in the overall context and evidence of the practices that were applied on a material object. Each individual written artefact has its own history and is worthy of attention, believes Professor Hirschler. “A new artefact produced 800 years after the original is not simply a ‘copy’ to us, it’s a new original. It’s a new engagement with this specific content. It’s been put into a new form, layout and material shape,” he stresses. What might previously have been called a copyist has in fact a fair degree of authorial agency in actively reframing texts, which has not always been reflected in the way researchers have dealt with such artefacts. “In the 19 th century critical editions of certain texts were printed, which were presented as the authoritative text, but this is not how humans have historically engaged with information management,” says Professor Hirschler. “It’s a very short blip, because in the digital era we are once again moving to a much more flexible way of dealing with texts.”
The primary goal in this research project is to investigate manuscript cultures, yet at the same time Professor Hirschler and his colleagues are also keenly aware of their ethical responsibility in dealing with the objects themselves, and the need to preserve them for future generations. Where manuscripts have been found to be in a fragile state,
Yao religious scroll on paper. It shows the Fam Ts’ing, ‘the three pure ones’, which represent the celestial. The scroll is hung vertically and has ritual functions.
researchers have focused on preserving and stabilising them before engaging in research activities. “We aim to bring manuscripts in
a problematic state of conservation onto a level where we are confident that they will survive and that further research can be conducted on them in future,” says Professor Hirschler. By focusing on the material objects, researchers have learned about how they can be preserved, knowledge that they are developing with institutions in different parts of the world. “We are working internationally with numerous institutions, including in West Africa and East Asia,” says Dr Hinze.
An increasing number of humanities projects are now coming to Hamburg to use the facilities and add a material dimension to their work, which provides strong foundations for continued research. With ongoing development of the facilities, also in collaboration with the team at the DESY synchrotron in Hamburg, Professor Hirschler hopes to build on the cluster’s reputation as a centre of excellence in manuscript research. “We do a lot of methodological development work with the team at DESY, essentially developing different methods to analyse written artefacts,” he says. These methods and facilities can help museums, archives and other institutions build a deeper picture of the cultural artefacts that they hold. “By working together with DESY, we have a real chance to build up a sustainable methodology on how to use synchrotron radiation to analyse cultural heritage artefacts,” he says.
Dr Hussein Mohamed is head of the Visual Manuscript Analysis lab and a Principal Investigator in the UWA cluster at the University of Hamburg. We spoke to him about his work in the cluster and how computational document analysis can uncover new details about written artefacts.
EU Researcher: What is your role within the UWA cluster?
Dr Hussein Mohamed: My colleagues and I are utilising and further developing AI approaches to the visual analysis of written artefacts as a whole, beyond their textual content. Our work has three strands. First, we analyse handwriting in terms of style, the ink used, writing tool, and writing support. Second, we detect, cluster, and classify particular visual features of written artefacts, which makes it possible to navigate through large collections. Third, we recover lost visual information in ancient manuscripts that were damaged or overwritten. An example for this is palimpsests. We utilise generative AI techniques such as image inpainting to reconstruct texts that cannot be read anymore, not even with the help of modern imaging techniques such as multispectral imaging.
EUR: How do you approach improving these methods? What role does feedback from the scholars play?
HM: Novel methods in computer vision can help scholars in the humanities to address some of the challenges in manuscript research. For our collaborations to be fruitful, however, both sides need to invest time: As a computer scientist, I first need to understand the details of the research questions that my colleagues in the humanities are engaging with; likewise, the scholars need to invest time to get to grips with the applied approaches I can offer them, and the results that these approaches can generate. In short, to improve a given method in such a way that it helps scholars
in the humanities with their specific research question, I need to thoroughly understand the research question itself, and I need feedback from the scholar who eventually wants to work with the results generated by this method.
EUR: What are you currently working on in the cluster?
HM: I am currently researching advanced mechanisms for the visual navigation of large manuscript collections. Some collections are too complex to be analysed by a single modality, i.e. the way in which information about an artefact is represented, for example in a text or an image. In this research, known as visionlanguage learning, we extract features from both textual and visual modalities and learn the associations between them. For example, visually rich images cannot be fully analysed by the mere detection of a seal or a drawing. We also need information on other aspects such as the spatial relations between different visual elements and their semantic context. To get a holistic understanding of a given written artefact, we need to integrate all the information that we can extract.
EUR: Do you ultimately aim to help scholars spend more time working with the manuscripts rather than sifting through data?
HM: By developing new methods, we can indeed help scholars save time on repetitive tasks, giving them more time for the intellectually exciting aspects of their research. However, the value of our methods is not limited to automation. By combining large amounts of documented knowledge, approaches from our research in computer vision can yield genuine new insights that would otherwise be unattainable, such as finding statistical correlations between different patterns or recovering damaged texts.
Manuscript 2058/6a from the Georgian manuscript collection of Graz University Library is being prepared in the UWA laboratory for X-ray fluorescence scanning (XRF) to determine its provenance through the composition of the black and red inks used.
Jost Gippert is Senior Professor at the Centre for the Study of Manuscript Cultures at the University of Hamburg. Together with a team of researchers at CSMC, Professor Gippert is investigating the development of literacy in the Caucasian territories, building on analyses of palimpsests.
EU Researcher: What are palimpsests?
Professor Jost Gippert: Palimpsests are manuscripts usually written on parchment, the contents of which subsequently became obsolete. However, people decided not to throw them away when they were obsolete, as parchment was very expensive in the Middle Ages. Rather they simply erased the contents and used the parchment again for something else. For example we have palimpsests from the 7th century, which were read for maybe 200-300 years. Then they were erased and overwritten in the 11th century with something totally different.
EUR: What is the scope of your research in the Caucasus? What is the timeline of the materials you’re looking at?
JG: There have been two written languages in the Caucasus since the 5th century, Armenian and Georgian. A third – Caucasian/Albanian –was only written for 200-300 years, between around the 5-8th centuries, then given up. There is a successor that still exists today, a language spoken by a small number of people in the Caucasus called Udi. Literacy in the region dates from around the 5th century when it was Christianised. The manuscripts we have from that time are nearly all palimpsests, as practically none survived without being erased later on.
Our timeframe is from the 5th to about the 10 th-11th centuries. We are trying to figure out what they were writing by deciphering the erased text on the palimpsests. Most of the texts are bible translations, along with translations of other theological texts.
EUR: Did the languages co-exist alongside each other? Or were there times when one language was dominant?
JG: It went in waves. Armenian was dominant initially, then Georgian developed its own individual stamp a bit later. Then we had a very important historical fact, namely a schism of the churches at the beginning of the 7th century, when the Armenian Church
departed from the main Greek orthodox tradition, but the Georgians remained. This was a major split, and then they came closer and moved further apart at different times.
EUR: Are you looking at how languages evolve? How do you approach your work on cross-language synthesis?
JG: When we consider the traditions of the early centuries of literacy, we can see that everything was joined together to a certain degree, which is what we mean by synthesis. We put together the information we have on the one language with information that we have on the other, and we try to figure out the extent to which they were influencing each other.
Cécile Michel is a Professor of Assyriology at the National Center for Scientific Research in Nanterre, France, and a member of the UWA cluster, where she leads a project dedicated to reading cuneiform tablets hidden in their original clay envelopes using ENCI, a high-resolution mobile CT scanner. We spoke to her about what wider insights can be drawn from this work.
EU Researcher: Could you describe the cuneiform tablets?
Professor Cécile Michel: Cuneiform tablets are the first form of writing invented in human history, dating from 3,400 BCE up to 100 CE. It’s a script which was used in three different writing systems – the first system has one sign for one word or idea, the second had one sign per syllable, and the third is an alphabet made up of cuneiform signs. This script has been used by something like 15 different languages from different families across a very large area, from the Mediterranean to the Persian Gulf, to the Black Sea, to Egypt.
I am working on private archives of these tablets from the 19 th century BC excavated in central Anatolia at a site called Kültepe –the ancient name is Kanesh. There are around 23,000 tablets found at Kültepe, only half of which have been read and published.
EUR: Who wrote on the tablets? What type of material are you uncovering?
CM: We find many letters written by merchants, as they travelled a lot, and we also have a lot of letters sent by their wives. They would often write about economic issues, but there’s also a lot about day-to-day details and family relations. Literacy was quite widespread in the Assyrian population, although at different levels. We find tablets from skilled scribes, who write very nicely in very clean, straight lines. We also find tablets written by learners, who make clear grammatical errors.
EUR: How do you use the ENCI in your research?
CM: If, for example, a letter could not be delivered, it often remained in its clay envelope and was never read. Today, we must not destroy these envelopes because they are cultural artefacts in their own right. Nevertheless, we want to read the hidden texts. ENCI enables us to do so. We simply put a tablet into ENCI, and it generates images in slices of 15 microns, that are later reconstructed into a full object. So we can see all the details in very high definition: Not only can we read the hidden text, we can also see how the clay was prepared and shaped into a tablet. We sometimes find inclusions like
stones, snails, shells, seeds, and leaves, which give us hints about the natural environment.
EUR: How far advanced is your research? Are you also working to develop new tools and instruments to analyse cuneiform tablets in greater depth?
CM: I’m developing new ideas to better understand the cuneiform tablets in a holistic sense, so not only the text, but the object as a whole. For example, new techniques can enable us to identify the clays and be more precise about the origins of the letters. Sometimes there are fingerprints on a text which can be analysed, from which we can learn more about the age and gender of the scribes.
Material, Interaction and Transmission in Manuscript Cultures
Project Objectives
The Cluster of Excellence ‘Understanding
Professor Christian Schroer is head of the high performance laboratory of the UWA cluster and DESY’s synchrotron radiation source PETRA III in Hamburg, home to sophisticated equipment that can be used to analyse written artefacts, while he also develops mobile tools and methods like the X-ray tomograph ENCI as part of his work in the UWA cluster. We spoke to him about how synchrotron radiation can help researchers look deeper into written artefacts.
EU Researcher: Could you tell us about your role in the cluster?
Professor Christian Schroer: I am co-director of UWA and part of the team representing the natural sciences in the cluster. We have a lot of analytical tools at the lab to investigate the chemistry, physics, structure and other aspects of material objects, which is what makes synchrotron radiation interesting. However, objects need to be taken to the lab for analysis, and many archives and museums won’t allow that, so we needed to develop mobile instruments, as well. It was clear to me that we could use tomography to look inside written artefacts like cuneiform tablets in a non-destructive way, and do it on site, which led to the development of ENCI.
EUR: How does ENCI work?
CS: The X-rays are produced by an ordinary X-ray tube, but the point where they are generated is really small. We can essentially focus the electrons onto a thin tungsten foil in which the X-rays are generated in a really small spot. There is then a very well-defined line through your object, between the source and the detector, which gives you a really sharp projection. We can then see details of the
handwriting, as we have a resolution of basically half the width of a hair.
EUR: And ENCI is mobile?
CS: Yes, it weighs slightly more than 400 kgs, and can be disassembled into eight pieces, the heaviest of which weighs 100 kgs. By comparison, a typical micro-CT system usually weighs between 2-7 tonnes.
ENCI needs a lot of penetrative power to shine through bricks, but at the same time we need to maintain safety standards. We use tungsten to essentially contain the X-rays, but we use as little as possible to keep the weight down.
EUR: Are you looking to improve ENCI and develop new tools?
CS: Yes, as many people within the cluster are now more aware of how tomography can be used to investigate written artefacts, and are interested in using these methods. We can look into changing the dimensions of the chamber to accommodate larger objects, while books might be more easily penetrated by X-rays, so we can also change certain parameters. I see my role as bringing these X-ray techniques that we use in the lab to the cluster, and making them applicable to the analysis of written artefacts.
Written Artefacts’ (UWA) is a cross-disciplinary and international research project for the holistic study of handwritten artefacts. Its main objective is to investigate the rich diversity of global manuscript cultures beyond traditionally held boundaries of academic discipline, time, and space. Its global perspective encompasses all objects carrying handwriting, from the beginning of writing to today’s digital age.
Project Funding
The Cluster of Excellence ‘Understanding Written Artefacts: Material, Interaction and Transmission in Manuscript Cultures’ (project no. 390893796) is funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy.
Project Partners
• Helmut-Schmidt-Universität / Universität der Bundeswehr Hamburg (Helmut Schmidt University/ University of the Federal Armed Forces Hamburg)
• Technische Universität Hamburg (Hamburg University of Technology)
• Deutsches Elektronensynchrotron (DESY, German Electron Synchrotron)
Contact Details
University of Hamburg
Cluster of Excellence
‘Understanding Written Artefacts’ Warburgstraße 26
20354 Hamburg, Germany
T: +0049-(0)40-42838-7127
E: manuscript-cultures@uni-hamburg.de W: https://www.csmc.uni-hamburg.de/
UWA.
The early years of a child’s life are highly formative, and while parents are typically the primary care-givers, the state often provides support to young families through policies such as subsidised childcare. How do families respond to public policies? Does their response depend on their background? These questions lie at the core of Dr Hans Sievertsen’s research.
The early years of a child’s life are highly formative, and parents typically devote enormous effort to nurturing their children, for example by providing nutritious meals, reading stories and taking them to toddler groups. Alongside these parental investments in children, many European governments also provide some level of support to families. “Some countries offer child benefit and tax credits to families with young children, while most governments subsidise childcare to some degree. Certain governments also have health programmes specifically targeted at young families, such as the nurse home visiting programme (NHV) in Denmark,” outlines Dr Hans Sievertsen, a researcher at VIVE, the Danish Centre for Social Science Research. As Principal Investigator of a research project based at VIVE, Dr Sievertsen is looking at the relationship between the way parents support their children and public policy, building on data from tax registries, surveys and nursing records. “We want to look at how families respond to public policies. What do families do if child benefit is increased? Does their response depend on their background?” he explains.
There are three main strands to this research. One part of the project is largely descriptive, with Dr Sievertsen and his colleagues seeking to describe existing inequalities and their effects on the very early stages of family life following childbirth, using data from Denmark. “There for example we look at the relationship between birth weight and parental income for example, or the extent to which the number of contacts with a nurse in the first year depends on the parents’ income. This is purely descriptive and is essentially about setting the stage; how different are families? How often does a high-income family see a nurse in comparison to a low-income family?” he says. The project team are also looking at health investments. “This covers essentially the first year of a child’s life, and there we look to see how families respond to a home visit by a nurse,” continues Dr Sievertsen. “The third main element of the project is focused on broader public policies, like child benefit, which starts immediately after childbirth. Here, we’re looking at data on the first three years after childbirth.”
Researchers in the project are using three sources of data in this work, one of which is administrative records from tax registries, which provide information on people’s incomes and their working patterns. This can then be linked to their educational records and other outcomes, on which a wealth of data has been recorded in Denmark. “There is a very good research infrastructure in Denmark,” stresses Dr Sievertsen. This information is combined with two other sources of data unique to the
“We
This research involves looking at how parents respond to public policy in terms of their working patterns. In low-income families for example, increased levels of financial support may allow one or both parents to adjust their working patterns, potentially allowing them to increase the amount of time they can spend with their children. “We can see that when low-income families receive more child benefit, it allows the mother and father
want to look at how families respond to public policies. What do families do if child benefit is increased? Does their response depend on their background?”
project. “We did two big surveys on around 60,000 children in 2017 and 2020, where we went out and asked parents questions on things like how much breastfeeding they did. We also did some screening questions about the child’s development,” says Dr Sievertsen. “The third source of data is nursing records from the NHV programme. In Denmark every family is visited by a nurse a few weeks after childbirth, who comes to the family, checks how things are going, and offers advice and guidance. We have access to these records –which are anonymised – and we are using them in our research.”
to re-allocate their labour supply,” says Dr Sievertsen. However, when child benefit is reduced to a lower level the father often starts to work more to compensate, which leads to some changes, as Dr Sievertsen explains. “The child will then spend more time with their mother, because the father is working to earn money,” he continues. “From the nurse records we can see that when lowincome families receive less child benefit, the nurse’ screening is more likely to indicate maternal mental health challenges. We have also found differences in breast-feeding behaviour and other outcomes, so we can
see that families do certain things differently when they have less money.”
The project is primarily focused on the impact of investments in children during their early years, yet these investments – or the lack of them – may have a long-term impact, for example on their later educational attainment. This is a topic that Dr Sievertsen is exploring. “We are working on linking the project on child benefit to school test results. We want to investigate whether there is a relationship between how a child does in school and the amount of child benefit their parents received in their early years,” he outlines. There is a large body of research in health and social sciences looking at the effects of dramatic shocks or events in early life on the subsequent lifecourse, yet it can be difficult to assess the impact of smaller, more regular shocks that may then become part of an individual’s mental background. “Many things happen to children, they are exposed to shocks all the time. So how can we look at how one shock may affect them
later in life?” asks Dr Sievertsen.”If something dramatic happens to a child – like witnessing violence – and the parents have the means, then often they will both try to compensate for the negative effects.”
A lack of resources and access to education and cultural opportunities may seem trivial by comparison, but parents may be more constrained in their ability to compensate. This may then have long-term effects on the child, an issue Dr Sievertsen plans to explore further in future. “We are in the process of working on several sub-projects and research papers,” he says. This research could also help policy-makers understand the likely impact of changes to child benefit or nursing programmes, and Dr Sievertsen is keen to bring the project’s findings to wider attention. “We want to give policy-makers the knowledge they need to decide how they want to design the nursing programme and the child benefit system for example,” he continues.
FAMILY BACKGROUND, EARLY INVESTMENT POLICIES, AND PARENTAL INVESTMENTS
Project Objectives
This project combines data from administrative records, nurse records, and surveys to examine how public policies—such as healthcare, childcare, and financial support for new families—interact with parental behaviors to shape child development.
Project Funding
Funded by the Independent Research Fund Denmark, official title “Family background, early investment policies, and parental investments”. Amount: 5,124,672 DKK. Grant: 0218-00003B.
Project Partners
Department of Economics, University of Copenhagen
Contact Details
Project Coordinator, Dr Hans Sievertsen
Professor MSO
VIVE – The Danish Centre for Social Science Copenhagen Herluf Trolles Gade 11 1052 Copenhagen K Denmark
T: +45 29 65 84 75
E: hhs@vive.dk W: www.hhsievertsen.net
Dr Hans Sievertsen is a professor with special responsibilities at VIVE, the Danish Center for Social Science Research and an associate professor at the School of Economics, University of Bristol. He is an applied micro-economist working on topics, including education, health, gender and inequality.
Artificial Intelligence (AI) tools are increasingly being used to inform decision-making in areas including law, finance and healthcare. The individuals affected by automated decisions have the right to meaningful information about the basis on which they were reached, as well as the right to contest the decision, issues at the heart of Professor Thomas Ploug’s research.
The commercial potential of artificial intelligence (AI) technology is enormous, with applications across many areas, including finance, manufacturing and healthcare. In medicine, AI is already being used in diagnostics and planning treatment, yet the basis of automated decisions should still be made clear to patients under the terms of the EU’s General Data Protection Regulation (GDPR). “Under article 14 of the GDPR, patients have the right to meaningful information about the logics involved, if they are subject to an automated decision. Under article 22, patients also have the right to express their opinion about being subjected to automated decisionmaking, and to contest that decision,” outlines Thomas Ploug, Professor at the Centre of AI Ethics, Law, and Policy at Aalborg University. As part of his work in a research project backed by the Independent Research Fund Denmark, Professor Ploug is seeking to connect these two rights. “The project is about the right to contest decisions, which needs to be defined more clearly and given more substance,” he says.
This work relates to the field of explainable AI, in which researchers seek to ensure that meaningful information is provided about the reasoning behind an automated decision. While an AI model can be used to diagnose patients in a healthcare setting, it is typically difficult for computer scientists to explain what happens in that model. “The kind of explanations they can give are not comparable to how a doctor may explain why a certain diagnosis was reached. We have argued that since it’s difficult to arrive at explanations of AI-based decisions, maybe we should change the perspective. Maybe we should see the right to an explanation more in the light of the right to contest decisions,” says Professor Ploug. When people choose to contest a decision, they often want to contest the grounds on which it was reached; patients in this situation should have access to information about the AI model involved, believes Professor Ploug.
Many patients may be willing to accept a decision reached by an AI model, but those who do choose to contest it should have access to relevant information believes Professor Ploug, including information about the data used, model bias and performance, and about the extent of human involvement in the decision-making. Rather than trying to look into what might be described as the black box of an AI system, Professor Ploug suggests that patients should instead have access to this information. “We should understand the right for an explanation in light of the right to contest decisions,” he says. The project team is working to essentially reinterpret the requirement for an explanation, in light of the right to contest decisions.
The project’s agenda also includes research into the finance sector, where banks may use AI systems in profiling an individual’s creditworthiness or in certain types of decision-making. In this situation banks also need to provide explanations and give people the opportunity to contest AI-based decisions, while researchers are also considering legal
decisions. “We are looking into asylum systems. If a decision on whether to grant an individual asylum is partly made by an AI system, then that individual also has the right to an explanation and to contest the decision. The information requirements in these different contexts may not be the same,” outlines Professor Ploug. AI technology is already being applied in these kinds of scenarios today, and with new applications emerging pretty much daily, there is a pressing need for effective regulation. “We are becoming more and more aware of the need to develop legislation as new and highly transformative technologies emerge,” says Professor Ploug. “The European Union’s AI Act will be the world’s first comprehensive piece of AI legislation.”
Alongside this work on people’s rights, the researchers are also looking at how these rights can be communicated to the wider public, which Professor Ploug says is an important aspect of the project’s work. “We have all sorts of rights as individuals but a lot of people don’t know about them. And if you are unable to define your own rights then clearly it’s very difficult to act upon them,” he points out.
A greater awareness among the public of the right to contest automated decisions may well then encourage more people to do so. It’s essential in this respect that information is provided in an accessible and concise way which can inform lay members of the public, in contrast to many of the
“ We have argued that since it’s very difficult to arrive at explanations of AI-based decisions, maybe we should change the perspective. Maybe we should see the right to an explanation more in the light of the right to contest decisions.”
This legislation is risk-based, with AI systems classified into different categories, and the information requirements then depend on that classification. While Professor Ploug believes this is a positive step in terms of regulating AI, he says that it should be complemented by a system of rights with the needs of individual people at its heart. “The regulatory relationship is between a national AI authority and companies. We need to get citizens back into the equation. Alongside this AI act, we should also establish rights that enable individuals to participate in the regulation of AI use and development,” he argues. Professor Ploug and his research team is therefore also working on a wider set of individual rights in relation to AI use.
excessively long cookie consent forms on the internet for example. “We would like to provide this kind of information in a manageable way,” stresses Professor Ploug. This could ultimately help boost transparency and enhance public trust in AI as the technology becomes a reality in our everyday lives. “The discussions about the potential of AI and possible negative effects are to some extent being superseded by actual development,” continues Professor Ploug. “The debate now is about where it can be used, for what purposes, and what we want it to do. We’re now moving more towards discussions about where this technology can benefit a particular enterprise or institution, or offer a particular service to clients, citizens, or patients.”
Contestable Artificial Intelligence (AI)
Project Objectives
We propose a substantial interpretation of the EU GDPR stipulation that individuals subjected to automated processing, including profiling, has a right to contest the decision-making. A substantial notion of contestability may inform the technical approaches to AI explainability as well as shed important light on an individual’s GDPR right to “meaningful information about the logic involved” when subjected to automated profiling.
Project Funding
Funded by the Independent Research Fund Denmark. Grant ID: 10.46540/202700140B. Contestable Artificial Intelligence (AI) - Defining, evaluating and communicating AI contestability in health care, law and finance.
Project Partners
• University of Aalborg, Denmark (Coordinator).
• University of Copenhagen, Denmark
• Technical University of Denmark, Denmark
• University of Manchester, United Kingdom
Contact Details
Project Coordinator,
Professor Thomas Ploug, PhD
A C Meyers Vaenge 15, 2450 Copenhagen SV. T: +45 31417140
E: ploug@ikp.aau.dk
W: https://vbn.aau.dk/en/persons/ploug
Professor Thomas Ploug, PhD is director of Centre of AI Ethics, Law, and Policy and the Centre of Ethics Education at Aalborg University, Denmark. He heads up the University Research Ethics Committee.
Ploug is former member of National Council of Ethics in Denmark (2010-2016), and he is currently member of a Clinical Ethics Committee at Rigshospitalet in Copenhagen. Ploug holds a position as guest professor at Halmstad University in Sweden.
Organic light-emitting diodes hold immense value in modern society, yet they are relatively limited in terms of brightness. The team behind the Ultra-Lux project are developing a new type of thin-film light-emitting diode, targeting extreme brightness, using a class of materials called perovskites, as Professor Paul Heremans and Dr. Karim Elkhouly explain.
A device which essentially converts electricity into light, OLEDs hold immense value in modern society, as light-emitting pixels in smart phone displays, televisions and Virtual Reality glasses. However, OLEDs are relatively limited in their brightness. As a leader of the Ultra-Lux project team, Professor Heremans has been working to develop a new type of thin-film light emitting devices using a class of materials called perovskites. “The hope in the project is to achieve superbright thin-film LEDs, 10,000 times brighter than an OLED, as a stepping stone to build laser diodes out of thin-film light emitting devices,” he outlines.
This research encompasses all aspects of the design of a light-emitting device, with the aim of going well beyond the capabilities of current OLEDs. The brightness of a diode is a product of its efficiency in converting electricity to light, and also its current density, in which OLEDs are relatively limited. “The layers in OLEDs are organic semiconductors with a very low charge carrier mobility. OLEDs are extremely efficient at low current densities, but they aren’t at super-high current densities,” says Professor Heremans. This may be perfectly sufficient for some applications,
but not for others like laser diodes, which have much higher demands in terms of high current-density operation, an issue that the project team are working to address. “We want to find a new thin-film device system, compatible with high current densities,” continues Professor Heremans.
The project team are working with metalhalide perovskites, which will form the active layer of these new LEDs, where light is created. The advantage of this material class is that they have a high mobility for charge carriers and hence should in principle allow high current density through the device. Beyond the active perovskite
Electrons injected from Cathode
layer, researchers are investigating the entire architecture of the LED, with special attention to heat management. Designs become even more complex when targeting laser diodes. “There are three main components in a laser. First you need a pump source - we want to make an electrically-pumped laser diode, so we pump electrically. Then you need the gain medium, in this case the perovskite, which we can fine-tune chemically. The third thing is a resonator, which is a cavity, a typical example is a pair of mirrors,” explains Karim Elkhouly, a researcher at imec & KU Leuven who is working on the Ultra-Lux project in the imec laboratories. One major priority in this part of the project work is to reduce the lasing threshold. “In a laser diode, you always have absorption of light, and you always have generation of light by stimulated emission. The lasing threshold is the point at which there is more
can achieve both high efficiency and low threshold, but there are compromises to make,” explains Elkhouly. “For instance, in order to improve the efficiency of the device, it may sometimes be necessary to do things that increase the threshold to a higher level.”
A type of perovskite with large threedimensional grains is being used, with indium tin oxide for the electrodes, to mitigate optical losses. “Indium tin oxide is a transparent conductor, meaning that it can conduct current pretty well, but at the same time, it’s also transparent, so it allows you to control the amount of electrical conductivity versus the optical losses,” he explains. “We optimize indium tin oxide material properties to the best possible compromise between low optical losses and high electrical conductivity. We are also integrating an optical cavity, by designing a cavity that fits underneath the entire lightemitting device stack.”
“The aim in the project is to achieve super-bright thin-film LEDs with a brightness 10,000 times higher than that which an OLED can deliver.”
stimulated emission than absorption, and from then on you can actually see lasing,” says Professor Heremans. Researchers aim to reduce this lasing threshold to the lowest possible level, limiting the need to increase the current, as reaching very high current densities is particularly hard in thin-film light-emitting devices. “We need to reach a certain threshold or carrier density to actually see lasing. We are trying to reduce that threshold as much as possible by reducing the absorption and photon losses through various modifications, such as changes to the architecture and design, and improvements to the resonator,” outlines Professor Heremans. The project team are also using quantum confinement features to create conditions where lasing exceeds absorption in a smaller, more enclosed space. “If you manage to narrowly confine the space where you reach that lasing, then the density means there is an absolute smaller number of carriers. That means you can put in a lower absolute current,” continues Professor Heremans.
The nature of the perovskite active layer is an important factor in terms of the goal of reducing the lasing threshold, but it’s also important to consider the layers around it, which can also contribute to losses. “We aim to design a device stack in which we
There is a trade-off here between improving a device’s capability to emit light at very high current densities while at the same time minimising optical losses, and significant progress has been made on both fronts, with researchers now closer to making a laser diode from these perovskites. The long-term goal is not to simply increase the brightness of perovskite LEDs, but rather to develop laser diodes that can be electrically pumped, which Professor Heremans says represents a radical shift. “We want to develop laser diodes that can be integrated on virtually any substrate by what I call monolithic integration,” he outlines. Ultra-Lux itself has recently concluded, but Professor Heremans is optimistic that full laser diodes will be developed in the next few years, while he is also interested in building on the project’s findings in future. “We’ve been working on the near-infrared, but we could extend our research towards visible wavelengths in future,” he says. “It’s fairly easy to find laser pointers for blue and for red, but they are pretty rare in green.”
This is sometimes referred to as the ‘green gap’, referring to the limited green laser options currently available. Perovskites could be a possible option here, says Professor Heremans. “Perovskites can address a wide wavelength range, and they could hold potential in terms of making green lasers,” he says.
Ultra-Bright Thin-Film Light Emitting Devices and Lasers
Project Objectives
ULTRA-LUX aims to overcome brightness limitations of thin-film light sources and achieve injection lasing via electrical pumping. By developing a high-brightness thin-film light-emitting device using metal halide perovskite semiconductors and creating a thin-film injection laser with low lasing thresholds, it will enable new applications in photonic integrated circuits, sensing and ICT.
Project Funding
This project has received funding from European Union’s Horizon 2020 research and innovation programme under grant agreement No 835133.
Project Partners
https://erc-ultralux.eu/index.php/our-team.html
Contact Details
Senior Researcher, Karim Elkhouly IMEC
Kapeldreef 75 3001 Leuven
Belgium
T: +32471633783
E: Karim.Elkhouly@imec.be
W: https://www.imec-int.com/en/whatwe-offer/research/government-fundedresearch/public-funded-projects/ultra-lux
Paul Heremans is Senior Fellow and Vice President for Future CMOS, Sensors and Energy at imec, and a Professor in the Electrical Engineering Department at KU Leuven. He started the organic semiconductor activities at imec in 1998 and the perovskite activities in 2014.
Karim Elkhouly is a Senior Researcher at imec, Leuven, Belgium, within the exploratory materials and devices group, having gained his PhD in 2024. His research focuses on pioneering novel device concepts for thin-film gain media, aiming to push the boundaries of photonic and optoelectronic technologies.
In magnetism the hierarchical structure of the material is important from the atomic structure to the mirostructure via the nanostructure. The size and shape are controlled in the synthesis, while the compaction influences the texture. Making good magnets are a challenge spanning seven orders of magnitude of length scales.
Rare earth elements have been used to produce magnets since the ‘60s and production accelerated in the ‘80s, but environmental concerns and uncertainties over the supply of these materials have prompted research into alternatives. We spoke to Professor Mogens Christensen about his work in developing a more environmentally-friendly method of producing ceramic magnets.
The rare-earth magnets developed in the mid ‘80s accelerated the use of rare-earth materials, and they play an important role in many industrial applications, yet environmental concerns and restrictions on the supply of essential materials have stimulated research into alternative production methods. As head of the METEOR project, Professor Mogens Christensen is working to develop a new, more environmentally-friendly method of producing ceramic magnets that doesn’t rely on rare-earth elements. “We are developing a new approach which has significantly fewer steps and uses less energy than current methods,” he outlines. This method involves precisely controlling the shape of the crystals at very small scales. “We make sure that our crystallites have a specific shape on the nanoscale,” explains Professor Christensen. “With this shape, we can then orient the crystallites and align them so that they are all pointing in the same direction when we compact the powder. We can make our crystallites in such a way that they lie flat, then we can transform them into a magnet.”
Researchers in Professor Christensen’s group are working with crystals of a very common mineral called goethite, which is one of the main components of rust. One of the most important considerations when selecting a goethite powder in terms of the project’s overall goals is its size, and Professor Christensen and his colleagues are looking carefully at the different options. “We are currently running through five different
commercial powders, to see which will perform best,” he says. Researchers then aim to essentially direct the transformation of geothite into an iron oxide compound called hematite through a relatively simple method. “We rely essentially on the geometry and force,” says Professor Christensen. “Let’s think of goethite as a ruler with a very long dimension, another that is relatively thick, then an extremely thin one. Basically we throw a lot of rulers into our pressing dye, then we apply pressure. As the goethite crystals have this shape, they like to align, they lie flat.”
different applications. Magnets are typically characterised by the hysteresis loop, which has four main parameters of interest. “One is the saturation magnetisation, which can be linked to the atomic structure. Once you know the atomic structure, you can calculate what saturation magnetisation you should expect,” outlines Professor Christensen. “A second parameter is the coercive force, the force that will rotate magnetic moments.”
The magnetic moments tend to align, but if the crystallites become too large this leads to the emergence of a domain wall where parts of it will rotate. The other moments can be
“We can orient the crystallites and align them so that they are all pointing in the same direction when we compact the powder. We can make our crystallites in such a way that they lie flat, then we can transform them into a magnet.”
This material will then be transformed into hematite through a topotactic reaction, by which the structure of the precursor material can be related to the final product. The crystals within a material have a specific coordinate system, and that system is retained in the product crystal following a topotactic reaction. “If you can align the crystals, make sure that they are pointing in the right direction, then you can also make sure that you have the same coordinate system in your final product,” explains Professor Christensen. The wider aim is to develop magnets that meet industrial needs, which may vary across
rotated relatively easily with the application of just a small magnetic field, so it’s important to control the size of the crystals. There are also two further parameters to consider. “A third parameter is the remanence magnetisation, the magnetisation that stays behind after the field has been removed. The fourth parameter is the energy product or BHmax, which is essentially an attempt to combine the other three into a single figure of merit,” says Professor Christensen. In the METEOR project, Professor Christensen is looking into both aligning the grains within a magnet, as well as modifying the internal
structure. “We are putting in aluminium grains as impurities to improve the coercivity, which is very important in some areas, for example motor applications,” he outlines.
A second strand of research is focussed more on increasing the saturation magnetisation of magnets through substitution. In the COMPASS project researchers are developing a strontium W-type hexaferrite, a type of ferrimagnet with some magnetic moments that are oppositely aligned to the others, which is a technically challenging task. “This W-type hexaferrite only forms at temperatures of about 1300 degrees, and it’s only stable in a range of about 100 degrees. So there’s only a very small window and we are working at very high temperatures,” explains Professor Christensen. The development of in situ techniques to follow chemical reactions is very beneficial here, enabling researchers to stop a reaction when a phase has been formed, and Professor Christensen believes they will help speed up the material discovery process. “The in situ techniques really help us in identifying when we have reached a critical point,” he says. “We can extract a lot of important information, for example about the crystallite size and the texture.”
This has already been shown in M-type strontium hexaferrite, yet there is a limit to the saturation magnetisation that can be achieved, and the goal in the project is to develop the W-type. Evidence suggests that an atmosphere with very little oxygen is more conducive, however Professor Christensen says it is not easy to create these conditions. “We are trying to reduce the amount of oxygen that is inside the furnace,” he says.
The structure of the material has a divalent atom, so some divalent iron is required as well. “The way to make divalent atoms is to heat the sample up so high that you have an equilibrium of oxygen in the sample and in the surroundings. We are working at very high temperatures here, and we also need to control the atmosphere,” continues Professor Christensen. “The importance of the in situ technique cannot really be underestimated here. We can do multiple experiments, with time, temperature and atmosphere, that will really help us in reducing the parameter space.”
The backdrop to this research is the challenge of developing a new way of producing high-performing magnets and reducing Europe’s dependence on external supplies of rare-earth elements. There are a wide range of potential applications for these magnets, from electric vehicles to wind turbines, while Professor Christensen has also explored some innovative applications. “Previously we collaborated with a company on a magnetically levitated flywheel – there is no friction, so it will just spin and spin. This technology is used in some modern vehicles; the idea is that when you brake, you transfer the energy to the fly-wheel, so you save the braking energy,” he outlines. The two projects are primarily focussed on developing the production method rather than specific applications, yet Professor Christensen is very much aware of their industrial potential. “For example in an EU project (BEETHOVEN) we are part of a collaboration with a UK company called GreenSpur, who are interested in using nonrare earth magnets for a windmill generator, while other partners are also looking at alternatives,” he says.
METEOR Magnetic Enhancements through Nanoscale Orientation
Project Objectives
Improving magnets has been an ongoing quest for a century. In the project new synthesis and sintering methods are combined with micromagnetic simulationsto investigate hierarchical structured nanocomposite ceramic hexaferrite magnets. The resulting magnets can potentially speed up the green transitions within the energy sector and e-mobility.
Project Funding
Independent Research Fund Denmark, projects: METEOR (1032-00251B) and COMPASS (1127-00235B).
Project Partners
The Meteor and Compass projects are carried out in collaboration with Technical University of Denmark (DTU). Paul Scherrer Institute (PSI), Switzerland, Faculty of Technical Sciences at Aarhus University, Danish Technological Institute (DTI), DanMAX at MAX-IV, Sweden and Rutherford Appleton Laboratories, UK.
Contact Details
Project Coordinator,
Professor Mogens Christensen, Ph.D. Centre for Integrated Materials Research (iMAT) Department of Chemistry & iNANO Aarhus University
Gustav Wieds Vej 14
DK-8000 Aarhus C
DENMARK
T: +45 6177 7451
E: mch@chem.au.dk
W: https://pure.au.dk/portal/en/persons/ mch%40chem.au.dk
is educated
Materials Physics and Chemistry from Aarhus University in 2003, he was postdoc at ANSTO, Sydney, Australia for almost two years before returning to Aarhus University, where he was promoted to assistant professor in 2010, associated professor in 2014 and full professor in 2021.
The capacitor creates an electric field
A microwave signal travels across this line
The inductor creates a magnetic field
The magnetic field interacts with the magnetic vortex
At the centre of the vortex, the core creates a precision spin (the central arrow moves with a circular motion)
The team behind the QFaST project are investigating spin excitations in magnets, using sophisticated sensors to measure their properties, while they are also looking into whether they could be used to perform certain interesting tasks. These could include the detection of dark matter and certain quantum computing applications, as Dr Pepa Martínez-Pérez explains.
The topic of spin excitations in magnets has attracted a lot of attention in research over recent years, borne out of both fundamental interest and also its potential commercial relevance, for example in quantum computing. Much of this research has focused on ferromagnetic objects, in which the magnetisation is uniform, but as Principal Investigator of the QFaST project Dr Pepa Martínez-Pérez is now looking at spin excitations in magnetic textures, which can be thought of as small, discrete magnetic modulations. “One example of a magnetic texture is a vortex in which the spins are naturally arranged, forming a central coil,” she explains. Magnetic textures are interesting for several reasons. “First of all they naturally stabilise, they are very easy to stabilise and to fabricate. They are also very robust, they cannot be destroyed by temperature, or quantum fluctuations” says Dr Martinez-Perez. “These textures bring new spin excitations in the gigahertz regime into play. This regime is very interesting in terms of developing quantum technologies based on superconducting circuits.”
As part of her work in the QFaST project, Dr Martínez-Pérez is studying the fundamental behaviour of spin excitations in magnets, using superconducting quantum interference devices (SQUIDS), which are very sensitive to small variations in magnetisation. These SQUIDS are
typically used to measure very slow or quasi-static variations in magnetisation, now the project team aim to extend these capabilities to AC, and potentially even to the gigahertz frequency range. “SQUIDS are already capable of working at high frequencies, as the Josephson effect holds, but it can be difficult to read-out the notifications. We aim to extend the frequency bandwidth operation of these devices, so that they can be also used not only to detect a magnetic vortex in a particle, but also to measure its dynamics and diversity,” explains Dr Martínez-Pérez.
A material called yttrium barium copper oxide (Yba2Cu3O7 - or YBCO) is being used in the project to develop superconducting quantum nanocircuits which function as SQUIDS. “This is a very well-known, highcritical temperature superconductor,” says Dr Martínez-Pérez.
This material is quite commonly-used in developing photon-conducting circuitry, yet it can be difficult to work with as it is fairly delicate, in the sense that it is very sensitive to defects. However, these
defects also behave as Josephson junctions, which are essential to the functioning of SQUIDS. “We can use these defects to build Josephson junctions, allowing us to fabricate SQUIDS and other devices,” continues Dr Martínez-Pérez. The project’s agenda also includes another line of research, in which Dr Martínez-Pérez and her colleagues are exploring the use of conventional microwaves or high-frequency circuits to address spin excitations in a different way. “This field of research is usually called cavity magnonics,” she says.
The project team are investigating the interaction of the spin excitation with photons trapped in cavities, in this case superconducting cavities. This holds fundamental interest to Dr Martínez-Pérez, as a strong coupling between magnetic vortex excitations and photons in a cavity has not yet been achieved, while she says this could also open up wider possibilities. “We want to learn more about these magnetic excitations, and to use them to perform interesting tasks. For instance, vortices in some senses focus the inhomogeneities of a magnetic field in very small regions.
between distant spin qubits,” she outlines. The project team is collaborating with another group investigating magnetic molecules as candidates for building spin qubits, with a view to their eventual application in quantum computing, which would lead to some significant benefits. “Quantum computers would not only be much faster than existing technology, but they would also allow us to perform simulations or operations that are not currently possible with classical computers,” explains Dr Martínez-Pérez.
The project team aim to both investigate these more applied possibilities arising from their research, while also exploring many of the fundamental aspects of magnetic textures, which Dr MartínezPérez believes are worthy of attention purely for their own sake. These objects encapsulate a lot of interesting physics, with respect to their boundary conditions, geometry and topology considerations for example, while their excitations are also highly visual and can be understood intuitively. “These magnetic textures are really beautiful, and joining these objects
“We aim to extend the frequency bandwidth operation of SQUIDS, so that they can be also used not only to detect a magnetic vortex in a particle, but also to measure its dynamics and diversity.”
These small regions would then be very sensitive to other excitations of signals,” she says. One interesting possibility is using magnetic excitations as sensors for dark matter. “If you are able to measure the number of magnons you have, you would have a means to sense if dark matter axions are passing through,” continues Dr Martínez-Pérez. “Another possibility would be to couple these magnetic textures to other spins, to sense them.”
A coupling like this would be extremely useful for the development of quantum computing technology, as spin qubits have been identified as promising candidates for building quantum memories and quantum computing protocols. These spin qubits are very coherent, yet they are also extremely difficult to sense, a topic of great interest to Dr Martínez-Pérez. “Magnons could interact strongly with spin qubits. This would be a very interesting result, as it would allow you on the one hand to sense and read-out spin qubits, but also to mediate interactions
with quantum photons in cavities is also extremely interesting from a fundamental point of view. We are building a system in which these magnetic excitations are absorbing photons and emitting them, in a quantum coherent state,” says Dr Martínez-Pérez. “I have worked for many years in superconducting circuits, but my real passion has always been magnetism, and I’m very excited about putting these two systems together.”
This research is ongoing, and Dr Martínez-Pérez says that some interesting results have been obtained over the course of the project. Prototype SQUIDS have been developed, which Dr MartínezPérez believes could be used in physics research soon, while progress has also been made in terms of the materials for building superconducting circuits and stabilising magnetic textures. “We are now able to observe quantum polaritons in conventional homogenous magnetisation states. The next step will be to move to textures, which I hope to do in the next year or so,” she says.
Quantum Fast Spin dynamics addressed by High-Tc superconducting circuits
Project Objectives
The QFaST project aims to address the quantum properties of quantized spin waves (or magnons) in magnetic textures. Researchers aim to both develop useful tools for the study of nanoscopic spin excitations based on superconducting circuits, and also to access interesting quantum properties of vortex excitations, such as the characterization of zeropoint magnetization fluctuations in vortices.
Project Funding
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant agreement ID: 948986.
Project Partners
• Dieter Koelle and Reinhold Kleiner at University of Tübingen (Germany)
• Eugenio Coronado at Instituto de Ciencia Molecular (Valencia)
• David Zueco and Irene Lucas at Instituto de Nanociencia de Materiales de Aragón (Spain)
Contact Details
Project Coordinator, Dr. María José (Pepa) Martínez Pérez, PhD
Tenured Researcher
INMA - Instituto de Nanociencia y Materiales de Aragón
CSIC - Universidad de Zaragoza. Facultad de Ciencias.
C/ Pedro Cerbuna, 12 - 50009 Zaragoza - España
T: +34 976 76 1216
E: pemar@unizar.es
W: https://www.qmad.es/quantummagnonics-with-magnetic-textures/
Dr. María-José (Pepa) Martínez Pérez, PhD
Dr. María-José (Pepa) Martínez Pérez is a tenured CSIC Scientist and is deeply involved in the emerging field of quantum magnonics. She is on the board of directors of the Spanish Division on Condensed Matter Physics and the editorial board of the Spanish Journal of Physics.
The production of terrestrial gamma-ray flashes within thunderclouds is the most energetic natural process on earth, yet much still remains to be learned about the underlying mechanisms of the phenomenon. We spoke to Dr Christoph Köhn about his research into the signatures of TGFs, how they are produced, and their wider effects.
Researchers reported in 1994 that bursts of gamma rays can be emitted from thunderclouds with very high energies, right up to 40 megaelectronvolt (MeV) or even beyond. These bursts, known as terrestrial gamma-ray flashes (TGFs), are produced by electrons with similar energies in a thundercloud, then emitted upwards. “There is a process that converts electrons to gamma rays,” says Dr Christoph Köhn, a Senior Researcher at the Technical University of Denmark. It seems surprising that electrons can reach such energies, as they would be expected to lose energy through collisions with air molecules, a topic of great interest to Dr Köhn. “Normally you would expect that the electron energies in air would be limited to 100 or 200 electronvolts (eV),” he explains. “How do we bridge this energy gap, from the lower levels you would typically expect in a thundercloud, to the MeV level, which is a factor of 105 –106 higher?”
Relativistic electrons and terrestrial gamma-ray flashes from lightning
This project is funded by the Independent Research Fund Denmark (grant 1054-00104)
Project Team: • Pierre Gourbin, Postdoc • Elloïse Fangel-Lloyd, PhD student • Saša Dujko, Professor • Sven Karlsson, Associate Professor • Mathias Gammelmark, PhD student • Kenichi Nishikawa, Adjunct Professor Principal Investigator, Christoph Köhn Technical University of Denmark, DTU Space, 2800 Kgs. Lyngby, Denmark
E: koehn@space.dtu.dk
T: +45 45259794
W: www.christophkoehn.de
Christoph Köhn studied physics at the Christian-Albrechts-Universität zu Kiel and at the University of Hamburg from 2005-2010. After his PhD at the Centrum Wiskunde en Informatica in Amsterdam (2010-2014) and a research visit at the Vrije Universiteit Brussel, he joined the Technical University of Denmark in 2015 as Marie – Curie Postdoc.
emission
This is a topic Dr Köhn is investigating in a research project, in which he is looking at the mechanisms by which electrons gain this level of energy. The project team is developing simulations, starting from the configuration of a thundercloud. Measurements from balloons and other atmospheric data can help researchers build a picture of the electric field in a thundercloud which is needed to model the motion of particles within it.“We start with a few low-energy electrons in different field configurations, trace them in air, then look at the energy distribution. Would we get those 40 MeV electrons, or even 100 MeV electrons
swarm of energetic photons is produced through these relativistic electrons,” outlines Dr Köhn.
A well-functioning model of TGFs will help researchers learn more about thunderclouds and lightning, which is by nature hard to measure, while Dr Köhn also plans to investigate the wider effects of these TGFs, particularly on greenhouse gas emissions. “Most of the research over the last 30 years has really focused on the production of those TGFs, but comparatively little attention has been paid to their effects,” he continues.
There is no doubt that the majority of greenhouse gas emissions are caused by human activity, yet TGFs and relativistic electrons may also contribute. A greater degree of
“We start with a few low-energy electrons, trace them in air, then look at the energy distribution. Would we get those 40 MeV electrons, or even 100 MeV electrons?”
which then produce the gamma radiation?” he outlines. “We use a technique called resampling in our simulations. When two electrons are close in space and energy, then we resample it to a so-called superelectron.”
These superelectrons are assigned an adjusted probability of collisions, which helps reduce the number of computational particles down to a level within the range of simulations. This allows researchers to gain deeper insights into the origins of TGFs. “We see that we get a front of high energy, relativistic electrons, that move almost at the speed of light, which produce photons. A
precision about their role in this respect could help improve the accuracy of climate models, says Dr Köhn.”These TGFs might activate the chemistry in a thundercloud and split certain particles for example. We want to contribute to research in this area,” he says. A further topic of interest is the effect of TGFs on planes passing above thunderclouds, another area in which there has been relatively little research.
“There have been very few papers discussing the effect of TGFs, relativistic electrons and other particles on avionics systems,” continues Dr Köhn.