EU Research SPRING 2016

Page 1

EU Research Spring 2016

Measuring Success In depth review of cutting edge research in metrology Medical devices getting to the heart of the matter

Vision on health research Black hole proves Einstein’s theory

Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH



Editor’s No M easurements and methods to quantify our environment underpin all branches of scientific research.

The more we discover about every layer of the physical world, in the nanoscale, in radiation, in electromagnetism or with concepts like currency and ways to scale global changes – we reveal rules and laws of nature and we create a hand-hold to manage these laws.

Measurement is the translation of the physical world, the code that weaves through the Universe and the language of research. We coordinate, design and map out our world by devising systems of weights and measures. This is the extraordinarily diverse and precise discipline of Metrology – an area that is still very much alive with open questions and a subject we cover in this issue of EU Research.

As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.

So how can we truly appreciate the relevance of Metrology? I’ll borrow a great example posed by Dr Brian Bradshaw, Managing Director at the National Physical Laboratory (NPL) in his foreword for the publication Metrology for the 2020s. Bradshaw highlights the invention of the atomic clock as an example of how Metrology can transform our lives dramatically. Atomic clocks are now the most accurate time and frequency standards. They are used as standards to set time distribution services by. When the first accurate atomic clock was created in 1955, built by Louise Essen and Jack Parry in the UK at the NPL, it was not envisaged that the impact would be so all encompassing on humanity. Its time keeping made so much possible in our lives. In fact, television, the internet, mobile phones and GPS rely on the atomic clock. Considering this is just one example in the vast field of Metrology, we can understand how important it is to support this branch of scientific research. Hope you enjoy the issue.

Richard Forsyth Editor

www.euresearcher.com

1


Contents

4 Research News

EU Research’s Richard Davey takes a look at current events in the scientific news

10 BP-CarDiO 30 million people in the EU are living with diabetes and the number of sufferers is predicted to grow by 10 percent by 2030. Dr Stephen Wheatcroft, Project Leader of BP-CarDiO and his team are researching a protein called IGFBP1 which has potential for the development of novel new treatments for diabetes

12 3D-OA-HISTO Osteoarthritis affects millions of people around the world, yet the root causes of the disease are not fully understood. The 3D OA HISTO project is developing a new 3D histopathological grading technique that could lead to new insights into the disease, as project leader Simo Jaakko Saarakkala explains

16 ENIGMO Both obesity and type-2 diabetes are characterised by metabolic inflammation. The ENIGMO project aims to shed new light on the topic by investigating how bacteria in the gut microbiota interact with the innate immune system and the endocannabinoid system, as Professor Patrice D. Cani explains

18 DEMOVE DEMOVE is a research project which aims at decoding and exploiting the neural information responsible for human movements by processing the electrical activity of muscles. The project, headed by Professor Dario Farina, has huge implications for those with impaired mobility.

2

21 Cholinomirs

Stress reactions are an essential human survival mechanism, but they can also lead to changes in gene expression and long-term damage to the body. We spoke to Professor Hermona Soreq about her research into the functioning of a family of genes called microRNAs, and their role in regulating stress and anxiety

24 SPEED Older adults tend to take longer to make decisions than younger people, as they typically seek to gather more information before reaching a conclusion. Researchers in the SPEED project are using sequential sampling models to investigate speeded decisionmaking in the human brain, as Professor Birte Forstmann explains

26 OptogenRet There is currently no cure for retinitis pigmentosa, a degenerative eye condition which affects photoreceptors in the retina, leading to complete blindness at the later stages of the disease. Inserting microbial opsins into the diseased retina holds real promise as a means of treating the condition and other retinal degenerative diseases says Dr Jens Duebel, Principal Investigator

29 CREST Evidence suggests that a healthy diet can help prevent the development of eye conditions, yet many of us don’t eat enough nutrients to maintain eye health. Professor John Nolan tells us about the CREST project’s work in investigating whether enhancing nutrition in the eye has a positive impact on both healthy subjects and people with age-related macular degeneration (CREST article design by Petra Curtis)

33 NOT

Around 30,000 people are thought to have disappeared in Argentina between 1976-83, as the ruling military dictatorship dealt brutally with perceived subversives and political opponents. This kind of terror and political intimidation is very much interdependent with fantastic narratives, as Professor Kirsten Mahlke explains

36 Bureau International des

Poids et Mesures (BIPM) Metrology is the area of study that makes design and manufacture and importantly discovery itself, possible. Welcome to the science underpinning science itself! Richard Forsyth questions Dr Richard Davis on the work at BIPM – the place where measurements are born

40 FEEL The FEEL project are developing a new approach to the ‘hard’ problem of consciousness, pursuing theoretical and empirical research based on sensorimotor theory. We spoke to the project’s Principal Investigator J. Kevin O’Regan about their work in developing a fullyfledged theory of ‘feel’, and about the wider impact of their research

42 CENTAUR Despite the extensive computerization of security systems, it is still up to a human operator to monitor the protected area by viewing tens to hundreds of interconnected video cameras. The EU project CENTAUR aims to develop next generation tools to assist security operators in dealing with the particularly difficult task of monitoring crowded environments

EU Research


44 P-SOCRATES Embedded computing systems increasingly require a high level of processing power, while executing their functions within a guaranteed timeframe. Professor Luis Miguel Pinho tells us about the P-SOCRATES project’s work in developing a new software design framework that meets the needs of modern systems

46 VITCEA While Fibre Reinforced Plastic (FRP) composite materials are an attractive option in several sectors of industry, certain defects can affect their strength and stiffness. The VITCEA project is developing non-destructive evaluation techniques which will encourage the wider use of FRP composites, as Michael Gower explains

48 Quantum Manifesto The development of quantum technologies could provide a significant boost to the European economy, but close collaboration between research and commerce is central to building the technologies of tomorrow. With global competition intensifying, scientists are calling for further support to European researchers in the ‘quantum manifesto’

53 SIQUTE The development of single-photon sources is central to a number of technical fields, including quantum computing, quantum cryptography and radiometry. Professor Stefan Kück tells us about developing single-photon sources, research which could have a significant impact in both the academic and commercial sectors

56 Molecsyncon The Next-Generation Organic Photovoltaics focus group brings together researchers from several disciplines to develop new, more efficient OPV devices. This work is closely related to the Molecsyncon project’s fundamental research into controlling the properties of individual molecules, as Principal Investigator Professor Ryan Chiechi explains

www.euresearcher.com

58 Metrosion Particulate erosion is a major concern to industry, which causes significant disruption to business operations, yet Europe has only two laboratories with the facilities which could meet the testing requirements to assist in the development of an improved testing standard. Tony Fry tells us about the Metrosion project’s work

62 ComPAg The history of plant domestication dates back millennia, during which time the genetic make-up of crops has changed and key traits have emerged. Dr Chris Stevens tells us about the ComPAg project’s work in tracking the evolution of domestication traits in over 30 crops, building a more complete picture of how agriculture developed across the world

65 Early Rice Rice has long been a staple crop across large parts of Asia, and cultivation methods have evolved as the population grew and dispersed over time. Professor Dorian Fuller tells us about his work in reconstructing the evolution of rice cultivation methods, and its wider importance in terms of our understanding of the global climate

66 MPM-DREDGE Land reclamation projects offer a means of expanding living space, which is essential to meeting the needs of the growing global population, but effective defensive structures are required first. Alexander Rohe tells us about the MPM-DREDGE project’s work in developing a numerical tool to model soil-water interactions

68 Crowdland Information about land use is relevant to a wide range of applications, yet there are gaps in the existing data. We spoke to Dr Steffen Fritz about the Crowdland project’s work in harnessing the power of crowdsourcing to provide more detailed information

EDITORIAL Managing Editor Richard Forsyth info@euresearcher.com Deputy Editor Patrick Truss patrick@euresearcher.com Deputy Editor Richard Davey rich@euresearcher.com Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks info@euresearcher.com PRODUCTION Production Manager Jenny O’Neill jenny@euresearcher.com Production Assistant Tim Smith info@euresearcher.com Art Director Daniel Hall design@euresearcher.com Design Manager David Patten design@euresearcher.com Illustrator Martin Carr mary@twocatsintheyard.co.uk PUBLISHING Managing Director Edward Taberner etaberner@euresearcher.com Scientific Director Dr Peter Taberner info@euresearcher.com Office Manager Janis Beazley info@euresearcher.com Finance Manager Adrian Hawthorne info@euresearcher.com Account Manager Jane Tareen jane@euresearcher.com

EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: info@euresearcher.com www.euresearcher.com © Blazon Publishing June 2010

Cert o n.TT-COC-2200

3


RESEARCH

NEWS

EU Research’s Richard Davey takes a look at current events in the scientific news

Einstein’s theory of relativity confirmed with Black Hole discovery Scientists are claiming a stunning discovery in their quest to fully understand gravity A team of scientists announced on recently that they had heard and recorded the sound of two black holes colliding a billion lightyears away, a fleeting chirp that fulfilled the last prediction of Einstein’s general theory of relativity. That faint rising tone, physicists say, is the first direct evidence of gravitational waves, the ripples in the fabric of space-time that Einstein predicted a century ago. It completes his vision of a universe in which space and time are interwoven and dynamic, able to stretch, shrink and jiggle. And it is a ringing confirmation of the nature of black holes, the bottomless gravitational pits from which not even light can escape, which were the most foreboding (and unwelcome) part of his theory. More generally, it means that a century of innovation, testing, questioning and plain hard work after Einstein imagined it on paper, scientists have finally tapped into the deepest register of physical reality, where the weirdest and wildest implications of Einstein’s universe become manifest. Conveyed by these gravitational waves, power 50 times greater than the output of all the stars in the universe combined vibrated a pair of L-shaped antennas in Washington State and Louisiana known as LIGO on Sept. 14. If replicated by future experiments, that simple chirp, which rose to the note of middle C before abruptly stopping, seems destined to take its place among the great sound bites of science, ranking with Alexander Graham Bell’s “Mr. Watson — come here” and Sputnik’s first beeps from orbit. “We are all over the moon and back,” said Gabriela González of Louisiana State University, a spokeswoman for the LIGO Scientific

Collaboration, short for Laser Interferometer Gravitational-Wave Observatory. “Einstein would be very happy, I think.” Members of the LIGO group, a worldwide team of scientists, along with scientists from a European team known as the Virgo Collaboration, published a report in Physical Review Letters on Thursday with more than 1,000 authors. “I think this will be one of the major breakthroughs in physics for a long time,” said Szabolcs Marka, a Columbia University professor who is one of the LIGO scientists. “Everything else in astronomy is like the eye,” he said, referring to the panoply of telescopes that have given stargazers access to more and more of the electromagnetic spectrum and the ability to peer deeper and deeper into space and time. “Finally, astronomy grew ears. We never had ears before.” Astronomers now know that pairs of black holes do exist in the universe, and they are rushing to explain how they got so big. According to Vicky Kalogera of Northwestern University, there are two contenders right now: Earlier in the universe, stars lacking elements heavier than helium could have grown to galumphing sizes and then collapsed straight into black holes without the fireworks of a supernova explosion, the method by which other stars say goodbye. Or it could be that in the dense gatherings of stars known as globular clusters, black holes sink to the center and merge. Michael S. Turner, a cosmologist at the University of Chicago, noted that astronomers had once referred to the search for gravitational waves as an experiment, not an observatory. “LIGO has earned its ‘O,’ ” he said. “That is, it will be an observatory, getting tens of events per year.”

©ESA–AOES Medialab

4

EU Research


Scientists fear loss of funding if UK leaves EU Sir Paul Nurse warns Brexit bad for UK science however other group backs leaving Sir Paul Nurse, who was awarded the 2001 Nobel Prize in Physiology or Medicine, warns that Brexit (Britain Exit from the European Union) would be bad for the country, and attacks the short-term political opportunism that some politicians are currently pursuing. Another group of scientists (details below) disagrees and insists UK science and British scientists would be better off leaving the economic bloc. If the United Kingdom votes to leave the European Union (EU) in an in-or-out referendum on 23rd June, 2016, the country’s research will suffer, says Sir Paul, who is the current director of The Francis Crick Institute. Sir Paul Nurse, a former President of the Royal Society, says that after a Brexit, British scientists would find it much more difficult to get research and development funding – the move would sell ‘future generations short’, he added. The Nobel laureate was speaking at a news briefing about the consequences of a withdrawal from the economic bloc for British scientist. He was flanked by other eminent scientists who also believe that their country’s best future lies within the EU. Prof. Nurse said: “Being in the EU gives us access to ideas, people and to investment in science. That, combined with mobility (of EU

scientists), gives us increased collaboration, increased transfer of people, ideas and science – all of which history has shown us drives science.” A group of scientists that supports Brexit – Scientists for Britain – says that it is a myth to believe that the country’s scientific community is somehow less robust than it appears, and leaving the European Union would have an unduly adverse impact on UK science. Sir Paul stated that Michael Gove, Lord Chancellor – Secretary of State for Justice, a pro-Brexit campaigner, suffers from ‘intellectual laziness’. According to the Lord Chancellor, the mountains of bureaucracy involved in the EU clinical trials directive had undermined ‘the creation of new drugs to cure terrible diseases.’ In a separate interview Sir Paul said: “He [Michael Gove) is wrong about this. We have plenty of bureaucracy in the UK about this, I fight it myself. If we are to really make science work, we have to be part of the European Union, we have to influence the agenda, we have to make the regulations work.” He later added “We are too small to be effective. We are an island. We cannot afford to be an island in science. If we are part of the European Union we are part of a powerhouse that can produce the data. We have to work with them and we will have no impact if we are outside. It will make it worse. It is naive, this argument.”

Health sector warned of Cyber attacks The rise of digital health technology makes cyber security potentially a matter of life and death Some of the most exciting innovations in medicine are coming from the use of digital technology to improve monitoring and management of people’s health But what are the risks of this new era of “connected healthcare” being exploited by people intent on stealing sensitive data, or worse, causing harm to patients? The FDA says the threat is real. In January it issued recommendations for how manufacturers should safeguard medical devices against cyber breaches, urging them to make security a priority in every stage, from the design process of a device onwards. “All medical devices that use software and are connected to hospital and healthcare organisations’ networks have vulnerabilities,” says Suzanne Schwartz of the FDA’s Center for Devices and Radiological Health. “Some we can . . . protect against, while others require vigilant monitoring.” These concerns have been building for some time. Dick Cheney, the former US vice-president, revealed in 2013 that doctors had disabled the wireless capabilities of his pacemaker as a precaution against hacking.

“The Health Service holds some of the most sensitive personal information available, but instead of leading the way in how it looks after that information, the NHS is one of the worst performers,” said Christopher Graham, the Information Commissioner, last year. Such concerns are sure to increase after Jeremy Hunt, UK health secretary, announced plans in September for NHS patients to have access to their medical records online within a year. Commissioner for health Vytenis Andriukaitis said on the matter “The integration of technology into healthcare has the potential to revolutionise patient care. However, the regulatory challenges associated with medical devices and data protection, combined with scepticism among the public about the use and safety of their personal data, leave significant hurdles to overcome.”

Critics say that, in the rush to digitise patient records and embrace technology, healthcare systems have not paid enough attention to security. In the UK, the Information Commissioner’s Office, the privacy watchdog, says data breaches in the NHS are “a major cause for concern”.

www.euresearcher.com

5


Cheaper healthy food could dramatically improve health Reducing prices of fruit and vegetables while raising prices for sodas and other sugary drinks could save millions of lives A ten percent drop in price for healthy foods and a ten percent increase in the price of unhealthy foods could potentially prevent a significant number of people from dying from heart disease and stroke, according to research presented at the American Heart Association’s Epidemiology/Lifestyle 2016 Scientific 2016 meeting. In a collaborative project, researchers at the Friedman School of Nutrition Science and Policy at Tufts University in Boston and Harvard Medical School used computer modeling to predict how price changes might impact eating habits over time and whether this could reduce cardiovascular diseases. They estimated: Reducing the price of fruits and vegetables: Within five years of a ten percent price reduction on fruits and vegetables, deaths from cardiovascular diseases overall could decrease by 1.2 percent and within 20 years by almost 2 percent. Specifically, heart attacks could decrease by 2.6 percent and strokes by 4 percent over the 20 years. Reducing the price of grains: Within five years of a ten percent price reduction on grains, deaths from cardiovascular diseases overall could decrease by 0.2 percent and within 20 years by 0.3 percent. Specifically, heart attacks could decrease by 0.83 and 0.77 percent respectively.

Increasing the price of sugary drinks: Within 5 years of a price increase of ten percent on sugary drinks, deaths from cardiovascular diseases overall could decrease by nearly 0.1 percent and within 20 years by 0.12 percent. Specifically, heart attacks could decrease by 0.25 percent in both timeframes and strokes could decrease by 0.17 percent in 20 years. Diabetes could decrease by 0.2 percent in five years and 0.7 percent in 20 years. Combined, the model shows that by 2035 it would be possible to prevent 515,000 deaths from cardiovascular disease and nearly 675,000 events, such as heart attacks and strokes, across the nation with these small changes in price. If a change by one serving occurred daily, for example one more piece of fruit (100gm), one full serving of a vegetable (100 gm), one serving of whole grains (50 gm), and one less 8 oz sugar sweetened beverage were consumed then up to 3.5 million deaths and 4 million cardiovascular events could be averted over a 2 year period. The SNAP program, also known as food stamps, in Massachusetts achieved a 30 percent change in prices. A recent study showed that Mexico’s imposition of a small tax on sugary drinks also decreased sales. Changing attitudes have helped sales of soft drinks fall more than 25 percent over the last two decades.

New 2D Material discovered is a game changer Physicist discovers new 2 dimensional material that could be more important than Graphene The new material is made up of a mixture of silicon, nitrogen and boron, coming together to form a one atomthick, hexagonal structure, very similar to that of graphene. All those materials are widely available, inexpensive and lightweight, and the finished material is extremely stable – theoretically at least. The researchers used computer simulations to try and get the bonds between the different base materials to disintegrate, but found they held strong, even at temperatures of 1,000º C (1,832º F). While the structure of the new material is hexagonal – just like graphene – the different sizes of the elements used means that results in a less uniform structure, with uneven sides. However, while it might not be quite as uniform, it does have some significant benefits over graphene. Most notably, it can easily be turned into a semiconductor by attaching other elements on top of the silicon atoms. At this point, the novel material only exists in a theoretical sense, with the researchers using computers at the University of Kentucky’s Center for Computational Science to perform the complex calculations. The team is now working with researchers at the University of Louisville to create the material under laboratory conditions. “We are very anxious for this to be made in the lab,” said team member Madhu Menon. “The ultimate test of any theory is experimental verification, so the sooner the better.”

6

“The article “Supporting Research with Śāstravid”, published in EU Researcher Winter 2015/2016, pp 62-64 might give the impression that the resource currently available at www.sastravid.net is identical with the version originally designed and built by Bridgeton Research under the direction of Dr David Gold. In fact the current version includes most of the data of this previous version, but not all of its functionality. Dr Gold is currently not associated with either the Śāstravid project or Oxford University in any way.”

EU Research


Environmental Health : Innovative Device Traces Chemicals Affecting Humans New device able to detect environmental contaminants more accurately and cost effectively

Human and aquatic lifeforms face increasing threats from chemical contamination. The United States Environmental Protection Agency (USEPA) estimates that 10 percent of the sediments located in domestic lakes, rivers and harbours are contaminated with potentially harmful chemicals. It is therefore essential to carry out environmental sampling to properly evaluate the degree of hazard and design remediation strategies, where needed. In a new study, a multi-disciplinary, multi-institutional team of researchers headed by Rolf Halden, director of the Centre for Environmental Security at Arizona State University’s Biodesign Institute, tracks the course of a family of widely used pesticides known as fiproles. These halogenated chemicals have been identified as an emerging contaminant, recently linked to the worldwide die-off of pollinating insects, particularly honeybees. To properly assess the levels of fiprole contamination in the environment, Halden’s team invented a new device, constructed at the Biodesign Institute. Known as the IS2B, the tool is a kind of mobile laboratory or pod for performing precision analysis on sampled water and sediment. The technology offers improved accuracy of measurement compared with existing methods as well as greater versatility and cost-effectiveness.

“Health risks from pollution are dependent not necessarily on the absolute quantity of toxins present but rather on what fraction of the total pollutant mass is accessible for uptake by living organisms. The patent-pending IS2B device is designed to tell apart and quantify these two important quantities,” says Halden. Soils, sediments and water resources can bind chemicals to varying degrees, making them available to microbes, plants, wildlife and humans. Yet, few of the chemicals in daily use have been properly assessed for safety. Researchers like Halden and Denslow hope to measure the capacity of potentially hazardous chemicals to be absorbed by living organisms, a quantity known as their bioavailability. Another important benefit of the device is its ability to collect samples continuously for extended periods of time, from days to several weeks. This makes possible the detection of short-term fluctuations in chemical loading, such as illegal dumping of process streams into surface waters. As such, information gathered with the IS2B is vital to environmental compliance reporting, modeling and risk assessment for biota and humans.

Stephen Hawking and Russian Billionaire plan to search universe Russian billionaire Yuri Milner begins $100 million search of the universe using “Nano-crafts To those hoping that Yuri Milner, the Russian billionaire who’s backed various Internet and space ventures, had called a press conference in New York to announce that his search for extraterrestrials had achieved contact: Prepare to be disappointed. No aliens have been found-yet. While Milner’s Breakthrough Listen project has achieved no such breakthrough in the first few months of its existence, this has not deterred the 54-year-old technology entrepreneur and investor from continuing to commit portions of his fortune in service of quixotic aerospace research. Milner, along with famed theoretical physicist Stephen Hawking, is to unveil another wildly ambitious research project Tuesday at One World Trade Center. Breakthrough Starshot is funded by a $100 million grant from Milner

www.euresearcher.com

with the goal of finding faraway planets capable of sustaining life. The grant comes in addition to the $100 million he donated last year to the Breakthrough Listen project, which will search the universe for intelligent beings over the next 10 years. For the new endeavor, Milner’s team plans to develop tiny, unmanned spaceships to fly into the Alpha Centauri star system - the star system nearest our own, which some astronomers believe may contain planets capable of supporting life on a research mission that will take at least 24 years to complete. “It’s doable in our lifetime,” Milner said in an interview. He has recruited a team that includes former NASA scientists and engineers to work on the “light-propelled nano-crafts,” as Breakthrough calls them. Facebook co-founder Mark Zuckerberg will join the project’s board of directors.

7


Commission planscloud platform for research data European Commission Will Build £5bn Cloud Platform For Scientists Delivering on its Strategy to create a Digital Single Market, the Commission today unveiled its plans to help European industry, SMEs, researchers and public authorities make the most of new technologies. The European Commission today presented a set of measures to support and link up national initiatives for the digitisation of industry and related services across all sectors and to boost investment through strategic partnerships and networks. The Commission also proposes concrete measures to speed up the development of common standards in priority areas, such as 5G communication networks or cybersecurity, and to modernise public services. As part of today’s plans, the Commission will set up a European cloud that, as a first objective, will give Europe’s 1.7 million researchers and 70 million science and technology professionals a virtual environment to store, manage, analyse and re-use a big amount of research data.

The European Cloud Initiative will make it easier for researchers and innovators to access and re-use data, and will reduce the cost of data storage and high-performance analysis. Making research data openly available can help boost Europe’s competitiveness by benefitting start-ups, SMEs and data-driven innovation, including in the fields of medicine and public health. It can even spur new industries, as demonstrated by the Human Genome Project. The public and private investment needed to implement the European Cloud Initiative is estimated at €6.7 billion. The Commission estimates that, overall, €2 billion in Horizon 2020 funding will be allocated to the European Cloud initiative. The estimation of the required additional public and private investment is €4.7 billion in the period of 5 years.

ERC announce two new members to governing body The European Commission announced Friday that it appointed two accomplished scientists to the governing body of the European Research Council (ERC), the Scientific Council, for a four-year term of office The two new members are Sir Christopher Clark, Regius Professor of History at the University of Cambridge, UK and Barbara Romanowicz, Professor and Chair in Physics of the Earth’s interior at the College de France, and also Professor of Geophysics at the UC Berkeley, the United States. These new members have been selected by an independent

8

identification committee, composed of six distinguished scientists appointed by the European Commission, said a press release. The ERC Scientific Council, composed of 22 distinguished scientists and scholars representing the European scientific community, is the governing body of the European Research Council. Its main role is setting the ERC strategy and selecting the peer review evaluators.

EU Research


Climate Research targets still viable according to experts Alan Finkel, Australia’s chief scientist says he is optimistic about future but concerned about women in science and school performance Australia’s climate research obligations can still be met despite the threat of cuts at the CSIRO, chief scientist Alan Finkel has claimed in an impassioned inaugural appearance before the National Press Club in Canberra. Finkel’s speech in February highlighted many successes in Australian science, including regulations that allow Australia to have 1,000 clinical trials proceeding and for aerial drone research to proceed easily. But he also pointed to failings, including regulations making life difficult for manufacturers of medical devices. And he decried the barriers women faced in pursuing careers in science. Finkel avoided mentioning climate science or the CSIRO cuts in his speech, but was immediately asked about it in the questions.

In response to one question, he said: “You asked, am I on the one hand confident that Australia can deliver on its climate research obligations, on the other hand do I feel that they’re under threat? Well of course it’s a bit of both, isn’t it?” When it comes to women in science, Finkel was downbeat. “Women comprise more than half of the science PhD graduates an early career researchers, but by the mid 30s, their mid 30s a serious gender gap begins to develop. We are improving, no doubt about it, but we have a long way to go,” he said. Finally, he said he announced he would develop “a roadmap for our future national research infrastructure.” If implemented, the roadmap would “power Australian research in coming decades”.

Make sure you vist us on our website www.euresearcher.com. For more information regarding any of your dissemination needs please contact us on info@euresearcher.com

Dirty friends make lab mice more useful More evidence emerges of how laboratory rodents can skew research Placing pet store mice in the same cages as laboratory mice could help improve mouse-based research into human diseases, a new study suggests. Laboratory mice are used in many areas of medical research, but their immune systems are more similar to the immature immune systems of newborn humans than adult immune systems, according to researchers led by David Masopust at the University of Minnesota.

mice could improve the translation of mouse-based research to humans, the researchers said. The study was published online April 20 in the journal Nature.

That’s because lab mice are kept in abnormally clean environments, the researchers said. When pet store mice were placed in the same cages as lab mice, the immune systems of the lab mice changed to more closely resemble adult human immune systems. Specifically, the lab mice housed with pet store mice had a more than 10,000 times improved immune response to a bacterial infection than typical lab mice, the study found. The findings suggest that exposing lab mice to wild or pet store

www.euresearcher.com

9


Fluorescence image of human endothelial cells in which we investigate the cellular effects of IGFBP1.

Skeletal muscle cells used to investigate the effects of IGFBP1 on insulin action.

A New Hope For Treating Diabetes 30 million people in the EU are living with diabetes and the number of sufferers is predicted to grow by 10 percent by 2030. Dr Stephen Wheatcroft, Project Leader of BP-CarDiO and his team are researching a protein called IGFBP1 which has potential for the development of novel new treatments for diabetes For those afflicted with diabetes, type 1 or type 2, once it has manifested it is a lifelong condition where levels of sugar in the blood become too high, which can lead to serious health problems. Type 1 diabetes is when the pancreas doesn’t produce any insulin – a hormone that regulates blood glucose levels and can develop at any age – usually seen in people at a young age. The more common type 2 diabetes – which often develops later in life – occurs when the body either doesn’t produce enough insulin or the body’s cells do not react to insulin. Diabetes can cause cardiovascular disease which in turn can be fatal or cause serious disability. Effects for sufferers can be an increasing risk of myocardial infarction, stroke and peripheral arterial disease. Patients with diabetes have an increased risk of cardiovascular disease that reduces their life expectancy by 5-15 years. Current treatments and management of diabetes include medications, carefully managed diet, exercise regimes – all combined with continued testing of the individual’s blood glucose. The condition means that once diagnosed the treatment plans will be set for the remainder of the patient’s lifetime. Whilst there is a range of oral drugs that help maintain good glucose control it is also necessary in many cases to self-inject insulin to keep blood glucose levels in the normal range. However, recent landmark trials of intensive glucose lowering (ADVANCE, ACCORD and VADT) have concluded that intensive glucose control

10

does not improve cardiovascular outcomes or significantly lower cardiovascular risk. This is alarming and makes the research into developing new treatments for the prevention of diabetes and cardiovascular complications that result from it, all the more urgent.

A race against time Dr Stephen Wheatcroft, based at the University of Leeds in the UK, is heading up the research project BP-CarDiO. Dr Wheatcroft believes there is now an opportunity to discover new treatments. “It was not until the results of the cardiovascular endpoint study EMPA-REG OUTCOME were released very recently, that we are beginning to see significant cardiovascular risk reduction with a new diabetes drug. We now need to identify other avenues to develop more drugs that can do this,” stated Dr Wheatcroft. “One of the biggest challenges in the field is the race against time in preventing the huge burden of diabetes and cardiovascular disease we are likely to see in our lifetimes - fuelled by the global epidemic of obesity. Development of new therapeutics is a long process which needs to begin with discovery science. Identification and exploitation of physiological mechanisms implicated in metabolic and vascular regulation is one way to do this.” Type 2 diabetes has a resistance to the effects of insulin and Dr Wheatcroft’s research on rodents revealed such insulin resistance in endothelial cells, which form the linings of the blood vessels. This can lead to accelerated atherosclerosis, a condition

where the artery wall thickens. The research team understood that the key to a breakthrough will come from studying insulin-like growth factors (IGFs) which are proteins with sequence similarities to insulin. IGFs are part of a complex system (referred to as an axis) cells utilise to communicate with their physiological environment. The research team’s goal is to exploit the IGF-IGFBP axis to prevent cardiovascular disease in the context of diabetes and obesity. In the first half of the project they made many significant discoveries. The endothelium lines the interior surface of blood vessels and lymphatic vessels and acts as an interface between circulating blood and the vessel wall. The research team found that resistance to insulin or IGF-1 in the endothelium, modulates vasomotor function. Vasomotor refers to actions in a blood vessel that change its diameter. Significantly, this is associated with altered capacity for the endothelium to repair itself after being damaged or injured. On-going studies will probe the mechanisms responsible for endothelium-metabolism cross talk to understand this further. The project specifically studied the binding protein IGFBP1 and found in cultured endothelial cells which were insulin resistant, the protein enhanced functional properties of those cells in a regenerative way. The team identified critical nodes in the signalling pathways responsible for the observed effects.

EU Research


IGFBP1 – a protein that is key to a solution? IGFBP1 was first identified in amniotic fluid in the 1980s and shown to be important in placental function. Since that time, others have studied IGFBP1 in a range of fields including growth, nutrition and cancer. Recognition that circulating levels of IGFBP1 correlate with insulin sensitivity led to IGFBP1 being implicated in glucose regulation. “Epidemiological studies have identified low circulating levels of IGFBP1 as a predictor of subsequent development of diabetes and cardiovascular disease. What we find interesting is that IGFBP1 can act independently of binding to IGFs – this has been shown in several cell types but has

targets responsible for IGFBP1 interactions with cells. Next, BP-CarDiO will investigate the potential for therapeutic modulation of the IGF-IGFBP system to prevent the metabolic and vascular consequences of diabetes with the ultimate aim of the research to create a platform for the development of novel treatments for the growing EU population with diabetes. “So far we have identified encouraging properties of IGFBP1 on vascular cells, metabolic cells and pre-clinical models which lead us to believe these could be harnessed therapeutically. We now need to demonstrate that we can reproduce these findings by administering IGFBP1 in relevant models of the disease. In the next

Development of new therapeutics is a long process which needs to begin with discovery science. Identification and exploitation of physiological mechanisms implicated in metabolic and vascular regulation is one way to do this not been studied in cells relevant to vascular physiology. Understanding how IGFBP1 affects the function of vascular cells is one of the key aims of this project,” said Dr Wheatcroft. It was discovered that IGFBP1 acts to increase insulin sensitivity and to increase generation of nitric oxide, which has a favourable effect on blood vessel function, in endothelial cells.

Progress towards preclinical trials By characterising the mechanisms responsible at a molecular level the team have set up the foundations to facilitate preclinical testing to discover the protective effects of IGFBP1 in diabetes. A vital consideration of the project, moving forward, is to identify new molecular

phase of the project we will drill down in to the molecular mechanisms by which IGFBP1 exerts its protective effects. We need to understand exactly how IGFBP1 interacts with cells and what signalling pathways are implicated. Ultimately we would like to develop a drug-like peptide or small molecule which can replicate the effects of IGFBP1.” The development of an insulin sensitising diabetes treatment which also protects against cardiovascular disease could have a huge impact in improving quality of life and life expectancy in individuals with diabetes. “We hope that BP-CarDiO will pave the way to us developing a new therapeutic option in diabetes which will act not only as an insulin sensitiser but will also prevent atherosclerosis and cardiovascular disease,” concluded Dr Wheatcroft.

The BP-CarDiO team (Left to right: Stephen Wheatcroft, Pooja Shah, Jessica Smith, Alex Francisco-Burns, Natalie Haywood, Paul Cordell).

www.euresearcher.com

At a glance Full Project Title Investigating the therapeutic potential of manipulating the IGF-IGFBP1 axis in the prevention and treatment of cardiovascular disease, diabetes and obesity (BP-CarDiO) Project Objectives To undertake mechanistic and translational research to identify how the IGF-IGFBP system regulates metabolism and vascular function. Project Funding Supported by a Starting Grant from the European Research Council. Project Partners Research in Dr Wheatcroft’s laboratory is also supported by funding from the British Heart Foundation. Contact Details Dr Stephen Wheatcroft Division of Cardiovascular & Diabetes Research, Leeds Institute of Cardiovascular & Metabolic Medicine University of Leeds, LS2 9JT United Kingdom T: +0113 343 7760 E: S.B.Wheatcroft@leeds.ac.uk

Dr Stephen Wheatcroft

Dr Stephen Wheatcroft is a Senior Lecturer in Cardiovascular Medicine at the University of Leeds. He qualified in medicine at the University of Birmingham and after general medical training in the West Midlands, undertook specialty training in Cardiology. He currently divides his time between research and clinical commitments in general cardiology.

11


A debilitating condition that causes joint pain and stiffness, osteoarthritis affects millions of people around the world, yet the root causes of the disease are not fully understood. The 3D OA HISTO project is developing a new three-dimensional histopathological grading technique that could lead to new insights into the disease, as project leader Simo Jaakko Saarakkala explains

A new approach to osteoarthritis A common musculo-skeletal disease, osteoarthritis is caused by the breakdown of joint cartilage and underlying bone, leading to pain and stiffness. While histopathological grading of 2-dimensional tissue sections is established as the current gold standard for determining the stage of the disease, researchers in the 3D-OAHISTO project are now developing an alternative method. “The idea of this project is to use some actual tissue blocks, instead of 2-dimensional slices. We will then image them in three dimensions, and develop algorithms that would allow 3-dimensional histological analysis,” explains Simo Jaakko Saarakkala, the project’s Principal Investigator. This research could improve understanding of the initial development of osteoarthritis, and thus help clinicians diagnose the disease at an earlier stage. “We can compare the results from 3-D histopathological analysis with those from existing clinical imaging methods. Perhaps we can translate some of those approaches to clinical use and try to detect the disease at an early stage,” outlines Saarakkala. The project is analysing tissue samples from different stages of osteoarthritis, from healthy tissue through to people requiring a full knee replacement, to build up a more detailed picture of the disease. One of the main problems in diagnosing osteoarthritis is that the initial symptoms are difficult to distinguish from other conditions. “A very early sign is usually very unspecific pain symptoms, in the knee for example. Usually people don’t get diagnosed at that stage – they just go to the doctor,

12

they get pain medication, and are advised to avoid exercising for a few weeks, then hopefully the pain will go away,” says Saarakkala. Typically the disease is only diagnosed at quite an advanced stage, by which time there may be only limited treatment options available. “Usually it’s diagnosed by x-ray. If a doctor decides to take an x-ray they might see that the cartilage within the knee joint has already worn out. That’s quite a clear diagnosis – that’s already an end-stage case. It’s quite common to get diagnosed at that stage,” continues Saarakkala.

Histopathological grading Some level of tissue change is a natural part of the ageing process, and it can sometimes be difficult to distinguish relatively mild changes from the early stages of the actual disease. Histopathological grading offers a way to assess the health of a section of tissue, which can be useful in diagnosing osteoarthritis. “With histopathological grading (HPG) we extract a tissue sample and process it in the laboratory, then it is cut into very thin sections, around 5 micrometers. The tissue is stained with certain dyes according to its structure and placed under a microscope, then pathologists and researchers can grade it according to certain osteoarthritis grading scales,” explains Saarakkala. Different grading scales for the assessment of tissue have been published in the scientific literature, providing a basis for the diagnosis of osteoarthritis. “We can say that a certain tissue sample is grade 0 for

EU Research


Researchers working within this ERC project. The image was taken in Simo Saarakkala’s laboratory and newly purchased micro-CT device can be seen in the background. (Left): Project’s Principal Investigator Simo Saarakkala, Ph.D. (Middle): Post-doc Fellow Mikko Finnilä, Ph.D. (Right): Doctoral student Sakari Karhula, M.Sc.

example, which means that it’s perfectly healthy. Or it could be right up to grades 5 and 6, which represent the most advanced level of pathology,” says Saarakkala. This is by nature a subjective visual evaluation however, which can vary between pathologists or researchers according to their judgment. Saarakkala and his colleagues in the project are developing an automated system that would provide a more solid foundation for histopathological grading. “The idea is that we would develop quantitative algorithms to give the user an independent value. An additional, very important point is that we will do it in 3-D, so that we image the whole volume of the tissue sample instead of thin 2-D sections,” he outlines. Historically, a lot of research in this area has been based

content in the articular cartilage and get a description of the collagen network. We can then identify some numbers to describe it – for example, a sample with a high collagen content might have a number of 20,” says Saarakkala. The final step of the project will be to apply the novel 3D HPG in a clinical setting in vivo; Saarakkala hopes this will bring significant benefits over existing approaches. “We aim to create a new standard for histopathological grading of osteoarthritis. By the end of the project we aim to have a new standard, which will actually be much more sensitive than the current one,” he outlines. “It will be more detailed and more robust and it will give you the same numbers for the same samples – there wouldn’t be the variation that we saw previously.”

If we are able to provide a more precise prognosis then maybe we will be able to identify in which patients the disease is likely to progress rapidly, and those in which it is likely to progress more slowly on analysis of very thin sections of tissue around 5 micrometres in length; Saarakkala believes a new, more detailed approach is required. “We really need to understand how a piece of tissue is structured in three dimensions – that’s the reason that these new imaging techniques are being used, they really allow you to do this,” he explains. “This wasn’t possible ten years ago, but now we have the necessary tools.” Researchers are using micro and nano computed tomography (CT) imaging methods, as well as clinical high-end extremity CT devices, to characterise the tissue samples in 3-D. This will act as the foundation for the development of the very first 3D HPG of osteoarthritis. “For example, we can analyse the collagen

www.euresearcher.com

Detailed prognosis This technique could be applied not only to diagnose osteoarthritis, but also to monitor its progression and provide a more detailed prognosis. This is an important aspect of Saarakkala’s research. “Almost all football players develop osteoarthritis for example. Our algorithm could be used to monitor tissue health,” he says. This kind of information is relevant in guiding lifestyle advice, such as exercise levels and weight loss, which are a central part of current treatment. “If we are able to provide a more precise prognosis then maybe we will be able to identify in which patients the disease is likely to progress rapidly, and those in which it is likely to progress more slowly,”

13


At a glance Full Project Title Development of 3D Histopathological Grading of Osteoarthritis (3D-OA-HISTO) Project Objectives 1) To establish and validate the very first 3-D histopathological grading based on cutting-edge nano/micro-CT (Computed Tomography) technologies in vitro 2) To use the established method to clarify the beginning phases of OA 3) To validate 3-D histopathological grading for in vivo use. Project Partners This ERC Starting Grant has been fully allocated to the host institution (University of Oulu), but the following universities are actively involved in the project as collaborators: University of Helsinki (Finland) • University of Eastern Finland (Finland) • University of Toronto (Canada) • École Polytechnique de Montréal (Canada) • University of Calgary (Canada) Contact Details Simo Jaakko Saarakkala Associate Professor, Academy Research Fellow Research Unit of Medical Imaging, Physics and Technology Faculty of Medicine University of Oulu POB 5000, FI-90014 Oulu, Finland T: +358 5057 46681 E: simo.saarakkala@oulu.fi W: http://www.mipt-oulu.fi/portfolios/ saarakkala_group Determining collagen distribution in articular cartilage using contrast-enhanced microcomputed tomography. Nieminen HJ, Ylitalo T, Karhula S, Suuronen JP, Kauppinen S, Serimaa R, Hæggström E, Pritzker KP, Valkealahti M, Lehenkari P, Finnilä M, Saarakkala S. Osteoarthritis Cartilage (2015) 23(9):1613-21.

Simo Jaakko Saarakkala

Dr Saarakkala is an Associate Professor of Biomedical Engineering at the Research Unit of Medical Imaging, Physics and Technology, Faculty of Medicine, University of Oulu, Finland. His research group is focusing on developing new biomedical and clinical imaging methods for diagnostics of musculoskeletal diseases, primarily focusing on osteoarthritis.

14

Example images of cartilage-bone tissue samples imaged with the new micro-CT device (Skyscan 1272, see www.bruker.com/products/microtomography/micro-ct-for-sample-scanning/skyscan-1272/overview. html). The micro-CT device was purchased with this ERC Starting Grant. Tissue samples were extracted from patients who went to total knee replacement surgery at the Oulu University Hospital (Oulu, Finland). Left sample is showing intact articular cartilage and subchondral bone tissues, i.e. having no signs of osteoarthritis. Middle sample is showing some level of articular cartilage degeneration and subchondral bone sclerosis, while right sample shows advanced articular cartilage degeneration and severe subchondral bone sclerosis. All of these micro-level changes can be evaluated fully in 3-D, instead of conventional histology which relies on thin tissue sections. The overall goal of this ERC project is to develop algorithms that would allow automated 3-D histological analysis. The colorbar indicates the level of X-ray attenuation in the 3-D micro-CT imaging, which is also the sign of the collagen content within the articular cartilage (high attenuation means higher collagen content).

outlines Saarakkala. “Maybe we could advise the patient at risk of a higher rate of progression on their lifestyle. While this would not stop the disease in itself, at least it would help to slow its progression as much as possible, so that the patient can manage the condition and won’t need a total knee replacement.”

to really get detailed information on what a particular drug is doing in the tissue, and to assess whether it is having positive or negative effects,” he says. This work could include developing other new methods to diagnose patients. “Besides the ERC project we are considering what kinds of measurements we could do without

The idea of this project is to use some actual tissue blocks, instead of 2-dimensional slices. We will then image them in three dimensions, and develop algorithms that would allow 3-dimensional histological analysis The main current methods of treating osteoarthritis are centred on managing the disease and pain mitigation. While the project’s primary focus is on histopathological grading, Saarakkala is interested in widening the impact of their research to help improve treatment. “It would be really nice to use these methods to study the effectiveness of pharmacological treatment. In future we want to apply our methods to follow-up and treatment,

going to MRI, which is the most comprehensive clinical imaging modality to diagnose osteoarthritis, however, it is quite an expensive and not widely available technique,” continues Saarakkala. “For example, we have been looking at thermal imaging of the knee, assessing the temperature of the knee. We have also been exploring the acoustic signals which are emitted from the knee when patients walk.”

EU Research



Shining a light on the gut’s bacteria Both obesity and type-2 diabetes are characterised by metabolic inflammation, yet the root causes of this inflammation are not fully understood. The ENIGMO project aims to shed new light on the topic by investigating how bacteria in the gut microbiota interact with the innate immune system and the endocannabinoid system, as Professor Patrice D. Cani explains The number of cases of type-2 diabetes in the European Union’s Member States is predicted to rise to approximately 66 million by 2030 according to the International Diabetes Federation, and researchers continue to investigate the underlying causes of the disease. Metabolic inflammation and changes to the endocannabinoid (eCB) tone are known to play a role, underlining the wider importance of the ENIGMO project’s research. “The aim of the project is to investigate the bacteria living in our gut, the gut microbiota. We want to look at how they may interact with the innate immune system, the endocannabinoid system, and energy homeostasis” explains Professor Patrice D. Cani, the project’s Principal Investigator. The endocannabinoid system and the gut microbiota are not known to be connected, but now Professor Cani is investigating the links between them, which could lead to a deeper understanding of both obesity and type-2 diabetes. “The endocannabinoid system is comprised of different bio-active lipids and receptors. Some endocannabinoids are able to stimulate food intake, while others are able to reduce food intake. We also know that some endocannabinoids are involved in the regulation of inflammation,” he says. Many questions remain about the root cause of metabolic inflammation, a condition which is known to be closely associated with the onset of insulin resistance, one of the first markers of the development of type-2 diabetes. This area forms a central element of Professor Cani’s research agenda. “We hope to demonstrate

16

that there are connections between some bacteria and the gut but also at distance on peripheral organs. A key question is whether one or several gut bacteria can trigger inflammation – or not – and thereby influence the development of insulin resistance and diabetes,” he outlines. Researchers are using several different approaches to investigate the fundamental mechanisms behind inflammation, with the wider goal of developing new therapies to treat type-2

diabetes. “We’re trying to understand how we can change the endocannabinoid system at the cellular level,” continues Professor Cani. “We are also developing different models of animals that have a specific tissue, or a cell-specific deletion of genes, that are involved in the immune system or the endocannabinoid system.”

Akkermansia muciniphila A key area of research is the role of Akkermansia muciniphila, a recently

Overview of the phenotype observed following Napepld deletion in adipocytes. When NAPE-PLD is present and functional, adipose tissue homeostasis is maintained and this contribute to a normal crosstalk between adipose tissue and gut microbiota. When NAPE-PLD is absent, adipose tissue metabolism is altered, with increased inflammation, decreased browning capacity and excessive fat mass development. Napepld deletion induces alterations in glucose homeostasis, dysbiosis of the gut microbiota, which in turn participates in the metabolic alterations observed in the adipose tissue. (From Geurts et al. Nature Communications, 2015).

EU Research


identified bacteria which is present in the gut in relatively large quantities and plays a central role in regulating the host’s energy metabolism. While it was only discovered relatively recently, Professor Cani says levels of Akkermansia muciniphila are an important indicator. “We have discovered that this bacteria is lower in the context of diet-induced obesity, and perfectly correlated with beneficial effects of prebiotics, which are non-digestible carbohydrates. So they are dietary fibres that escape the digestion, and are used and fermented by bacteria in the lower part of the gut,” he outlines. The second step has been to demonstrate the impact of such bacteria on host health. “We treated mice with these bacteria plus a high-fat diet – we found that the administration of Akkermansia muciniphila reduces insulin resistance, inflammation, body weight and fat mass development, and improves gut barrier function” he outlines. Professor

This kind of translational research will be an important part of the project’s agenda over the coming years, while their work also holds implications beyond type-2 diabetes. More evidence is emerging to suggest that the gut barrier function is altered in other conditions, including rheumatoid arthritis and other inflammatory diseases. “This suggests that what is happening at the level of the gut may indeed have an influence on different organs. The mechanisms we have demonstrated as being important in the context of obesity and diabetes might also be applied to different diseases,” outlines Professor Cani. Having trained as a physiologist, Professor Cani believes many health problems can be traced back to the gut. “If you take care of your gut, you can prevent a lot of health problems. I think a lot of metabolic complications start in the gut, as this is where you find

We hope to demonstrate that there are connections between some bacteria and the gut also at distance on peripheral organs. A key question is whether one or several gut bacteria can trigger inflammation – or not – and thereby influence the development of insulin resistance and diabetes Cani and his colleagues are investigating the underlying mechanisms behind this. The aim now is to assess whether this bacteria can crosstalk with the host intestinal cells via the innate immune system, the endocannabinoid system, or indeed both. It has already been demonstrated that silencing one specific gene involved in the innate immune system – for instance in the gut – has an impact on the microbial community, now the project aims to go a step further. “By using the different models developed in the project we will not only identify the links between microbes and host, but also learn how we can influence the gut microbiota, which in turn influences the metabolism,” says Professor Cani. By analysing these parameters, which lie beyond the classical markers of type-2 diabetes, researchers aim to provide more specific data about each individual patient. “So we could potentially anticipate, based on the microbiota and also maybe the immune system response, whether a subject will respond to a particular treatment,” outlines Professor Cani. “Maybe we will be able to design specific therapeutic applications that will be effective for one cohort and not for the second.”

www.euresearcher.com

a large number of bacteria,” he stresses. “When pursuing research into integrative physiology, I want to see and to work on all these interconnections between organs and systems. I believe that health problems are not the simple result of a problem in one organ or one system everything is inter-connected.”

At a glance Full Project Title Gut microbiota, innate immunity system and endocannabinoid metabolic interactions link inflammation with the hallmarks of obesity and type 2 diabetes (ENIGMO) Project Objectives In this high-risk/high-gain research program, we propose to elucidate what could be one of the most fundamental processes shared by different key hallmarks of obesity and related diseases. This work could provide different perspectives about disease pathogenesis and knowledge-based evidence of new therapeutic options for obesity and associated metabolic disorders. Contact Details Patrice D. Cani UCL, Université Catholique de Louvain Faculty of Pharmacy and Biomedical Sciences. Louvain Drug Research Institute, Metabolism and Nutrition WELBIO, Walloon Excellence in Life sciences and BIOtechnology NeuroMicrobiota lab, European Associated Laboratory (INSERM/UCL) Av. E. Mounier, 73 box B1.73.11 B-1200 Brussels, BELGIUM T: +32 2 764 73 97 E: patrice.cani@uclouvain.be W: http://www.ingutwetrust.org/ Twitter: @MicrObesity

Professor P. D. Cani

Gut microbiota The gut microbiota is an integral part of this research, and scientists continue to investigate links between changes in the brain-to-gut axis and other health problems. Professor Cani is also codirector of the NeuroMicrobiota lab with Prof. C. Knauf (European Associated Laboratory, INSERM, Toulouse/UCL, Brussels) investigating the relationship between microbes and brain diseases such as neuro-degenerative diseases and depression. “I’m convinced that the brainto-gut axis is very important,” he says. “I believe that we have to look beyond obesity and diabetes, and it will be really interesting to work in this integrative physiology field. I am also part of projects which are trying to link microbial changes to brain disorders.”

Project Coordinator

Professor P.D. Cani has a M.Sc. in Nutrition and another M.Sc. in Health Sciences, he is registered dietitian and PhD in Biomedical Sciences. He is member of several international associations; he is member of founding member of the Belgian Nutrition Society. He is author and co-author of more than 155 scientific publications (h-index of 50 (citations >11,500).

17


Decoding Neural Information For Human Movement DEMOVE is a research project which aims at decoding and exploiting the neural information responsible for human movements by processing the electrical activity of muscles. The project, headed by Professor Dario Farina, has huge implications for those with impaired mobility and might pave the way to a new generation of man-machine interfaces in particular for upper-limb amputees In order to

build advanced prosthetic interfaces that allow for natural movements, the control drive should also be as natural as possible, ideally directly from the motiontriggering motor neurons. To achieve such a remarkable engineering feat, the DEMOVE project, led by Professor Dario Farina and funded by the European Research Council (ERC), developed methods to decode this neural information in awake and naturally moving humans. For this purpose, his team mainly focuses on neural cells that are in the spinal cord, the so-called lower motor neurons. “For these cells, we have developed techniques that allow us to identify their electrical activity accurately in natural movements,” explained Professor Farina. “Although the motor neurons are not located in the brain, they receive input currents from the brain. An accurate decoding of their response, as we can now do in humans, allows the identification of these input currents and therefore allows us to understand the commands sent from the brain to generate movements.”

A Moving Proposition The neural decoding of motor neurons in the spinal cord provided the scientists with the opportunity to understand how the brain controls movement. The motor neurons send electrical activity directly to the muscles which then transform this neural movement code into function. Therefore, reading this neural language allows for understanding the natural movement intentions sent by the brain, making it possible to predict a subject’s movements. There are several exciting applications envisaged based on the results of the DEMOVE project but the implications specifically for amputees, who wish to have a more intuitive and natural control of their prosthetic arm for instance, are truly life changing. “We focus on several technologies but certainly artificial upper

18

limbs are one of the most appealing applications,” said Professor Farina. “In the case of amputations, we can still decode the needed motor-neuron information even if the muscles are missing, with a combination of surgical nerve procedures and algorithm design. Therefore, we can reconstruct the exact intention of the user and reproduce it in movements of a robotic arm or hand.” As such interfaces will understand the neural code, being the language of the motion-triggering brain, they will in the example of prosthetics enable an artificial

How It’s Done To study the neural pathways the researchers had to identify the exact timing of small electrical signals produced by motor neurons, so-called action potentials. There were several challenges which had to be overcome to achieve this goal. A central challenge was to be able, in the intact human, during natural movements, to identify the timing of activation of most of the motor neurons in the spinal cord which were responsible for the specific

In the case of amputations, we can still decode this information even if the muscles are missing, with a combination of surgical nerve procedures and algorithm design. Therefore, we can reconstruct the exact intention of the user and reproduce it in movements of a robotic arm or hand limb to interpret the precise neurochemical instruction in a very natural way. This might eventually make it possible to not only perform simple actions such as shaking hands or picking up a fork, but to potentially let an amputee play the piano again. Beyond prosthetics, there is another spectrum of applications that could become possible from this research, ranging from the control of other assistive devices such as exoskeletons and wheelchairs to wearable technology for monitoring the elderly, or specifically for preventing falls. Intact-bodied subject controlling a sensorhand speed prosthesis (without glove; Ottobock Healthcare, Germany).

motor task. This implies understanding the output neural code from the spinal cord and, as a consequence, to link the neural information to its functional consequences – meaning movement. To tackle this issue and measure the output of not just some, but of almost all motor neurons that contribute to a human movement, advanced electrode systems were developed within the DEMOVE project which made in-vivo electrophysiological recordings from both within and on-top-of muscles in humans possible. Through the development of new computational methods for extracting functionally relevant information on movement from these recordings, the neural code could then be decoded and performed motions predicted. “Through the DEMOVE project, we now have methods not existing five years ago for decoding the spinal cord output while humans move. This has opened new ways of understanding movement as well as new possibilities for the design of neuro-technologies,” said Professor Farina. An important step to

EU Research


bridge the gap between the above described knowledge of neural decoding and well-described motion patterns is the development of biomechanical models. During the DEMOVE project, such models have been implemented to simulate, describe, and predict neural-controlled movements in healthy subjects, amputees, and subjects suffering from other movement difficulties. A particularly attractive practical point in this research is that potentially distressing invasive surgical techniques can be avoided. The DEMOVE approach uses the muscles as biological, natural amplifiers of the nerve activity that comes from motor neurons, making it possible to extract the activity of motor neurons in the spinal cord with high precision. Professor Farina highlights the clinical implications of that: “The great advantage of this approach, with respect to placing electrodes into the spinal cord or the brain directly, is that we do not need a surgical operation and we can even make the decoding in a completely noninvasive way. The motor neurons are the only neural cells for which we have such a noninvasive access in humans. Our group has been pioneering the methods for decoding the neural code specifically for these cells.�

Above: The DEMOVE approach is to record high-density electromyography from muscles (left panel) and uses computational methods to extract the neural code of the movement. This code is then used in real-time to drive a biomechanical musculoskeletal model (middle panel). Finally, these simulations drive a prosthesis with several degrees of freedom in a neutrally-driven, natural way (right panel)

A Multi-disciplinary Approach To tackle the complex decoding of the neural signals with high-end engineering and get it closer to application, DEMOVE has needed to link up various scientific disciplines and experts in a wide range of fields. “Our team is made of mathematicians, physicists, biomedical engineers, movement scientists, roboticists, and many other disciplines. Bringing all together is challenging but I believe it also constitutes the plus of our research with respect to less interdisciplinary groups. The scientists mainly concerned with basic physiology,

www.euresearcher.com

Intact-bodied subject controlling a Michelangelo prosthesis (Ottobock Healthcare, Germany)

19


At a glance Full Project Title Decoding the Neural Code of Human Movements for a New Generation of Man-machine Interfaces (DEMOVE) Project Objectives DEMOVE aims at the development of advanced electrode systems for in-vivo electrophysiological recordings from nerves and muscles in humans and new computational methods/models for extracting functionally significant information on human movement from these recordings. The highly innovative focus is that of providing the link between the cellular mechanisms and the behavior of the whole motor system in the intact human, i.e. to build the bridge between the neural and functional understanding of movement. Project Funding Max ERC funding 2,431,473 euros Advanced Grant (AdG), PE7, ERC-2010-AdG Contact Details Project Coordinator, Professor Dario Farina, Ph.D Host Institution (HI) Universitaetsmedizin Goettingen Georg-August-Universitaet Goettingen Stiftung Oeffentlichen Rechts, Germany T: +49 (0)551 39 20100 E: dario.farina@bccn.uni-goettingen.de W: www.universitaetsmedizin-goettingen.de W: www.neurorehabilitation-systems.de Selected project in the 2014 ERC Annual Report. Article at the following link: http://erc.europa.eu/ sites/default/files/publication/files/erc_annual_ report_2014.pdf (at page 35 of the report).

Professor Dario Farina, Ph.D

Professor Dario Farina received the Ph.D. degrees in automatic control and computer science and in electronics and communications engineering from the Ecole Centrale de Nantes, Nantes, France, and Politecnico di Torino, respectively, in 2001 and 2002. After a period as Assistant Professor at Politecnico di Torino, he has been an Associate Professor (2004– 2008) and then a Full Professor (2008-2010) at Aalborg University, Aalborg, Denmark. In 2010 he was appointed Full Professor and Founding Chair of the Department of Neurorehabilitation Engineering at the University Medical Center Göttingen, Georg-August University, Germany.

20

for example, find it very rewarding that their work can be applied for technology development. At the same time, more applied researchers are very eager to discuss and integrate in their work basic principles of neuroscience which may broaden their view and also provide new biologically-inspired ideas for the applications. The key factor in bringing all together for a common goal is the right team of researchers, who need to have the curiosity and interest in a very broad perspective in addition to be very qualified in the details of their own discipline. Within my ERC project, I have an excellent team with these characteristics.” Currently, 12 people from various scientific backgrounds work for DEMOVE. To increase scientific discussion and outreach, a DEMOVE symposium with internationally renowned scientists has been held on an annual basis for the four years of the project so far, with one final event planned for June 2016. This series of symposia made it possible for experts in different fields to converse, collaborate, and present related findings in a formal setting in Göttingen.

Next Steps Now that the underlying techniques are successfully established, it is time to refine the technology to make it practical in ‘real world’ scenarios beyond the laboratory. Currently, experts are required to carefully mount the EMG electrodes onto an amputees’ muscle before each usage, whereas in the future wearable technologies, where the electrodes are directly embedded into clothing, will make the process of mounting natural and easier for users. To make this possible, the amplifiers and electronics for signal recording need to be miniaturised and provided with wireless

Musculoskeletal modeling of the upper limb.

transmission. In parallel, the algorithms need to work on very general conditions, so that an excellent control can be achieved in non-laboratory conditions. Apart from the use in prosthetics, the suggested decoding approach can also be used to describe pathological behaviours and get closer to the causes, such as to understanding how pathological tremor in limbs is generated and amplified. Also, this research will enable state-of-the-art technologies to make the lives of those pathological suffering more comfortable. “With this new knowledge, it is possible to better tune training and rehabilitation programmes for elderly people and patients and monitor their progresses objectively. Moreover, this new knowledge has allowed to define new strategies for designing neurotechnologies,” concluded Professor Farina. It is no exaggeration to say DEMOVE will have a great impact in our fundamental understanding of how movement is generated and might revolutionize the field of upper-limb prosthetics in the near future. Transradial amputee controlling the Michelangelo hand in the clinical SHAP test, being monitored by a motion-capture system.

EU Research


Stress reactions are an essential human survival mechanism, but they can also lead to changes in gene expression and long-term damage to the body. We spoke to Professor Hermona Soreq about her research into the functioning of a family of genes called microRNAs, and their role in regulating stress and anxiety

MicroRNAs and the path to stress Each individual reacts

to stressful situations differently, depending on their genetic inheritance, education and the environment in which they live. MicroRNA’s, a relatively newly discovered family of genes, are thought to play a central role in the regulation of anxiety and stress, an area of great interest to Professor Hermona Soreq, the Principal Investigator of the CholinomiRs project. “The project is focused on the functioning of microRNAs in the brain. We’re looking at the regulation of one pathway of neural transmission involving a chemical called acetylcholine,” she says. Acetylcholine plays an important role in communication between the nervous system and the immune system. “Neurons communicate with each other by sending chemicals called neurotransmitters, and acetylcholine was the very first neurotransmitter to be discovered. The neurons that produce it send messages to other nerve centres, to other organs, to the intestine, and also to a lot of tissues. So it’s a very important communicator,” outlines Professor Soreq. This importance dates right back to our early ancestors. Although human civilisation has of course developed significantly over millennia, the way we

www.euresearcher.com

are wired is still very similar to early homo sapiens. “When early humans got anxious, their first thought was that their life was threatened. They needed to speed up their pulse and to run as fast as they could,” explains Professor Soreq. While most modern humans don’t face lifethreatening situations in the course of their daily lives, we react in much the

microRNAs which control cholinergic transmission – which is the transmission of acetylcholine – CholinomiRs.”

MicroRNAs The emergence of microRNAs as regulators of gene expression, and of acetylcholine signalling as a regulator of anxiety and inflammation, provides an effective model

If neural

transmission is a pathway, and if microRNAs control pathways, then there should be microRNAs that control more than one component of the acetylcholine pathway. If that pathway controls cognition and stress reactions, then disrupting it should lead to an imbalance, which will lead to higher anxiety and higher inflammation same way to stressful circumstances, and microRNAs are an important component. “When we get stressed, regardless of the cause, we sort of prepare to run very quickly, and to avoid wasting energy on things that are not essential,” continues Professor Soreq. “We need a very efficient system in order to adjust in those kinds of situations. We have discovered over the last few years that that system is mainly controlled by microRNAs. We call those

for studying the interaction between the nervous system and the immune system. A lot of the microRNAs present in the human brain are not there in other species; Professor Soreq says these microRNAs play an important role in the body. “It really makes sense for the body to produce microRNAs, as they are smaller than other genes, so it doesn’t cost us much energy to produce them. They bind to other genes, based on a very small motif of the DNA.

21


So that means they can essentially bind to all of the genes that control cholinergic transmission,” she explains. Professor Soreq co-authored a recent paper looking at post-traumatic stress disorder (PTSD), underlining the wider relevance of research into microRNAs. “The findings of the paper suggest that the interaction between a specific microRNA – miRNA 608 – and the acetylcholinesterase gene is involved in the threat circuitry behind PTSD,” she says. The functioning of microRNAs is a central part of Professor Soreq’s overall research agenda, yet the project’s immediate focus is the role of CholinomiRs in regulating acetylcholine signalling, investigating how they interact with each other in both healthy and diseased states. This research starts from a clear concept of how neural transmission works. “If neural transmission is a pathway, and if microRNAs control a pathway, then there should be microRNAs that control more than one component of that pathway. And if the acetylcholine signaling pathway controls cognition and stress reactions, then disrupting it should lead to an imbalance, which will lead to higher anxiety and higher inflammation,” outlines Professor Soreq. This hypothesis

has been borne out by research into mice, in which microRNA regulation has been prevented on one of the cholinergic genes. “These mice are stressed because they were born like that, because we change their functionality. Now we want to find out what happens in human beings when that occurs and hopefully find ways to avoid such damage,” says Professor Soreq.

the entire system of transmitting acetylcholine. “If you change one of the microRNAs then you get a domino effect – all the others will be modified too. If you interrupt the interaction of one of these CholinomiRs with its target gene, then you have too much of the target gene because there’s not enough control over it,” says Professor Soreq.

When we get stressed, regardless of the cause, we sort of prepare to run very quickly, and to avoid wasting energy on things that are not essential. We need a very efficient system in order to adjust in those kinds of situations This builds on fundamental research into the role of CholinomiRs. They act almost like traffic policemen within the body, directing cholinergic transmission and helping the body to respond efficiently and effectively to stress; this is a delicate process, and Professor Soreq says many different CholinomiRs are involved in this coordinated action. “Each of them makes a relatively small contribution, but together they provide very efficient cumulative control,” she explains. Researchers have found that changes in these microRNAs, or changes in the genes they control, may imbalance

This can have a significant impact on an individual’s anxiety and stress levels. Professor Soreq and her colleagues investigated what happens when the interaction between CholinomiRs and a target gene goes wrong, gaining some clear results. “We found that people who carry a specific genomic change of this interaction show changes in their anxiety level. They have higher blood pressure, and they have a higher inflammation level. This is to be expected when you imbalance the transmission of acetylcholine, from the brain to the body and back,” she says. Researchers have also

MicroRNAs create a complex network of interactions and compete with each other on affecting all of our genes.

22

EU Research


studied the impact of changing the balance of cholinergic transmission in a large-scale study of healthy volunteers. “Acetylcholine controls the heartbeat. So, would changes to the heartbeat be associated with a higher level of anxiety? This question has been asked before, but only at the individual level,” continues Professor Soreq. It has historically been difficult to study the effects of fear and anxiety on cardiac functioning in large populations, as not many are chronically exposed to stressful conditions. Based at the Hebrew University of Jerusalem, Professor Soreq says the circumstances of daily life in Israel expose people to high levels of stress, from whom the project can gather data. “We’ve looked at data from 18,000 people who come for health checks. These are healthy people – companies pay for their annual health check, and then they are asked; ‘would you be willing to put your data into a study’?” she says. Participants were asked about the extent to which they felt the effect of terror on their lives, which researchers could then correlate with physiological data, such as on pulse rate. “Typically pulse rate slows down with age. However, some of those volunteers developed annual increases in pulse rate – and those people are at greater risk of premature death,” outlines Professor Soreq. This group comprised about 4 percent of the 18,000 people who took part in the study. The next step was to investigate any links between increased pulse rate and the way people answered the questions about terror. “We asked whether we could see any association between the increase

in their heartbeat and the way people perceive the threat of terror. Was there any association between that and the level of inflammation?” explains Professor Soreq. This work revealed some clear links. “We used machine learning to find an association, and we found that those 4 percent of people who developed an increase in pulse rate were those who answered five out of five to the question; ‘to what extent do you fear terror?’ So they are really highly strung,” says Professor Soreq. “The good aspect of these findings is that while everybody here lives under chronic stress, most of the population manages to stay healthy.”

Therapeutic avenues This research holds clear relevance in terms of therapeutic development, and two main avenues are currently being pursued. The first is to develop therapies to control inflammatory disease, such as Crohn’s disease, while the second is in controlling liver disease. “If you think again about the stressed early homo sapiens creatures, they needed to ensure they used energy very carefully. That meant storing fat in their liver. We still react that way today. We are exploring the role of CholinomiRs in this pathway, and we are trying to progress with that towards therapeutic treatment,” says Professor Soreq. This kind of therapeutic work will form an important part of Professor Soreq’s future agenda, while she also plans to pursue further exploratory research. “I’m fascinated by the evolutionary path of research. We are trying to take that further, while also looking at potential therapeutic pathways, so we are proceeding in both directions,” she says.

At a glance Full Project Title MicroRNA Regulators of Cholinergic Signalling in the Neuro-Immune Interface (CholinomiRs) Project Objectives This project aims at establishing the existence of, and competition between, microRNA regulators of cholinergic signalling (CholinomiRs). Researchers aim to discover the ways in which this new regulatory mode of cholinergic processes is involved in controlling anxiety and inflammation, and to develop new tools for experimental and therapeutic interference with these regulatory processes. Project Funding ERC FP7 program – ncRNAPain – no. 602133 ERC FP7 program – CholinomiRs – no.321501 ERC H2020 program FLDcure – no.639314 Project Partners • Group leader for the pain project is Professor Michaela Kress, Innsbruck. Contact Details Petra Pollins Soreq Lab Department of Biological Chemistry The Hebrew University of Jerusalem T: +02-6585446, 054-7736511 E: petra.pollins@mail.huji.ac.il W: https://erc.europa.eu/cholinomirsmicrorna-regulators-cholinergicsignalling-neuro-immune-interface W: http://cordis.europa.eu/project/ rcn/107242_en.html

Professor Hermona Soreq

Professor Hermona Soreq is a Molecular Neuroscientist studying microRNAs and Cholinergic signaling. Both topics become more relevant than ever in health and disease, with major implications that integrate basic with translational research.

www.euresearcher.com

23


Older adults tend to take longer to make decisions than younger people, as they typically seek to gather more information before reaching a conclusion. Researchers in the SPEED project are using sequential sampling models to investigate speeded decision-making in the human brain, as Professor Birte Forstmann explains

Understanding speeded decision-making A number of

studies have shown that older adults take longer to make decisions than younger people, as we tend to become more cautious as we age. Based at the University of Amsterdam in the Netherlands, Professor Birte Forstmann is the coordinator of the SPEED project, an ERC-backed initiative that is investigating decision-making. “The general aim is to understand the speeded decision-making in the human brain,” she outlines. The project is investigating key mechanisms in the brain, using a range of techniques. “We will use quantitative modelling – socalled sequential sampling models that can be fitted to data – in combination with state-of-the-art imaging techniques, such as 7-tesla MRI (7T MRI),” continues Professor Forstmann. “One sub-project focuses very specifically on the subthalamic nucleus (STN). This is a small nucleus in the mid-brain, which is thought to act as a kind of brake in the brain.”

Subthalamic nucleus The STN is thought to play a very important role in connecting to different cortical regions, thereby giving rise to cognitive and associative networks. Researchers in the SPEED project aim to identify manipulations that can be used to tap into these different networks, which are also represented in the STN. “The question here is whether the STN contains sub-parts that connect with these cortical regions, and thereby affect the speed of decision-making,” says Professor Forstmann. This is a key part of Professor Forstmann’s research. “One crucial question that we’re tackling is whether the STN is structured in such a way that there are specific sub-divisions, or whether differentiation is more gradual,” she continues. “Then we’re looking at how information is transferred back into the cortex from the STN, taking into account its structure and architecture.”

24

The project is using sequential sampling models to investigate decision-making processes and the different mechanisms involved. Researchers are investigating very simple perceptual discrimination tasks, in which knowledge, memory and experience are not involved. “For example, we might ask an individual sitting in front of a screen to decide whether some dots presented on the screen moved to the left or the right,” outlines Professor Forstmann. Researchers aim to understand what happens during the decision phase, after the relevant information has been received. “These models help us to get at this latent level. If we only look at reaction times, or errors, then we won’t understand a lot about the decision phase itself,” says Professor Forstmann.

that back to neuro-scientific measures.” From these sequential sampling models researchers can gather information on key parameters of decision-making, such as motor execution times. This can then be correlated with, for example, brain imaging data. “We can find out whether certain areas of the brain are modulated, or whether they are modulated due to inter-individual differences in these decision parameters. So then we will learn which areas of the brain are modulated by differences in the decision process,” explains Professor Forstmann. Researchers can also use functional imaging data to build an even more detailed picture. “With ultra-high field MRI – for example 7T MRI, or an even higher field strength – small structures such as the STN become

We see a lot of variation in how elderly

people solve perceptual discrimination tasks, but we also see significant variation in young, healthy people, and we want to exploit these inter-individual differences

Researchers have found that people not only make decisions more slowly as they grow older, but also that in parallel they become more accurate. Professor Forstmann says this represents a very interesting shift in response thresholds. “People become more cautious as they grow older, which is eventually reflected in extended reaction times. They take more time in making decisions in order to make fewer errors,” she explains. Sequential sampling models are vital to understanding the underlying mechanisms behind this. “We see a lot of variation in how elderly people solve perceptual discrimination tasks, but we also see significant variation in young, healthy people, and we want to exploit these interindividual differences,” outlines Professor Forstmann. “By using these models we can estimate the length of the decision phase for each individual, and then relate

visible to such an extent that it is possible to delineate these structures manually, and to create probabilistic atlas maps,” says Professor Forstmann.

Parkinson’s disease This research could hold important implications for the treatment of Parkinson’s disease. Patients with Parkinson’s commonly suffer from tremors and movement problems during the progression of the disease; deep brain stimulation (DBS) is one method of alleviating these symptoms. “During this DBS procedure an electrode is lowered into the STN. The idea is that stimulating the STN, and in particular specific subparts of the STN, will relieve these very strong tremors that these patients display,” explains Professor Forstmann. This method is based on the hypothesis that the STN has several sub-divisions, but recent

EU Research


At a glance Full Project Title Speeded decision-making in the basal ganglia: An integrative model-based approach (SPEED) Project Objectives Because of its pivotal role in how we interact with the world, the topic of speeded decision-making has been studied by many disciplines, including mathematical psychology, experimental psychology, and the cognitive neurosciences. These disciplines often work in isolation, and the main goal and defining feature of this proposal is to study speeded decisionmaking using an integrative, model-based approach (e.g., Forstmann et al., 2015, Annual Review of Psychology). Project Partners • ERC • NWO • STW • Hersenstichting • Parkinsonfond Contact Details Project Coordinator, Professor Birte U. Forstmann, PhD University of Amsterdam Valkenierstraat 65-67 1018 XE Amsterdam T: +31 (0)20-525 6281 E: buforstmann@gmail.com W: www.birteforstmann.com W: http://cordis.europa.eu/project/ rcn/106889_en.html Forstmann, B. U. & Wagenmakers, E.-J. (Eds.) 2015. An introduction to model-based cognitive neuroscience. Springer.

research calls this into question. “Our initial evidence shows that there aren’t strict territories in the STN, but there are in fact gradations,” says Professor Forstmann. “So in fact the hypothesis under which DBS is applied, and the mechanisms behind it, are under scrutiny.” The DBS method is actually quite effective in alleviating the symptoms of Parkinson’s disease, yet the underlying mechanisms are not fully understood. The project aims to shed new light in this area, which could enable researchers to improve the DBS method further. “The results of this project will show in more detail how neurons are distributed and what types of neuro-transmitters are present in the STN,” says Professor Forstmann. The SPEED project itself is focused on healthy subjects, but Professor Forstmann hopes to extend the scope of their research in future. “I hope that our work will eventually be translated into the clinical neurosciences, so that more of the clinics which offer DBS procedures will use 7T MRI to get better

www.euresearcher.com

images for patients prior to surgery,” she says. “This will allow neurosurgeons to individually delineate the STN and then plan surgery accordingly.” The project’s immediate focus is on fulfilling the aims set out in the original grant, yet Professor Forstmann is also looking towards further research. One major part of Professor Forstmann’s work will be atlasing the human sub-cortex. “The main aim here is to focus on the mid-brain and the human sub-cortex, using ultra high-resolution MRI. We aim to better understand – not only structurally, but also functionally – what’s going on there in both healthy and diseased states,” she outlines. This research will hold relevance for many fields, including the clinical neurosciences, the basic neurosciences, and the computational neurosciences. “A more detailed chart of the brain will be of use for many disciplines. This work is vital for fostering the neurosciences,” stresses Professor Forstmann.

Forstmann, B. U., Ratcliff, R., & E.-J. Wagenmakers (in press). Sequential Sampling Models in Cognitive Neuroscience: Advantages, Applications, and Extensions. Annual Review of Psychology.

Professor Birte Forstmann

Professor Birte Forstmann is a Full Professor of Cognitive Neuroscience at the University of Amsterdam. Her general research goal is to understand the brain mechanisms that allow people to adapt quickly to changes in their environment. Her work combines mathematical modelling with functional magnetic resonance imaging (fMRI), diffusionweighted imaging, ultra high-resolution 7Tesla MRI, electroencephalography, and tissue from postmortem brains.

25


Different retinal cell types of the mouse retina (left: photoreceptors, right: bipolar cells) with a green fluorescent protein.

A new vision for treating retinal blindness There is currently no cure for retinitis pigmentosa, a degenerative eye condition which affects photoreceptors in the retina, leading to complete blindness at the later stages of the disease. Inserting microbial opsins into the diseased retina holds real promise as a means of treating the condition and other retinal degenerative diseases says Dr Jens Duebel, Principal Investigator of the Optogenret project A degenerative eye

disease which severely impairs vision, retinitis pigmentosa can be caused by many different photoreceptor-specific mutations. While there is currently no cure for the disease, the work of the Optogenret project could lead to restored visual function. “The idea is to use microbial opsins, derived from bacteria or other sources, and to introduce them into the diseased retina,” says Jens Duebel, the project’s Principal Investigator. This approach is designed to restore light sensitivity in the retina, a property which is lost in cases of retinitis pigmentosa. The feasibility of the optogenetic approach has been demonstrated in previous studies by other laboratories. “We use these microbial opsins, which are light-sensitive proteins. We put the genetic code of a microbial opsin into a viral vector – a so-called adeno-associated virus (AAV) – and inject it into the back of the eye. This is all done in mouse models of retinal degeneration,” explains Duebel. “After injection, the

26

virus can do its job and introduce the information from the microbial opsin. This is then expressed in a specific cell type. Then we use electrophysiology and two-photon imaging to test the functionality of the microbial opsin – i.e.

resolution goes down, or they are really fast but then they are not light-sensitive. So we are investigating boosting the intensity of light with a prosthetic device,” he outlines. “In this context, tuning the wavelength of the microbial opsin will be

We are using so-called ‘opto-genetics’ to restore vision. The idea is to use microbial opsins, derived from bacteria or other sources, and to introduce them into the diseased retina if it can make the blind retina lightsensitive again.” The light intensity that is needed for optogenetic stimulation is very high though. To overcome this obstacle, researchers have developed novel optogenetic treatment strategies by using microbial opsins with enhanced lightsensitivity; however, Duebel says there is always a trade-off involved. “Either they are very light-sensitive but their temporal

of major importance. If we can shift the wavelength from blue light to red light the energy is lowered, and that’s less harmful to the retina. Thus, with red-shifted microbial opsins we are allowed to use much higher light intensities, without the risk of damaging the eye.” Another challenge is that the degree of retinal degeneration among patients varies, something which Duebel and his colleagues take into account in their

EU Research


Fundus image of a mouse retina where a microbial opsin (green) has been introduced into retinal output neurons that are directly connected to the brain.

research. In some patients the photoreceptors – the first order neurons – are still present, but the light-sensitive segments have been lost. “These people are functionally blind, but the cell bodies still remain, and they make connections to the second-order neurons, the socalled bipolar cells,” explains Duebel. “In a previous study we demonstrated that the expression of an archaebacterial opsin in these surviving ‘dormant’ photoreceptors can restore visual responses in blind mice.” Duebel is also interested in patients who don’t have any photo-receptors left. “In those cases we have to go to the bipolar cells. In a mouse model of retinitis pigmentosa, we introduced a microbial opsin with a viral vector to specifically target the bipolar cells,” he says. “We took measurements in the retina and in the visual cortex of blind mice that had been treated. We wanted to know if this information from the retina was transmitted to the higher visual centre of the brain, so the visual cortex,” says Duebel. Researchers found that they could indeed measure light responses in the cortex, and visually guided behaviour was also observed. “This showed that we could restore visual activity on the retinal level, on the level of the brain, and the behaviour

www.euresearcher.com

level,” outlines Duebel. “Then there is another group of patients, where only the output neurons of the retina – the ganglion cells – remain, and our data from animal models and post-mortem human retinas show that we are also able to target these cells; but there is also a trade-off in this approach, because less retinal information

Custom in-house designed two-photon microscope combined with an electrophysiology setup.

processing remains when the last-order neurons are targeted. A key point of the optogenetic approach is that this strategy works independently of the genetic cause of the disease; in other words: it is a universal strategy, regardless of the patient’s mutations. This optogenetic approach could potentially be used to treat other retinal degenerative diseases, including common conditions like age-related macular degeneration, which leads to a gradual loss of central vision. Another major advantage of this technique is that injecting a viral vector into the eye is less invasive than an electrical implant, which requires complicated surgery: “We just inject the virus with one shot, and then the protein is expressed. In the animal model, we can see that the expression lasts for the entire life of the animal,” says Duebel. Another big benefit of optogenetics over electrical implants is that they offer a very high spatial resolution compared to electrical implants. “With an electrical implant the spacing of the electrodes is constrained – you cannot put the electrodes very close together,” explains Duebel. “With optogenetics you can simulate a much better spatial resolution than with electrodes. In contrast to electrical implants, we expect that an optogenetic

27


At a glance Full Project Title ‘Microbial opsins for mammalian vision: Optogenetics in the retina’ (OptogenRet) Project Objectives The ERC Starting Grant project ‘OptogenRet’ aims at developing optogenetic treatment strategies to restore vision in retinal degenerative diseases. The key idea of optogenetics is to convert light-insensitive retinal neurons into ‘artificial photoreceptors’ by genetically introducing light-sensitive proteins from microbes (algae or bacteria). Project Funding European Research Council (ERC) Starting Grant ‘OptogenRet’ 309776. Project Partners • Dr D. Dalkara, Dr S. Picaud, Dr T. Léveillard, Dr O. Goureau, Professor J.-A. Sahel; Institut de la Vision, Paris, France • Professor M. Ader; CRTD / DFG-Center for Regenerative Therapies Dresden Cluster of Excellence, Dresden, Germany • Dr V. Emiliani; Neurophotonics Laboratory, Paris, Descartes University, CNRS, France Contact Details Dr Jens Duebel Group Leader at the Department of Visual Information Processing Institut de la Vision 17 rue Moreau - 75012 Paris - France T: +33 6 5256 2400 E: jens.duebel@inserm.fr W: www.institut-vision.org

Dr Jens Duebel

Dr Jens Duebel is a Group Leader at the Institut de la Vision in Paris. His research centres on neurophysiology and optogenetic applications in the retina. After his doctoral studies at the Max-Planck Institute for Medical Research Heidelberg in Germany he worked as a postdoctoral fellow in Botond Roska’s laboratory at the Friedrich Miescher Institute for Biomedical Research Basel, Switzerland.

28

Depending on the degree of retinal degeneration, microbial opsins can be expressed in specific cell-types of the retina by using a viral vector (AAV). Drawing in the centre (Cajal, 1900) shows an intact retina. In a degenerated retina, after optogenetic treatment, microbial opsin driven light responses are then transmitted via the optic nerve to the brain. based therapy will be able to restore vision at much higher acuity levels.” “In addition, we are developing parallel approaches together with our colleagues to not only restore the light-sensitive function to cells, but also to slow down degeneration. We also have viral vectors carrying a trophic factor,” continues Duebel. “These trophic factors help neurons maintain connections with their neighbours. If you lose the retinal structure then you have nothing to target,” he points out. “We aim to maintain the structure, then bring in a functional microbial opsin, to restore the function. So these two approaches are running in parallel.” The viral vector (AAV) has already been used in clinical trials to treat LCA (Leber’s Congenital Amaurosis), showing that it is non-pathogenic, now Duebel and his colleagues aim to develop their gene therapy approach further. The project’s long-term goal is to translate this research into an effective therapeutic approach. A start-up company has been established on the campus of the Vision Institute in Paris, with the aim of bringing optogenetic approaches to clinical trials in the near future; safety is a key issue in these terms. “There are two key points regarding safety – do you get any inflammation, and how much light do you need?” says Duebel. The indications

so far are positive, and if the project’s approach doesn’t cause any inflammation, then Duebel believes there is a good chance of moving to clinical trials in the near future. “It is important to do patient screenings, to see who is eligible. This work has already been started here at the eye clinic,” he says. “Our colleagues in the ophthalmology department are using OCT scanning (Optical Coherence Tomography) to see which patients could benefit from treatment.” This imaging technique allows researchers to assess the condition and extent of the remaining retinal structure in a patient, and tailor treatment accordingly. While applied research is an important part of Duebel’s overall agenda, he is keen to stress that his lab also pursues more fundamental research into the retina. “We have to understand how the retina works in order to develop treatment strategies,” he points out. Much remains to be learned about the function of specific inter-neurons in the retina, an area that Duebel plans to investigate further in future. “We want to learn more about the function of these retinal interneurons,” he says. “We can use the optogenetic approach as a stimulation tool. We can pick a certain cell type and stimulate it. Then we can see which cell type this cell is talking to, and from that we can understand more about the function of these unknown cell types.”

EU Research


Evidence suggests that a healthy diet can help prevent the development of eye conditions, yet many of us don’t eat enough nutrients to maintain eye health. Professor John Nolan tells us about the CREST project’s work in investigating whether enhancing nutrition in the eye has a positive impact on both healthy subjects and people with age-related macular degeneration

Nutrients that enhance eye performance A

degenerative condition which mainly affects older people, age-related macular degeneration (AMD) is the leading cause of blindness in the developed world, and hence is the focus of a great deal of research attention. Based at the Nutrition Research Centre Ireland at Waterford Institute of Technology in Ireland, Professor John Nolan is the Principal Investigator of the CREST project, and ERC-backed initiative investigating the effects of specific nutrients on visual performance. “The aim of the CREST project was to investigate whether enhancing or enriching nutrition in the eye, with key nutrients, will impact positively on visual function in two distinct populations,” he outlines. The first of these populations was a young, healthy group, with no known eye problems (CREST Normal). “The idea here was to study whether we could take a young and healthy person, with normal vision, and enrich nutrition in their eye to give them better visual performance,” explains Professor Nolan. The second population was subjects with early stage AMD (CREST AMD) who met a rigorously applied set of inclusion criteria, including internationally recognised categorisation for AMD. Currently there is no cure for AMD, with the exception of some treatment options using injections for the advanced neovascular (wet) form of the disease. This treatment is very expensive costing around €30,000 per patient per eye, per year, and hence is not a feasible option for treating all patients in the long term. However, Profess Nolan says that nutritional changes can slow the progression of the disease.

Potentially, we could even stop the disease developing in the first place www.euresearcher.com

“We know the risk genes that are associated with AMD, but the real opportunity is that by optimising lifestyle choices, such as by stopping smoking and controlling body fat and enhancing nutrition, using key nutrients known as carotenoids, we can do a very good job of pushing out the time at which this disease presents. Potentially, we could even stop the disease developing in the first place,” he says. This could be achieved by enriching the macular pigments, the carotenoids found at the macula, in the central part of the retina. “What is really interesting is that we can make patients with AMD see better, by increasing their macular pigment. We can slow down the rate of progression of the disease, and we can create a better visual experience for patients. This is wonderful and, importantly, patients really notice the difference,” continues Professor Nolan.

contribute to a phenomenon known as veiling luminance. Therefore, filtration by macular pigment is crucial if the deleterious effect on visual performance of veiling luminance is to be minimised; both oxidative stress and blue light are major factors that contribute to developing AMD. “The two main reasons why we get AMD are oxidative stress – which is damage caused by unstable molecules in the eye – and cumulative exposure to blue light,” explains Professor Nolan. However, even a healthy person with a well-balanced diet consumes far less

Macular Pigment

What is really interesting is that we can make patients with AMD see better, by increasing their macular pigment. We can slow down the rate of progression of the disease, and we can create a better visual experience for patients

The macula pigment in the eye is made up of nutrients called carotenoids, which can be found in many leafy greens, coloured fruits and vegetables, and overall it performs two key functions. Firstly, it has antioxidant properties, which makes it ideal for neutralising free radicals – unstable molecules produced in the eye in a process known as oxidative stress – which destroy the cells we need for vision. Secondly, it filters short-wavelength blue light. This is important because blue light also produces free radicals, and because blue light is deleterious for visual performance and experience. One reason for this is that we have no blue sensitive cones at the centre of the macula and therefore blue light cannot contribute to visual performance and experience at the location of maximum acuity. Also, it is only the blue wavelengths of light that are appreciably scattered and

carotenoids than are needed to maintain optimal levels of macular pigment. “We are all walking around with sub-optimal levels of carotenoids” says Professor Nolan. “An average western diet contains about 1.5mg of the macular pigment carotenoids per day. Studies show that if you want to change retinal tissue levels, to a point that has a positive implication in terms of visual function, you need to be consuming between 10 and 20 mg per day.” The three carotenoids in the retina are meso-zeaxanthin (MZ), zeaxanthin (Z) and lutein (L), which together form the macular pigment, an important yellow protective pigment found in the human retina. Researchers are now investigating at a very high level whether supplementation with these specific carotenoids has a positive effect on visual performance. “Earlier trials,

29


before CREST AMD, really pointed us in the direction that a combination of the three nutrients is the best way to both increase macular pigment in the retina, both centrally and across the spatial profile, which also has the best results in terms of visual performance,” outlines Professor Nolan. Supplementing diet with these carotenoids can have a positive impact both on the young, healthy population, and also on subjects with early-stage AMD. “With these specific nutrients we can actually create a better, more effective visual experience in the younger population, while also having a positive impact on patients with AMD,” says Professor Nolan. The CREST Normal study, a placebo control

A combination of the three nutrients is the best way to both increase macular pigment in the retina, both centrally and across the spatial profile clinical trial, has now been completed. The effects of supplementation with all three macular carotenoids in a 10:10:2 mg ratio (MZ,L, and Z) were compared with a placebo in subjects free of retinal disease, but with

low macular pigment. The CREST AMD study is a head-to-head clinical trial in patients with early AMD, comparing the combination of the three carotenoids – uniquely also including the central carotenoid MZ using the 10:10:2 ratio (MZ, L, and Z) – against a formulation which doesn’t contain MZ. However, both formulations contained the same co-antioxidants that were used in the American AREDS2 trial funded by the National Eye Institute (MEI). In CREST AMD, researchers are comparing the 10:10:2 formulation against a formulation not containing MZ (L and Z in a 10:2 ratio). This work builds on the AREDS2 study which tested the impact of carotenoids plus co-antioxidants on the progression of AMD. “The AREDS2 study used multivitamins and antioxidant supplements, together with L and Z. The study showed that this formulation had a beneficial effect for patients in terms of slowing their rate of progression,” explains Professor Nolan. There are important differences between CREST AMD and the earlier AREDS2 study however. “CREST AMD was uniquely designed to test the advantages of adding MZ to the formula. But also, our outcome measures are visual performance and not the disease progression; in other words, we want to know if we can improve visual performance

in these patients. Therefore, we have used very specialised tests for visual function to measure vision,” says Professor Nolan. Evidence suggests that MZ in particular plays a central role in enrichment of macular pigment. It is the dominant macular carotenoid at the foveal epicentre, a part of the retina responsible for central vision, and is in the ideal position to filter light and perform antioxidant activity. “What the data shows us is that a combination of MZ, L and Z in the 10:10:2 ratio offers a better intervention because it contains MZ,” explains Professor Nolan. “Indeed, five head-to-head trials have shown that a formulation containing all three macular carotenoids in a MZ:L:Z ratio of 10:10:” is superior to alternative formulations, in terms of both visual improvements and in observed increases in macular pigment (the precise aim of supplementation).” The effectiveness of such interventions may vary according to the stage of the condition however. “We’re looking at data from early stage AMD, and we’ll see if we can enhance vision at this stage of the disease,” says Professor Nolan. “The AREDS2 study showed that the rate of progression from intermediate AMD to advanced AMD can be slowed, but my message is that we actually need to intervene at an earlier point.”

Dr Alan Howard, Chair of the Trustees of the Howard Foundation, UK; and Professor John Nolan, Chair of the International Macular Carotenoids Conference (see www.macularcarotenoids.org). This conference was dedicated to the role of carotenoid nutrition for visual function and brain function.

30

EU Research


Eye Care This could potentially start right from standard eye care, with risk assessments for AMD and optimisation of the macular pigment, and in fact such risk assessment tools are already available (see, for example, www.sightrisk.com). People at high risk of developing AMD tend to have low levels of macular pigment, and this is evident decades before the disease actually presents itself; Professor Nolan says this could be important in evaluating overall eye health. “Macular pigment is a good

where they lose their vision,” continues Professor Nolan. The condition itself can have a severe impact beyond the loss of central vision, as many people experience depression, loneliness and other health problems as their

The interesting thing about AMD is that you can see changes in the retina long before it starts affecting vision

Vision with low contrast sensitivity. biomarker for the health of the retina, and the likelihood of someone getting AMD or not,” he says. Currently, AMD is only diagnosed when a patient starts experiencing problems with their vision, yet the condition typically starts developing much earlier. “The interesting thing about AMD is that you can see changes in the retina long before it starts affecting vision. These are called drusen and pigmentary changes – these changes leave an individual at very high risk of developing visually consequential AMD

www.euresearcher.com

they’re a very good delivery system for the nutrients. The reason why egg yolk is yellow is because of the carotenoids and the fats in the egg actually facilitate transport of the carotenoids,” explains Professor Nolan. “Certain other foods, such as butter and icecream, could also act as delivery mechanisms for these nutrients, but such functional foods have not yet been developed. The supplements represent the best means to increase macular pigment, but it is important to use supplements that have been tested for efficacy in clinical trials. We

Vision with high contrast sensitivity. sight deteriorates. Earlier interventions, through the enrichment of macular pigment, could potentially prevent some of these problems from developing; Professor Nolan believes the benefits would far outweigh the costs. “Those at risk of getting AMD should take these supplements every day,” he says. Within the CREST project, the supplements have been supplied as a soft-gel capsule, which have been shown to be the most efficient way to deliver the nutrients, but they could also be delivered through other foodstuffs. “Eggs are a great opportunity,

have seen great results in trials with the MZ:L:Z (mg) ratio of 10:10:2 which is commercially available as MacuShield™ in Europe, and MacuHealth with LMZ3 in the USA and Canada.” This research holds real implications not only in terms of slowing the progression of AMD, but also enhancing visual performance in the wider population. The project’s findings have helped inform medical practice for the prevention and management of AMD, and now Professor Nolan and his team are looking to widen its impact. “We believe that

31


At a glance Full Project Title The Central Retinal Enrichment Supplementation Trials (CREST) Project Objectives The Central Retinal Enrichment Supplementation Trials (CREST) is funded by the European Research Council, and comprises two different trials: CREST Normal and CREST AMD. CREST aims to assess the impact of enriched macular pigment with all three macular carotenoids - lutein (L), zeaxanthin (Z), and meso-zeaxanthin (MZ) - on visual performance and blindness. Of note, this is the first study to test the effects of supplementation with all three macular carotenoids, including MZ, in the context of a sufficiently powered, double-blind, randomised, placebo-controlled clinical trial. Project Funding This project is funded by the European Research Council Starter Grant. This funding was awarded to us in October 2011. We were awarded €1,493,342 for this project for a duration of 5 years. Project Partners Please see website for details. Contact Details Professor John Nolan, Nutrition Research Centre Ireland, School of Health Science, Carriganore House, Waterford Institute of Technology West Campus, Carriganore Waterford, Ireland E: jmnolan@wit.ie W: www.mprg.ie W: www.profjohnnolan.com

effort now needs to go into using these target nutrients to enhance visual performance and safety for drivers, pilots and the military,” he says. These carotenoids may also have an effect on brain health, an area that Professor Nolan is keen to explore in future. “We published several papers last year looking at how these nutrients actually impact on brain health, brain function,” he says. “The data and understanding that we have gained from CREST opens opportunities for us to investigate the impact that targeted nutrients – a combination of carotenoids and maybe omega 3 for example – will have on brain function and brain health.” This work will form a central part of Professor Nolan’s future research agenda. Researchers have set up a series of tests looking at the relationship between macular pigment levels and cognitive performance, generating some exciting findings. “We’ve shown that people with

high macular pigment levels have better cognitive performance, and that patients with Alzheimer’s disease are deficient in these key nutrients. We’ve also shown that if you supplement patients with Alzheimer’s disease with carotenoids, then you improve their pigment and enhance their visual performance,” says Professor Nolan. We’ve now started research with patients at an earlier stage, with what is known as mild cognitive impairment, which is very exciting and this is the research that I will focus on for the next 10 years. We have already confirmed the importance of the carotenoids for vision and we are so lucky to have been able to conduct research that has helped patients in real time. It is now time to see what we can learn about nutrition and brain health. I am confident that we will identify ways to help people who are suffering with this terrible disease. I am committed to this journey.”

We’ve shown that people with high macular pigment levels have better cognitive performance, and that patients with Alzheimer’s disease are deficient in these key nutrients

Professor John Nolan

Professor John Nolan is a Principal Investigator in the Nutrition Research Centre Ireland (NRCI), School of Health Science, Waterford Institute of Technology. His research studies the role of eye nutrition for vision and prevention of blindness, and current studies are investigating the link between nutrition and brain health and function. He has published over 75 peer-reviewed scientific papers in his area of research (2,447 = citations, H index = 31, research funding = over E5 million in research funding to date).

32

EU Research


Around 30,000 people are thought to have disappeared in Argentina between 1976-83, as the ruling military dictatorship dealt brutally with perceived subversives and political opponents, events which left a lasting impact on the country. This kind of terror and political intimidation is very much interdependent with fantastic narratives, as Professor Kirsten Mahlke explains

Exploring the links between fantasy and terror The military dictatorship which ruled

Fantastic narratives

Argentina between 1976-83 dealt brutally with political opponents, using enforced disappearances, detention centres and torture to intimidate and terrorise opponents. During the time of the dictatorship thousands of people literally disappeared at the hands of the state; researchers in the NOT project are now investigating the wider social and literary impact of the period on Argentina. “The project aims to find out what effects the politics of enforced disappearances in Argentina had on literary and social narratives in the aftermath of the military dictatorship,” outlines Professor Kirsten Mahlke, the project’s coordinator. The use of enforced disappearances was not explicity acknowledged by the state at the time, but people were aware that the dictatorship were willing to use highly repressive measures; this has had a huge impact on the wider population, for example in the way people communicated. “One important study in the project was about the way people in the neighbourhood of former detention centres spoke about these spaces of terror,” says Professor Mahlke. “We found that there was a lot of rumour and gossip, narrative about spectres, appearances, voices.”

This research forms a key part of the project’s overall agenda in investigating the interdependency of fantastic narratives and the historical phenomenon of terror. The twentieth century was marked by several periods of terror where enforced disappearances were used as a method of repression, and Professor Mahlke draws from history in her research. “I recently wrote an article about the early history of enforced disappearances, which

www.euresearcher.com

foreboding that it provokes in people about what could happen next. It forces people to imagine worst-case scenarios that could develop in the future. It is not about the past or scenarios that everybody is familiar with, but rather the expectation of what could happen in the future – this is one of the characteristics of terror.” Researchers in the project are combining a literary, theoretical assessment of the use of the fantastic as a means of communicating about acts of terror, with a

As long as the perpetrators have not been put on trial, have not been punished, have not confessed, have not given any information about the whereabouts of the disappeared, then uncertainty continues were used by Hitler to intimidate political opponents in Western Europe,” she says. There are examples of enforced disappearances in fantastic literature and high culture, such as in Der Ring des Nibelungen, the famous opera by Richard Wagner. “The very origin of the political method of enforced disappearance already lies in fantastic literature. So fantasy not only plays a role in the method itself, but it also has a perpetuating effect,” explains Professor Mahlke. “Terror is about the

case study of Argentina’s ‘war on terror’ between 1976-83 and its aftermath. The so-called ‘dirty war’ had actually flared up around 1974, and the military dictatorship continued the conflict when it came to power two years later, hunting down and killing left-wing guerrillas; during this period a climate of fear prevailed in the country. “The irrationality of the violence which happened during the dictatorship, the terrifying effect of silence – and of ignoring the violence that

33


Conference Disappearance and Literature 2016. occurred – contributed to a sense almost of madness, both before and during the military dictatorship. This continued through democratic times, in large part due to the continuation of silence about the crimes and impunity of the perpetrators,” outlines Professor Mahlke. “This is translated and communicated in ways that our research group is investigating – we are also building on other studies into similar phenomena after extreme political violence and disappearances. They’re called spectrality and haunting studies – they deal with a new kind of sensibility to historical, traumatic events.” The period between 1976-83 was certainly a traumatic period for many Argentineans, and it is taking the country some time to come to terms with its past. While the dictatorship relinquished power in 1983, uncertainty persists about the fate of many thousands of the people who were detained by the dictatorship; Professor Mahlke believes its important to have a full account of the period. “As long as the perpetrators have not been put on trial, have not been punished, have not confessed, have not given any information about the whereabouts of the disappeared, then uncertainty continues. As long as information is missing, as long as no evidence has been found, then you can say

34

that enforced disappearance is still at work,” she says. In the absence of truth, people may look for other explanations for terror, developing narratives that include fantastical elements. “Indeterminacy about this method of forced disappearances generates an unending circle of terrible fantasies,” outlines Professor Mahlke. “It’s not just about fantasy, and ghosts and spectres. It generates terrifying accounts and stories which somehow paralyse other ways of rationalising and thinking about not only personal histories, but also political issues.”

disappearance, it did at least encourage discussion about what happened. “There was a public discourse about it and the issue could never really be silenced again. It could not really be forgotten, particularly once the commission published its report in 1984, which is called Nunca Más (Never Again),” says Professor Mahlke. However, in the main the report did not name the main perpetrators, instead focusing primarily on the experiences of people in the detention centres, and the names of people who had disappeared. “Only a

Researchers in the project are combining a literary, theoretical assessment of the use of the fantastic as a means of communicating about acts of terror, with a case study of Argentina’s ‘war on terror’ between 1976-83 and its aftermath National Commission The National Commission on the Disappearance of Persons was established in 1983 to investigate the fate of the victims of enforced disappearance, yet many questions remained unanswered. The Commission recorded the fate of 8,961 people; while this is a relatively small proportion of the estimated 30,000 victims of enforced

relatively small number of the main officials were jailed. Then many of these officials received presidential pardons under President Carlos Menem,” continues Professor Mahlke. “These pardons have since been rejected, so the cases could be reopened and there could be new trials. One of the questions we looked at in this project was – does this change anything now?”

EU Research


A number of important figures in the military dictatorship have been put on trial over recent years, including the notorious Alfredo Astiz, and enough time has now passed to analyse the impact of those trials on the victims of enforced disappearance and their relatives. Researchers travelled to a province of Argentina to undertake an empirical study; the results showed that the trials have had a significant impact, helping people deal with the impact of enforced disappearances. “As soon as a crime is acknowledged, publicly, in the form of a trial, and the offence is dealt with and a punishment handed down, the narrative does change. The haunting can be stopped,” says Professor Mahlke. This research holds continuing relevance, with enforced disappearances still being used today as a tool of repression. “There are thought to have been around 65,000 cases of enforced disappearance between 2011 and 2015 in Syria alone, while there have also been other cases in the Middle East and North Africa. Then there are extra-judicial detention centres,” continues Professor Mahlke. “We can probably expect people will experience similar effects as those we have seen in Argentina, from individual traumatisation up to collective traumatisation.” There is also a comparative element to the project’s research, exploring other historical examples of enforced disappearances, including in Nazi Germany, Cambodia and Sri Lanka. Several conferences have been held on the topic; Professor Mahlke says this work has thrown new light on how enforced disappearances affect people. “Enforced disappearance is not ‘just’ another war crime. It has a dimension of its own, which

is this fantastic one,” she stresses. The Mothers and the Grandmothers of the Plaza de Mayo, Argentinean human rights groups formed in 1977, helped bring the issue to international attention, and the country has led the way in establishing how enforced disappearances can be dealt with, both politically and theoretically. “The transitional regime that was established in Argentina after the fall of the military dictatorship was responsible for investigating the crimes that occurred,” explains Professor Mahlke. “These trials and investigations were internal affairs – Argentina took responsibility for its own history, its own crimes. These cases weren’t prosecuted at the Hague or another international tribunal – they took place in Argentina.” Researchers have been in close contact with human rights groups during the course of their work, while there are also plans to publish articles and books to publicise their results. The research results have been presented in Argentina, and while Professor Mahlke is wary of over-stating the impact of this work, she says they have received positive feedback. “Some people told us that they were very relieved that they could finally talk about this fantastic dimension, which was excluded from the political and official discourse for a very long time. It seems that the fantastic is becoming a more common research topic in Argentina, which it wasn’t before,” she says. The next step for the project is to publicise the findings from recent conferences, and bring their work to wider attention. “It is not only violence followed by uncertainty which creates fantastic stories. It is also the simple fact of not telling people what has happened – not informing people,” says Professor Mahlke.

At a glance Full Project Title Narratives of Terror and Disappearance. Fantastic Dimensions of Argentina’ s Collective Memory since the Military Dictatorship (NOT) Project Objectives The NOT project seeks to investigate the interdependency of fantastic narrative and the historical phenomenon of terror. The five-year, six-member project combines a literary theoretical reassessment of the fantastic as a mode of telling the unspeakable with a historical case study of the Argentinean ‘war on terrorism’ during the military dictatorship of 1976-83 and its aftermath. The project uses an approach which combines narratological analysis with a Cultural Studies perspective including Political Science and Social Anthropology to investigate the specific ways that the figure of the Disappeared shapes the Argentinean social body, its histories and collective self-understanding. For the first time, the Disappeared are analyzed as integral figures of the transition between historical reality and fantastical imagination. Their case history represents a paradigm for a terroristic answer to terrorism and can shed light on current debates on the war on terrorism. Project Funding NOT is an ERC funded project. Contact Details Professor Kirsten Mahlke Kulturtheorie und kulturwissenschaftliche Methoden FB Literaturwissenschaft Universitätsstr. 10 78465 Konstanz T: +(49)7531-882426 E: kirsten.mahlke@uni-konstanz.de W: http://www.litwiss.uni-konstanz.de/ fachgruppen/kulturtheoriekulturwiss-methoden/ erc-narratives-of-terror-and-disappearance/

Professor Kirsten Mahlke

Kirsten Mahlke studied French and Spanish Literature at the University of Frankfurt/Main and gained her PhD in Romance Philology in 2002. She was appointed as Professor of French and Iberoamerican Literatures at the University of Heidelberg in 2009, and has held a Professorship in Cultural Theory at the University of Constance since 2011. ERC group meeting in Berlin with the Cambridge ERC grantee Yael Navaro Yashin and our own team.

www.euresearcher.com

35


New Measures At The BIPM Measurements are so entwined into the fabric of our lives, that we often take them for granted and yet Metrology has defined ways to measure out basic units we rely on every day in every field of study through rigorous science. Metrology is the seed from which discovery blossoms. This sacrosanct area of study makes design and manufacture and importantly discovery itself, possible. Welcome to the science underpinning science itself! Richard Forsyth questions Dr Richard Davis on the work at BIPM – the place where measurements are born

36

EU Research


T

here is a building located in Sèvres, a suburb of Paris, in one small area of a large public park. It is known as the Bureau of Weights and Measures – or BIPM because of the French initialism Bureau international des poids et mesures. The BIPM was first established in 1875 via a treaty called the Metre Convention. It is an organisation maintaining an International System of Units, in order to support a coherent system of measurements throughout the world. Within its walls there are about 70 staff members – 45 of whom work studiously in lab coats. As Dr Richard Davis puts it, a man who has dedicated 20 career years to BIPM before retiring as head of the Mass Department in 2010, “Everyone keeps busy!” Due to the BIPM’s importance in globally recognised measurements there were always workshops and many visitors from around the world would come to the BIPM for meetings, Davis recalls. “Our campus includes a historic building originally built for the brother of Louis XIV. Our laboratories are of course modern and are located in other buildings. The BIPM is a nice place to spend the day.” The BIPM is the place where scientists define measurements used around the world – for instance, it is keeper of the international prototype of the kilogram, it maintains and analyses a coordinated World time measurement (UTC) and importantly carries out research into measurement.

Artefacts That Define So just how do you go about defining a measurement for the world? The original methods were arguably, relatively basic. For instance, take the very first tasks assigned to the BIPM – to provide 1 metre length standards and 1 kg mass standards to Member States, a task completed in 1889. In 1889, international length measurements were traceable to a physical bar known as the international prototype of the metre. The metre was defined by the General Conference on Weights and Measures (CGPM) as the distance between two scratches at opposite ends of this bar when the bar was maintained at a specified temperature. “The role of the BIPM was then to provide copies of the international prototype to states that had acceded to the Metre Convention, providing each with a calibration certificate giving the small measured difference between the length of the copy and the length of one metre as defined by the international prototype,” explains Dr Davis. The same kind of idea applied to the definition of the kilogram. In truth the unit of mass called the kilogram has been with us since the end of the 18th century when it was in essence defined as the mass of a litre of water at a temperature of 4 °C. People are accustomed to water having a density of one kilogram per litre. The BIPM, however, wanted to have a prototype they could use as a resource, which led to the creation of what is referred to as the IPK (International Prototype Kilogram). In a vault at the BIPM, is a platinum-iridium cylinder (the IPK) taken to be exactly 1kg. So, like the original definition of the metre, the measurements were based on physical artefacts. But times are changing. A lot has happened in the history of the BIPM in recent decades and innovative new methods to quantify have been devised and are being devised.

www.euresearcher.com

Dramatic Changes The International System of Units (SI) was established relatively recently, in 1960, as the basis for international agreement for mechanical and electrical measurements. There are seven base units: second, metre, kilogram, ampere, kelvin, mole and candela and other units are derived from various combinations. The SI has been expanded to include other fields of metrology. “There have been dramatic changes”, explains Davis. “In 1960, the metre was redefined in terms of the wavelength of a particular frequency emitted by krypton-86 atoms. This was an example of what became a general goal to replace unit definitions based on materialised ‘artefacts’ by definitions based on constants found in nature. In order to realise the new definition of the metre, the BIPM had to develop new expertise, starting with the ability to measure the wavelength of krypton with respect to the international metre bar. “In 1983 the metre was again redefined, this time to be the distance light travels in vacuum in a specified fraction of one second - the definition has not changed since then and is not slated to change. The best way to realise this definition is through optical interferometry using frequency stabilised lasers. Thus the BIPM became a pioneer in developing a transportable model of such a laser, as well as in carrying out comparisons with other laboratories claiming similar capabilities. Why were such comparisons necessary? Because even a laser constructed according to sound principles might not function properly due to a whole host of reasons and it was found useful, and even essential, to compare in situ instruments constructed by different laboratories to uncover possible systematic differences.” Length metrology was revolutionised by the development of the ‘frequency comb’. In a 2003 paper, BIPM staff showed the remarkable capabilities of such an instrument when linked to a hydrogen maser frequency standard.

Working Collaboratively The previous example also shows the benefits of an important development. As SI reference standards are (except the kilogram) defined from physical constants, it was realised it was possible and more cost effective for laboratories outside the BIPM to have reference standards calibrated by a trusted peer. Today, about 100 laboratories around the world participate in the CIPM Mutual Recognition Arrangement (CIPM-MRA) and thereby agree to accept each other’s calibration certificates provided there is transparent evidence that each laboratory can provide the calibration and measurement services that it claims. This is all co-ordinated by the BIPM under the authority of the International Committee for Weights and Measures (CIPM) and includes formal procedures and databases that are maintained by the BIPM and are publicly accessible.

Weight For Change The present definition of the kilogram has been with us since 1889 and is based on the international agreement, as previously mentioned, that an object stored at the BIPM has a mass of exactly one kilogram. As the last calibration to rely on an artefact, a redefinition of the kilogram is also scheduled for 2018. “In fact, the 1889 redefinition of the kilogram aimed to be

37


consistent with the original definition and so will the redefinition planned for 2018,” said Davis. The 2018 kilogram will be redefined in terms of the fundamental constant of quantum physics, known as the Planck constant. “At present, the Planck constant is measured in laboratory experiments that link its value to the present definition of the kilogram based on the mass of a platinum-iridium cylinder taken to be exactly 1 kg. So the numerical value of the Planck constant depends on the mass of the IPK. To redefine the kilogram, we can simply set the present value of the Planck constant, which has a small experimental uncertainty, to have no uncertainty. The small experimental uncertainty does not vanish but it becomes associated with the mass of the IPK, instead of the Planck constant. “One might wonder what is gained by this? First, despite the mass of the IPK being 1 kg by definition at present, there is no guarantee that its mass is constant. The consequence is that the unit which is defined by the mass of the IPK, i.e. the SI kilogram, is not guaranteed to be constant. The situation is different for the redefinition of the kilogram based on a true constant of nature. In addition, any laboratory with the means and the desire to do so will be able to realise the redefined kilogram based on the Planck constant. The IPK becomes another mass standard which should be recalibrated from time to time. There will be no requirement for national laboratories to come to the BIPM, although the BIPM will continue to provide calibration services.”

Focus For Our future The availability of comb technology and the existence of the CIPM-MRA means that BIPM research can now be re directed to more pressing needs.

38

“Another example of change at the BIPM has been the creation of a research programme in chemical measurements. The new Chemistry Division soon became a leader, in particular contributing to improvements in measurements related to health and global climate monitoring,” continues Davis. “In the field of chemistry, the BIPM has just completed a measurement of an optical property of the ozone molecule and this should help to harmonise global measurements of atmospheric ozone.” Davis adds: “I can briefly mention that a collaboration that included the BIPM has demonstrated the advantages of a graphene-based quantum standard of electrical resistance”. It’s important to realise that the BIPM is also tasked with global time keeping. “Our Time Department calculates a time scale called TAI International Atomic Time - which is a weighted average of clock data from many laboratories around the world. TAI is the basis of Universal Coordinated Time or UTC, so this is a big responsibility. Recently the BIPM in collaboration with other colleagues have developed an improved algorithm to transfer frequencies via GPS satellites to 1 × 10 −16 accuracy using existing products. The accuracy is competitive with transfer via optical fibres.”

Challenges Ahead The work at the BIPM seems to be a mixture of collaboration and pioneering new methods. There are always new types of challenges thrown up by research, so what are the most challenging quantities to measure? “There are many challenging measurements with applications

EU Research


in health, physics, engineering, chemistry and so forth. One example from engineering is the need to measure rapidly changing force or pressure, even though the most accurate measuring instruments have been designed for traditional static measurements. While the BIPM does not do laboratory work in this area, we have organised workshops as a part of our coordination activities with a view to making progress in this field. And indeed, progress is being made. “In chemical analysis the challenge is often to determine the amount of a particular chemical contained in a reference sample. Chemists refer to the sample material as the matrix. Measuring the content of lead in water is a different technical problem to measuring lead in shellfish. Again, impressive progress has been made in sorting this out but there is always more to do. “The advent of optical clocks presents challenges to our Time Department, which has the task of constructing time scales from the available clock data, as already mentioned. “Last but not least, our Ionizing Radiation Department must keep up with changing needs for monitoring various types of radiation used for medical treatment.” When Dr Davis said: ‘Everyone keeps busy,’ it wasn’t an understatement. When not in their laboratories, personnel of the BIPM also engage in important international co ordination projects and sit on the expert committees of other relevant intergovernmental bodies whose work must be underpinned by sound measurements such as the ITU, WHO, WMO, IAEA, ISO and more.

Dr Richard Davis oF the BIPM

www.bipm.org www.euresearcher.com

39


What does it feel like? The FEEL project are developing a new approach to the ‘hard’ problem of consciousness, pursuing theoretical and empirical research based on sensorimotor theory. We spoke to the project’s Principal Investigator J. Kevin O’Regan about their work in developing a fullyfledged theory of ‘feel’, and about the wider impact of their research The ‘hard’ problem of consciousness, of explaining how certain types of brain activity give rise to certain types of ‘feels’, such as the way we experience colour and taste, is a major area of research in both philosophy and neuroscience. Based at the Laboratory of Perception Psychology at the University of Paris Descartes, J. Kevin O’Regan is the Principal Investigator of the FEEL project, in which researchers are developing an alternative way of thinking about consciousness. “The idea is that the way most people currently think about consciousness is a mistake. Most people think consciousness is something that the brain generates – just like the vitalists at the beginning of the twentieth century thought that life was something that biological systems generated,” he says. “We now know that this was the wrong way of thinking about life, because there is no vital spirit. In fact, life is just a word that describes the potentialities of certain systems interacting with the world. I think that consciousness is similar to life, in that there is nothing generated by our brains that corresponds to consciousness.”

Sensorimotor theory The project is instead building its research on the sensorimotor theory developed by O’Regan, which suggests that

consciousness and ‘feel’ are ways of interacting with the environment. Just like the feel of the softness of a sponge is constituted by the fact that it squishes when you press it, all sensory feels are constituted by the sensorimotor laws that govern how you interact with things in the world. Certain predictions can be made using this approach, one of which is the possibility of sensory substitution. “Sensory substitution is a method by which you can replace one sense by another. For example, you might be able to use your skin to get input which provides you with visual sensations -something Paul Bach y Rita had already tried to do back in the 1970’s. Or you might be able to use your ears to get input that gives you tactile sensations,” explains O’Regan. This raises the question of what it is about a specific neural activity that gives it a visual, tactile or auditory feel; O’Regan says the answer lies in the sensorimotor laws that describe it. “There’s nothing special about the optic or auditory nerves that gives the nerve impulses they carry a visual or auditory character. What gives visual information its visual character is what I call the sensorimotor contingencies, that is, the laws that govern how you interact with the world when you see” he says.

A first key part of the project is to build the mathematical basis behind the concept of sensorimotor contingencies. This theoretical work, being developed by Alexander Terekhov and Guglielmo Montone in O’Regan’s team, should then prove relevant to the development of sensory substitution devices. “But instead of substituting one sense for another,” says O’Regan, “one thing we’re doing with Frank Schuman and Christoph Witzel is to actually create a new sense. This new sense is a sense of space. We know that some birds navigate by using the earth’s magnetic field – they have little grains of ferrite in their brains, and sensors that sense the orientation or movement of these grains. Wouldn’t it be nice if humans could also have such a magnetic sense? If we can provide the brain with extra sensations like this, then maybe we can not only compensate for disabilities, but expand or enhance human capabilities.”

Colour perception A number of other predictions can be made on the basis of sensorimotor theory, including the existence of change blindness, a phenomenon where an observer fails to notice a change in a visual stimulus. One workpackage within the project is dedicated to investigating colour. “Why does red seem

FEEL administrator Niclette Kampata using the “NaviEar”, a new sensory augmentation device that translates cardinal directions into sound.

Image Above: The same picture of the dress cut out and pasted into the shadow (left) where it appears gold and white; and into the sun (right) where it appears black and blue. To see the differences in colour perception between the two images more clearly, cover one of the two images when looking at the other. Ideally it is best not to have previously seen any photo of the dress - (Copyright © Witzel, Racey & O’Regan in press).

40

EU Research


red to us rather than seeming green? Could it be that when you look at a red patch of colour, the feel that you get is the same feel as I get when I look at a green patch of colour?” asks O’Regan. Again, researchers can use sensorimotor theory to gain a deeper understanding of this question. “Sensorimotor theory predicts that the ‘redness of red’ depends on the way red things behave when they’re acted upon,” continues O’Regan. “So if you take a red piece of paper, and move it around under different lights, then the light coming into your eye changes. The reflection from the red piece of paper changes, because it’s

of light coming into your eye as somebody else, you may see the colour as being different from the way that person sees it. Wavelength is not colour,” he stresses. The way an individual sees the dress is also strongly determined by what assumptions he makes about the colour of the light that is illuminating the dress. “An individual might assume that the dress is illuminated by direct sunlight. Alternatively he might assume the dress is in the shadow. Depending on what you assume about the light, your brain will deduce that the material is different and so actually see its colour as being different,” points out O’Regan.

Sensory substitution is a method by which you can replace one sense by another. For example, you might be able to use the skin to get input which provides you with visual sensations, or you might be able to use the ears to get input that gives you tactile sensations facing other kinds of light. Sensorimotor theory says that, analogously to the feel of softness of the sponge, the feel of red is constituted by the law that describes how the light coming into your eye changes as you act upon the red piece of paper.” This provides an explanation behind why people sometimes see the same object in different colours. Christoph Witzel has shown this for a prominent recent example: the photo of a particular dress that provoked debate around the world as to whether it was gold and white, or blue and black. O’Regan says that these different viewpoints are predicted precisely by sensorimotor theory. “Even though you have the same wavelengths

The project will continue its research into fundamental questions around colour, sensory substitution and infant development in future, as well as working to improve sensory substitution devices. “We’re now experimenting with a device that gives people a sense of North – we aim to see whether it helps to improve people’s navigational abilities,” says O’Regan. Researchers will also continue to develop the sensorimotor theory, which could then act as a theoretical framework for further research. “The project will go on for another two years, and we hope to develop a fully-fledged theory of feel,” says O’Regan.

At a glance Full Project Title A sensorimotor approach to understanding consciousness (Feel) Project Objectives In addition to investigating the philosophical implications of the “sensorimotor” approach to consciousness and making links to robotic applications, we are doing experiments to verify the theory’s predictions about why certain colours are perceived as special, about how infants develop the sense of their own body, and about how one might be able to perceive the direction of North through hearing. Project Funding Funded by the European Research Council (ERC) Project Partners Feel is a European project financed by the European Research Council and hosted by the Laboratory of Perception Psychology of the University Paris Descartes. Contact Details Laboratoire Psychologie de la Perception CNRS - Université Paris Descartes Centre Biomédical des Saints Pères 45 rue des Sts Pères 75270 Paris cedex 06 T: +(33 1) 4286 4312 E: jkevin.oregan@gmail.com W: http://lpp.psycho.univ-paris5.fr/feel/ http://doi.org/10.1109/IROS.2013.6696507 http://arxiv.org/abs/1308.2124

J. Kevin O’Regan

Kevin O’Regan started his career in experimental psychology studying eye movements in reading and visual perception. He was director of the University Paris Descartes Psychology of Perception Laboratory for 12 years, and most recently received a large European Research Council Advanced grant called “FEEL” to investigate a “sensorimotor” theory of consciousness which he has developed.

A robotic agent acquiring the notion of space and its topology by detecting coincidences in its sensory information (Copyright © Terekhov AV and O’Regan JK 2016). Space as an Invention of Active Agents. Front. Robot. AI 3:4. doi: 10.3389/frobt.2016.00004 (Copyright © 2016 Terekhov and O’Regan).

www.euresearcher.com

41


Intelligent surveillance for a safer tomorrow Despite the extensive computerization of security systems, it is still up to a human operator to monitor the protected area by viewing tens to hundreds of interconnected video cameras. The EU project CENTAUR aims to develop next generation tools to assist security operators in dealing with the particularly difficult task of monitoring crowded environments Many of us

regularly negotiate crowded environments, whether it’s a busy train station or a shopping street heaving with people. These crowded environments present higher security and safety threats and also pose a real challenge for security staff, whose role is to predict, react to and – if possible – prevent incidents. There have been many recent incidents of crowd violence, often resulting in injuries and damage. More extreme examples are crowd stampedes which are responsible for the deaths of scores of people every year – the 2010 Duisburg Love Parade and the 2012 Madrid Arena Halloween Party are infamous recent occurrences. The inhabitants of densely populated cities are exposed to potential crowd threats on a daily basis. Technology can help prevent such threats developing into disastrous outcomes. Security cameras offer real potential to alleviate these heightened risks and make the efforts of security staff more efficient. So far this potential is mostly wasted – typically, tens to hundreds of camera views are presented on separate monitors to a team of human operators in the security control room for their visual analysis, then recorded in a database. Wherever the occurrence of incidents is rare or unpredictable, the team of security operators is minimized for cost reasons. Taking into account the cognitive load limit of the human mind, the growing number of cameras means that maintaining situation awareness is increasingly leading to operator overload or, if greater numbers of operators are employed, rapid growth in the operational cost of a security system. This

42

problem forms the primary research focus of the CENTAUR project. The aim is to use a computer’s running cognitive software to receive the video data, process it and understand it with a view to significantly improving the security operators’ efficiency and reaction time. So far this has not been satisfactorily achieved for crowded scenes.

by focusing on crowded scenes. The security operator will use the automated surveillance for incident prevention or early reaction – as a tool that helps to efficiently maintain the operator’s situation awareness, where they fully comprehend the situation of the secured space. Such a tool consists of cameras attached to

Graphical user interface of the forensic search for a person re-appearance. Courtesy of Dr Slawomir Bak, Disney Research Pittsburgh and INRIA. Images from the SAIVT-SoftBio Database.

Automated surveillance Automated surveillance is a complex area of research. State-of-the-art automated video surveillance is capable of understanding relatively simple scenarios such as individuals moving in a sparsely populated space and performing simple actions like leaving their luggage behind. The project is taking this to the next level

the computational environment – a physical computer or a cloud. There, specialized software processes the video data from all the cameras, integrates them and presents the security operator with a simplified, yet accurate overview of what is going on in the secured space. Another way in which the operator will use the automated surveillance is forensic search. The video data are available in a

EU Research


database and the operator’s task is to search through it. For example, if an incident has occurred and he needs to find a related piece of information, such as the reappearance of the people involved. With large numbers of cameras and many hours of recorded feeds, manual searches can be difficult and laborious. An automated surveillance system could shorten the operator’s query from hours to minutes, which is crucial in situations where time may be precious. The academic partners in the project provide in-depth expertise in computer vision and hence the fundamental scientific basis for development. This strong academic triplet is complemented by two industrial research partners: the Data Centric Technologies team of the engineering corporation Honeywell, and a Czech company, Neovision. While Honeywell connects to a large base of corporate customers and provides the global

alerting the operator whenever violence breaks out, so he can give his full attention to the developing situation (*see footnote). At the end of this endeavour into developing a truly intelligent automated surveillance we expect to see semi-automated security systems equipped with hundreds to thousands of cameras. Perhaps only a single operator will be needed, observing a single large screen to view the secured space, establishing his situation awareness by a single brief glance. Even in the case of large crowded environments – like the London Tube with the cameras on all platforms, transition corridors and vestibules – incidents like violence and stampedes or potential threats like high crowd density will be brought to the operator’s attention. The operator will be able to instantly see the context, such as the exact location and the nature of the incident, and act accordingly.

The aim is to use computers running cognitive software to receive the video data, process it and understand it with a view to significantly reducing the workload of the security operators business perspective, Neovision brings in the view of a highly adaptive company capable of focusing on specific customer problems. Project activities are based on research exchanges: scientists from academic labs perform their research in the businessoriented environment of the industrial partners, and vice versa. This approach was adopted by the Research Executive Agency of European Commission to bridge the existing gap between fundamental academic research and the fast-paced applied R&D work of industrial companies. It gives academic researchers a much needed perspective of needs, gaps and limitations of the relevant market, while their industrial colleagues get a unique opportunity to access the deep knowledge of academics – this approach normally could not be taken in the fastmoving commercial world. So far, the project has been successful in several areas: the technologies developed were integrated to allow operators to track people or objects in ‘nearly crowded’ environments across camera views which may not overlap. This greatly simplifies the operator’s job as he does not need to switch between multiple camera views to explain the activities of people or objects in the secured space. For crowded environments, the project has developed an accurate crowd violence detection technology capable of

Ethics and privacy The project’s research is being conducted against a backdrop of intense debate about the balance between privacy and security in the information age. Images acquired from surveillance cameras are of course open to misuse, an issue of which CENTAUR researchers are well aware. The project handles this issue by observing strict guidelines that require every project activity to adhere to privacy protection best practices. This mainly concerns how the researchers handle the data needed for the research and provides a solid foundation for the project’s data and knowledge exchange. The project is also actively investigating how the technology itself can protect privacy. The fact that the video surveillance is computerized offers new possibilities – automated anonymization or cancellable biometrics, for example, allow the operator to maintain his situation awareness while preventing disclosure of identity of the people captured in the camera views. The potential for commercialisation of the CENTAUR project research is currently being considered in parallel to continued investigation into the monitoring of crowded environments. The results of the CENTAUR project could be commercialised through existing marketing channels of the industrial partners.

At a glance Full Project Title Crowded Environments Monitoring for Activity Understanding and Recognition (CENTAUR) Project Objectives The CENTAUR project aims to develop a network of scientific excellence addressing scientific research topics in computer vision with a view to advancing the state of the art in video surveillance, in particular, automated monitoring and understanding of crowded scenes. Under CENTAUR, two industrial research Labs and three top European academic Labs team up to transform the advanced technology concepts to commercial products with potential to improve the security and safety of public spaces while facilitating the labor intensive tasks of operators of safety/security systems and while effectively protecting the citizens privacy. Project Funding EUR 1.04 M in 4 years (2013-2016) Project Partners • Honeywell, ACS Global Labs Prague, CZ • Neovision, CZ • Ecole Polytechnique Federale de Lausanne, CH • Inria, Sophia Antipolis, FR • Queen Mary University of London, UK Contact Details Dr Vit Libal, Ph.D Principal Research Engineer, Data-Centric Technologies / Big Data COE ACS Global Labs - Prague T: + 420 734 645 909 E: vit.libal@honeywell.com W: www.honeywell.com Bialkowski, Alina, Denman, Simon, Lucey, Patrick, Sridharan, Sridha & Fookes, Clinton B. (2012) A database for person re-identification in multi-camera surveillance networks. In Digital Image Computing: Techniques and Applications (DICTA 2012), 3-5 December 2012, Esplanade_Hotel, Fremantle, WA, available at http://eprints.qut.edu.au/53437/. We would like to thank the SAIVT Research Labs at Queensland University of Technology (QUT) for freely supplying us with the SAIVT-SoftBio database for our research. The research leading to described results has received funding from the People Programme (Marie Curie Actions) of the European Union’s Seventh Framework Programme FP7/2007-2013/ under REA grant agreement n°324359

Dr Vit Libal, Ph.D

Dr Vit LIBAL is Principal Research Engineer with Honeywell Labs since 2009. He leads research teams in performing R&D and business development activities in areas of data science, integrated security technologies and building control technologies. Vit received his MSc and PhD degrees in Microelectronics (1993) and Electrical Engineering and Informatics (2001) from Czech Technical University in Prague. Vit’s areas of active research include, machine learning, multimodal signal processing and information fusion, image recognition and computer vision, anomaly detection and semantic modeling technologies. Vit is (co) author of 20+ technical papers and 8 patents.

* See more in: P.Bilinski, F.Bremond, Video Covariance Matrix Logarithm for Human Action Recognition in Videos, IJCAI 2015 - 24th International Joint Conference on Artificial Intelligence (IJCAI), Jul 2015, Buenos Aires, Argentina.

www.euresearcher.com

43


Embedded computing systems increasingly require a high level of processing power, while at the same time they need to execute their functions within a guaranteed timeframe. Professor Luis Miguel Pinho tells us about the P-SOCRATES project’s work in developing a new software design framework that meets the needs of modern systems

Where high-performance meets time-criticality

44

Embedded computing systems are a

Time-criticality

common feature of everyday life, performing dedicated functions within a number of larger systems, cars, satellites and planes. While these embedded computing systems were previously relatively simple, today they increasingly require high levels of processing power. For instance, until very recently embedded computing systems in a car were used only for very specific and limited functionalities, such as controlling the engine. But now cars have numerous sensors, they may have cameras, and the embedded computer must use all the information they generate to provide complex functionality, such as detecting obstacles on the road in real-time. An EC-funded initiative bringing together seven partners from across Europe, the P-SOCRATES project aims to develop a new software design framework, borne out of a recognition that the highperformance and embedded computing domains are converging. Highperformance computing involves thousands of processing units, which are very complex to program, as their execution needs to be coordinated. The same sets of requirements as found in high-performance computing are increasingly being seen in the embedded computing domain, but with the added consideration that developers need to guarantee that the result is reached within a certain amount of time.

This adds significantly to the complexity of software development. In order to ensure that a piece of software can process information within a given timeframe, developers may need to parallelise it. It may be necessary to re-structure the software, from one processing unit into several processing units, which are synchronised. This means developers can divide a problem, tackle it in parallel, and finish it quicker. This approach however introduces a large degree of complexity into development, with programmers needing to not only engineer parallel software applications, but also synchronise and control the behaviour of the hardware to meet time constraints. The project aims to considerably reduce the need for this dual expertise by developing a parallel programming model to incorporate more information and provide the necessary time guarantees in an automatic way. With this model, programmers can use the same methodologies as used in highperformance computing and focus their energies on parallelising the software, without needing to worry about how to achieve time-criticality. Researchers in the project are taking the methodology used to develop software for highly parallel, high performance computing systems, and incorporating the changes required to provide time-critical computing.

The project’s research agenda encompasses both adapting methodology and also developing the tools required to implement and execute these types of time-critical systems. While the tools are being built on the basis of existing examples, to ensure they are easy for programmers to use, much of the project’s analytical work is highly novel. The tool that programmers will use to compile their applications is based on GCC, the most common compiler in the Linux system. It is being adapted with the addition of some extra features, so that a programmer who already uses GCC will not see any significant change. Within that compiler, there are also plans to develop a module which will extract and analyse the complete model of the software. This work is based on completely new theory.

Commercial impact This research holds real importance to the commercial sector. In today’s often highly collaborative working environments, individual teams or workers may not be able to start a project before another has completed an earlier task. There are clear parallels here with computing. If work is divided among different processing units, then it needs to be synchronised – if one of the processing units needs some data that another one is calculating, there needs to be a guarantee that it will only start when the data is

EU Research


Technical Approach

At a glance Full Project Title Parallel Software Framework for Time-Critical Many-core Systems (P-SOCRATES) Project Objectives The aim of P-SOCRATES is to allow current and future applications with highperformance and real-time requirements to fully exploit the huge performance opportunities brought by the most advanced many-core processors, whilst ensuring a predictable performance and maintaining (or even reducing) development costs of applications. Project Funding e3.6 M, with a funding of e2.7 M.

available. If this can’t be done automatically, then this synchronisation needs to be done manually, meaning that the work sequence needs to be explicitly and manually programmed. A new analytical method has been specified within the project to automate this process, while there have also been several other novel areas of work. Researchers have been working with the Erika operating system, developing a new method of analysis that will answer the question of whether the software will be able to deliver results within a specified time.

three applications to be parallel, and the goal then will be to use the compiler, tools and systems, and show that the P-SOCRATES methodology made them faster, while still meeting the key requirements. The process of meeting those timing requirements will be completely automatic. Ultimately, the aim will be to validate that high performance programmes can be developed quickly and easily without the need to explicitly synchronise the multiple different parallel activities of the programmes. This represents a much more efficient

We are increasingly seeing the same sets of requirements in the embedded computing domain as you see in high-performance computing, but with the additional consideration that you need to guarantee the result is reached within a certain amount of time The project has specified novel elements in this domain, not just because it was interesting, but because they were necessary in order to use this methodology in real systems. This is of course a key consideration, and the likely needs of industry have been a major factor in guiding research.

Development process There are three main applications that the project is currently looking at, aiming to demonstrate how the P-SOCRATES methodology and system will improve the development process, while still meeting timing requirements. The project is now entering the validation phase. Researchers will programme these

www.euresearcher.com

approach to development, which is a key priority for many companies. The project’s advisory board includes multi-nationals, SMEs and application providers, and their views on software development have been clear, with many end-users saying that the complexity of software development is increasing too fast. Software is becoming too complex to develop and it’s taking too much time to develop correct programmes. The P-SOCRATES framework holds great potential in these terms, and technical work will continue. While the more research-focused part of the project is finished, researchers are now looking at validation and adapting to a new version of the hardware, maintaining their focus on continued technical development.

Project Partners Instituto Superior de Engenharia do Porto, PortugaL • Barcelona Supercomputing Center, Spain • University of Modena and Reggio Emilia, Italy • Federal Institute of Technology Zurich, Switzerland • Evidence Srl, Italy • Active Technologies, Italy • ATOS, Spain Contact Details Project Coordinator, Professor Luis Miguel Pinho (lmp) E: lmp@isep.ipp.pt Project Manager Dr Sandra Almeida (srca) E: srca@isep.ipp.pt CISTER Research Centre Porto, Portugal T: +351 22 834 0502 F: +351 22 832 1159 W: www.p-socrates.eu “P-SOCRATES: A parallel software framework for time-critical many-core systems”, Microprocessors and Microsystems, Elsevier, Available online 24 June 2015, ISSN 0141-9331, http://dx.doi.org/10.1016/j.micpro.2015.06.004.

Professor Luis Miguel Pinho

Luis Miguel Pinho is Coordinator Professor at the Department of Computer Engineering - School of Engineering of the Polytechnic Institute of Porto. He is also Vice-Director and Research Associate at the CISTER research unit, where he leads activities in several areas, including real-time programming models, scheduling of real-time parallel tasks and real-time middleware.

45


While the low weight and corrosion resistance of Fibre Reinforced Plastic (FRP) composite materials make them an attractive option in several sectors of industry, certain defects can affect their strength and stiffness. The VITCEA project is developing non-destructive evaluation techniques which will encourage the wider use of FRP composites, as Michael Gower explains

Towards validated NDE techniques

46

The excellent mechanical properties of

Material defects

Fibre Reinforced Plastic (FRP) composite materials make them an attractive option in renewable energy, oil and gas, and transport applications. However, these materials are susceptible to specific defects which prevent their wider application, says Michael Gower, the coordinator of the VITCEA project. “They’re hindered by the diverse range of defects and damage that can occur, which reduce strength and stiffness,” he explains. A project bringing together four leading National Measurement Institutes (NMIs) from across Europe, the VITCEA project aims to develop techniques to evaluate FRP composites, which will help ensure they are fit for purpose, so encouraging their wider adoption. “We’re looking at developing non-destructive evaluation techniques – so operational procedures on how to use specific techniques. We’re also developing modelling capability to simulate the NDE techniques, in order to increase confidence in the use of composite materials,” he outlines. This research holds real importance in terms of our wider environmental goals. The use of FRP composites could help reduce greenhouse gas emissions and lessen our dependence on fossil fuels, but the materials need to first be of sufficient quality, and reliable and consistent defect detection is an essential pre-condition. “The primary focus of the VITCEA project is the energy sector. So we’re looking mainly at renewable energy, such as wind, wave and tidal energy. We’re also looking at the rehabilitation of aging oil and gas infrastructure, as well as new infrastructure involving composite materials. Then there are also lightweight transport applications,” says Gower. The project has sourced the materials that are typically used in these sectors, aiming to develop and hone techniques to detect defects in the materials. “It’s really about applying some fairly novel, NDE techniques to detect defects in a reliable and consistent manner,” he explains.

The materials themselves contain either glass fibre or carbon fibre, and are surrounded by a polymer matrix. Over 100 different types of defect can occur in these materials; the project is not aiming to cover the full range, but rather to identify those which are a particular problem for energy, oil and gas, and transport companies. “At the start of the project we held an industrial consultation exercise, where we asked participants to tell us about the types of defect that they routinely needed to detect,” says Gower. A number of defects were identified as particular priorities, including kissing bonds, delaminations, fibre misalignment and porosity, which can emerge at different stages. “We’re trying to develop or look at techniques that are capable of picking up both the

Above: Sample holder for measurement of material emissivity required for active thermography. Below Left: Ultrasonic C-scan scan of CFRP reference defect artefact. / Below Right: Optical micrograph of an edge delamination in a CFRP material.

manufacturing – or processing – defects and the in-service defects. So things like porosity or voidage are defects that typically occur when you’re manufacturing composites,” continues Gower. “Defects like impact damage, delamination, matrix cracking and fibre fracture typically occur in-service.” The size and nature of these defects can vary, depending on the application of the material across the different industry sectors. Researchers are following a twostage approach to address these defects, with the ultimate goal of developing and validating traceable procedures for novel NDE techniques. “We’ve created some reference defect artefacts, using a range of materials representative of the sectors we’re looking at. Within those artefacts that we’ve fabricated, we’ve artificially created and positioned certain defects. We can very tightly control the size and location of these defects,” says Gower. Along with these artifical defects, Gower and his colleagues are also looking at natural defect artefacts. “We’re also creating damage or defects by natural means. So through fatigue loading or impact for example.We have less precise control of the exact nature and distribution of the damage created through fatigue loading or impact, but it’s more representative of what we’d actually see in practice.” Variation of ultrasound beam direction using phased array ultrasonics.

EU Research


At a glance Full Project Title Validated Inspection Techniques for Composites in Energy Applications (VITCEA) Project Objectives The work in Project VITCEA will develop and validate traceable procedures for novel NDE techniques with contrasting detection capabilities, which will underpin the increased use of FRP composites for improved efficiency and reliability in energy related applications e.g. wind and marine turbine blades, nacelles, oil and gas flexible risers. Project Funding 2.7 million euros

Lock-in thermography system. Researchers are investigating two main scanning NDE techniques, ultrasonics and microwave testing, to improve the way these kinds of defects are detected. The aim is to optimise and validate these techniques; Gower and his colleagues are looking at two ultrasonics techniques, an advanced testing method in which the probe is comprised of many smaller elements. “With phased array utrasonics you have an array of transducers rather than just a single device. You can angle and steer the ultrasonic signal to optimise your inspection capability. Air-coupled ultrasonics is basically a non-contact ultrasonic inspection technique, where you’re not reliant on a water couplant or a gel couplant. It tends to be used at fairly

NDE industry and the energy sector to take part. The idea of that is to hone the operational procedure and identify any limitations, to try and work out how robust those procedures are and whether they are suitable for industrial application,” says Gower. “The other aspect is to apply the techniques and procedures in field inspections – again, with the idea of trying to ascertain whether there are any limitations in terms of issues like fixturing, power requirements or accessibility. Most of what we’re doing at the moment is lab-based.” This could be an important step towards the development of standards for defect detection in composite materials. While standard techniques and operational

develop techniques that are capable of picking up both the manufacturing defects and the in-service defects We’re trying to

low frequencies,” he explains. The project is also looking at microwave techniques, to detect defects and characterise FRP structures. “Microwave can be used to inspect non-conducting materials. So they’re mainly glass fibre composites,” says Gower.

Best practice The project also has a workpackage dedicated to developing and evaluating two full-field, non-contact NDE techniques, active thermography and laser shearography. After each experimental technique has been optimised, and the modelling capability developed, the next stage will be to write up these experimental procedures as a best practice guide. “We’ll then undertake a series of intercomparison exercises for each technique. We’ll be looking for participants from the

www.euresearcher.com

procedures have been established for the aerospace sector, Gower says they’re not necessarily transferrable. “We’re trying to fill a void in the standards world, for NDE of composite-specific applications,” he says. These standards are intended to be fairly generic, addressing composite materials across a range of applications, while the project will also continue its research into natural defects over the remainder of its term. “We’re currently finishing off manufacturing the natural defect artefacts. We’re also going to make some additional samples available for use in a statistical detection study,” continues Gower. “These artefacts that we’ve produced will be farmed out to all the various partners in the consortium across Europe, and the development experts of each technique will come up with the formulations, the best practice guides.”

Project Partners • National Physical Laboratory (NPL) – UK – Project Coordinators • Bundesanstalt für Materialforschung und -prüfung (BAM) (Federal Institute for Materials Research and Testing) – Germany • Physikalisch-Technische Bundesanstalt (PTB) – Germany • Český metrologický institute (CMI) – Czech Republic • CEA-LIST institute - France Contact Details Project Coordinator, Michael Gower National Physical Laboratory Hampton Road Teddington Middlesex TW11 0LW United Kingdom T: +44 20 8943 8625 E: michael.gower@npl.co.uk W: http://projects.npl.co.uk/vitcea/ http://www.jeccomposites.com/news/compositesnews/european-project-non-destructiveinspection-techniques-polymer-composite-materi

Michael Gower

Michael Gower, senior research scientist within the Composites Group at NPL, is the coordinator of Project VITCEA. He has over 20 years of research experience in mechanical testing, NDE and condition monitoring, analysis, characterisation and finite element analysis (FEA) of composites.

47


A manifesto for excellence in quantum research The development of quantum technologies could provide a significant boost to the European economy, but close collaboration between research and commerce is central to building the technologies of tomorrow. With global competition intensifying, scientists are calling for further support to European researchers in the ‘quantum manifesto’

T

he commercial and social potential of quantum technologies is widely recognised, with researchers investigating fundamental effects that could have a significant impact on the defence, finance and telecommunications industries, to name just three. Governments and universities across the world are investing correspondingly significant sums in research; the UK Government committed £270 million to a National Quantum Technologies Programme in its 2013 Autumn statement, while other European countries and the EU itself have also increased spending.

Quantum technologies So what are quantum technologies? The Engineering and Physical Sciences Research Council (EPSRC) in the UK considers quantum technologies to be those that; ‘harness quantum physics to gain a functionality or performance that is otherwise unattainable’ . While many of the technologies that we rely on today in our everyday lives, notably the semiconductor industry, are derived to some extent from quantum physics, emerging quantum technologies are built on even more subtle aspects of quantum mechanics. This has been the focus of a great deal of research attention over recent years, stimulated by a desire to both reap the economic dividends of European expertise and to maintain a place at the leading edge of research. The UK Government signalled its commitment with the National strategy for quantum technologies launched in 2013, which outlines detailed plans to establish the UK as a global leader. The vision is to; ‘not only grow and develop a quantum technologies industry, but to ensure it remains strongly rooted in the UK and delivers long-term benefits to society as a whole.’

48

There are now calls for a more coherent investment strategy at the European level, building on the continent’s technical expertise, infrastructure and strong research links to support long-term economic growth. Researchers across Europe have put forward a ‘quantum manifesto’, calling on the EU to launch a 1 billion Euro initiative which would invest in education, science, engineering and innovation. The authors warn that without a large-scale programme, there is a risk that research in Europe will be fragmented, and scientists will replicate each others’ work. The manifesto argues that research excellence can underpin long-term prosperity. ‘Technologies based on the laws of quantum mechanics, which govern physics on an atomic scale, will lead to a wave of new technologies that will create many new businesses and help solve many of today’s global challenges ,’ the authors write. This requires investment though. ‘Europe needs strategic investment now in order to lead the second quantum revolution. Building upon its scientific excellence, Europe has the opportunity to create a competitive industry for long-term prosperity and security.’ This will be built on the technical promise of quantum technology. ‘Quantum computers are expected to be able to solve, in a few minutes, problems that are unsolvable by the supercomputers of today and tomorrow. This, in turn, will seed breakthroughs in the design of chemical processes, new materials, such as higher temperature superconductors, and new paradigms in machine learning and artificial intelligence. Based on quantum coherence, data can be protected in a completely secure way that makes eavesdropping impossible. ‘

EU Research


Over the last fifteen years, around 250 million of EU funds have been invested in projects in the field, but now scientists argue that the time has come to move further, turning research findings into commercial development. While individual European nations have invested significant funds on research into quantum technology, the authors of the Quantum Manifesto argue that a more closely integrated ‘ecosystem’ is required in order to capitalise fully on research advances, bringing together the key players. “This includes scientists to supply good ideas, engineers to turn early prototypes into devices, and innovators to develop commercial products that they can market and sell,” said Richard Murray, a co-author of the Manifesto, in an interview with physicsworld.com . “While the end applications may be unclear at the moment, one thing is certain: quantum technologies will enable the growth of multiple hi-tech companies, securing a significant number of jobs and economic prosperity for many decades into the future. “

Global competition This call for increased funding comes against a backdrop of increasingly intense global competition, as countries across the world seek to build their stake in the technologies of tomorrow, with China in particular investing heavily in the development of quantum technologies . China became the first country in the world to establish quantum communications technology outside the laboratory in 2009, building a secure network for government officials to exchange information on the 60th anniversary of the founding of the People’s Republic , and research continues apace.

www.euresearcher.com

Research into quantum teleportation by scientists at the University of Science and Technology of China in Heifei was named Breakthrough of the Year in 2015 by Physics World . JanWei Pan and Chaoyang Lu were able to simultaneously transfer a photon’s spin (polarization) and its orbital angular momentum (OAM) to another photon some distance away. While this is still some way from the type of teleportation we saw depicted in Star Trek, it still represents an important breakthrough, and lays the foundation for even more ambitious research. Then there’s the world’s first quantum satellite, also under development in China, which is set to be launched into space in June this year , further building on recent achievements. The satellite, named the Quantum Experiments at Space Scale (QUESS), promises to revolutionise cryptography. Its primary role will be to test the phenomenon of quantum entanglement, where two or more particles are fused together into complementary ‘quantum states’; it will carry several sophisticated instruments to relay transmissions between two ground stations. There are also plans in China to launch the world’s biggest quantum communications system at some point in the next three months . The planned network will stretch 2,000 kilometres, linking government offices in Beijing and Shanghai, while researchers are also considering the next stage in development. One objective is to establish a system to the financial centre of Hong Kong, which could greatly benefit from quantum communication, while the ultimate goal is to extend the network to cover the whole globe at some point in the next two decades.

49


Commercial potential Clearly the Chinese see quantum technology as an area of rich potential, and are willing to commit significant resources to development, helping establish them as pre-eminent in the field. By contrast, while individual European countries are home to great research expertise, there is not yet a common vision on how to exploit this and translate technical excellence into concrete applications. The potential impact of quantum technology is enormous, as outlined in the National Strategy for Quantum Technologies; ‘A new generation of quantum technologies has moved beyond simply exploiting naturally occurring quantum effects. They are now driving and enabling a new generation of hitherto impossible devices and systems, from breathtakingly powerful medical imaging devices to entirely new methods of computing to solve currently intractable problems – all made possible by the engineering of quantum effects into next-generation technologies.’ These quantum effects are now the focus of a great deal of research attention across Europe, with the wider goal of realising the full potential of quantum technologies and eventually bringing them to the commercial marketplace. The QuTech institute, based in the Dutch city of Delft, is at the forefront of research and development; scientists are investigating several

aspects of quantum technology, including work on the quantum internet, described on the website as ‘an optically-connected network of quantum processors’ A five-year roadmap has been developed, setting out the initial objectives and primary technical challenges facing researchers. Within the five-year scope of the roadmap, the aim is to develop a quantum network connecting the Dutch cities of Delft, Den Haag and Leiden. While the initial plans are only for a relatively limited network, this could lay the foundations for the long-term development of the quantum internet, which promises to both boost problem-solving capabilities and significantly enhance security. The ability to encode information on single photons of light, produced on demand, is central to the development of the quantum internet. Recent research at Eindhoven University of Technology holds great promise in these terms, with scientists developing a nanoscale device that can ‘sculpt’ the shape of individual photons. This device can release single photons of light, on which information can be encoded, which has been identified as an important step towards the development of the ‘quantum internet’. This is just one example of the depth of European technical expertise, but the world of science never stands still, and strong leadership is required to build on the continent’s strong research

A new generation of quantum technologies has moved beyond simply exploiting naturally occurring quantum effects. They are now driving and enabling a new generation of hitherto impossible devices and systems

50

EU Research


foundations. The UK was ranked second in the world between 2008-2012 for the quality of research across both engineering and physical sciences . However, supporting this research and translating it into commercial applications is a major challenge, as the development of quantum technologies requires a strong network of supporting infrastructure. As a first step, the UK has established a network of four quantum technology hubs, involving 17 universities and more than 100 companies . Based at the universities of Birmingham, Glasgow, York and Oxford, and each specialising in a different area of quantum research, these hubs will help to drive development, building clusters of activity together with industry. The Birmingham hub for example specialises in sensors and metrology, with researchers aiming to produce quantum sensors that will outperform existing classical devices.

www.euresearcher.com

Quantum manifesto These hubs are funded by the EPSRC at the UK level, but in the quantum manifesto, researchers call for a common European strategy. The manifesto is open for endorsement from research institutes, industry and scientists until April 30, ready to be presented to representatives of the European Commission, Parliament and Council at the Quantum Europe conference in Amsterdam from May 17-18 . The conference will provide a platform for further discussion and knowledge-sharing between researchers and industry, as the second ‘quantum revolution’ gathers pace.

www.epsrc.ac.uk

51


EU Research

For more information, please visit: www.euresearcher.com

EU


Photonic trumpets lying on a gold mirror. High brightness single-photon sources.

Security with single photons The development of compact and efficient single-photon sources is central to a number of technical fields, including quantum computing, quantum cryptography and radiometry. Professor Stefan Kück tells us about the SIQUTE project’s work in developing single-photon sources, research which could have a significant impact in both the academic and commercial sectors A type of massless elementary particle, photons are well-suited to a range of quantum communication, computing and metrology applications. However, currently no single-photon source is available that meets the scientific and technical criteria of these applications, an issue that lies at the core of the SIQUTE project. “In this project we are mainly focusing on the production of single photons, meaning the development of single-photon sources,” says Professor Stefan Kück, the project’s coordinator. This is central to the development of a number of quantum applications, including quantum cryptography, which could greatly enhance internet security. “If you want to enhance security by having just one photon in a specific time period, then first you really need to have a single-photon source. This is very important,” stresses Professor Kück. “The reason is that if you want to establish secure quantum communication, then in principle you can do this by transmitting single photons with a specific characteristic. If, however, you transmit two photons where this characteristic is identical, then an eavesdropper can take one and read your secure data.”

www.euresearcher.com

The concept of quantum communication is built on the knowledge that it is not possible to copy a photon and produce another with identical characteristics. In order to copy a photon you first need to measure it, and when you measure it you change its quantum state, which has significant implications in terms of information security. “When you change the quantum state of a photon, this change can be detected. When the recipient of a secure message detects that the photon has

changed they know not to use the key, because somebody has intercepted it. This is the idea of quantum communication,” explains Professor Kück. The development of pure single-photon sources would be a huge step forward in these terms, helping meet the needs of cutting-edge quantum optical technologies. “We are aiming at quantum communication in this project. Our goal is to develop more accurate and more efficient single photon sources,” says Professor Kück.

Manufacturing of QD-based SPS

53


Single photon sources Researchers are investigating two main single photon sources, the first of which are vacancy centres within nanocrystals. Professor Kück and his colleagues are investigating single photon sources based on impurity centres within diamond. “Nitrogen vacancy centres are present in natural diamonds. But, for it to be a single photon source, you need there to be just one of these nitrogen vacancy centres in a specific volume of the crystal. So if you have two or more of these nitrogen vacancy centres, then you don’t have a single-photon source, you need there to be just one,” he explains. This can be achieved by artificially producing these diamonds, for example by shooting nitrogen or silicon atoms into these nanodiamonds. “By this method you’re producing silicon and nitrogen centres in very clean diamond. The trick is to have a rather low irradiation of these ions, so that you have a really low concentration of this nitrogen or silicon in your crystal.” The nitrogen vacancy centre can be irradiated with a laser pulse; it will then go into an excited state, and emits a photon by spontaneous emission. However, this photon is not emitted in a specific direction, which Professor Kück says is an important

Map of Diamond Nanocrystals. The inset shows the g(2)-function for the encircled nanodiamond.

Electron microscope picture of a nanodiamond.

54

Optically pumped design for indistinguishable photon emission. Design of a photonic NW SPS trumpet structure implementing a weak cavity effect. consideration in terms of the project’s overall goals. “It’s very difficult to collect photons from a source which emits in all directions. But if you put this nanodiamond into a specific structure, then you will guide this emission into a specific direction,” he outlines. Researchers are implementing defect centre doped nanocrystals into metallo-dielectric structures, which will allow for near unity in collection efficiency. “By putting this nano-diamond into a specific structure, then we can guide this photon emission in a specific direction. If we can do this effectively, then we can go up to a very high collection efficiency, approaching 100 percent,” says Professor Kück. “We haven’t achieved that yet, but in principle the photon collection efficiency should be very high with this approach.” The second type of single photon source the project is investigating are quantum dots, a type of semi-conductor device which is relevant to several areas of research. The project’s work in this area centres on the fabrication of Indium Arsenide (InAs) quantum dots, which are placed within Gallium Arsenide (GaAs) structures. “Quantum dots are semiconductor materials, and we place them in slightly different semi-conductor materials.

One major advantage of this approach is that, in principle, you can access semiconductors electrically. That means single photons can be emitted by electrical excitation,” says Professor Kück. These single-photon sources should be easier to handle; however, Professor Kück says that there are also disadvantages with these semi-conductor quantum dots. “They have to be operated at very low temperatures. So typically at say 12-20 Kelvins, so you need to strongly cool them,” he explains. This is a significant obstacle in terms of the practical application of quantum dots, but it is relatively insignificant when it comes to the project’s work, with the project consortium bringing together six National Metrology Institutes and several research laboratories with advanced facilities. This enables researchers to investigate fundamental questions around photons; the third workpackage in the project is centred on measuring entanglement. “We say that two photons are entangled when a measurement on one photon exhibits a certain relationship, or correlation, to a measurement on the other photon. It means that if you measure one then you know the characteristics of the other, even though they may be quite a long way from each other,” says Professor Kück. This gives

The measurement of the g(2)-function, the 2nd order correlation function, is the proof for single-photon emission. The dip in the middle, which almost goes down to zero, indicates that almost all photons arrive as single photons at the detector.

Artistic view of a so-called solid immersion lens. This lens is placed on top of an impurity center and thus collects its emission very efficiently.

EU Research


researchers the chance to observe photons indirectly. “You can send a photon beam in one direction, and these photons will interact in a scene, with other particles in that scene. We can observe what is going on by looking at the entangled photons in a different location,” continues Professor Kück. A key point to note here is that entanglement between two photons is by nature fragile and liable to disruption. While on the one hand this fragility is a problem, seemingly limiting the applicability of the feature, on the other it means that it can act as a highly sensitive sensor. “You can really see very small

Single-photon detectors

At a glance

This is central to the detection of weak signals, which occur in many fields beyond quantum communication, including medicine, biology and astronomy. Many national metrology institutes have worked on the calibration of detection devices, but Professor Kück says existing techniques have some clear limitations. “Currently there is no national metrology institute capable of calibrating single-photon detectors in a proven way. This is one area in which our research will have a major impact,” he says. This work will form an important part of the future research agenda. “One future research direction will be to set

Full Project Title Single-photon sources for quantum technologies (SIQUTE) Project Objectives The aim of the SIQUTE project is to develop compact and efficient singlephoton sources and to implement them in quantum optics and metrological applications in order to advance the measurement performance and facilitate new scientific discoveries in these fields. Project Partners Funded Partners (all are national metrology institutes): CMI, Cesky Metrologicky Institut, Czech Republic • INRIM, Istituto Nazionale di Ricerca Metrologica, Italy • Metrosert, AS Metrosert, Estonia • MIKES, Mittatekniikan Keskus, Finland • NPL, NPL Management Limited, United Kingdom • PTB, PhysikalischTechnische Bundesanstalt, Germany Unfunded Partners: USMF, The University System of Maryland Foundation, Inc., United States • Funded Universities and Academia: Commissariat à l’énergie atomique et aux énergies alternatives, France • Danmarks Tekniske Universitet, Denmark • Friedrich-AlexanderUniversität Erlangen - Nürnberg, Germany • Universitaet des Saarlandes, Germany

If you want to establish secure quantum communication, then in principle you can do this by transmitting single photons with a specific characteristic. If, however, you transmit two photons where this characteristic is identical, then an eavesdropper can take one and read your secure data interactions between light and matter, and these interactions will have an impact on the entanglement that you can measure. You can measure small forces and small absorptions. So you can see some things which you cannot see if you observe it classically,” outlines Professor Kück. Researchers can get more detail on photon interactions, and measure them more accurately, part of the wider goal of carrying out precise measurements beyond classical limits; Professor Kück says this work holds wide relevance. “The potential of quantum communication is widely recognised, while we’re also working on the calibration of single-photon detectors,” he outlines.

up calibration services, characterisation services, for devices, which means detectors and sources. This will be done at different National Metrology Institutes,” continues Professor Kück. “A follow-up project will work on quantum communication, aiming to address the main problems in the field.” This is a complex task, and there are many issues to consider when it comes to information security. While in principle totally secure communication should be possible, in reality Professor Kück says it will be difficult to guarantee it. “There is a long way to go before we can achieve absolute security,” he acknowledges.

Project Funding This work was funded by the project Single-Photon Sources for Quantum Technology (SIQUTE) of the European Metrology Research Programme (EMRP) [Grant Agreement No. 912/2009/EC]. The EMRP is jointly funded by the EMRP participating countries within EURAMET and the European Union. Contact Details Physikalisch-Technische Bundesanstalt FB 4.1 Photometrie und angewandte Radiometrie, AG 4.13 Laserradiometrie Bundesallee 100 38116 Braunschweig T: +49 531 592 4100 E: stefan.kueck@ptb.de W: http://www.ptb.de/emrp/siqute-home.html

Professor Stefan Kück

The confocal microscope setup for excitation and detection. Such a setup is typical for excitation of single impurity centres in nanodiamonds and the detection of their emission.

Professor Stefan Kück started his academic career in 1990, when he began his PhD work on Cr4+ doped laser materials, which he finished 1994. He completed his habilitation on tunable laser materials in 2001. He then switched topics towards metrology, especially laser radiometry, photometry and single photon metrology. Currently he leads the Department of Photometry and Applied Radiometry at the PhysikalischTechnische Bundesanstalt, the German national metrology institute. Since 2013, he coordinated the joint research project “Single-photon sources for quantum technologies” (SIQUTE).

www.euresearcher.com

55


Molecular control for next generation OPV The Next-Generation Organic Photovoltaics focus group brings together researchers from several disciplines to develop new, more efficient OPV devices. This work is closely related to the Molecsyncon project’s fundamental research into controlling the properties of individual molecules, as Principal Investigator Professor Ryan Chiechi explains The field of

molecular electronics field continues to evolve, with researchers seeking to control the properties of specific molecules and investigate their potential application in electronic circuits. Based at The University of Groningen in the Netherlands, Associate Professor Ryan Chiechi is the Principal Investigator of the Molecsyncon project, an ERC-backed initiative which is investigating tunnelling charge transport. “The dominant mode of conduction in a molecular electronic device is tunnelling. You can synthetically control the transmission probability – the tunnelling probability – by tailoring different organic structures to have different transmission properties,” he explains. The project is using synthetic chemistry to control tunnelling probability. “We’re interested in the electronic properties of molecules. So we aim to control things like energy levels, conjugation patterns and polarizability,” continues Professor Chiechi.

Quantum interference Researchers aim to synthesise interesting molecules, measure them, and hopefully identify interesting properties, rather than following a more theory-driven approach. The project is using two main experimental platforms – Eutectic Ga-In (EGaIn) and STAN electrodes – to perform physical organic studies in tunnelling junctions. “EGaIn is a very interesting material – it’s a conductive liquid metal at room temperature, but its rheology is nonNewtonian. That means that we can form non-Newtonian geometries with it,” says Professor Chiechi. Researchers draw EGaIn out into sharp tips, which are then put into contact with self-assembled monolayers (SAM) of the synthesised molecules, assembled on gold or silver. “We touch the tip of EGaIn to the SAM, which creates a metal molecule-metal junction – a tunnelling junction. Then we sweep through a bias range, which generates an I/V (current-voltage) curve,” explains Professor Chiechi.

56

A lot of information can be derived from changes to the magnitude and shape of this curve, as researchers systematically adjust the molecular structures under investigation. Two different sets of molecules will be synthesised and investigated in the project, with Professor Chiechi and his colleagues targeting three main properties. “One is quantum interference. This is the idea that if you tailor the conjugation pattern of a molecule A droplet of EGaIn (~ 1 mm in diameter) suspended from the end of the needle of a syringe.

The project’s eventual goal is to develop a method of controlling these kinds of properties externally, while researchers are also interested in controlling the electrostatics of molecular junctions. Professor Chiechi and his colleagues are altering the polarizability and embedded dipole moments in molecules in the junction. “Here we’re looking at things like rectification, and breaking the symmetry of the junction,” he says. The third key area of the project’s research is to ‘gate’ the molecules. “We’ll either use light or electric fields to change the electronic landscape inside the junction in real time. For example, we can shine a light on the junction and see it increase in connectivity by orders of magnitude. Or we can turn on an electric field and watch it exhibit gating phenomena, just like in a field effect transistor. It’s a physically different phenomenon, but the observables are similar,” explains Professor Chiechi.

Organic photovoltaics A sharp tip of EGaIn (~ 20 µm at the apex) being formed by drawing it between the needle of a syringe and a droplet stuck to a surface.

correctly you can cause destructive interference to be dominant,” he outlines. A state where two waves effectively cancel each other out, destructive interference has a significant impact on the operation of a molecular system. “You see a big drop in conductivity, compared to a linearly-conjugated molecule that’s otherwise structurally homologous,” explains Professor Chiechi. “We want to synthesise a variety of structures where we tune the energy levels, so we move the interference feature around with respect to the bias of the system.”

This work could have a wider commercial impact, with Professor Chiechi also part of the Next-Generation Organic Photovoltaics (OPV) focus group, together with the group’s coordinator Professor J.C. Hummelen and the other PIs: Dr. L. Jan Anton Koster, Prof. Maria A. Loi, and Prof. Remco Havenith. The group aims to enable commercially viable photovoltaic (PV) technologies for solar technology; while this research shares some common elements with the work of the Molecsyncon project, Professor Chiechi says there are key differences between molecular and organic electronics. “The big difference is that in organic electronics, charges can hop,” he explains. The focus group is looking at bulk properties in organic electronics, using synthetic chemistry to control these properties. “The property that we’ve identified is dielectric constant, which is a bulk property – it’s not a property of an individual molecule. How to adjust that experimentally is a really intriguing question,” continues Professor Chiechi. “The end goal there is to make efficient solar cells.”

EU Research


At a glance Full Project Title Controlling Tunneling Charge Transport with Organic Syntheses (MOLECSYNCON) A glass substrate with several epoxy sections comprising STAN electrodes (not visible). The dark spots are painted silver contacts. The dielectric constant, a measure of how an electric field affects a material, is a key issue in these terms. Scientifically, the focus is on increasing the dielectric constant of the active layer in organic photovoltaics, which could help improve efficiency. “As the dielectric constant goes up, the charges are screened better, and if the charges are screened sufficiently then you effectively bypass excitons. Then, when the organic layer absorbs a photon, it directly produces charge carriers, which is essentially how silicon solar cells work,” outlines Professor Chiechi. The fact that OPV devices don’t generate charge carriers directly is one of their biggest loss mechanisms; solving this problem should improve efficiency. “That’s the aim of our research,” says Professor Chiechi. “Our ultimate goal is to move towards materials that have a high enough

A STAN electrode device being interrogated by contacting the silver pads painted over the ends of the device.

spectroscopists, quantum chemists and others,” outlines Professor Chiechi. “I design and make molecules, and work with device physicists and quantum chemists to tease out some structure-function relationships and do some predictive work.” This close inter-disciplinary collaboration is crucial to addressing research challenges around the development of OPV science. While not neglecting the theoretical side of

The dominant mode of conduction in a molecular electronic device is tunnelling. You can synthetically control the transmission probability – the tunnelling probability – by tailoring different organic structures to have different transmission properties dielectric constant that we have no effective exciton binding energy, so they create free carriers directly.” The current generation of large-scale OPV devices operate at a nominal efficiency of around 3 percent, with the theoretical maximum thought to be around 20 or 30 percent. Removing the exciton loss pathway could in principle lead to significant improvements in efficiency, which Professor Chiechi says is crucial to the longer-term prospects of OPV. “If we can’t demonstrate that OPV can be commercially viable and competitive, then interest is going to dwindle and funding agencies are going to focus on alternatives,” he warns. The focus group aims to demonstrate the commercial viability of the technology, pursuing research across several disciplines. “The focus group combines synthetic chemists with device physicists,

www.euresearcher.com

research, Professor Chiechi believes it’s important to allow chemists a level of investigative freedom. “If you want to achieve major steps forward, you need chemists to explore unknown parameters. Chemists and physicists work really well together, because chemistry is an exploratory, creative science, whereas physics is a careful, descriptive science, that’s very good at explaining and predicting things,” he says. There has been a steady trend of improvement in the efficiency of OPV, now Professor Chiechi wants to contribute to a major leap forward in molecular electronics. “I’d like to really bring molecular electronics from something that is strictly academic into something that’s a little more mainstream and widely adopted. To do that, I feel we need a bigger zoo of molecules, and we need to measure them,” he says.

Project Objectives The MOLECSYNCON project aims to push Molecular Electronics (ME) beyond simple distant-dependence studies towards controlling tunneling charge transport with organic synthesis by manipulating the intrinsic properties of organic molecules to shape the tunneling barrier. The measurements will be done with two tools that Professor Chiechi developed; Eutectic Ga-In (EGaIn) and SAMtemplated nanogap (STAN) electrodes. The latter is a newer tool that allows the facile coupling of light and electric fields into SAM-based tunneling junctions. FOM Focus group: The FOM Focus group aims to demonstrate a 20 percent efficient OPV module by 2020 (the project started in 2010). The strategy is to combine synthetic organic chemistry to (re)design donor and acceptor materials with device physics to understand the PV process better and to optimize devices (whether they be all-organic or hybrid). Contact Details Professor Ryan Chiechi University of Groningen Nijenborgh 4 9747 AG Groningen The Netherlands T: + 31 50 363 7664 E: r.c.chiechi@rug.nl W: www.rcclab.com

Professor Ryan Chiechi

Ryan Chiechi is an Associate Professor of Chemistry at the University of Groningen in the Netherlands. He earned his PhD in Chemistry from UCLA with Fred Wudl in 2005, and pursued post-doctoral research in physical-organic chemistry at Harvard University under George M. Whitesides.

57


Particulate erosion is a major concern to industry, which causes significant disruption to business operations, yet Europe has only two laboratories with the facilities which could meet the testing requirements to assist in the development of an improved testing standard. Tony Fry tells us about the Metrosion project’s work in testing the erosion performance of structural materials and coatings The erosive capacity of solid particles is a major concern to industry, as illustrated by the eruption of Iceland’s Eyjafjallajökull volcano during April 2010, which caused significant disruption to air travel. Researchers at the National Physical Laboratory (NPL) are investigating the subject, work with wide commercial relevance. “A large proportion of our work is addressing industrial concerns, and developing the necessary metrology and measurement techniques. That could mean developing a new test, improving the uncertainties in conventional testing, or developing a deeper understanding of the uncertainties,” explains Tony Fry, Principal Research Scientist at the NPL. While particulate erosion is a major concern to industry, the Electric Power Research Institute (EPRI) recently found that only two European laboratories had the facilities required to develop an improved testing standard. “The volcanic eruption was a bit of a warning that Europe needed better capability to address industrial issues,” says Mr Fry. “At that point we were formulating the Metrosion project, and we identified a number of issues with the current state-ofthe-art erosion testing which needed to be addressed to improve the reliability, reproducibility and applicability of erosion testing for industrial needs.” Metrological framework The Metrosion project is developing a metrological framework to fully instrument and monitor high temperature solid particle

erosion (HTSPE) testing, which will help improve our understanding of material performance and durability. Researchers are looking at four main particles – alumina, silica, chromia and fly-ash – with the wider aim of enabling scientists to control the HTSPE test with greater precision. “The aim is to investigate the influence of the test conditions, which can vary from lab to lab, on how the materials erode. For instance if I had a steel and I tested it with alumina particles, would the results be different if I used silica particles? Would I get a different

There are some uncertainties involved with current methods of measuring particle velocity, which researchers calculate to be in the order of 30 percent. These uncertainties can be magnified if the measurements are then used again in further exercises, such as modelling kinetic energy, underlining the importance of the project’s work in developing a new velocity measurement technique. “Our partners at DTU are developing a very lowcost optical technique, which can be used in situ, in the test rigs. It uses particle imaging to capture images of the particles as they

Through better, faster testing techniques and better understanding, we can help industry to accelerate material development and qualification of coatings, surface engineering approaches, to accelerate the development and implementation of solutions to particulate erosion result if I used a larger particle of alumina than a smaller particle? We aim to understand the experimental features of different test rigs, and the way people do their tests, and then try to quantify the uncertainty,” outlines Mr Fry. A variety of factors can affect the erosive properties of a particle, including its velocity and the temperature of both the particle and the substrate, issues which the Metrosion consortium are considering. “We’re looking at varying the angle of incidence and performing tests at different temperatures and using different erodents to try and understand how repeatable tests are and evaluate the uncertainties,” he explains.

traverse, and then you can map the images and use particle following techniques to calculate the velocity,” says Mr Fry. Using these sensors, researchers will also be able to assess the shape of the particles and how they’re moving in the gas stream, both of which will affect their capacity to erode materials. “They may travel in a straight line, or they may have a rotation to them,” continues Mr Fry. “Particles have different aerodynamic profiles, and depending on their shape and their aspect ratio they might spin. If they’re spinning then it will influence the erosion mechanism – so it will have more of a cutting action, rather than just impinging on the material.”

Close-up of the surface plot of the velocity of the gas at the throat region of the nozzle, after which the powder feeds into the gas main stream. The powder feeder is at a 30° angle.

Schematic showing the principle of the laser triangulation method used to measure the depth of material removed during the test and image of a measurement in progress.

58

EU Research


In addition to being able to operate at 900 °C with particle velocities up to 300 ms-1, the following key measurement aims to achieve in the new apparatus were identified as being critical to improving the measurement method of HTSPE.

Researchers aim to update existing equipment to provide a more solid basis for future particulate erosion testing. Characterising the size and shape of the erodent particles is another important element of the project’s work. “We’re trying to relate the shape of the particle to how much damage happens to the sample, also as a function of temperature,” outlines Mr Fry. Researchers are looking at two types of silica, one of which is quite sharp and angular, while the other is more rounded, to try and build a more detailed picture of the erosive effects they have on materials. “The expectation is that the angular particle will cause more damage than the rounded particle,” says Mr Fry. “Our partners at PTB are using a range of optical techniques, including 3-D confocal microscopy and X-ray computed tomography, to measure the 3-dimensional shape and size of the particles. We’re also working to develop a method of describing these particles with just a few geometrical measurements. At NPL we’ve performed 2-dimensional optical measurements, and developed a method to describe the particles erosivity.”

Uncoated substrates The project is working with two uncoated substrates, Nimonic 80A and X22 steel, and are also looking at the protective effects of coatings on those materials, work which holds real importance to industry. Erosion can dramatically reduce both the efficiency

and life-span of high value components, issues which the Metrosion project’s research is addressing. “The aim is that through better, faster testing techniques and better understanding, we can help industry to accelerate material development and qualification of coatings, as well as surface engineering approaches, to accelerate the development and implementation of solutions to particulate erosion,” outlines Mr Fry. With an instrumented test technique, researchers can develop a fuller understanding of the underlying mechanisms of material failure, which will enable industry to target those mechanisms. “If you understand the mechanisms you can protect against them through surface engineering,” says Mr Fry. “These developments will enable a HTSPE test to be conducted within a day – and it will also be possible to obtain more information than through the conventional testing methods.” This will then form the basis for future material development, and continued improvements to durability and performance. The project is now in its final year, and research is ongoing on three models of particulate erosion. “We want to develop a more mechanistic based model, where we model what happens during erosion. By the end of the project we’ll have three models – one based on mass removal, one on finite element analysis and another for erosion of thermal barrier coatings,” says Mr Fry. The Metrosion Workshop

www.euresearcher.com

At a glance Full Project Title Metrology To Enable High Temperature Erosion Testing (METROSION) Project Objectives A three year collaborative project between five European National Measurement Institutes (NMIs) to develop improved measurement and control methods to enable more precise quantification of the high temperature solid particle erosion (HTSPE) performance of structural materials and coatings. Project Participants A.T. Fry, M.G. Gee / National Physical Laboratory, UK • S. Clausen / Danmarks Tekniske Universitet, Denmark • U. Neuschaefer-Rube, M. Bartscher / Physikalisch-Technische Bundesanstalt, Germany • D. Spaltmann, M. Woydt / BAM Bundesanstalt f ür Materialforschung und –prüfung, Germany • S. Radek / Cesky Metrologicky Institut Brno, Czech Republic • F. Cernuschi / Ricerca Sui Sistema Energetico Spa, Italy • J.R. Nicholls, T.W. Rose / Cranfield University, UK Contact Details Project Coordinator, Tony Fry National Physical Laboratory, Hampton Road, Teddington, Middlesex, TW11 0LW T: +020 8943 6220 E: tony.fry@npl.co.uk W: www.npl.co.uk W: http://projects.npl.co.uk/metrosion/

Tony Fry

Tony Fry is a Principal Scientist in the Engineered Materials Science Group which is concerned with the development of test methods to evaluate the life-time performance and durability of engineered materials such as metal alloys, ceramics and their composites. He leads the High Temperature Degradation team at NPL and has been working in high temperature corrosion for more than 16 years. He is a member of the IOM3 Energy Materials Group, and is a member of a number of standards committees including ISO/ TC156/WG13 committee on high temperature testing.

59


00

EU Research


www.euresearcher.com

00


Looking into the roots of modern agriculture The history of plant domestication dates back millennia, during which time the genetic make-up of crops has changed and key traits have emerged. Dr Chris Stevens tells us about the ComPAg project’s work in tracking the evolution of domestication traits in over 30 crops, building a more complete picture of how agriculture developed across the world The development of

plant cultivation and domestication marked an important stage in human evolution, as we began to move away from foraging for plants to more systematically managing the natural environment. This area forms the primary focus for researchers in the ERC-backed ComPAg project. “The aim of the project really is to try and understand the circumstances in which specific plants were brought into cultivation,” outlines Dr Chris Stevens, a Research Associate in the project. Collecting cereals like rice and maize is thought to have been an important step, as hunter-gatherer societies began to build up stocks of certain wild foods. “It’s clear that in various regions of the world people started collecting certain cereals and other plants in quite large quantities, and they became quite important staples within their diet. We’re not saying that they were the most important element though - hunter-gatherer societies usually had quite a diverse diet,” says Dr Stevens. This diversity was essential to survival, meaning societies weren’t dependent on a single source of food and could adapt to

changing circumstances. While these wild foods may have accounted for only a relatively small proportion of the overall diet, they represented an important source of nutrition. “When we look at the archeobotanical assemblage in these early societies, we see that there are a lot of wild resources,” explains Dr Stevens. Initially these plants were genetically wild, but over time many became domesticated, as new plant cultivation and management techniques were developed. “It’s been suggested that this process of domestication, in which the genetics of a plant were changed into something which we would say was a domesticated plant, takes about 2,000 years,” continues Dr Stevens. “There’s a vast grey area between wild plants and domesticated plants, in which you might say plants are still semi-wild.” The project aims to gather evidence from many regions of the world to build a deeper understanding of how plant domestication evolved, a process which dates back several millennia. Researchers have found several archaeological sites with evidence of

early-stage cultivation, when the management of plants developed to a stage in which people cleared the land. “They were clearing an area for that plant, and potentially sowing the seeds of that plant as well,” says Dr Stevens. There is evidence that hunter-gatherers managed their environment to a degree before this point, for example by burning out undergrowth or vegetation to encourage the growth of certain plants, but plant cultivation came later. “Management of wild [cereal] stands may have involved burning off vegetation every now and again, but they weren’t necessarily clearing and tilling the soil with implements, or re-sowing the seed,” says Dr Stevens.

Domesticated plants Researchers will bring together data on the evolution of over 30 crops from across the world, aiming to build a framework to compare how agriculture and plant domestication evolved in different countries. The project is pursuing primary archaeobotanical research at several locations, including sites in East Asia,

Gradual evolution of domestication traits. Top: percentage of non-shattering seed. Bottom: grain size. Means and standard deviation plotted against age before present.

62

EU Research


India and parts of Africa, looking at the charred remains of plants that haven’t decomposed. “We’re looking for plants which have survived archaeologically. In normal circumstances plants decompose. Some however are preserved within what we call anaerobic environments, where there’s no oxygen for those organisms that usually decompose plants to do that,” says Dr Stevens. “The main way in which that happened is that these plants were thrown into a fire and became charred, from which they were then able to survive archaeologically. That’s predominantly grains, and sometimes chaff and weeds.” A great deal of attention is centred on the size of the cereals and pulses that are found at these sites. Typically, the wild grains are very small and the domestic grains are quite big, from which Dr Stevens says researchers have been able to draw new insights into the extent of plant domestication. “By measuring the size of those grains through time, you can measure not only at what stage they were domesticated, but also how far along that process they’ve gone,” he explains. Researchers are also looking at the extent of the seed coat on some plants, which prevents the plant entering dormancy. “Wild quinoa for example has a seed coat, which is usually quite thick. In a typical environment it produces thousands of seeds, which then get incorporated into what we call a seed bank. Some of those seeds will germinate quite soon, but some of them can remain there in the soil for up to a hundred years or so,” says Dr Stevens. This of course is an important issue in terms of the cultivation of a plant and the nature and size of the yield. If a seed is being planted with the aim of it growing

in the near future and providing food, then it’s important to consider the thickness of the seed coat. “A reduced seed coat thickness allows chemicals from the outside environment to get into the seed and promote germination, and also allows water to get into the seed,” says Dr Stevens. Various other traits can also be measured. “One particularly important trait is spikelet bases. From this we can tell whether a plant was shattering, meaning it was a wild type, and the grain would just have fallen off the plant,” outlines Dr Stevens. “If it was a domesticated type the grain would stay on the plant – we can look at the ratios between the wild and domesticated types. We find that the wild

project is integrating archaeological evidence with genetic data to try and understand where plants have diverged from the wild population and then become domesticated again. “It’s very difficult to tell whether the wild plants which we have today are actually ancestors of earlier wild plants, early domesticates, or early cultivated plants that have since gone wild. So cross-fertilization between wild and domesticated plants makes it very difficult,” continues Dr Stevens. The vast majority of crops in Europe are domesticated, yet there are areas of the world where wild plants are still gathered and cultivated. Researchers in other initiatives have travelled to parts of India

We’re looking for plants which have survived archaeologically. In normal circumstances plants decompose. Some however are preserved within what we call anaerobic environments, where there’s no oxygen for those organisms that usually decompose plants to do that type gradually disappears over time, until we end up only with non-shattering plants within the population of plants that are being cultivated.” There is also the potential for a level of mixing between domesticated and wild plants. Wherever a domesticated plant grows in the vicinity of its wild progenitor, there’s a chance of it crossing back to being a wild plant; in some areas they have mixed to the extent that it’s sometimes difficult for researchers to identify whether they are wild or not. “Cannabis is a classic example – it grows over vast areas of northern India and China, but we don’t know quite what its wild distribution is,” says Dr Stevens. The

and Bangladesh to investigate how people gather wild rice for instance, yet Dr Stevens says this kind of work can be challenging. “There are certain areas of the world where we can do that kind of work – in certain parts of Africa for instance there are still people who would go out and collect wild millets and other plants. The biggest obstacle tends to be the political situation and actually getting to those areas,” he outlines. There is still evidence of relatively recent hunter-gatherer societies who are fairly untouched by agriculture though, and researchers have access to further written materials. “We use historical ethnographic texts where the management of wild plants is described,” says Dr Stevens.

Map of centres of early crop domestication and trajectories of agricultural dispersal marked in 1000s of years before present.

View from the archaeological site Jarmo, in Iraqi Kurdistan, an important early farming village sampled by the project.

www.euresearcher.com

63


At a glance

Data comparison

Full Project Title Comparative Pathways Towards Agriculture (ComPAg)

This information will be combined with existing evidence from Western Asia and Europe to develop a deeper picture of how domestication traits evolved in plants. Archaeobotanical remains have been gathered from different time periods; researchers will bring together quantified time series data, from which new insights can then be drawn into how agriculture evolved. “We’re looking at sites from various different time periods, then looking at traits like the size of charred grains from each of those sites and the ratios of wild to domesticated spikelet bases. Then we’ll quantify that data, plot it out, and see how it changes over time,” says Dr Stevens. Researchers can then compare data gathered from different parts of the world. “We’re looking at things like what sort of subsistence patterns they had before the beginnings of plant cultivation? What sort of wild plants were they eating? What was the ecology of the plants which came into cultivation?” continues Dr Stevens.

Project Objectives This research will produce the first global comparative synthesis of the convergent evolution of domesticated plants and early agricultural systems based primarily on empirical archaeobotanical data. It will provide a new framework for explaining the multiple routes from foraging to agriculture the most significant ecological and economic change in the history of human populations. Project Funding European Research Council: €2,041,992 Project Partners University College London, United Kingdom (Lead Research Organisation) • Peking University, China (Collaboration, Partner) • Australian National University (ANU) (Collaboration) • Deccan College Post-Graduate and Research Institute (Collaboration) • Jahangirnagar University, Bangladesh (Collaboration) • University of Otago, New Zealand (Collaboration) • Yunnan Province Institute of Archaeology (Collaboration) • Department of Archaeology and Heritage Management, Addis Ababa University, Ethiopia (Collaboration) • Silpakorn University, Thailand (Partner) • Australian National University, Australia (Partner) • University of Hawaii at Manoa, United States (Partner) • Deccan College, India (Partner) • Bioversity International, Italy (Partner) • Chinese Academy of Sciences, China (Partner) Contact Details Professor Dorian Q Fuller UCL Institute of Archaeology 31-34 Gordon Square London WC1H 0PY T: + 44 (0)20 7679 4771 E: d.fuller@ucl.ac.uk W: http://cordis.europa.eu/project/ rcn/108519_en.html W: http://www.ucl.ac.uk/archaeology/ research/directory/compag-fuller

Professor Dorian Q Fuller

Dorian Q Fuller is Professor of Archaeobotany at the Institute of Archaeology, University College London. He completed his PhD in Cambridge (2000) on the origins of agriculture in Southern India. Since then he has expanded his studies in domestication to include all of India, China, Sudan, Ethiopia, West Africa, Southeast Asia and the Near East.

64

A comparison of modern sorghum (left) and archaeological sorghum preserved as imprints on ancient Sudanese pottery (right). Top row: examples of wild sorghum. Bottom row: examples of domesticated sorghum.

The vast

majority of crops in Europe are domesticated, yet there are areas of the world where wild plants are still gathered and cultivated Researchers can also draw comparisons between cultures before they began cultivating plants, and how they explored the surrounding environment. This is an integral part of the project’s wider goal of developing a comparative understanding of the context in which plant domestication took place. “We’re looking at how long the process takes, from the beginning of plant cultivation to a point where plants are fully domesticated,” says Dr Stevens. “We’re also asking questions about how society develops during this process. At what point does the population begin to expand? Does it begin to expand, and does the distribution of that crop begin to widen? Do you get migrations of people across the landscape? Does that only happen after the crop has been cultivated?” continues Dr Stevens.

This research holds wider relevance beyond the inherent research interest of the topic. There is a great deal of interest currently in looking at the genetics of early wild species of specific crops for example, with a view to managing or cultivating them in different ways; Dr Stevens points to the example of rice. “One area that there is a lot of interest in is whether it is possible to cultivate a perennial rice species which has all the characteristics of domestic rice, but is perennial. This means that you don’t need to dig up the land every year and cultivate it,” he explains. The project’s work in enhancing our understanding of plant domestication could help inform this kind of work. “In terms of understanding the genetics of the process of plant domestication, our research could help improve farming techniques,” continues Dr Stevens.

EU Research


The evolution of rice Rice has long been a staple crop across large parts of Asia, and cultivation methods have evolved as the population grew and dispersed over time. Professor Dorian Fuller tells us about his work in reconstructing the evolution of rice cultivation methods, and its wider importance in terms of our understanding of the global climate The world’s most

productive crop, rice has long formed a central part of the human diet across large parts of Asia, and the methods used to cultivate it have evolved over time. The history of rice agriculture is an area of great interest to Professor Dorian Fuller, the Principal Investigator of an NERC-backed project bringing together 13 research partners from across the world. “The general aim of the project is to develop and then deploy new methods to determine how rice was cultivated. We’re looking at the ecology of rice cultivation, and how those cultivation ecologies evolved over time and then spread over different parts of Asia,” he outlines.

evidence takes two main forms. “We’re looking at archaeobotanical seed remains, and we’re also looking at phytoliths, which are micro remains,” says Professor Fuller. Using the insights gained from these data, researchers aim to reconstruct early rice cultivation systems, dating back around 4,500 years. “Evidence indicates that the first stretch of cultivation in South East Asia was dry rice, which is not a methane producing rice. Methane-producing rice cultivation methods, wet rice, evolved later,” outlines Professor Fuller. These findings hold great relevance to our understanding of population growth and dispersal patterns. As populations

Methane levels steadily grew from 3,000 BC up to the industrial period. There’s been debate as to whether that can be explained by natural climate processes Methane levels This research is central to our understanding of how our climate has evolved. Global methane levels started to increase between 4-5,000 years ago; one hypothesis put forward to explain this increase is the development of rice paddy agriculture. “If you grow rice on an upland field under high rainfall (dry rice), it basically produces no methane, whereas if you grow it in an irrigated paddy-field (wet rice), it produces lots of methane. So the ecology of cultivation makes a big difference,” stresses Professor Fuller. The project’s work will help build a stronger evidence base in this area. “The idea behind the project was to ask whether we can ground-proof that hypothesis in terms of empirical evidence,” says Professor Fuller. The primary focus of this work is reconstructing how rice was cultivated. The project is gathering archaeobotanical data from parts of South East Asia; this

www.euresearcher.com

increased, production was intensified, leading to a shift from dry rice to wet rice. “Wet rice produces four times as much grain as rain-fed rice, so you can feed four times as many people,” explains Professor Fuller. The general historical trend has been away from the cultivation of lowintensity dry rice towards wet rice; Professor Fuller and his colleagues are working to identify when this shift occurred. “Our data suggests that the big increases in wet rice production happened mainly in the last 3,000 years,” he says. This will have had an impact on atmospheric methane levels. While it is understood that greenhouse gas emissions have increased enormously since the beginning of the industrial age, research suggests that methane levels were already growing. “Methane levels steadily grew from 3,000 BC up to the industrial period. There’s been debate as to whether that can be explained by natural climate processes – the difficulty is that we don’t

see anything like this in any previous inter-glacial period,” says Professor Fuller. “If you go back to ice-core records from Greenland or Antarctica, from previous inter-glacials, you don’t see an increase in methane levels.” The growth and dispersal of wet rice cultivation is one potential explanation for this increase, which Professor Fuller believes needs to be taken into account in climate models which build on historical climate data. While it might be assumed that human activity did not have a big impact on the climate before the industrial age, in fact agriculture and cultivation had already affected the greenhouse gas budget. “Predictive climate models need to take into account that there is already this extra greenhouse gas in the atmosphere. That has implications for future planning of what we do in terms of mitigating the impact of greenhouse gas emissions,” says Professor Fuller.

The Impact of Evolving of Rice Systems from China to Southeast Asia (Early Rice) Utilizing archaeobotanical evidence the project aims to model the dispersal and evolution of rice farming systems in China and south-east Asia. This information will then be utilised to produce improved models of past wetland rice agriculture and its impact on past and present climates and populations. Funded by the National Environmental Research Council (NERC) £735,752 Project: NE/K003402/1 ; May 2013 - April 2016 Professor Dorian Q Fuller T: +44 (0)20 7679 4771 E: d.fuller@ucl.ac.uk W: http://gtr.rcuk.ac.uk/ projects?ref=NE/K003402/1 W: http://www.ucl.ac.uk/ archaeology/research/directory/ evolution-rice-fuller

Dorian Q Fuller is Professor of Archaeobotany at the Institute of Archaeology, University College London. He completed his PhD in Cambridge (2000) on the origins of agriculture in Southern India.

65


Solid foundations for dredging projects Land reclamation projects offer a means of expanding living space, which is essential to meeting the needs of the growing global population, but effective defensive structures are required first. Alexander Rohe tells us about the MPM-DREDGE project’s work in developing a numerical tool to model soil-water interactions A number of land reclamation projects have been completed over the last few years, including Hong Kong’s new international airport, the artificial Palm Islands just off the Dubai coast and Rotterdam harbour, all of which were built on land reclaimed from the sea. All were extremely complex, large-scale projects which involved displacing large volumes of water and soil, an area which forms the primary research focus for the MPMDREDGE project. “We are aiming to develop numerical tools for engineers in order to model large deformations of soils in contact with water,” explains Alexander Rohe, the project coordinator. While dredging companies hold deep technical expertise and knowledge of soil-water interaction, Rohe believes that they would benefit from the sophisticated numerical tools the project is developing. “Dredging companies have highly experienced operators on their vessels, who are familiar with the local situation. They know the environment and they know the soils, but that can be improved in a scientific way,” he says. The starting point in this is a deep understanding of soil composition. Soil in coastal areas is mainly quite sandy, and is liable to being eroded by the movement of water. “During water flow the water can erode the soil, so the sand can be carried away with the water. Then the impact of waves on the shoreline, or on dykes, can

66

cause erosion and affect structures, like for example flood defences,” outlines Rohe. Dredging of soil can cause further disruption; the project is developing a numerical software tool for dredging companies, aiming to model the impact of soil-water interactions, covering both solid and fluid mechanics. “Traditionally the disciplines of soil mechanics and hydraulics were rather separated – because one is a solid and one is a liquid. They were usually seen as two different fields of study, and

stacked up to protect the area behind against the waves,” explains Rohe. These geocontainers are likely to significantly deform when they are dropped into water, which Rohe says is an important consideration for the dredging company. “The water flows through the soil in the bag and around the side of it – as such it influences how the bag deforms and how it settles onto the sea bottom,” he points out. “This will also play a role in determining how strong these bags need to be and how

it can cover both fields of engineering – so both solid mechanics as well as the fluid mechanics

The novelty of this new software is that

separate companies were focusing on these aspects,” says Rohe. “The novelty of this new software is that it can cover both fields of engineering – so both solid mechanics as well as the fluid mechanics.”

Industrial application Researchers are focusing on three major applications of this software related to the dredging industry. The first is the use of geocontainers, which are often used to provide added protection for harbour entrances from the sea, or for shore protection. “These geocontainers are like large bags filled with sand, with dimensions of maybe 50 metres x 20 x 10. They are dropped onto the sea floor and

many of them you would need. So the better you are able to install them, to drop them, then the less construction material you will need. That’s a key driver of this research.” The second major application of the project’s research is in modelling liquefaction, the process by which a solid becomes a liquid. When the amount of water in a particular area of soil increases to a critical degree, then the surrounding sand will lose its strength and flow away. “If the amount of water in a formation of soil on a slope changes, whether due to currents, waves, or to an inclination of the slope itself, then this slope can become unstable. The slope will then fail, so it will

EU Research


At a glance Full Project Title Modelling and simulation of soil-water interaction for dredging applications using the Material Point Method (MPM-DREDGE) Project Objectives The MPM-DREDGE project aims at developing, validating and demonstrating an advanced software tool based on the material point method for the modeling and simulation of processes related to the dredging and offshore industry. It is addressing the scientific challenges associated with large non-linear deformations, water pressures and phase transition that occur in the interaction between soils and fluids. Results are improving the understanding of installation of geocontainers, dredging of soils, liquefaction of submarine slopes, landslides and the erosion and scour around offshore and near-shore structures.

Top left: Erosion of the coast line due to wave attack and currents. Top right: Dredging for land reclamation. Bottom right: Eastern Scheldt storm surge barrier (The Netherlands) including model schematisation for assessing erosion.

Project Funding FP7-PEOPLE-2012-IAPP - Marie Curie Action: “Industry-Academia Partnerships and Pathways”, Grant Agreement PIAP-GA-2012-324522 be liquified, and it will flow into the sea. This process can happen very quickly,” says Rohe. The project’s work is also relevant to modelling erosion processes; the tool is designed to help dredging companies understand how their operations will affect soil-water interactions, and tailor their work accordingly. “With this software tool you can establish parametric studies, and you can identify the significant influences on the process itself. Then you can focus your investigations on these particular parameters, and develop a more efficient and robust design,” continues Rohe.

Land reclamation This is of course crucially important to land reclamation projects, which must be built on solid foundations, particularly in flood protection and offshore applications like oil and gas. The project is working with four European dredging companies involved in offshore applications, and researchers are collaborating closely with industry in the development and eventual validation of the numerical tool. “We will try to identify their needs, and we will get some data from them, to validate our work,” says Rohe. The validation of the tool involves work on several different levels. “We will thoroughly test the individual parts of the software with known mathematical solutions. That’s possible only for very simple cases – the

www.euresearcher.com

next step would be to compare it to laboratory experiments,” says Rohe. “For example, you could apply some water flow to a one metre high column of sand, and see how the sand grains behave. This can be simulated with the developed software, and then we compare the experimental data with our numerical data.” The next step will be to compare the numerical results with field data. Beyond the project’s immediate objectives, and the testing and validation of the software tool, Rohe believes there is scope for further development. “This research is only a first step towards a more general design tool – this is quite a novel tool – in which you can model both solids and liquids. There’s a lot of work to be done to make this tool attractive for engineering companies, and it will also be important to provide training and education to engineers in using these sophisticated models,” he says. The goal within the project is to prove the applicability of the software to the three applications that have been identified; beyond this initial objective, new questions arise. “Can we use the software for similar problems, which are slightly different in terms of soil properties?” asks Rohe. “It could be not only sand, it could also be other types of soils, or other ways of water interacting with soils. Currently we are mainly focusing on liquefaction and erosion processes, but in future we could also think about other problems.”

Project Partners Coordinator + participant: Unit Geoengineering Deltares, Delft, The Netherlands, represented by head of department Mr Ipo Ritsema and scientist-in-charge Dr Hans Teunissen Participant: Geotechnical and Environmental Research Group, Engineering Department, University of Cambridge, United Kingdom, represented by head of the chair and scientist-in-charge Prof Kenichi Soga Contact Details Project Coordinator, Dr Alexander Rohe Deltares, Boussinesqweg 1, 2629 HV Delft, The Netherlands T: +31 (0)88 335 73 51 E: alex.rohe@deltares.nl W: http://mpm-dredge.eu/projectmpmdredge

Dr Alexander Rohe

Dr Alexander Rohe is a senior researcher and advisor at Deltares (Delft, The Netherlands). He conducts research on the numerical modelling of large deformation processes in soil-waterstructure interaction problems related to dredging and offshore applications. This year he is a Marie-Curie research fellow at the University of Cambridge and Visiting ByFellow at Churchill College (Cambridge, UK).

67


Information about land use is relevant to a wide range of applications, yet there are gaps in the existing data. We spoke to Dr Steffen Fritz about the Crowdland project’s work in harnessing the power of crowdsourcing to provide more detailed information

Filling the gaps in land use data The

European Statistical Office regularly gathers information on land cover and land use throughout the EU through the LUCAS survey, yet there are gaps in the data. Researchers in the Crowdland project now aim to harness the power of crowd-sourcing to provide more detailed information. “The idea of Crowdland is to do certain quality checks of crowd-sourced data within the existing geo-wiki (see geo-wiki.org) framework,” says Steffen Fritz, the project’s Principal Investigator. The project aims to understand to what degree the official sample in the LUCAS survey can be complemented with additional information provided by citizens. “We aim to understand how information can be added to the official land use area frame sample, and how we can get detailed statistics and spatial information on land cover and land use in Austria and Kenya. We are using Crowdland This project assesses the potential of using crowdsourcing to close big data gaps of ground sourced data on land cover, land use and change. The project builds on the Geo-Wiki crowdsourcing tool and moves from an online environment to a mobile ground-based collection system. Dr Steffen Fritz, Senior Research Scholar Ecosystems Services and Management, International Institute for Applied Systems Analysis (IIASA) - Schlossplatz 1 - A-2361 Laxenburg, Austria T: + 43 223 680 7353 E: fritz@iiasa.ac.at W: http://www.iiasa.ac.at/web/ home/research/researchPrograms/ EcosystemsServicesandMan agement/Crowdland.html Dr Steffen Fritz is head of the Earth Observation Systems (EOS) group in the Ecosystems Services and Management (ESM) Program at the International Institute for Applied Systems Analysis (IIASA) in Austria. He is a senior expert in Geographic Information Systems (GIS), remote sensing, data interoperability, land use and land cover as well as policy related land use modelling. Dr Fritz is Coordinator of a European Research Council (ERC) grant entitled ‘Harnessing the power of crowdsourcing to improve land cover and land-use Information’.

68

FotoQuest Austria being accessed using a mobile device. mobile technologies to do those tests, and to provide tools to citizens,” continues Fritz. The LUCAS survey does not gather information on land use above an altitude of 1,000 metres in Austria and the sampling density is relatively low, so the statistics tend to be quite coarse and aggregated in nature. By using crowdsourced information from more points within a country, Fritz and his colleagues aim to help build a more detailed picture of land use. “By laying more points we get more information about where changes in land use are happening,” he explains. “One important change, for example, is the so-called ‘land take’. This is the land that is being converted from agricultural or natural use to soil-sealed applications – such as shopping malls, road-building, or the expansion of urban areas. That’s an important topic in Europe, and that’s something which we specifically think Europeans are good at understanding. So it’s about understanding how citizens can contribute to existing surveys.”

Developing countries This approach could help both mature economies and developing countries gather more detailed information about land use. Some developing countries lack

rigorous statistics on land cover and land cover change, partly due to the difficulty of getting to certain areas in the first place. “Kenya for example has some very remote areas and relatively few roads, so it’s much more difficult to get to certain selected sampling points,” points out Fritz. Researchers are testing the potential of using very high resolution satellite imagery and possibly mobile money and social gaming to encourage people to provide land cover information via desktop mapping or mobile technologies. “We will examine the costs of using these smart payment incentives to collect data using a trained crowd. We envisage training students at universities to go to those specifically selected sample points,” explains Fritz. This information on land cover and land use holds real wider importance, in both economic terms and for our understanding of climate change. “The more soil is sealed, the greater the likely impact of a flood, while understanding how crops are managed is important in terms of understanding emissions from agriculture,” says Fritz. Researchers plan to extend their work in this area, using very high-resolution satellite imagery to look at deforestation in Kenya. “We already have a prototype running called Picture Pile (see http://www.geo-wiki. org/games/picturepile/), that looks specifically at deforestation in Tanzania and Indonesia. We are now going to extend it to Kenya,” continues Fritz. “This will really give us more detailed information on where deforestation is happening and what appears after the forest is lost, such as regrowth due to fire or cropland or rangeland conversion. It also gives us a means to cross-check existing deforestation maps derived from data in the Lancet archive.”

EU Research


EU



Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.