EU Research Autumn 2012 Issue
The latest research from FP7 Dominique Ristori
A focus on the Joint Research Centre
Drugs - The Risks and Rewards
Exclusive interview with Professor David Nutt Highlighting The Leading Dissemination for EU Research
Images: Martyn Poynor and Mandy Reynolds
Editor’s No A Martin Brodie runs his own media and communications company, after a career in journalism and public relations. After starting his journalist career on the Bristol Evening Post , Martin joined Rolls-Royce plc in Bristol in1980 and held a number of senior communications positions in the defence and civil aerospace businesses before moving to the group’s corporate headquarters in 1991. From then until his retirement in 2009, Martin was responsible for much of the Group’s international communications and was also involved in UK national media relations throughout this period. His last role was Head of Communications – Europe Middle East and Africa. His previous roles also included responsibility for engineering and technology, during which he was editor of the first Rolls-Royce environment report, Powering a Better World.
www.euresearcher.com
s economies emerge from the financial crisis, they do so amid a growing acceptance in public policy that a dependence on financial services, to the detriment of high value-added manufacturing and associated services, reduces a country’s ability to achieve a successful and sustainable recovery. However, some countries are beginning to recognise that it is very difficult to create, or restore, such high value-added manufacturing capability overnight, or at the flick of an educational switch. Technical education, in which the importance of science, technology and maths is paramount, is at the heart of a nation’s ability to generate wealth. The UK’s Coalition government has acknowledged this, as in the Vision of the Department for Business, Innovation and Skills (BIS) which ‘requires developing our human capital, through our universities and skills system, and science, innovation and technology development.’ The Vision also set out the role of the BIS in helping to build a new and more responsible economic model, that delivers sustainable growth and international competiveness. Without actually spelling it out, this model shares some of the aims of the European Union’s Seventh Framework Programme for Research and Technological Development (FP7). FP7 has two main objectives - to strengthen the scientific and technological base of European industry and to encourage competiveness while, at the same time, promoting research that supports EU policies. It was, therefore, encouraging that when Vince Cable, the new UK Business Secretary, met EU Commissioner Máire Geoghan-Quinn on his first visit to Brussels in May 2010 since taking up his new post, they agreed that research and innovation are central to boosting growth and jobs in Europe, to ensuring a sustainable recovery, as set out in the Europe 2020 Strategy. In a later speech, Mr Cable also spoke of the role of innovation, the introduction of new or improved products, processes or methods – as the key driver of economic growth in advanced economies. He also highlighted the difficulties in funding research against the Coalition’s plan to cut government spending to reduce the UK deficit. The importance, therefore, of FP7’s €50 billion budget, is that, as individual countries struggle to fund all their research aspirations, it provides universities, small and medium sized enterprises, innovative companies and public sector bodies, with the ability to invest in research that creates the ideas that will increase Europe’s technological base and power its recovery. Hope you enjoy the issue. Martin Brodie
1
Contents
4 Interview with
Dr Konrad Talbot
The leading expert on dysbindin-1 discusses his latest research
6 ABC Professor Przemyslaw Kardas on medication programmes and an initiative which aims to help improve adherence levels
8 Endostem Professor David Sassoon explains work in developing strategies to stimulate stem cells to repair damaged tissue
10 MASC Professor Kevin Shakesheff explains to EU Research how the project is beginning to create new therapies to stimulate tissue repair
12 NANOPUZZLE The development of a new methodology for cancer treatment which discharges drugs only in the tumoral area, with Dr Jesus de la Fuente
14 Chromatin Crosstalk Professor Rolf Ohlsson talks about the communication within and between chromatin structures and the central role it plays in the body’s development
16 Pathogen Induced Apoptosis
Apoptotic cells play a major role in stimulating the innate immune response, an area being addressed by Professor Olle Stendahl
2
18 ASSPRO CEE 2007
Milena Pavlova, Wim Groot and Godefridus G. van Merode of the project write about their work to establish a set of evidencebased criteria to assess the effectiveness of patient payment
20 SBMPs Professor Alain Milon on membrane proteins and the key role they play in cellular processes, performing important functions in the body
22 IKPP We speak to Professor Dr Uyen Huynh–Do about the project’s research into the causes and possible treatments of chronic kidney disease, the ‘silent epidemic’
25 Exclusive Interview with Professor David Nutt
An insight into research, media and government policy
28 Conhaz Natural hazards in Europe, and the significant social and economic impact on the communities affected. Professor Reimund Schwarze explains the project’s work
30 Corpus Dr Gerd Scholl talks to EU Research about the project, the issue of sustainable consumption and bridging the gap between policy and research
32 Derreg
How globalisation is impacting on the effectiveness of regional responses, identifying the difference that can be made by regional development programmes
35 Treelim Professor Körner’s continued work establishing the upper tree limit of European tree species, aiming to explain the upper limit of common deciduous forest trees in Europe
38 Cachet II We speak to Bai Song, coordinator of the project about their work in developing a costeffective technology capable of producing clean power from both coal and natural gas
40 The Intergovernmental Oceanographic Commission
Richard Davey explores the past, present and future of the IOC
44 Seeing in the dark We speak to Professor Eric Warrant about his work exploring how insects and other animals see in the dark
46 Superbad Massimo Capone identifies the cause of high-temperature superconductivity in a “cure” of the bad metallic behaviour caused by the strong electronelectron interaction
EU Research
48 Flatronics Professor Andras Kis explores fundamental properties and potential applications of new nanomaterials
50 Nanogold We talk to Toralf Scharf about the work of the project in developing a metamaterial ink to produce materials with specific electromagnetic properties
53 Material Science A closer look at the history of graphene, its structure, and its potential applications
56 Heattronics Dr Tero Heikkilä talks about the limitation to the ongoing miniaturisation of electronic devices
58 UPCON Professor Anna Fontcuberta i Morral talks to EU Research about the latest developments in Ultra Pure Nanowire research and the advancement of Solar Cell technologies
62 Quantum Puzzle Professor Silke Bühler-Paschen on entirely new theoretical approaches, and advanced experimental studies of matter at extremely low temperatures
66 Bemosa Avi Kirschenbaum speaks to EU Research about the project and profound questions relating to how security issues are handled within an airport environment
68 Joint Research Centre With the global population set to rise to 8 billion by 2030, food security is a key issue for Dominique Ristori, the JRC’s Director-General
71 CEMYSS Dr Marc Chaussidon explains how the project’s work in developing new analytical techniques will help researchers learn more about the early years of the solar system
74 Ferdolim Professor Christophe Salomon speaks to EU Research about the five-year ERC funded research project that hopes to fully characterise the properties of Fermi gases
77 Visionspace Professor Colin McInnes talks to EU Research about the development of novel approaches to orbital dynamics to improve telecommunications and space exploration
EDITORIAL Managing Editor Martin Brodie info@euresearcher.com Deputy Editor Patrick Truss patrick@euresearcher.com Deputy Editor Richard Davis rich@euresearcher.com Acquisitions Editor Elizabeth Sparks info@euresearcher.com PRODUCTION Production Manager Jenny O’Neill jenny@euresearcher.com Production Assistant Tim Smith info@euresearcher.com Design Manager David Patten design@euresearcher.com Designer Susan Clark design@euresearcher.com Illustrator Martin Carr mary@twocatsintheyard.co.uk PUBLISHING Managing Director Edward Taberner etaberner@euresearcher.com Scientific Director Dr Peter Taberner info@euresearcher.com Office Manager Janis Beazley info@euresearcher.com Finance Manager Adrian Hawthorne info@euresearcher.com
EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: info@euresearcher.com www.euresearcher.com © Blazon Publishing June 2010
Cert o n.TT-COC-2200
www.euresearcher.com
3
Dysbindin-1 andthe theClinical Clinical Dysbindin-1 and Features Schizophrenia Features of Schizophrenia Dysbindin-1 is often decreased at sites of neuronal communication in several brain areas of schizophrenia cases. We spoke with Dr. Konrad Talbot of the University of Pennsylvania about his project on dysbindin-1 reduction in schizophrenia and its relation to the clinical features of the disorder
O
ver the past decade many advances have been made in understanding the biological basis of schizophrenia. Those advances include identification of variant genes (alleles) increasing risk of developing the disorder and of molecular abnormalities impairing communication among neurons in the brain. One of the genes in which variation has often been found to be associated with schizophrenia is known as DTNBP1, which encodes a protein called dysbindin-1. Unlike the product of nearly all genes associated with the disorder, dysbindin-1 is often decreased at sites of neuronal communication (synapses) in several brain areas of schizophrenia cases (auditory cortices, hippocampus, and prefrontal cortex) as discovered by Drs. Konrad Talbot and Steven Arnold at the University of Pennsylvania in the U.S. We spoke with Dr. Talbot about their project on dysbindin-1 reduction in schizophrenia and its relation to the clinical features of that disorder.
EU Researcher: Can you briefly describe your dysbindin-1 project? Does it involve collaboration with other researchers? Dr. Konrad Talbot: The project began in 2004 with the purpose of testing hypotheses about the extent of dysbindin-1 reductions in schizophrenia brain tissue and about synaptic functions of that protein as studied in dysbindin-1 mutant mice and wild-type controls. We adopted a multidisciplinary approach requiring a wide range of collaborators. Our laboratory specializes in biochemical and immunohistochemical studies of post-mortem tissue and the mice just noted enabled by an onsite brain bank and animal colonies and testing facilities. Our collaborators include behavioural neuroscientists at the University of
4
Pennsylvania (Irwin Lucki and Julie Blendy), an electrophysiologist at that institution (Gregory Carlson), and molecular biologists Derek Blake at Cardiff University and Wei-Yi Ong at the National University of Singapore. More recently we have focused on how dysbindin-1 reductions in schizophrenia may contribute to the clinical features of the disorder, which include both its diagnostic symptoms and its cognitive deficits. EUR: What are the main symptoms of schizophrenia? KT: There are two diagnostic categories, specifically positive and negative symptoms. The positive symptoms, which dominate media portrayals of schizophrenia, are features added to normal behaviours. They include psychosis (delusions and hallucinations) and disorganised speech, thought, and actions. The negative symptoms are features notable for deficits in normal behaviours (e.g., a poverty of speech, unchanging facial expression, lack of emotional responsiveness, and social withdrawal). Closely associated with the negative symptoms are prominent and diverse cognitive deficits such as memory
impairment, which are not diagnostic since they are associated with many neurological and psychiatric disorders. EUR: Are the genetic variations in DTNBP1 increasing risk of schizophrenia associated with specific symptoms or cognitive deficits in that disorder? KT: In six of the seven studies on this topic, DTNBP1 risk alleles were found to be associated with severity of negative and/or positive symptoms. The most recent and large-scale of these studies by Jaana Wessman and her colleagues at the University of Helsinki found that such DTNBP1 risk genes are more common in a subset of schizophrenia cases marked by more prominent positive and negative symptoms, as well as greater cognitive deficits, including various forms of what is called working memory. Even among normal people, those carrying these DTNBP1 risk genes show such cognitive deficits compared to those who do not have such genes, suggesting the importance of the gene for cognition in general. EUR: Do DTNBP1 risk alleles explain the reduced levels of dysbindin-1 you found in
EU Research
the brains of schizophrenia cases? KT: This appears unlikely. Schizophrenia results from interactions between environmental factors and many different genetic alterations. DTNBP1 variants are just one of many genetic factors involved and are not common even in the clinical population. In contrast to the low percentage of schizophrenia cases estimated to carry one or more of the DTNBP1 risk alleles (18 per cent or less in populations studied), its gene product (dysbindin-1) is reduced in 7393 per cent of the schizophrenia studied to date depending on the population and brain area studied. While it was initially reported that gene expression of DTNBP1 is reduced in a brain area consistently affected in schizophrenia (the prefrontal cortex), we were unable to replicate that finding and yet found that at least one form of dysbindin-1 was reduced in that brain area. This suggests that DTNBP1 risk alleles do not have a major impact on brain levels of dysbindin-1. These alleles may still affect expression of other genes and thereby affect brain function, but they do not appear to do so by affecting dysbindin-1 levels.
manifests itself for the first time as personal responsibilities increase stress levels. EUR: If not related to DTNBP1 risk alleles, what is the significance of reduced dysbindin-1 in schizophrenia? KT: At present, we cannot answer this question based on schizophrenia studies, because we lack detailed clinical and cognitive data on a sufficiently large number of cases for which we also have brain tissue to measure dysbindin-1 levels. But the potential importance of reduced dysbindin-1 in schizophrenia is evident from studies on sandy mice, which have a naturally occurring DTNBP1 mutation leading to a loss of dysbindin-1 protein. Studies in recent years by our laboratory and others in the U.S., China, and Japan have shown that these dysbindin-1 mutant mice show basic biological, cognitive, and behavioural features of schizophrenia, most notably impaired excitatory and inhibitory communication among and between neurons, reduced memory functions, and mouse equivalents of positive and negative symptoms.
The positive symptoms, which dominate media portrayals of schizophrenia, are features added to normal behaviours. They include psychosis (delusions and hallucinations) and disorganised speech, thought, and actions EUR: What then causes reduced levels of dysbindin-1 in schizophrenia cases? KT: We are just beginning to understand how levels of dysbindin-1 are regulated. Our only clue at present is the finding by Derek Blake’s laboratory at Cardiff University that an enzyme known as TRIM32 binds at least two of the three major forms of dysbindin-1 in the brain and targets them for degradation. We have found that in the human brain, the higher the levels of TRIM32 are, the lower the levels of these dysbindin-1 forms. But we also find that TRIM32 is itself reduced in schizophrenia cases, so it remains unknown what accounts for reduced levels of dysbindin-1 in their brains. Since schizophrenia is a neurodevelopmental disorder, we need to consider developmental factors in future research, not just in the neonatal period but in late adolescenceearly adulthood when the disorder often
www.euresearcher.com
EUR: Is there any particular abnormality in the dysbindin-1 mutant mice that provides new insights into the biological basis of schizophrenia? KT: Yes, there is increasing evidence that lack of dysbindin-1 may affect multiple clinical features of schizophrenia simply by diminishing reciprocal interactions between pyramidal cells and interneurons in the cerebral cortex and hippocampus. The pyramidal neurons excite the interneurons, which in turn inhibit the pyramidal cells. In sandy mice, pyramidal cells release less excitatory transmitter and are inhibited less by interneurons. We find that the latter effect is profound and that it is associated with a molecular abnormality in a subset of interneurons known for their parvalbumin content. Inhibitory output of these parvalbumin cells controls the timing of output by pyramidal neurons, ensuring that these excitatory cells fire synchronously at a
certain rate that maximizes their impact on neurons elsewhere in the brain and thereby coordinates activities across brain areas. The lack of such synchrony shows up in brain wave recordings as a slowing in what are called gamma waves, an effect common in schizophrenia and correlated with both its negative and positive symptoms, as well as its cognitive impairments. Dysbindin-1 reductions in schizophrenia may thus disrupt interactions between neurons in microcircuits critical for higher brain functions. EUR: What do you think are the most important implications of studies on dysbindin-1 for the treatment of schizophrenia? KT: We cannot expect that curing any one abnormality in schizophrenia will cure the disease. However, the evidence that reduced dysbindin-1 may affect not only positive symptoms, but also negative symptoms and cognition makes it an important new therapeutic target, because those last two clinical features, rather than the positive symptoms, are the most debilitating features of schizophrenia. The need for new therapies is great given that established therapies are largely effective only for positive symptoms. We are thus challenged to discover safe and effective means of restoring normal levels of dysbindin-1 in brain synapses of schizophrenia cases or restoring its synaptic functions by other means. At present, however, we do not have pharmacological means of achieving that. To develop them, we need a greater understanding of dysbindin-1 gene expression, synthesis, metabolism, and molecular functions. It is not unreasonable to expect progress in the near future, however, given the escalating pace of research on those topics. The following publications give a more detailed discussion of Dr. Talbot’s views on dysbindin-1’s potential role in the clinical features of schizophrenia: Talbot, K. et al. (2009) Dysbindin-1 and its protein family with special attention to the potential role of dysbindin-1 in neuronal functions and the pathophysiology of schizophrenia. In: Javitt D, Kantrowitz J, editors. Handbook of Neurochemistry and Molecular Neurobiology, 3rd ed., vol. 27 (Schizophrenia). New York: Springer US. pp. 107-241. Talbot, K. (2009). The sandy (sdy) mouse: a dysbindin-1 mutant relevant to schizophrenia research. Prog Brain Res 179:87-94.
5
ABC project’s prescription for a healthier Europe Around 50 per cent of patients fail to adhere to the medication programmes they are prescribed, which could be not starting a course of treatment, not completing it, or missing a dose. This has serious long-term implications says Professor Przemyslaw Kardas of the ABC project, an initiative which aims to help improve adherence levels The failure of
patients to adhere to medication programmes severely limits the effectiveness of treatment and has significant implications for the cost of healthcare. With approximately 50 per cent of patients not adhering to medication programmes this is a major public health concern according to Professor Przemyslaw Kardas, the scientific coordinator of the ABC project. “The project aims to find out why people are not adhering to their medication and, on the basis of this, to provide guidance for European policymakers in order to improve adherence rates, and through that to achieve better health outcomes,” he says. Failing to adhere could mean not taking the medication (both due to not initiating it, and early discontinuation), taking it at the wrong time, or missing a dose; there are many reasons behind such behaviour. “Antibiotic treatments can help people quickly feel better, so they assume the problem is over. They are happy to quit the treatment after 3-4 days, when in fact they were prescribed the antibiotic for 7-10 days. Then there are people who don’t believe in the doctor’s diagnosis, or may have some objections against the medication itself. They may want to get better on their own without putting foreign substances into their body.”
Chronic conditions One of the major issues in terms of drug adherence levels is the impact a condition has on the patient’s daily life. Patients with some chronic, long term conditions may not see any immediate benefit from taking the medication; they often start to ask whether it’s still necessary to take the medication, to go through all the hassle and inconvenience it entails. “Some patients have found that nothing happens if they miss a dose of a drug. So they are happy to quit the treatment because they are fine in the short run,” explains Professor Kardas. However, in other cases patients find that drugs are extremely helpful in reducing the pain caused by their condition. “Patients with arthritis often find that drugs are helpful in
6
reducing the number and seriousness of the symptoms. The patient would then be much happier to take those drugs than somebody who has high cholesterol but isn’t aware of it and doesn’t feel any positive change in their organism after taking the drug. That’s one of the major reasons for differing levels of adherence. Beyond that, there is also a group of psychiatric conditions in which patients are not completely aware of having the condition, such as depression, and thus are not motivated to take any kind of drug at all.” There are also of course cases where people take excessive amounts of medication, but these are relatively rare and the ABC project’s focus is very much on those who take less than was prescribed. This includes people who would seem to have the strongest motivation to adhere to their medication programme, such as
three doses a month without compromising the effectiveness of the treatment, but if you miss more than three tablets your chances are equal to those who don’t take it at all.”
Identifying interventions This not only has health consequences for the patient, but is also a serious waste of money for the healthcare provider which can lead to further indirect costs that affect the wider economy. Raising awareness of the impact of non-adherence is a key part of the project’s work; however, they also have more tangible aims. “To date some 200 different factors have been established that affect adherence to medication. We aim to find out the factors which are major triggers of the patient behaviours, and on this basis to identify the interventions that may change these behaviours in a positive way,” outlines
In future you may receive a much more tailored form of medication based on in-depth tests that assess your genetic background and the likely effectiveness of particular drugs. Adherence will become even more important in this situation – the more
effective the treatment the more important the adherence transplant patients. “People dream of getting a transplant for years, but nonadherence to medication is one of the biggest reasons for transplant rejection. The situation is similar with tuberculosis and HIV, in which the medication is available and in both cases is very important to the patients’ health. In tuberculosis it cures the disease while in HIV it may prolong the life of the patient,” says Professor Kardas. Regardless of the condition, failing to adhere to a medication programme has a severe impact on the effectiveness of treatment. “Take the example of oral contraceptives. Even if you only skip one tablet a month – less than 5 per cent of the total – you may ruin the whole treatment. For lipid-lowering drugs the threshold is more or less 90 per cent according to epidemiological studies. So you can miss
Professor Kardas. This work will provide important guidance for policy-makers and healthcare professionals in ensuring that patients adhere to their medication programme. “We have objectively assessed the situation in order to find out which factors are the most important in terms of non-adherence and which are the most promising in terms of both effectiveness and cost-effectiveness. We will then sum up these findings and provide policymakers with guidance on how to solve the problem. We are also providing guidance on the development of the curriculum for medical schools in order to equip doctors and nurses with the knowledge and skills which might help solve the problem.” The first step is Doctors acknowledging that some of their patients are failing to adhere to the medication programmes they
EU Research
prescribe. While ultimately it is of course the patient who has to take the medication, the ABC project team believes intervention by Doctors can help raise adherence levels. “We aim to provide European citizens with what I would call an adherence-supporting environment. Of course we must allow patients to refuse treatment, that’s their right, but nevertheless we must also make them aware of the benefits of treatment.” One of the key issues is the simple fact that most people don’t like taking medication. “We are much happier when we feel that we are healthy and independent rather than having to support ourselves with medication,” points out Professor Kardas. “We have tried to characterise different groups of people in order to find out who is prone to non-adherence. A very interesting finding of our study was that the same people might be both adherent and non-adherent – we asked them for their adherence to long-term treatment for hypertension and at the same time we asked them about their adherence to antibiotics. There was no correlation – we can probably flag some people who are at very high risk of non-adherence, but it’s not the case that the rest are at no risk.” The project is currently working to publicise its findings and researchers would
be happy to release their results to the public for use in direct contact with policymakers at local and national levels. Patients today have an active role in treatment, and the project is keen to engage with them on how adherence rates can be improved. “I’m trying to convince the Polish Government to adopt some of the results of our project in the electronic patient records and electronic prescriptions it’s currently introducing. This would give us important insights into how patients are complying with treatment, which would be the starting point for future discussions.” The development of personalised medicine could change the way drugs are prescribed, making adherence even more important in future. “Nowadays if somebody has a chronic condition their doctor is obliged to follow certain guidelines – what you receive is basically a set of drugs universally accepted for that condition,” explains Professor Kardas. “In future you may receive a much more tailored form of medication based on indepth tests that assess your genetic background and the likely effectiveness of particular drugs. Adherence will become even more important in this situation – the more effective the treatment the more important the adherence.”
At a glance Full Project Title Ascertaining Barriers for Compliance: policies for safe, effective and costeffective use of medicines in Europe Project Objectives The ABC Project aims to produce evidence-based policy recommendations for European policymakers for improved patient adherence and subsequent better use of medication, in order to obtain safer, more effective and costeffective use of medicines in Europe. Project Funding PK: total budget €2,736,000, EU contribution €2,235,000 Project Partners Medical University of Lodz • Bangor University • AARDEX Group • Keele University • Katholieke Universiteit Leuven Contact Details Project Coordinator, Professor Przemyslaw Kardas MD, PhD Head, First Department of Family Medicine, Medical University of Lodz 60, Narutowicza Str. 90-136 Lodz, Poland T: (+48 42) 678 72 10 F: (+48 42) 631 93 60 E: pkardas@csk.am.lodz.pl W: www.ABCproject.eu W: www.zmr.lodz.pl
Professor Przemyslaw Kardas
Project Coordinator
Professor Przemyslaw Kardas, MD, PhD, graduated from the Faculty of the Medicine at Medical University of Lodz, Poland, in 1994. He received his PhD degree with honours in 1999 for his studies on patient compliance with antibiotic treatment and has worked in the Department of Family Medicine, Medical University of Lodz since 1998.
www.euresearcher.com
7
Using stem cells to regenerate tissue The ability of stem cells to divide, differentiate and replace damaged tissue is significantly impaired in older people and patients with degenerative diseases. In this context the EndoStem project’s work in developing strategies to stimulate stem cells to repair damaged tissue takes on real importance, as Professor David Sassoon explains Most tissue degeneration
leads to a regenerative response in the body, where stem cells are stimulated to divide and differentiate and replace the damaged tissue. However, a genetic defect in sufferers of the various types of muscular dystrophy, the disease of focus for the EndoStem project, ultimately leads to the reduction of stem cell functionality and deterioration of mature muscle tissue. “After prolonged regenerative responses in cases of muscular dystrophy the regenerative capacity begins to decline and it appears that the stem cells stop working as effectively,” explains Professor David Sassoon, the coordinator of EndoStem, an EC-backed initiative which brings together research and clinical teams to look at how stem cells resident in damaged tissue can be stimulated to repair it. The project’s focus is on understanding muscular and vascular regeneration in these tissues and developing biotherapies targeting them, research which is relevant not only to our understanding of muscular dystrophies, but all degenerative diseases and also old age. “Muscle mass commonly declines with age, a condition called sarcopenia. Many old people are frail – their muscle mass is reduced, and when the muscle is injured it fails to regenerate,” continues Professor Sassoon. “It is thought that understanding muscle stem cells could enhance these regenerative responses and help people recover quicker.” The mechanisms of tissue regeneration are common to all tissues, as are the factors which prevent effective repair. This means that insights from EndoStem’s work could be applied to many other tissues which degenerate as a result of damage or aging.
8
H&E cross section of healthy muscle from a normal mouse 1 week after injury. The central nuclei in the small fibers is typical of regenerating muscle.
Damaged tissue When tissue is damaged signals are released that initiate an inflammatory response, where infiltrating cells in the blood move in, clean up and eat up all the dead and dying tissue. Understanding these signals and the best type and mode of therapeutic delivery could allow researchers to mimic them and develop cutting-edge approaches that enhance the regenerative response to tissue damage. “Studies show that some of the signals that are present in young people and healthy adults begin to disappear as you grow older,” outlines Professor Sassoon. However, deterioration in stem cell functioning is not only caused by changes in signalling, but also by changes in the
Immuno fluorescence of fibers in culture stained for muscle myosin (red) and nuclei (blue). These photos come from a study in preparation by Nathalie Didier (Marazzi and Sassoon).
stem cells themselves. “We ask whether there is an autonomous or non-autonomous response and there’s data to support both sides of this question,” explains Professor Sassoon. “Alongside research showing that signals present in young muscle are absent in old muscle there are also other experiments, including work from our own lab, that show there are autonomous responses where the stem cell itself undergoes a process of aging. Perhaps the most well-known response in aging cells is that their telomeres move to the end of their chromosomes, which are absolutely essential for proper cell division.” There are also genetic factors in the disease case studied in Endostem which affect the ability of stem cells to regenerate tissue. In duchenne muscular dystrophy the gene encoding a protein called dystrophin has a mutation which indirectly causes muscle damage, leading to the repeated recruitment of stem cells to repair it; Professor Sassoon says this ultimately leads to the downward evolution of the disease. “The disease is progressive – it repeatedly recruits stem cells which evolutionarily speaking were intended to be used only a limited number of times, and not on a daily basis over a period of 15-18 years,” he explains. With a genetic cure for muscular dystrophy still some way off Professor Sassoon believes it is important to pursue further research into stem cellbased therapeutics, work which holds relevance beyond muscular dystrophy. “The results of our research are not only applicable to muscular dystrophy but will be useful across the broad spectrum of clinical issues,” he stresses. “There are three phases to our project – fundamental research, preclinical research for therapy development
EU Research
and clinical trials for therapeutic validation, bringing together leading research groups, clinical groups and companies.” The fundamental part of Endostem’s research centres on trying to understand the signals that get stem cells out of quiescence, where they do not divide, and stimulating them to divide and work. It is not clear whether all stem cells have the ability to generate tissue-specific cells or if some are designed to remain quiescent; while stem cells in the intestine were not examined in Endostem, Professor Sassoon says it presents an interesting example. “Cells in the intestine turn over on a daily basis, and in fact your entire intestine replaces itself every few months. It does so through stem cells that are never quiescent,” he explains. In other types of tissue stem cells are quiescent and incapable of any
behaviour can be optimised; while a minority of people retain very good muscle mass well into their later years most people will inevitably undergo reduced stem cell competence as they age, reinforcing the wider relevance of the EndoStem project. “The results from this project will be very broadly applicable,” continues Professor Sassoon. “Let’s imagine that the project yields a small handful of therapeutic candidates in clinical trials that will have a beneficial effect on stem cells – it really doesn’t take much imagination to realise that’s going to be applicable to not only rare genetic diseases but also to sports medicine, as well as to gerontology.” These degeneration issues are common to all types of tissue, so the outcomes of Endostem will apply to many other tissues and diseases.
Muscle mass commonly declines with age, a condition called sarcopenia. Many old people are frail – their muscle mass is reduced, and when the muscle is injured it
understanding muscle stem cells could enhance these responses and help people recover quicker fails to regenerate. It is thought that
further cell division. “One really pertinent example is the heart,” continues Professor Sassoon. “We know that our hearts are not so different in terms of cell type to the heart of a fish, yet if you take a fish and damage the heart – literally cut away a piece of its muscle – the heart will regenerate. If you were to manipulate a mouse or human heart in the same way the individual would die and there would be no regeneration. So then you ask the question, are there stem cells in fish? The simple answer is yes – humans also have stem cells in the heart, but they don’t work very effectively.”
Signalling pathways From here the enticing possibility opens up of using pharmacological approaches to regenerate tissue. This will first require a deeper understanding of the signalling pathways involved. “A whole host of genes encode what are called growth factors, which can have a variety of effects on cell behaviour. They can cause cells to divide, differentiate, stop dividing, or even die,” explains Professor Sassoon. The proteins produced in this environment provide important external clues into how stem cell
www.euresearcher.com
The project has three more years to run and is now at the stage of evaluating the initial returns from clinical trials. The aim is to provision a number of innovative therapies for four clinical trials by the time the project is completed. “The clinical trials are built into the project. One of the agents we’re currently testing is an antiinflammatory for one of the central myopathies, which is a form of muscular dystrophy. The idea is that if you reduce inflammation you will be able to optimise regeneration,” outlines Professor Sassoon. Translating research advances into improvements in treatment is a long drawn-out process; Professor Sassoon says the structure of EndoStem will allow researchers to incorporate feedback more efficiently. “We started with a number of clinical trials and asked ourselves some important questions – why did it work? Why didn’t it work? Why did it work not as well as we thought? Why did it work even better than we thought?” he explains. “We can then take those results, while we’re still at the pre-clinical and translational phase, and use them to guide our future research and effective therapy development.”
At a glance Full Project Title Activation of vasculature associated stem cells and muscle stem cells for the repair and maintenance of muscle tissue Project Objectives Endostem aims to develop new strategies to mobilise skeletal muscle tissue-associated stem cells for efficient tissue repair and novel approaches limiting tissue damage. Project Partners David Sassoon, Frederic Relaix, Patrick Debre and Ana Ferreiro, Giovanna Marazzi, Benedicte Chazaud, Jacques Demotes, France • Gabriella Minchiotti, Emilio Clementi, Silvia Brunelli, Elisabetta Dejana, Pier Lorenzo Puri, Marco Bianchi, Paolo Mascagni, Italy • Jeffrey Hubbell, David Glass, Thomas Meier, Switzerland • Nadia Rosenthal, Stefanie Dimmeler, Germany • Pura Munoz, Spain • Jonathan Dando, Bernardo NadalGinard, United Kingdom Contact Details Project Coordinator, Dr David Sassoon T: +33 1 40 77 81 31 E: david.a.sassoon@gmail.com W: http://www.endostem.eu/
Dr David Sassoon
Project Coordinator
Professor David Sassoon is the coordinator of the EndoStem project. His group is based in Paris; their research interests range from the role of Wnt genes in smooth muscle development to the identification of novel gene pathways shared between cell stress responses and stem cell specification. Professor Sassoon actively contributes to academic-industry collaborations in translational research.
9
Copying natural development processes to create regenerative medicine products The human body develops from a single cell using specialised architectures built from molecules, materials and cells. By copying these developmental processes, Professor Kevin Shakesheff is beginning to create new therapies to stimulate tissue repair The human embryo
is a remarkable thing; it starts out as a single cell before dividing itself and, within a few weeks, becomes a complex system of cells and tissues that form the basic blueprint of our bodies. Our cells have an amazing ability to grow spontaneously and can self-assemble into incredibly complex tissue structures. Professor Kevin Shakesheff’s research aims to learn from the embryo, and his interest in its properties have opened up avenues of research that could change the way that patients who have suffered tissue damage are treated.
Professor Shakesheff is the principle investigator of the MASC project, based at the University of Nottingham. MASC, which stands for materials that impose architecture within stem-cell populations, aims to utilize breakthroughs in polymer science, nanotechnology, and materials processing, in order to create a new class of materials that could be used to mimic the regenerative nature of the human embryonic cell. “I’ve been interested in many years in the interaction between synthetic material and human cells,” Professor Shakesheff tells EU Research. “The main application of this project is to try to use synthetic materials as scaffolds, within which human cells can form human tissue.”
10
The project’s primary focus so far has been to look at possible clinical applications of using a synthetic material as a starting point for the growth of new tissue. The idea is that by using a synthetic, sponge-like material, new cells and blood vessels are encouraged to grow within the material’s porous structure. One potential application for this process would be for use with patients who have a large defect in their bone structure. “You can place this material inside the patient’s bone,” Professor Shakesheff explains. “It would then allow local stem cells, already present within the patient in nearby tissues, to migrate to the structure and lay down new bone material.” The next stage of the project will be to investigate how, by changing the three-
dimensional architecture of the scaffold, the way in which cells interact within the synthetic material can be changed and therefore encourage the growth of better quality tissue. In order to achieve this, Professor Shakesheff and his team are looking at a number of methods. “One example,” he explains, “would be to embed within the synthetic material various drugs, which would encourage tissue formation at a faster rate; it would become a sort of pharmaceutical device.” Another method being explored is the introduction of outside stem cells to the structure. “Rather than relying on the patient having cells that migrate into the structure, you can put those cells in yourself; you would then know that there is a high concentration of the right cell type.” Professor Shakesheff tells us that there is a large gap between what he and his team are accomplishing within a lab setting, and what the human embryo itself can achieve. “The MASC project is trying to bridge that gap by investigating what it is the human embryo can do in order to control how tissues form. We hope that by using the state of the art techniques that we are developing, we can develop this synthetic material that will have the same properties to control the growth of new tissue.”
EU Research
Professor Shakesheff explains that within the early stages of embryo formation, stem cells have not yet decided what they will develop into; within the tissue structure of the embryo, a growth factor is released, which is a protein based molecule. The growth factor migrates from a single point in three-dimensional space and creates a concentration gradient that affects all the cells at different levels. The combination of chemical gradients and local cell-tocell communication acts as a kind of postcode that tells the cells what kind of tissue they should become. Using this as a start point, the MASC project is preparing all the techniques that would allow this process to be replicated within the lab. “We will be able to precisely locate growth factor releasing drug delivery systems with an accuracy equivalent to the width of a single cell,” Professor Shakesheff explains. “Then, by controlling the way the growth factor is released from the drug delivery systems, we can re-create gradients of growth factors in three-dimensional space.” In order to do this, the MASC team will utilize holographic optical tweezers to move single cells accurately within this three-dimensional space and “draw” tissue like structures. “We are trying to do this with a level of precision that occurs within the embryo.” With the materials and tools developed as part of the MASC project, Professor Shakesheff and his team believe that there are three broad classes of patient who would benefit from the research. • P eople who have suffered tissue damage as the result of a physical injury. This could be in a form of damage to tissue, muscle, bone, cartilage, part of the skeletal system, or it could be an internal structure or even traumatic injury to the head or damage to the nerves • P eople who have suffered tissue damage, or tissue loss, due to disease. For example, cirrhosis of liver, or liver cancer, where a lot of the liver has had to be removed. Also, heart attacks or cancers that affect most parts of the body’s tissues structures. If those tissues damaged by disease can be recreated, then medical treatment can be provided to restore tissue function • P eople with congenital defects, where tissue forms incorrectly in the first place
www.euresearcher.com
Professor Shakesheff explains that the particular focus at the moment for the project is the growth of bone tissue. “Bone is an interesting tissue; it has structural function, and within the bone marrow is a major site of haemotopoesis, and the formation of various blood cells,” he says. Professor Shakesheff hopes that the tools and materials that have been created throughout the MASC project will be very generic, and so will enable other researchers to build upon the team’s results and produce different tissue types. “Once you can precisely define the conditions within your scaffold, other people can take that technology further,” he says. “If you look at every tissue of the body, it is the same basic rules that are used to form that tissue, the same concept that you need to control in a threedimensional environment at the level of a single cell.” The project is currently in the process of developing the fundamental tools that would allow the construction of the tissue structure. Working in collaboration with the University of Glasgow, the MASC team will be using a machine that will enable the “drawing” of cells in 3D. “Once the machine is up and running,” Professor Shakesheff tells us, “it could be used by many different groups of researchers, so we want to demonstrate that it works very robustly.” The team are also working on the production of the growth factor gradients. By using growth factors synthesized in the lab, the team will then form them into a pharmaceutical device known as a microparticle. The growth factor is encapsulated within a polymer shell which is designed to release the growth factor over an extended period of time. “We are using that as a way of creating growth factor gradients,” Professor Shakesheff says. “It is a very generic tool, and once we’ve got it working it could be applied to virtually any growth factor.” One of the key points of the MASC project is its multi-disciplinary nature. It is not only dealing with synthetic materials science, but it also encapsulates nanotechnology, stem cell science, and developmental biology. As Professor Shakesheff explains, “the attraction of this project is that there is an opportunity to weave all of these different area together towards one common goal.”
At a glance Full Project Title Materials that Impose Architecture within Stem Cell Populations Project Objectives The project will create new regenerative medicine product opportunities by incorporating ideas from developmental biology into new scaffolds and cell therapies. Project Funding ?2.3 million Contact Details Project Coordinator, Professor Kevin Shakesheff Head of School of Pharmacy Centre for Biomolecular Sciences University of Nottingham University Park Nottingham NG7 2RD T: +44 11595 101 E: kevin.shakesheff@nottingham.ac.uk
Professor Shakesheff
Project Coordinator
Professor Shakesheff’s inventions and scientific breakthroughs have resulted in 180 peer-reviewed papers and 4000 citations. He established 2 companies and has won numerous international rewards. A leading figure in shaping UK interdisciplinary research, he established a 320 researcher Centre for Biomolecular Sciences. In 2009, he became the Head of the School of Pharmacy and is a member of the Medicines and Healthcare Regulatory Authority Expert Advisory Group on Biologics and Vaccines. He is also a member of the Research Excellent Framework Panel 3A for 2014.
11
Solving the NANOPUZZLE Current methods of treating cancer can cause hair loss and damage to the liver, kidney and bone marrow. The NANOPUZZLE project is developing a new methodology which discharges drugs only in the tumoral area, an approach which Dr Jesus de la Fuente says will minimise the side-effects of treatment Current methods of treating cancer are often not specific to the individual case and can have negative side-effects on patients. Cytotoxic agents used in chemotherapy kill both healthy and tumoral cells, causing hair loss and damage to the liver, kidney and bone marrow; the NANOPUZZLE project aims to develop a new methodology that will minimize these side-effects on patients. “The goal of the NANOPUZZLE project is to develop an innovative and versatile methodology for controlled drug release using magnetic nanoparticles. To achieve this, multi-functional magnetic nanoparticles will be prepared,” says Dr Jesus de la Fuente, the project’s overall coordinator. Through this methodology drugs will only be discharged in the tumoral area, avoiding many of the negative side-effects of chemotherapy. “Specific and complementary biomolecules will be incorporated to the nanoparticle surface as well as to the required biologically relevant molecules,” continues Dr de la Fuente. “Molecular recognition events will induce the multifunctionalization. The biologically relevant molecules will be able to specifically recognize the tumoral cell as well as to treat them with a strong anti-tumoral drug. The release of the anti-tumoral drug will be monitored by the application of an alternating magnetic field; the magnetic core will be heated up, and the temperature will induce the release of the drug.” The project’s main area of research has been ovarian cancer, as they already had access to the animal model system, but the technology could be extended to other pathologies. This work in developing a general strategy to treating cancer could potentially bring real improvements in the
12
efficiency of treatment. “I believe that the choice of the right tumoral marker (specializing the treatment for each kind of tumor) could improve the efficiency of the treatment. The NANOPUZZLE multifunctionalization strategy provides a simple way to prepare different nanotools for each specific kind of cancer, while incorporating different tumoral markers will also be very simple. The main value of NANOPUZZLE is the development of a general multi-functionalization strategy,”
the Nanomedicine Research BIONAND in Malaga.”
Centre
Properties of nanostructures This work builds on a recognition that nature has been using nanostructures for millions of years, such as in enabling plants to absorb light in certain wavelengths. The fact that nanostructures are around the same size as typical biological objects and that their properties can be tailored by changing their size or shape makes them
The goal of the NANOPUZZLE project is to develop an innovative and versatile methodology for controlled drug
release using magnetic nanoparticles. To achieve this, multi-
functional magnetic nanoparticles will be
prepared. Specific and complementary biomolecules will be incorporated to the
nanoparticle surface as well
as to the required biologically relevant molecules explains Dr de la Fuente. A proof-ofconcept has been prepared that uses a well known anti-cancer drug, work which could be relevant to both the diagnosis and treatment of cancer. “The strategy that we have developed for the multifunctionalization could also be used for the nanoparticles – or other supports – which biosensors require for example,” continues Dr de la Fuente. “Our multi-functional magnetic nanoparticles labelled with tumoral markers could be a good candidate as contrast agent, using magnetic resonance imaging (MRI) to diagnose cancer. MRI is used with these nanoparticles in collaboration with Dr. ML Garcia-Martin in
ideal for biomedical applications, according to Dr de la Fuente. “Using nanoparticles to deliver drugs to tumors offers an interesting opportunity to avoid the obstacles that occur during conventional systemic drug administration,” he outlines. The researchers are using heat to control the release of the drug. “The release of the drug is controlled by the link between the drug and the nanoparticle, and the heat is produced by magnetic hyperthermia,” explains Dr de la Fuente. “This heat is produced as a result of the re-orientation of nanoparticles magnetic moments due to an external magnetic field. However, it is unlikely that conventional magnetic hyperthermia treatments could be
EU Research
At a glance Full Project Title Multifunctional Magnetic Nanoparticles: Towards Smart Drugs Design extrapolated for its use in few superficial human tissues or organs, due to the huge amount of nanoparticles that it would be necessary to locate in the tissue in order to effectively heat and destroy the tumor. Furthermore, an intense cooling effect is generated by the blood torrent, making it necessary to apply a high external magnetic field.” The method being pursued by NANOPUZZLE actually uses less nanoparticles than conventional treatments as it is not necessary to increase the temperature of the whole cell in order to kill it. Dr de la Fuente says this approach could prove more effective than more established methods. “We only need enough heat to release the drug,” he explains. The NANOPUZZLE multi-functionalization strategy offers a relatively simple way to prepare different nanotools for specific types of cancer; however, the ideal dosage of the drug will vary according to how far the disease has progressed. “Ideally we would start the treatment as soon as possible to destroy the cancer in the first stages. However, for severe cases the treatment will be longer and harder,” acknowledges Dr de la Fuente. “More of the drug will be required – we think we would need to administer the drug in several doses and monitor the reduction of the tumoral tissue. We face several technical challenges in developing this technology; concerning the nanoparticles, it is difficult to chemically characterize multi-functional biomaterials like this one. Concerning the hyperthermia, we are also having problems in finding ‘user friendly’
magnetic hyperthermia equipment, which is mainly for animal experimentation.”
Minimizing side-effects By nature anti-cancer drugs are extremely toxic, so it will be difficult to completely prevent unwanted side-effects. However, Dr de la Fuente believes that more selective administration and release of the drug in the right place will minimize the required dose and the side-effects. “We have developed magnetic nanoparticles and the multifunctionalization strategy, and we know that magnetic hyperthermia induce release of the drug using NANOPUZZLE strategy. Now it is time for in vitro experimentation and in vivo assays,” he says. The project has only been running for 18 months, so it is too early to say that the new method will prove fully effective; nevertheless Dr de la Fuente is already looking to the future and the possibilities for further development. “The multifunctionalization strategy has been recently patented. This technology could be very interesting for research groups working in nanobiotechnology, pharmaceutical companies or in the biosensor industry,” he continues. “We have created a spin-off company – Nanoimmunotech SL (NIT) – to which the patent has been licensed. Our aim is to produce different kinds of nanoparticles (for example gold, silver, iron oxide, quantum dots) which have already been prepared to link the biomolecules that have been derivatized by NIT.”
The NANOPUZZLE team
www.euresearcher.com
Project Funding ERC-Starting Grant Project Partners Institute of Nanoscience of Aragon (University of Zaragoza) Contact Details Project Coordinator, Jesus M de la Fuente Institute of Nanoscience of Aragon University of Zaragoza Campus Rio Ebro Edif I+D C/ Mariano Esquillor s/n 50018 Zaragoza (Spain) T: +34 97 676 2982 F: +34 97 676 2776 E: jmfuente@unizar.es W: http://www.bionanosurf.info/
Dr Jesus de la Fuente
Project Coordinator
Dr. J.M. de la Fuente is currently leading the research group specialized in the Biofunctionalization of Nanoparticles and Surfaces. His research interest is based on the development of general and simple strategies for the functionalization of nanoparticles and surfaces for biomedical and biotechnological applications. He is actually coordinating an ERC-Starting Grant and an ERANET projects in the field of Nanotherapy.
13
Chromatin structures and the body’s development Communication within and between chromatin structures plays a central role in the body’s development, regulating when and where genes are activated. We spoke to Professor Rolf Ohlsson of the Karolinska Institute about his research into the chromatin architecture and its relevance to our understanding of the etiology of human diseases Chromatin, the combination of DNA and proteins that packages the genome in a highly regulated manner, controls when and where genes are activated during the body’s development. DNA is wrapped around a smaller unit called a nucleosome, which plays a major role in cellular changes. “One big difference between eukaryotes and prokaryotes, which lack nucleosomes, is that nucleosomes repress transcription – so when you activate genes they have to de-repress, whereas in prokaryotes the default state is activity and then it has to repress to make a change.” “The general consensus is that there is a high-order structure that involves adherence between different nucleosome units. These nucleosomes can establish interactions within themselves or between factors interacting with distal nucleosomes in close physical proximity to regulate gene transcription in complicated patterns,” says Professor Rolf Ohlsson. Based at the Karolinska Institute in Stockholm, Professor Ohlsson is the principal investigator of chromatin cross talk, a project looking at chromatin architecture and its relationship to development. “It’s very important to silence or activate genes in an orderly manner to robustly manifest different repertoires of gene expression 14
patterns that specify liver or muscle cells, for example,” he explains. “Chromatin is also a target for changes, which are usually taken care of by chromatin modelling complexes that open up structures and put new, but different marks on the chromatin.”
to establish whether that interaction is sufficient to propagate a change in the chromatin structure, which would subsequently manifest itself as a transcription change for example,” says Professor Ohlsson. The length of the DNA
We think many human diseases involve cross-talk between chromatin domains/regions, which are pre-disposed to disease, both within and between chromosomes. What we have postulated, and what we are currently pursuing, is that some individuals may have a genetic variant that either
antagonises or promotes this ‘molecular discussion’
Chromatin changes Changes in chromatin structure have been correlated to certain stages of development, but much remains to be learned about the underlying mechanisms. Researchers know that chromosomes communicate with each other, while there are also extensive chromatin fibre encounters within and even between chromosomes; these interactions can then lead to wider changes. “You don’t need to have this interaction in every cell at any given time –in a population of cells a certain percentage are in sufficiently close proximity to interact. A major conundrum in the field, however, is
also affects how changes occur. “Very short lengths of DNA, such as those in the promoter regions, are in many cases actually devoid of a nucleosome. This facilitates transcription activation by recruiting transcription factors and chromatin fibres at enhancer regions, for example,” continues Professor Ohlsson. These faults set up a whole new system of gene expression regulation. Researchers who want to modify the expression repertoire of certain genes, which could be important for the differentiation process, need to take account of long-distance interactions between regulatory regions
EU Research
and transcriptional units. This research holds real importance to our understanding of human disease. “We think a lot of human diseases involve individual-specific cross-talk between regions which may be pre-disposed to disease. What we have postulated, and what we are trying to pursue now, is that some individuals may have a genetic variant that either antagonises or promotes chromatin fibre cross ‘discussions’. As a result it can modulate the epigenetic state in a pleiotropic manner, to target distant regions,” explains Professor Ohlsson. “This means you have a cascade of events in disease development; if you interfere with one of them in the beginning you can have a read-out effect which is far-away from the initial event. However, if you don’t know the cascade of events which connects the two points then you can’t understand what is going on.” The patterns of chromatin cross talk can be dramatically rewired during development or during disease progression. Moreover, heterogeneity in an individual’s epigenetic state/chromatin cross talk can influence the timing and the onset of human diseases. This is particularly the case in complex diseases, in which more than one gene is involved. “A lot of research effort is currently being focused on genome-wide association screens, which aim to identify the genetic variant associated with a particular disease. Analysis of this area has grown over recent years and now we know many variants of regulatory regions that can promote various human diseases. But the majority are far away from transcription units, so people don’t understand how they can be so far away and yet have an effect on the gene expression repertoire. We think these genetic variants are establishing platforms of discussions within and between chromosomes that indirectly affect the expression patterns,” explains Professor Ohlsson. In the research projects of his laboratory this cross talk primarily involves a suppressive ‘discussion’, but other networks are likely geared towards transcriptional activation. “Our colleagues working from a different angle have found that many genes are actually transcribed at the same time and in the same place, forming a network of actively transcribed genes in transcription factors,” says Professor Ohlsson.
Healthy and diseased cells A range of different cell types are being looked at in this research, including both healthy and diseased cells. The project is collaborating with a group of clinicians to
www.euresearcher.com
look at cancer stem cells, and Professor Ohlsson says they could also look at other conditions. “We hope to look at cells which have been taken from let’s say a diabetic patient, or from a patient with mental retardation or a neurological disorder of some kind. Then in the lab we will redirect those cells to the progenitor state which is associated with the initiation of the disease,” he outlines. “The genetic make-up of a particular individual is inherited from their parents, and that can affect any stage of development from when they were a foetus. Many human diseases, in particular complex diseases, are actually established when you are a foetus,” explains Professor Ohlsson. “There might be a slight imbalance in the organisation of the tissue for example, that subsequently sets the way for a lot of secondary problems when you grow up, and chromatin cross talk could influence this. But it could also affect you later on in development, because particular genes which are maybe having this crosstalk discussion don’t pair up until much later in development.” The principles governing chromatin communication are currently not well understood, but likely to be extremely complex, with the proximity of proteins and the mobility and preferences of the chromatin fibres among the most important factors. The project is analysing these interactions within a relatively small region of the body to try and identify the underlying principles behind them. “I don’t think it’s possible to look at this at a high resolution from the whole genome perspective. We have calculated that between 5-500 million interactions occur at any given time within a human cell – given this number it’s better to use many small regions as baits, already known as chromatin nodes or being essential regulatory regions and see how it looks from there,” explains Professor Ohlsson. This work crosses many disciplinary boundaries, touching on areas of developmental biology, pathology and genetics to name just a few, so Professor Ohlsson says there is enormous scope for further research. “I firmly believe that when you pick up knowledge from different disciplines, and learn about how researchers think and reason, you’re better able to find new things and identify interfaces in a creative manner,” he says. “This work is very challenging but it can be very rewarding as well. We will keep this broad interface discussion going in my lab and hopefully we will be able to find something really new.”
At a glance Full Project Title Chromosomes talk to each other Project Objectives The overall aim is to understand the 3rd dimension of the epigenome and how it relates to both pediatric and adult cancers by incorporating quantitative traits and GWAS data of human diseases from novel perspectives. Project Funding €12 million Contact Details Project Coordinator, Rolf Ohlsson Strategic professor in Genomic Integrity Department of Microbiology, Tumor and Cell Biology Nobels väg 16, Box 280 Karolinska Institute SE-171 77 Stockholm Sweden T: +46 8 524 86203 F: +46 8 342651 E: rolf.ohlsson@ki.se W: http://ki.se/ki/jsp/polopoly. jsp?d=26255&a=68198&l=en
Rolf Ohlsson
Project Coordinator
Rolf Ohlsson has a background in biochemistry, pathology, tumor biology, cell biology, molecular biology, development and evolution. After being appointed to the chair in developmental biology at Uppsala University in 1993 he is since 2008 astrategic professor in genomic integrity at Karolinska Institutet. Memberships include the Nobel Assembly and Faculty 1000 as well as stints at the editorial boards of J Biol Chem, Epigenetics.
15
Boosting the body’s resistance to TB New co-infections, such as HIV, and increased levels of drug resistance mean tuberculosis remains a threat to health, particularly in poverty-stricken areas of the world. Apoptotic cells play a major role in stimulating the innate immune response, an area being addressed by Professor Olle Stendahl Around 2 billion people across the world are infected by mycobacterium tuberculosis, the bacteria that causes tuberculosis, and while relatively few will go on to develop the disease this still represents a significant threat to public health. More than 1.5 million people die every year from TB, a context in which the work of Professor Olle Stendahl’s research group at Linkoping University takes on real significance. “The main questions we are addressing are what role the body’s innate immune response plays in the host response to TB and also how you can boost the innate immune response in order to facilitate clearing of the bacteria and enhance resistance to the infection. We are looking at the role of apoptotic cells in boosting the innate immune response in macrophages in the lung and are also interested in the role of nitric oxide in the lung as a resistance factor to TB, and in boosting the production of nitric oxide,” he says. While in most infected people the bacteria is latent and causes no ill-effects, some events can trigger the onset of the disease. “If you get a co-infection with HIV for instance, or perhaps with worms, then that can initiate the growth of the bacteria, because the immune response is impaired or changed,” explains Professor Stendahl. “That’s one reason TB has become a very threatening disease over the last couple of years, although we thought we had controlled it.” Phagocytic cells The role of phagocytic cells in ingesting harmful foreign particles makes them an important part of the body’s response to infection. There are two primary types of phagocytic cells – monocytes and neutrophils – which respond to chemical signals by accumulating at sites where a pathogen has invaded the body. “The monocytes and neutrophils will accumulate and then phagocytose and kill the bacteria,” explains Professor Stendahl. In more chronic cases of infection, where the bacteria rest inside cells, macrophages – cells produced by the differentiation of
16
monocytes in tissues – are able to kill bacteria over the longer-term; enhancing this process will offer improved protection against infection. “The BCG vaccine was developed in the 1930s. This vaccine is effective in protecting children against TB, but it doesn’t work very well in adults. So we need a better vaccine,” continues Professor Stendahl. “But there is no real established drug that will boost the immune response outside vaccines. There have been clinical
“The main questions we are addressing are what role the body’s innate immune response plays in the host response to TB and also how you can boost the innate immune response in order to facilitate clearing of the bacteria and enhance resistance to the infection” trials – people have shown for instance that vitamin D boosts the immune response and the function of the macrophages. We have shown that giving patients arginine – an amino-acid – will boost the activity of the macrophages to produce nitric oxide, and that will kill the bacteria. We can see that treatment is more effective in the presence of this compound.” These attributes have made macrophages an important research focus, with Professor Stendahl’s group looking at how their function can be boosted and the immune response in the lung enhanced. This also requires an active phagocytic cell that can phagocytose and kill the bacteria; the process of apoptosis holds real importance
in this regard. “Apoptosis means cell death. But it’s not that the cell just dies in an uncontrolled way and goes into pieces – it’s what we used to call Programmed Cell Death, it’s a more regulated way for the cells to die,” explains Professor Stendahl. All cells have a limited lifespan; granulocytes live for only around 24 hours, after which they die in a regulated way. “When these cells go into apoptosis they have to disappear. They don’t just lie there and leak out a lot of enzymes that will damage the tissue – they are being phagocytosed by other cells,” continues Professor Stendahl. “The macrophages will phagocytose the apoptotic cells. The idea in our project is that the neutrophils will accumulate in the infectious area, in the lung and then phagocytose some of the bacteria. They will go into apoptosis and then these apoptotic cells will be recognised by the macrophages, so it will phagocytose the apoptotic cell, and together with that the bacteria.”
Heat shock protein The bacteria have certain virulence factors that specifically trigger apoptosis in the neutrophils, but there are also situations where they might impair the normal apoptotic process in macrophages. One of the main goals of Professor Stendahl’s research is to find a way to stimulate macrophages to be more active; the group have found that apoptotic cells or bodies can stimulate macrophages, the next step is to analyse the underlying mechanisms at the molecular level. “We have found some molecules – so-called heat shock proteins, in particular heat shock protein 72 – which will signal to the macrophage to be more active,” outlines Professor Stendahl. This could eventually lead to the development of more powerful, effective methods of treating TB than are currently available. “The problem today is that treatment takes a long time – you usually have to treat people for six months with multiple drugs.
EU Research
Over the last 10-15 years we have seen increased levels of resistance to certain antibiotics, more and more bacteria are resistant to these multiple drugs,” explains Professor Stendahl. “If the patients you treat do not follow this challenging regiment, there is a high risk they will develop resistance to certain antibiotics. We need to find new drugs and ways to reduce the time it takes to treat TB, this is one of our most important goals.” Alongside this basic research into apoptotic cells at Professor Stendahl’s lab in Linkopping the group is also pursuing extensive clinical studies into a large cohort of household contacts, looking at how people who are already infected with the bacteria respond to its presence. Researchers are looking at the immunological state of the household contacts and how that affects each individual’s chances of being infected. “Can we understand why just one individual is infected and not all the people sleeping in the same room? We are looking at the function of phagocytic cells, the production of nitric oxide production in the lung, and also at some other immunological parameters on patients,” explains Professor Stendahl. Living conditions are another important factor in terms of an individual’s susceptibility to infection. “In Sweden the prevalence of TB was about 400 per 100,000 at the beginning of the twentieth century. Now it’s around 6 in 100,000, so the frequency has gone down by a factor of around 100. Most of that reduction can be explained by better living conditions,” says Professor Stendahl. “Sub-Saharan Africa in particular is going through a period of urbanisation and together with HIV, that will increase the risk of people contracting tuberculosis. People living in bad, crowded conditions are at greater risk of contracting tuberculosis.”
www.euresearcher.com
At a glance Full Project Title Pathogen-induced apoptosis in phagocytic cells “Stimulation of innate immunity in tuberculosis from bench to bedside”
A model for how apoptotic cells and nutritional additives, arginine and vitamin D, enhance the innate immune response to M. tuberculosis.
Drug resistance Concern over the increased threat posed by TB has led to an intense focus on vaccine development. However, with no new vaccine likely to emerge within the next five to ten years, Professor Stendahl believes it is important to also explore alternatives. “If we can boost the innate immune response of infected individuals in a cheap and effective way then that will definitely have an impact in the future,” he says. The group will continue its research into the underlying mechanisms behind the immune system’s response to infection. “The challenge is to understand how the apoptotic cells activate the macrophages. We have some leads for that,” outlines Professor Stendahl. “We also want to understand how co-infections affect the immune response to TB, and how this affects latent TB infections. We think that nitric oxide is important and want to find a robust way to increase nitric oxide production in lung tissue – we believe that would enhance resistance to infection. We have preliminary data showing that people that can generate more nitric oxide are not infected, while the group that cannot generate so much have a bigger chance of being infected by people in their family.”
Project Objective The main objective is to understand how innate immune activation stimulates macrophages, thereby facilitating early host response against and eradication of Mycobacterium tuberculosis. Project Funding 5 million SEK (550k per annum) Contact Details Project Coordinator, Olle Stendahl, MD, PhD Div Medical Microbiology, Dept of Clinical and Experimental Medicine Faculty of Health Sciences, Linköping University 581285 Linköping, Sweden T: +46 101 032 050 E: cid:123112b2-8b84-401a-bb884de0f7202eb1@ad.liu.se] W: http://www.hu.liu.se/ike/forskning/ medmikro/stendahl?l=en&sc=true
Olle Stendahl, MD, PhD
Project Coordinator
Olle Stendahl, MD, PhD, is professor of medical microbiology at Linköping University, where he leads a translational research team focusing on host response against tuberculosis in Sweden and Ethiopia. OS received his MD and PhD from Linköping University Medical School in 1973, and has been a visiting scientist at Harvard Medical School, University of Geneva, and University of California at San Francisco (UCSF).
17
Many countries are moving the costs of healthcare away from government towards patients. Milena Pavlova, Wim Groot and Godefridus G. van Merode of the ASSPRO CEE 2007 project write about their work to establish a set of evidence-based criteria to assess the effectiveness of patient payment
The future of European healthcare The guiding principle of European healthcare systems is that they should be available to people when they need them and provide high-quality treatment and care. Therefore many European countries have established state- or insurancefunded healthcare systems, which provide essential health services officially free at the point of use to all citizens. Nevertheless across Europe the costs of healthcare are being shifted from government to patients to reduce the need for government funds. Such reforms are expected to reduce the deficit in the state budget but also to provide incentives to patients to use health care systems more efficiently and adopt healthier life-styles. Policy-makers face a major challenge in designing fair, efficient and equitable patient payments systems that maintain a high quality of care for all citizens. This is particularly the case in the countries of the former Eastern bloc, where until 1989 healthcare was largely under state control. The twin pressures of an ageing population and economic shortfalls are placing an ever-greater strain on their healthcare systems, adding up to a powerful case for reform. These countries spent a long period under Communist rule before rapid and often untidy liberalisation of their healthcare systems post-1989. Since then healthcare systems have evolved and it is now common for patients across these country to pay for pharmaceuticals and medical devices but also for health services. The majority of these payments are formal, but some are informal. Informal patient payments present a problem since they negatively affect the functioning of the health care system and pose a threat to public health. They distort the system in several ways, jeopardising the efficiency, equity and quality of health care provision. The most serious risk associated with informal patient payments is that those who cannot afford them may postpone or forego medical treatment. Thus the formal-informal patient payment mix in Central and Eastern Europe imposes a considerable burden on household budgets. Would people be able to cope with new or increased formal charges? An initiative co-funded by the EC, the ASSPRO CEE 2007 project will help inform the debate. ASSPRO CEE 2007 is working to establish a set of evidence-based criteria by which the effectiveness 18
of patient payment policies can be assessed; the project is gathering and analysing data from Bulgaria, Hungary, Romania, Lithuania, Poland and Ukraine, as well as Albania, Serbia and Russia.
Informal payments Patient payments which occur outside the official system, i.e. informal payments, have long been a part of the overall healthcare picture in Central and Eastern Europe. This is often attributed to limited healthcare resources and the desire of patients to ensure they receive the highest possible quality of care, which the government fails to provide. Informal payments are made to both medical staff in hospitals and general practitioners, usually for standard healthcare services. It might be thought that this is illegal, but in fact this is only the case if the payment is reported, so in many cases they occupy a kind of legal grey area, with the precise definition varying from country to country. In some countries informal payments are a significant part of health providers’ income, so banning it outright would have a major impact. Indeed, some policy-makers do not see informal payments as a negative phenomenon, especially in countries where funding for public health care services is scarce. Even taking these considerations into account, the project’s research shows that informal payments have a negative impact on the efficiency and equity of healthcare systems. Not everybody is able to make informal payments, and those who cannot might delay treatment or even put it off altogether, risking their own health and the health of the people around them. In countries where informal payments are the cultural norm, patients who cannot afford to pay may even have to take on loans or sell assets to fund treatment. The ultimate effect is to erode trust in public healthcare systems and drive many people towards private providers. It should also be considered that informal payments tend to undermine the role of policymakers and make funding assessment very difficult. The money tends not to be reinvested in public healthcare services, leading to increasing disparities between the quality of care in the public and private sectors. The practice of informal payments also means that the state may allocate less resources than are
EU Research
At a glance Full Project Title Assessment of patient payment policies and projection of their efficiency, equity and quality effects. The case of Central and Eastern Europe
The Project ASSPRO CEE 2007 team required to maintain high quality health services. This has clear implications for public healthcare services and undermines the fundamental principle that everybody should have access to high quality healthcare, regardless of ability to pay.
Role of policy makers Having gathered data from across Central and Eastern Europe, researchers say strengthening control and accountability in the healthcare sector should be a priority for policy-makers. This could take a variety of forms, such as increasing medical professionals’ pay, penalising those who accept informal payments and incentivising the development of the private sector. Poland and Bulgaria provide positive examples. This would lead to a variety of providers competing to provide services, while the project also recommends continued investment by the state in public healthcare services. This must be married to improved regulation if it is to deliver long-term improvement; countries where informal payments are common tend to have a weak regulatory system. There is a strong case for reform, as people who have been paying informally would accept formal charges. However, there is concern that the culture of informal payments could persist even after the introduction of formal charges, which would undermine their whole purpose. As with all types of corruption, the success of efforts to combat the prevalence of informal payments depends on a longterm cultural shift. This also needs the backing of healthcare professionals; many physicians currently benefit from informal payments and so may be reluctant to cut off what remains a highly lucrative income stream.
Effective regulation The success of a move towards formal patient charges depends on the transparency of the system and public faith in its overall fairness. This calls for effective regulation and strengthening of the regulatory framework, which in the
www.euresearcher.com
long-term would lead to a cultural shift where informal payments were no longer sought or accepted. Many people in Central and Eastern Europe see informal or indirect payments for health services as part of the established way of doing things, but a majority across the countries surveyed by the project have a negative view of the practice. Increased formal charges are a key step towards ending it and making healthcare costs more transparent. Formal patient charges should be paid to the insurer or government, not to the physician. Money should be taken out of the relation between physicians and patients. This is especially important since patients are often unable to distinguish between formal and informal payments charged by care providers. The insurer or government should reinvest the payment revenue for improved health services. The project’s research shows that ending the practice of informal payments would help improve the efficiency of healthcare services, while it could also have an impact on individual behaviour. A clear payment structure will encourage people to take more responsibility for their own health, which in turn will reduce the burden on providers and ensure resources can be targeted at acute cases. Effective communication with the public is crucial and will help patients understand the wider context in which policy is framed. European healthcare systems face strong demographic and cost pressures, underlining the need for reform. However, policy-makers need to also ensure that people on low incomes have access to high-quality care, a consideration reflected in the project’s policy recommendations. While the project suggests formal patient charges should be introduced, this should be combined with an exemption or fee reduction mechanism to ensure access for all. The revenue from formal charges could then be re-invested to widen access and improve quality, two of the key measures by which healthcare systems are judged.
Project Objectives • To identify a comprehensive set of tangible evidence-based criteria for the assessment of patient payment policies • To develop a reliable and valid research instrument for studying the level and type of informal payments for health care • To model consumer demand for health care services under official patient payments that accounts for the potential impact of informal payments for health care services • To develop a projection tool for the analysis of macro-level efficiency, equity and quality effects of patient payment Project Funding The study is financed by the European Commission under the 7th Framework Programme, Theme 8 Socio-economic Sciences and Humanities, Project ASSPRO CEE 2007 (Grant Agreement no. 217431). Contact Details Project Coordinator, Dr Milena Pavlova Department of Health Services Research (HSR), CAPHRI Faculty of Health, Medicine and Life Sciences; Maastricht University, P.O. Box 616, 6200 MD Maastricht; The Netherlands T: +31 43 388 1705 E: m.pavlova@maastrichtuniversity.nl W: www.assprocee2007.com
Project coordinators: M. Pavlova, W. Groot and G.G. van Merode Dr Milena Pavlova – project coordinator and Assistant Professor of Health Economics at Maastricht University. Professor Wim Groot - scientific coordinator and Professor of Health Economics, Maastricht University and member of the Dutch Council for Public Health and Health Care. Professor G.G. van Merode – scientific coordinator and Professor of Logistics and Operations Management in Health Care, and Dean of Sciences at Maastricht University.
19
Studying the structure of membrane proteins Membrane proteins play a key role in many cellular processes and perform important functions in the body. A collaborative approach to training is essential for young structural biologists to gain a solid grounding in the field, says Professor Alain Milon, coordinator of the SBMPs project Membrane proteins play a key role in cellular communication, motion, adhesion and other processes. They also protect cells from toxic factors while allowing the entry of essential components. To effectively study the structure of such proteins, a variety of complementary techniques must be mastered. A collaborative network, comprising experts in each method, is essential for the complete training of young structural biologists. “The SBMPs project consists of a joint training effort including the major biophysical methods used in the structural biology of membrane proteins,” outlines Prof Alain Milon, the project’s scientific coordinator. This tactic supports an integrated approach to the study of structure-function relationships in membranes, which will, in turn, facilitate new research possibilities. “It will open up new strategies for structure-based drug design, in particular towards G-protein coupled receptors (GPCRs), which are major drug targets. In fact, GPCRs represent 30 per cent of current drug targets.” Structure of proteins The current focus, however, is on training young researchers in the structural determination of membrane proteins. The SBMPs project brings together 12 research institutions and three industrial companies from across Europe, with each lab and fellow working with different proteins. “Our research groups combine all the expertise required to understand membrane protein function, purified or in their natural environment, from their expressionpurification to their functional analyses, with the most advanced structural biology approaches currently available.” A wide range of membrane proteins are being studied in the SBMPs network, including GPCRs, ligand-gated ion channels, respiratory proteins, transporters and several β-barrel membrane proteins. Researchers collaborate with other labs from the SBMPs network, sharing expertise
20
Formyl Peptide Receptor 1 (FPR1) in a Membrane Environment. Residues in yellow, which spread from extra membrane into inner membrane, are involving in receptor activation. and learning new techniques, including protein expression and purification, structural determination and the study of protein-membrane interactions. “Some partners are working on the interaction of the protein with the membrane, while others focus only on the structure of the protein itself, which is in its native membrane or reconstituted in either a natural lipidic or an artificial environment. This could be in detergent micelles, lipid bilayers, amphipols, nanodiscs or cubic phases. Some members of SBMPs study the structure of the protein associated with specific partners. Computer simulations and modelling of the structural and dynamical properties are also important aspects of the network’s research.” Each type of membrane protein has different types of partners and performs various important roles, such as signal transduction or the transport of ions or other molecules. The project uses techniques like atomic force microscopy and electron microscopy to study individual membrane proteins as single molecules. “On another
level, membrane proteins are studied as an ensemble of purified proteins by using Nuclear Magnetic Resonance (NMR), electron microscopy or crystallography techniques. In these experiments, the protein can be associated with other components, such as partners in a biological complex or intact membranes.” Important insights have been gained by comparing results from intact cells or membranes with highresolution structures of purified recombinant proteins. “The way proteins fold into a functional form is a critical step in terms of reconstituting them, and this can be monitored by biochemical and biophysical techniques,” outlines Prof Milon; however, he says the project’s focus is on structure determination of functional proteins. Alongside training in one of these techniques, young researchers in the SBMPs are exposed to other fields relevant to the structural biology of membrane proteins, as part of the project’s commitment to broadbased training. While the nature of modern scientific research demands a high level of technical knowledge, often in quite a narrow
EU Research
field, researchers in the SBMPs project have the opportunity to learn about areas outside their own particular specialism. “All of the fellows have collaborations, allowing them to discover various fields. In addition, many of them visit other labs within the SBMPs network to directly learn complementary techniques. Important synergies occur between the various methods, and we try to maximise them. We have collaborations, student exchanges, training courses and annual meetings throughout the network. Complementary skills and contacts with the private sector are also favoured and organised within the network.”
analyses, with the most advanced structural biology approaches currently available. Our main expertise are bioinformatics and molecular dynamics, liquid- and solid-state NMR, crystallography, electron microscopy and AFM-SMFS techniques.” Research advances in one area can lead to developments in other groups. “Each new structural insight allows other fields to work from this data,” stresses Prof Milon. “For example, crystallography can solve a protein structure, but NMR and AFM and SMFS will then use the structure to study some dynamic aspects in greater depth.” After solving a protein structure, the next
Our research groups combine all the expertise required to understand
membrane protein function, purified or in their natural environment, from their expression- purification to their functional analyses, with the most advanced structural biology approaches currently available This training is designed not only to encourage scientists to continue their research, but also to consider the broader potential of their work, an approach very much in line with the wider agenda of translating academic research into commercial development. The involvement of three major companies in SBMPs is testament to the importance industry attaches to membrane protein research, and it is hoped that SBMPs’ work will form the basis of future developments in biotechnology and personalised medicine.
Collaborative approach This collaborative approach is a real strength of the SBMPs project. The network brings together experts in complementary techniques to research the structures of membrane proteins, and provides students with the skills they need to perform detailed research. “Our research groups combine all the expertise required to understand membrane protein function, purified or in their natural environment, from their expression- purification to their functional
step is to investigate potential drugs or ligands. Many membrane proteins are implicated in certain pathologies, and the SBMPs project has established links with the commercial sector to investigate possible drug development. “A small enterprise called Zobio Inc, which specialises in drug development and ligand screening, is a partner in our network, and one of our fellows is directly involved in this research. Sanofi, a major pharmaceutical company is another partner which contributes to the training of our fellows. Several partners are also engaged in research aimed at developing new instrumentation such as Bruker and JPK instruments.” Several fruitful collaborations have been established over the course of the project, which Prof Milon believes will lead to further important research. “We are confident that the fellows we have trained in this network will have bright scientific careers in academic institutions or industry,” he says. “We also anticipate that several networks will emerge in the future from the teams gathered in the SBMPs project.”
At a glance Full Project Title Structural Biology of Membrane Proteins (SBMPs) Project Objectives Training of researchers in the first five years of their careers in theoretical and practical aspects of a wide range of structural biology approaches. Project Partners 13 European groups involved in SBMP research: A. Milon (coordinator), J.L. Popot, E. Pebay-Peyroula, H. Vogel, A. Engel, H. Oschkinat, M. Baldus, D. Müller, H. Grubmüller, F. Bernhard, S. Filipek, M. Archer, Gregg Siegal. 4 industrial partners: Sanofi, JPK instruments, Zobioinc, Bruker. International invited professors: D. Engelman, R. Griffin, K. Palczewski, O. Soubias, M. Piotto, J. Baenziger, M. Cocco, T. Schmidt, S. Yoshikawa, A. Kuklin, V. Shirokov, M. Engelhard, L. Yaguzhinskiy, G. Bueldt Contact Details Project Coordinator, Alain Milon Institute of Pharmacology and Structural Biology - IPBS / CNRS UMR 5089 205 route de Narbonne 31077 Toulouse cedex 4 BP 64182 FRANCE T: 00 33 05 6117 5423 E: alain.milon@ipbs.fr W: www.sbmp-itn.eu
Alain Milon
Project Coordinator
Alain Milon is professor of the Université de Toulouse. He is leading a group specialized in biological NMR in the IPBS (UMR 5089, CNRS – UPS). His main research interests are on membranes and membrane proteins, GPCRs, protein – DNA interactions and ligand screening by NMR for targets of pharmaceutical relevance.
SBMPs network participants, 2nd annual Meeting, Basel 2010
www.euresearcher.com
21
Project raises its voice over ‘silent epidemic’ With cases set to rise significantly over the next thirty years, chronic kidney disease has been described as the ‘silent epidemic’. We spoke to Professor Dr Uyen Huynh–Do of the IKPP project about their research into the causes and possible treatments of the disease Recent data shows that approximately 10 per cent of the world population has some degree of chronic kidney disease, which restricts the body’s ability to both remove toxins and retain important molecules, such as aminoacids and proteins, in the blood. The incidence of the disease is predicted to rise further over the coming years, a context in which the work of the IKPP project takes on real importance. “The main goal of the project is to foster a new generation of researchers in the field of integrative kidney physiology and pathophysiology,” says Professor Uyen Huynh-Do, the project’s Training Programme Director. The project covers four key modules – water and salt, acid and minerals, nutrients and drugs
22
and finally oxygen – reflecting the central role of the kidney in maintaining homeostasis, a balanced internal environment within the body. Professor Huynh-Do says it is important to consider the whole organism in this research. “The kidney is central to the whole organism, so if you want to understand kidney physiology you have to have a global view,” she stresses. “It’s not just about the organ itself, or about some cells or molecules. We aim to look at the kidney within the whole organism.”
Functions of the kidney The starting point is an understanding of the kidney’s functions within the body. One of the main functions of the kidney is
to retain the correct amount of salt and water, which Professor Huynh-Do says is essential to maintaining blood volume and blood pressure. “If you retain salt then you can also retain water, and this makes the blood volume. If you don’t have enough water then your blood volume, and hence blood pressure, decreases; in contrast when you have too much water, the blood volume is too high and this leads to high blood pressure,” she explains. The second important function of the kidney is waste management. “When you eat and work your muscles work and they produce waste. One type of waste is called urea, and the body has to get rid of it, which is the job of the kidney. It does this by filtrating all the blood going through the
EU Research
kidney, which amounts to about 1.2 litres per minute – a fifth of the volume of blood pumped by the heart per minute,” continues Professor Huynh-Do. “On the other hand the kidney also has to retain everything you need – this is not only salt and water, but also various proteins. So all those important things have to be retained and the waste has to be expelled; you really need quite a sophisticated system to do that.” The kidney’s ability to perform these functions is significantly impaired in people with chronic kidney disease. In particular the glomerular filtration rate, the capacity of the kidney to remove toxins and other molecules from the body, is reduced. “If the capacity is reduced by more than 50 per cent, or if you have any structural changes, then you have chronic kidney disease,” says Professor Huynh-Do. The project is also studying the closely related conditions of diabetes, metabolic syndrome and hypertension, which Professor Huynh-Do says are the most important causes of chronic kidney disease – in fact, hypertension can be both symptom and cause. “We know for example that when a person has advanced kidney disease the vessels into the kidney are sclerotic and rarefied, and the kidneys produce hormones increasing the blood pressure. This is one very important
www.euresearcher.com
reason why people get hypertension,” she explains. “In this case the kidney is really the cause of hypertension. As hypertension goes on it destroys the kidney even more. You have a vicious circle – kidney disease leads to hypertension, which in turn exacerbates kidney disease.” The kidney also produces some key hormones such as erythropoietin (EPO), which regulates red blood cell production, and active vitamin D, which is essential for bone maintenance. Lack of either EPO or vitamin D can lead to significant problems which are very difficult to reverse, so the project’s goal is to identify the risk factors in order to prevent kidney diseases. “New-born children have about 1 million nephrons, the working unit of the kidney, in each kidney. The number of nephrons declines throughout the average life-span, and because they are so complex it’s not possible for them to regenerate,” explains Professor Huynh-Do. Among the most important risk factors are diabetes, metabolic syndrome and hypertension, but Professor Huynh-Do says there are also other causes to consider, such as genetics or early foetal events. “Damage to the kidney can begin in the uterus of the mother,” she says. “It has been shown that children who have been born prematurely or are small for their age are more prone to hypertension, cardiovascular disease and
kidney disease later in life. This issue, known as foetal programming, is one of the things we are looking at in our network.”
Treating kidney disease The ultimate focus of this research is to improve treatment. While currently there is no way of restoring the kidney to full functioning, Professor Huynh-Do says it is possible to prevent the progression of chronic kidney disease. “If you prevent proteinuria – loss of proteins through the kidney – or treat hypertension and diabetes, then you can halt kidney diseases. It can be stabilised, this has been shown in multiple studies of tens of thousands of people,” she stresses. The kidneys do not have to work at full capacity to maintain homeostasis; for example, people can donate a kidney for transplant without compromising their ability to remove toxins and produce hormones. “You won’t have 100 per cent function but you will still have 50-60 per cent, and this is absolutely enough to get rid of the body’s waste and to produce enough EPO and the other things you need,” says Professor Huynh-Do. “Normally both kidneys should work at the same time. But you don’t know, unless you go to the Doctor and they look at your blood samples and maybe make a
23
At a glance Full Project Title International Fellowship Programme on Integrative Kidney Physiology and Pathophysiology Project Funding The project budget is €3.158.532 of which €1.263.412,80 (40%) is funded by the European Commission’s 7th Framework Programme under the Marie Curie COFUND scheme. Project Partners • University of Bern (coordinator) • University of Basel • University of Geneva • University of Lausanne • University of Zurich www.ikpp.unibe.ch/content/res/ri/ Contact Details IKPP training programme director, Professor Uyen Huynh-Do Department of Nephrology and Hypertension, Bern University Hospital, CH-3010 Bern T: +41 31 632 3141 E: Uyen.Huynh-Do@insel.ch W: www.ikpp.unibe.ch/content Professor Uyen Huynh-Do
Training programme director
Uyen Huynh-Do was trained as an MD at the University of Zurich, where she specialised in internal medicine and nephrology. In 1996 she joined the Center for Vascular Biology at Vanderbilt University (USA) as a postdoctoral research fellow. In 2004 she was appointed assistant professor, and in 2008 associate professor at the University of Bern. In 2009 she received a Master in Medical Education (MME) from the University of Bern and the University of Illinois (Chicago).
sonography of your kidney, how your kidneys are functioning. Some people never even know whether they have two functional kidneys.” This points to the difficult of diagnosing kidney disease before it reaches a chronic state. Most kidney disease patients go to their Doctor because they feel tired or have hypertension, by which time Professor Huynh-Do says the disease has typically progressed to quite an advanced stage. “The Doctor takes a blood sample and finds the kidney is not working. But at that time the kidney may be working at only 15 per cent of its normal function, which of course is too late,” she says. People with a family history of kidney disease can be diagnosed quite early, but
Since cases of hypertension, diabetes and metabolic syndrome are likely to rise further over the next 30 years then you could also expect that the incidence and prevalence of kidney disease will rise over that period as well these cases are relatively rare and the predicted rise in cases of chronic kidney disease is more due to increases in hypertension, diabetes and metabolic syndrome. “About 30 per cent of the patients on dialysis in the industrialised world have diabetic kidney disease,” says Professor Huynh-Do. “Since cases of hypertension, diabetes and metabolic syndrome are likely to rise further over the next 30 years then you could also expect that the incidence and prevalence of kidney disease will rise over that period as well.”
The future This trend has led the World Health Organisation to describe chronic kidney disease as the ‘silent epidemic’, further
24
underlining the urgency of the situation. With researchers looking to understand the disease in greater depth, Professor Huynh-Do believes effective collaboration between basic scientists and clinicians is crucial to improving treatment. “We have multiple studies on treatment of hypertension and we also have other studies dealing with bone disease in patients with kidney diseases, an important, costly public health issue,”
she outlines. With funding in place for the next few years the project is pursuing research across a range of areas, reflecting the complexity of the kidney and its importance within the body. “We are beginning some projects in clinical nephrology which we think are quite important. For example, we are setting up a Swiss kidney stone cohort. This is really something we can treat quite effectively,” says Professor Huynh-Do. “We also have another study going on into patients with liver cirrhosis and kidney diseases. Liver dysfunction leads to kidney dysfunction, so we can also try to treat that. Currently we have several clinical research projects going on as well alongside our fundamental work.”
EU Research
Drugs Risks & Rewards
An exclusive interview with Professor David Nutt Professor David Nutt’s recent paper in the Lancet fundamentally challenges the way we think about drugs and their associated harms. In this wide-ranging interview he talks frankly about his research, the neurological effects of specific drugs, and the importance of impartial, clear-eyed scientific advice in drug classification www.euresearcher.com
25
I
t is known that drugs can cause significant harm to both individual users and wider society, yet debate continues as to how they should be classified and their impact controlled. The UK operates a system where illegal drugs are classified on a list continuously monitored and updated by the Home Office. Those considered the most harmful (e.g. – heroin and cocaine) are classified as class A, with significant penalties for both possession and dealing, down on a sliding scale to class C (e.g. – ketamine and some tranquilisers), which while still illegal attract less severe punishment. Scientific evidence is of course a key factor in classifying drugs, yet there are also other considerations to take into account, particularly political, and successive governments have amended the system in line with their own philosophy on drug control and the wider social climate. To take one of the more prominent examples, cannabis has variously been classified as either class B or class C over the past 20 years; Metropolitan Police Commander Brian Paddick experimented with a ‘softly, softly’ approach to possession in the south London borough of Lambeth in 2002, but the scheme was not extended. Alcohol’s central place in our social and cultural life has seemingly made it exempt from this debate, yet the growing number of alcohol-related health and social problems has put it squarely on the agenda. Professor David Nutt, Director of the Neuropsychopharmacology Department at Imperial College London, is an acknowledged authority on the effects of drugs and alcohol, and his recent paper in the Lancet fundamentally challenges the way we think about how they are classified. In this wide-ranging interview he talks frankly about his research, the effects of drugs, and the importance of impartial, clear-eyed scientific advice in drug classification.
Pharmacological research EU Researcher: Is your research focused on looking at the fundamental structure of the brain, or are you looking more at how drugs affect the brain? Professor David Nutt: I use a number of techniques to look at the psychopharmacology of disorders like addiction, depression and insomnia, among others. These include computerised tomography (CT), Positron Emission Tomography (PET) and Single Photon Emission Computer Tomography (SPECT). I am particularly interested in GABA receptors, a class of receptors which evidence suggests play a significant role in anxiety and depression, two separate but linked conditions. Anxiety is a kind of apprehension that a major event might go wrong, while depression tends to come afterwards, they are very closely linked. GABA receptors respond to the neurotransmitter gammaaminobutyric acid: drugs that act as agonists of GABA receptors, or increase the amount of GABA available, typically have relaxing effects. EUR: How do you assess how harmful a drug is likely to be to an individual? DN: In the study (Drug Harms in the UK: a multi-criteria decision analysis, recently published in the Lancet) that I co-authored with Drs Leslie King and Larry Phillips with the help of other
26
members of the ISCD, we identified 16 harms caused by drugs and alcohol. Nine of these harms are to the individual user, such as the effects of the drug on their physical and mental health, and seven to wider society, for example the economic costs of use and crime associated with the use of the drug. We assessed 20 drugs, including tobacco, alcohol and cocaine, on each of these 16 measures and ranked them accordingly. Although the study found that heroin, crack cocaine and crystal methamphetamine were most harmful to individual users, alcohol came out as the most harmful drug in terms of its impact on both individuals and wider society. Of course alcohol-related problems are nothing new but this is a growing issue – more young people are suffering from alcohol-related illnesses, just last year a 22-year old woman needed a transplant for alcoholic cirrhosis of the liver, which is a startlingly young age to be suffering from that condition.
The positive symptoms, which dominate media portrayals of schizophrenia, are features added to normal behaviours. They include psychosis (delusions and hallucinations) and disorganised speech, thought, and actions Alcohol initially has a calming effect, it acts as a tranquiliser. It relaxes people and provides an immediate high through targeting two major neurotransmitters, dopamine and serotonin. Alcohol turns on the same brain systems as heroin, and although people tend not to like it the first time they’re exposed to it, people are starting drinking younger. This is partly down to the high visibility of drinks like alcopops, which alongside being heavily marketed are specifically designed to get young people used to alcohol by masking the taste with sweeteners.
The politics of alcohol EUR: Why have politicians failed to take stronger action? Is it because they’re scared of being seen as illiberal or penalising drinkers who don’t have a problem? DN: There are several reasons. Among the most important is the simple fact that alcohol is deeply embedded in our culture; we associate it with relaxation and having a good time – a lot of people are comfortable with the status quo and are resistant to any efforts to change. Then there is the mistaken belief that tax revenues from alcohol companies outweigh the social and financial costs of alcoholrelated health and social problems, and that greater regulation would have significant financial implications, an argument which of course holds a great deal of traction in the current economic climate.
EU Research
This is in large part the result of a very sophisticated lobbying campaign by the drinks industry, in which politicians are often involved. Then there’s persistent misinformation from the media, which sensationalises the issue and puts out inaccurate stories about tax on alcohol. The fact is that taxpayers are paying large amounts of money to treat alcohol-related problems. If you could regulate the industry more effectively – and this doesn’t mean outright prohibition, I am not a prohibitionist – then you could make net savings through reducing alcohol-related illnesses, and people would be better off in the long-term. I have estimated that the average taxpayer makes a contribution of approximately £1,000 per year to cover the costs of alcohol-related health and social damage. EUR: How addictive is alcohol in comparison to other drugs? DN: There are several ways of assessing how addictive a drug is. For example we can experiment on rats – a particular brain disorder has been identified which leads to very high rates of addiction, and there are likely to be parallels in humans. Then we can also look at what we call ‘capture rates’, where we analyse what proportion of people who start using a drug go on to become dependent on it. The most addictive drug of all is actually tobacco – about 40 per cent of users are dependent, while with heroin and crack it’s about 30 per cent. In the case of alcohol I would say that between 10-15 per cent of users are dependent on it in some way, and overall about 10 per cent are in real danger. There is evidence to suggest that some people have a genetic variant which predisposes them to drug addiction. Research in Sweden and the USA has shown that the children of alcoholic parents were more likely to grow up to become alcoholics than their peers. EUR: Could this be partly because of the nature of their upbringing? That maybe they grew up thinking alcoholism was normal? DN: Some of these studies were on children who were actually adopted into non-drinking families, yet still grew up to suffer from similar problems to their biological parents, so genetic inheritance is clearly very important in terms of susceptibility to addiction.
that government scientific advisers have to work within. My interest is in providing impartial, balanced evidence on the harm caused by drugs. That’s why I, together with several colleagues including Professor Colin Drummond, Professor Barry Everitt and Ben Goldacre (the writer of the Bad Science column in the Guardian), set up the Independent Scientific Committee on Drugs (ISCD). EUR: Is prohibiting a drug an effective tool in terms of managing its overall impact, or should the focus be more on making information available? DN: Alcohol was prohibited in the U.S in the ’20s, but it didn’t really help control its impact as the trade went underground. From the doctors perspective prohibition was a success as rates of liver disease fell. From a crime perspective it was a disaster as organised crime grew to a disastrous extent into the bootleg trade. The focus has got to be more on making clear, accurate, accessible information about drugs widely available. The recent Lancet study was quite accessible, it gave an explicit score for each drug on their relative harms to the individual and wider society. The general perception of drugs and their associated dangers is often out of step with reality. For example Magic Mushrooms and ecstasy (MDMA) are currently classified as Class A drugs, which is not commensurate with the danger they actually pose. EUR: What role will the ISCD play in the drugs debate? DN: The ISCD will provide unbiased analysis of drugs and their associated harms, and will contribute to the wider debate on drug classification. I hope the ISCD will play a role similar to that which the Institute for Fiscal Studies plays in scrutinising government economic and monetary policy, we want to be an independent, impartial, reliable source of information on drugs and their effects that will inform and underpin political decisions.
It was also shown that the male children of male alcoholics had a higher tolerance for alcohol, which is innate rather than developmental, so they didn’t get significant and unpleasant intoxication effects when they started to drink, which encouraged them to take more than their peers.
Role of government EUR: On what basis are drugs usually prohibited by governments? Is it primarily about the physiological impact of a drug and the likelihood of users becoming addicted? DN: Scientific advice is of course an important consideration in this kind of decision, yet the reality is that it is ultimately a political one, and so a range of other factors also come into play. Governments always consider whether change is achievable in a particular area, and fear political damage if they try to deal with controversial issues likely to attract strident criticism from vested interests and elements of the media. These are the kinds of political constraints
www.euresearcher.com
27
Managing the costs of natural hazards Natural hazards are a fact of life in many areas of Europe, and can have a significant social and economic impact on the communities affected. Professor Reimund Schwarze explains how the Conhaz project’s work in compiling state-of-the art methods for cost assessment will help improve our understanding of the economic impact of natural hazards The occurrence of natural hazards like droughts and floods is a fact of life in some areas of Europe, and while communities have found ways to adapt and mitigate their impact, such events still cause significant damage to local economies. Cost assessments are a central part of planning for natural hazards, an area being addressed by the ConHaz project. “ConHaz will compile state-ofthe-art methods for cost assessment as used in European case studies and assess these methods in terms of terminology, data quality and availability, this will enable us to identify research gaps. From there we can synthesise new methods and recommend areas for future research,” says Professor Reimund Schwarze, the project’s overall coordinator. The economic impact of natural hazards is difficult to calculate, as alongside the direct costs of cleaning up and reconstruction there are also many other factors to consider, such as loss of trade or tourism. Professor Schwarze believes it is important to take both direct and indirect costs into account in order to assess costs more accurately. “We aim to improve our understanding of how the economy responds to external shocks like natural hazards,” he outlines. “This means applying the broad spectrum of existing economic valuation methods to practical examples. We also need to look at mitigation costs, as non-structural measures are often not considered in current cost estimates.” The project brings together eight highly reputed research institutions from across
28
the European Union to gain a comprehensive picture of cost assessment methodology, work which will be of enormous interest to European policymakers. Research is focused on four main types of natural hazard in particular; droughts, floods, storms and coastal hazards, and finally Alpine Hazards. Areas at risk from these types of events often take preventative measures; however, policy-makers need to carefully consider the scale of the programmes required against the risk of inaction before implementing them on a wider basis, a point which illustrates the wider importance of the ConHaz project.
Natural hazards The costs of some natural hazards are already factored in to the general cost of living for those based in at risk areas, such as people living on flood plains paying higher insurance premiums, yet there are still significant uncertainties across all areas of cost assessment. Researchers
Floods in Richmond, England
within ConHaz will evaluate the compiled methods by addressing various theoretical issues such as the assumptions made within certain methods, and also the availability and quality of data. This work takes on even greater importance given increasing concern about climate change and its potential impact on living standards. Given the worst case-path of global greenhouse gas emissions we are on, some degree of climate change above the internationally agreed threshold of 2°C seems inevitable. Communities will need to adapt to this change if they are to protect their economies and maintain living standards. “Climate change is expected to shift the distribution of climate and weather characteristics, such as temperature, precipitation, wind and sea level. This will drive changes in spatial and temporal averages and the frequency, magnitude, and character of extreme physical events,” explains Professor Schwarze. Such changes mean areas which haven’t historically been affected by natural hazards could be at risk in future. In preparing contingency plans for the event of a natural disaster policy-makers need rigorous methods to assess their likely impact. “When extreme events involve extreme direct and indirect social and economic impacts leading to a severe disruption of the normal, routine functioning of the affected society, they contribute to the occurrence of a ‘disaster’,”
EU Research
At a glance
ConHaz will compile state-of-the-art methods for cost assessment as used in European case studies and assess these methods in terms of terminology, data quality and availability, this will enable us to identify research gaps
says Professor Schwarze. Natural disasters can also have knock-on effects felt well beyond the area directly affected, as the inter-connected nature of the global marketplace means loss of demand in one area can have an impact on jobs and prosperity in another.
Preventative measures This would seem to add up to a powerful argument for governments to take preventative measures to reduce the impact of a natural disaster, but they must be costeffective if they are to be politically acceptable. Rigorous cost assessment has a part to play in guiding decision-making bodies in making better choices. “It’s important to provide economic tools to people in key positions to support their decision-making on how to respond to natural disasters and also in terms of prevention – this will feed in to the policy development process. This holds clear relevance for the management of natural hazards and climate change adaptation planning,” stresses Professor Schwarze. The evolution of the climate remains the subject of important future research and there is always likely to be some level of uncertainty. However, this is not to say that Governments are powerless in the face of increasing natural hazards, and the ConHaz project has established strong links with policy-makers across Europe. “We attended an international workshop organised by the UN-ISDR and the Council of Europe, which was held in Brussels on 8-9 September,” says Professor Schwarze. “We have also been at several other events, including the WG-F Ghent workshop on Floods & Economics in October 2010, organised by the Flemish Environment Agency, together with Flanders Hydraulics Research and several other bodies.”
www.euresearcher.com
National borders Natural hazards often cross national borders, so it is important to ensure different administrative bodies are able to work together effectively. This goes for efforts to assess costs as well, further reinforcing the importance of ConHaz’s close engagement with European stakeholders; beyond this kind of continued collaboration, Professor Schwarze says there is still room for further improvement in our understanding of cost assessment methodology. “We need to consider the dynamics of risk and risk drivers, and also improve our understanding of the distribution of costs and risks.
Full Project Title Costs of Natural Hazards (ConHaz) Project Objectives • Compilation of state-of-the-art methods for cost assessment as used in European case studies. • Assessing these methods in terms of terminology, data quality and availability, and research gaps. • Synthesis and recommendations for future research. Project Funding €382.192,70 Project Partners Consortium: 8 partners from 7 countries Contact Details Project Coordinator, Professor Reimund Schwarze Helmholtz Centre for Environmental Research-UFZ, Germany T: + 49 0341 235 1607 E: reimund.schwarze@ufz.de W: www.conhaz.org (ConHaz reports and policy briefs downloadable)
Professor Reimund Schwarze
Project Coordinator
The Conhaz structure There is also a clear need for improved data sources and in some cases methods, but we need to keep in mind the efforts this will involve,” he says. There is significant scope for further research into the risks posed by natural hazards, and Professor Schwarze says new projects have been established to look at some of the most pressing questions in the field. “We are involved in or starting new social science related natural hazards projects, such as Caphaznet (www.caphaz-net.org) and emBRACE (www.embrace-eu.org), which will be conducted at UFZ, the Centre for Environmental Research based in Leipzig,” he says.
Reimund Schwarze is Professor for Environmental Economics at the University of Frankfurt/Oder and a Senior Researcher at HelmholtzCenter for Environmental Research (UFZ) in Leipzig. He has authored books and journal papers on Natural Hazards Economics, and is scientific advisor to the German Committee for Disaster Prevention of UN-ISDR.
29
Creating a Sustainable Future
Creating a Sustainable Future One of Europe’s leading authorities on the subject of sustainable production and consumption, Dr Gerd Scholl talks to EU Research about the CORPUS project, the issue of sustainable consumption and bridging the gap between policy and research
Sustainability is an
issue that is becoming increasingly important as our natural resources become depleted. The rate at which we consume is increasing year on year and if we are to continue our current lifestyles, we need to seriously consider alternative methods of production and consumption. Dr. Gerd Scholl of the Institute for Ecological Economy Research (IÖW), Berlin, spoke to EU Research on the subject of sustainable consumption and the CORPUS project that he is currently a part of. Graduating as an economist in 1993, Dr Scholl began working with the IÖW that same year. With a background in German national and EU wide projects that looked at product related environmental policies, he has also worked on eco labelling and life cycle assessment. From here, he has progressed to research into sustainable product service systems, such as sharing, renting, and leasing. Dr Scholl’s interest in sustainable consumption grew from a belief that sustainable development could not be achieved without altering consumer lifestyles, especially in more economically developed countries. “Efficiency gains at the product or service level are often offset by an increase in demand,” Dr Scholl explains. “This clearly shows that technology cannot be the sole cure to consumption. We need a more accurate
30
picture of what we consume, and why, in order to successfully pave the way for a sustainable consumer culture. The idea is to create a culture in which we consume less natural resources and differently.” In order to achieve this, Dr Scholl believes that there needs to be a broad societal transformation, addressing businesses, policy, and civil society alike. “It will require a wide range of disciplines and trans-disciplinary integration to come up with appropriate strategies. That’s the great thing about research into sustainable consumption; it is informed by diverse scientific communities and often geared towards a practical solution.” It is this philosophy that underpins the CORPUS project and makes it such an important tool in the area of sustainable consumption and development. The CORPUS project is one that hopes to tackle the issue of consumption from the point of view of diverse scientific communities, and by connecting these communities to a wider audience. It aims to develop novel ways of brokering knowledge between policy-making and research. The CORPUS acronym stands for: Enhancing Connectivity Between Research and Policymaking in Sustainable Consumption. The idea is to utilise a number of methods in order to explore the issues of sustainability from both the side of researchers and from the side of policy-
makers. The project is made up of a consortium of 11 European organisations, all working together to establish a centralised hub of knowledge that will allow easier discourse on the subject of sustainable consumption. “CORPUS was inspired by the need to improve the transfer of knowledge between research and policy with the aim to support evidence-based policy making for sustainable consumption,” Dr Scholl told EU Research. “For more than two decades researchers from all over Europe, and of different disciplinary backgrounds, have analysed barriers to and drivers for more sustainable consumer behaviours. There is a huge body of evidence out there which is, however, under-utilised in policy-making for various reasons: different mind sets and professional cultures, diverging time lines, mismatch of agendas, and others too.” This lack of connectivity between communities is all the more problematical as sustainable consumption has gradually climbed within political agendas. According to Dr Scholl, the overriding objective of CORPUS is to experiment with and develop new integrative modalities of knowledge brokerage at the policy-science interface. He sets out three avenues that will ultimately lead toward this objective: • I mproving the understanding of the knowledge interface between research
EU Research
At a glance and policy-making and developing appropriate and transferable methodologies and tools for knowledge brokerage in sustainable development • Fostering evidence-based policy-making in sustainable consumption policies (on food, mobility, and housing) at European and national level and strengthening the policy-orientation of relevant research communities through the development and implementation of online and offline knowledge brokerage mechanisms • Stimulating community building across the involved research and policy-making communities in order to trigger a selfsustaining process of effective knowledge management in sustainable consumption policies Essentially, the CORPUS project hopes to bridge the gap between researchers and policy makers in an attempt to spread the knowledge of sustainable consumption to a much wider audience. “CORPUS primarily addresses researchers and policy-makers,” Dr Scholl explains. “The general public might benefit indirectly from CORPUS in that policies that foster more environmentally and socially benign consumer behaviour are more effectively tailored to people’s needs, opportunities, and abilities. By improving the sciencepolicy-interface, research evidence will be better exploited and thus will have more of an impact.” There are two primary methods that CORPUS employs to bridge the gap between policy and research. The CORPUS web platform – the “Sustainable Consumption and Production (SCP) Knowledge Hub” – allows access to a vast array of data that pre-exists and will become a focal point for the uploading of future articles and data. The CORPUS workshops provide researchers and policy makers with the opportunity to enter into personal dialogues and facilitate opportunities for networking. The web platform and workshops form knowledge networks and communities of practice, which will nurture the transfer of knowledge. “Knowledge exchange will come about naturally to the extent that researchers and policy makers personally
www.euresearcher.com
connect within and across their professional communities,” Dr Scholl tells us. What makes the CORPUS project unique is that, at present, it is the only knowledge repository in the area of sustainable consumption and production. It is also the only project which offers a diverse means to enhance connectivity between research and policy on sustainable consumption, particularly in the areas of food, mobility, and housing. The results of the CORPUS project will be processed in a way that will potentially benefit an even wider audience. Dr Scholl explains: “A policy brief will summarise the main results of the research and provide recommendations on proper knowledge brokerage for sustainable development policies.” While this brochure is primarily aimed at policy consultants and policy practitioners, it will also be available for, and useful to, scientists in applied research. “It will offer advice on how to communicate scientific evidence more effectively.” At the time of writing, the CORPUS project is reaching its halfway point. So far, more than 700 experts in the field of sustainable consumption have registered on the web platform and are benefiting from access to the wealth of data available online. The workshops have also attracted numerous professionals and are proving to be effective means of encouraging networking between disciplines. The full extent of the project’s aims are by no means realised yet, and Dr Scholl is confident about CORPUS’ goals. “Community building is a challenging task which takes time,” he told us. “Intercommunity collaboration can only be achieved when benefits are obvious for both sides and can easily be realised. Given this challenge the project’s running time of three years can be regarded a rather short one.” While the CORPUS project is only funded by FP7 for three years, it aims to continue its mission past this point. “We hope to continue maintaining the web platform after the project has ended,” Dr Scholl says. The overall desire is for CORPUS to evolve into an established institution for knowledge brokerage on sustainable consumption and production.
Full Project Title CORPUS - Enhancing Connectivity Between Research and Policymaking in Sustainable Consumption Project Objectives The overriding objective of CORPUS is to experiment with and develop new integrative modalities of knowledge brokerage at the policyscience interface. Project Funding €1.5 million Contact Details Project Coordinator, Dr Gerd Scholl Institute for Ecological Economy Research (IÖW) Potsdamer Str. 105 Berlin 10785 Germany T: 00 49 30 884594 2 E: gerd.scholl@ioew.de W: w ww.scp-knowledge.eu
Dr Gerd Scholl Project Coordinator
Economist, graduated in 1993, and began to work with IÖW the very same year. Started off with national and EU projects on product related environmental policies. Some focus on eco labelling and life cycle assessment. Later on comprehensive research into sustainable product service systems, such as sharing, renting, and leasing, in various projects. Innovative marketing strategies for such concepts was the topic of his dissertation that was finalised in 2008. More recently research interest in consumers’ perceptions of new technologies (e.g. nanotechnology).
31
DERREG: Globalisation and the Development of Rural Europe
Globalisation presents challenges and opportunities for the development of rural regions in Europe. DERREG has examined how globalisation is impacting on them and the effectiveness of regional responses, identifying ways in which regional development programmes can make a difference The impact that
globalisation has had on our way of life is probably not something that many of us think about, but we will have experienced it nonetheless; walk down any supermarket aisle and there are numerous goods on offer that have travelled across continents to get to our shelves. What many of us will not consider is the affect that globalisation has on the smaller, usually traditional businesses that are located in rural regions throughout Europe. Developing Europe’s Rural Regions in the Era of Globalisation (DERREG) is a project which seeks to address the issue of globalisation and the affect this has had upon rural regions across Europe; Professor Michael Woods, whose previous work has been concerned with changes in rural areas and the political dimensions associated with this change, spoke to EU Research about the DERREG project and what it hopes to accomplish. The DERREG project grew from Dr. Woods’ interest on the subject and from discussions with colleagues throughout Europe. “A group of us came together at a conference in Leipzig and decided to embark upon a research project on this theme,” Prof. Woods explains. “What we are trying to achieve through this is two things. Firstly, addressing the bias which there has been in academic literature towards looking at globalisation only in terms of urban areas, the idea of the ‘Global City’; this tends to neglect what is
32
happening in rural areas and suggests that they are perhaps not affected as much by globalisation, or that any affect is an indirect result of what is happening in urban centres. We want to redress that balance.” The second aim of the DERREG project is to challenge the idea that rural areas are the inevitable victims of globalisation. Prof. Woods tells us: “We wanted to highlight what regional development practitioners and agencies working in rural regions could do or have done to take advantage of the opportunities created by globalisation. We wanted to see how they responded positively to any challenges arising from that.” DERREG wants to understand what actually happens in and around rural regions and how the various aspects of globalisation are having an impact there. Following this, the project asks questions about how the regions are responding and what action can be involved this process. “We are trying to understand the mechanics so we can then develop an interpretive model explaining how globalisation works within rural regions,” Prof. Woods tells us. “We hope that this model will be pivotal in the creation of a rural development strategy, identifying where policy intervention should be targeted to have an effect.” One of the key messages that underpins the project is that there is no single common experience throughout the rural areas;
different regions experience globalisation in different ways, and their responses also differ. “Some of the biggest challenges faced are those of trade barriers being reduced,” Prof. Woods says. “Markets are opened up to competition and small rural businesses then have to compete with increased international imports.” The DERREG team has noticed that in some cases production has relocated to cheaper locations; traditional industries, such as textiles and factory work, have encountered closures which have a detrimental affect on the rural regions’ economy. “We’ve also seen an impact from migration, including depopulation – particularly in areas such as Central and Eastern Europe – with people leaving rural regions not just to move to national cities, but also emigrating to the UK, Ireland or Scandinavia to become migrant workers.” The project has also looked at areas that have been affected by in migration. Some regions considered to have attractive landscapes or attractive ways of life have become increasingly popular for many Europeans. “Some areas of Slovenia that we studied have seen an increase in migration of British, German, and Dutch either buying holiday homes or moving permanently to the area. There are challenges that then arise from that mixed population; globalisation can disrupt the old settled social and economic structure of these regions.”
EU Research
“One thing that was very important to us was recognising that there was not just one single experience of globalisation; we wanted to look specifically at different regions and settings to get an overall impression of globalisation’s effects”
The DERREG Structure DERREG explores four key themes in the development of its interpretive model. The first theme looks at the engagement of rural businesses in international networks; using a survey of rural businesses within five case study regions, the team mapped the extent to which these businesses were selling products in other countries, whether they were collaborating with firms internationally, and whether they had suitable access to finance capital. “We’re mapping out the degree of internationalisation of those small and medium sized enterprises in rural regions, but also looking at the extent to which this becomes embedded within the local economy.” The project has encountered examples of companies who are utilising regional products and resources to create an export industry. “In northern Sweden, there is a company harvesting wild berries and exporting them internationally,” Prof. Woods tells us. “It’s looking at that combination of using regional resources but also accessing an international market and international resources and networks.” The second theme focuses on international migration and increased mobility across borders, perhaps the most obvious example of this is looking at the influx of migrant workers into different countries. “We examined areas of Sweden and Ireland and looked at the net influx of migrant workers and how they contributed to the local economy and how they were socially integrated,” Prof. Woods explains. “We also looked at the impact of out migration and return migration, particularly in a case study region of Lithuania. We wanted to see how that might be harnessed and used to
www.euresearcher.com
contribute to the local economy.” The project also looked at ‘amenity migrants’: those who are moving in order gain a better quality of life; this included people who are investing in holiday homes, or are making a permanent move to these regions, and again looking at the social integration and economic impact this has. The final aspect of the second theme was to look at smaller scale cross border migration. “We have an example where people are moving from Luxembourg into villages in Germany because property prices are cheaper and there is more land available for building. All of these examples of migration beg the question what is the impact on local communities and the extent to which there are the opportunities to be harvested in terms of economic development.” DERREG’s third theme involves looking at sustainable development and environmental capital. One aspect of globalisation that has become more prevalent in recent years is a growing concern with issues such as climate change, global warming, and biodiversity among others, and DERREG is exploring the effect that these issues have upon regional development. “We are seeing a clear emphasis in policy documents on renewable energy developments,” Prof. Woods tells us. “Similarly, developments such as sustainable agriculture and eco-tourism are being encouraged and fit in with the global discourse responding to environmental change.” The DERREG team investigated this by asking about what opportunities this opened for rural regions. “There are some major opportunities,” he tells us. “In one of our case study regions in Northern Sweden, it was announced recently that Facebook plans to house its servers there
because it can access cheap plentiful hydroelectricity to provide its energy needs.” However, such developments can have a negative impact on more traditional industries or traditional land management practices. “In east Germany, where the traditional coal mining industry is being phased out by regional policy makers with an emphasis on renewable energy and wind farms, we found a response from the local population which was very much opposed to wind farm development. They wanted to keep the coal mines and power station open, despite the pollution, because they were the traditional sources of employment.” The fourth theme of the DERREG project encapsulates the previous three; rather than looking at the different dimensions of globalisation, the fourth theme considers the response to globalisation. As Prof. Woods explains: “We’re looking here in terms of how we build regional capacities to respond to globalisation. We’ve taken on board an idea called ‘regional learning’, where we look at how the local population, local civil society groups and businesses can be brought together to work in conjunction with universities and research institutes and reflect upon their region and how it has been affected by globalisation and their future aspirations. They can consider what they think their identity should be and from that develop grass roots initiatives for rural development.” An example of this can be seen in the Netherlands, where local universities have become involved in a project concerned with engaging local people to think about where they wanted the region to go in the future; this has led to the initiation of numerous projects looking at work in
33
At a glance Full Project Title Developing Europe’s Rural Regions in the Era of Globalisation (DERREG) Project Objectives To produce an interpretative model that will enable regional development actors to better anticipate and respond to the key challenges for rural regions arising from globalisation. Project Partners Aberystwyth University, UK • LeibnizInstitut fur Landerkunde, Germany • Mendel University in Brno, Czech Republic • National University of Ireland, Galway • NeVork Research, Slovenia • Nordregio, Sweden • Universitat des Saarlandes, Germany • University of Ljubljana, Slovenia • Wageningen University, Netherlands Contact Details Project Coordinator, Professor Michael Woods Institute of Geography and Earth Sciences Aberystwyth University Aberystwyth SY23 3DB United Kingdom T: +44 1970 622 589 F: +44 1970 622 659 E: m.woods@aber.ac.uk W: www.derreg.eu
Professor Michael Woods
Project Coordinator
Michael Woods is Professor of Human Geography at Aberystwyth University, and works of issues of rural politics and governance, rural development, and the engagement of rural localities with globalization. He is editor of the Journal of Rural Studies, and author of several books including Rural (Routledge, 2011), Rural Geography (Sage, 2005) and Contesting Rurality (Ashgate, 2005).
34
The DERREG Interpretative Model: Globalisation processes combine with regional contexts and capacities to create impacts in rural regions, but regional development responses are important in determining long-term outcomes. tourism and ways of harnessing natural resources in terms of biomass energy among other things.
Good Practice The findings of the DERREG project are being presented in a variety of ways. In 2011, a workshop was held in the European parliament with members of parliament, individuals from the European Commission and other organisations in attendance; this workshop allowed the DERREG team to relay their findings from the research and discuss policy implications associated with the work. A conference was also held in Slovenia with people actively involved in regional and rural developments across Eastern Europe. As Prof. Woods explains: “We presented our findings, but also held discussions on how the practical implications of this may be taken forward. We did this because the initial call we responded to had a particular focus on assisting regions in Central and Eastern Europe.” The primary way that the DERREG project has facilitated its research findings is through the project’s website. The site includes an online resource centre that includes access to the various papers being published as part of the project, but it also includes documents and information aimed at non-academic users. “There are two key elements to the website,” Prof. Woods tells us. “The first is a database of examples of good practice, which currently holds over one hundred different examples and initiatives from the case study regions used throughout the project. Some of these are examples of exporting businesses, some are examples of initiatives where there has been a successful integration of migrant communities; there is a wide range of searchable examples of good practice, broadly disseminated so that others can look at them and perhaps even utilise
them in their own local communities.” The second element of the website is a series of short films from each case study region, which again highlight examples of good practice; as well as being available via the DERREG project website these films are also available on YouTube. One of the key benefits of the DERREG project has been its scope; it is a Europe wide project involving nine partner institutions. “You really get a synergy from European projects such as this,” Prof. Woods says. “It allows you to get a really good geographic coverage over Europe. One thing that was very important to us was recognising that there was not just one single experience of globalisation; we wanted to look specifically at different regions and settings to get an overall impression of globalisations effects. To that end, working with local partners has been crucial.” Another key benefit has been the multidisciplinary approach to the project; as Dr. Woods explains: “It has brought together very different disciplines, from geography, to sociology, to planning and economics. We’ve had different insights throughout the research process, and through discussions regarding the findings.” Perhaps the most important message to come out of the DERREG project is the realisation that different regions respond to globalisation in different ways, and are affected differently also. “There is no one size fits all solution,” Dr. Woods says. “We need to recognise these differences; policies and other interventions need to be aimed at allowing regions to respond in a way that is either the best or most appropriate to them. What we also need to realise is that these rural regions are not just victims of globalisation; they are able to integrate themselves successfully within global networks.”
EU Research
A leading authority on plant ecology, with multiple publications in high elevation tree research, and author of Alpine Plant Life, the only textbook in this field worldwide, Professor Christian Körner talks to EU Research about his latest project
Seeing the Forest through the Trees TREELIM, a five year research project funded by the ERC’s extremely rare advanced grants, continues Professor Körner’s previous work establishing the upper tree limit of European tree species, and aims to explain the upper limit of common deciduous forest trees in Europe. TREELIM aims to answer specific questions regarding the climatic limits of European broad-leaved taxa, the most fundamental being what causes cold
www.euresearcher.com
climate limits in these common European tree species. “Trees like oak, beech, maple, cherry form the forest,” Professor Körner explains. “None of these species is reaching the treeline, because none of them evolved genotypes that can survive at the treeline climate.” The TREELIM research is based on a large-scale and comparative approach, a philosophy of Professor Körner’s that has underpinned his previous work.
“Studying one species in one area will limit your understanding because every species is different.” TREELIM uses techniques and avenues of investigation which are applied to a wide range of species over a large area. “We are comparing tree positions and climate conditions in northern Europe at the latitudinal climatic limit, and compare this with the elevational limit in the Alps. We utilize elevational gradients along
35
which one species after another finds its limit. Only the treeline species are left to go even further.” Based in Switzerland, the majority of the research is conducted in the Swiss Alps with parallel sites in Sweden, enabling a Europe wide approach.
Five Branches of the TREELIM Project One of the initial tasks undertaken by TREELIM was to investigate existing databases to ascertain where certain tree species occur. Utilising Geographical Information Systems to gain access to a vast amount of data allows TREELIM to compare the distribution of tree species looking for correlations and boundaries of where certain species grow. Lead researcher in this area Christophe Randin has discovered that northern and elevational limits of these species correlate in terms of temperature. As Professor Körner states: “The ranking of species, which goes highest, and which goes least high in the Alps, corresponds to which goes northern most and less far to the north. It seems like most of the species in the north are actually tracking the climate; the data we have so far indicates they are tracking in both elevation and latitude.” Much of this data has been gathered using meteorological stations not localised to the areas where these species grow. In order to gather more precise data, Chris Kollas and Christophe Randin spread hundreds of single channel temperature sensors within the tree canopy, rooting zone, and understory where the various species are found most northern, and elevated the highest. “You need two years to arrive at a data set, and over that time you’re only measuring. Once the data has been gathered, you have to distil it and we are right in that place now, trying to find patterns and analysing these microclimatology patterns. This is the second branch of activities.”
36
Micro-climatological data analysis will enable TREELIM to gain an in-depth insight into the types of extreme temperatures that these species are subjected to within their ecosystem; a task never attempted in this form before. “We will discover just how cold the tip of the tree gets in cold winter nights, or in spring nights. It is like a fingerprint of the actual environmental conditions that these trees live in.” The third branch explores temperature limits of these tree species, analysing freezing resistance, an undertaking by Günter Hoch, Armando Lenz and Yann Vitasse that requires the freezing of thousands of fresh plant samples. TREELIM has developed a precise piece of machinery to aid data collection; the new purpose built freezing lab where samples are stored are computer controlled, enabling steady temperature decline across seven different freezing chambers. Each specimen can be tested in parallel through different freezing programs.
“This has not been done before. Normally, researchers just place specimens in deep freeze and remove them at various stages during the cooling process; the flaw with this strategy is that you need to hold the samples at certain temperatures to ensure that the tissues have had time to respond.” The fourth branch looks at the dendrology of the tree species. Studying core samples, and comparing them to climate data obtained from weather stations, Armando Lenz is able to gauge the optimum conditions under which new tissue growth is encouraged. “We can correlate the actual site climate to data collected from weather stations, whose records can span anywhere between 60 and 80 years,” Professor Körner tells us. In identifying certain growth restraints in this way, it is possible to further explain the geographical spread of certain tree species, especially those of the treeline. These field studies are complemented with a series of controlled climate experiments in growth chambers, in order to decipher mechanisms of growth at low temperatures. The final branch of the TREELIM project is perhaps the most laborious and certainly the largest scale. Using seeds collected from species at their upper limit, the team studied their quality and germinated them. “We had 5,000 seedlings which we then moved to transplant gardens along elevational gradients in the Swiss Alps.” The task of moving close to 20 tonnes of plant material to each of the four gardens in the Canton of Valais, near Martigny, and another four near Chur to the east of the Alps, required a great deal of organisation and extra help which was offered by the Swiss Army. Using reciprocal transplantation, Professor Körner and his team exchanged seedlings from eight gardens in the eastern and western Alps at all four elevations. “We had geographical replication and vertical replication,”
EU Research
Professor Körner explains. “We used eight different species, and each example of a species had a different mother tree so we can also explore micro-evolutionary aspects.” This stage of the project required twenty field assistants and five academics. The harvesting that took place in September 2011 of ca. 5,000 saplings was a huge undertaking, but integral to the project outcome. “We want to find out the phenology of the individual specimens and look for genetical differences between provenances between high and low elevations, a work led by Yann Vitasse.” The project will examine the rate of growth, the quality of tissue, the proportions between leaf and stem tissue, and the effects that colder conditions have upon different species. “It seems so far that these seedlings do well even at the highest elevated gardens. They are obviously smaller, but they are healthy; our current conclusion is that the distributional limit of these species allows them to grow vigorously and that high elevation populations are adapted to grow and develop under cool conditions.” As the project stands, findings indicate that temperature does not necessarily limit the growth and spread of the trees and as global climate change increases the planet’s temperature, the trees should naturally begin to move upslope. “We already have evidence that they are doing this,” Professor Körner tells us.
The Future of TREELIM The TREELIM project is pioneering in terms of scale, and incorporates a multidisciplinary approach, engendered from the desire to ensure that its research can be taken up by future generations of researchers. Professor Körner states: “The ERC are particularly interested in projects which make use of certain types of science and methodologies that can be handed
www.euresearcher.com
At a glance Full Project Title A functional explanation of low temperature tree species limits Project Objectives To decipher the biological causes of deciduous trees’ distributional limits in a multi-disciplinary approach including biogeography, climatology, ecophysiology, reproductive ecology and dendrology. over to the next generation. For instance, freezing resistance is an area that has not seen much in the way of research in the last 20 to 30 years, and the younger biologists have had little in the way of training in this field. We are passing on knowledge, method, and expertise. I think the team benefits from the multidisciplinary approach we have adopted. The use of GIS, microclimatology, wood science, stress physiology and evolutionary biology gives the project an incredibly broad spectrum.” For Professor Körner, TREELIM is asking questions that can only be solved in a comparative, large scale approach, which can account for variations. “It is the art of asking small questions that will push ecology further, and by that I mean using tools that have a strong, indicative value but which don’t tie you up with tonnes of equipment and time consuming measurements. We are trying to use signals from nature such as stable carbon isotopes or simple growth measures that can be measured in a large collection of species.” One of the overriding tenets of the TREELIM project is the importance of using nature to house the experiment rather than creating an artificial environment in which to house the experiments and replicate natural conditions. “The data collected has more validity than a lab setting could allow.”
Project Collaborators Christophe Randin, Günter Hoch, Yann Vitasse, Chris Kollas and Armando Lenz (University of Basel) and Niklaus Zimmermann (WSL, Birmensdorf near Zurich). Contact Details Project Coordinator, Professor Christian Körner Institute of Botany, University of Basel Schönbeinstrasse 6 4056 Basel (Switzerland) T: +41 (0)61 267 35 10 F: +41 (0)61 267 35 04 E: ch.koerner@unibas.ch W: http://pages.unibas.ch/ botschoen/treelim/ Professor Christian Körner
Project Coordinator
Christian Körner is Professor of plant ecology at the University of Basel, Switzerland. His research fields are alpine ecology (his textbook ‘Alpine Plant Life’ with Springer became a major reference), the explanation of climatic high-elevation treelines worldwide, and the effects of climate change and atmospheric CO2 enrichment on natural vegetation. He chairs the Global Mountain Biodiversity Assessment of DIVERSITAS. He belongs to ISI’s 1% group of highly cited authors.
37
The capture and storage of CO2 could significantly reduce Europe’s carbon emissions, yet major barriers remain before the technology is widely adopted. We spoke to Bai Song, coordinator of the CACHET-II project about their work in developing a cost-effective technology capable of producing clean power from both coal and natural gas
Using palladium membranes to produce clean power The goal of reducing carbon emissions is widely shared, leading to an intense research focus on how carbon capture technology can be used to mitigate the environmental impact of industry. However, while some CO2 capture technology is currently available, the high capital costs involved and low energy efficiency represent significant barriers to its deployment; these are issues the CACHET-II project is working to address. “The objective of the project is to develop a technology that could produce clean power from natural gas or coal at a lower cost than is currently possible,” says Bai Song, a process engineer at BP, one of the prime movers behind CACHET-II. The project brings together academic and industrial partners to develop palladium hydrogenselective membranes, a pre-combustion carbon capture concept, which will separate hydrogen from CO2. “We take natural gas and coal and convert them into synthesised gas, which is a mixture of carbon monoxide and hydrogen,” explains Ms Song. “It then undergoes a water-gas-shift catalytic reaction, where steam is added to the synthesised gas mixture and converts it into a CO2-hydrogen mixture.” Efficient separation The next task is to separate the CO2 from the hydrogen. Most technologies selectively pick up CO2, leading to a loss of pressure which has to be raised again at a later stage; Ms Song says the CACHET-II project is following a different approach. “The beauty of the palladium hydrogen membrane technology is that we can separate hydrogen instead of taking out CO2. So we can keep CO2 at high pressure and save a lot of power, while in the process we can actually afford to lose some pressure across the membrane on hydrogen and use it to drive the hydrogen separation,” she outlines. The technology will allow researchers to
38
High pressure high temperature membrane testing rig at ECN, Netherlands. capture carbon and produce hydrogen simultaneously. “Once you take out the hydrogen from the synthesised gas stream, the hydrogen is at a good level of purity and ready to be burned as a fuel to generate clean power. Meanwhile the remaining CO2 could be processed a bit more to reach the specification required for storage in geological formations. So the separation process basically creates the two products that you want at the end,” explains Ms Song. “The hydrogen produced goes to a turbine to generate power, while the CO2 will be stored underground.” The membrane technology is constructed from a porous ceramic tube, which acts as the membrane support, with a layer of palladium metal deposited on its top. Hydrogen is the only molecule that is able to pass through the palladium layer, which is otherwise impermeable to normal gases; such high selectivity towards hydrogen is enormously relevant for carbon capture
application, which contributes to a near 100 per cent CO2 capture rate that most competing technologies cannot achieve. The CACHET-II technology could be used on several different sources of energy, but there are economic and technical constraints on the way it can be applied. “There are several ways in which the material could be used – you can either use pure palladium, which is not very sulphur resistant but can be suitably deployed to generate clean power from natural gas. Or you can develop advanced palladium alloy membranes, which combine palladium with other metals, such as silver, gold, copper and other noble metals. These alloy membranes are currently under development within CACHET-II to improve their levels of resistance to sulphur in order for them to become exploitable in clean power generation from coal,” says Ms Song. Pure palladium membranes can still be used for coal applications if the sulphur in the coal is sufficiently removed upstream, but this is likely to prove less economic, pointing to a need for more advanced membrane technology with improved sulphur resistance. CACHET-II is also dedicating its efforts to developing an advanced sulphur removal technology that is efficiently integratable with its advanced membrane technology, in order to create an optimum technology package for generating clean power from coal. While coal is commonly thought of as a ‘dirty’ fuel, Ms Song believes it can still play a part in longterm energy provision. “As long as we can handle the impurities in the emissions using cost-effective technologies, such as the ones that CACHET-II is trying to deliver, we feel that it is feasible to generate clean power from coal,” she says.
Carbon capture Alongside its research into generating clean
EU Research
power, the project also aims to improve the efficiency of carbon capture. While a 100 per cent capture rate is theoretically possible with palladium membranes, major technical challenges remain. “The CO2 we produce for storage needs to meet certain specifications. When we use a hydrogen membrane we can’t achieve the exact specification all the time – so we need to further process the CO2 stream and when we do that we will inevitably lose some CO2 and reduce the capture rate. Another important factor is the integrity of the membrane and its seals – if the membrane is not 100 per cent leak tight then some CO2 will travel to the side where hydrogen is going to, and we will lose some capture efficiency,” explains Ms Song. One of the project’s key targets is to achieve a sufficiently high capture rate, and from the target the researchers have calculated how leak tight the membrane seal needs to be. “The membrane module body, when deployed in industries, will be built from steel, but the membrane tubes themselves are primarily based upon ceramics, so we need to come up with a seal to connect the ceramics to the steel. The challenge with the seal is that we need to form a leak tight metal-to-ceramic connection that can withstand high differential pressures across it,” explains Ms Song. One way is through a mechanical seal, using a compression cap, or alternatively there are chemical sealing methods to link ceramics to steel. Both types of seals are being developed and tested for long-term stability and reproduceability by CACHET-II.
These membranes could eventually be used in quite large quantities within newbuild power stations. “The first application will be on natural gas, because the relevant pure palladium membrane technology is at a more developed stage than the coalrelated technologies. For the natural gas application, we are trying to scale-up the technology to 1 metre long. The next step will be to integrate the single membrane tubes into about 5-metre long modules, we expect that size will be well suited to commercial needs,” says Ms Song. Emissions reduction is a real priority across the energy sector, yet companies will still need to be convinced of the business case before they adopt carbon capture technology. The current research paints a promising picture – the project’s own internal analysis suggests their palladium membrane technology could help industries reduce their carbon capture energy penalties. “We think we can achieve about 15 per cent reduction in energy penalties when we apply this capture technology in the natural gas clean power generation. We also think we can achieve up to a 40 per cent reduction in energy penalties when we apply this technology to coal-based clean power generation,” outlines Ms Song. “These are significant figures, but the realisation of them does rely on us delivering the palladium membrane technologies, the more efficient sulphur removal technology, and successfully scaling them up for commercial scale applications. These are the goals we are working towards.”
At a glance Full Project Title Carbon Capture and Hydrogen Production with Membranes Project Objectives The project aims to advance the hydrogen-selective palladium membrane technology, to be applied as a precombustion capture technique, for clean power generation from fossil fuel. Project Funding 5.2 million euros Project Partners BP (UK): Project co-ordinator and industrial partner • TECHNIP (France): Membrane module design and process integration • ECN (Netherlands): Technology development on membrane, seal and advanced sulphur removal, and pure palladium membrane pilot unit operation • SINTEF (Norway): Technology development on advanced palladium alloy membrane and advanced sulphur removal • DICP (China): Scale up of pure palladium membrane, and advanced palladium alloy membrane development • IMR (China): Chemical seal development • NTUA (Greece): Natural gas based process integration and techno-economic evaluation, and project dissemination activities • PTM (Italy): Coal based process integration and techno-economic evaluation Contact Details Project Coordinator, Bai Song BP Sunbury, Chertsey Road Middlesex, TW16 7LN T: +44 01932 764176 E: bai.song@uk.bp.com W: www.cachet2.eu
Bai Song
Project Coordinator
Bai Song graduated from Cambridge University in 2007, with a masters degree in chemical engineering. She became an employee of BP in the same year. With a passion to fight climate change, she joined the BP Alternative Energy CCS team in September 2010, focusing on research and development of advanced carbon capture technologies including the palladium membrane technology.
www.euresearcher.com
39
A Snapshot of the IOC: Their History, Their Research, Their Future For 50 years now, the Intergovernmental Oceanographic Commission has been the UN’s body for ocean based research, information, and services. EU Researcher’s Richard Davey explores the past, present and future of the IOC and the importance of its work
M
any of us take the Earth’s oceans and seas for granted. We know that the oceans cover around 70% of its surface; we may also know that it supplies us with 96% of the entire planet’s surface water. The issues of global warming, the melting of the ice caps and rising sea levels will also not have escaped our attention. We may know all these things, but how many of us really take a note of them? How many pause and wonder just what the oceans mean to our daily lives? The Intergovernmental Oceanographic Commission (IOC) are one group that have dedicated their professional lives to helping unlock the mysteries still contained within the oceans and they want to share this knowledge with the rest of the scientific community and the rest of the world. The IOC is a branch of the United Nations Educational, Scientific, and Cultural Organisation (UNESCO) and currently comprises of 142 member states, including US, UK, and China, with a maximum of 40 member states sitting on the Executive Council at any one time. For 50 years now, the IOC has worked toward expanding not only our knowledge of the oceans and all they contain, but have also looked for ways to harness the power of the sea, as well as search for ways to protect the vast array of life in the seas, and monitor the state of the water itself. Part of the IOC’s mission statement declares that: “The world community faces growing challenges arising from climate variability and change, marine environmental degradation and pollution, biodiversity losses, and natural hazards. How we respond to these global issues, while facing the increasingly complex challenges of sustainable development and ecosystems-based management will dominate the work of the IOC.” – IOC Resolution EC XXXIX.1 The origins of the IOC date back to an Intergovernmental Conference on Oceanographic Research that was held in Copenhagen in 1960; it was during this conference that the decision to establish an Intergovernmental Oceanographic Commission
40
within UNESCO was made. A year later, the first Intergovernmental Session of IOC was held at UNESCO headquarters in Paris, and the IOC found its first 40 member states. There are four main areas that provide the focus for the IOC’s work and research:
• Coordination of Oceanographic research programmes Perhaps the most significant aspect of the IOC’s work is its commitment to international oceanographic research. The IOC oversees the development, promotion, and facilitation of many research programmes every year. The research undertaken not only helps to improve our knowledge and understanding of global and regional ocean processes, but is also geared towards the sustainable development and stewardship of ocean resources. The IOC was one of a number of organisations who were affiliated with the Census of Marine Life (COLM), a decade long study that attempted to assess the diversity, distribution, and abundance of marine life on a global scale. COLM utilised the time and knowledge of 2700 scientists from 80 nations around the world; cumulatively spending over 9000 days at sea on 540 expeditions, it was the first comprehensive census of its type. From the COLM project came the Ocean Biogeographic Information System (OBIS), an online, userfriendly repository that contains the COLM project findings and which can be used by scientists and the public alike.
• Global Ocean Observing System and Data Management Another aspect of the work the IOC undertakes is the planning, establishment and co-ordination of an observation system that monitors global oceans. 15 years ago, the IOC began overseeing the Global Ocean Observing System (GOOS) in order to observe, model and analyse marine and ocean variables. The data collected by GOOS can be used to monitor the state of the oceans, provide
EU Research
information relating to the future conditions of the sea, and can also provide the basis for climate forecasts. It is also used by coastal nations for ocean and coastal zone management, and by researchers studying global climate change. Advances in technology have allowed the IOC to utilise groundbreaking developments within oceanographic research in order to gain a more detailed picture of the earth’s seas and oceans. Satellites now have the ability to monitor surface temperatures, winds, the height of the sea surface, and novel sensor equipment can provide a wealth of information regarding the ocean’s subsurface. The Argo system, a global network of over 3200 profiling floats, measures the temperature, salinity, and velocity of the ocean down to 2000m. What the IOC and GOOS offer us is an in depth dynamic snap shot of the state of the world’s oceans on a regular basis; the information collected by these systems is readily available via online database storage, accessed via the programme websites. Wendy Watson-Wright, Assistant Director-General and Executive Secretary of the Intergovernmental Oceanographic Commission of UNSECO says: “The physical, chemical and biological characteristics of the ocean are important vital signs of the planet’s well being. But to understand indications of change it is crucial to monitor these vital signs frequently, with as fine a detail a possible, and from marine locations around the world.”
• Mitigation of Marine Natural Hazards When the Indian Ocean Tsunami struck in 2004, the IOC campaigned for the establishment of a global marine multi-hazards warning system; they wanted a system in place that would not only monitor and predict hazards, but would also issue rapid warnings and mitigation plans when hazards occurred. This system was put to the test earlier this year when an 8.9 magnitude earthquake struck Japan. 3 minutes after the quake, the Japanese Meteorological Agency issued a major tsunami warning, and within 9 minutes of the earthquake, warnings and watches had been issued in places such as Hawaii and Russia. Regular bulletins released by the IOC detailed the towns and cities affected by the tsunami along with statistics on population levels, peak water height, and how much area was flooded during the tsunami. The IOC’s dedicated Tsunami programme has been set up to assess the risk to coastal regions from tsunami hazards and is there to inform and educate officials and civilians about their dangers. An early warning system has been established using a network of seismometers and sea level measuring devices, which broadcast real time data to national and regional warning centres. The information received by the warning centres allows authorities to assess the potential risk of a tsunami and take necessary action if it is needed. The IOC’s goal is to minimise as much as possible the loss of life though such natural marine hazards.
Support to Capacity Development The IOC’s management of GOOS, its tsunami programme, and the co-ordination of oceanic research, are just a few of the tasks they undertake n the course of their operations. On top of this, the IOC has shown a commitment to improving the education and training of others in the oceanographic area. The IOC’s Capacity Development programme is dedicated to providing international leadership for
www.euresearcher.com
education and training, as well as providing expert technical assistance for global ocean and coastal zone based research, and the sustainable development of countries involved. Using a global network of scientists and research institutions, the IOC is actively addressing the critical issues surrounding the protection and sustainable development of ocean and coastal regions. One of the most critical issues facing these areas is the affect of global climate change. The IOC is involved in a number of projects that hope to enable those who live in coastal areas to adapt to climate change. One of the methods being explored by the IOC’s work is the restoration of natural coastal ecosystems. By restoring natural coastal ecosystems that are at risk from the impact of climate change, these coastal areas will not only find themselves better protected from effects such as rising sea levels or the erosion of coastland, but they will also benefit in many ways. For example, these projects will aid local communities in terms of eco-tourism and the rejuvenated vegetation will help absorb atmospheric CO2 levels. Such initiatives are currently being implemented in coastal regions Africa that are deemed most at risk from global climate change.
Much of what IOC does may not seem glamorous. It is often behindthe-scenes work such as meetings and consultations, agreements, and seminars. But at the grass-root level the main corpus of IOC are the scientists themselves, at sea and in laboratories around the globe While these four areas provide the primary focus for the work undertaken by the IOC, they are not the only aspects of the organisation. Spanning the entire network of programmes and initiatives is an overarching policy that may not seem too groundbreaking, but which is invaluable to authorities, researchers, and civilians alike. The IOC’s open access data policy means that there is no restriction on access to the wealth of data stored online as a result of the various programmes; this benefits not only researchers and scientists with access to information on vast numbers of marine life and the state of the oceans, but also government agencies who can utilise the observation networks for potential hazards and who have the power to affect policy. The general public can also benefit as they can learn more about the structure and state of our waters and prepare themselves for hazards or even educate themselves on issues of sustainability.
41
The IOC Technical Committee on Data and Information Exchange (IODE) programme, established in 1961, was created by the IOC as a means of enhancing marine research, exploitation and development by providing member states with the facilities to exchange oceanographic information and data, and by providing them with the necessary tools that would allow the data to be shared with great ease. The biggest benefit of the IODE programme is the variety of data that can be accessed, and the fact that researchers from participating member states can draw upon data and information that suits their particular field of expertise. A global scale study is an arduous task, but programmes such as IODE mean that information is readily at hand. Wendy Watson-Wright says about the IOC: “Much of what IOC does may not seem glamorous. It is often behind-the-scenes work such as meetings and consultations, agreements, and seminars. But at the grass-root level the main corpus of IOC are the scientists themselves, at sea and in laboratories around the globe. Through IOC researchers are able to form networks of cooperation and share ideas and resources that enable them to tackle challenges that are too big for any one research centre, one nation, or even one region.” Part of the IOC’s overall mission is to educate and inform governments and the public alike in order to combat threats faced by our future generations, the biggest threat being rising global sea levels. If global climate change continues at the current rate, the global sea level will rise approximately one meter by around 2100. This figure may not sound that significant, but to put it in perspective, the IOC points out that such a rise would spell the end of the Maldives. Indeed in Europe, parts of the Netherlands, including Amsterdam, would be predominantly submerged underwater, and in the UK Great
42
Yarmouth would itself become a peninsula. On top of all this, the increased volume of salt water flowing into fresh water habitats would have a devastating impact upon many species of life unused to such increased salinity levels. Other issues that are of particular interest to the IOC include the acidification of our planet’s oceans. There are global concerns about the rising level of carbon dioxide emissions into our atmosphere; many of us now try to limit our carbon foot prints as much a possible. The oceans have acted as a natural buffer of sorts to limiting the affects of climate change; the oceans have been absorbing around 25% of the CO2 emissions caused by human activity. While this has slowed the affect of global climate change, the side effect is that the continuous absorption of CO2 and heat is beginning to have a devastating effect on marine ecosystems; the pH levels of the oceans are falling, making them more acidic. If this acidification continues, many species will lose their sources of food, as they cannot sustain themselves in such conditions. On top of this, areas of the ocean are also becoming deoxygenated; as the temperature of the oceans increases, the level of dissolved oxygen in the water will decrease even more. The IOC is raising awareness of these issues and, by working together, the member states are in a position to affect international change in the way that we manage and develop our oceans. This is no easy task, and the IOC has a lot of hard work ahead of it. So, what’s next for the IOC? There are still many projects being undertaken, but perhaps the biggest event in the IOC calendar is happening in June 2012. Rio De Janeiro will be hosting the Rio+20 Earth Summit, and the IOC along with the International Maritime Organisation, the Food and Agriculture Organisation of the UN, and
EU Research
the United Nations Development Program, have prepared a Blueprint for Ocean and Coastal Sustainability to present at the summit. On the release of the blueprint, Irina Bokova, the Director-General of UNESCO said: “The role of the ocean in sustainable development is one of our core messages for Rio. […] The ocean is a source of life and the prime regulator of climate – it is also a key provider of economic and social services to humankind.” The importance of ocean and coastal issues are to be considered within the main outcomes of the conference. The blueprint is a wake-up call to the world’s nations; without action through policy, our oceans face an uncertain future. Issues such as ocean acidification and the disappearance of coral ecosystems will be discussed alongside the need to protect more of our oceans and coastlines. Further to this, the blueprint addresses the depleting fish stock levels, the increasing occurrences of Hypoxic or “dead” zones in the ocean, as well as a general loss of marine biodiversity. The blueprint does not just highlight the science of the oceans, the biology, chemistry, and physics. It addresses the world’s nations and asks them to take greater notice of what is happening to the oceans around us all:
www.euresearcher.com
“It is clear that the ocean choices made by world governments and the agencies they support will be critical to the welfare of future generations, in supporting poverty reduction, economic growth and environmental improvement. Ocean governance gaps, institutional failures and problems in the implementation of global and regional conservation measures, as well as the need to harness the expertise of scientific institutions are likely to feature prominently on the Rio+20 agenda.”
www.ioc-unesco.org www.ioc-tsunami.org www.ioc-goos.org
43
Vision and visual navigation in nocturnal animals The most basic research into how nocturnal animals see well in very dim light has led to surprising applications in digital imaging techology. We spoke to Professor Eric Warrant about his work exploring how insects and other animals see in the dark “Most people think of insects as lower life-forms, but they’re not. They are really sophisticated and have amazing brains. The brain of an insect is divided into many sub-regions. There’s a whole area devoted entirely to vision, and this visual area alone is broken down into even more sub-structures, each of which has hundreds of thousands of neurons connected in every possible combination.”
number of questions about how exactly their visual systems work. It is this quest for discovery which has driven Professor Warrant’s work. The project itself has a four pronged approach to the study. In order to investigate just how well insects can see, and testing the extent of their visual performance, Professor Warrant will be carrying out behavioural studies. In
Professor Eric Warrant has had an active interest in the study of nocturnal insects for many years; his research stems from a desire to discover how nocturnal insects see so well in an environment where there is very little light. “Nocturnal insects can see a lot better than we can,” Professor Warrant says, “and my research is not just about trying to determine the kind of visual power these insects have in dim light, but also trying to work out exactly how it is that they see so well.” The Seeing in the Dark project is not just looking at the physical properties of insect eyes, but is taking a closer look at the neural processes that occur within the brains of the insects as they navigate in dim light. “I wanted to find out what special tricks these creatures have,” says Professor Warrant. “How can they see colour? How are they able to fly through a dark forest at night without crashing into trees? How do they learn visual landmarks and recall them to find their homes in what can be a visually complicated environment, where we ourselves would have great difficulties finding our way in bright light?” When we consider the relative size of an insect, and therefore the size of their eyes and brains, the fact that they have such sophisticated optical abilities poses a
44
recent studies, he and his colleagues have shown that it is possible to train a nocturnal hawk moth to associate the colour blue with food; from this, it is then possible to introduce further stimuli, such as striped patterns or different shapes, to test spatial resolution. “What we have learned from this,” Professor Warrant says, “is that they can see colours at night, whereas we and nearly all other vertebrates cannot.” Some insects have the innate ability to see very faint patterns of polarised light, which are produced around the moon; it is this
ability which allows them to navigate in the dark. The second approach is to study the neural performance of the insects with electrophysiology, using electrodes to investigate specific neurons that have been identified in the visual system. “Traditionally, we have been working mainly in the retina,” says Professor Warrant. “We have been recording from photoreceptors and looking at what specific properties of the photoreceptors suit them for life in very dim light.” Following this, the aim is to look further into the brain and investigate different circuits of neurons; the hope is to discover how these circuits improve and enhance the visual signal as it passes from the retina to the brain. “Its not a trivial task at all to get to the bottom of this, but the higher you get in the visual system in terms of finding circuits of cells within the brain that deal with vision, the more you can learn. The hope is that we can build a picture of the strategies that these insects use neurally. I would hope that within five to ten years, we will be able to do this, but whether we can actually pin down the exact circuitry and know which cells are involved and how they are connected and exactly what they do, I will be very pleased if we can do that before I retire.” Thirdly is the study of the eye itself, looking at the optics and structure of the eye. “We are trying to work out exactly how sensitive to light the optics and morphology of the eye are and determining what exactly this adds to the nocturnal visual system,” Professor Warrant says. The fourth approach is to compile a theoretical understanding of the insect nocturnal visual system, based upon the
EU Research
At a glance information gathered in the preceding three stages. Using behavioural, physiological, optical and anatomical results, Professor Warrant hopes to build up a complete picture of the insects’ visual system and use that knowledge to create models of nocturnal vision and visual performance. “From our previous studies on what’s going on at the level of neurons, and what’s going on at the level of behaviour, we see that there is a huge gap,” Professor Warrant says. “So far, we haven’t really nailed the exact mechanisms that the brain is using for allowing nocturnal animals to see so well in very dim light. Their behaviour vastly exceeds their apparent visual abilities at the level, at least, of the photoreceptors. This is about animals doing things which are seemingly impossible.”
algorithm,” Professor Warrant says. “It’s like an eye really; it works in the same way.” As a result of the development of the algorithm, Professor Warrant is one of the founders of a start-up company to promote the software. Nocturnal Vision AB, based in Sweden, have demonstrated the effectiveness of the algorithm and posted several videos on their website showcasing the images created by it. The possible applications for the algorithm are quite numerous. “It could be developed for use in consumer cameras for improving filming at night, one could have them in mobile phones, or for any other application that requires cameras and video, such as surveillance, military use, and microscopes among others,” Professor Warrant says.
While the Seeing in the Dark project is concerned with investigating the properties of nocturnal insects’ eyes, there is a lot more to the project than meets the eye. During the course of his research, Professor Warrant and his colleagues have developed an algorithm that could be developed commercially with a number of applications. “For me, the development of this algorithm was wonderful,” he says. “It was the result of pure blue-skies, curiosity driven research.”
Several years ago, car manufacturer Toyota approached Professor Warrant to offer their assistance in helping to develop the algorithm; they provided funding for five years and employed two mathematicians to help with the project. The hope is to create a colour camera that could assist with driving during the night.
The algorithm takes some of the principles that Professor Warrant has discovered in his investigation of nocturnal insects and applies them to digital images captured in very dim light, or by cameras which have very small lenses that cannot capture all of the available light. It works dynamically to enhance the colours captured in the image, providing an image which is strikingly clear. “The advantage of the algorithm is that you don’t require another light source,” explains Professor Warrant. “For example, with most night-vision goggles, you illuminate the world with infra red light; this technique is based on the way animals see, and they don’t have lamps built into their heads. They’re trying to work with existing light levels, so the philosophy of the algorithm is to do the same; to adapt to light levels as an insects’ visual system would.” As existing light levels grow dimmer, the algorithm works harder, and it is able to monitor light levels itself; therefore when light levels change again, it can either work harder or turn itself off. “It is a clever
www.euresearcher.com
It is difficult to imagine that the nocturnal visual abilities of insects could have such a potential impact on our everyday consumer lives. For Professor Warrant, the varied range of the project is what makes it so refreshing. “It’s a wonderful project because there’s a lot of lab work, and a lot of field work. You could find yourself in Panama, where we’ve been studying nocturnal bees, or South Africa, where we’re working on dung beetle navigation at night. We’ve even had a project in India studying a giant nocturnal bee; it’s about 4cm long and stings like a sidewinder missile. It’s not very often that bees become nocturnal, but in hot areas of the world, particularly tropical and hot arid areas, there is so much competition during the day for limited flower resources, and so many predators, that these bees have drifted into the night. What makes this bee in India so remarkable is that it has taken its daylight eyes with it; the optical design of their eyes is best suited to bright daylight, but they’re seeing incredibly well at night, which makes the demand on the neural circuitry even greater. This has really fascinated us. I just hope that one day we can solve the mystery.”
Full Project Title Seeing in the dark: Vision and visual behaviour in nocturnal and deep-sea animals. Project Objectives The objective of the project is to determine the optical and neural strategies used by nocturnal and deep-sea animals to see well in very dim light Project Funding The Swedish Research Council (VR) (2010-2012): 700,000 SEK per year The US Air Force Office of Scientific Research (2009-2011): 750,000 SEK. Contact Details Project Coordinator, Professor Eric Warrant Department of Biology University of Lund Sölvegatan 35 S-22362 Lund Sweden T: +46 46 2229341 F: +46 46 2224425 E: eric.warrant@biol.lu.se W: www.lu.se/o.o.i.s/8213
Professor Eric Warrant
Project Coordinator
Eric Warrant is Professor of Zoology and Director of Postgraduate Studies in Biology at the University of Lund, Sweden. He is a Fellow of the Royal Danish Academy of Sciences and Letters, the Royal Physiographic Society of Lund, and Vice Chairman of the National Committee for Biology (Royal Swedish Academy of Sciences). His collaborations with Toyota to develop night vision systems for cars resulted in the forming of a new company, Nocturnal Vision AB, of which he is part owner.
45
Understanding High-Temperature Superconductivity in correlated materials SUPERBAD, an ERC Starting Grant in theoretical condensed-matter physics identifies the cause of high-temperature superconductivity in a “cure” of the bad matallic behaviour caused by the strong electron-electron interaction. The project leader is Massimo Capone The SUPERBAD research project, headed by principal investigator Massimo Capone, hopes to gain a better understanding of high temperature superconductivity from the ground up by analysing theoretically and testing a broad range of superconductive materials, and also by studying the relationship between high temperature superconductivity and bad metallic behaviour. The project started in October 2009 and is funded by the European Research Council. Focussing on the newer superconductors discovered within the last twenty five years, the project’s aim is to investigate how superconductivity can be seen as a “cure” for bad metallic states. The origins of the SUPERBAD project can be found with the discoveries made in the mid-1980’s, where the critical temperature, the temperature at which electrical resistance in a given material becomes zero, reached never before seen levels thanks to the discovery of a new surprising family of materials. Before this time, only standard superconductors were known, as explained by the Bardeen, Cooper, and Schrieffer Theory (BCS Theory), which determined that superconductivity could not occur above 20 Kelvin. Capone explains: “Suddenly, we moved from a point where superconductivity could occur at temperatures below a few Kelvin’s to an entirely new world in which it could occur at temperatures close to 100 Kelvin, which makes a huge difference because that is above nitrogen’s boiling point; therefore, the cost of cooling these new materials is greatly reduced.”
46
The discovery of these new materials was doubly significant when considering their properties. Standard superconductors are just normal metals, which have been cooled to their critical temperature, at which point their electrical resistance
A superconducting sample levitates in a magnetic field due to the Meissner effect. The same principle is at the basis of the magnetic levitation trains
drops to zero. Capone explains that the new superconductors, all based on layers of copper and oxygen atoms, like e.g. La -x SrxCuO2, were strange compounds: 2 “They are insulators if taken in their stoichiometric composition. However, it was found that when carriers were doped inside them, they become metal and these metals had the highest critical temperatures that had ever been observed. So, you have the best superconductor you’ve ever had, and it is closely related to an insulator”. This is a surprise because insulators by their very nature are the opposite of conductive materials. Moreover, the insulating behaviour is not of standard type, but it is an effect of the interaction between the electrons, as understood by Sir N. Mott. At the beginning of research into superconductivity, the search was focussed on conductive materials. However, with the discoveries made around the mid-eighties, researchers had to change their way of thinking. To begin with, researchers thought that there was nothing in common between standard superconductors and the new superconductors and soon the buzz word became “Strong Correlations”; as Capone explains: “This means that there are strong interactions between electrons, which are responsible not only for the insulating behaviour of the non-doped compounds, but also for superconductivity.” After 1986 it appeared as if two essentially separated families of superconductors existed. Standard superconductors, as found in BCS Theory where phonons are the mechanism of pairing, are metals when they
EU Research
“The materials with the highest critical temperatures are far from the best metals. Indeed they are closely related to insulators and they become poor conductors above the critical temperature. Why is this?” are not superconductors, and they have low critical temperatures. New superconductors, which can often be termed “bad metals”, metals which are poorly conducting if their temperature is increased above critical temperature; in these materials it has been thought that phonon interaction is largely irrelevant to superconductivity and conductivity. Capone however believes that the two families of superconductors can be closer than was originally thought. By looking at fullerides, superconductive compounds formed by a solid of fullerene buckyballs (C60) where phonon interaction is directly responsible for superconductivity and a Mott insulating state is present, Capone identifies these compounds as a bridge between standard and hightemperature superconductors.
field of superconductive materials, and these developments have also been incorporated into the project. In March 2010, a new superconductor was discovered – potassium-doped picene, an organic based aromatic compound. This superconductor, similar to the fullerene family of superconductors, was included in Capone’s calculations and the results of his testing gained some interesting results. “This material works even more like the copper based superconductors, even if it is an organic one.” Capone proposes that this new material provides even stronger evidence that there is a link between organic compounds, like the fullerenes, and the inorganic compounds such as the copper or iron based superconductors.
Capone believes that a key point of the research is to find a different way to understand why certain materials have higher critical temperatures than others, and his proposal is that the key to unlocking the mystery of high temperature superconductors lies in “bad metal behaviour” more in the identification of the “pairing glue”.
Perhaps the most important aspect of Capone’s research is the possibility of creating a superconductive material through which electrical current flows without any dissipation at ambient temperature. Such a material could have far reaching implications in the development of more effective systems for practical applications. Among other things, superconductors are currently used in the magnetic imaging of the human brain, as well as in the production of magnetic levitation transport, such as the JR-Maglev train system being developed in Japan. Capone states that the SUPERBAD research project is not in a sense concerned with creating new superconductors, but is more concerned with improving the ones already available. “This is an improvement that is definitely needed.”
Capone says: “Bad metal behaviour is a direct consequence of strong correlations.” The essence of superconductivity is achieving within a given material the optimum flow of electrons; the electrical resistance ideally should be zero. Researchers into superconductivity are now faced with a “catch-22” situation; A high critical temperature seems associated with a “bad metallic state” with high resistance, but you can not increase the level of correlations too much, because that would kill superconductivity. Capone explains that these new superconductive materials, which have more in common with insulators, are evidence that superconductivity effectively “cures” these materials: “Superconductivity is sort of a cure for the disease that leads to a bad metallic state.” Since the project’s inception, there have been further developments within the
www.euresearcher.com
The overarching idea of the project is to be able to look at a given material’s structure on an atomic level, taking account of all the atomic components within it. With this structural understanding, it would then be possible to alter specific atoms within the material’s structure to increase the potential superconductive capabilities. Essentially, the ultimate aim of such research is to reach a point where superconductivity can occur at room temperatures.
At a glance Full Project Title Understanding high-temperature superconductivity from the foundations: Superconductivity as a cure for bad metallic behaviour Project Funding €1 Million provided by European Research Council (ERC) through the Starting Independent Grant scheme within the IDEAS program of FP7/ERC. Contact Details Project Coordinator, Massimo Capone Istituto Officina dei Materiali Consiglio Nazionale delle Ricerche (IOM/CNR) International School for Advanced Studies (SISSA/ISAS) Via Bonomea 265, I-34016 Trieste, Italy T: +39-040-3787-374 T: +39-339-6620959 E: massimo.capone@sissa.it E: massimocapone@gmail.com W: http://superbadproject.wordpress.com
Massimo Capone
Project Coordinator
Currently Researcher at the SISSA Unit Democritos of the Institute for Materials Workshop of Italian National Research Council (CNR) and Assistant Professor at SISSA. Graduated in Rome Sapienza and PhD at SISSA in Trieste. His research activity focuses on condensed matter physics and the properties of strongly correlated fermions in high-temperature superconductors, other functional materials and cold-atom systems.
47
Exploring the potential of two-dimensional materials for nanoelectronics
Single layers of transition metal dichalcogenides such as MoS2 are structurally similar to graphene but cover a wide range of electrical properties: they can be semiconducting, metallic or superconducting depending on their chemical composition. Professor Andras Kis explores fundamental properties and potential applications of these new nanomaterials The FLATRONICS project
was started in September 2009 and is concerned with exploring the electrical properties of nanoscale devices and circuits based upon nanolayers. It is funded by the European Research Council under the Seventh Framework Programme. The project’s Principle Investigator is Professor Andras Kis, head of the Laboratory of Nanoscale Electronics and Structures at the School of Engineering, Ecole Polytechnique Federale de Lausanne (EPFL) Switzerland. In general terms, the FLATRONICS project is exploring two-dimensional materials and their applications within the electronics world. The idea is to investigate materials which are similar to graphene, a carbon based material consisting of a one-atom thick “honeycomb crystal lattice” of carbon atoms, and which is considered to be the successor to silicon. Like graphene, these layered materials can be extracted from bulk crystals using just scotch tape. As Kis explains: “Basically you have twodimensional layers stacked vertically. By
48
using scotch tape, you can peel a nanoscale layer from your starting material.” Professor Kis says that the main aim of the project is to look beyond just graphene and its applications. “Thousands of researchers at the moment are studying different types of graphene. My interest was to look at different layered materials with different properties.” Kis is concentrating the project on transition metal dichalcogenides an example of which is molybdenite (MoS2). MoS2 is similar in structure to graphene, but it is a semiconducting material. There
Atomic force microscope image of a single layer of MoS2
are around twenty materials being used in the study, and they all have very different electrical properties. “Some are metals, some are superconductors, and some are semiconductors,” Kis explains, “but they all share a common stacked layered structure. You can use the same approaches developed by the graphene community in order to extract a single layer of each material.” While some people argue graphene’s position as an alternative to silicon, and with MoS2 appearing as a possible alternative to graphene, Kis is quick to state that it is not just a question of which material is going to succeed over another. He says “this research is a part of a much bigger picture; it is a really broad field. There is not just one material out there that will fulfil all possible roles.” In terms of nanoscale research, Kis states that since the 1990’s, people have been focussing primarily on the uses of nanotubes and nanowires, both one-dimensional materials. Kis believes that it is time now to
EU Research
branch out and explore the full capabilities of the two-dimensional playing field. “The entire area needs to be looked at. You have two-dimensional superconductors, semiconductors, and insulators; all these materials give you something different. Graphene for example is a very good conductor, it is transparent, and mechanically it is very strong; for applications that require those properties, graphene is very useful, but there are other applications for which graphene would not work as well.” Kis explains that the project has several key components. The first is to investigate ways to synthesize and grow the materials being studied on a larger more practical scale. “You can start by simply taking a crystal and applying scotch tape, removing very thin layers one at a time, and although that works very well, it is not very practical. It is not something that industry is going take much interest in, despite any advances you may make, unless you can prove you can produce these materials on a large scale.” The second component of the project is to use these grown materials, or start with a crystal and use the scotch tape method, and look at the basic electrical properties of these two-dimensional layers and investigate ways that they can be altered and the effects that such alterations have. “Building on what you find out about the electrical properties, the idea is to try and develop the materials for more practical applications,” says Kis. In terms of the development of practical applications for the research, perhaps the most significant result of Professor Kis’ research is the discovery that MoS2 is a very effective semiconductor. The material is thinner than silicon and has similar conductivity; it is comprised of a layer of molybdenum atoms sandwiched between two layers of sulfur atoms, and it has a natural 1.8 electron-volt band gap. “MoS2 worked better than we had hoped. We didn’t think it would work as well as this, so that was an unexpected result,” says Kis. “I was hoping that you could make transistors, but I didn’t really expect them to be this good.” Kis believes that there are many applications that could potentially benefit from the FLATRONICS research findings. “There could be very interesting applications in optical electronics; anything that involves the
www.euresearcher.com
conversion between light and electricity, like solar panels for instance.” There is also the possibility of utilizing the findings to significantly benefit the future of electronics. “For example, MoS2 could be used in electronic circuits, or for doing computations. It could also be used to make very effective sensors as it is thin and, being a semiconductor, it is very sensitive to any changes in the environment.” Kis is also optimistic about the possible ecological ramifications of his research. Tests have proven that MoS2 transistors use 100,000 times less energy in standby mode than conventional silicon based ones although the question is if this would persist when the size of the MoS2 transistors is scaled down to the same size as silicon transistors that we currently use. “When the transistor is turned off, it dissipates practically no current. It falls under the measurement capabilities of our instruments.” This, coupled with the fact that a single layer of MoS2 is only 0.65 nanometres thick compared to 2 nanometers that is the record for silicon, could pave the way for smaller more energy efficient electronic devices. “You could have smaller transistors with the same power dissipation,” says Kis. “It’s really a question of economics and fabrication at the moment. It’s difficult to predict how things will develop on an industrial scale, but in terms of individual transistors, it looks very promising.” FLATRONICS is an experimental project, but Kis says that there is a certain amount of collaboration with theoretical groups. “The initial project brief was purely based upon making devices with the materials being studied,” he says. “Now that this has happened, we are collaborating with these groups in order to take our discoveries into some interesting future directions.”
A chip with MoS2based transistors
At a glance Full Project Title FLATRONICS: Electronic devices based on nanolayers Project Funding €1.8 Million 2009-2014 Project Partners None (ERC Starting independent researcher grant) Contact Details Project Coordinator, Professor Andras Kis School of Engineering Electrical Engineering Institute EPFL T: +1 21 6933925 E: andras.kis@epfl.ch
Professor Andras Kis
Project Coordinator
Andras Kis has been an assistant professor of electrical engineering at EPFL since 2008. Between 2004 and 2007 he worked as a postdoctoral researcher at University of California, Berkeley in the group of Prof. Zettl. Andras Kis has performed research in nanomechanics of biomolecules and carbon nanotubes as well as in electronic properties of nanotubes, graphene and related two-dimensional materials. In these areas, he has published 25 peerreviewed papers that have been cited more than 800 times.
49
The building blocks
of metamaterials Exploring a new concept for the fabrication of metamaterials allows researchers to control their effective properties in an unprecedented manner, opening up new potential applications. We spoke to Toralf Scharf of the NANOGOLD project about their work in developing a metamaterial ink to produce materials with specific electromagnetic properties The use of amorphous structures with structural dimensions that are typically much shorter than the wavelength of the operating electro-magnetic radiation could help researchers explore and analyse metamaterials in greater detail than ever before. This area forms the primary research focus of the NANOGOLD project, an EU-funded initiative which aims to fabricate and apply bulk threedimensional metamaterials. “The basic idea is to make a kind of metamaterial ink, like a type of paint, which allows you to produce a material with non-conventional electro-magnetic properties. This is in the form of a liquid which could then be dried or processed to produce a stable and durable material, but with a certain 50
volume,” explains Dr Toralf Scharf, the project’s scientific coordinator. “The ground-breaking aspect of our material is that it’s not just a single surface – it will be something that has a truly threedimensional volumetric structure. But it doesn’t necessarily have to be included in a container – it could be a thin film with a finite thickness that possesses bulk properties. This allows us to create electromagnetic materials with high efficiency and to observe effects usually associated only with the realms of fantasy.”
Self assembly The project is following an interdisciplinary approach to form these
metamaterials, bringing together elements of organic chemistry, physics and liquid crystal technology. The starting point is the use of resonant entities, such as metal nanoparticles, to develop composite metamaterials with specific electromagnetic properties. “We introduce the metal nanoparticles directly into an organic molecule. And then we let these hybrid molecules self-assemble via intermolecular interaction,” says Dr Scharf. These hybrid molecules should contain both the metal as the active plasmonic entity and organic molecules in the form of mesogens; effectively they will need to be macro-molecules, as they have to be relatively large to carry and wrap the metal nano-particles. “This metal
EU Research
nanoparticle is a resonant entity in the hybrid molecule – it has a resonance frequency at which it interacts in an extreme manner with light,” explains Dr Scharf. “You can take these hybrid molecules and assemble them further on, either by self-organisation of the molecules themselves or with other techniques, to create second-level morphology. The second level morphology will also interact with light and can lead to additional interference phenomena. This interference is structure-driven and not molecule-driven.” This structure–driven interference, and the molecular resonance which comes from the metal entity, leads to particular features of the material. Combining resonance and interference effects allows the project to use different sets of parameters, which Dr Scharf says illustrates the strength of the NANOGOLD concept. “The hybrid molecules form an entire material that already has particular electromagnetic properties, on both the nanoparticle and the organic molecular level. In particular this material has a high load of resonant nanoparticles, a feature needed to explore resonances and their coupling. You can use such an entire material and integrate more functionalities; you can put in functionalities like fluorescence or absorption,” he outlines. This work is based on analysis of both the individual components – the hybrid molecules – and the material as a whole. “We have a kind of technology pool, where we use the material made of hybrid molecules to form the structural components and to go to the second level morphology,” says Dr Scharf. “Then we use the theory to simulate the optical properties and guide us in the most effective direction in terms of material realisation.” A variety of techniques can be used to analyse composite materials, such as reverse-engineering, which can help researchers understand how the material properties can be modified and enhanced. However, Dr Scharf says it is extremely challenging to control self-organisation on the molecular level, partly because of the size of the hybrid molecules. “They’re huge, the molecular interaction is complicated, and it’s extremely hard to predict the kind of self-organisation phase that will occur,” he explains. With larger molecules it is difficult to find evidence of self-organisation; the major challenge is that the metallic entities are relatively big
www.euresearcher.com
compared to the molecular attachments (the mesogens) that are used for creating the macro-molecules. “The core is a kind of metallic sphere which you attach molecules to,” continues Dr Scharf. “These molecules are usually much smaller than the metallic sphere itself; however, you want to create something where the metallic sphere is almost hidden in the resulting hybrid molecule.”
Inorganic particles Ensuring that the material’s homogeneity is not disturbed by the nano-particle is a real challenge. The nanoparticles are spheres and are typically between 3-5 nanometres (nms) in size, making hiding an inorganic particle very difficult, while there are also other issues to address. “The resulting material very
The basic idea is to make a kind of
metamaterial ink,
like a type of paint, which allows you to produce a material with nonconventional electromagnetic properties. This is in the form of a liquid which could then be dried or processed to produce
a stable and durable material often has features that are usually not favourable for technology processing – for instance the viscosity is too high, or it needs processing temperatures that are relatively high – let’s say 100°C,” says Dr Scharf. However, it is possible to tune the properties of the material by manipulating the molecules in various ways. “We are looking at so-called thermotropic systems. So with the temperature you can influence physical parameters like viscosity and the structure; you can increase and decrease temperature to adjust properties,” continues Dr Scharf. “The effects on nanoparticle resonance and structural
interference are different. The nanoparticle’s properties and the related resonance are stable but the surrounding morphology and the interference is changing. We get the possibility to tune the material properties and freeze the structure to conserve it.” It is often the case that materials with particular electro-magnetic properties can’t be used to create devices with certain functionalities. Researchers are using a hybrid approach which allows them to be more flexible on the material parameters and second level morphologies. “On the resonance level we’re mainly using the hybrid approach. So we use organic molecules and nanoparticles to create functions,” explains Dr Scharf. One important consideration is that the second morphology level of the material should be below the wavelength of light so that it appears homogenous for visible light; the structure size should be less than around 400 nms. “If you put it a little bit larger than the wavelength of light then you would get diffraction effects, which are dispersive, scattering effects and depend on the wavelengths of light. We want to avoid this,” stresses Dr Scharf. “If your structure is smaller than the wavelength of light then you don’t have this diffraction. You only work in what we call a ‘zero order’ phase, where the light doesn’t bend and the material it experiences can be described as homogenous.” This allows researchers to focus on the effective properties of the material with respect to the wavelength. The aim is to develop a meta-material with an amorphous structure which leads to certain nonconventional optical and electro-magnetic properties. “The concept of meta-materials is that you have something that behaves almost like glass – you can describe it from the outside by a single set of criteria parameters, and you can use this single set of criteria parameters for all angles of incidence. There are two main parameters – the refractive index and the absorption,” explains Dr Scharf. With this approach the materials can be described on a single refractive index image; Dr Scharf says this is very beneficial in terms of the potential applications of NANOGOLD’s research. “We have in mind applications to hide objects and to shield particular areas from electromagnetic energy,” he outlines. “There are also applications in the optical domain, such as making materials with features like complete light absorption.”
51
At a glance Full Project Title Self-organized metamaterials based on spatially arranged nanoparticles (NANOGOLD) Project Objectives The NANOGOLD project aims at the fabrication and application of bulk electro-magnetic meta-materials. A promising new concept for the exploration of meta-materials is the use of periodic structures with periods considerably shorter than the wavelength of the operating electromagnetic radiation Project Funding €3.52 million Project Partners École polytechnique fédérale de Lausanne EPFL • Virtual Institute for Artificial Electromagnetic Materials and Metamaterials - Metamorphose VI AISBL, Belgium • Universite de Geneve, Switzerland • Ruprecht-Karls-Universitaet Heidelberg, Germany • University of Patras, Greece • Friedrich-Schiller-Universitaet Jena, Germany • Universita della Calabria, Italy • University of Hull, United Kingdom • The University of Sheffield, United Kingdom Contact Details Scientific Coordinator Toralf Scharf École polytechnique fédérale de Lausanne EPFL Rue A-.L. Breguet 2 Neuchatel 2000 Switzerland T: +41 327 183286 F: +41 327 183201 E: toralf.scharf@epfl.ch W: nanogold.epfl.ch
Energy detection This feature could be used to convert energy into heat, an attribute which holds real relevance to energy detection and light harvesting. For instance, shining a light on a complete absorber can be used to measure radiative energy; the energy is transferred into heat and the heat is detected and measured; Dr Scharf says this method is very efficient. “This method measures all the energy that comes onto the surface of the material,” he stresses. It’s not entirely clear how this research could be applied, but some areas have been identified. “It could be possible to create devices that are transparent for certain wavelengths but not for others,” says Dr Scharf. “In fact, it is already possible to do this today using some complicated multi-interference structures. We are aiming to develop a kind of ink that could be applied to a surface and allow you to get these functionalities and create particular effects in a single layer. Shields against electro-magnetic radiation could feasibly be used for solar cell applications for example.” The design of the material is based on the resonances, which usually have particular spectral features. This means it’s not possible to cover the whole visible light spectrum. “The resonances have a certain spectral width, and usually they are used to create unusual optical and electro-magnetic properties. So automatically these properties are created only in a particular spectral window, because outside this spectral window you
don’t have the resonance effects, and you need the resonance effects to make these refractive indexes. So as long as the material is based on resonance effects, there’s always a limited spectral window,” explains Dr Scharf. The design of metamaterials is largely based on resonances in nano-entities; Dr Scharf says that high precision machine techniques can be used to create very sharp resonances and spectacular large resonance features. “You get these very high resonance peaks, but at the same time the peak is very narrow,” he outlines. The NANOGOLD project is using more amorphous structures, where the resonances are less sharp, so automatically the effects are wider. However, widening the wavelength is not the project’s primary focus; Dr Scharf says the next big challenge is to explore how functional devices can be made out of the material. “The challenge is to use the available materials and to try to make functional structures and demonstrate the functionality of devices,” he outlines. While the NANOGOLD project is nearing the end of its term, Dr Scharf is keen to pursue further research into metamaterials. “We are looking at the pre-commercial exploitation level at the moment, but the speed of advancement depends very much on the level of funding we are able to attract,” he continues. “We will continue to collaborate with our partners across Europe and aim to get further funding from national research bodies.”
Toralf Scharf
Project Coordinator
Toralf Scharf focuses his research activities at the École polytechnique fédérale de Lausanne on interdisciplinary subjects bringing micro-system, material technology and optics together. With a background in surface physics (MSc), physical chemistry (PhD) and extensive experience in optics. He is familiar with all necessary aspects of technology development and application and can communicate with different scientific communities.
52
EU Research
Graphene An amazing material A material formed of a two-dimensional layer of carbon atoms, graphene is not just an efficient conductor, but is also extremely flexible and even stronger than diamond. These qualities mean it could play a key role in the consumer electronics of the future; we take a closer look at the history of graphene, its structure, and its potential applications www.euresearcher.com
53
A
material formed of a two-dimensional layer of carbon atoms, graphene has attracted a great deal of research attention since it was first described in Phillip Wallace’s theoretical paper of 1947. Wallace’s paper looked at the material’s electronic properties, in particular its elasticity and electrical mobility, which open up a wide range of potential applications. These could include integrated circuits, transparent conducting electrodes and single molecule gas detection. The electronic properties of graphene led to an intense research focus on synthesising the material, a goal which for many years remained frustratingly elusive. Graphene is best thought of as a 2-dimensional, 1-atom thick layer of graphite; the sp2 bonded atoms are densely packed, making what is one of the world’s thinnest materials also one of its strongest. However, the unique properties of graphene are only found in a 1-atom thick layer of graphite; the same properties are not observed in even a bi- or trilayer of atoms. The extraction of this single atom-thick layer proved a difficult task. For years researchers have grappled with the problem of extracting graphene; just how could a single, atom-thick layer of graphite be pulled off the material? This was achieved with a method based on surprisingly everyday materials as it turned out. After years of research Andre Geim and Kostya Novoselov of the University of Manchester succeeded in 2004 in extracting a 2-dimensional layer of graphene from 3-dimensional layers of graphite. The seeming simplicity of their method belies its wider significance and potential commercial impact, particularly in the consumer electronics industry. Geim and Novoselov used ordinary scotch tape to peel off a layer of graphene from graphite, then put on silicon dioxide substrate and simply looked for an atom-thick layer of graphene using an optical microscope. They found they could identify a single layer of graphene in some regions of the layer – called a micro-mechanical cleavage, work for which they were awarded the 2010 Nobel Prize for Physics.
Applications of graphene Since Geim and Novoselov’s discovery other ways of synthesising the material have been developed, such as using chemical methods to extract it from graphite, and the graphene research agenda continues to evolve. Now that graphene can be synthesised researchers are looking at how the properties of the material can be further enhanced, how it reacts to the presence of other materials, and of course at its potential commercial applications. These seem virtually limitless. The development of graphene has generated particular excitement in the consumer electronics industry, with scientists predicting that it could be used in phones that fit behind the ear and even replace silicon in the production of computer chips. These applications are made possible by the electrical properties of the material. Graphene has extraordinary electrical mobility – electrons can travel in graphene without any resistance and
54
without any scattering, offering potentially enormous electronic capabilities, which as things stand cannot be matched by conventional materials. These electronic properties are matched by its physical attributes, which of course is enormously important in terms of its commercial potential. Despite being just one atom thick graphene is extremely flexible – it can be stretched 20 per cent without any damage – and is stronger than steel. And not only that, graphene is also transparent, which sets it apart from many other conductive materials. Companies across the world are keen to utilise these qualities in the next generation of commercial products, with many investing significant sums in research and development. Electronics giant Samsung have produced a large layer of pure graphene in collaboration with researchers from Sungkyungkwan University in Korea. The sheet, which is as large as a TV panel, was produced using roll-to-roll printing processes; it could be used in flat panel displays and a range of other applications.
Enhancing graphene This is just the tip of the iceberg in terms of commercial potential and with researchers from across a variety of scientific disciplines looking at how the properties of the material could be enhanced still further, there is enormous scope for further development. However, while graphene has exciting electronic potential, it will need to be modified for certain applications; this kind of research is very much inter-disciplinary in nature, bringing together chemists, physicists and engineers to add their perspectives and expertise. An energy gap is essential if graphene is to be used in transistors for electronic devices for example, so researchers are looking into how a band gap can be opened up in the material. One possible method is to bring molecular agents, such as organic molecules, close to graphene so that they can be absorbed into the material. They could then interact on the molecular level and open up a gap in graphene. A number of molecular agents could potentially be used for this purpose. Little is known at this stage about exactly how these agents will interact with graphene and the effect they will have on its properties, so there are many possible ways in which the material could be modified. Some researchers are even talking about how magnetism could be introduced into graphene; how can spins be incorporated in graphene? How will spin propagate in graphene? Could graphene even be used to store information? The answers to these questions are of both academic and commercial interest. Europe as a whole faces major challenges from emerging economies with significantly lower labour costs, a trend which places an even higher premium on scientific and technical knowledge. With many analysts predicting long-term stagnation in the European economy, Britain needs to tap into its scientific and technical expertise if it is to remain competitive.
Investing in graphene The UK is historically slow off the mark when it comes to capitalising on the commercial potential of research, and while
EU Research
no country can claim a monopoly on a new idea, Britain could do better in building on the work of its scientific researchers. In countries like America and South Korea there is a culture of close links between industry and academia, and of establishing spinoff companies built on scientific research, but in Britain there has historically been a bigger gap between our leading universities and our leading companies. There are signs this is changing, despite the wider financial climate. Like many areas of the public sector Britain’s universities face significant cuts in funding, but with graphene the Government is making an exception. A new £50 million research fund has been established to build on the initial development of graphene; Manchester University will be home to this new graphene research and technology hub, which Chancellor George Osborne says is part of the Government’s efforts to rebalance the British economy. “We’re going to get Britain making things again,” he said.
Graphene is best thought of as a 2-dimensional, 1-atom thick layer of graphite; the sp2 bonded atoms are densely packed, making what is one of the world’s thinnest materials also one of its strongest
bringing the prospect of wearable devices one step closer, all with a material just one atom thick.
Research expertise For the research pioneers themselves, Professors Geim and Novoselov continue their work in Manchester, but with a new title to add to their scientific qualifications. The Nobel Laureates were both awarded Knighthoods in the 2012 New Year Honours list, the ultimate mark of approval from the British establishment, in recognition of the impact of their research. Professor Geim says scientific research has a major role to play in economic recovery. “Technology is the engine of the economy, and science is the petrol to keep this engine running. The state of the global economy is in such a mess that its engine requires urgent repairs,” he commented. The British economy is short of both petrol and money, leaving it stranded without fuel; Professor Geim says applied science offers an escape route. “With the enormous interest this material has generated around the world, we expect to be able to turn our world-leading research expertise into real technologies,” he continued. “The research hub will certainly allow us to explore deeper into the vast applied potential of graphene, but also will lead to new exciting results, continuing the scientific excellence in the UK.”
The Government’s plans were warmly welcomed by Professor Sir Peter Knight, President of the Institute of Physics, who was keen to highlight the wider economic importance of scientific research. “We’re delighted that the Government recognises the role science can play in creating a vibrant, diverse economy for the future of the UK – investment in science delivers great returns economically and intellectually,” he said. The electronics of the future are already being shaped in laboratories and research institutions across the country, and while there is still a big time lag between turning new findings into commercial development, graphene could have a major role to play in the next generation of consumer devices. Already researchers at Cambridge University have made flexible electronics from graphene using a printer,
www.euresearcher.com
55
Heattronics: electrons feel the heat Heating of electronic components is a serious limitation to the ongoing miniaturisation of electronic devices. However, it also gives an insightful access to interesting physical phenomena, says Dr Tero Heikkilä, the overall coordinator of the Heattronics project An ERC-funded project based at Aalto University’s Low Temperature Laboratory, the Heattronics project is studying how heat is distributed within nanoconductors, focusing in particular on those with dimensions of less than a micrometer. One of the key issues in this work is how the size of the conductor compares with the length scales characteristic of the different types of processes occurring within them. “Two such important size scales are phase coherence length and energy relaxation length. The former characterises the scale over which the
heat away from them. Therefore it is extremely important to find out exactly how this happens and how it depends on the operating conditions and materials used to make the devices.”
Length scales The effect of low temperatures on electron behaviour is an area of particular interest to Dr Heikkilä. Phase coherence length and energy relaxation length are of the same order of magnitude: at room temperature they are only a few nanometers, whereas at temperatures below 1 Kelvin (-272°C), they
The most serious limitation envisaged at
present to the
miniaturisation of electronic
components is how to carry the heat away electron loses its quantum-mechanical wave character and becomes a classical billiard ball-like object, the latter tells us the scale over which the electron loses its energy, either to other electrons or to the ion lattice,” explains Dr Tero Heikkilä, the project’s overall coordinator. While much of Heattronics’ research is fundamental in nature, Dr Heikkilä says their work nevertheless holds broad relevance. “The ability of a conductor to conduct heat away generally decreases in smaller conductors,” he explains. “This is the most serious limitation envisaged at present for the continued miniaturisation of electronic devices: how to carry the
56
may become several micrometers, a point at which some significant changes can be observed. “If you drive such a small conductor out of equilibrium, by for example applying a voltage, then the electrons in it heat up. At length scales below the energy relaxation length the temperature of the electrons may not be well-defined. Moreover, even if the electron temperature is well-defined, it may be different from that of the ion lattice temperature,” explains Dr Heikkilä. Device cooling has not traditionally been based on an understanding of these kinds of processes, or of how heat is generated in a system and driven away; Dr Heikkilä says
this will need to change in line with commercial development. “Current methods are based on the idea that if you just plug a powerful enough cooler inside your computer, it will not heat up too much. This is bound to change, as modern devices require more and more cooling,” he points out. “Therefore it might be helpful to know how the elementary operations of a device (say, switching a transistor on or off) consume energy, how this energy relaxes and what the fundamental limitations for such processes are.” Knowledge of these kinds of processes could help companies develop devices which consume less energy, or at least to understand their heating limits. The question of how energy relaxation processes occur on the macroscopic scale is mostly an engineering one, and not of direct interest in Heattronics; the more relevant question is how they occur on the nanoscale and how much energy is required to carry out the elementary tasks in individual electronic components. “Nanoscale transistors typically have a small active area where the desired physical process takes place, and this area is connected to larger electrodes. The usual flow of energy is such that the power from a voltage source gives some energy to the electrons in the active area,” explains Dr Heikkilä. “This energy then either decays into the electrodes as the heated electrons flow into them, or to the lattice via electronphonon interaction. The details of both of these processes also depend on the interactions between the electrons themselves, and on the amount of elastic
EU Research
(non-energy transferring) scatterings in between. Eventually the energy is dissipated into the lattice, either inside the active area or inside the electrodes. The lattice is typically the lattice of the metal film that resides on top of some substrate. So the energy relaxation process between the phonons of the film and those of the substrate may also be relevant. Detailed understanding of these processes may help to find materials where the produced heat may be used for useful work, and thereby increase the overall efficiency of the devices.”
Sub-systems The question of how much the various subsystems, such as electrons on the active area or film phonons, heat up in the presence of a given power depends on the level of heat conductance between them. At low temperatures the different subsystems have different (effective) temperatures within observable length scales, but at higher temperatures the picture changes. “At high temperatures it is harder to separate the electron temperature from a phonon temperature, as the relaxation between the two is quite strong. In a previous project we described the analogue of the GMR (Giant MagnetoResistance) effect for heat transport, a phenomenon where a magnetic field can have a large effect on the resistance of a given magnetic resistor system. Its use in hard drives earned its inventors the 2007 Nobel prize for physics, and it’s also used in a new variant nowadays called TMR (tunnelling magnetoresistance). When discussing our Giant MagnetoHeat Resistance we had to introduce the concept of spin-dependent electron temperature and especially the energy relaxation between the two spins of an electron. Here the energy relaxation from electron-phonon coupling suppresses the GMHR effect at high temperatures,” says Dr Heikkilä. Heat can be transferred on the nanoscale in several ways, but some processes are more relevant to Heattronics’ overall agenda than others. “Typically the most relevant energy exchange processes are electron-electron and electronphonon,” explains Dr Heikkilä. This does not exclude investigations into new processes, such as heat exchange between electrons and photons, which could lead to a deeper understanding of energy relaxation. Research in this area is also important to our understanding of noise and fluctuations, another area being addressed by the project. “With a
www.euresearcher.com
manually tuned radio you can hear noise by tuning the receiver to frequencies outside the broadcasting ones. In fact, dissipation (like resistance in electronics) always comes with fluctuations of the current, meaning noise. This is typically called the thermal noise and its properties are quite well-known,” outlines Dr Heikkilä. Knowledge of noise holds real practical importance, as it poses limits to all kinds of measurements, while Dr Heikkilä says it is also possible to find out more about a system by studying the noise it emits. “Recently we have studied another type of noise, which is due to fluctuations of temperature inside the conductors themselves. In some cases of non-linear systems these challenge the conventional thermodynamics of the devices, and may dominate the noise emitted by them. Direct measurement of such fluctuations poses stringent requirements for thermometry – and raises new interesting questions. How fast can a given temperature be measured? Are there fundamental quantummechanical limitations to the accuracy of thermometry? These questions are being studied in the Heattronics project.”
At a glance Full Project Title Mesoscopic heattronics: thermal and nonequilibrium effects and fluctuations in nanoelectronics (Heattronics) Objectives The heattronics project studies the distribution and relaxation of heat in nanoelectronics, resulting from the operation of the electronics devices. Its aim is also to describe the fluctuations, of the heat currents, and caused by them. Contact Details Project Coordinator, Dr Tero Heikkilä Low Temperature Laboratory P.O. Box 15100 FIN-00076 AALTO T: +358-(0)9-470 22396 F: +358-(0)9-470 22969 E: Tero.Heikkila@aalto.fi W: http://ltl.tkk.fi/~ttheikki/ W: http://ltl.tkk.fi/wiki/Tero_Heikkila
The future This research is not targeted at any specific applications, nevertheless it holds real relevance for industry, particularly nanoelectronics. While Dr Heikkilä’s job is not to actively look for applications for his work, he has a common patent application with a group from the Finnish State Research Centre VTT. “That group makes heat cameras based on superconducting radiation detectors. The patent is an idea for a novel type of radiation detector that could be used as a detector of THz radiation. This is produced as thermal radiation from all objects, as all of us radiate heat,” he says. Many interesting fundamental questions remain in the field, particularly around noise, and will form the focus of Dr Heikkilä’s work in future. “Noise in fact leads heattronics into contact with recent studies of fundamental thermodynamics. Studying fluctuations allows one to study the basic thermodynamics in nanoelectronic systems. In fact not only the ‘basic’ thermodynamics, but also some recently discovered additions, such as the so-called fluctuation theorem,” he outlines. “Our aim is to see how this fluctuation theorem connects with quantum mechanics in quantum conductors.”
Dr Tero Heikkilä
Project Coordinator
Tero Heikkilä received his doctoral degree from the former Helsinki University of Technology (now Aalto University) in 2003. At present he leads a condensed matter theory group in the Low Temperature Laboratory, Aalto University. He has worked as a researcher also in the University of Karlsruhe, University of Basel, Delft University of Technology and Lancaster University.
57
UPCON: Investigating the Structural Properties of Ultra-Pure Nanowires and Improving Energy Conversion
Professor Anna Fontcuberta i Morral, a leading authority in the field of Nanowires, talks to EU Research about the UPCON project, the latest developments in Ultra Pure Nanowire research and also discusses the advancement of Solar Cell technologies Over the years, advances in technology have provided us with smaller and more powerful devices. However, when we think about the things that are now commonplace in our homes and work, we don’t tend to think about what makes these items work. There will be a number of people who do not know what exactly a nanowire is, let alone know the pivotal role they play in the development and operation of a plethora of novel technologies. The Ultra-Pure nanowire heterostructures and energy CONversion (UPCON) project is investigating the synthesis of ultrapure nanowires with the specific
58
intention of applying them in third generation solar cells. Third generation solar cells come after the first using crystalline silicon and the second using thin film technologies. “The switch between generations has been fostered by the motivation of reducing the amount of material used while trying to increase efficiency,” says Professor Anna Fontcuberta i Morral, principal investigator of the UPCON project. “In third generation solar cells, one expects that the novel properties observed in nanoscale materials will enable a quantitative jump in the value of efficiencies, while still reducing production costs thanks to the reduction
in material quantity.” These ultra-pure nanowires represent a model system in which the main working principles can be investigated and the device optimized. An initial interesting facet of the project concerns the synthesis of ultra-high purity nanowires themselves. Nanowires have garnered a great deal of interest within the scientific community in recent years, thanks to their multidisciplinarity. Indeed they offer expectations in diverse areas such as biotechnology, sensing, electronics and energy conversion. Professor Morral points to the work of Professor Harry Atwater at Caltech as the first example of nanowires being used
EU Research
within silicon solar cells. “Three or four years ago, Professor Atwater showed that if wires were grown using gold as the catalyst, the electrical properties of those wires were less performant than those grown using other metals, due to the contamination of gold” she says. This hint to the important aspect of having ultrahigh purity in nanoscale materials is key for a successful implementation in devices. Indeed, gold is a known impurity which affects the mobility of electrons and carrier lifetime within semi-conducting materials; its use a catalyst in nanowire synthesis is the most common method employed in their growth. One of the UPCON project’s main aims is to offer an alternative to the use of gold as a catalyst in nanowire growth. “We are one of the first teams to try growing nanowires without gold, but it has been a concern for quite some time,” Professor Morral says. “In many facilities that produce semi-conductors, you will find that gold is a banned substance; it diffuses very quickly at room temperature. Essentially, if you get a bit of gold in one machine, you may as well just throw that machine away.” In order to grow ultra-pure nanowires, the UPCON project will employ the use of molecular beam epitaxy (MBE), whereby
www.euresearcher.com
the materials used to grow the wires are heated in a high-vacuum chamber until they evaporate and then condense on a wafer with no need for an external catalyst. Nanowires grown in this way have yielded optical properties which surpass those of nanowires grown using external catalysts. Professor Morral says, “The MBE techniques will allow us to clarify the role of gold in the properties of the nanowires and contribute to the ongoing discussions.” As well as investigating the role that gold plays on the properties of nanowires, Professor Morral and her team will be investigating ways in which different elements can be combined within nanowires themselves. The composition of a nanowire can be varied axially or coaxially, and both methods lead to the fabrication of nanowires with different properties. “With an axial composition,” Professor Morral says, “it is possible to combine two materials with different band gap energy, which for example enables devices like resonant tunnelling, singleelectron transistors, and light emitting quantum dot devices.” The fabrication of axial heterostructures involves introducing two different materials along the growth axis. It is relatively easy to produce nanowires in this way with gold used as an external catalyst; however it is less straight forward when external catalysts are not used. “To our knowledge, there are no publications reporting on axial heterostructures obtained without the use of an external metal as catalyst.”
Coaxial nanostructures, comprising of a core and shell of different materials, have received very little attention in the scientific community in terms of the geometric and electronic structure of the outer shell; the focus has predominantly been focussed on the internal nanowire. “We have shown that it is possible to uniformly coat the nanowires with successive epitaxial layers, resulting in multiple heterostructures defining for example quantum wells with a prismatic geometry,” says Professor Morral. “All these possibilities expand the functionalities of the nanowire and, accordingly, increase the freedom of design for the devices.” While one of the project’s concerns is with the synthesis of ultra-pure nanowires, this is not the only outcome that Professor Morral and her team are investigating. The second stage of the project is to take their findings and apply them to the creation of a more efficient solar cell. As it stands, the majority of solar cells used today are constructed primarily from a silicon base in the bulk or thin film form; however, solar cells that have been developed for use in space are constructed using multi-junction III-V semi-conductors. Multi-junction solar cells are expensive but allow a much more efficient absorption and conversion of sunlight. The reasons for this are due to the need for high
59
At a glance Full Project Title Ultra-Pure nanowire heterostructures and energy Conversion Project Objectives This proposal is devoted to the synthesis of ultra-pure semiconductor nanowire heterostructures for energy conversion applications in the photovoltaic domain. Project Funding ?1.270 million Contact Details Anna Fontcuberta i Morral Project Coordinator, Laboratory of Semiconductor Materials, Institute of Materials, Ecole Polytechnique Federale de Lausanne, Station 12, 1015 Lausanne, Switzerland T: +41 21 693 73 94 E: anna.fontcuberta-morral@epfl.ch W: http://lmsc.epfl.ch
Anna Fontcuberta i Morral
Project Coordinator
A. Fontcuberta i Morral finished her master studies in physics at the University of Barcelona (Spain) in 1997. She obtained a masters in materials science from the University of Paris XI (France) in 1998 and her PhD in materials science from Ecole Polytechnique (France) in 2001. In 2001 and 2002 she was a postdoctoral fellow at the group of Prof. Harry A. Atwater at the California Institute of Technology. After that she obtained a permanent research position at the French research council (CNRS) at Ecole Polytechnique (France). In 2004, she took a leave of absence to return to the California Institute of Tehcnology, where she co-founded the startup company Aonex Technologies Inc.. Then, in 2005 she started her own research group in the chair of Prof. G. Abstreiter of the Technical University of Munich (Germany). Since 2008 she is assistant professor at the Institute of Materials of the Ecole Polytechnique Federale de Lausanne (Switzerland).
60
efficiency and a longer life span in space applications. As global energy needs become an increasingly more pressing issue, the commercial desire for more energy efficient and cost effective solar cells is steadily increasing, especially when considering the abundance of the source. It is here where nanowires can play an important role, as they enable to rationalize the use of expensive and high performing material. An array formed by one micron long nanowires and separated by about one micron can perform as well as a 350 micron thick substrate of the same material. The amount of material used can therefore be reduced by a factor more than 2000. Professor Morral has always held a particular interest in the area of solar cells, and the funding for the UPCON project has allowed her and her team to investigate fully the potential of nano-structures as a basis for solar cell development. “The area of nanowire based solar cells in still in its infancy,” Professor Morral says. “A higher design effort and more basic research on the fundamental processes occurring during energy conversion need to be realised in the coming years in order to completely fulfil the expectations.” The success of solar cells made utilising nanowire technology depends upon more research being carried out into what effectively makes a nanowire a better energy converter. It is this question which is ultimately driving the UPCON project. For Professor Morral, the UPCON project is not just about creating solar cells that are more efficient, it is about redefining the ratio between efficiency and cost. “We want to find a way to minimise, as much as possible, the cost of production for each solar cell device.” One way of achieving
this is to minimise the amount of materials used in the fabrication of each solar cell; therefore, by using nanowires, a great deal of space is saved due to their relative size. “The aim is to produce highly functional nanowires, which are about one micron in size, on a base of silicon, which is a cheap material to use. The principle advantage of this would be to reduce the cost per watt,” Professor Morral says. Professor Morral and her team highlight also other applications for their ultrapure nanowires in converting energy. The possibility of using nanowires to split water is something that they find fascinating. “Nanowire electrodes would be inserted into water,” Professor Morral says, “and the sun would shine onto the water, splitting it into oxygen and hydrogen; the hydrogen could then be used as fuel.” The idea could then be taken further in order to fabricate other organic molecules, such as methane, or even more complex molecules. This could therefore lead to even more opportunities for the sun to provide us with a source of fuel. A further application could also be to use the nanowires as electrodes within batteries. Lithium-ion batteries have been developed using silicon nanowires around a stainless steel anode by Dr Yi Cui at Stanford University. There are a great number of possibilities open for the research Professor Morral and her team are undertaking, with a significant amount of interest coming from within industry. “There are people extremely motivated to see these devices put into the market place because they see the advantages of our work,” Professor Morral says. “I think within about eight years, we will see a real impact.”
EU Research
EU Research Funding
Research EU Dissemination
Keeping Research and Development moving www.euresearcher.com
For more information, please visit: www.euresearcher.com
00
Ultra-low temperatures to understand quantum criticality The recent discovery of a new energy scale which vanishes at the quantum critical point of a heavy fermion material calls for entirely new theoretical approaches, and for advanced experimental studies of matter at extremely low temperatures. Professor Silke BühlerPaschen of the QuantumPuzzle project explains how her research will advance the field At absolute zero in temperature matter can reach a highly exotic state where the system itself is uncertain about the next stage in its development, and undergoes strong collective quantum fluctuations as a result. These fluctuations are described by quantum criticality, an area that forms the primary research focus of the QuantumPuzzle project, an ERC-backed initiative based at the Institute of Solid State Physics at Vienna University of Technology. “We aim to advance the field of quantum criticality in strongly correlated electron systems, which are characterized by low-lying and frequently competing energy scales,” says Professor Silke Bühler-Paschen, the project’s Principal Investigator. The recent discovery of a new energy scale, which vanishes at the quantum critical point of a heavy fermion material, cannot be explained by the standard theory of quantum phase transitions; therefore entirely new experimental and theoretical approaches are needed to advance the field, a goal which QuantumPuzzle is working towards. The five-year project started in 2009 and is run by an international team of more than 10 Post-Docs and PhD students, supported by experienced scientists at Vienna University of Technology and from abroad. QuantumPuzzle’s primary focus is on pursuing experiments into the nature of the new low-lying energy scale that has been identified; the project is looking mainly at heavy fermion compounds, the properties of which have made it an attractive field of investigation for numerous groups worldwide. “The physical 62
properties of heavy fermion compounds are governed by very low energy scales,” explains Professor Bühler-Paschen. “This allows researchers to deliberately induce changes between different ground states by varying external parameters such as pressure or magnetic field. This has led to impressive progress being made in the field in recent years.”
Experimental challenges “In order to study zero-temperature phase transitions – called quantum phase transitions – other, non-thermal parameters are needed in addition to low temperatures,” explains Professor BühlerPaschen. “Physical properties must then be studied as a function of at least two parameters. In QuantumPuzzle we are working on several experiments that have
not been used before to characterize quantum criticality. For instance, in the newly founded Vienna Micro-Kelvin Laboratory, which is equipped with a powerful nuclear de-magnetization cryostat, samples shall be cooled down to the micro-Kelvin regime, which is about two orders of magnitude lower than used in most state-of-the-art dilution refrigerators.” The project is using several other highly sophisticated techniques to characterize quantum criticality, which could lead to significant advances in the field. For instance, microwave experiments shall be used to study the dynamic behaviour near the quantum critical point, which Professor Bühler-Paschen believes will reveal the microscopic meaning of the mysterious new energy scale. “Such experiments are extremely challenging. In order to keep the sample at very low temperatures during a measurement, only very low excitations (e.g., electrical currents) may be used,” she explains. “The tiny measurement signals (e.g., electrical voltages) must then be detected to a very high level of precision.”
Quantum critical behaviour in different materials Across the new energy scale (red line) conduction electrons and spins get successively entangled, leading to full screening of the spins at high fields. At zero temperature, at the quantum critical point, the crossover is abrupt and is experimentally seen as a jump in the Fermi volume
Studying quantum critical behaviour usually goes hand in hand with identifying the types of systems in which it occurs. As such Professor Bühler-Paschen says it is important to study various different materials, and ultimately even different classes of materials, to gain an accurate picture. “Quantum critical behaviour has
EU Research
At a glance Full Project Title Quantum Criticality - The Puzzle of Multiple Energy scales Project Funding €2,100,000 Project Duration 60 months Project Partners Single PI project.
The QuantumPuzzle team been observed in systems as diverse as high-temperature superconductors, metamagnets, organic compounds, and heavy fermion compounds. From classical (finite temperature) phase transitions the phenomenon of universality is well known. We want to find out the extent to which quantum phase transitions are also universal,” she outlines. “One important aspect in our search for new materials is their dimensionality. The form of quantum criticality we recently observed in a cubic material was unexpected and calls for verification with other three-dimensional materials. Synthesising adequate compounds is just as challenging as
quantum criticality is that quantum critical fluctuations can stabilize new phases. Unconventional superconductivity, for instance, occurs in numerous heavy fermion systems in the immediate vicinity of a quantum critical point. Quantum critical fluctuations are also discussed as ‘glue’ between the Cooper pairs in the high-temperature cuprate superconductors. In some materials ‘nonFermi liquid phases’ with still unknown microscopic origins have been observed,” says Professor Bühler-Paschen. The nature of the field offers infinite scope for further research, and the project is still pursuing further investigations into low-lying
We hope that such experiments will not only broaden the range in which we can test quantum
Standard theory and beyond While the ‘standard theory’ of quantum criticality is an extension of the theory of classical continuous phase transitions to zero temperature, a number of materials have been identified which appear to require completely new methods of theoretical description. Analysis of quantum fluctuations in these materials has already led to some interesting results. “One of the most interesting aspects of
www.euresearcher.com
Professor Silke Bühler-Paschen
critical scaling
relations but also lead to the discovery of new phases. Ultimately we hope this will help to solve the ‘puzzle’ of quantum criticality beyond the standard scenario measuring them – the slightest amount of disorder may alter the properties severely and thus hide the intrinsic properties.”
Contact Details Project Coordinator, Professor Silke Bühler-Paschen Institute of Solid State Physics Vienna University of Technology Wiedner Hauptstraße 8-10 1040 Vienna Austria T: +43 1 58801 13716 F: +43 1 58801 13899 E: paschen@ifp.tuwien.ac.at W: http://erc.tuwien.ac.at/paschen
energy scales. The results generated are likely to advance not only the fields of heavy fermion systems and quantum criticality, but also the current understanding of phase transitions in general, something which holds relevance beyond the condensed matter physics field. “We hope that our experiments will not only broaden the range in which we can test quantum critical scaling relations, but also lead to the discovery of new phases,” says Professor Bühler-Paschen. “Ultimately we hope this will help to solve the ‘puzzle’ of quantum criticality beyond the standard scenario.”
Silke Bühler-Paschen is a Full Professor of Solid State Physics at the Institute of Solid State Physics, part of Vienna University of Technology. She gained a PhD in physics from EPF Lausanne in 1995 and worked in Germany and Japan before assuming her current role.
63
www.photokina.com
COLOGNE | 18 – 23. SEPTEMBER
THE MUST-SEE EVENT FOR THE SECTOR. Experts from trade and industry, as well as manufacturers, service providers, professional users and photo enthusiasts will all come together at the world’s leading trade fair for the imaging sector. Take advantage of the combined expertise and networking opportunities to learn about the latest technologies and current trends – up-close and first-hand. No other trade fair combines such a range of technologies, and no other business platform covers the whole imaging workflow – from image capture to image processing and image output.
Welcome to photokina 2012 !
00
Koelnmesse GmbH · Messeplatz 1 · 50679 Köln · Germany Tel. +49 180 5 10 3101 · Fax +49 221 821- 99 1270 photokina@visitor.koelnmesse.de
e ts onlin e k c i t o Buy ve up t a s d n a
! % 37 a.com
n hotoki www.p
00
BEMOSA – Behavioural Modelling For Security In Airports The 9/11 attacks have raised profound questions concerning security issues within an airport environment, and it is these questions which have formed the impetus behind the BEMOSA project. Professor Avi Kirschenbaum, the initiator and principle investigator, has previously worked in the area of disaster management and behaviour speaks to EU Research about the FP7 Cooperation Programme funded project There’s no denying that the attacks on September 11th 2001 changed the world at large. No one could have predicted that morning that terrorists would be thrown so dramatically into the public eye, and that issues of security would come under such detailed scrutiny. Now over a decade later, the threat of terrorism and security issues are still major points of discussion the world over. Professor Kirschenbaum’s latest research project hopes to take the issue of security to the next level. BEMOSA, or Behavioural Modelling for Security in Airports, is a three year research project funded under the FP7 Cooperation Programme (Aeronautics and Air Transport theme); the aim of the project is to address security issues within an airport’s organisational framework with a detailed exploration of the decision making process during both routine situations and a security threat. The end result of this will be the construction of a prototype comprehensive and practical training programme that encompasses all aspects of dealing with a crisis situation. BEMOSA hopes to uncover exactly what happens within an airport: “There are many rules and regulations,” Professor Kirschenbaum explains. “Then there’s the administration, the structure of how things should be done, and all of this is on paper. Very few people actually know what happens and our first step was to find out what goes on.” By studying and learning how both employees and passengers behave within an airport it is then possible to begin the process of improving and enhancing security. “The rules and regulations which govern airports are built upon logic and rational thinking, but in a crisis logic and reason do not always form our decisions.” With a behavioural understanding in place, BEMOSA will generate an evidence based training 66
programme to give people a better idea of how to deal with the different situations which will occur that may be outside of the rules and regulations. “Here’s a simple example of the kind of situation that we’re talking about. In many airports you see a piece of emergency glass that has to be broken with a small hammer. In the protocols it says ‘take the hammer and break the glass’, but what happens if the hammer is not there? What do you do? There are no rules or regulations for this eventuality, and so it requires some thought. What if you use a chair to break the glass? Will you lose your job? Will you be sued?” The BEMOSA project looked at how employees relate to security measures, and one thing that they have discovered is the extent to which rules and regulations are broken or bent to fit the situation. “This in a sense shows the degree of innovation in terms of behaviour. During the course of the research and the analysis of the data, we called this the ‘adaptive employee’.” The structure of the project was built around observation and interviews. BEMOSA is an incredibly broad study that goes beyond normal research organisation; it utilises profiles of the employees, ethnographic studies, questionnaires, interviews, and hundreds of hours of recordings, all spread across eight sample
airports across Europe. “We interviewed over 500 employees, from the cleaners to the head of security, as well as the sales assistants in the malls. We also spoke to passengers.” BEMOSA wants to understand how security information is passed along the line within an airport environment and is asking the question ‘who do you trust?’. “If a colleague tells you something, do you trust them, or do you trust your boss?” Professor Kirschenbaum asks. “It’s things like this that affect the decision making process.” The project is currently conducting panel studies and following up with group observations to uncover how behaviour changes in the event of a security incident. In order to capture behaviour, Professor Kirschenbaum and his team staged a number of incidents within the airports and videotaped the results; this information coupled with the questionnaires gave the team all they needed to discover how decisions are made in airports both among security and non-security personnel. “We literally have a broad picture of the whole operation of the airport; broad but very encompassing.” What BEMOSA has discovered is that the extent to which rules are broken or bent is more widespread than initially thought. “It’s a whole informal system within the airport’s social structure, and it is this that basically determines how decisions are made.” What BEMOSA aims to do is devise a supplementary training programme to be administered alongside current training programmes; it will provide real life scenarios that will be enmeshed with available training in order to increase the skill level and human capital of the employees themselves. The hope is that this new training programme will help reduce the number of false alarms and will also
EU Research
serve to increase the skills and motivation of the employees; as a result of this, passengers will themselves receive much better service. “How security decisions are made can mean the difference between a terrorist getting through or not getting through. The thing about BEMOSA that goes way beyond any similar research is that we make the assumption that technology has reached its zenith of ability. What we’ve done is consider the fact that technology was created by humans and has to be interpreted by humans. The decision process all along is difficult because we make the decisions, not the machines; the machines simply provide us with the information.” With this in mind, BEMOSA has adapted its simulation process to take into account the myriad of options available to a person to make a decision. “We’ve combined empirical evidence of behaviour and have modified the mathematical simulation programme to incorporate the reality of how people behave,” Professor Kirschenbaum explains. “The fact that we take into account unexpected situations and have a large variety of options available is what sets our simulation apart from others. In terms of decision making, the broad data collection that we’ve undertaken, studying who speaks to whom, where information originates, and who is trusted, is vital to the project.” BEMOSA was always conceived as an interdisciplinary project, but the extent to which this worked out beneficially was better than expected. “We had airport managers and security, we had mathematicians, social scientists, psychologists, and people from risk analysis. All of these elements together shaped how we undertook the study.” In total, eight airports participated in the study, though more were interested in taking part once word of the project spread. The input from each participant, their concerns, complaints and arguments have allowed BEMOSA to design all the different components of the research programme, including the training programme, from the ground up to cater for any eventuality that may arise. “We are ready now to begin the testing of the training programme in one or two airports to see how people react to it,” Professor Kirschenbaum tells us. “After this, there will be different iterations of it; we will change it and try to improve upon it. It has really been a wonderful consortium of partners.” The degree of interaction
www.euresearcher.com
between the partners is what has made the BEMOSA project so unique and cutting edge. “Each partner has been involved throughout the process at every step and we were constantly giving feedback to one another.” The success of the project can be seen by the number of interested parties who would like to be involved in a series of three workshops scheduled to take place at the end of March. “Once other airports started to hear what we were doing, they wanted to get involved, and on top of this we have all the major European agencies that deal with airports and airport security in our special interest groups. They all recognise that technology has gotten as far as it will go and they are looking for something novel to help them.” The research being carried out by BEMOSA has gone further than the original design brief outlined. Due to the amount of data that has been collected and the constant analysis, the researchers are learning more and subsequently adapting and improving the range of the training programmes. The future of BEMOSA is something that Professor Kirschenbaum and his team are already considering. “We have another proposal which is more or less a continuation of BEMOSA,” he explains. “It seems to me to be a logical progression of the work we’re doing now. “Furthermore, considering all the data and research and looking at the future of aviation or airport security, it would be a pity if the results of the research and the conclusions and recommendations from BEMOSA will not be implemented and incorporated into airport daily practice. I don’t see any reason why it should not. I think the work that we’ve been doing will have an impact on airport security policy.” One of the main lessons learned from the BEMOSA project, and acknowledged by airport personnel from aviation and security, is the fact that technology alone will not solve the problem of security. “Today airports are viewed in terms of an industrial process; it’s all about logistics and operations research. They forget that people are involved as both employees and passengers. BEMOSA has found that it really is a people organisation rather than a manufacturing organisation. The whole concept of security, of making passengers more satisfied, making the airport more affective, and at the bottom line making money, will in the end depend upon looking at airports in terms of the people involved in them.”
At a glance Full Project Title Behaviour Modelling for Security in Airports Project Objectives Its objective is to improve security in airports through enhancing the capability of airport authority personnel to correctly detect potential security hazards and reduce false alarms. Furthermore, BEMOSA aims at improving the way in which airport learn from experience, revising and updating their safety and security skills/procedures. Project Partners Technion – Israel Institute of Technology (Coordinator - Israel); Deep Blue (Italy); BMI (Czech Republic); USE2ACES (Netherlands); Helios (UK); Technische Universiteit Delft (Netherlands); Avitronics (Greece); University of Modena e Reggio Emilia (Italy); Letiste Brno A.S. - Airport Brno (Czech Republic); CARTIF (Spain); University of Žilina (Slovakia) Contact Details Project Coordinator, Professor Avi Kirschenbaum Faculty of Industrial Engineering and Management, Technion - Israel Institute of Technology, Haifa 32000, ISRAEL T: +972 2 678 6120 / +972 54 456 3384 E: bemosa@bemosa.eu W: www.bemosa.eu
Professor Avi Kirschenbaum
Project Coordinator
Alan (Avi) Kirschenbaum (Professor of Organizational Sociology and Disaster Management, Technion at I.I.T) is an Executive member of the International Research Committee on Disasters (RC-39), the European Disaster and Social Crisis Research Network, consultant to a Canadian national project in emergency preparedness and member of numerous scientific associations. Author of “Chaos Organization and Disaster Management” (Marcel Dekker, 2003) and many scientific journal articles, Kirschenbaum is presently coordinator of BEMOSA, a European Union FP7 consortium.
67
Food security A key issue for JRC Director-General The Joint Research Centre (JRC) is the only part of the European Commission charged with direct research. With the global population set to rise to 8 billion by 2030, food security is a key issue for Dominique Ristori, the JRC’s Director-General
A
s Director-General of the European Commission’s Joint Research Centre (JRC), Dominique Ristori has a lot on his plate. Energy, transport, health, agriculture, environment and climate change research all fall under his remit, a daunting in-tray by anyone’s standards. However, Mr Ristori is not the type to back away from a challenge. Since his appointment in November 2010 he has thrown himself into the role. Food security is an area that Mr Ristori sees as a particular priority, describing it as ‘one of the biggest challenges that humanity has ever faced.’ The world’s population currently stands at around 6 billion people, of which almost 1 billion don’t have access to what Mr Ristori describes as ‘normal’ levels of food. This is mainly in the developing world, but the issue of food security is also having a wider impact; recent rises in food prices put budgets everywhere under strain, adding to wider economic worries. Emerging demographic trends are likely to add to the pressure.
68
Demographic trends Continued growth in the global population, which is expected to rise to 8 billion by 2030, will put further strain on the world’s food resources, while rapid urban development in countries like China and India and the growth of a more affluent middle class will have a similar effect. The growth of these new social classes offers significant economic opportunities for western commerce. With Europe in the grip of its worst financial crisis since the ’30s many countries are keen to develop their exports to emerging economies like China and India, aiming to tap into a growing market for consumer goods. The scale of the food security challenge is immense, underlining the importance of continued research into new, more efficient production methods. This is something in which the JRC is playing a major role. “We develop a range of projects that are concerned with the necessary evolution of agricultural production in the light of climate change,” says Mr Ristori.
EU Research
We have to produce more food and we have to produce it better. This means that scientific research is absolutely fundamental in order to find out which production techniques will be best for the environment The European Union is a major aid donor across the world, and the JRC has strong links with the UN and other important global agriculture organisations. Food security is very much a global problem and so requires a global solution. “We have to promote a sort of appropriate global governance in this area,” stresses Mr Ristori. The key issue is how to enhance production in a sustainable way; Mr Ristori says continued scientific research is crucial. “We have to produce more food and we have to produce it better. This means that scientific research is absolutely fundamental in order to find out which production techniques will be best for the environment.”
Food security The JRC is pursuing research into a range of areas, including the typology of soils, soil degradation and deforestation, as part of this work. Many people view agriculture as quite a traditional industry but this does not mean there is no scope for technical innovation or improvements in efficiency. The statistics underline the potential impact of technical improvements in agriculture.
www.euresearcher.com
Currently agriculture consumes approximately 70 per cent of global water withdrawals from rivers and aquifers and accounts for around 10-12 per cent of global greenhouse gas emissions. Even relatively small reductions in these figures could have wider effects on the global food security issue. Agriculture varies in economic importance across the world, forming the bedrock of some countries’ exports but peripheral in some others, yet it is part of virtually every major economy and reductions in emissions could have a correspondingly wide impact. Soil quality is another important issue in terms of sustainability. Pesticides and fertilisers can enhance production levels, but often harm the health of the soil in the long-term, leaving it unable to support crops. Therefore a delicate balance, underpinned by clear research data, needs to be struck between enhancing production and maintaining the health of the soil. The JRC’s Food Security and Global Governance conference, held in Brussels in September 2011, addressed this and many other questions. The conference brought together key international
69
organisations, producer countries and EU politicians to discuss food security issues and how technology and innovation can be used to improve production. Among the most prominent issues was that of identifying areas at risk of crop failure before it occurs. The JRC is at the forefront of innovation in monitoring and early warning systems using earth observation techniques, agro-meteorological models and information and communication technologies.
Crop development One of the models developed by the JRC simulates crop development and identifies water stress and hence areas at risk of crop failure, as happened in the Horn of Africa in 2011. The short-range cropping season in the region failed between November 2010 and March 2011, followed by the main cropping season later that year. This led to a devastating drought, which has affected more than 13 million people across Ethiopia, Kenya and Somalia, and the effects continue to be felt today. The EU continues to provide aid to the region, but the long-term goal has to be improving food production and making global agriculture more resilient. The JRC is supporting the work of the EU through dozens of crop monitoring bulletins designed to provide relevant information to people on the ground. These bulletins include a specific focus on the Horn of Africa crisis, and information on a regional and national level to EU external services, national authorities and international organisations. This information supports organisations in their decision-making processes and humanitarian interventions, ensuring aid is targeted at those who need it most. The JRC is also a scientific partner in the integrated food security phase classification, which played an important role in identifying the severity of the crisis.
distributed across the globe and in some cases are poorly managed or exploited, leading to resentment among those affected. The JRC has gathered and analysed data on about 1,500 conflict events and identified the factors that could potentially lead to armed conflicts. Prominent among them are economic underdevelopment, proximity to mineral resources and the prevalence of grasslands or croplands, with research confirming that many conflicts are related to food issues. Alongside the human impact of conflict, it can also have a longlasting impact on agriculture. Even small-scale conflicts can disrupt food production and distribution networks, leading to rises in the price of food or even its outright destruction.
It is very important to understand and to stress the important input that science and research can give to the definition and implementation of food security in the world There is no easy solution, but effective, accurate monitoring can help identify potential food security problems before they occur and prevent them from escalating. The JRC has developed tools to constantly monitor droughts in Europe and has defined indicators for monitoring and assessing drought and desertification processes across the globe.
Monitoring technologies are only one part of the picture and not everything can be solved by science. Nevertheless, Mr Ristori is in no doubt that scientific research is crucial to food security. “It is very important to understand and to stress the important input that science and research can give to the definition and implementation of food security in the world.�
Access to resources This work also needs to be placed in the wider social and political context. Research by the JRC shows that historically many armed conflicts have arisen over access to food, water and mineral resources; even seemingly minor issues can quickly escalate and exacerbate existing tensions, entrenching negative attitudes and prejudices. These prejudices are often at the core of ongoing resentment and mistrust, which may then be passed on to the next generation. These kinds of problems will not be immediately resolved by distributing resources more equably, but evidence from the JRC shows that management of food resources is a major issue in conflict zones across the world. The JRC is looking at four pilot study areas; the great lakes of Africa, the Horn of Africa, Western Africa and Central Asia. Natural resources such as forests, crop lands or grasslands are unevenly
70
EU Research
New insights into the evolution of the solar system
Chondrules in a section of primitive chondrite, Image courtesy of M. Champenois CRPG.
The grains and solids that have accumulated in meteorites can provide important insights into the evolution of the solar system. Dr Marc Chaussidon explains how the CEMYSS project’s work in developing new analytical techniques will help researchers learn more about the early years of the solar system Studies of meteorite composition can offer valuable insights into the formation of the solar system. Meteorites formed in the early years of the solar system have accumulated grains and solids throughout their evolution; this area forms the primary research focus of the CEMYSS project. “The idea of our project is to study meteorites and develop new analytical techniques to measure the isotopic www.euresearcher.com
composition of different components of meteorites. This will help us put the evolution of the solar system into the context of what astrophysicists have observed in young stars and their accretion disks,” says Dr Marc Chaussidon, the project’s scientific coordinator. Researchers in the project are using a special analytical tool called an ion microprobe to analyse the isotopic composition of meteorites. “We
can look at how the sun was formed from the mixture of inter-stellar gas and dust, and how this dust merged into small objects which then grew together to make bigger objects and planets,” explains Dr Chaussidon. “One very important observation which has emerged from various studies is that the processes involved in forming the sun were very rapid and led to the rapid formation of the
71
Thin section of the Chassigny metorite showing olivine crystals from the mantle of Mars. Image courtesy of B. Zanda MHNH. first small planets, called planetesimals, which were 60-100 kilometres in diameter.”
Planetesimal formation It is thought these planetesimals formed very rapidly around the sun in the accretion disk over the first 2-3 million years after its formation. Previously it was believed that planets grew through a gradual process of hierarchical evolution, but recent astrophysical models have shown that it was in fact possible to go straight from dust to planetesimals; this has significant implications for researchers understanding of meteorites’ chemical composition. “There are samples of small planetesimals in meteorites, and if we can identify the types of planetesimals they came from then we can date the sample and measure its composition. We are trying to find the logical steps in between these stages, to determine the timescale which operated in our own solar system, and to compare that
to astrophysical models,” outlines Dr Chaussidon. By measuring the abundance of short-lived radioactive nuclides in meteorites Dr Chaussidon says it is possible to investigate the ages of those meteorites very precisely. “We are measuring the products of radioactive elements which were present at the beginning of the solar system,” he explains.
Planetary evolution This was probably due to the heat generated by the decay of 26Al, showing that short-lived radioactive nuclides are not only effective tools for establishing the chronology of solar processes, but also help control the evolution of planets. The project’s work in developing an ion microprobe, a type of mass spectrometer,
The idea of our project is to study meteorites and develop new analytical techniques to measure the isotopic composition of meteorites. This will help us put the evolution of the solar system into the context of what we and astrophysicists have observed in young stars and their accretion disks The short half-life of these radioactive nuclides means researchers can use them to reconstruct the chronology of early processes in the solar system, even within the wider timescale of its evolution. The two key nuclides being used are aluminium-26 (26Al), which has a half-life of approximately 700,000 years, and b e r y l l i u m -10 (10Be), which Dr Chaussidon says was probably produced by the sun. “When the sun formed it was very
The instrument developed for the purposes of the CEMYSS project: ion microprobe Cameca 1280HR2.
72
active and it emitted a lot of particles. The particles emitted by the present day sun make the solar wind radiation – the early sun was probably much more active than the present day sun,” he explains. These particles collided with the gas and dust of the accretion disk, leading to nuclear reactions and the production, among others, of 10Be, which is a signature of the activity of the young sun. “The sun was very active during the first 1-2 million years after its formation, so 10Be is the signature of this period. It can be used to relate components of meteorites to the activity of the young sun,” continues Dr Chaussidon. “Another reason for focusing on these short-lived radioactive nuclides is that nuclides like 26Al were probably responsible for the fact that some of the early planets heated very rapidly and melted.”
means these short-lived radioactive nuclides can be measured to a high level of precision. “Take 26Al, which decays to magnesium 26 (26Mg); if you measure the three magnesium isotopes you can precisely determine the excess amount of 26Mg, which is derived from the decay of 26Al. You can then calculate the age of the meteorite much more precisely,” explains Dr Chaussidon. The project works with astrophysicists to model the production of 26Al and its distribution in the disk, and hence better understand the processes which control it. “If you compare two samples containing excess amounts of 26Mg, then you can say for instance that the one which contains less is younger than the other, because the 26Al has decayed before its formation,” says Dr Chaussidon. “This is of course based on the questionable assumption that originally the 26Al distribution was
EU Research
homogenous in the disk.” This work on the distribution of 26Al is just part of the project’s approach. Other elements were produced continuously over a period of approximately 1-2 million years in the early years of the solar system; by also measuring the abundance of 10Be Dr Chaussidon says it is possible to gain further detail. “We can couple different short-lived radioactive nuclides which have different sources, different half-lives and which decay at different rates. By coupling them within the same meteoritic component we can put much better constraints on its age and the processes involved in its formation,” he stresses. This is also relevant to the project’s goal of discovering the very first planetesimals; oxygen isotopic composition is a very important tool in this regard. “Oxygen has three stable isotopes. We know that oxygen isotopic composition is very variable in the solar system,” explains Dr Chaussidon. “For instance you can identify a meteorite from Mars simply because of its oxygen isotopic composition. Presumably, planetesimals also had different oxygen isotopic compositions, reflecting the variable oxygen isotopic composition in the accretion disk.” The source of this variability has not been established. Different possibilities have been suggested, which in turn could be used to help identify planetesimals; alongside oxygen, Dr Chaussidon is also looking at nitrogen isotopic composition in the early solar system. “We are working to determine the oxygen and nitrogen isotopic composition of the sun itself,” he says. The project is also using data gathered by other initiatives in this work. “I have been working to measure the nitrogen isotopic composition of the sun
Primitive chondrite with its fusion crust. Image coourtesy of B. Zanda MHNH
by analysing samples from NASA’s GENESIS mission, which sampled the solar wind,” continues Dr Chaussidon. “Another group did the same with oxygen. This established that the sun is very different in oxygen and nitrogen isotopic composition from other objects in the solar system, including the earth and meteorites. Initially there was extremely wide variation in oxygen isotopic composition – we are trying to find components within meteorites that we think are fragments of those planetesimals that formed very early. We can then pair these fragments together by measuring their oxygen isotopic composition.” The project’s findings could then be integrated with those from other initiatives to gain a more complete picture. This will be enormously relevant to our understanding of how the solar system evolved. “Our results will be used to establish times and processes in the early solar system. So they are very important to understanding the growth of planets,” stresses Dr Chaussidon. The next step is to discover very short half-life radioactive nuclides which would be important in linking the formation of some components in the meteorites to the early evolution of the sun. “For instance irradiation processes around the sun could have shaped and modified the composition of the dust. This would be very important to understanding the chemical composition of planets,” says Dr Chaussidon. “Our efforts at the moment are focused on trying to discover these short-lived radioactive nuclides. We think we have also identified some of these very early planets which were destroyed, and we are working on trying to reconstruct their geological history based on much more precise measurements.”
At a glance Full Project Title Cosmochemical exploration of the first 2-3 millions years of the solar system (CEMYSS) Project Objectives Make progress in our understanding of the processes by which the nebular gas was transformed into solids and planets around the early sun at the very beginning of the solar system. Project Funding 12 million euros Contact Details Project Coordinator, Marc Chaussidon CRPG-CNRS BP 20 54501 Vandoeuvre-lès-Nancy France T: +33 3 83 59 42 25 E: chocho@crpg.cnrs-nancy.fr W: www.crpg.cnrs-nancy.fr/index.php W: www.crpg.cnrs-nancy.fr/Sonde/ intro-sonde.html
Marc Chaussidon
Project Coordinator
Marc Chaussidon graduated from the Ecole Nationale Supérieure INSU de Institut national des sciences de l'Univers Géologie in Nancy France and is Directeur de Recherches CNRS at the Centre de Recherches Pétrographiques and Géochimiques in Nancy. He is a geochemist/cosmochemist specialist of isotopic analysis by ion microprobe. His interests concern the study of formation and early evolution of the solar system and the Earth.
INSU
Institut national des sciences de l'Univers
CRPG www.euresearcher.com
73
Professor Christophe Salomon, a leading expert in the field of ultracold quantum gases and Head of the cold Fermi gas group at ENS, speaks to EU Research about FERLODIM, a five-year ERC funded research project that hopes to fully characterise the properties of Fermi gases
FERLODIM: A High Precision Exploration of the Properties of Ultracold Gases of Fermions FERLODIM, or Fermi Gases in Lower Dimensions, seeks to take advantage of recent advances in cooling and trapping techniques to explore the unique quantum properties of Fermi gases. For this task it aims to develop two ultracold atom machines, which will enable a full characterisation of the properties of Fermi gases. “In nature we have two types of particles, depending on their internal spin, Fermions with half integer spins and Bosons with integer spins,” Professor Salomon explains. “Both types of particles behave in a very different manner. The individual components of matter such as protons, neutrons, electrons, are all Fermions, and understanding the properties of interacting Fermions represents a key challenge of modern quantum physics. These gases are at temperatures just above absolute zero, ten nano Kelvin. At this temperature the quantum nature of the particles becomes dominant and so essentially in our field what is interesting is studying quantum matter in a regime where quantum effects and interactions are dominant.” Professor Salomon’s work with laser cooling techniques began in the late 80s and early 90s with a group of colleagues at the Ecole Normale Superieure. Following advances in the manipulation of atoms, and the discovery of BoseEinstein Condensation at JILA and MIT in the US in 1995, Salomon turned towards exploring quantum phases of fermionic matter at low temperatures in 1997. “Particles described by Bose statistics undergo Bose-Einstein
74
Condensation at low temperature,” Professor Salomon explains, “macroscopic occupation of the lowest energy state in a trap. All atoms move in locked step and behave in exactly the same way, but this is forbidden by quantum mechanics for Fermions. Two Fermions cannot occupy the same quantum state of the trap, and because of this Fermions behave very differently.” Over the following years, new cooling techniques had to be adapted for Fermions; in 1999 the quantum regime for Fermions was first obtained at JILA and in 2001 at ENS by Salomon’s team. A breakthrough in the field occurred in 2003 when it was discovered by Salomon and his team that a fermionic gas could be stable in the strongly interacting regime. “This was a great surprise,” Professor Salomon explains. “We could produce molecules by attaching together two Fermions, and this pairing was found to be extremely stable, even though their binding energy was extremely small. Our collaborative work with D Petrov and G Shlyapnikov allowed us to understand and explain this remarkable stability. This discovery opened up new avenues of investigation, and soon the fermion pairs were condensed into a superfluid phase at Innsbruck, JILA, ENS, and MIT. “This made an interesting link with condensed matter physics; the pairing of electrons in superconductors for instance.” With metallic superconductivity, despite the natural repulsion between particles with the same charge, electrons in matter are weakly attached to each other due to their interaction with phonons. It was
exactly one hundred years ago that Kammerlingh Onnes discovered the surprising properties of superconductivity. He discovered that at low temperatures, the electrical resistance in matter drops to zero. “This only occurs at low temperatures where quantum effects manifest themselves more prominently.” Working instead with cold gases allows the FERLODIM team to exert more control over the atoms and also offer the ability to fine-tune the interactions between them. “We can fully explore this link between the superfluidity of fermions with weak attraction and the superfluidity of strongly bound fermion pairs. This connection was pointed out more than 25 years ago but never checked before the advent of cold atomic gases,” Professor Salomon explains. “When you take two Fermions and bind them into a molecule, you create a Boson. If the binding energy is large these Bosons are described by BoseEinstein statistics and undergo BoseEinstein Condensation.” One of the biggest benefits to working with dilute gases is the amount of control the researchers have over the system under study; nearly all parameters of the experiment can be tuned, for instance interactions between atoms as mentioned above, but also the trapping potential, the dimensionality of the system, the atomic density, and the temperature. Furthermore these quantities can be made timedependent and many of the tools of atomic physics, such as precision spectroscopy and matter-wave interferometry can be used to probe the new properties of these quantum gases. Professor Salomon explains: “Because the atoms are so far
EU Research
apart, their interactions can be modelled essentially with one parameter avoiding a very complex description with interaction potentials. Furthermore, this interaction can be tuned; we can turn a dial and either increase or decrease interaction, or make the gas attractive or repulsive. This is unique. Within a solid, attraction cannot be changed.” Professor Salomon tells us that the systems being developed as part of FERLODIM will not revolutionise society per se, but they will provide a clearer understanding of the quantum mechanics involved in the superfluidity of quantum gases. This understanding will enable the researchers to turn their attentions to solid-state materials. “This is a model system,” Professor Salomon says. “We are trying to learn everything we can in order to transpose this to new systems, new materials.” A far-reaching goal of this
Photograph by J F Dars
“The critical temperature for superconductivity is very high in our system. If we were able to design solid-
state materials with exactly
the same properties as our cold atom systems, then these
materials would be superconducting even above room temperature” would be the eventual development of superconductors that can operate at or above room temperature. “We and others have found that the critical temperature for superconductivity is very high in our system. If we were able to design solidstate materials with exactly the same properties as our cold atom systems, then these materials would be superconducting even above room temperature, at 500 Kelvin, or 200 Celsius. This would then solve the problem of losses in the transportation of energy.” Such an undertaking is not easy as the FERLODIM ultracold gases are model systems; they lack the tremendous complexity and richness of real materials. Professor Salomon believes that conceptually it is of
www.euresearcher.com
75
At a glance Full Project Title Fermions in low dimensions Project Objectives • Quantum simulation of strongly correlated fermions • Understand the low temperature phase diagram, in particular the superfluid and normal phases, • Benchmark theoretical models • Study dynamic and thermodynamic properties • Develop a new experiment with mixture of fermions • Develop a new solid-state aser source at 671 nm for cooling of lithium atoms. Project Funding €2.05 Million Project Partners Yvan Castin, Ecole Normale supérieure, Research Director at CNRS Contact Details Project Coordinator, Professor Christophe Salomon Laboratoire Kastler Brossel Ecole Normale Supérieure 24 rue Lhomond 75231 Paris France T: +33 14 432 2510 F: + 33 14 432 3434 E: salomon@lkb.ens.fr W: www.lkb.ens.fr/-Ultracold-Fermi-gases
Professor Christophe Salomon
Project Coordinator
Christophe Salomon obtained his PhD in 1984 at Paris 13 University (France). He currently works on quantum gases and ultrastable clocks in Laboratoire Kastler Brossel, at Ecole Normale Supérieure in Paris, France. As a Research Director at CNRS, he is Head of the cold Fermi gas group at ENS and Principal Investigator for the ACES/PHARAO space clock Mission of ESA. His research interests range from superfluidity in quantum gases to high precision measurements and fundamental physics tests with space clocks.
76
great importance to identify the basic ingredients needed to further this field and also to guide theory. “In FERLODIM, we have been able to make for the first time a detailed comparison between highprecision experimental data and the most advanced theory models.” FERLODIM’s experimental data has questioned current theories within the field; data collected across a range of temperatures studied as part of the project does not tally with many of the current theoretical discussions. “This was surprising to us, but it illustrates the difficulty of these theories.” Professor Salomon also pointed out that recent theories developed in the US have been used to compare against the project’s data. “This is not even published yet, which goes to show you just how quickly the field is developing.”
have to let freedom take over, give people the freedom to explore new things,” Professor Salomon says. “What surprised us in the course of our research was the fact that we could push the method so far and to such high precision.” For Professor Salomon this is the beauty of research, the ability to take a relatively simple idea and follow wherever the results may lead, even if it transcends the original idea. FERLODIM is approximately halfway through its five year run time, and the results that the team have so far discovered are opening up further areas for investigation. “In terms of the future of this research,” Professor Salomon tells us, “we have plenty of ideas to implement. We have already built a new experiment where we will further explore the possibilities of these systems by looking specifically at
“In fundamental physics, you
have to let freedom take over, give people the freedom to explore new things” This is another aspect of FERLODIM that sets it apart from other research projects. The benchmarking of the most advanced theoretical concepts allows the ENS team to help further develop current theories. “This is a fascinating concept for me,” Professor Salomon explains. “You can get an experimental answer to a complicated many-body problem by the precise measurement of gases properties and then make a detailed comparison with theoretical models.” The FERLODIM project is pushing the frontiers of experimental investigation with regard to trapping strongly interacting Fermi gases. Much of the work currently being undertaken as part of the project did not form part of the original proposal; as the research began to take shape new avenues of investigation opened up and were explored by the FERLODIM team. “In fundamental physics, you
two different species of fermions – heavy ones and light ones. With the economic climate in Europe the way it is, we have been very privileged to receive such ERC funding that has allowed us the research freedom we have had.”
EU Research
One of the world’s leading authorities on the subject of solar sails, Professor Colin McInnes talks to EU Research about the VISIONSPACE project and the development of novel approaches to orbital dynamics to improve telecommunications and space exploration
VISIONSPACE: Taking Orbital Dynamics to New Levels Since the launch of Sputnik, the first manmade satellite in 1957, the number of satellites in orbit around Earth has increased dramatically. Current figures from the Space Surveillance Network estimate that there are around 8000 objects in orbit around the planet with a diameter no smaller than 10cm; of this number, 974 are operational satellites, with the rest being made up of dead satellites and assorted space debris. As the relative costs of production have decreased and demand has increased over the years, governments, space agencies, and companies have found themselves in a position to place more satellites in orbit with relative ease. As a direct result of this, orbital space around the planet in the most useful orbits is becoming something of a premium; the space in space, as it were, is running out around Earth. This is just one of many challenges being addressed by the VISIONSPACE project, a
five year project funded by a European Research Council Advanced Investigator Grant. Based at the University of Strathclyde’s Advanced Space Concepts Laboratory (ASCL), which opened in 2009 under VISIONSPACE funding, the project is investigating a number of avenues of orbital dynamics that have far reaching applications for use in space exploration, the enhancement of telecommunications capacity, as well as developing novel spacecraft designs with both commercial and scientific applications. Professor Colin McInnes is the VISIONSPACE project’s Principle Investigator, and Director of the ASCL. His work on solar sails has made him one of the leading figures in the field, and his current research interests centre on the orbital dynamics and mission applications of solar sail
spacecraft. Part of this work includes the development of families of highly nonKeplerian orbits for solar sails and other spacecraft which can enable novel applications. Professor McInnes spoke to EU Research about the VISONSPACE
Swarm of ‘smart dust’ sensor nodes
www.euresearcher.com
77
project and some of the avenues they have been investigating. One of the primary goals of the VISIONSPACE project is to investigate ways of establishing novel orbits that would provide new opportunities for space science, telecommunications and Earth observation. For example, VISIONSPACE has been instrumental in proving a theory that, until recently, had been thought impossible. The use of a levitated geostationary orbit, accomplishable with the aid of a solar sail, would allow satellites to follow an orbital path that sits outside normal, Keplerian orbits. Professor McInnes and PhD student Shahid Baig have published a paper showing that the use of a solar sail attached to a satellite could elevate its orbit between 10 and 30 km north or south of the standard. “Attaching a large reflective sail could utilise sunlight to propel a satellite above or below the equator,” Professor McInnes explains. “This would provide enough thrust to sustain a geostationary position, almost indefinitely, without the need for any additional fuel that would otherwise be required to maintain this orbital position.” These non-Keplerian orbits present an opportunity to significantly boost the number of satellites that can be placed in orbit around the Earth in popular orbits, offering applications for greater telecommunications networks, as well as for weather forecasting among others. Another aspect of the VISIONSPACE project, also incorporating novel orbits, is the investigation into stationary orbits with a particular focus on the two poles. These
78
Solar sail trajectory - ‘pole-sitter’ orbit for polar Earth observation ‘Pole Sitters’ are devices that would utilise solar sail propulsion and solar electric propulsion in hybrid form in order to sustain a continuous stationary position above the poles. Professor McInnes explains: “This would mean that we can capture real-time observations of the Earth’s poles, with a full hemispheric view, and could enable a wide range of new applications, including monitoring of the ice cover and line-of-sight telecommunications to high-latitude regions.” System designs have been drawn up, and the hybrid solar sail propulsion / solar electric propulsion has the potential to offer significant improvements to mission performance. Earth-centred orbits are not the only concern of the VISIONSPACE project; there are other aspects of the project which offer
many possibilities. Near Earth Asteroids are one of the biggest long-term threats to our planet from space; the devastation caused if one were to collide with us could potentially wipe out life as we know it. However, they also represent a valuable source of materials for future large-scale space engineering ventures. As Professor McInnes explains: “You would be surprised how easy it is in principle to manipulate the trajectory of an asteroid; using a continuous push from a spacecraft and altering the trajectory of the asteroid at just the right point, it would be possible to manoeuvre one into orbit around Earth.” Such a venture would provide us with a wealth of raw materials that can be used to prime future ventures in space exploration and exploitation. “A 24 meter M class asteroid could potentially provide around 30,000 tonnes of metal, and even 1 tonne of Platinum Group Metals.” This would save the need to transport such materials into space, saving launch costs in the process. Other benefits could be harvesting the water content of such asteroids; there are many objects that are hydrated carbonaceous asteroids, and as such could produce significant volumes of water. Further research within the VISIONSPACE project is investigating the possibility of a solution to global climate change in space. The concept of geo-engineering looks at altering the planet’s climate by using large clouds of small dust grains to help reduce the amount of sunlight that reaches the Earth. In statistical terms, a 1.7% reduction of sunlight reaching the planet will offset
EU Research
At a glance Full Project Title Visionary Space Systems: Orbital Dynamics at Extremes of Spacecraft Length-Scale (Visionspace)
the effects caused by a 2 degree increase in global temperature. “The process is called solar insolation reduction,” Professor McInnes explains. “It can be achieved by dispersing dust particles from NEAs, or even transporting dust from the surface of the Moon using mass drivers.” The dust clouds could be placed close to the first Lagrange point between the Sun and the Earth, where it would be relatively unaffected by gravity and would loiter on the Sun-Earth line. Another aspect of this particular investigation is to investigate the possibility of using a captured NEA positioned at the Lagrange point in order to form an “anchor” for the dust cloud, therefore increasing its effectiveness. Professor McInnes spoke about the future possibilities of swarm technology within space based applications. “Recent advances in miniaturisation have enabled the fabrication of spacecraft with smaller length scales,” he explains. “There are even examples of spacecraft with the dimensions of a single microchip.” Such spacecraft are relatively low cost in terms of their manufacture, and so vast numbers of these “smart dust” devices can be constructed for use in a number of swarm applications, such as global sensor networks for Earth observation and communications, real-time sensing for space science, and to support damage detection with conventional space craft among others. VISIONSPACE’s considerations of orbital dynamics will come into play with this research as many of the swarm applications will rely on solar radiation pressure and aerodynamic drag to manoeuvre the smart dust devices into orbits around the Earth. “On top of this, the short life-time of the swarm can be increased by utilising the same light pressure to gain energy, and the drag to dissipate it.” Furthermore, the effect of drag can be
www.euresearcher.com
exploited at mission’s end in order to ensure that no long term debris is left in orbit; the idea is that the swarm burns up in it’s entirety within the Earth’s atmosphere. Professor McInnes describes the VISIONSPACE project as blue-skies research. Some aspects of the research being undertaken as part of the project are not applicable as yet in the real-world due to technological limitations; however, the results of the project will provide us with a clearer picture of the plausibility of such ventures, “the key benefit of European Research Council support is the ability to conduct curiosity driven research which can help shape the future.” The benefits from the VISIONSPACE project are not limited to the scientific communities and governments; the knock on effect of the research being undertaken by the project has the potential to bolster the local economy. Professor McInnes commented that: “the space sector looks as if it could be immune from the effects of the recession”. He highlighted rapid growth within the UK and European space sector, and believes that: “there is a successful collaboration between companies and academics to help bring jobs to Scotland in this growing sector.” In recognition of the work that the project has already done, the ASCL was awarded the 2011 Sir Arthur C Clarke Award for Achievement in Space Research, held at the UK Space Conference at Warwick University; Professor McInnes remarked that: “this was recognition of the space innovation delivered by the entire team at the University of Strathclyde.”
Project Objectives VISIONSPACE will deliver radically new approaches to orbital dynamics at extremes of spacecraft length-scale to underpin new space-derived products and services. Project Funding Total VISIONSPACE proect budget is €2.1million (2009-2014) Project Partners The Advanced Space Concepts Laboratory works with a broad range of academic, industry and agency partners including EADS Astrium, Clyde Space and the European Space Agency. Contact Details Project Coordinator, Colin McInnes Department of Mechanical and Aerospace Engineering University of Strathclyde T: +44 0141 548 2049 E: colin.mcinnes@strath.ac.uk W: www.strath.ac.uk/space
Colin McInnes
Project Coordinator
Colin McInnes is Director of the Advanced Space Concepts Laboratory at the University of Strathclyde. His work includes the investigation of families of novel spacecraft orbits and their mission applications, autonomous control of multiple spacecraft systems and advanced space concepts. Recent work explores new approaches to spacecraft orbital dynamics at extremes of spacecraft length-scale to underpin future spacederived products and services. McInnes has received national and international awards including a Philip Leverhulme Prize, the Royal Aeronautical Society Pardoe Space Award, and the Ackroyd Stuart Propulsion Prize. The International Association of Space Explorers awarded him a Leonov medal in 2007.
Space-based approaches to terrestrial geo-engineering
79
EU Research
www.euresearcher.com
For more information, please visit: www.euresearcher.com
EU 00