Early Warning Systems in Earth Management In addition to currently implemented measures for establishing an early tsunami warning system in the Indian Ocean, the German Federal Ministry of Education and Research (BMBF) has launched a portfolio of 11 research projects for developing and testing early warning systems for other natural geological catastrophes. The projects are carried out under the umbrella of the national R&D-Programme GEOTECHNOLOGIEN. The overall aim of the integrated projects is the development and deployment of integral systems in which terrestrial observation and measurement networks are coupled with satellite remote sensing techniques and interoperable information systems. All projects are carried out in strong collaboration between universities, research institutes and small/medium sized enterprises on a national and international level. The abstract volume contains the presentations given at the “Kick-Off-Meeting� held in Karlsruhe, Germany, in October, 2007. The presentations reflect the multidisciplinary approach of the programme and offer a comprehensive insight into the wide range of research opportunities and applications.
Science Report
Seite 1
GEOTECHNOLOGIEN
15:24 Uhr
Early Warning Systems in Earth Management
20.09.2007
GEOTECHNOLOGIEN Science Report
Early Warning Systems in Earth Management Kick-Off-Meeting 10 October 2007 Technical University Karlsruhe
Programme & Abstracts
The GEOTECHNOLOGIEN programme is funded by the Federal Ministry for Education and Research (BMBF) and the German Research Council (DFG)
ISSN: 1619-7399
No. 10
Umschlag_SR10.qxd
No. 10
GEOTECHNOLOGIEN Science Report
Early Warning Systems in Earth Management Kick-Off-Meeting 10 October 2007 Technical University Karlsruhe
Programme & Abstracts Number 1
No. 10
Impressum
Schriftleitung Dr. Ludwig Stroink © Koordinierungsbüro GEOTECHNOLOGIEN, Potsdam 2007 ISSN 1619-7399 The Editors and the Publisher can not be held responsible for the opinions expressed and the statements made in the articles published, such responsibility resting with the author. Die Deutsche Bibliothek – CIP Einheitsaufnahme GEOTECHNOLOGIEN; Early Warning Systems in Earth Management, Kick-Off-Meeting 10 October 2007, Technical University Karlsruhe, Programme & Abstracts – Potsdam: Koordinierungsbüro GEOTECHNOLOGIEN, 2007 (GEOTECHNOLOGIEN Science Report No. 10) ISSN 1619-7399 Bezug / Distribution Koordinierungsbüro GEOTECHNOLOGIEN Heinrich-Mann-Allee 18/19 14473 Potsdam, Germany Fon +49 (0)331-620 14 800 Fax +49 (0)331-620 14 801 www.geotechnologien.de geotech@gfz-potsdam.de Bildnachweis Titel / Copyright Cover Picture: NASA/JSC; http://visibleearth.nasa.gov
Preface
Geological events like earthquakes, volcanic eruptions, and landslides devastate many regions of our planet time and again. Natural events increasingly create natural catastrophes – regionally due to dense concentration of people and property in threatened areas, and globally due to the unification of the world economy. An exact prognosis of where and how earthquakes occur or volcanoes erupt is therefore essential for effective protection of the population and the economy. In the frame of the R&D-Programme GEOTECHNOLOGIEN 11 joint projects between academia and industry have been launched in 2007. The objective of this research is the development and deployment of a new generation of early warning systems against earthquakes, volcanic eruptions and landslides. All joint projects are funded by the Federal Ministry of Education and Research (BMBF) with about € 9 Million for the next three years.
Currently supported activities focus on the following key themes: 1. Development and improvement of measurement and observation systems in real time for online transmission of decisive physicalchemical danger parameters 2. Development and calibration of coupled prognosis models for quantitative determination of physical-chemical processes within and at the surface of the Earth 3. Improvement in the reliability of forecasts and prognoses for decision-making and optimization of disaster control measures 4. Implementation of mitigation measures in concrete socioeconomic damage prognoses 5. Development of information systems that ensure prompt and reliable availability of all information necessary for technical implementation of early warning and for decision-making by disaster managers The main objective of the Kick-Off-Meeting was to bring together the scientists and investigators of the funded projects to present their ideas and proposed work plans to each other; several projects are interlinked and could therefore benefit from synergies. All who are interested in the forthcoming activities of the projects - from Germany, Europe or overseas – are welcome to share their ideas and results.
Ludwig Stroink Max Wyss
Table of Contens
Scientific Programme Kick-Off-Meeting »Early Warning Systems« . . . . . . . . . . . . . . . .
5
GPS Stress Evaluation within Seconds (G-SEIS) Rothacher M., Gendt G., Galas R., Schöne T., Ge M. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
Rapid Automated Determination of Seismic Source Parameters (RAPID) Meier T., Dahm T., Friederich W., Hanka W., Kind R., Krüger F., Ohrnberger M., Scherbaum F., Stammler K., Yuan X.
. . . . . 14
WeraWarn: Real time detection of tsunami generated signatures in current maps measured by the HF radar WERA to support coastal re-gions at risk Stammer D., Pohlmann T., Gurgel K.-W., Schlick T., Helzel T., Kniephoff M. . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 Early Warning System for Transport Lines (EWS TRANSPORT) Wenzel F., Kühler T., Hohnecker E., Buchmann A., Schedel F., Schöbinger F., Bonn G., Hilbring D., Quante F. .
. . . . . . . . . 31
METRIK – Model-Based Development of Technologies for Self-Organizing Decentralized Information-Systems in Disaster Management (DFG-Graduiertenkolleg) Fischer J., Avanes A., Brüning S., Fahland D., Gläßer T. M., Köhne K., Quilitz B., Sadilek D. A. Scheidgen M., Wachsmuth G., Weißleder S., Kühnlenz F., Poser K.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Earthquake Disaster Information System for the Marmara Region, Turkey (EDIM) . . . . . . . . . . . . . . . . . . . 51
Wenzel F., Erdik M., Zschau J., Milkereit C., Redlich J. P., Lupp M., Lessing R., Schubert C. .
Numerical Last-Mile Tsunami Early Warning and Evacuation Information System Birkmann, Dech, Hirzinger, Klein, Klüpfel, Lehmann, Mott, Nagel, Schlurmann, Setiadi, Siegert, Strunz . . . . . . . . . . . . . 62 Sensor based Landslide Early Warning System – SLEWS Arnhardt C., Asch K., Azzam R., Bill, R., Fernandez-Steeger T. M., Homfeld S. D., Kallash A., Niemeyer F., Ritter H., Toloczyki M., Walter K. .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Integrative Landslides Early Warning Systems (ILEWS) Glade T., Becker R., Bell R., Burghaus S., Danscheid M., Dix A., Greiving S., Greve K., Jäger S., Kuhlmann H., Krummel H., Paulsen H., Pohl J., Röhrs M.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Development and testing of an integrative 3D early warning system for alpine instable slopes (alpEWAS) Thuro K. , Wunderlich T., Heunecke O. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 Development of suitable information systems for early warning systems Breunig M., Reinhardt W., Ortlieb E., Mäs S., Boley C., Trauner F. X., Wiesel J., Richter D., Abecker A., Gallus D., Kazakos W
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
Exupéry: Managing Volcanic Unrest – The Volcano Fast Response System Hort M., Wassermann J., Dahm T. and the Exupéry working group . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124 Authors’ Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 GEOTECHNOLOGIEN Science Report’s – Already published/Editions . . . . . . . . . . . . . . . 136
Scientific Programme Kick-Off-Meeting »Early Warning Systems for Natural Hazards« 10 October 2007 8.30
Welcome
8.40 – 9.00
GPS Stress Evaluation within Seconds (G-SEIS)
9.00 – 9.30
Rapid Automated Determination of Seismic Source Parameters (RAPID)
9.30 – 9.50
Real time detection of tsunami generated signatures in current maps (WeraWarn)
9.50 – 10.10
Early Warning System for Transport Lines (TRANSPORT)
10.10 – 10.30
Presentation DFG-Graduiertenkolleg »METRIK« Modellbasierte Entwicklung von Technologien für selbstorganisierende dezentrale Informationssysteme im Katastrophenmanagement
10.30 – 11.00
Coffee break
11.00 – 11.40
Earthquake Disaster Information System for the Marmara Region, Turkey (EDIM)
11.40 – 12.20
Numerical last-mile Tsunami Early Warning and Evacuation Information System (Last Mile/T-RISK)
12.20 – 13.20
Lunch
13.20 – 13.50
Development of a geoservice infrastructure as basis for early warning systems for landslides (SLEWS)
13.50 – 14.20
Integrative Early warning System for landslides (ILEWS)
14.20 – 14.40
Development and testing of an integrative 3D early warning system for alpine instable slopes (alpEWAS)
14.40 – 15.10
Coffee break
15.10 – 15.40
Development of suitable Informationssystems for EWS (EGIFF)
15.40 – 16.20
Managing Volcanic Unrest-The Volcano Fast Response System (Exupery)
16.20 – 17.00
Final discussion
5
G-SEIS: GPS Surface Deformations within Seconds Rothacher M., Gendt G., Galas R., Schöne T., Ge M. Department of Geodesy and Remote Sensing GeoForschungsZentrum Postsdam 14473, Potsdam, Germany
Abstract The G-SEIS (Gps-SurfacE deformations within Seconds) focuses on the monitoring of the Earth surface on different temporal and spatial scales using GPS and/or similar or future space-based GNSS systems (GLONASS, GALILEO) for the new generations of monitoring systems for hazards like earthquakes, land slides and volcano eruptions. In order to meet the requirement on accuracy, time latency, spatial and temporal resolution of the GPS derived deformation, new conceptual sensor station with modern data communication and transfer capability will be developed. New strategies of real-time data processing for different applications are designed and the related software package will be developed for real-time data processing of huge global networks with both static and kinematic stations. The developed system will be tested in two prototype applications, GPS-Shield for Sumatra and Automatic GPS Array for Volcano Early Warning System (AGAVE). The related work packages are presented here in details. 1. Introduction While GPS is able to measure and quantify surface movements of a source region in realtime, seismological networks need a certain time span to accumulate data in order to determine the nature of the rupture process and some remote sensing techniques with a high spatial sampling give only inadequate temporal sampling depending on orbital parameters of the satellites. For time-critical applications, as e.g. in the case of the tsuna-
mi event in Sumatra, GPS networks might be the only way to get a fast insight into the nature of a rupture process. Using combined seismological and GPS information will enable to issue alerts and also to support the near real-time modeling of tsunami propagation. Today, in scientific applications for Earth monitoring GPS is mainly used in continental-scale and regional, but data is processed in batches, leading to certain latency in providing position result. Therefore, the current status of the GPS monitoring stations and analysis software can not fulfill the future requirement on GPS for the use in, e.g. a tsunami early warning system or volcano observatories. The G-SEIS (Gps-SurfacE deformation withIn Seconds) project focuses on the monitoring of the Earth surface on different temporal and spatial scales using GPS and/or similar or future space-based GNSS systems (GLONASS, GALILEO) for a new generations of monitoring systems for hazards, like earthquakes, land slides and volcano eruptions. This confronts us with the following tasks: – Design and development of a new series of multi-parameter stations running unattended and autonomously for long periods, – Development of new and automated GPS/GNSS software allowing real-time data analysis and self-detection of events, and – Development and test of new data communication strategies for high-rate and highvolume data. Within the project, a new generation of robust, autonomous field GPS-based monitoring stations, having modular software and
7
hardware architecture, will be developed. Hybrid power subsystems for these field stations will be designed and developed. Various satellite- and wireless terrestrial data communication technologies for high- and very highrate GPS data will be suggested. Appropriate software tools for high-rate (1Hz) real-time data streaming will be developed. The performance of the station’s equipment in difficult environment will then be verified. A real-time software packages will be developed, which will be capable of: – Providing satellite orbits and clocks in realtime (network software), – Kinematic monitoring of station positions in real-time for networks with good real-time data transmission, and – Performing real-time Precise Point Positioning (PPP) for stations with problems in high-rate data communications based on orbit and clocks from the network software (in such cases only orbits and clocks have to be transferred to the station). Special studies will yield solid background for future application of
– High-rate GNSS data, where high-rate means in this context 20 to 50 Hz in this context, – GNSS combined solutions comprising GPS, GLONASS and the upcoming Galileo system, which will yield a new quality for many applications by offering the possibility of using nearly 100 satellites simultaneously. Two prototype deformation monitoring networks will be set-up and operated. The prototypes should demonstrate that the main goal of the undertaking, i.e. the detection of the Earth’s deformation and issuing warning alarms is possible »within seconds«. 2. Sensor Stations, Data Transfer and Communication Converting scientific monitoring networks into early warning networks requires new approaches in terms of autonomous operation, failure tolerance, error recovery, power supply and data communication. During the past 10 years GFZ has developed GPS field stations for different projects (global tectonic monitoring, volcano monitoring). Based on
Figure 1: Monitoring station utilizing different techniques (Sensors, Communication, Power Supply)
8
this experience new station concepts will be developed, providing new types of multi-sensor (smart) stations. Working with a laptop connected via a 100MBit/s internet connection makes it hard to imagine transfer rates of 2400 Bit/s. However, many remote areas have a very limited dial-up access to global telecommunication. This requires a cascade of measures to e.g. down-sample or compress data streams, to develop concepts of emergency dial-ups and to test new approaches of data transfer methods. VSAT, BGAN, HF or L-Band communication are among them to be studied. Part of the project is to test the different devices and to develop adapted data transfer techniques. A new requirement is also the multi-sensor/ multi-purpose approach. Since the installation and maintenance of remote stations are very expensive, stations needs to be equipped with different sensors. For the G-SEIS project the focus will be on the combination of GPS, seismic, tide gauges and weather sensors. To achieve a reliable station operation, appropriate hardware has to be selected and new station management software has to be developed. Also of immense importance is the power supply. Only very few stations have mains power. For remote areas monitoring stations rely mostly on solar power. Sophisticate concepts will be developed reliable power supply and monitoring tools. 3. Software Development The software packages (e.g. BERNESE, GAMIT, EPOS) available for global and regional GPS data analysis are based on batch processing, i.e. the analysis of data in daily, hourly or even shorter batches. This strategy has its natural limitation, if applications are approaching realtime monitoring. Although it is feasible to generate precise orbit predictions (except in case of maneuvered satellites), clock predictions with sub-nanosecond accuracy are not possible. Commercially available real-time software packages are normally designed under the assumption that phase biases can be precisely
represented by those of nearby reference stations. They are suitable for local/regional networks, but cannot handle global analysis problems. An alternative, but most promising strategy for the global monitoring applications is the planned real-time IGS service. The global data shall be analyzed in real-time for providing real-time orbits and clocks. Based on those real-time products with highest possible accuracy, the position of a single station is obtained by PPP. In order to obtain highly precise and reliable position results, online quality control is one key element of the strategy. Proper integer ambiguity-fixing is a challenge and will improve the PPP accuracy dramatically. First tests will be performed. 3.1. Real-Time Software a) Generation of orbits and clocks in real-time: All data analysis is based on data streams from a global GNSS network with possible regional densification. A tool has to be set up for centralized data acquisition and pre-processing, which handles all possible events and provides pre-processed data for the selected network in the given sampling rate (e.g., 1 Hz) to the analysis modules. A special analysis line will be set up using data from a global network with best data communications (GFZ project sites and sites from the IGS for global applications and EUREF for European applications) to provide real-time (latency should be less than 5–10 sec) satellite orbits and clocks. In a first stage the software will deliver satellite clocks for given predicted satellite orbits only. Tests will be performed whether the uncalibrated phase delay (UPD), originating in satellites, can be estimated reliably from the global network to enable ambiguity-fixing within PPP for given user stations. Based on the generated orbits and clocks local and regional station networks will be monitored. b) Station network solution in real-time For local or regional networks with optimal data connections, applied for earthquake detection and volcano monitoring, a continuous monitoring will be set up. Studies will be
9
performed whether a classical network solution or a PPP plus network-ambiguity-fixing is the best suitable approach for an efficient and very precise relative positioning applicable for the prototype applications in Section 5. 3.2. Real-Time Software: PPP Mode For special applications, where high-rate GPS data cannot be transmitted to an analysis center, e.g. GPS on buoys, in case of an event, the orbit and clock products from global solutions will be transmitted to the affected stations in case of an event, and the analysis will immediately start on the local computer in a PPP mode. To get an optimal time series solution the analysis will start with data from several hours backward (<24 hours) and proceed afterwards with the real-time monitoring until the end of the event, and thus will be able to detect tsunami waves (in case of buoys) or analyze post-seismic behavior in case of earthquakes or volcano activities. To support such applications a real-time PPP software tool will be developed. Tests will be performed whether integer ambiguity-fixing can be realized in real-time PPP based on additional information obtained from the network. 3.3 Deformation Monitoring Finally, the obtained solutions will be used to monitor the station motions in a selected region. Automated procedures will make the decision whether an event has been detected and consequently trigger alerting sequences. The deformation information of the GPS stations will be tested for its suitability for extraction of earthquake parameters and information about the earthquake mechanism and rupture process. It should be mentioned that the software will be able to analyze all GNSS data (GPS, GALILEO and GLONASS), aiming for highest accuracy. 4. Supporting Studies 4.1. High-Rate Data Rates For monitoring of tectonic or volcanic motions and detecting events 1 Hz data are normally
10
sufficient. However, to get more information on the source mechanisms a much higher sampling is of great interest. It is planned to perform studies with sampling rates of 10 Hz even 20 Hz. The reliability of the data stream, the quality of the data and the accuracy of the analysis results will be studied. 4.2. GNSS Studies The number of satellites observed simultaneously is significantly influencing the quality of the results. Presently, normally 8 to 12 GPS satellites are seen simultaneously (depending on elevation cutoff and latitude). With the completion of the GLONASS system and the upcoming GALILEO this number will increase to about 24 to 35, which will give a new quality in results. Studies about the influence on the accuracy of real-time results are planned. 5. Prototype Applications One of our primary tasks will be the implementation of the new hardware and software environment into applications and warning regimes. Recent discussions and the experience of the Thailand GPS monitoring of the Sumatra earthquake showed the advantage of not only monitoring the tectonic situation but also to monitor the station movement in realtime. A fast output of coordinate changes would enable the real-time modeling of earthquake source mechanisms and allow real-time predictions of tsunamigenic hazard potential. Therefore, the general concept will be demonstrated for the region of Sumatra. Hardware is installed in the course of GITEWS, thus these expertise and resources will be used. The results will be made available to the to-beestablished Tsunami Warning Center and can be used as additional information in the warning chain. A second application will be the monitoring of a volcano. In Indonesia, the Merapi is highly at risk and already monitored with different sensors by GFZ and other institutes. GPS stations have been operated by the MERAPI project for 3 years, but the sites were not permanently occupied. In Mexico the applicants had also installed and operated a volcano monitoring
network. This effort, however, was stopped after 1.5 years due to man-power restrictions. Either the monitoring array in Mexico will be resumed, updated and equipped with realtime infrastructure or a new real-time GPS array will be set up on an active volcano in Chile. With both applications we will be able to demonstrate directly with in-situ data the potential of this new technique. 5.1. Prototype Application: GPS Shield for Sumatra The main goal of this work package is the investigation whether the proposed new strategy of a GPS shield, based on real-time continuous GPS, can be successfully applied for Tsunami Early Warning Systems as it was recently suggested (Sobolev et al., 2006). In this concept the deformation information, especially the deformation gradient within a GPS array, is used to obtain more reliable earthquake parameters. Figure 2 shows the GPS derived high-rate position change of station JOG2 caused by an earthquake.
The Mentawai islands and the West Sumatra coast will be selected for a prototype installation and operation of GPS arrays, because: – It is expected that one of the future tsunamigenic earthquakes may occur close to the large city Padang. – A reliable prediction of tsunami wave heights for Padang cannot be provided using traditional earthquake-magnitude-based methods because wave heights on the coast of Padang may differ by more than a factor of 5 for earthquakes having the same seismic magnitude but different slip distribution. It is expected that the slip distribution can be successfully measured with small scale GPS arrays. – Several other large cities, which are also located close to seismogenic zones, also cannot be protected using only seismological networks. However, GFZ is engaged in the development of GITEWS and the proposed array near Padang is of great importance to such efforts. Key elements will be arrays of GPS-based multi-sensor stations. High-rate (1–10 Hz) con-
Figure 2: GPS derived high-rate deformation by earthquake.
11
tinuous GPS arrays should be installed on Mentawai islands (Siberut and Pagai), along the trench, and one or two, on the west coast of Sumatra (near Padang and to the South). The arrays will provide displacement gradients and precise coordinates of the Array Master Stations. Time series of those parameters will be used to predict heights of tsunami waves following the earthquake. 5.2. Prototype Application: Automatic GPS Array for Volcano Early Warning System (AGAVE) The purpose of this work package is the installation and continuous operation of an automatic mobile array of GPS receivers, and auxiliary sensors, to monitor deformations of an active volcano prior to or during eruptions and, thus, to provide short-term hazard assessments. Continuous GPS provides an excellent tool to set up an early warning system for forecasting and/or prediction of volcanic eruptions, but necessarily observational data must be streamed and processed in real-time. A state-of-the-art small scale deformation monitoring system using GPS has been designed and developed at GFZ in the past years. It is an open system, which can be easy extended, but it must be upgraded and equipped with real-time capability. Implementation of realtime processing, array upgrade and demonstration of its usefulness for volcano monitoring are the main points of this work. Installations of technical equipment on active volcanoes are critical in terms of working safety. Therefore, alternative project sites are proposed. The array will be installed either on an active volcano in Chile or our previous system in Mexico (Popocatepetl/Colima) will be upgraded and taken into operation again. It will consist of one »Array Master Station« (AMS) and of three or four monitoring »Array Slave Stations« (ASS). Local partners will support installation and operation of the system.
tested for use in hazard monitoring projects, and new components will be implemented. This work package is an important extension of our station management concept. The data transfer and communication will deal with testing different media for data transfer. It is important for the future concept of reliable data transfer. VSAT, internet and classical satellite telephone lines will be tested in the first year. BGAN (Broadband Global Area Network) will be tested as soon as it is available in the targeted region of our prototype applications. The software development will begin with start of the project. In the first 18 months the core modules for the real-time software will be written and tested (real-time data acquisition, preprocessing, update of GNSS clocks). In the second half, the software will be extend for the processing of a full set of parameters, including orbits and station positions, and using different data types (fast moving stations, network stations). Deformation monitoring for groups of points and the alerting capability will be implemented. The supporting studies will run in parallel to software development during the last year. As soon as results are available, they will be implemented and tested. It will certainly need the new stations developed in the sensor station. For the first studies no real-time software is needed. The main hardware elements of the array will be set up at GFZ first. All the software modules will be implemented, tested and updated. The engineering model will be used for prototype The first array will be installed in an easy accessible area with high risk for hazards. Accessibility is an important issue in the test phase. Most probably it will be located on the island Pagai (GPS shield Sumatra, Indonesia). The demonstration phase is foreseen for three months. The other elements of the prototype applications will be installed and taken into operation afterwards.
6. Schedule and Workflow The sensor station development will have a very high priority during the first year. Based on previous experience the hardware will be
7. Summary GFZ has already demonstrated its capability of high-quality GPS processing and operational sevice of real-time GPS stations for various
12
applications. The G-SEIS project will modernize the two already existing chains, namely realtime data transfer and high-precise GPS data processing which are running now separately, and join them together in order to provide real-time, precise and high-rate surface deformation for the new generation of hazard monitoring system. The details of the five major work packages are described as well as the related new approaches. The approaches developed in the project will, in a very short time, revolutionize monitoring of Earth surface deformation parameters and detection of hazardous earthquakes, volcanoes and landslide. References: Blewitt, G., et al. (2006): Rapid Determination of Earthquake Magnitude using GPS for Tsunami Warming Systems. Proceedings IGS Workshop 2006, ESA, Darmstadt, in print.
Reigber, Ch., R. Galas, W. Koehler, M. Forberg, M. Ramatschi, and J. Wickert:, 2003. GFZ HR/LL GPS Ground Station Networks and their Use, AGU Fall, San Francisco, December 8–12, 2003. Schaer, S., Beutler, G., Rothacher, M., BrockMann, E., Wiget, A., Wild, U.: The Impact of the Atmosphere and Other Systematic Errors on Permanent GPS Networks, IAG-Symposia, Springer Verlag, 121, 373–380, 2000. Sobolev, S.V., A.Y. Babyeko, R. Wang, R. Galas, M. Rothacher, D.V. Sein, J. Schröter, J. Lauterjung, C. Subarya,: GPS-Shield: reliable prediction of tsunami amplitude within less than 10 minutes of an earthquake (accepted to EOS) 2006.
Bock, H., Beutler, G., Schaer, S., Springer, T.A., Rothacher, M.: Processing Aspects Related to Permanent GPS Arrays, Earth Planets Space, 52, 657–662, 2000. Galas, R., Ch. Reigber, G. Michel, 1998. Current Status & Perspectives of the GFZ GPSBased Volcano Monitoring System, WPGM 1998 (invited paper), EOS supplement. Galas, R., J. Wickert and W. Burghardt: High Rate Low Latency GPS Ground Tracking Network for CHAMP, Phys. Chem. Earth (A), 26 (6–8), 649–652, 2001. Genrich, J.F., Y. Bock (2006), Instantaneous geodetic positioning with 10–50 Hz GPS measurements: Noise characteristics and implications for monitoring networks, J. Geophys. Res., 111, B03403, doi: 10.1029/2005JB003617. Miyazaki, S., K.M Larson, et al.: Modelling the rupture process of the 2003 September 25 Tokachi-Oki (Hokkaido) earthquake using 1-Hz GPS data. Geophysical Research Letters. 2004 NOV 16; 31(21): L21603 1–4
13
Rapid Automated Determination of Seismic Source Parameters (RAPID) Meier T. (1), Dahm T. (2), Friederich W. (3), Hanka W. (4), Kind R. (5), Krüger F. (6), Ohrnberger M. (7), Scherbaum F. (8), Stammler K. (9), Yuan X. (10) (1) PD. Dr. Thomas Meier, Institut für Geologie, Mineralogie und Geophysik, Ruhr-Universität Bochum, 44801 Bochum, Universitätsstr. 150, NA 3/173, meier@geophysik.rub.de (2) Prof. Dr. Torsten Dahm, Universität Hamburg, Institut für Geophysik, Bundesstr. 55, 20146 Hamburg, dahm@dkrz.de (3) Prof. Dr. Wolfgang Friederich, Ruhr-Universität Bochum, Institut für Geologie, Mineralogie und Geophysik, NA 3/165, Universitätsstr. 150, 44801 Bochum, friederich@geophysik.rub.de (4) Dr. Winfried Hanka, GFZ Potsdam, Telegrafenberg, 14473 Potsdam, hanka@gfz-potsdam.de (5) Prof. Dr. Rainer Kind, GFZ Potsdam, Telegrafenberg, 14473 Potsdam, kind@gfz-potsdam.de (6) Dr. Frank Krüger, Universität Potsdam, Institut für Geowissenschaften, PSF 601553, 14415 Potsdam, kruegerf@geo.uni-potsdam.de (7) Dr. Matthias Ohrnberger, Universität Potsdam, Institut für Geowissenschaften, Karl- Liebknecht Str. 24/25, 14415 Potsdam, mohrn@rz.uni-potsdam.de (8) Prof. Dr. Frank Scherbaum, Universität Potsdam, Institut für Geowissenschaften, Karl- Liebknecht Str. 24/25, 14415 Potsdam, fs@geo.uni-potsdam.de (9) Dr. Klaus Stammler, Seismologisches Zentralobservatorium Gräfenberg, Bundesanstalt für Geowissenschaften und Rohstoffe, Mozartstr. 57, 91052 Erlangen, stammler@szgrf.bgr.de (10) Dr. Xiaohui Yuan, GFZ Potsdam, Telegrafenberg, 14473 Potsdam, yuan@gfz-potsdam.de
Introduction Automated rapid and robust detection and localization of earthquakes using real-time data of regional and global networks lies at the heart of any earthquake early warning system. Early recognition of earthquakes and the determination of their source parameters are essential for a rapid warning against earthquake generated tsunamis as well as for any rapid post-earthquake response decisions by emergency management authorities. Because of recent progress in real-time data transfer as well as in off-line data processing, automated near-real time estimation of source parameters is feasible. However, methods have to be adopted for near real-time data processing and software has to be developed that makes use of the incoming real-time data stream. Building on existing real-time data transfer and data processing software packages like SeedLink and SeisComP developed at GFZ Potsdam,
14
additional software components for automated determination of source parameters and a near real time estimate of the spatial distribution of strong shaking from regional and teleseismic information will be developed. The SeedLink IP data transmission protocol as part of the SeisComP data acquisition and processing software package has become a defacto global standard adopted as a manufacturer independent real-time data exchange protocol by international organisations like FDSN, ORFEUS and IOC/IOTWS. Due to its simplicity and robustness, SeisComP is meanwhile used at many institutions worldwide for realtime data acquisition and data processing. The GEOFON data centre at GFZ acquires presently near real-time data from 45 of its 54 stations together with the data from more than 250 globally distributed stations from GEOFON partner networks. Using these real-time data feeds, automatic processes for data quality
checks, event detection, localisation and source quantification are applied and the resulting earthquake information is automatically published in the Internet (http://www.gfzpotsdam.de/geofon/new/eq_inf.html) and alerts are distributed by email and SMS messages to a wide spread user community. The SeisComp seismological control centre software architecture will also serve as basis for the application of the software modules developed by the RAPID project. Based on methodical studies, algorithms for automated near-real time determination of source parameters will be developed and tested using data available at the GEOFON data centre. The algorithms will be implemented at the GEOFON data centre. The developed software components will complement the existing ones. The software will be portable to other data centres. Tests of the software at these data centres as well as integration of generated alerts into European early warning systems are intended. The project consists of 5 work packages. Each work package aims at providing a software component for a specific purpose. We will focus on (1) signal detection, signal characterisation and event localisation, (2) rupture mon-
itoring, (3) determination of source mechanisms, (4) teleseismic estimation of shake-map estimations and (5) on evaluation and classification of source parameter estimates. A flowchart of the processing scheme we will work on is given in Fig. 1. Broad-band waveforms transferred to the GEOFON data centre in real time are input to the software package to be developed in the framework of the project. Source parameters are estimated in near-real time and are fed into several processing schemes. In one processing chain, the tsunami hazard potential is evaluated based on source parameter estimates. In a second chain source parameter estimates are used in conjunction with strong ground-motion models to derive first estimates of the expected spatial distribution of strong shaking (Âťtele-shakemapsÂŤ). In the case of detecting a large earthquake or the possible occurrence of a tsunami, alert messages are generated and further non-seismological hazard evaluations are triggered. The algorithms should be automated, rapid and robust. The robustness of the tools has to be intensively tested, classification schemes of the estimated source parameters, and rapid hazard estimation algorithms have to be developed in order
Figure 1: Flowchart of the early warning data processing scheme.
15
to generate informative alerts that may be used for further decisions in case of a strong earthquake. The software components will be tested using data of events in Indonesia and the Mediterranean. In the following short descriptions of the 5 work packages are given. The project started on July 1, 2007 and will end on September 30, 2010. Work package 1: Signal detection, signal characterisation and event localisation T. Meier (Ruhr-University Bochum), W. Friederich (Ruhr-University Bochum), F. Scherbaum (University Potsdam), K. Stammler (Seismologisches Zentralobservatorium Gr채fenberg, BGR) Work package 1 is focused on rapid automated event detection, localisation, and estimation of magnitude. Rapid automated event detection and localisation are the basis for any earthquake early warning system. Carried out continuously and in near-real time, they represent the first processing steps of any near-real time data processing chain and triggers subsequent processing steps if the potential for a large earthquake is detected. Existing software components of the localisation scheme developed at the GEOFON data centre are to be supplemented by additional tools. Crucial for the accuracy of the localisation is the quality of automatic arrival-time estimates. Methods for the determination of arrival-times based on higher-order statistics and autoregressive (AR) models will be adapted to near-real time data processing. Special emphasis will be put on the estimation of S-wave arrival times at local and regional distances that allow the reduction of the uncertainty in event localisations. The potential of a modified algorithm for the determination of AR-parameters applied to three-component waveforms will be investigated. The programme HYPOSAT (Schweitzer, 2001) will be adapted for rapid event localisation using only a limited number of stations at local and regional distances. Additional information like quality estimates of automatic picks and polarisation estimates are effectively
16
accounted for by HYPOSAT. Besides rapid detection and localization, a fundamental challenge to any early warning system is the rapid recognition and characterisation of strong earthquakes. After a preliminary localisation has been obtained, incoming waveforms will be continuously compared to waveforms of master events by a cross correlation technique. A database of master-event waveforms will be set up using synthetics (Friederich and Dalkolmo, 1995) as well as waveforms of previous events recorded at GEOFON stations. Similarities to master events will be quantified allowing the rapid identification of earthquakes with high hazard potential. In cases of a high correlation with master events, source time functions are determined using a Wiener filter approach. Magnitude estimates are obtained from determined source time functions. Inputs are the incoming real-time data from seismological broad-band stations. Outputs are event detections, localizations, magnitude estimates, and similarities to master events. Work package 2: Near real time rupture monitoring R. Kind (GFZ Potsdam), F. Kr체ger (University of Potsdam), X. Yuan (GFZ Potsdam), W. Hanka (GFZ Potsdam), T. Meier (Ruhr-University Bochum) Recently it was shown that it is possible to monitor the rupture propagation of strong earthquakes with the aid of teleseismic array technology (Kr체ger and Ohrnberger, 2005; Ishii et al., 2005). Presently we are working on a technique to monitor in real time the rupture propagation using a polarization analysis of local and regional events. This is a second independent technique providing redundancy and it is also closer to real time than teleseismic techniques. Determination of backazimuth and incidence angles from the amplitude ratios of the three components of the P phase at one station is a very old technique for estimation of the earthquake epicenter. We apply in real time a moving window technique for the determi-
nation of the backazimuth as a function of time. With the epicenter known from other techniques, we will monitor the rupture propagation starting at the epicenter. We apply this technique to a number of stations and get a reliable image of the rupture propagation in near real time. Using many stations increases the reliability of the observations. We have applied this technique to the data of the great Sumatra event from December 2004 and obtained results very close to those of the teleseismic array technique. The next step will be the application of the new technique to a series of other great earthquakes. We intend to develop effective software for automatic real time application of this technique to any large earthquake occurring anywhere on Earth. We also intend to publish the results in the internet, in a similar way Harvard is publishing the moment tensor solutions, but we plan to do this in near real time. The advantage of our technique is the inclusion of the nearest stations in the solution, which permits solutions which are faster than only teleseismic solutions. Inputs are real time seismic broadband data from existing seismic stations. Outputs are real time determination of lengths of ruptures of large earthquakes, which is very important for determination of the earthquake magnitude and estimation of possible damages, including generation of tsunamis. Work package 3: Estimation of source mechanisms – kinematic and dynamic source parameters T. Dahm (University of Hamburg), F. Krüger (University of Potsdam), T. Meier, W. Friederich (Ruhr-University Bochum), K. Stammler (Seismologisches Zentralobservatorium Gräfenberg, BGR) Rapid, automatic and robust estimation of parameters of the extended seismic source like its duration, its spatial extent, its slip and seismic energy release are a necessary prerequisite to estimate with modern methods the local damage and more widespread consequences
of strong earthquakes (like tsunami generation) for a specific region of the Earth. We plan to determine these important quantities on a global scale from regional and teleseismic broadband waveforms within 10 to 20 minutes of the onset of a specific event for events with Mw > 5.7. For that purpose we will use higher order moment tensor inversion on the base of simple parameterizations of the finite source (Dahm and Krüger, 1999; Vallee and Bouchon, 2004) with constraints provided by work packages 1 and 2 and the recently developed imaging of the rupture process by arrays in the teleseismic distance range (Krüger and Ohrnberger, 2005a, 2005b). Inputs are global/regional seismological broadband waveforms provided by GEOFON and other data centers, station parameter databases and Greens functions from own Green function databases. If available, continuous GPS data will be added. Additional inputs are the location (first onset, including an estimate of the source depth, initial estimate of scalar moment and type of mechanism from work package 1) and a first estimate of the rupture direction from work package 2. Outputs are estimates of the source mechanism, scalar moment and centroid depth, the lateral dimension of the source and its temporal duration, the average rupture velocity, the average type of rupture (unilateral/bilateral), the rupture direction and estimates of average slip and energy. Work package 4: Near real time estimation of expected spatial distribution of strong ground-motion F. Scherbaum (University Potsdam), F. Krüger (University Potsdam) ShakeMaps are representations of ground shaking produced by an earthquake which are becoming an increasingly important tool to facilitate the rapid post-earthquake response by private and public agencies. In their original form they are based on regionally derived estimations of strong ground-motion from accelerometric networks, which are subse-
17
quently interpolated spatially and converted into instrumental intensity maps. This approach, however, is limited to well instrumented regions. We intend to develop and implement a novel approach to rapidly obtain first estimates of the expected spatial distribution of strong shaking for strong earthquakes in continental regions (and with lesser accuracy also for subduction zone events) based on teleseismic recordings (»TeleShakeMaps«). TeleShakeMaps will complement traditional ShakeMaps (Wald et al., 1999) while for remote areas (e.g., the Pakistan earthquake of fall 2005) they will provide unique information. The backbone of TeleShakeMaps will be the combination of teleseismically determined source dimensions (Dahm and Krüger, 1999) obtained in work package 3 with composite ground motion models (Scherbaum et al., 2005), calibrated for the prospective application regions using the method of Scherbaum et al. (2004). This will allow not only to rapidly predict site specific strong ground-motion parameters but also to quantify the associated epistemic uncertainties. Inputs are the estimate of source size, source orientation, and moment magnitude obtained from work package 5.3 using the approach of Dahm and Krüger (1999) and target region specific ground motion models. The latter are going to be either empirical (cf. Douglas, 2004), stochastic (Boore, 2003), or hybrid empirical (Campbell, 2003) using the method of Scherbaum et. al. (2006) to determine the regional transfer functions. Output is an estimate of the spatial distribution of strong ground shaking together with associated uncertainty estimates. Work package 5: Evaluation and classification of near-real time source parameter estimates for tsunami hazard Matthias Ohrnberger (University of Potsdam), Frank Scherbaum (University of Potsdam) The objective of this work package is the development of an automatic and robust alert level for tsunami generation based on the evidence of seismological observations as
18
described in work packages 1 to 3. The alert level system is intended to serve as a prototype non-public message generator for scientific staff involved in tsunami warning centers. The alert level system will be based on the formulation of belief networks (Aspinall et al., 2003; Woo and Aspinall, 2005). Belief networks allow incorporating both discrete and continuously distributed data sources as well as the quantification of expert knowledge into a probabilistic framework which can be evaluated very efficiently. For the establishment of such a system it is necessary to account for all available prior information (Moreno et al., 2004). This includes simple datasets like a shoreline-database as well as geological priors (e.g., fault information, seismo-tectonic regimes, etc.) and finally the expert knowledge about generation models of tsunamigenic or tsunami earthquakes (e.g., Bilek and Lay, 1999; Satake and Tanioka, 1999; Polet and Kanamori, 2000). Based on the prior information and the arriving evidence (seismic source parameters) it is possible to automatically obtain a similar reasoning as an expert seismologist would perform, when trying to judge on the soundness of data analysis as well as the probability of strong ground shaking or tsunami generation and finally on the severity of possible consequences. There are two types of inputs into the system: 1) detection messages of analysis results provided by work packages 1 to 3 (i.e. epicentral location parameters, seismic moment estimates, source mechanism, rupture direction and speed, etc.); 2) prior information derived from global seismicity data sets (Wadati-Benioff zone geometry, moment tensor solutions), geographical and geological data bases (e.g. digital elevation map and bathymetry, shoreline database, tectonic boundary database). Outputs are the automatically generated alert messages. A message will be created whenever the alert level system changes its state based on the evaluation of seismological evidence. References Aspinall, W.P., G. Woo, B. Voight and P.J. Bayter (2003). Evidence-based volcanology:
application to eruption crises, JVGR, 128, 273–285. Bilek, S. and T. Lay (1999). Rigidity variations with depth along interplate megathrust faults in subduction zones. Nature, 400, 443–446.
Satake, K. and Y. Tanioka (1999). Sources of Tsunami and Tsunamigenic Earthquakes in Subduction Zones, PAGEOPH, 154, 467–483.
Boore, D.M. (2003). Simulation of ground motion using the stochastic method, Pure and Applied Geophysics, 160, 635–676.
Scherbaum, F., F. Cotton, and P. Smit (2004). On the use of response spectral-reference data for the selection of ground-motion models for seismic hazard analysis: the case of rock motion, Bulletin of the Seismological Society of America, 94(6), 2164–2185.
Campbell, K.W. (2003). Prediction of strong ground motion using the hybrid empirical method and its use in the development of ground-motion (attentuation) relations in Eastern North America, Bulletin of the Seismological Society of America, 93(3), 1012–1033.
Scherbaum, F., J.J. Bommer, H. Bungum, F. Cotton, and N. A. Abrahamson (2005). Composite ground-motion models and logic trees: methodology, sensitivities, and uncertainties, Bulletin of the Seismological Society of America, 95, 1575–1593.
Dahm, T. and F. Krüger (1999). Higher-degree moment tensor inversion using far-field broadband recordings I: Theory and evaluation of the method with application to the 1994 Bolivia deep earthquake, GJI, 137, 35–50.
Scherbaum, F., F. Cotton, and H. Staedtke (2006). The estimation of minimum-misfit stochastic models from empirical prediction equations, Bulletin of the Seismological Society of America, 96(2), 427–445.
Douglas, J. (2004). Ground motion estimation equations 1964–2003 Reissue of ESEE Report 01–1: »Á comprehensive worldwide summary of strong-motion attenuation relationships for peak ground acceleration and spectral ordinates (1969 to 2000)« with corrections and additions (No. ESEE Report No. 04-0001-SM). London: Empirical College of Science, Technology and Medicine, Civiel Engineering Department.
Schweitzer, J. (2001). HYPOSAT An enhanced routine to locate seismic events. PAGEOPH, 158, 277–289.
Friederich, W. and J. Dalkolmo (1995). Complete synthetic seismograms for a spherically symmetric Earth by a numerical computation of the Green’s function in the frequency domain. GJI, 122, 537–550.
Wald, D.J., V. Quitoriano, et al. (1999). TriNet »ShakeMaps«: Rapid Generation of Instrumental Ground Motion and Intensity Maps for Earthquakes in Southern California. Earthquake Spectra, 15, 537–556. Woo, G., and W. Aspinall, (2005). Need for a risk-informed tsunami alert system, Nature, 433, 03 February 2005, 457–457.
Moreno, B., K. Atakan, K.A. Furlokken, S. Temel and O.J. Berland (2004). SAFE-Tools: A web-Based Application for Identification of Active Faults. SRL, 75, 2, 205–213. Polet, J., and H. Kanamori (2000). Shallow subduction zone earthquakes and their tsunamigenic potential, GJI, 142, 684–702.
19
WeraWarn: Real time detection of tsunami generated signatures in current maps measured by the HF radar WERA to support coastal regions at risk Stammer D. (1), Pohlmann T. (1), Gurgel K.-W. (1), Schlick T. (1), Helzel T. (2), Kniephoff M. (2) (1) Institute of Oceanography, Center for Marine and Atmospheric Research, University of Hamburg, Bundesstr. 53, D-20146 Hamburg (2) Helzel Messtechnik Ltd., Carl-Benz-Str. 9, D-24568 Kaltenkirchen
Abstract The WeraWarn project aims at optimizing the WERA current and wave radar for the early warning of tsunamis, as well as to test the systemâ&#x20AC;&#x2122;s capabilities and limitations in this context. To date, the WERA system has been deployed to date to extensively measure ocean surface currents and wave heights of sea regions covering several thousand square kilometres in real time and at high temporal and spatial resolution. These features will be used for detecting tsunami induced current signatures in the shelf break area. Thus a WERA system can produce warnings for the affected coastal regions. The investigation will be carried out in three steps: 1) Based on temporally and spatially high resolution numerical simulations, the typical hydrodynamic properties of an oncoming tsunami wave will be investigated for various sea regions. This involves the evaluation of the influence of different shelf break geometries on the signatures in the surface currents. It is expected that the slope and orientation of the shelf break significantly contribute to the strength and propagation direction of the observed signature. 2) The relationship between typical tsunami current signatures, calculated with a radar backscatter model with input from a model study described above, and the actually measured radar backscatter spectra serves as a basis for the development of a specific tsunami early
20
warning algorithm. 3) In addition to theoretical considerations and numerical simulations, a field experiment will be carried out in a third phase, during which a tidal bore of substantial height, like it e.g. occurs in the Hangzhou Delta in China, will be recorded with the WERA system. In this way, the detection capability of the system can be determined. 1. Introduction In the aftermath of the natural disaster that occurred in December 2004 in the Indian Ocean, the necessity for a reliable early warning system has become the focus of attention of politics, science and research. It has turned out that existing technical solutions are not yet sufficiently reliable and thus hold the high risk of producing false alarms. The aim of this project is to investigate the capabilities of a far range HF-Coastal Radar, which has been successfully deployed in various locations during many national and international field experiments, to contribute to reducing the number of false alarms through the observation of direct tsunami signatures. On the basis of theoretical considerations, numerical simulations and a final verification of the results by field experiments, the possibilities and limitations of the WERA HF-Radar System and its suitability for being a supportive or even essential component of an early warning system, shall be investigated.
2. Objectives and concept The development and testing of early warning systems for minimizing the risk of damage caused by tsunamis or cyclone surges comprises several different stages. The proposed project will make an outstanding contribution to the development and improvement of monitoring systems, to the development of models for coupling remote sensing data and numerical simulations, as well as to the quality testing of early warning systems. Furthermore and firstly, an estimate of the usefulness of an HFRadar based early warning system will be made for the most endangered coastal regions worldwide. To this end, we will fall back on our own model calculations and on those of the AWI Project Group »Deutsche Finite Elemente Ozean Model«. The data and results of the project proposed here will be available for the »New Spaceborne and Ground-based Microwave or UHF Systems for Detection of Tsunami« during the term of the project. 2.1. Development and improvement of monitoring systems The project WeraWarn will, by combining and adapting tested remote sensing technology and numerical simulations, provide a new and powerful component for a modern early warning system that fulfils the requirement of providing prompt, extensive and reliable information. The WERA Radar System, manufactured by the Helzel Messtechnik Company and developed in collaboration with the University of Hamburg (IFM/ZMAW), offers a simultaneous large area measurement of ocean currents, wave heights and wind direction, i.e., desirable attributes of an early warning system »sensor.« To date, the HF-Radar has an up to 250 km range capability with a local resolution of up to 150 m and a temporal resolution down to less than 3 minutes. Thus a rapid detection of significant changes is now already possible. Nevertheless, further technical system improvements must be achieved in order to rapidly detect the expected current signatures that a tsunami at the coastal shelf or a cyclone at sea would generate. In particular, the chal-
lenge is to continually analyse (filter) the continuously generated current maps, or to be more exact, the HF-Radar backscatter spectra the maps are based on, in order to extract a set of parameters that is capable of reliably triggering an alarm upon reaching a critical constellation. 2.2. Development and calibration of a quantitative physical model To date, it has been possible to investigate highly dynamic current processes in coastal regions with the HF-Radar. Tidal currents, drifting eddies, fronts and cyclone driven currents (Florida) have been recorded successively and used to drive numerical simulations for the forecasting of water movements (e.g., shipping route optimization, EUROROSE and Wings-for-Ships EU projects). However, preliminary model generated current patterns of a tsunami wave in the shelf area (e.g., Seychelles, Androsow) have to date never been recorded with an HF radar or a comparable remote sensing method, which possibly would allow a derivation of the topographically invariant parameters of a tsunami wave. Due to the »fortunately«, in the statistical sense, rare occurrence of tsunami events, it will not be possible in the future to directly ascertain such parameters from the measurement data. Thus the modelling of the physical process will be applied here in two ways to compensate for the lack of measurement data. In a first step (cf. 4.3 Modelstudy TsunamiCoast), appropriate models will be developed and operated to generate high resolution, small scale tsunami-similar coastal area current patterns which will be used in a second step (cf. 4.4 Modelstudy TsunamiHfSpec) to calculate the corresponding HF-Radar backscatter spectra. It will thus be possible to quantitatively describe the chain of physical processes occurring from the origin of the tsunami to the HF-Radar spectrum. In addition, the model results will be used to conceptually support the planned field experiment (cf. 4.5 Fieldexperiment TidalBore). Subsequently, as data from the tidal bore experiment becomes available, these data,
21
evaluated on the basis of experience gained from the models, could possibly contribute in an iterative process to improving the models and/or optimizing the HF-Radar system characteristics. 3. A brief description of the basics 3.1. Modelling of tsunami waves Shortly after the occurrence of the disaster in the Andaman Sea, Dr. Alexey Androsov carried out model calculations at the ZMAW on the basis of the model GNOM (General hydrostatic/non-hydrostatic Ocean Model). The simulation provided surface displacements and current velocities for the area of sea in front of Phuket and the Seychelles as it is shown for the Seychelles in Fig. 1. The model covered an area that included the islands of MahĂŠ and La Digue, the shallow shelf and the deep sea region bordering the shelf. The simulation was based on the assumption that the first quake occurred near the Nikobar Islands and triggers a tsunami. The tsunami wave propagates at a high velocity towards the Seychelles and reaches the northeast model boundary with a wave height of 50 cm at model time t = 0. From this point on, the high energy wave is influenced more and more by the structure of the shelf. The model
simulation confirms the assumption that at a distance of 150 to 200 km from the island of La Digue, significant alterations (changes in magnitude and direction) of the current field occur along the shelf break (edge) at model time t = 72 min, cf. Fig. 1. The extreme alterations of the current field and the pronounced orientation to the isobaths could serve as indicator for an oncoming tsunami. From this point on, it takes another 100 minutes until the tsunami reaches the coast of La Dique, in agreement with known propagation times. After another 30 minutes, the tsunami arrives at the larger island of MahĂŠ. Velocities of up to 2 m/s are observed in the current field, which is an indication that with high probability, the associated current patterns could be detected and tracked by an HF-Radar. The radar integration time must only be short enough (ca. 4 to 5 minutes) to resolve the dynamics of the event. 3.2. Theory and numerical simulation A Tsunami generally is generated by an abrupt vertical displacement of a large amount of water in the ocean. This can be induced due to earthquakes, volcanic eruptions or landslides. Also meteorite impacts carry the potential to trigger a Tsunami. For example on December 26th in 2004 we were faced with
Figure 1: Simulated current signatures at the shelf edge of the Seychelles due to a propagating tsunami
22
Figure 2: Phase velocity as a function of the water depth
Figure 3: Current velocity as a function of the water depth
the worst effects of those kind of waves in the Andaman Sea. On a wide scale from 500 to 1000 km, the ocean floor has been displaced by an earthquake, that itself was possibly generated by another earthquake on the other end of the Indic-Australian-Plate near Antarctica two days before. Due to the imbalance (you can compare it with a plate of ice in the arctic ocean) the earthquake in the Andaman Sea raised with a vertical displacement of the sea bottom by 10 m to 30 m. But other natural events are able to induce a tsunami as well. For example, the enormous mix of gas and steam which is ejected during a submarine volcanic eruption may generate a tsunami wave due to the big displacement of the water column.
In theory a Tsunami wave is a long wave because the relation between the wavelength of the tsunami (~ 100 km) and the water depth (4 to 5 km) is very large. Those waves are called shallow water waves and they are mostly influenced by the bottom topography. The phase velocity c can be characterized by c = sqrt(g*h), cf. Fig. 2. The total energy of a tsunami in the deep sea is nearly balanced, which leads to the effect of small amplitudes (0,1 to 1 m) and very high phase velocities of this wave in the area of the epicentre (nearly 900 km/h). But when the wave is reaching islands or shelf edges the water depth is decreasing (that means the velocity is also decreasing) and the potential energy is extremely increasing (so we get a
Figure 3.1: Bathymetry of the Seychelles with transect at 4,9째 South
Figure 3.2: Variance of currents along this section in a moving window of size 5 km
23
much higher amplitude of about up to 30 m) contrary to the velocity of propagation of kinetic energy, that is decreasing (30 km/s). Fig. 2 illustrates the decrease of the phase velocity due to the decreasing water depth from 700–800 km/h in the deep sea down to 100–150 km/h, while the wave is reaching the continental shelf edge (100 to 200 m water depth). These effects are accompanied by increasing amplitudes and current velocities. Fig. 3 shows the relation between current velocity and water depth for a wave propagating from deep to shallow water. For a real bathymetry, like it is shown in Fig. 3.1 for the Seychelles islands, at some points current velocities of about 1 m/s and more can be observed. Supposed we would acquire the surface currents along the section above by the usage of a radar system and by determining the variance of the current velocity in radar cells with an extent of about 5 km, we will get significant signal peaks in variance (cf. Fig. 2.2) at the continental shelf edge, which in the end can be used as an indication for a Tsunami wave. Exact statements concerning the spatial and temporal behaviour of signatures in the velocity field, which are related to tsunamis, need model studies with a very high resolution in the order of several hundred metres. According to theory tsunami, waves can be considered as »long waves«. They produce pure barotropic (vertically homogenous) currents as a first order approximation. Thus for the proposed investigations it is sufficient to apply a two-dimensional model. Moreover, the baroclinic (density induced) part of the circulation can be neglected since for the wave motion it only plays a minor role. Therefore a simple tidal model can be applied in the frame of the investigations. Such type of model was already used at the Institute of Oceanography in Hamburg in the seventies. Due to the enormous development in computer technology it is nowadays possible to use these relatively simple water elevation-current model with an extremely high spatial resolution, which allows to implement the above mentioned grid dis-
24
Figure 4: Typical HF radar backscatter spectrum
tances of several hundred metres. The largest problem, if scenarios of actual region should be simulated, is the lack of sufficiently highresolution topographic data. In certain cases it may be necessary to carry out side-scan-sonar investigations in the respective section of the shelf break. 3.3. Basics of HF radar technics The use of HF radar in oceanography has been initiated by Crombie (1955) and has been applied for the measurement of ocean current maps at the Institute of Oceanography, University of Hamburg, Germany, since 1980 within the scope of many field experiments1. Due to progress in system hardware and processing software driven by the oceanographic requirements, it is now also possible to measure ocean waves and wind direction at an increased spatial and temporal resolution. The basic physics utilized by an HF radar installed at the coast, which is ground wave propagation of electromagnetic decameter waves and Bragg scattering at the ocean surface, allow the observation of processes far behind the horizon (up to 250 km). The electromagnetic waves are transmitted at the coast and propagate along the ocean surface. A small fraction of the transmitted energy is scattered back due to the roughness of the ocean surface and is received by the radar. After processing of the echoes, radar backscatter spectra (cf. Fig. 4) are avail-
able. Based on these spectra, surface current, wave height and wind direction can be processed. The most important parts of the spectra are the ›First Order Peaks‹ and the ›Second Order Returns‹. In case of a propagating tsunami wave, special current signatures can be expected at the ocean surface, which influence the shape of the backscatter Doppler spectrum. Numerical simulations and experiments will be used to investigate the influence of the tsunami on the Doppler spectrum. Later on, these findings can be used to deduct information on a tsunami wave from a Doppler spectrum. As a single HF radar system measures the radial component of surface current, towards or away from the radar, it is necessary to install two sites at different locations along the coast. The two radial components can be combined to give the two-dimensional current vectors. Using this set-up, it is possible to measure actual current maps covering large areas. Fast changing features in the current field, e.g. drifting eddies, fronts, or tsunami waves, can be monitored and resolved, because the system is able to measure actual current maps within minutes. Fig. 5 shows a sequence of current maps measured at the French coast near Brest with a temporal resolution of 12 minutes. Successful tsunami detection requires that actual current maps are available at short time intervals, which must be processed in real time to release a tsunami warning as early as possible. 3.4. Basics and requirements in hydrodynamic numerical simulations The Institute of Oceanography of Hamburg University has decisively participated in the development of numerical hydrodynamical modelling since the mid fifties (Hansen, 1952, Sündermann, 1966). Until the mid seventies two-dimensional barotropic, wind and tidal
Figure 5: Varying currents in 12 minute intervals, typical current amplitudes are in the range of tsunami induced amplitudes
25
driven models have been in the focus of research activities (Maier-Reimer, 1977). Subsequently, a development towards threedimensional, baroclinic models was observed (Backhaus, 1979; Maier-Reimer, 1979). At the beginning of the eighties IfM was the first European institute which conducted long-term simulations for actual periods. Within this type of modelling, the circulation model HAMSOM played a key role. Due to the semi-implicit formulation of the gravitational waves and the vertical exchange, it was possible for the first time to perform simulation for periods of decades using realistic forcing conditions (Backhaus and Hainbucher, 1987). Thomas Pohlmann was one of the leading persons responsible for the development of a prognostic baroclinic North Sea model. This in particular concerns the treatment of the thermal stratification (Pohlmann, 1996, 1997). Furthermore, Th. Pohlmann participated in a number of national (e.g. ZISCH, PRISMA) and European projects (e.g. NOMADS2 (Delhez, 2004)), several times in a leading position as principle investigator. In these projects the hydrodynamical model has been applied successfully. In particular in the BMBF-project »Storm Surge Development« the foundations for the applied tsunami project have been laid. In this project a prognosis for the development of storm surges along the North Sea coast has been generated taking into account future climate development. For this purpose climate scenarios calculated by the Max-Planck-Institute for Meteorology have been facilitated. Presently Th. Pohlmann is principle investigator in a GLOBEC-D subproject, and in the projects Vietnam Upwelling and Siak River Dispersion. In all of these projects, the hydrodynamical modelling plays a dominant role.
4. Description of the working plan 4.1. Basic structure of the concerted project The cooperation between the »Zentrum für Marine und Atmosphärische Wissenschaften (ZMAW) and Helzel Messtechnik (HMT)« is aimed to allow the development of an opera-
26
tive component of a tsunami early warning system (TEWS). The extraordinary experience of HMT in the field of hardware design and system technics, especially in radar technics, will unburden ZMAW from hardware related work and therefore allow to concentrate on the data analysis and development of numerical models including tsunami-wave propagation and radio backscatter simulations. For the WeraWarn project, 4 tasks can be distinguished, cf. Fig. 8 (project overview), which will be discussed in the following sections. 4.2. Model study »TsunamiGlobal« In TsunamiGlobal, an existing storm surge model will be applied to the world ocean. The planned resolution will be 10′ (approx. 20 km). Due to the implicit solution scheme for the external gravity waves no limitation of the time step must be considered. The time step can be chosen purely due to physical requirements. With the global model, qualitative studies with respect to the tsunami risk will be conducted for different coastal zones. This analysis will mainly be of comparative nature, i.e. the risk potential of different coast zones will be compared. The tsunami will be triggered at region with known tectonic instabilities. Also the duration and strength of the perturbation will be defined by means of historic tsunami events 4.3. Model study »TsunamiCoast« In TsunamiCoast, the storm surge model described above will be applied to selected coastal regions. The resolution will be between 0.25’ (500 m) and 1’ (approx. 2 km) depending on the region of interest. For this study the parallelized version of the model will be employed in order to obtain an optimal performance on the vector computer at the »Deutsches Klima Rechenzentrum« (DKRZ). The parallelization will be performed by means of the domain-splitting scheme. The vectorization of the implicit part of the model code is carried out with help of the red-black-algorithm, which is known to show very good numerical properties for diagonal dominant matrices.
At the beginning, the model will be validated by means of historical tsunami events. In this context, it is possible to calibrate relevant parameters like the bottom friction coefficient. Subsequently, scenario runs for selected topographies will be conducted. The resulting signatures of the surface velocities will be analysed in combination with the governing parameters like shelf width, topography gradient, distance of epicentre and strength and form of the perturbation. Finally, the signatures and the derived analyses will be made available to the other project partners. They will use this information to more accurately describe the influence of tsunami waves on the sea surface currents, which is the parameter that is finally detected by the HF radar. 4.4. Numerical simulation »TsunamiHfSpec« The ocean surface current data simulated by the high resolution tsunami model are available at a resolution of about 100 m in space and 10 s in time. Together with ocean wave spectra, which are derived from the climatology, they are passed to a radar backscatter model, which calculates the highly variable backscatter Doppler spectra across the shelf edge for the radar resolution cells as a function of time. This backscatter model is used to
investigate the characteristic structure of a tsunami signature, i.e. how long can it be seen within a resolution cell, what is the variance of the Bragg frequency, how fast does the signature move through the resolution cells, and how do these characteristics depend on the topography. The results of this investigation will be used to optimize the required spatial and temporal resolution of the radar. In a next step, typical design parameters for a special filter that allows to recognize tsunami signatures in HF radar measurements will be derived from EM backscatter simulations. The sensitivity and response characteristic of such a kind of filter can be tested by superimposing a simulated tsunami signature to real HF radar data. On this basis, the fast propagating signature should be observable and assessable within the normal ocean circulation dynamics generated by tides, fronts and wind influence. 4.5. Field expriment »TidalBore« Beside of all the theoretical and numerical investigations, the capabilities of HF radar to detect tsunami waves should be tested in a field experiment. For this purpose the radar system must be adapted and optimized to the special task, which is mainly in the responsi-
Figure 6: Area of observation with 3 radar coverage at Hangzhou delta, China
27
Figure 7: Tidal bore at Hangzhou delta, China
bility of Helzel Messtechnik. The following paragraphs summarise the main aspects of the system optimization: – The excellent signal to noise characteristic of a WERA system allows for modifications like this without the risk to reduce the reliability. The higher temporal resolution can be achieved by means of optimised signal processing algorithms and more powerful computers. – The required higher spatial resolution can be achieved by increasing the operation bandwidth of the radar. Since our radar has a broadband concept, this higher resolution is technically possible, but for long range systems the generated data volume would be hard to handle. This problem is again a matter of signal processing optimisation and adaptation to the individual application (area of interest). – As a component in a national disaster management system the radar needs to be very reliable. Actually the rate of availability of data from the entire field of view is about 90%. The requirement for a sensor component of a disaster management system is stronger, almost 100% should be goal. The reason to loose data is typically external electromagnetic interference (EMI). This problem can be strongly reduced by using more than one frequency band for the radar and using a multi-frequency system that automatically selects the best frequency band. Even if the WERA already is designed as a broadband system, for this purpose special antennas, input filters and multiplexers as well as software need to be developed. Our main goal is to carry out a field experiment where we may test the radar system,
28
software, warning strategy, and the alarm chain under »tsunami conditions«. The variability of the current pattern for a tsunami wave in shallow shelf water, where it gains in height and runs with reduced speed towards the coast, should at least be comparable to that of a tidal bore cf. Fig 7. A practical test shall determine if these assumptions hold true in nature. There are, however, no sufficiently pronounced tidal bores in Europe, that occur so regularly that there is a high probability to observe one during a measurement campaign. Searches to date have produced three possible locations: 1. Canada, Bay of Fundy or PETITCODIAC (Amplitude of the Bore ca. 1m) 2. Brasil, Amzonas Delta or ARAGUARI (Amplitude ca. 4–6 m) 3. China, Hangzhou Delta Bore QUIANTANG DRAGON (Amplitude ca. 7–9m) The last location No. 3 (China, cf. Fig. 6) is preferred by us, since the tidal bore here is most pronounced and predictable. Second and third choices would be location No. 2 (Brazil) and location No. 1 (Canada) respectively. 5. Conclusion Three independent research institutes (Uni Hamburg, Uni Sheffield and the research group of Codar Ocean Sensors) came to the same conclusion, namely that an approaching tsunami will generate a significant ocean current signature when it reaches the continental shelf. Due to the dramatic reduction of the tsunami wave’s velocity in shallow water, there can be enough time for a pre-warning if the shelf edge is more than 50 km offshore. The detection and analysis of this effect can provide a reliable forecast of the effects at specific coastlines or harbour areas. Since even smaller tsunamis can cause loss of lives or damage of coastal properties, such a local forecast is important. It is well known that warnings will be effective only if the number of false alarms are minimal. A local detection system can increase the reliability of a national tsunami warning system.
6. Sketch of project overview
Figure 8: Combination of working modules
29
References BACKHAUS, J. O., 1979: First results of a three-dimensional model on the dynamics in the German Bight. In: Proc. of 10th Internat. Liége Colloq. on Ocean Hydrodynam. Ed.: J. C. J. Nihoul. Amsterdam: Elsevier, Elsevier Oceanography Series 25, 333–349. BACKHAUS, J. O. UND D. HAINBUCHER, 1987: A finite difference general circulation model for shelf seas and its application to low frequency variability on the North European Shelf. In: Three dimensional models of marine and estuarine dynamics. Ed.: J. C. J. Nihoul und B. M. Delhez, E., J.M, P. DAMM, E. DE GOEDE, J.M. DE KOK, F. DUMAS, J.E. JONES, J. OZER, T. POHLMANN, P.S. RASCH, M. SKOGEN, R. PROCTOR (2004): What can we expect from shelf seas models: the NOMADS2 Project. Journal of Marine Systems, 45, 39–53. HAINBUCHER, D., J.O. BACKHAUS, 1999: Circulation of the eastern North Atlantic and north-west European continental shelf – a hydrodynamic modelling study. Fisheries Oceanography, 8, Suppl. 1, 1–12.
POHLMANN, T. (2005): A meso-scale model of the central and southern North Sea: consequences of an improved resolution, accepted by Continental Shelf Research. SÜNDERMANN, J. 1966: Ein Vergleich zwischen der analytischen und der numerischen Berechnung winderzeugter Strömungen und Wasserstände in einem Modellmeer mit Anwendungen auf die Nordsee. Mitt. Inst. Meeresk. Univ. Hamburg Nr. 4, 77 S. CROMBIE, D. D. (1955): Doppler spectrum of sea echo at 13.56 Mc/s., Nature 175 (4449): 681–682 K.-W. GURGEL, H. H. ESSEN, and T. SCHLICK, 2003: The use of HF radar networks within operational forecasting systems of coastal regions. in ›Building the European Capacity in Operational Oceanography‹ 3rd International EuroGOOS Conference, Pro-ceedings pp. 245–250, Elsevier, ISBN 0 444 1550 X.
HANSEN, W., 1956: Theorie zur Errechnung des Wasserstandes und der Strömungen nebst Anwendungen. Tellus, 8, 287–300.
K.-W. GURGEL, 1999: Applications of Coastal Radars for Monitoring the Coastal Zone. Proceedings EUROMAR Workshop’99. Published by: EUROMAR Office, Agência de Inovação, S.A., Av. dos Combatentes, 43, 10°C/D, 1600042 Lisboa, Portugal, pp. 21–30.
MAIER-REIMER, E., 1979: Some effects of the Atlantic circulation and of river discharges on the residual circulation of the North Sea. Dt. Hydrogr. Z., 32, 126–130.
K.-W. GURGEL, H.-H. ESSEN, and T. SCHLICK, 2000: Eddy dynamics off the Norwegian coast, recorded by HF radar. IGARSS’2000 Conference, Proceedings, pp. 1839–1841.
POHLMANN, T. (1996): Predicting the Thermocline in a Circulation Model of the North Sea–Part I: Model Description, Calibration and Verification. Continental Shelf Research, 16/2, 131–146.
1
POHLMANN, T. (1997): Estimating the Influence of Advection During FLEX’76 by Means of a Three-Dimensional Shelf Sea Circulation Model. Deutsche Hydrographische Zeitschrift 49 Vol. 2–3, 215–225.
30
http://ifmaxp1.ifm.zmaw.de/Experiments.shtml
Early Warning System for Transport Lines (EWS TRANSPORT) Wenzel F. (1), Kühler T. (1), Hohnecker E. (2), Buchmann A. (2), Schedel F. (2), Schöbinger F. (2), Bonn G. (3), Hilbring D. (3), Quante F. (3) (1) Geophysical Institute, University of Karlsruhe (2) Department of Railway Systems, University of Karlsruhe (3) Fraunhofer Institute for Information and Data Processing, Karlsruhe
Abstract Based on recent improvements of early detection methods for earthquakes as well as advances in the field of communication and information technologies, new risk minimization strategies for railbound transportation systems are developed. 1. Goals of the project The purpose of the proposed research project is to develop and test an early earthquake warning system that minimizes the risk of damage for transport lines. The term »transport lines« is meant to comprise transport operation, vehicles (including passengers and goods), and infrastructure. The development and testing of an early warning system for transportation lines will be carried out first for railbound transportation systems. On one hand, this transportation mode is particularly vulnerable and risk minimization measures are more difficult to implement, e.g. because of the long braking distance of trains. On the other hand, in railbound traffic, every train movement is planned in advance and controlled from the outside, and thus characterized by very high standards of organization, control, and safety. Therefore, this mode of transportation is predestinated for applying specific and coordinated emergency measures from outside, and hence well suited for testing the efficiency of the proposed early warning system. The experience gained here will be useful in the devel-
opment of information systems for other transport lines. For many transportation systems there exist traffic control centers, which can interfere and influence the flow of traffic more of less directly. In the case of an earthquake, traffic control centers will decide whether and where in a particular region, traffic is prevented to enter an endangered bridge or tunnel. Furthermore, an immediate speed reduction of the vehicles in that region may be initiated. In the case of railbound transportation systems, passenger safety requires that certain operation situations, e.g. trains stopping on a bridge and train encounters (danger of collapse, danger of panic and ensuing damages) are avoided. In railbound traffic, vehicle movement cannot only be influenced by the train driver but also by an external train control center. This means that one has enhanced possibilities of interfering from the outside, but also entails additional decision making problems. After an earthquake, the train control center must decide which parts of the network are accessible. In general, the train driver cannot react sufficiently quickly even if an obstacle on the track (e.g. due to a land slide) or track buckling is recognized ahead of time, except when driving in the low safety mode »running at sight« (vmax ≤ 40 km/h). Figure 1 and the following sections provide an overview over the three research and development areas of the proposed early warning system. In chapter 4 the methods used in each area are discussed in more detail.
31
Figure 1: The three research and development areas in EWS Transport Early detection (I) is performed by seismological basis stations, where relevant earthquake parameters are derived from the raw data. The former are then transmitted via radio communication to various traffic control centers where a risk minimization procedure (II) is started by choosing appropriate emergency plans and measures of which early warning is one of the most important. Efficient IaC technologies (III) are necessary to support the decision making process because of the short warning times and the complexity of the problem.
(I) New early detection methods are used to determine important parameters such as magnitude, location, time, and spatial distribution of an earthquake. The seismic hazard for a particular region is then projected onto transport lines within the range of a given control center. With the proposed early detection methods, the necessary information can be obtained within a few seconds, and depending on the epicentral distance warning times up to 1 minute or more are achievable. (II) For risk minimization detailed emergency plans are carried out by various traffic control centers, based on preconceived hazard and damage catalogues from which the appropriate measures can be chosen. As mentioned before, a typical action would be the stopping of a train before it enters a damaged part of the network. Furthermore, EWS Transport will produce an early estimate of the damage in the network that has been caused by the earthquake. Finally, the proposed standardized early warning system will allow a continuous monitoring of the network state also when there is no imminent danger. (III) In view of the short warning times and the large number of alternatives, the use of new
32
information and communication technologies (IaC) is absolutely necessary in order to accelerate the exchange of information and to support the decision making processes of the responsible managers. Using so called »message brokers« data transfer between quite different systems is facilitated. In addition to conventional radio communication, the railway specific GSM-R standard is suitable for a priorized and fast signal transmission, in order to execute the automatic emergency procedures and to exchange information between those affected by the earthquake. The use of geoinformation system (GIS) standards facilitates the creation of shake maps, and high level architecture (HLA) compliant IaC systems allow the efficient use and coupling of different models needed for hazard prognosis simulations. The latter are an essential building block of the decision support system. The collaboration between the three different research and development areas (see Fig. 1) can be characterized as follows: experts from the fields »early detection« and »risk minimization« develop and compile the required scientific and practical knowledge. This comprises the development of early detection
methods, and the compilation of hazard and damage catalogues for estimating the imminent danger in transport operation and the damage done on various infrastructure components. On the basis of these catalogues and a given set of earthquake parameters, appropriate emergency plans will be automatically selected and executed. In the IaC field, computer scienticists develop the functionalities that allow a simulation of the various steps in the early warning chain. The results of these simulations can then be used in decision support processes. The structure and criteria that underly the decision support system are defined by the experts from the areas »early detection« and »risk minimization«. The proposed research and development is carried out by an interdisciplinary team of researchers and practitioners from universities, research institutes, international engineering firms and industry.
2. Relevance to society As recent events have shown, world community is no longer willing to accept the catastrophes caused by major earthquakes as mere fate. The motivation for the proposed project is based on the knowlegde that major advances in the fields of sensor-, computer-, and IaC technologies have been made. This opens up new possibilities for minimizing the damage through the use of an early warning system in connection with support systems for decision making. In recent years, through a worldwide effort numerous projects have been started, which aim at risk reduction and an improvement of disaster management in the event of an earthquake. In Europe the main emphasis has been put on projects, which aim at a reduction of the risks accompanying an earthquake through preventive measures and at improving the catastrophe management. In this context the following projects are noteworthy: – ORCHESTRA (http://www.eu-orchestra.org): aims at an open architecture for risk management with emphasis on prevention (with participation of IITB, see below)
– OASIS: catastrophe management system (http://www.oasis-fp6.org/) – WIN (http://www.win-eu.org): IT system, which provides a connection to technological systems, which are useful for damage prevention and in the case of damage – LESSLOSS (www.lessloss.org): a coordinated effort of 46 European partners for earthquake hazard prognosis, assessment of its effects on the environment, such as cities and infrastructures, and disaster preparedness and protection strategies The present project incorporates pertinent results from these EU projects. 3. Current state of the art 3.1. Early warning system An early warning system is usually characterized by a network of detectors, e.g. accelerometers, which are uniformly distributed over an earthquake prone area so that the distance between an assumed epicenter and the nearest detector is minimized. The seismic waves emitted by an earthquake can be classified in body waves (compressional and shear waves) and surface waves. The shear (S-waves) and surface waves have the largest amplitudes and cause the greatest damage. However, their propagation velocity is only about half of the P-wave velocity. When fast P-waves arrive at the nearest detector, these data are electromagnetically transmitted to a central information processing facility. There, the raw detector data are converted into a few significant parameters, such as magnitude, location of the epicentre, and beginning of the earthquake, including an estimate of their reliability. These parameters are then transmitted by radio communication to potential users, e.g. the train control centers in the case of railbound transportation. The basic idea of an early warning system is to take advantage of the time difference between the arrival of a radio signal propagating with the speed of light vem ~ 300 000 km/s and seismic shear or surface waves, which propagate with a typical velocity of vS ~ 3 km/s. For example, assuming that a vulnerable structure is
33
located at a hypercentral distance of 60 km, this leads to a time difference of 20 seconds before the S- and surface waves reach this structure. This time difference has to be used by intelligent information systems in order to minimize the damage. Applications of early warning systems exist only in few places and only in simple form, e.g. in Japan (NAKAMURA 2004), Taiwan (WU, 1998, 1999; TSAI, 1997), Turkey (ERDIK, 2003), Mexico (ESPINOSA, 1995), California (ALLEN, 2003). Either the signals of different seismometer stations are transmitted to a central processing station or they are processed directly at the site of the individual stations. Usually, only a few seismological parameters are extracted from the raw data. Threshold criteria are used as alarm signals. With present early warning systems, important information relevant for hazard prognosis and damage prevention, such as hypocenter location, propagation direction of the seismic waves, and shake maps is rarely used. However, in particular for railbound systems with their narrow tolerances of track alignment parameters, more detailed information concerning the propagation direction and amplitudes of seismic waves in the three spatial dimensions could be essential. This information could be used in assessing the impact of an earthquake on train operation and various infrastructure elements with higher accuracy and for choosing the most suitable emergency measures. 3.2. Risk minimization for transport lines Currently, there are no early warning system applications that focus on the special requirements of transport lines, with the exception of a Japanese system for railways that has been developed since the 1980s. According to NAKAMURA the UrEDAS (Urgent Earthquake Detection and Alarm System) for railbound transportation is worldwide the only early warning system for transport lines in operation (NAKAMURA, 2004). In recent years it has been repeatedly activated. In case of an earthquake, single stations equipped with 3D accelerometers detect the P-waves, calculate its magni-
34
tude, and within 4 seconds transmit a radio warning signal to endangered sites within a range of about 200 km. A standard emergency measure is then, for example, the deacceleration and stopping of a fast Shinkansen train. Each single UrEDAS station is able to measure, calculate, and transmit corresponding warning signals. A higher level network organization is not used. Possible disadvantages of the system are the ones mentioned in section 3.1, and the relatively low network density. A higher density network of P-wave detectors could help in saving valuable seconds, and in providing a more detailed and more reliable damage prognosis. 3.3. IaC technologies In seismology, complex IaC technologies, e.g., for data fusion, have so far only been used in investigations of recorded signals after an earthquake. Recent progress in detector and computer hardware accompanied by simultaneous cost reduction, in connection with advances in the fields of early detection methods, model building, simulation, and decision support systems, suggest that an application in early warning systems and risk minimization procedures is promising. 4. Employed methods 4.1. Early warning methodology Ground motion caused by an earthquake consists of a wave train on three components (vertical and horizontal) with a length that depends on the magnitude and ranges from 10 seconds (M = 5) to a minute (M = 8). The acceleration signal starts weakly with compressional waves arriving first and increases with time to a damaging level associated with shear waves or surface waves. For a given magnitude, hypocenter, and source time, ground motion can be estimated at any location (a) by stochastic modelling, or (b) by empirically determined attenuation relations of various parameters, such as Peak Ground Acceleration (PGA). Traditionally, source parameters are determined with a seis-
mological net of stations. Although this provides reliable values it takes time (at least one minute) and relies on functioning communication between stations and a data center (TENG, 1997). Alternatively new methods have been developed in the past years that utilize typically the first 3 seconds of the weak initial arrivals of a single station in order to predict the magnitude. A method developed at the Geophysical Institute in Karlsruhe bridges the gap between single station alarm systems and those requiring records from an entire network with the Artificial Neural Network methodology (BÖSE et al., 2004, 2005). It allows to issue an alarm with one station, but upgrade it with each additional available record continuously. This approach should be extended to estimate not only ground motion at one or more specified sites but also to determine source parameters and relative amplitudes of ground motion components. The feasibility of component-specific prediction of ground motion amplitudes is assesed by analysis of existing earthquake data bases (e.g. data from the Japanese K-net and Kik-net). Basically the following information can be extracted from seismic signals recorded in the vicinity of railway lines:
– After 3 seconds: Has an earthquake with damage potential occurred? What is its magnitude? – After a few more seconds: When will the strong motion phase arrive at the railway line in which location? How is ground motion distributed on line-parallel, line-perpendicular, and vertical components? – After minute(s): Fairly accurate map of ground motion along the railway line (shake map), that allows to assess the state of the line and potential damage. This requires detailed knowledge of geotechnical parameters along the railway line so that site effects and secondary phenomena such as liquefaction and landslide potential are included. The geotechnical parameters are usually known along railways as they are required for its structural design. Assessment of the precision and reliability of seismic prognoses in short time with Artificial Neural Nets represents a major challenge for the seismological part of this project, particularly with regard to the noisy environment in the vicinity of railway lines. Hence, seismic measurements are carried out along train tracks (see Fig. 2) to obtain the required noise
Figure 2: Seismogram showing 45 minutes of ground velocity recorded by a seismometer in 4.83 m distance from the train track. The upper plot shows the component of ground motion oriented perpendicular to the train track (E), the middle plot the component parallel to the track (N) and the lower plot the vertical component. The periods with distinct high amplitudes are caused by passing trains.
35
model. The model will allow an appropriate evaluation of the possibility to extract information on earthquake ground motion from accelerometers along railway lines. Furthermore, the measurements provide a basis for the development of methods for continuous track state monitoring.
The complexity of the problem suggests that only a consequent formalization of the underlying knowledge of hazard and damage estimates on one hand, and of suitable measures for risk minimization, on the other hand, via a computer is able to support the responsible emergency managers.
4.2. Development of strategies for risk minimization A risk minimization procedure will be successful if the relations between a given threedimensional signal strength and its consequences for transport operation and transport infrastructure are known in some detail. This knowledge is available in the form of physics and civil engineering reference manuals, and in the form of special models (e.g. derailment models in railbound systems). An integrated application of this knowledge combined with the information about the spatial and temporal distribution of strong ground motion (shake maps) leads to improved hazard estimates and impact prognosis. The latter serves as a basis for applying proper emergency measures in the region affected by an earthquake. This process may be considerably accelerated if one can rely on pre-existing earthquake hazard and damage catalogues containing various impact scenarios as well as corresponding emergency plans and measures. The choice of appropriate emergency measures from these catalogues can be supported by a computer if the aforementioned relations have been formalized as an expert system. In the case of an earthquake with damage potential the responsible emergency managers must make numerous decisions under rather difficult circumstances: – There are numerous alternatives – The preference for a particular alternative is not clear – There are a number of criteria that have to be taken into account – The required information is not sufficiently accurate – Decisions have to be made under an enormous time pressure
4.2.1. Compilation of hazard, damage, and emergency measure catalogues In Europe railway traffic is sometimes organized and supervised by few national train control centers (e.g. ~ 5 in Germany). These train control centers can interfere and affect individual trains over a large area if a corresponding warning from a superior agency (e.g., meteorological service, Department of the Interior) has been issued. For example, train control centers can stop and redirect trains within their range and coordinate emergency measures. If track alignment is perturbed by an earthquake the probability of derailment increases rapidly with increasing speed, in particular in curves. Even a moderate speed reduction that has been initiated by an early warning system implemented in train control centers is able to considerably reduce the severity of an accident, or possibly avoid an accident altogether. Therefore, one goal of EWS Transport is the compilation of relevant catalogues containing various hazard, damage, and emergency plan scenarios based on a given earthquake scenario. For this, the physical properties of track infrastructure and engineering works, the dynamics of train motion, and the train positions and velocities on the network have to be known. To achieve this goal we can rely on the experience and expertise of a network of researchers and practitioners from seismology, civil engineering, train operation, and information science.
36
4.2.2. Track network as an earthquake detector and infrastructure monitor An interesting aspect of railbound systems is the principal possibility to employ the existing rail network with its internal sensors and electrical wiring for early detection and for infor-
mation transmission. The earthquake early warning system would then be part of the standard train control and protection system. Because the train control and protection infrastructure is currently redesigned with the aim of achieving technical interoperability between different EU member states (BERGER, 2004), it seems timely to ask the question: How must the train control infrastructure be modified so that it can be used for early detection purposes, fast damage map generation, as well as for continuous track state monitoring? To answer this question, a feasibility study is performed and suitable sensors and IaC solutions are developed during the first phase of EWS Transport. In a second phase, a test track section will be designed and build (demonstrator) in order to test these components of an early warning and monitoring system in practice. 4.2.3. Demonstrator for practical testing For a future demonstrator there are two principally different track systems employing (a) discretely supported, and (b) continuouslyelastically supported and embedded rails as shown in Fig. 3. Due to the low number of construction elements and its quasi monolithic structure, the latter system is particularly well suited for testing purposes. Practical testing of the early detection and track state monitoring equipment is performed using a modern Embedded Rail System
(ERS) in collaboration with Edilon Ltd. In an ERS the rails are embedded in a two longitudinal troughs encarved in a concrete slab and elastically fastened by a two-component polyurethane-cork granulate-glue filling, called corkelast (see Fig. 3b). In contrast to the standard ballast track (Fig. 1a) and most slab track systems derived from it, where the rails are only supported at discrete support points, in an ERS the rails are continuously-elastically supported over their entire length, both in vertical and horizontal directions (HOHNECKER, 2002). Furthermore, instead of about 55,000 parts (bolts, nuts, clamps) per km of track, in the case of a ballast track, an ERS employs only few structural components (see Fig. 3b) that are quasi monolithically connected. PVC pipes are usually inserted in order to reduce the required corkelast mass. If the sensors (e.g. strain gauges, accellerometers) are not embedded in the corkelast itself, the sensors and their wiring can be accomodated inside the PVC pipes. In this way they are protected from vandalism and damage due to train operation. Standardized units of ERS test track sections can be integrated in the worldwide railway network. 4.3. Advanced IaC technologies For the communication of seismological basis stations with central processing stations several transmission methods (via satelite, spread spectrum, ADSL) with high stability and efficiency in noise suppression can be employed.
(a)
(b)
Figure 3: (a) Standard ballast track with discrete rail support (b) details of a continuously- elastically supported Embedded Rail System (ERS)
37
The choice of the method depends on the circumstances (possible perturbations, distance between stations, available energy) of the particular application. In the railway sector, the possibility of using the standard GSM-R has to be investigated. Usually, the communication between an early detection system and a traffic control center will involve different data transmission formats depending on the country where the early warning system is to be installed. In order to avoid frequent interface adjustments, the implementation of a »message broker«, which accomplishes the required universal transformations between different data formats used by various standard and non-standard applications is recommended. Spatial data plays an important role for the early warning system for transportation lines. On the one hand spatial earthquake data (shake maps) need to be recorded, edited, stored, modelled and analyzed, as well as alphanumerically and graphically visualized. On the other hand this data needs to be combined with the spatial data of the transportation lines themselves. The early warning system shall take this tasks into account and shall make usage of available geospatial standards, for example provided by the OGC (Open Geospatial Consortium) and shall make usage of adequate functions of geographical information systems (GIS) for the realization of the early warning system, which intends to minimize the risks caused by earthquakes. In this context, an important role is played by models that allow to simulate the behavior of various infrastructure components of a transport line. Such models can be of different nature, depending on the state of the art in the particular field, i.e., they are either encoded heuristically as a set of rules or in mathematical form. The complexity of such models can be very high. For most infrastructure components one can use existing models (e.g. the vibrational behavior of a bridge, tunnels, or earthworks). New models have to be developed for special tasks, e.g., a derailment model. In any case, it is important to establish
38
an interoperability of various models and simulation tools by implementing the standard HLA (High Level Architecture). For an integrated system component of a decision support system, experts from the respective transportation system must define the rules and criteria, which if satisfied lead to a unique set of measures (early warning of a certain region, stopping of vehicles, blocking of certain road/track sections). This process leads to an increased transparency and justification of certain decisions. Furthermore, the decisions become rational and reproducible. If these prerequisites are satisfied, a computerbased decision support system can be constructed through formalization of these rules and criteria. Such a computer-based decision support system is an essential part of EWS Transport, because one cannot expect that sufficiently fast and effective decisions can be made by the responsible emergency managers given the boundary conditions listed in section 4.2 they have to cope with. Also in the case of an implemented IT solution the factor time is decisive. Therefore an Information System for the EWS Transport System will be developed, which takes into account the tasks and aspects discussed in the previous paragraphs. The architecture of the Information System will be designed following the »Reference Model of Open Distributed Processing (RM-ODP)«. The design foresees to look at the architecture from different viewpoints. The first viewpoint defines use case scenarios for the usage of the Information System, which shall be realized. The first version of the architecture of the Information System for the EWS Transport System distinguishes between the following use cases: 1. Early warning system for transport lines 2. Dynamic early warning system for transport lines, which describes the workflow from the recognition of an earthquake to the final action, which needs to be performed (e.g. stop of a train) 3. Generation of an event report, which can be created for the post-processing of an event.
4. Generation of an damage report, which can be created for the post-processing of an event. 5. Training of the Artificial Neural Network, which is used to detect an earthquake. 6. Preparing of the damage catalogue, which is needed for the evaluation of predicted damages. 7. Continuous monitoring of infrastructure elements of transport lines In the second viewpoint information models will be created. They will be used in the service specifications of the third viewpoint. One goal will be to reuse and adapt already defined information models and service specifications from consortiums or projects like the OGC or ORCHESTRA (European Integrated Project, which has developed a reference architecture for the application field »Risk Management«). The models of viewpoints two and three are described in abstract UML diagrams before the mapping to current technologies is implemented. This ensures that the architecture can be flexibly adapted to newly developed technologies in future. 5. References Allen, R. and Kanamori, H., 2003. The potential for earthquake early warning in South California, Science, 300, 786–789. Berger, R. et al., 2005. The way to coordinated deployment of ERTMS/ETCS throughout the European Network, Railway Technical Review 4, pp 21. Böse, M., Erdik, M. and Wenzel, F., 2005. Earthquake Early Warning – Real-time prediction of ground motion from the first seconds of seismic recordings. Proc. Volume of the Int. Conference on 250th Anniversary of the 1755 Lisbon Earthquake, 185–187, 1–4 Nov. 2005; Near real-time estimation of ground shaking for earthquake early warning. In: Malzahn, D. & Plapp, T. (eds). Disaster and Society – From Hazard Assessment to Risk Reduction, Logos Verlag, 175–182, 2004.
Erdik, M., Fahjan Y., Ozel O., Alcik H., Mert A., and Gul M., 2003. Istanbul Earthquake Rapid Response and the Early Warning System. Bull. Of Earthquake Engineering, V.1, Issue 1, pp. 157–163. Espinosa-Aranda, J., Jiménez, A., Ibarrola, G., Alcantar, F., Aguilar, A., Inostroza, M., and Maldonado, S., 1995. Mexico City seismic alert system, Seism. Res. Lett. 66, 42–53. Hohnecker, E., 2002. Diskret gelagerte oder kontinuierlich eingebettete Schienenfahrbahnsysteme? EI-Eisenbahningenieur 53, 45. Hohnecker, E. 2003. Acoustic properties of railway superstructures, World Congress on Railway Reserach, 28 Sept.–1 Oct. 2003, Edinburgh. Nakamura, Y., 2004. UrEDAS, Urgent Earthquake Detection and Alarm System, Now and Future, 13th World Conference on Earthquake Engineering, Vancouver, August 1–6, 2004. Quante, F., 2001. »Innovative Track Systems – Criteria for their Selection«. Better Railways for a European Transport Market, Quante, F. (Ed.) Media Network, Pfungstadt, Nov. 2001. Quante, F., Kuhla, E., Strothmann, W., 2005. »CroBIT – Cross Border Information Technology for Interoperability in European Rail Freight Transport«. Proceedings 5th European Congress on Intelligent Transport Systems, Hannover, Juni 2005. Röthlingshöfer, F., 2002. Erschütterungstechnische Analyse eines Straßenbahnoberbaus, Fallstudie Thereses gate, Oslo, Diplomarbeit, Universität Karlsruhe. Usländer, T. (Ed.), 2005. »Reference Model for the ORCHESTRA Architecture (RM-OA)«. Open Geospatial Consortium Discussion Paper (05–107), 2005; »Trends of environmental information systems in the context of the European Water Framework Directive«. ELSE-
39
VIER Journal Environmental Modelling & Software 20 (2005) 1532–1542. Teng, T. L., Wu, Y. M., Shin, T. C., Tsai, Y. B., and Lee, W. H. K., 1997. One minute after: strong-motion map, effective epicenter, and effective magnitude, Bull. Seism. Soc. Am. 87, 1209–1219. Tsai, Y. B. and Y. M. Wu, 1997. Quick determination of magnitude and intensity for seismic early warning, 29th IASPEI meeting, Thessaloniki, Greece. Wu, Y. M. Shin, T. C., and Tsai, Y. B., 1998. Quick and reliable determination of magnitude for seismic early warning, Bull. Seism. Soc. Am. 88, 1254–1259; Wu, Y. M., Chung, J. K., Shin, T. C., Hsiao, N. C., Tsai, Y. B., Lee, W. H. K., and Teng, T. L., 1999. Development of an integrated seismic early warning system in Taiwan – case for the Hualien area earthquakes, TAO 10, 719–736. Wenzel, F., Oncescu, M.M., M. Baur, F. Fiedrich & C. Ionescu, 1999. An early warning system for Bucharest, Seismol. Res. Letters, 70, 2, 161–169. Wenzel, F., Baur, M., Fiedrich, F., Oncescu, M.C. & Ionescu, C., 2001.Potential of Earthquake Early Warning Systems, Journal of Natural Hazards, 23, 407–416, 2001. Wenzel, F. and Marmureanu, G., 2006. Earthquake Information Systems, to be published in Pageoph Topical Issue, Proc. Volume 22nd Int. Tsumani Symposium, Chania, Greece, 27–29 June, 2005.
40
METRIK – Model-Based Development of Technologies for Self-Organizing Decentralized Information-Systems in Disaster Management (DFG-Graduiertenkolleg) Speaker: Fischer J. (1), fischer@informatik.hu-berlin.de (1) Institute for Informatics, Humboldt University of Berlin (2) GeoForschungsZentrum Potsdam (3) Zuse Institute Berlin (4) Fraunhofer FIRST (5) Hasso Plattner Institute for Software Systems Engineering
Summary Recent progress in basic research has lead to visions how to use new self-organizing networks for advanced information systems. These networks function without central administration – all nodes are able to adapt themselves to new environments autonomously and independently. The addition of new nodes or failure of individual nodes does not significantly impact the network’s ability to function properly. Information systems and underlying technologies for self-organizing networks, in the context of a specific application domain, are the central topic of research for this graduate school. The research focuses on the important technologies needed at each individual node of a self-organizing network. Research challenges within this graduate school include: finding a path through a network with the help of new routing protocols and forwarding techniques, replication of decentralized data, automated deployment and update of software components at runtime as well as work-load balancing among terminal devices with limited resources. Furthermore, non-functional aspects such as reliability, latency and robustness will be studied. The graduate school’s focus on decentralized information systems with self-organizing net-
works is sharpened by relating those more general research issues to a very specific application domain: computer-supported disaster management. For this reason, the graduate school emphasizes the use of techniques, methods, and concepts for designing and implementing geographic information systems on top of dynamic, highly flexible, self-organizing networks and their integration with services for geographic information systems based on existing technologies in these areas. To manage the complexity of data, information, and services and to make them available for the user, it is extremely important to hide (as much as possible) the difficulties and the complexity of such an environment. Only if we succeed in shielding the user from internal errors and/or changes, such systems will be accepted. Aside from the specific demands given by the geographically dispersed positions in our application domain, the network topology plays an important role in the configuration process, because the partitioning into separated subnetworks must be detected and prevented. Furthermore, if the network should have been partitioned, it must be possible to find all subnetworks and reconnect them – if possible – with a minimal number of links.
41
Research in the suggested application domain is interdisciplinary by nature. The graduate school will conduct basic research in applying workflow management technology to disastrous events, such as earth quakes, based on the developed network protocols and information service concepts. The goal of this work to support decision makers in making better informed decisions by using the complete range of available options. A key differentiator of this graduate school is its model-based approach that will be applied to all layers of the system. Meta-model languages will aid disaster management experts to model their workflows, which may in turn be simulated in order to assess decision processes. Theoretical studies of workflow usability will provide the basis for investigations of the composability of partial workflows in complex scenarios. Workflows will also be studied for their applicability to aid the self-organization of systems by dynamically allocating network resources. The combination of functional specification, automated code generation, and performance analysis methods is a distinguishing aspect of model based service engineering in self-organizing distributed information systems, which will contribute significantly to the service quality of all system components. Current PhD Subjects Transactional Workflows in Self-Organizing Information Systems Artin Avanes, avanes@gk-metrik.de The integration of sensor networks, other embedded systems and mobile devices building a self-organizing information system (IS) results in a faster detection, a better prevention and recovery from urban disasters. However, an efficient disaster management strongly depends on the reliable execution of workflow processes before, during and after a disaster strikes. Workflows describe the operational aspects of work procedures specifying the individual tasks, the order, and the conditions in which the tasks must be executed. In
42
the area of disaster management, workflow activities may run on sensor networks measuring the ground motions and providing mobile rescue teams with important event data. The integration of smart, wireless collaborating devices poses new technical challenges for the coordination and execution of workflow activities. Unlike traditional IS, we have to cope with – Dynamics: Device nodes may fail or the communication links between devices can temporally break. – Limited Resource Capabilities: Self-organizing networks contain smaller devices, with limited resources, e.g. limited battery power or a finite memory capacity. – Heterogeneity: Different kinds of participants with varying software and hardware capabilities have to be considered during the workflow execution. In view of these challenges, we will focus on strategies to support an energy-efficient distribution and a robust execution of workflow activities in such self-organizing environments. Thereby, we use the approach of constraint programming to efficiently distribute the activities. We model the deployment task (mapping of activities to device nodes) as a constraint satisfaction optimization problem (CSOP). Our goal is to select the optimal assignment concerning the energy consumption from the set of possible execution settings. Based on a transactional model for workflows relaxing the ACID properties, we will also develop different failure protocols, such as replication and migration strategies. We will compare the costs for migration of activities with the replication efforts for different failure classes (software, hardware, communication failures). Dependable, Service-Based Infrastructure for Disaster Management Stefan Brüning, bruening@gk-metrik.de Computer systems for use in disaster management are based on very heterogeneous infrastructures. Powerful servers are the backbone of those systems which are used for workflow management, task force coordination and
communication. These servers process data generated by humans and sensors. Sensor data can come from a variety of different sensor types mounted to individual sensor nodes. The complexity of these sensors ranges from sophisticated physical systems to simple nodes as used in sensor networks that have only very limited resources. All parts of the system have to work effectively and efficiently. A special focus lies on dependability. A break down of some parts must not lead to a failure of the whole system but instead must be compensated by the remaining parts. The system has to self-adapt to the changed infrastructure and still allow successful management of the disaster. Service-oriented architectures (SOAs) have been established as a standard for distributed systems in recent years. The possibility to dynamically discover and bind to new services allows a loosely coupled and adaptive infrastructure. The dependable integration of powerful servers and resource limited sensor nodes into one disaster management system is a challenging task. Servers for use in workflow systems can be embedded in a service- based architecture based on web services. Several approaches for increasing dependability for web services exist. Nevertheless, there are no exact measurements yet whether those increases in dependability can really be reached. In my dissertation I will examine to which extend certain technologies and methods are increasing dependability. For this purpose, a SOA will be deployed in our laboratory which will consist of several test services on different computers. Artificial failures will be simulated using fault injection technologies. The results will be evaluated based on metrics for service dependability. Additionally a fault taxonomy will be developed as a basis for fault injection. Modeling and Verifying Declarative Workflows Dirk Fahland, fahland@gk-metrik.de The support of disaster management by selforganizing information systems and networks
cannot succeed without support of the intermeshed work procedures emerging in this context: Administrative processes that are carried out when disaster strikes and that cover the overall situation need to fit to emergency and rescue procedures for locally limited actions. Both will interfere with procedures and behaviour of the supporting self-organizing information system. A failure free and robust concurrence, that is correctness of the involved procedures, is prerequisite for successful disaster management. Workflows have been established as a mathematically founded, operational model for work procedures in different variants. Thus questions on the correctness of one or several joint processes can be formulated and answered on a principle basis. As any model, established workflow models have been built under assumptions like availability of resource, communication infrastructure and continuity of cooperating partners. The dynamics of disaster management and self-organizing systems violates these assumptions: Workflows for disaster management need to be flexible to allow the execution of a process in various circumstances. Unpredictable events require the need for (runtime) adoption of a workflow. At the same time, a flexible and adaptive workflow has to be correct in order to be a reliable tool in disaster management. A property of interest will be the self-stabilization of a workflow that encountered a faulty situation. We identified two ideas that may effectively allow the modeling of flexible and adaptive workflows such that these models can be analyzed for properties of interest. We step back from classical operational models like Petri nets and introduce declarative and scenariobased elements: Temporal logics, for instance linear-time temporal logic, allow to precisely model critical aspects like communication or resource access while giving only a loose, but sound, characterization of ordering tasks of workflow. A scenario-based approach as exercised in Harelâ&#x20AC;&#x2122;s Life-Sequence Charts breaks behaviourally complex models into humanconceivable parts that can be modified as the (changing) situation requires.
43
This approach requires the definition of a feasible (declarative and/or scenario-based) modeling language for workflows that meets the ideas and concepts of processes in disaster management. Having models in such a language, the models have to be made effective, for instance by translating them into an operational model (and vice versa). Finally, we are interested in a (temporal-) logic-based characterization of relevant properties like self-stabilization and feasible verification procedures to verify the correctness of a workflow. Data Management for Wireless Sensor Networks Timo Mika Gläßer, timo.glaesser@gk-metrik.de The Graduierten Kolleg METRIK focuses on self-organizing systems for use in disaster management before, during and after natural disasters like earthquakes, tsunamis or volcanic eruptions. We will try to address the various problems that arise with a) data gathering and b) event-detection using wireless sensor networks in the above mentioned scenarios. In fact wireless sensor networks play a central role in the envisioned systems for disaster management. Wireless sensor networks consist of small, independent sensor devices with highly incomplete and local knowledge that can measure physical properties of their environment and communicate with other sensor devices in their area using broadcast wireless communication. These sensor devices are equipped with small batteries as their only power-source. However each sampling operation, each computation and communication operation on the sensor devices consumes energy. Hence respecting the energy constraints of these wireless sensor devices is critical to keep them operational and optimizing for lower energy consumption may increase device and network lifetime. We will try to provide a declarative publish/ subscribe-style interface to the sensor network that enables users to specify queries like »Report all locations of sensors in the area around Istanbul where the shaking exceeds a
44
threshold τ.« or »Report all clusters of nodes and the average shaking in the clusters in the Istanbul area where at least 80an earthquake of magnitude greater than τ.« When a user hands such a query to our system, the query will automatically be decomposed into smaller operations arranged in a dataflow graph and be spread across the wireless sensor network. The operations in the dataflow graph can be executed in different orders yielding different energy usages at both the individual sensor devices and the network level. Our system will continuously try to optimize the cost of multiple such queries to extend the wireless sensor network’s lifetime. Additionally as routing is of utmost importance in multi-hop wireless networks we provide novel ideas of including routing decisions in the evaluation and optimization process of queries. Prior research in the area does either not perform in network query processing – all data is routed to a gateway node and processed there – or considers single-queries only. Additionally our approach differ from others by not only allowing simple aggregations like MIN, MAX, AVG but also supporting the clustering of nodes based on the network topology, e.g. hop-distance. QoS in 802.11 Multihop Networks Kai Köhne, koehne@gk-metrik.de Wireless local area networks using the IEEE 802.11 standard are popular both in home and office use. Most often, the technology is used for the »last hop«, letting the user access the (wired) core network via an access point. However, projects such as MIT Roofnet and Berlin Roofnet demonstrate that even a larger metropolitan network can rely solely on wireless links. In these networks, every participating node does not only process its own data, but also relays traffic for neighbouring nodes, so that every node in the network can be reached by multi-hop connections. These techniques have also been studied and examined for sensor networks. In the EDIM and SAFER projects we create an earth-
quake early warning system based on 802.11 multi-hop technologies. Implementing demanding real-time applications such as earthquake early warning based on wireless multi-hop networks require high network reliability. However, each link in a 802.11 network is highly volatile, suffering from physical phenomena like fading, but also from interferences with other devices. Moreover, in cities often 10 or more 802.11 networks operate in parallel, effectively limiting the available bandwidth for each of them. One way to achieve a higher reliability for time critical applications is service differentiation. The recent addition 802.11e provides new medium access methods that share the medium either in a fully distributed (EDCA) or centralized (HCCA) way. EDCA is of special importance for multi-hop networks: High priority packets have a higher chance to access the medium in the case of conflicts. In my work I evaluate the advantages and problems of using EDCA in wireless multi-hop networks. One problem largely ignored in the literature is spectrum competition between different 802.11 networks. Networks are able to optimize their EDCA parameters so that they can gain a higher share of the bandwidth. If every network does use the most aggressive settings, the differentiation mechanism is effectively undermined. We therefore develop a new differentiation protocol which builds upon EDCA. In these protocols, every data message has the same EDCA priority. Service differentiation is achieved using a separate control channel. This avoids the mangling of intra- and inter-network priorities, and resolves unfairness issues due to interference between data and control packets. Semantic Integration for Disaster Management Bastian Quilitz, quilitz@gk-metrik.de Fast and effective disaster management largely depends on the fast provision of detailed, precise, and up-to-date information. It is vital for decision makers and rescue crews to be informed about the state of critical infrastruc-
tures, the number, location, and status of injured people, etc. The required information is likely to be scattered over many distributed information systems, belonging to different organizations and not all relevant information sources may be known in advance. Another important aspect in disaster management is the different needs depending on the userâ&#x20AC;&#x2122;s role. Decision makers will require different information than rescue crews in the field. Integrated information systems provide access to a number of information sources. In those systems heterogeneity is one of the main challenges. While there exits some approaches to overcome technical heterogeneity, such as object oriented middleware, overcoming structural and semantic heterogeneity is still an ongoing research question. For disaster management we distinguish between two types of information sources based on their temporal characteristics: Databases and streaming information sources. While databases provide query access to persistent data that is relatively static, such as background information or management information, streaming information sources are updated continuously. Examples for streaming information sources are sources for sensor data or locations of rescue teams. In the context of METRIK, I envision an information system for disaster management that seamlessly integrates both types of information sources and provides the right information to the right user. Users can either submit subscriptions to the system and will continuously receive updates or ask ad-hoc queries if they have special information need. In this context, I will investigate methods for semantic integration of geo data from a multitude of heterogeneous, distributed data sources. The focus will be on efficient query processing for the integration system, utilizing the spatiotemporal context of the data for query optimization. The goal is a timely provision of integrated, spatiotemporal consistent data. Supplementary, I will investigate methods to ensure data quality.
45
Using metaprogramming and metamodeling for prototyping middleware abstractions in wireless sensor networks Daniel A. Sadilek, sadilek@gk-metrik.de In METRIK, wireless, self-organizing sensor networks (WSNs) are supposed to be the basis of complex disaster management systems. Software development for such systems is difficult because of numerous reasons: applications on WSNs are distributed and show the usual synchronization and communication problems; they have to respect the limited resources of the sensor nodes; deployment and debugging is very costly. Middleware could help with these problems but existing middleware technologies are inappropriate for WSNs – and it is not yet clear which services, which programming model and which abstraction level a middleware technology for WSNs should offer. Furthermore, for WSNs the term »middleware« is used in a much broader meaning than traditionally: there are operating systems, languages and pure libraries all named »middleware«. I am looking at middleware from the perspective of domain specific languages (DSLs), which may enable domain experts to actively develop parts of a WSN application. Because services, programming model and abstraction level are not yet clear, I am working on technologies that allow the prototyping and simulation of a DSL’s domain concepts and its semantics. The definition and usage of DSLs is routine since decades in metaprogrammable languages like Lisp and Smalltalk. However, these languages lack support for purposebuilt concrete syntax. On the other hand, technologies from the field of model-driven software engineering (MDSE) provide such support. However, they miss a thing that metaprogrammable languages provide: direct executability – because metamodels alone don’t have operational semantics. The semantics is usually provided by a transformation, in which the knowledge how to map domain specific concepts to concepts of the target platform is encoded.
46
My approach is to use flexible metaprogrammed domain abstractions in which fixed object-oriented metamodeling layers are introduced. I hope to gain synergies from this combination that allow a great part of the language semantics done for the simulation to be reused for the target platform. Currently, I am doing a proof of concept implementation based on the Eclipse modeling technologies and Scheme. I am working on a development process specification, the simulation of multiple nodes, support for communication primitives, configuration of runtime parameters, support for mixed graphical/textual DSLs, and compilation for the target platform. As first evaluation project, I am developing a streambased language, which can be used for the development of an earthquake early warning system with algorithms for earthquake detection, distributed warning determination, and warning dissemination. Language Modeling Markus Scheidgen, scheidge@gk-metrik.de Computer languages are in the centre of software engineering; they are the tools to communicate problems and solutions to both humans and computers. Computer languages have to allow unambiguous and effective expressions, independent of the specific language nature, its level of abstraction, purpose, or area of application. Language sentences have to be proofed and tested. The artefacts written in computer languages have meaning; these semantics is manifested in compilers, simulators, model transformations, or employing calculi and formalisms. In this sense, a computer language is much more than a set of sentences; it is more a set of tools that provide for all these aspects. Language modeling is based on the hypothesis that languages are pieces of software and that they should be developed like software in a model driven fashion. We use models to describe languages, their concepts, notations, and semantics. Language models, known as meta-models, are artefacts written in several meta-languages. Each of these languages can
be used to describe a single language aspect. These aspects include, for example, abstract language structure, static semantics rules, textual or graphical notations, operational semantics or code generation. The semantics of a meta-language is manifested in a generic tool which uses a language description written in that meta-language. The ultimate goal is to provide a complete framework of metalanguages that allow to describe a language in all necessary aspects. Whereby, these descriptions can be used by humans to understand the language and by machines to provide automated tool support for the language. We contribute several meta-languages and generic tool support for the modeling and use of computer languages. We developed the meta-modeling framework A MOF 2 for Java based on the MOF 2.0 standard. This framework, in contrast to others, supports enhanced refinement and specialisation features that allow for more flexible and reusable structure models. It also uses a new Java mapping which utilises generics and variant return types for safer programming with models. We developed the Textual Editing Framework, which provides a meta-language for textual notation modeling. The framework includes a generic editor that provides semantic rich editing with syntax highlighting, code completion, error annotations, and much more. A third contribution provides meta-language and generic tool support for execution semantics. Based on structure definitions for abstract language syntax and runtime data, the language engineer can specify behaviour at the meta-level. The generic model simulator can then execute language instances solely based on such descriptions. We are evaluating on the Specification and Description Language (SDL). This proofs the general applicability (and also the weaknesses) of our contributions and is our basis to reason about the hypothesis: languages, as software, can be modeled.
Model Driven Engineering (MDE) for Modeling and Simulation in Disaster Management Falko Theisselmann, theissel@informatik.hu-berlin.de Scientific models and simulations of environmental systems are major information sources in disaster management. Various frameworks and domain specific languages have been developed to ease modeling and simulation tasks and provides access to new modeling and simulation technologies to non-experts. These frameworks usually stick to certain modeling formalisms (i.e. box models, state machines, cellular automata etc.) and modeling languages. As a result, models are usually framework and language specific. Using frameworks requires knowledge about frameworks, their concepts, languages, and capabilities. Changing the framework usually requires expensive, manual reimplementation of models. In environmental modeling practice, this is an obstacle for the use of frameworks, the reuse of models, and their integration in coupled multi-model models. We investigate the use of MDE to tackle this problem. Our approach is conceptually based on the Object Management Groupâ&#x20AC;&#x2122;s Model Driven Architecture (MDA). Usual modeling formalisms in the disaster management domain are platforms in the sense of MDA. Domain specific models (DSMs) are mapped onto formalism specific models (FMSMs). In the following steps, these are mapped onto framework specific models (FWSM) that allow for the generation of framework specific executable code. One DSM maybe transformed into several FMSM. Note that one FMSM maybe transformed into other FMSMs and into several FWSMs. This yields the possibility for the extensive reuse of models, if the formalisms allow for that. Also, several DSMs maybe translated into a single FMSM, which facilitates model integration/coupling. To formulate DSMs, we use DSLs, where the concepts and notational elements are tailored to the concepts and cognitive spaces of the domain experts. The logic defined
47
by the transformations ensures the correct application of formalisms and concepts which otherwise would be the responsibility of the domain expert. The goal of our research is to evaluate and show how MDE may enhance model reuse, integration and combination of different domain models and promote the sensible use of frameworks. Moreover, we want to investigate how to represent domain knowledge adequately and support collaborative, multidisciplinary modelling in the field of disaster management. Metamodel-based Language Engineering Guido Wachsmuth, guwac@gk-metrik.de In METRIK, we follow a model-based approach to develop disaster management systems. We want to provide domain experts with modelling means that are intuitive, concise, and semantically precise. To fulfil these requirements, we propose a combination of several domainspecific languages (DSLs). Language Engineering brings Software Engineering to languages. It is concerned with language design, language maintenance, language extension, language recovery, translation, generation, interpretation, etc. Metamodels are our choosen tool to specify languages. Like other software artefacts, metamodels and metamodel-based language descriptions evolve over time. Metamodel evolution is usually performed manually by stepwise adaptation. Furthermore, models need to co-evolve in order to remain compliant with the metamodel. Without co-evolution, these artefacts become invalid. Like metamodel evolution, co-evolution is typically performed manually. This errorprone task leads to inconsistencies between the metamodel and related artefacts. In our work, we explore transformational metamodel adaptation as a foundation for Metamodelbased Language Engineering. As a first result, we provide a theoretical basis to study the effects of metamodel evolution in terms of metamodel relations. We employ well-defined evolutionary steps for metamodels compliant to OMGâ&#x20AC;&#x2122;s Meta Object Facility
48
(MOF). The steps are specified as transformations in QVT Relations, the relational part of OMGâ&#x20AC;&#x2122;s Query-View-Transformation language. Each step forms a metamodel adaptation and is classified according to its semantics and instance-preservation properties. Automatic co-evolution steps are deduced from these welldefined evolution steps. This co-adaptation prevents inconsistencies and metamodel erosion. Starting from our theoretical results, we develop an Adaptation Browser for MOF compliant metamodels. The tool is built upon the Eclipse Language Toolkit. This provides us with undo/redo support, adaptation history, and scripting facilities. We now examine our approach in two case studies. The first study is concerned with the design of a new DSL. The transformational approach facilitates a well-defined stepwise metamodel design. Starting from basic features, new features are introduced by construction. We hope that extensive usage of this principle leads to an agile process. The second study is concerned with the extraction of a language out of a natural language document. The document describes disaster management processes semi-formally. Language recovery is concerned with the derivation of a formal language specification from such sources. For grammar recovery, a transformational approach already proved to be valuable. In a similar way, we employ our approach to assist metamodel recovery. Model Coupling and Test Stephan WeiĂ&#x;leder, stephan.weissleder@gmail.com Models are a fundamental element of modeldriven software development (MDSD). By using model transformations and code generation patterns, available tools are already able to generate a large part of the source code automatically. However, while effort has been invested in source code generation, the automatic generation of test code has been widely neglected. This test code should reveal errors, which are occurences from abnormal program terminations to small deviations from
the specified behavior. Normally, the late detection of such errors results in high costs for their removal and for the compensation of their consequences. So, early and extensive testing is most important. Nevertheless, the priority of testing is often still considered lower than the priority of delivering products early, which is mainly due to high costs for test suite development. All in all, models and tests both aim at improving the quality of the developed system. Our current work combines methods from these two fields. The intention of this combination is to gain extra benefit for software testing by using different models together. Therefore, we use several models describing different aspects of a system, combine them, and derive enough information for automatic test case generation. Several former approaches only dealt with single models or with models that used just a very small aspect of another model. Likewise, constructive but random approaches for automatic test input data generation have been neglected. Our current state of affairs includes the automatic and model-based generation of a whole test suite based on state machines and class diagrams of UML. We combine both models via their OCL constraints and construct partitions for the corresponding test input values. Furthermore, we operate boundary testing on these partitions to generate test inputs for positive and negative test cases. Additionally, our current work combines structural and behavioral models to benefit from their particular relationships. For instance, behavioral specifications (e.g. state machines) can be reused in several static contexts (e.g. classes) via structural relationships like inheritance. We already implemented our main ideas in SMOTEG – a tool based on the Eclipse modelling framework. The identified challenges of our current approach are the following: the identification of retraceable constraints so that the influence of input parameters becomes clear; the combination of several behavioral specifications for one class (multiple inheritance); and the intro-
duction of concurrency, which is important for distributed systems like sensor nets. In the context of METRIK, we aim at adapting our approach to special tasks of geo information systems (GIS) or meta-modeling. Furthermore, we need to identify significant case studies to show our approach’s benefit. Associated PhD Subjects Model-based simulation of decentralized, ad-hoc sensor systems Frank Kühnlenz, kuehnlenz@informatik.hu-berlin.de The popularity of sensor systems increases with the ability of realizing more complex scenarios at low costs. Additionally the hardware became cheaper and even more powerful over the last years. This leads to the vision of »smart dust« whereby billions of smallest sensors provide services in a cooperative way. They form an ad-hoc network where sensor nodes may fail and new ones may be added and act cooperatively in a self-organizing and »intelligent« way (e.g. distributed computing). One challenging application domain for sensor networks are earthquake early warning systems (EWS). Especially the reliability of the sensor net is a crucial point in this domain where under disadvantageous conditions single sensors may be destroyed but the whole system can detect the earthquake nevertheless. Today sensor nets for earthquake early warning have a static infrastructure which was carefully planned by an operator and comprises one or more data centers (e.g. Turkey IERREWS). In this case the data center does all computing and decision logic. In two international, interdisciplinary projects, SAFER and EDIM, methods and technologies are developed and prototypically realized for an earthquake early warning system based on a decentralized, wireless, mesh sensor network. Like every early warning system it must match several optimization goals: e.g. early warning time as the most important one but also minimal false positive alarms (which should be guaranteed by issuing an
49
alarm only after »discussion« within a group of sensor stations). For this recent approach new distributed detection algorithms and consensus based alarm handling protocols must be created and explored. Therefore my work concentrates on developing model-based simulations and evaluating them against specific criteria for EWS. We follow a top-down approach starting with higher order questions – in case of alarm handling for example how is a group of sensor stations defined and how many stations are necessary. According to that we first abstract from the transport oriented network level and instead find constraints for the network layer. Such a constraint could be the conclusion: If a number of messages between designated stations can be transported within a defined time, then the early warning time is below a specific value. To follow the top-down approach and for supporting the abstractions of single levels, different simulation frameworks are used and must be coupled. For example models of communication systems can be designed in SDL-RT very well and directly executed (therefore it is used for modeling alarm handling). In the last development steps it is necessary to couple a specialized network simulation framework (e.g. ns-2). Quality assessment of reports from the public for flood disaster management Kathrin Poser, kathrin.poser@gfz-potsdam.de For disaster management it is important to integrate information from different sources to get an assessment of the situation which is as complete as possible. Particularly when larger areas are affected, the local population can contribute valuable information for disaster response and recovery. New internet technologies facilitate fast and easy collection of data from the public. So far, however, information provided by the affected population has only been taken into account systematically in earthquake intensity mapping, and no comprehensive evaluation of the quality of these data has been performed.
50
In order to make informant reports useful for disaster management and use them as input for the estimation of damage, the following issues need to be addressed: – Which types of information, that are required in disaster management, can be supplied by the affected population? – How can these data be collected? – How can these data be validated and their quality be assured? – How can these data be integrated with other information such as sensor data or remotely sensed data? The research questions will be addressed exemplarily for the case of (rapid) flood loss estimation by empirical loss models which describe the susceptibility of elements at risk to flooding. Input data for these models are information about the flooding event and about the exposed values. The output is the estimated monetary loss. The use of information supplied by nonexperts for disaster management is impeded by the fact that its reliability is in general unknown. Therefore, it is important to assess the reliability and quality of these data. Our approach for such a quality assessment will rely on geostatistical methods and concepts of trust models. These methods will take into account a comparison of the statements with each other as well as with other data about the flood event such as gauge measurements, remotely sensed data, and a plausibility check based on terrain and topographic data. The results of this research will contribute to the efforts to tap the potential of »humans as sensors«, i.e. to make use of information provided by the general public via the internet for disaster management.
Earthquake Disaster Information System for the Marmara Region, Turkey (EDIM) Wenzel F. (1), Erdik M. (2), Zschau J. (3), Milkereit C. (3), Redlich J. P. (4), Lupp M. (5), Lessing R. (6), Schubert C. (6) (1) Universität Karlsruhe (TH), Geophysikalisches Institut, Hertzstr. 16, 76187 Karlsruhe (2) Kandilli Observatory and Earthquake Research Institute, Bogazici University, Cengelkoy, Istanbul, Turkey (3) GeoForschungsZentrum Potsdam (GFZ), Telegrafenberg, 14473 Potsdam (4) Humboldt-Universität, Institut für Informatik, Unter den Linden 6, 10099 Berlin (5) lat/lon GmbH, Aennchenstr. 19, 53177 Bonn (6) DELPHI InformationsMusterManagement GmbH, Dennis-Gabor-Str. 2, 14469 Potsdam
Abstract EDIM – Earthquake Disaster Information System for the Marmara Region – is an interdisciplinary research project that focuses on improving earthquake early warning capacity in the Marmara region around the mega city of Istanbul. The consortium that addresses this goal consists of the Geophysical Institute of the University of Karlsruhe (TH), GeoForschungsZentrum Potsdam (GFZ), the Computer Science Department of Humboldt University – Berlin (HU), lat/lon GmbH, Bonn, DELPHI InformationsMusterManagement GmbH, Potsdam, and Kandilli Observatory and Earthquake Research Institute of Bogazici University in Istanbul (KOERI). Objectives of EDIM The existing earthquake early warning system for Istanbul (Istanbul Earthquake Rapid Response and Early Warning System – IERREWS) will be expanded to a regional scale (the Marmara Sea Region) and be significantly improved in terms of the quality of the early warning information. Estimates of the reliability of the warnings will be delivered, as well as real-time information dealing with the earthquake source parameters and near-realtime shake maps for the region. At the same time, this project will serve as a test site where new advanced sensor technology con-
sisting of self-organizing networks will be assessed. Data and information exchange will be controlled by a dynamic geo-information infrastructure utilizing a user-oriented information and visualization strategy. The expected results and products that have the potential of being transferable to other earthquake-prone regions are: real-time seismology using neural nets, self-organizing sensor technology, real-time information systems and personalized visualization systems. We can take advantage of a large body of previous work on ground motion assessment, site effect quantification, and risk assessment (KOERILOSS). On this basis, we address real-time information before, during and after an earthquake, following an interdisciplinary approach. Seismology will provide novel algorithms for the rapid detection of earthquakes, source properties, and the expected level of ground shaking, seismologists, computer and communication scientists will develop the new technology of self-organizing networks for early warning and rapid reporting, while geoinformatics will provide methods for the incorporation of additional information and will focus on tuning the resulting early warning information to a user’s needs. In order to achieve these objectives, we have built a consortium (Verbundvorhaben) of
51
Figure 1: Location of currently used stations for early warning in Istanbul (blue stars) and proposed locations of new stations for expanding the system to the Marmara Area (red stars). Possible locations of the proposed downtown lowcost sensor test network are also shown.
research and commercial organizations consisting of – the Geophysical Institute, University of Karlsruhe (TH), – GeoForschungsZentrum Potsdam (GFZ) – Computer Science Department, Humboldt University of Berlin (HU) – lat/lon GmbH, Bonn – DELPHI InformationsMusterManagement GmbH, Potsdam – Kandilli Observatory and Earthquake Research Institute, Bogazici University, Istanbul, Turkey (KOERI) The consortium is coordinated by TU, while KOERI will allow access to the existing IERREWS system, will implement our results and will assist in the development of an improved communications system. Data from the selforganizing sensors, combined with additional spatial information, will be provided by an interoperable information infrastructure. The project is structured into three packages: Real-time information from a regional accelerometer network (A), Self organizing sensor system (B1), Infrastructure for self
52
organizing sensor system (B2), Information System (C). The specific objectives of the project are: (1) Expansion of real-time communication via satellite within the existing system to the Marmara Region (Fig. 1). This step is crucial, as the entire region is highly industrialized and densely populated. The work will be financed by the Governate of Istanbul within 2007 and 2008. From the very beginning, KOERI will provide internet access to the IERREWS to its German partners. (2) Optimization of the existing early warning system in terms of area coverage and signal processing capacity. This is important as the current system operates with a very simple scheme, that does not allow it to fully utilize the available seismological information, nor does it include information on the reliability of the early warnings. (3) The self-organizing network is planned to act as an autonomous wireless mesh network that works efficiently, even in the event of the failure of individual nodes (similar to the experimental BerlinRoofNet project (http://sar.infor-
Figure 2: Sensor node network with potential embeddings
matik.hu-berlin.de/research/projects/2005BerlinRoofNet/berlin_roof_net.htm). To these network nodes, inexpensive seismic sensor devices are incorporated, resulting in a sensor system that will be configured by the appropriate hardware and software for application as an earthquake early warning system. Beyond that, further sensors are to be included that can register, for example, building motion or changes in groundwater characteristics. This requires the appropriate installation of specific analysis and processing software, as well as suitable equipment for communication protocols and services for the cooperative analysis and evaluation of seismic waves and time series of other parameters. Furthermore, more distant nodes of the sensor system are to be connected by hard wire or over an existing network infrastructure (e.g. Internet), thereby not losing its self-organization character (Fig. 2). (4) The EDIM information and mediation system must provide appropriate information to various user groups, e.g. disaster management, politicians, media, private people, and scientists. For this purpose, a geoinformation infrastructure will be established that is realized by components following actual ISO and W3C/OGC standards. Interfaces that link all components are also to be implemented. State-of-the-Art Substantial progress in seismic real-time acquisition and communication technologies, aside from enhancements of seismic processing soft-
ware, has been made over the past few years. This has paved the way for the design and implementation of earthquake early warning systems all over the world (Zschau and KĂźppers, 2003; Kanamori, 2005; Gasparini et al., 2007). Systems are now in operation in Japan, Taiwan, and Mexico (Nakamura, 1989; Wu and Teng, 2002; Espinosa-Aranda et al., 1995), while in Romania and Turkey, systems have also been recently constructed (Wenzel and Marmureanu, 2006; Erdik et al., 2003). Earthquake early warning systems are effective tools for disaster mitigation if used for the triggering and execution of automatisms to prepare vulnerable systems and dangerous processes for the imminent danger. Seismic warnings can be used to slow down highspeed trains to avoid derailments, to close various pipelines, for example gas, to minimize fire hazards, to shut down manufacturing operations to decrease potential damage to equipment, and to save vital computer information to inhibit the loss of data. A compilation of effective measures in response to such warnings is given by Goltz (2002). Instrumentation One hundred strong motion accelerometers have been placed in populated areas of Istanbul, within an area of approximately 50 Ă&#x2014; 30 km, to constitute a network that will enable rapid shake map and damage assessment after a damaging earthquake. Once triggered by an earthquake, each station will process the strong-motion data to yield the spectral accelerations at specific periods and will send these parameters in the form of SMS messages to the main data center through available GSM network services. For earthquake early warning information, ten strong-motion stations were established as close as possible to the Marmara Fault. The continuous on-line data from these stations will be used to provide near-real-time warnings for potentially disastrous earthquakes. In addition, 40 strongmotion recorder units will be placed on critical engineering structures to augment the already instrumented structures in Istanbul. All together, this network and its functions are called the
53
Istanbul Earthquake Rapid Response and Early Warning System (IERREWS). The system is designed and operated by Bogazici University, with the logistical support of the Governorate of Istanbul, the First Army Headquarters and the Istanbul Metropolitan Municipality. Continuous telemetry of data between these stations and the main data center is realized with digital spread spectrum radio modem system involving repeater stations. Depending upon the location of the earthquake (initiation of fault rupture) and the recipient facility, the alarm time can be as high as about 8 s. Thus, Istanbul has a basic version of an early warning and rapid response system. The seismological network of the new selforganizing early warning and information system will be designed to take advantage of seismological knowledge resulting from many studies and International projects that have focused on Istanbul and the Marmara region, especially after the 17 August 1999 Izmit earthquake. New source parameter scaling relationships (Parolai et al., 2007) and crustal attenuation laws for the area (Bindi et al., 2006) will play a fundamental role, for example, in the calculation procedures adopted for the alert and shake maps. On the other hand, for a proper calibration of the attenuation and source parameter scaling relationships, it is of primary importance to also assess the effects of amplification/deamplification and the lengthening of seismic ground motion due to surficial geology. For this reason, GFZ and KOERI are carrying out an extensive field campaign for the mechanical site characterization of Istanbul. In particular, single station (about 200) and 2D-array (8) measurements have been carried out in the western part of the metropolitan area of Istanbul, with the aim of estimating the S-wave velocity profile for the sedimentary cover and to identify areas where increased amplification of seismic ground motion is to be expected. Therefore, all available seismological and geophysical information will be used for the optimization of the single node analysis (e.g. frequency dependent analysis for seismic event detection) and for the improvement of the self-organizing early
54
warning system as a whole. The new early warning system will be designed to complement existing classical seismological monitoring networks, but it will focus on the metropolitan area of Istanbul, and the strong ground motion and building response. Sensor development In addition to improvements in the existing centralized infrastructure, we will also deploy and test a decentralized wireless mesh network based on inexpensive hardware. Such networks are currently a popular research topic in the networking field (Zhang, 2006).They aim to be self-organizing in the sense that only a minimum of manual configuration is required, and that no central administration is necessary. One of the key ideas is that the services of the network are not provided by a dedicated central infrastructure, but by a collaboration of nodes nearby to the requester; ideally this pattern allows the network to be robust against single failures, and to scale well as the number of nodes grows. In effect this enables a network density and reliability which centralized networks can only achieve at extremely high costs. In recent years projects like the MIT Roofnet and the Berlin Roofnet (Sombrutzki et al., 2006) have successfully implemented large wireless mesh networks, primarily to provide free internet access in densely populated areas. They use standard WLAN hardware, which utilizes license-free spectrum bands for communication. We profit from the hands-on experience made in these projects, and apply it to the new area of geo-sensor networks. To our knowledge this will be the first study for the use of self-organizing ad-hoc networks for the purpose of earthquake monitoring & early warning. An evaluation of the real-time behavior of such monitoring and alerting systems based on self-organizing network infrastructures is almost impossible or too expensive without accompanying model investigations. One of the most used engineering tools is that of an experimental investigation by computer simulations. However, computer simulations require and depend on concrete
investigation goals. Based on these goals several models are developed (e.g. in our context: behavioral models of the early-warning system, transport and alerting communication protocols) and suitable simulation frameworks must be identified (ODEMx library, Gerstenberger, 2003). Information Systems The transfer and processing of heterogeneous specific information and their integration in a harmonized information system is the approach part C of the project. Methods of semantic interoperability are concerned with the integration of knowledge from different application domains. In former resarch projects, DELPHI IMM (meanInGs, completed 09/2005, www.meanings.de and DeCOVER, in progress 06/2008, www.decover.info) used methods were and developed processes in semantic interoperability between application domains. In order to integrate existing, heterogeneous data sources automatically it is necessary to provide a system for the semantic interoperability of existing data sources. A methodic approach is the use of Ontologies [Gruber, 1993]. Ontologies allow the mapping of classes into a hierarchical structure, the classes are linked to each other by the use of thesauri, taxonomies and relations. This concept provides a high expression strength in a formal description of technical terms. Ontologies are knowledge-based models and realize a kind of description logic for the computer. With this approach it is possible to gain logical conclusions by feature-based queries (Klien et al., 2004). With the integration of this approach in web-services, Geo information can be provided to an broad user group. A dynamic geoinformation infrastructure will be developed in order to disseminate relevant information of various sources before, during and after hazardous events for different target audiences – for the general public, the system will provide aggregated information of latest events and personalized guidance. In one central node the citizens get localized information in their native language.
– the local administration and politicians have access for instance to hazard and vulnerability maps in order to estimate the endangerment of their population – for emergency response coordination by getting information during respectively short after a hazard for their situation, for instance to coordinate rescue teams and humanitarian aid. Based on a spatial data infrastructure (SDI) approach, the following requirements can be accomplished: – The communication is based on WebServices supporting international standards of ISO, OGC and W3C. The standardized interfaces enable the interoperability of the components making the integration of information from different providers into the central portal possible: Integration of EDIM internal OGC Web Mapservices (WMS) or OGC WMSs of a local government in the portal (service-to-portal) as well as service-to-service communication between a OGC Web Feature Service and OGC WMS can be accomplished. – In order to support semantic aspects for the information system, the use of international standards builds a good and reusable basis. – The SDI concept provides a distributed network of components connected via the Internet based on the HTTP protocol. The distribution of services creates a basis for decentralization and reducing the vulnerability of the entire system. Each service is a module in the infrastructure with a different task: While sensors are collecting data like accelerations at the one end, the information are disseminated via a standardized interface of OGC sensor observation service (SOS) to the central portal for the emergency crew. A Web Feature Service (WFS) stores vector data of roads and information about current road conditions. Users that want to know if a road is blocked after a hazardous event, can log into the portal and check a map with that information rendered by a WMS. Depending on their rights they have access to more confidential information controlled by the owsPproxy component. The portal is the
55
central information node where user’s get their localized and personalized information. By using standardized interfaces, services from other networks can be hooked in transparently to the user. In between the portal and service a data processing takes place in order to aggregate data and make information of different sources semantically comparable. First Results The existing data transmission is via 2.4 GHz Spread Spectrum Radio Modems and relay stations. Currently, a second data transmission avenue is implemented through the use of satellites, involving the installation of Direcway Satellite dishes at each station to transmit the data to the main center. This produces a time delay of about 0.5s. Dexar Company in Istanbul (affiliated with U.S. Hughes Network Systems) will provide this service. The future plan
Figure 3: Alert Map for Istanbul
56
involves adding a third transmission system via ADSL. To provide total coverage in the Marmara Region (Fig. 1) the current 10-station network (blue stars in the attached figure) needs to be expanded with an additional 10 stations (red stars in the attached figure). Early Warning Methodology In cooperation with KOERI, a novel scheme for extracting early warning information has been developed within the framework of a PhD project (Böse, 2006) jointly supervised by TH (Prof. Friedemann Wenzel) and KOERI (Prof. Mustafa Erdik). Some of the results have been published in Böse et al. (2005) and Böse et al. (2006). Fig. 3 shows an example of an alert map for Istanbul for the case of a M = 6.5 earthquake in the eastern part of the Marmara Fault. The left panel indicates the distribution of PGA after the earthquake: the right panels
show the predicted alert maps after 3.5 and 4.2 seconds. After 4.2 seconds the map comes fairly close to the final ground motion, so that after this time the regional ground motion is fairly well predicted. The actual warning time for Istanbul for this earthquake is 15.8 seconds. The partners in KOERI developed the existing warning and information system, which includes the installation of sensors, real-time communication, shake-map generation and earthquake-loss estimation technology (Erdik et al., 2003). The neural net methodology PreSEIS (BĂśse, 2006) for earthquake early warning has been calibrated with real data from Southern California. The dataset consists of 69 shallow local earthquakes with a moment magnitude ranging between 1.96 and 7.1. The data come from broadband (20 or 40 Hz) or high broadband (80 or 100 Hz), high gain channels (3component). Although the data have been recorded at a total of 177 stations of the Southern California Seismic Network (SCSN), only certain combinations of single stations have been tested so far. As an input for the neural nets, the envelopes of the waveforms defined by Cua (2004) are used. The envelope is obtained by taking the maximum absolute amplitude of the recorded ground motion time history over a 1-second time window. Due to the fact that not all of the stations recorded each earthquake, the missing records were replaced by synthetic envelopes, calcu-
lated by envelope attenuation relationships developed by Cua (2004). Sensors and Networks The model-driven development (MDD) initiative of the OMG proposes a transfer from platform independent models to platform dependent models. We will use this approach to realize the complex task of simulating the realtime behavior of the ad-hoc mesh sensor network for EEW. Therefore we designed and partly implemented a prototyping infrastructure with the following characteristics: â&#x20AC;&#x201C; A repository with customizable and executable model components, which enables the efficient configuration of executable EEWS models (Experiment Management System). â&#x20AC;&#x201C; Model components in the repository which represent software components of specific seismic, alerting and distributed services should be evaluated by simulation experiments. From these models corresponding software components can be generated for the EEWS and deployed on the target platform. This ensures that the same software component which was evaluated by simulation runs on the target platform. We designed a component-based architecture to be deployed on every node (see Figure 4). It consists of both functional as well as technical software components. Functional soft-
Figure 4: Software Architecture of a Node
57
ware components, like the analysis of sensor data, reflect the core services the self-organizing network has to provide. Functional components are first implemented and tested within the simulation part of the prototyping infrastructure. In contrast, the technical components are driven by the needs and restrictions of the underlying hardware infrastructure, and are therefore developed directly for the platform. Components already implemented include the retrieval of data within the node (Data Sensing, from the sensors as well as from a file), and the supply of data (Raw Data Archiving) this through the wireless network. This allowed us to conduct first experiments in a small in-house test bed with real hardware. The component Map Parameter Calculation computes the peak ground acceleration (PGA) necessary for ground shake maps as well as for alert maps. The component Collaborative Early Warning Management decides how to deal with incoming alarm messages from other nodes (and work in progress). A first version of a low-cost sensor system (ADXL203) for three-component acceleration and/or velocity (Figure 5a) measurements has been developed by GFZ Furthermore, taking into account the low-cost margin of the planned system that makes prohibitive the use of expensive commercial AD-converters (AD-c), a 4-channel AD-c was developed by the GFZ (Figure 5b). These converters are designed for sample rates of up to 400 Hz with a signal
a)
b)
bandwidth from 0 Hz to 80 Hz. Moreover, a low-pass a-casual filter system with a hardware adjustable corner frequency before digitization to be integrated into each sensor has been developed. In cooperation with the project partner HU, a suitable low-cost hardware platform has been identified for reading and handling the data streams from the AD-c board. The hardware configuration for the WLAN computer is based on the use of a single board computer, the PC Engines WRAP.2E board (Figure 5c), which is equipped with two 802.11a/b/g WiFi cards (Atheros chipset), a 1GByte CFlash card and two USB ports, of which one is used to attach the AD-c converter developed by the GFZ. The communication system between sensors is developed taking advantage of the experience of the operating seismological real-time system for worldwide earthquake monitoring developed by GFZ within the context of the GEOFON Project. In fact, the SeedLink software installed on the WRAP computers allows data exchange for a first prototype seismological network, and a plug-in has been written to handle the data streams from the AD-c. The new seismic sensors, properly equipped with a integrated USB port for data transport to the mesh point of the network and GPS instruments for time synchronization and location, will be tested at the GFZ during a preliminary field experiment. In particular, the usage of a limited prototype network on the Telegrafenberg Albert Einstein Science Park,
c)
Figure 5: a) Three accelerometers (ADXL203 from Analog Device) with an analog filter form the 3 component low-cost seismic sensor. b) A new 24 bit low cost AD converter for 4 channels. GPS will be integrated for timing and location determination. c) WRAP computers handle the data streams from the sensor board. Data exchange with neighboring sensors will be realized with WLAN.
58
both inside and outside buildings, will provide the opportunity to evaluate the sensitivity of the low-cost 3 component accelerometers, the power consumption of the individual components and of the complete system, and the general reliability and stability of the system, paying particular attention to both the performance of the individual nodes and communication between nodes. At present, basic software for filtering, STA/LTA triggering, and event detection and discrimination has been developed. Therefore, during the field experiment, the software devoted to run on each single node will be tested. Information Systems In EDIM a number of application scenarios are being developed and validated regardeing the setup of a geodata-infrastructure, for the integration of geodata and geo-services as well as for an intelligent search of geo-information application scenarios are being developed and validated. During the development of the semantic interoperability of geo data, the following services are focused on: Support of the information provider, support of the information inquirers, mediation services between information provider and information inquirers and an automatic process to perform quality control by the use of Service Level Agreements (SLA). Concrete results of this approach are seen in language independent risk estimation, real-time warnings, a fast disaster evaluation and support of the preparation of effective supporting measures. The following application scenarios are seen as vital regarding user requirements. Geology and earthquake catalogue – this scenario describes the integration of existing geoscientific, earthquake and disaster information in a knowledge-based model. A second scenario shows the integration of land use information, infrastructure data and possibly socio-economic data stets. Relevant classification catalogue for early warning and disaster systems can be produced by ontologies and similarity measures make determined warnings and assistance possible. A further application is to be seen in the use of Earth Observatory satellites. In this case it
must be examined whether a derived classification catalogue can be applied for a guide evaluation shortly after the case of disaster. For the entire system Open Source components will be used: – For the database backend PostgreSQL with the spatial extension PostGIS will be used. – For all OGC WebServices like WMS, WFS and SOS deegree services will be used. – The security components are provided out of the degree project as well. – For the central information node, a portal framework (Apache Jetspeed) will be used. The clients for each OGC WebService will be realized as portlets conforming to the JSR168 standard. Currently, a new geometry and feature model for deegree3 are designed to support the upcoming ISO and OGC standards. A new concept for the new client framework supporting both web and desktop clients is developed. For the first prototype deegree2 components will be used presenting the current possibilities. Step-by-step, they get updated by deegree3 components for using the newest standards. The interaction of all (already in use) components will be accomplished by supporting the according standards. Summary The EDIM consortium has the potential and strategy to address key issues in further developing the Istanbul Earthquake Rapid Response and Early Warning System (IERREWS). We will bridge the gap between single station alarm systems and those requiring records from an entire network, using the Artificial Neural Network methodology (Böse et al., 2005) that allows an alarm to be issued from one station, but which will be upgraded with each additional available record. In cooperation with GFZ and HU, we will develop a fundamentally new monitoring and communication system, based on low-cost, self-organizing recorders with a high potential for applications in many places of the world. We will address the rapid inversion of ource parameters as critical input for city-wide damage predictions. We will study neural net
59
methods, specifically adapted for the Marmara Region, that do not require velocity models, but need the incorporation of experience from previous events and simulations. Algorithms to be developed will consider the degree of reliability of the information about forthcoming ground shaking and damage. For example, the neural net methods developed at TH allow the enhancement of the reliability of the early warning with evolving time. Linking ground motion information with damage modelling (KOERILOSS) aims at the rapid quantification of economic damage to the region, so that socio-economic implications of an earthquake can be understood immediately after the event. Self-organizing networks can deliver much more detailed information about the functionality of specific structures and critical facilities. Data from the warning system, including the self-organizing sensors, and further spatial information will be provided by an interoperable information infrastructure. References Bindi, D., Parolai, S., Grosser, H., Milkereit, C., and Karakisa, S., 2006. Crustal attenuation characteristics in northwestern Turkey in the range from 1 to 10 Hz. Bull. Seism. Soc. Am. 96(1): 200–214. Böse, M., Erdik, M. and Wenzel, F., 2005: Earthquake Early Warning – Real-time prediction of ground motion from the first seconds of seismic recordings. Proc. Volume of the Int. Conference on 250th Anniversary of the 1755 Lisbon Earthquake, 185–187, 1–4 Nov. 2005. Böse, M. Erdik and F. Wenzel, 2006: A New Approach for Earthquake Early Warning. In: J. Zschau and P. Gasperini (eds.): Seismic Early Warning Systems, Springer Verlag, in press. Böse, M. 2006: Earthquake Early Warning for Istanbul using Artificial Neural Networks. PhD thesis, University of Karlsruhe, Germany. Cua, G., 2004: Creating the Virtual Seismologist: developments in ground motion charac-
60
terization and seismic early warning. PhD thesis, California Institute of Technology, USA. Erdik, M., Fahjan, Y., Özel, O., Alcik, H., Mert, A., Gul, M., 2003: Istanbul Earthquake Rapid Response and the Early Warning System. Bulletin of Earthquake Engineering, 1, 157–163. Espinosa-Aranda, J., Jimenez, A., Ibarrola, G., Alcantar, F., Aguilar, A., Inostroza, M., and Maldonado, S., 1995: Mexico City Seismic Alert System. Seismological Research Letters, 66, 6, 42–53. Gasparini, P., Manfredi, G. and Zschau, J. (eds) Earthquake Early Warning Systems, 349 pages, Springer, 2007. Gerstenberger, R., 2003: ODEMx: Neue Lösungen für die Realisierung von C++-Bibliotheken zur Prozesssimulation. Diplomarbeit Humboldt-Universität zu Berlin. Goltz, J.D., 2002: Introducing earthquake early warning in California: A summary of social science and public policy issues,. Caltech Seismological Laboratory, Disaster Assistance Division, A report to OES and the Operational Areas. Gruber, T.R., 1993: A Translation Approach to Portable Ontology Specifications, In Knowledge Akquisition Vol.5, No.2, 1993. Kanamori, H., 2005: Real-time seismology and earthquake damage mitigation. Annual Reviews of Earth and Planetary Sciences, 33, 5.1–5.20. Klien, E., Lutz, M., Einspanier, U., & Hübner, S., 2004: An Architecture for Ontology-Based Discovery and Retrieval of Geographic Information. Proceedings of the 7th Conference on Geographic Information Science (AGILE 2004) Nakamura, Y., 1989: Earthquake alarm system for Japan Railways. Japanese Railway Engineering, 8, 4, 3–7.
Nakamura, Y., 2004: UrEDAS, Urgent Earthquake Detection and Alarm System, Now and Future. Proceedings 13th World Conference on Earthquake Engineering, Vancouver, August 1–6, 2004. Parolai, S., Bindi, D., Durukal, E., Grosser, H., and Milkereit, C., 2007. Short Note Source Parameters and Seismic Moment–Magnitude Scaling for Northwestern Turkey Bull. Seism. Soc. Am. 97(2): 655–660. Sombrutzki, R., Zubow, A., Kurth, M., and Redlich, J.-P. Self-Organization in Community Mesh Networks – The Berlin RoofNet, First Workshop on Operator-Assisted (Wireless Mesh) Community Networks, pp. 1–11, 2006 Wenzel, F. and Marmureanu, G., 2006: Earthquake Information Systems, in Pageoph Topical Issue, Proc. Volume 22nd Int. Tsunami Symposium, Chania, Greece, 27–29 June, 2005, in press. Wu, Y.-M. and Teng, T.-l., 2002: A virtual subnetwork approach to earthquake early warning. Bulletin of the Seismological Society of America, 92, 5, 2008–2018. Zschau, J., N. Küppers (eds), Early Warning Systems for Natural Disaster Reduction, 834 pages, Springer, 2003. Zhang, Y., Luo, J., and Hu, H. (eds) Wireless Mesh Networking: Architectures, Protocols and Standards, 592 pages, Auerbach Publications, 2006
61
Numerical Last-Mile Tsunami Early Warning and Evacuation Information System Birkmann (1), Dech (2), Hirzinger (3), Klein (4), Klüpfel (5), Lehmann (3), Mott (6), Nagel (7), Schlurmann (8), Setiadi (1), Siegert (6), Strunz (3) (1) United Nations University, Institute for Environment and Human Security (2) University of Würzburg (3) German Aerospace Center (DLR) (4) University of Bonn (5) TraffGo HT GmbH (6) Remote Sensing Solutions GmbH (RSS) (7) Technical University Berlin (8) Leibniz University Hannover
1. Background Seventeen out of twenty of the most disastrous natural hazards, since 1950, have occurred in the last 10 years. Extreme environmental events grow in frequency and magnitude. Hence, the number of human losses since the 1950’s has now reached 1.7 million; and the economic damage exceeding 1.4 Billion US$.1 Most of the hazards affect more than just one region, the most prominent example being the devastating tsunami of 26 December 2004 in the Indian Ocean. This according to the current estimates killed more than 220,000, made more than 1 million people homeless, and left many thousands without any basis of existence. The estimated economic costs (10 billion US$) of the tsunami is comparatively low. However, only 20% of this sum has been registered as insured loss.2 This clearly indicates that the aftermath of the seaquake to the west of Sumatra also consists of certain unforeseen dimensions. Therefore a significant shift in risk perception of an extreme event must follow. The aftermath of the tsunami, with all its consequences has already affected the development politics and the international, multi-sectoral research communities. The overarching goal of these initiatives is to improve disaster prevention by preparedness methodologies, including public
62
and political awareness of the residual risk and addressing the global hazard resilience. The current multi-disciplinary project »Last-Mile – Evacuation« attempts to incorporate all of these guiding principles and jointly develops with local partners the necessary evacuation recommendations by scenarios of a tsunami early warning and evacuation information system. The reasons for the disastrous impacts of hazards, especially in developing countries, are manifold: rapid population growth, growing socio-economic segregation in most parts of the population and the extreme growth of urban agglomerations in coastal areas are the major factors. This automatically creates a dilemma, where growing utilisation and settlements pressures in coastal areas have to be balanced against preserving a natural landscape. A natural coastline and its native green belts play an important role in protecting a region from both tsunamis and storm surges.3 The narrow coastal zone up to 100 kilometres inland represents only 6 per cent of the global land area. However, nearly 40% of the world’s population lives in this small strip, and this tendency is growing constantly. On top of this, two thirds of all major cities counting more than 2.5 millions citizens can be found in exactly this coastal zone.4 The vulnerability of the people living on the coastal zone will
grow constantly. It seems obvious that only sustainable development will deliver solutions to decrease disaster risks in coastal zones. Therefore the sustainability’s basic principles and criteria for protection and use must include economic efficiency; and economic integrity while ensuring equitableness.5 In general those attempts consider the major concepts and principles of subsidiary and/or reversibility; this project in the context of early warning systems (EWS) in earth management includes prevention as the main solution of disaster risk reduction. The danger of natural disasters lies in the complexity of hazards, human vulnerability, coping capacity, resilience and the given environmental realities like water, drinking water, soil and agriculture in coastal zones. A disaster is more likely to occur, if the community is not informed, aware or perceived of the existing risks. However, facing the ever growing frequency and magnitude of natural events shows how crucial disaster preparation have become. Reducing human vulnerability is the only way to minimise the risk of a natural disaster. Besides activities to raise public awareness, the development of early warning systems is indispensable.6 The lives of many people and the level of economic loss depend strongly on those systems. The overarching ambition of »LastMile – Evacuation« within the context of early warning systems in earth management is to develop and test a tsunami early warning and evacuation information system, which will be exemplarily implemented as a prototype application in the city of Padang, West Sumatra, Indonesia. The term last-mile itself characterizes the ability to secure the proper execution of an early warning chain in all its facets and institutions, i.e. from intergovernmental commissions over national bodies down to coastal communities and the individual himself.7 The image lastmile was first drawn by defining the so-called crucial link after the devastating earthquake in Bam, Islamic Republic of Iran, in 2003. It describes the interface in between modern communication technologies and the proposed recipients of the information – the ten-
tatively affected people. Any design of an early warning system has to recognized and subsequently embed local structures, actors and capacities to create an end-to-end EWS. Local authorities and affected people have to be made aware and adequately prepared of natural hazards in order to anticipate issued warning dossiers in a proper way and react adequately. Thus, it is likewise a primary goal of this project to raise awareness and develop appropriate preparedness mechanism by achieving a numerical last-mile tsunami early warning and evacuation information system which delivers concrete evacuation recommendations for tsunami inundation scenarios and methodologies for optimized evacuation circumstances with the local authorities and scientist in a joint research approach. Moreover, it is essential to recognize that the local government of a district or city plays the most significant role when developing and initiating to execute research projects in Indonesia due to the fact the decentralization and democratization processes are still in progress to advance in the country. It is evident that districts (Kabupaten) and cities (Kota) permanently receive more and more competencies and responsibilities from the Indonesian government; especially in relation to disaster management and prevention policies. By constitution the Lord Mayor has the authority of making decisions in disastrous events, e.g. whether a city or coastal stretch is being evacuated or not. The long-term and sustainable distribution of resources and civil protection of citizens in a city is steered by the local authorities. In fact, the stringent decentralization process within a country like Indonesia is beneficial for customized local disaster management schemes and likewise advantageous for an anticipated rapid response to tsunami warning dossiers in terms of evacuation. Therefore, the project management of »Last-Mile – Evacuation« has to elucidate whether the local authorities and capacities are yet defined by an efficient combination of effective, interdisciplinary, integrated and wide-ranging managing institutions for disaster management plans in the city of Padang, and, should initiate guidelines to set
63
an institution like this into effect for the full usage of the information tool being developed in this project. 2. Description of the Research Project This research project develops a numerical last mile tsunami early warning and evacuation information system (acronym: »Last-mile – Evacuation«; sponsorship code: 03G0643A-E) on the basis of detailed earth observation data and techniques as well as unsteady, hydraulic numerical modelling of small-scale flooding and inundation dynamics of the tsunami including evacuation simulations in the urban coastal hinterland for the city of Padang, West Sumatra, Indonesia. It is well-documented that Sumatra’s third largest city with almost one million8 inhabitants is located directly on the coast and partially sited beneath the sea level, and thus, is located in a zone of extreme risk due to severe earthquakes and potential triggered tsunamis. »Last-Mile – Evacuation« takes the inundation dynamics into account and additionally assesses the physical-technical susceptibility and the socio-economic vulnerability of the population with the objective to mitigate human and material losses due to possible tsunamis. By means of discrete multiagent techniques risk-based, time- and sitedependent forecasts of the evacuation behav-
iour of the population and the flow of traffic in large parts of the road system in the urban coastal strip are simulated and concurrently linked with the other components. This project is being developed in close cooperation with the local authorities of Padang, the Engineering Faculty of the Andalas University in Padang as well as the Indonesian research institutions. Thus, by means of modelling tsunami inundation scenarios as well as performance of evacuation processes a crucial disaster preparedness measure is made available. Developing and implementing a suitable software application, i.e. an information system, disaster mitigation methodologies are created and optimized and can be indirectly steered prior and during a catastrophic natural extreme events. The time framework allocated for »Last-Mile – Evacuation« is three years.
2.1. Scientific and technological objectives In general, an adequate disaster preparedness scheme of any early warning system generally consists of four or five levels, respectively. Those stages, which are consecutively applied, build the so-called early warning chain. Only the interaction of all stages can assure an effective early warning system (e.g. in the case of a tsunami):
Figure 1: One of Padang main roads located along the coastline
64
Table 1: Levels of an effective early warning chain
This research project mainly addresses level four, but also taking into consideration the level five of an early warning system in the city of Padang. To deliver recommendations and model the evacuation behaviour on this level of an effective early warning chain, the project aims for deriving tsunami inundation dynamics and evacuation simulations due to extreme natural events like tsunamis on the local coasts. This is done by building an information system for subsequent disaster mitigation methodologies that can be indirectly steered prior and during a catastrophic natural extreme events. The determining aspects of any numerical last mile tsunami early warning system in coastal communities (level 5) are so far being neglected from a scientific point of view. In front of this background the major task of future developments must be focussed on applied tsunami research themes as well as in the general understanding of the reciprocal relation between Tsunami – Coast (urban hinterland) – People’s behaviour (reaction) in case of concrete tsunami warning, or e.g. likewise in case of storm surge warning. The overarching objective is to deliver precise preparation measures regarding clear warning and evacuation schemes regarding the time of the day and the day of the week specific dependencies in near shore, urban agglomerations. The project funded by the Federal Ministry of Education and Research (BMBF) with the title »Implementation of core elements of a tsunami early warning system in Indian Ocean with collaboration of Indonesia and other partners – GITEWS« (submission number: 03TSU01)
aims to develop an effective and sustainable early warning system with Indonesia and other countries in the Indian Ocean Rim. This major project of the German Helmholtz Institutions and other research centres mainly concentrates on monitoring systems and innovative technologies (levels 1 and 2). A possible seaquake in the region of Sunda Arc possibly causing a tsunami, might reach the West Sumatran coast of Indonesia within 18–20 minutes. This critical time mark defines the design and the evaluation of data of the tsunami early warning system in Indonesia.9 Therefore the tsunami early warning system can only make sense and a meaningful difference if the issued warning dossier (or all-clear dossier) reaches the affected region within 10–12 minutes to initiate a proper evacuation in the coastal region. »Last-Mile – Evacuation« originates itself upon some of the information and evaluations that are being developed or in the process of being generated in GITEWS. In this respect certain work packages are directly related to this research project since the tsunami risk assessment studies in GITEWS are being conducted for the whole Indonesian coastal stretches (approximately 4000 km) bordering the Indian Ocean (Sumatra, Java, Bali, etc.). Thus, it is a given fact that coarse resolution analysis is conducted in GITEWS, which concentrates on research on deep-water tsunami propagation in the whole Indian Ocean as well as to parameterize the interaction between tsunami generation and seismic strokes. The local tsunami run-up is being analyzed similarly, but hydro numerical simulations stem from a meso-scale numerical grid size and irregular
65
cell structure (100 × 100 m, occasionally finer resolved: 50 × 50 m). The hydro numerical models used in common approaches do neither incorporate local infrastructures nor do they consider street networks or urban waterways in the respective cities. The simulations make use of constant coastal slopes representing the near shore region and the urban hinterland without determining infrastructures. To establish quantitative information about the local run-up heights onshore, land use and land cover data are often parameterized to imply the characteristic roughness and thus make use of friction coefficients for the model. Neither numerical evacuation simulations are envisaged in common research projects nor an information system relying on detailed, time-dependent inundation dynamics and evacuation recommendation are produced. Thus, it can be easily argued that »Last-Mile – Evacuation« apparently erects its scientific basis upon previous scientific studies, but goes far beyond the expected outcome. In summary, »Last-Mile – Evacuation« defines itself as an independent research project. It is neither seen as a simultaneous scientific challenging approach nor does it produce redundant results for validation purposes. In reference to table 1, »Last-Mile – Evacuation« completes the early warning chain and exemplarily shows the mechanisms of evacuation efforts and the dynamics of tsunami inundation in an urban area in Western Sumatra. Thus, the project addresses the crucial questions in how – after a verified tsunami scenario alignment and thereby validated dissemination – the initiation for evacuation is affected and the subsequent chronology of this procedure is feasible at all. What are the consequences of the impact and reactions of the people in Padang which will lead to the formulation of evacuation recommendation or technical preparedness measures (i.e. shelters). It is reasonable to allocate the project’s methodological approach to other coastal cities in the Indian Ocean Rim by substituting merely the boundaries conditions and providing a best-practice solution.
66
2.2. Overall project objectives The key objective of this research project is to achieve and fulfil the final level of the tsunami early warning chain in the city of Padang. Answers as well as concepts to the following central aspects are determined: – In what respect is – after localization of a tsunamigenic earthquake, after detecting the triggered tsunami in the deep ocean and after verified scenario alignment and thereby validated dissemination in the crisis and information centre in Jakarta – the initiated request for evacuation and the chronology of this procedural instruction including the inherent physical-technical susceptibility and the socio-economic vulnerability of the population in the coastal region integrating the respective daytime and weekday feasible at all? – What kind of inundation dynamics characterizes the tsunami in the city of Padang and which consequences can be derived for the optimization of evacuation schemes and standard operational procedures? What are the exact time frames due to tsunami inundations in Padang’s coastal districts to successfully organize evacuation routines? – Which bottlenecks arise during the process of evacuation? What time of the day and what day of the week specific dependencies emerge during the evacuation? Where do critical infrastructures subsist and how is their individual evacuation being optimized? How is the general traffic situation within an acute tsunami threat after the onset of evacuation? How might these blocked obstructions and bottlenecks be resolved by now? – How can the vulnerability of the population and those of the critical infrastructures be exemplified and measured? When and where are hotspots of vulnerability and where is special assistance in the case of an evacuation needed? – Which scenario-specific decisions and recommendations, e.g. vertical evacuation, tsunami shelters, etc., have to be met technically and within future administrative spatial planning processes in order to minimize the tsunami disaster risk in Padang, i.e. to
reduce the susceptibility of Padang’s coast and the vulnerability of its people? On the whole »Last-Mile – Evacuation« leads to insights about the detailed flow dynamics of tsunamis as well as it pinpoints to optimized early warning and evacuation mechanisms in the city of Padang which can find operation to municipal and spatial planning efforts in an attempt to deal with an integrated coastal zone management – here in Padang. A site-specific software application using a Graphical User Interface (GUI) is defined as the physical end product of the project. This system assembles scenarios, data sets and analysis results of the four thematic subcomponents of the project and inherently generates additional, coupled information for decision-makers and local authorities. It is feasible to assign the project’s methodological approach to other coastal cities in the Indian Ocean Rim. This system is being developed in close cooperation with local authorities and research institutions as a participative approach and is finally recommended as a supplementary tool for the local decision-makers since it integrates an efficient and effective collection, administration, processing and distribution of data as well as it generates and hosts analysis results of early warning and evacuation processes in the city of Padang. It thus also incorporates approaches for the development of adaptation and mitigation strategies to reduce disaster risks in the region; especially by means of Capacity Building and Stakeholder Dialogue to generate knowledge and further insights about tsunami threats and appropriate evacuation recommendations for the people of the region. The end products of the project, i.e. the evacuation recommendations & information system, are gradually established. The once generated rudimentary version of the application is recommended to the local authorities as a basic tool for daily planning processes. It embeds an efficient and effective collection, administration, processing and distribution of data and the inherent information from calculations concerning early warning and evacu-
ation. The information system’s architecture is built as an OpenSource code to enable autonomous upgrades by the local authorities. Interfaces and standards are defined to feed updates into the system, e.g. field survey data or newly derived census data. On the whole, the project demands for and yet, incorporates a interdisciplinary research approach (social sciences, economy, engineering and natural sciences) together with a strong linkage to local authorities, decision-makers and the people of the city of Padang. 2.3. Particular scientific and technological work load The work load towards the research project objectives is divided in five working packages, based on field of expertise of each working partner, namely socio-economic vulnerability assessment (WP 1000), inundation scenarios, flow analysis (WP 2000), geodatabase, information system and vulnerability assessment (WP 3000), evacuation analysis and pedestrian traffic optimization (WP 4000) and highlyresolved 3D-model of Padang (WP 5000) (See Figure 6 in the Chapter 3). On the basis of satellite images and other data sources, advanced highly resolved information including topography, infrastructure, population density and settlement structures are provided. These tasks are compiled by WP 3000. Collected data serve as starting point for further analyses. In cooperation, WP 1000 focuses on the analysis of the susceptibility and coping capacity of the inhabitants in certain ranges of the coastal stretches in Padang to exemplify and measure vulnerability in a broader sense. This is done in a holistic endeavour to resolve the hotspots of vulnerability and to ascertain where in case of an evacuation special assistance is needed. These widespread evaluations are linked to the coupled information system that is being developed upon all assembled data by WP 3000, so that real time prognoses after a tsunami battered the shores can give first estimates of tentative losses and potential damages. Thus, essential information for ad hoc disaster management activities can be determined effectively.
67
Figure 2: Physical vulnerability of built infrastructure in Padang (Source: DLR)
WP 1000 concentrates on research aspects how to specify vulnerability indicators with regards to exposed and susceptible elements (people/different social groups and their socioeconomic structures), as well as evacuation capability, perception and behaviour. It comprises the quantitative collection of suitable indicators of Padang’s population through local municipal statistics (e.g. Census), as well as additional surveys. Moreover, indicators on critical infrastructures will be collected in collaboration with WP 3000, which focuses to conduct an extensive physical vulnerability assessment. The use and unification of different data sources are essential for the decisive development of insightful vulnerability indicators. It provides a deeper understanding and characterisation of Padang’s districts to come up with evacuation information in case of a heightened tsunami threat. Data for these particular analyses are collected and processed by local researchers and students from the Andalas University in Padang. This holistic approach on vulnerability assessment aims preliminary on small-scale and household level investigations. It focuses on a thorough assessment of the vulnerability of the society and population, economics and environment by natural disasters in particular by tsunamis or e.g. storm surges, which require special attention when establishing an information system associating evacuation routines. In addition this vulnerability analysis intends to elucidate the characterisation and
68
quantification of coping capacities and its spatial distribution in the city of Padang to determine existing institutions for disaster control and management (fire-brigade, police) and their situation as well as appropriate emergency shelters for evacuation. This latter aspect is linked with a particular component from GITEWS. The work package Capacity Building also strengthens local governments and organizations to develop appropriate warning and disaster preparedness mechanisms; mostly on the basis of drills and trainings in the field. This component is carried out by the German Technical Corporation (GTZ) and is intended to put three pilot regions in Indonesia into operation – also in Padang. Detailed hazard maps due to tsunamis or e.g. due to storm surges for coastal areas on the basis of one- and two-dimensional unsteady hydro numerical simulations are done by WP 2000. It develops tsunami inundation scenarios in »Last-Mile – Evacuation«, which are established on shallow water bathymetry and urban topography of the coastal hinterland. The requirements in terms of precision and detail for a digital elevation model (DEM) are extremely high. It is agreed with the Indonesian researchers from the line-agencies from BAKOSURTANAL, BPPT and LIPI to jointly extend the simple digital elevation model for the city of Padang in a sustainable way. WP 2000 and 3000 support the Indonesian counterparts in upgrading that DEM. Spatially highly dissolved data of the bathymetry are indis-
pensable due to the major impact of submerged small-scale structures, e.g. coral reefs, sea canyons, headlands, sandbars or fluvial near shore sedimentations, regarding the hydrodynamic simulation of the wave transformation once the tsunami reaches the shores. It is known from Indonesian researchers that coarse bathymetry data are available, but further measuring campaigns are being conducted by WP 2000. Once the DEM is extended, WP 2000 conducts micro scale hydro numerical simulation of near shore tsunami wave dynamics off the coast of Padang and links these simulations with the bore propagation of the incoming water onshore on different numerical scales. A hybrid approach is chosen that defines its boundary conditions from the variation of water levels and near shore currents from a pre-selected hydro numerical model of the near shore coastal wave field. In the transition zone between the beach and the urban area an efficient, and rather rapid, one dimensional hydro numerical model is accordingly set-up and integrates significant open paths and waterways (streets, rivers) as well as the capabilities of retention areas (open fields, parks, markets). This model is extracted from the highly resolved DEM (WP 3000 and WP 5000). It is then subsequently provided to WP 4000 for further usage in their respective research programmes. In the meantime, this basic model is expanded to shape a two-dimensional adaptation that is feasible to cover specific, critical infrastructures (hospitals, schools or other public buildings) in detail. By means of numerical coupling this enhanced model is integrated in the existing one-dimensional representation of the city of Padang and wholly linked with WP 4000 which subsequently aim to model how evacuation procedures in specific infrastructures and streets evolve based on the established hydro numerical model provided by WP 2000. As a further result for subordinate implementations, e.g. the detailed surge flow dynamics past selected infrastructures, the parameters water level, flow rate and direction at any place of the city for the entire period of the tsunami inundation are
made available in the urban area. The precision and detail of any pre-calculated scenario is provided in spatial domains in the order of 1.0 meter, while temporal resolutions are about 5 seconds. These highly resolved inundation mechanisms in the city of Padang then enable to estimate relevant potential damages and provide decision-support parameter to generate unambiguous vulnerability assessments, and consequently derive optimized evacuation procedures. This directly establishes the linkage to the subproject of WP 4000. For the development and examination of evacuation concepts, WP 4000 use discrete multiagent-models which rely on the substantial aspects of the evacuation process itself. Thus, extremely fast and most efficient numerical simulations are feasible. WP 4100 concentrates on evacuations within buildings and typical infrastructure (blocks) in the city of Padang. Design sketches of any structure provide the basis for these computations. By means of thorough data collection, WP 1000 delivers typical daytime and weekday information about the habits and behaviours of the people and different social groups to gain a deeper understanding where and when the citizens of Padang work, live, relax and gather. It will develop characteristic scenarios with WP 4200 to create typical boundary conditions for the time- and site-dependent numerical evacuation simulations. With these data succeeding simulations of the evacuation behav-
Figure 3: Existing evacuation plan in Padang (Source: KOGAMI)
69
iour within the construction are most likely to pinpoint to critical infrastructure (design malfunctions) and bottlenecks within the distinctive evacuations procedure. WP 4200 numerically simulates the road traffic in case of an evacuation, using the road network information provided by WP 3000. The backbone of these computations is formed by a so-called »Queue Model«, which simplifies roads as edges and crossings as knots to facilitate quick computations of large scenarios. The model necessarily demands for simple, graph-based information (connection of the edges and knots; Length, width and capacity of the edges) based on the existing model generated by WP 2000. Computational evacuation procedures in the streets of Padang also contain a behavioural mode including the simulation of individual escape routes. On the whole, future spatial planning conceptions in coastal zones around Padang should make extensive use of the outcome of these evacuation models and should connect potential infrastructural measures with the hydro numerical one- and two-dimensional tsunami inundation model.
Figure 4: Last-Mile – Evacuation flight planning to take high resolution aerial images of Padang in 2007
70
WP 5000 provides high resolution spatial data, which is required by WP 2000 and WP 4000. Currently, the only source of surface height data in the area is provided through the Shuttle Radar Topography Mission (SRTM). The major disadvantage of SRTM data in context of damage modelling is the coarse resolution and non-validated data sources. With this data, overview maps produced will not present precise modelling of damage patterns. WP 5000 focuses on delivering high resolution aerial images and elevation data of about 500 km2 which covers the city of Padang and adjacent villages. Those data are derived by applying the Multi Functional Camera (MFC) system (developed by the German Aerospace Center – DLR), which has unique properties and is one of the most advanced digital aerial photographic systems worldwide. Synchronous to the image data acquisition with up to 16 cm spatial resolution, a detailed Digital Surface Model (DSM) is derived through highly automated processing. The MFC system is one of the very few available systems capable of producing image and elevation data with the radiometric and geometric quality needed. The spatial resolution of both, the DSM and the image data, are 25 cm. Based on the DSM a Digital Elevation Model (DEM) will be derived and in a further step all buildings with their individual heights are being extracted and classified according to their building structure. Those data will be converted by a highly computerised process into a 3 dimensional city and coastal model. Based on this model, the potential risk on individual buildings can be estimated by taking the hydronumerical calculation (WP 2000) into account. The results will be implemented into a spatial geodatabase and combined with results of the WP 1000 and WP 3000. In order to evaluate the possible effects of tsunami inundations in Padang in respect to proposed evacuation measures and reduction of potential damages or losses, rule-based (decision-tree) strategies as a function of daytime and weekday information are developed. The consequences of the tsunami flooding at the coast and in the hinterland in terms of vul-
Figure 5: Photorealistic 3D city model (Source: RSS GmbH)
nerability and hazard assessments are determined by the working packages 1000, 2000 and 3000 and made available for all project stakeholders by a suitable common exchange interface. It is of particular interest to determine by which time the streets or individual buildings in Padang have to be evacuated; WP 4100 assesses evacuations within buildings and typical infrastructure and WP 4200 determines whether the evolving traffic in a heightened tsunami warning is limited by the present road network in Padang. In this regard, the present information system can find usage in the distribution of relief efforts after a tsunami hit the city, due to fact that street network data and flooded districts as well as the essential socio-economic data (demographic, mobility, e.g. availability of cars, motorcycles or bicycles) are integrated in the system. As an end product, a web based information system will be developed by WP 5000, which gathers all information derived during the project ÂťLast-Mile â&#x20AC;&#x201C; EvacuationÂŤ in order to provide it to public and governmental institutions. The information system will be available in 3D and displays the potential risks of districts and even single buildings taking different Tsunami
scenarios into account. It provides all relevant information for an evacuation and risk assessment in a way which can be easily understood, as well as for evaluation of different measures taken in future decision processes, e.g. spatial planning and construction sites, which may decrease the damaging effect of coastal hazards. This application will be available via intra- as well as internet. Moreover, video animations are rendered by combining all components of the project. In this way the behaviour of a Tsunami which hits the coast and the wave propagation inside the city of Padang, as well as potential damage and possible evacuation routes can be realistically simulated. It will be used for awareness campaigns and training material for the local authorities. 3. Structural Design and Current Status of the Project Figure 6 provide a general impression about the internal structural design of the current project. (The detailed list of the German partner institutions is attached in the Annex) The five work packages of the project are lead by the respective German partners and each of
71
Figure 6: Project organization and work structure
them is subdivided into several subcomponents. Each work package is characterized by interfaces open for the partner institutions. By coupling these components new insights of the work packages and subcomponents in terms of data exchange or joint scenario development are created and forwarded to the overarching work packages Evacuation recommendations & information system and Web application that finally lead to the establishment of an open software application encompassing all available scientific knowledge, information, scenarios, data and analysis results regarding inundation dynamics, vulnerability and evacuation in Padang. This end product is nourished and promoted by all involved project partners. Within this project networking is carried out internally as well as externally. By means of contents-based, technical collaboration of this multi-disciplinary combination of partner institutions networking is done internally. External networking focuses on collaborating with the Indonesian partners in GITEWS (BAKOSURTANAL, BPPT and LIPI), yet it prelim-
72
inary takes the local authorities, the Faculty of Engineering at Andalas University and the Indonesian research institutions into account. In order to enable a functioning and reliable network of partnering academic institutions, besides signing letters of intent or/and endorsing the objectives of the project, financial support is also allocated for conducting research on data collection and analysis of essential socio-economic data-sets in Padang. During the foreseen meetings a thorough needs and assessment study of tentative partners in Padang concerning the tangible potentials and expertise of academic institutions and local authorities is conducted. Likewise an institutional mapping approach on local and regional scale will be followed to determine mandate and participation in any decision-making processes regarding disaster management. The above mentioned Indonesian research institutions are already determined through the research being executed in GITEWS and led so far to the creation of a well-functioning bilateral network among German and Indonesian partners. The methods and technologies
within »Last-Mile – Evacuation« partially support the work of GITEWS, but take the workload of GITEWS even further and are not in focus of substantial scientific challenge. This research project is currently in its early stage, where networking with the local Indonesian institutions is initiated and preliminary data is being collected. A hydrographical survey is currently being conducted in order to gather highly-resolved data of near shore bathymetry of Padang to an extent of up to 100 km2 as well as collecting information about flow rates rectangular and parallel to the coastline aiming to enhance the insight in tidal dynamics and wave interactions. Moreover, an official »Kick-off Meeting« with all German and Indonesian partners is foreseen by the beginning of 2008. 1
Development. United Nations Development Programme MunichRe (2005), Schadenspiegel 3/2005 – Themenheft
3
Fernando, J. (2005), Coral Poaching Worsens Tsunami
Risikofaktor Wasser, MunichRe Destruction in Sri Lanka. Eos, Vol. 86, No. 33 4, 5
Oumeraci, H. (2003), Wasser im Küstenraum. In: Denkschrift Wasser, Deutsche Forschungsgemeinschaft (DFG)
Bogardi, J. (2004), Hazards, risks and vulnerabilities in a changing environment. Global Environm. Change 14, pp. 361–365
7
[4], [5] Oumeraci, H. (2003), Wasser im Küstenraum. In: Denkschrift Wasser, Deutsche Forschungsgemeinschaft (DFG) [6] Bogardi, J. (2004) Hazards, risks and vulnerabilities in a changing environment. Global Environm. Change 14, pp. 361–365 [7] Shah, H. (2003), Last-mile – Earthquake Risk Mitigation Assistance in Developing Countries, Report Stanford University 8 [8] Population number in year 2005 was 801,344 from BPS Kota Padang, (2006), Padang dalam Angka 2006
UNDP (2004), Reducing Disaster Risk – A Challenge for
2
6
[3] Fernando, J. (2005), Coral Poaching Worsens Tsunami Destruction in Sri Lanka. Eos, Vol. 86, No. 33
Shah, H. (2003), Last-mile – Earthquake Risk Mitigation Assistance in Developing Countries, Report Stanford
[9] GITEWS Konsortialpartner (2005), Technical Layout of GITEWS, Helmholtz-Gemeinschaft Annex II: List of German project partners (& institutions) in Last-mile – Evacuation Partner 1: Dr.-Ing. Jörn Birkmann Institute for Environment and Human Security, United Nations University Hermann-Ehlers-Straße 10, 53113 Bonn Tel.: +49 (0)228 815–0208, Fax: +49(0)228 815–0299 Email: birkmann@ehs.unu.edu
University 8
Population number in year 2005 was 801,344 from BPS Kota Padang, (2006), Padang dalam Angka 2006
9
GITEWS Konsortialpartner (2005), Technical Layout of GITEWS, Helmholtz-Gemeinschaft
Annex I: [1] UNDP (2004), Reducing Disaster Risk – A Challenge for Development. United Nations Development Programme
Partner 2: Univ.-Prof. Dr.-Ing. habil. Torsten Schlurmann (Projektleiter) Franzius-Institut für Wasserbau und Küsteningenieurwesen Fakultät für Bauingenieurwesen und Geodäsie, Leibniz Universität Hannover Nienburger Straße 4, 30167 Hannover Tel.: +49 (0)511 762–2572, Fax: +49 (0)511 762–4002 Email: schlurmann@fi.uni-hannover.de
[2] MunichRe (2005), Schadenspiegel 3/2005 – Themenheft Risikofaktor Wasser, MunichRe
73
Partner 3: Prof. Dr. S. Dech, Dr. Günter Strunz Universität Würzburg, Sanderring 2, 97070 Würzburg Lehrstuhl für Fernerkundung Email: guenter.strunz@dlr.de Partner 4a: Dr. Hubert Klüpfel (KMU) TraffGo HT GmbH, Bismarckstr. 142, 47057 Duisburg Tel.: +49 (0)203 8783–3601, Fax: +49 (0)203 8783–3609 Email: kluepfel@traffgo-ht.com Partner 4b: Univ.-Prof. Dr. Kai Nagel Verkehrssystemplanung und Verkehrstelematik, Institut für Land- und Seeverkehr Fakultät V – Verkehr- und Maschinensysteme, Technische Universität Berlin, Sekr. SG 12, Salzufer 17–19, 10587 Berlin Tel.: +49 (0)30 314–23308, Fax.: +49 (0)30 314–26269 Email: nagel@vsp.tu-berlin.de Partner 5a: Prof. Dr. Florian Siegert, Dr. Claudius Mott Remote Sensing Solutions GmbH (RSS), Wörthstr. 49, 81667 München Tel.: +49-(0)89–48954765, Fax: +49-(0)89–48954767 Email: siegert@rssgmbh.de und mott@rssgmbh.de Partner 5b: Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR), Linder Höhe 51147 Köln (ausführendes Institut: Institut für Robotik und Mechatronik, Abt. Optische Informationssysteme Rutherfordstr. 2, Berlin 12489 Prof. Dr. Gerd Hirzinger und Frank Lehmann Tel.: +49 (0)8153 282401, Fax: +49 (0)8153 288667 Email: gerd.hirzinger@dlr.de und frank.lehmann@dlr.de)
74
Partner 5c: Prof. Dr. Reinhard Klein Institut für Informatik II – Computer Graphik, Universität Bonn Römerstr. 164, 53117 Bonn, Germany Tel.: +49 (0)228 734201, Fax: +49 (0)228 734212 Email: rk@cs.uni-bonn.de
Sensor based Landslide Early Warning System â&#x20AC;&#x201C; SLEWS Development of a geoservice infrastructure as basis for early warning systems for landslides by integration of real-time sensors Arnhardt C. (1), Asch K. (3), Azzam R. (1), Bill, R. (2), Fernandez-Steeger T. M. (1), Homfeld S. D. (4), Kallash A. (1), Niemeyer F. (2), Ritter H. (4), Toloczyki M. (3), Walter K. (2) (1) Chair of Engineering Geology and Hydrogeology (LIH), RWTH Aachen University, Email: arnhardt@ / azzam@ / fernandez-steeger@ / kallash@lih.rwth-aachen.de (2) Chair of Geodesy and Geoinformatics (GGR), Rostock University, Email: ralf.bill@ / frank.niemeyer@ / kai.walter@uni-rostock.de (3) Federal Institute for Geosciences and Natural Resources (BGR), Hannover, Email: kristine.asch@ / markus.toloczyki@bgr.de (4) ScatterWeb GmbH (SWB), Berlin, Email: homfeld@ / ritter@scatterweb.de
Abstract Early warning systems are becoming one of the main pillars of disaster prevention in natural hazards especially where mitigation strategies are not realizable. Therefore, the call for multi-hazard early warning systems and improvement of monitoring of natural hazards is steadily growing. However, data gathering (generation) through sensor integration as well as system independent supply of information from early warning networks are of particular challenge.Because very complex information and warning chains have to be served for, standardisation and interoperability plays an important role for all organisations and infrastructures being involved. The SLEWS project investigates the complete information chain, starting from data gathering using wireless sensor networks via information processing and analysis to information retrieval. This is demonstrated for landslides and mass movements. The proposed approach adds especially to the funding target of mobile, costreduced and easy deployable measurement systems, as well as the modern information
systems under consideration of interoperability and service orientated architecture concepts. The proposed wireless network provides the basis for applying further techniques like sensor fusion, identification of malfunctions or errors. The obtained geodata is processed according to the requirements of particular users and can be provided for use in a local or regional as well as global information structure. Interfaces will be created, which make a mobile and flexible adaptation or integration of geodata possible. Thus, it is ensured that the early warning system reaches the spectrum of potential users and provides a maximum flexibility. 1. Introduction Due to the progressive development of urban areas and infrastructure in Europe as well as world-wide, more and more people settle in environments that are or become endangered by mass movements. This situation is being complicated by the fact that the dependency of our todayâ&#x20AC;&#x2122;s society on a functioning infrastructure and number of human
75
or objects in endangered areas increases at the same time. This leads to an overall increase of risk for our society. This development is not only a specific problem to Germany or the Alpine region. It is a world-wide challenge apparent from an increasing number of national and international programs during the last years. Besides the investigation of hazards, these programs are focused on risk reduction by monitoring and early warning, since here immediate positive effects for protection of human lives and objects can be expected. Currently existing monitoring systems for early warning are available in terms of monolithic systems. This is a very cost-intensive way considering installation as well as operational and personal expenses. A very complex emergency plan is usually executed in case of warning. This requires a disciplined adherence to an information chain. Failure of single elements will significantly disturb the flow of information. The new approach provides the possibility for an involved institution to get user adapted information in a very early state independent from a hierarchic information structure. The present joint project aims at a systematic development of a prototyping alarm- and early warning system to address different kinds of natural hazards citing landslides as an example. It is based in innovative service orientated information structure integrating OGC standards and the application of highly flexible sensors, also from automotive areas due to their readiness and cost-efficiency. It is an advancement regarding classical early warning as the sensor networks are free scalable, extensible and foreign data can be integrated from other providers due to an open platform strategy with web processing services. The planned geoservice infrastructure will involve sensors, geodata, recent information and communication, as well as methods and models to estimate parameters with relevance to stability of landslides.
76
2. Objectives One main goal of this project is the development of a low-cost autonmous sensor network that is suitable for the detection and the observation of mass movements. In order to get information about movement rate, acceleration and movement direction of a landslidearea, appropriate real-time observation systems are selected and will be tested. One focus will be on the determination of realistic scenarios and adequate sensor profile requirements. To develop a reasonable and effective sensor combination (sensor fusion) and alignment for early warning, the results of experiments with the different sensors will be analyzed and evaluated regarding its quality statement. Special focus will be on sensor and network fusion with respect to the reduction of false alarm rates, malfunktion detection and information enhancement. The sensors will be integrated on so called motes, providing energy supply, processing capabilities and a wireless network connection. In this context, wireless sensor networks make perfectly sense as they provide a good coverage of a wide area without inacceptable deployment effort. With the multihop approach of wireless networks, each self-organized sensor (node) of the network automatically integrates into the network and forwards both local measurements and data of peer nodes to a data collection. In addition, innovative geodetic measuring principles are used to describe movements on the surface in space and time. The measurements resulting in heterogeneous observations sets (real-time and geodetic measuring) have to be integrated in a common adjustment and filtering approach. In the given project, geodetic methods for deformation analyses have to be adapted and modified with respect to real-time requirements. The second main goal of this project is to link, prepare and aggregate relevant information for early warning systems. This will be based on data received from different information sources of the cooperative project partners in the process of building a Spatial Data and Spatial Service Infrastructure. The whole informa-
tion System will be optimized on the basis of the actual state of concept of web-based, distributed spatial information systems. The development and setup of a geodata and geoservice infrastructure for the integration of real-time data and geoprocessing is essential for reaching the objectives. The infrastructure developed here will obtain syntactic and semantic interoperability. In order to do this, the development of applicable processes will be required for ad-hoc networking of Spatial Services, and for ad-hoc building of highly complex processing routines and models. The information generation will be carried out in parallel to the provided models and processes, but they should remain open for future development. Much importance will be attached to the use and further development of existing standards by the Open Geospatial Consortium (OGC), whereby synergies with other projects can result. Another important goal of the project is the fformulation and analysis of an interface to end users and to political decision makers. An essential component is the integration of the system into present network structures, the
connection to other early warning systems and the involvement of attained results into national and international research programs. Working packages were defined and each package is assigned to a project partner, who is responsible for content processing. The Department of Engineering Geology and Hydrogeology of the RWTH Aachen University (LIH) is responsible for establishing an observation network integrating and evaluating new geo-sensors for the observation of mass movements in consideration of landslide type and mechanism. he ScatterWeb GmbH (SWB) manages the initialization of wireless, self-governed and self-organizing sensors with competitive costs. The Department of Geodesy and Geoinformatics of the Rostock University (GGR) improves data quality and provides the interface for implementing a geoservice infrastructure (GDI), while the Federal Institution for Geosciences and Natural Resources (BGR) identifies users demand and the possibility to profit by the projectâ&#x20AC;&#x2122;s outcome by installing an expert system comparative to European standards. Although some partners leading tasks in the frame of the joint project the partners will
Figure 1: Linkage and interaction between project partners involved
77
work in close cooperation to ensure a smooth flow of information and exchange to ensure progress (Fig. 1). 3. Technology 3.1. Wireless sensor network 3.1.1. Sensor concepts and sensor fusion A new level of miniaturization of processors and radio modules encouraged the rise of a new kind of networks: ad hoc wireless sensor networks. Measurement data is collected by distributed sensor nodes, which interact separately and connect each other in a selforganizing manner. Modern nodes for wireless sensor networks are built in a strongly modular way by providing open interfaces for integration of very different measuring sensors. Simple temperature and humidity sensors can be integrated as well as high-precision displacement and acceleration transducers, tiltmeters , geophysical and acoustic sensors or even GPS modules for location determination (fig. 2). For the observation of mass movements it is very important to define which measuring sen-
sor could be used for a special failure mechanism. In order to observe slow movements, like creeping, very sensitive measuring sensors are necessary, that can detect small elongations. These sensors are often very expensive, because of their high precision. By contrast slide and fall movements are much faster, with a higher acceleration and elongation. Hence, sensors are used that can detect larger movement amplitudes. This knowledge is associated with selection of the sensors and the target of this project in respect of low cost sensors. The idea is to use sensors which are easy to get from the automotive industry with less complexity in integrating them to sense the mass movements. Anyway in the beginning it is important to link between the sensor measurement and the failure mechanism. Tab. 1 shows the five main landslide failure mechanisms that are basically examined in this project. The reasons to turn the attention to these failure mechanisms are: 1. The project concentrates on fast mass movements like rock fall and topples or rock slide. 2. Debris flow will not to behold because of the coasts of the monitoring measurements
Figure 2: Schema of a Sensor based Landslide Early Warning System
78
Table 1: Example of Failure Mechanisms and Instrumentation of different Measuring Sensors
and completely different boundary conditions (flow process, very fast). 3. The warning should be a short term or post event warning. As stated in Tab. 1, the sensing of the acceleration is helpful in all chosen mechanises Due to the kind of the used sensor (three axial or mono axial) the signal could be dashed or continues in the case of fall or topple. If a three axial sensor is used the signal will be continuous but it will be dashed if the sensor has only one axis (Fig. 3). Because of the high price of the three axial sensors, three or two monoaxial sensor will be orientated monted on a
mote to set up a multi axis accelaeration sensor. The sensing of displacement can be realised using a potentiometric Draw Wire Displacement Transducers or with normal Linear Displacement Transducers (Fig. 4). The Draw Wire Displacement Transducer has the advantages that the stainless steel cable is relatively flexible at its end, so that a movement in all directions will be no problem. The Angle sensor is a useful instrument for the monitoring of toppling or in some cases falling. In the case of rotational slide if the slip surface is known, then it will be useful
Figure 3: Examples of diagrams from mono- axial and three-axial acceleration sensors
79
A.) Draw Wire Displacement Transducer
B.) Linear Displacement Transducer
Figure 4: Examples of displacement transducers
to use an angle transducer to identify the active part of the slide. Angle sensor can also help to detect reactivation of endangered ares as a first signal for beginning movments. Additionally angle sensors can be used to measure landslide movements indirectly on walls or buildings for example, as they detect the deformation and tilting of such constructions. In the frame of the project, selection and testing of appropriate sensor configurations and fusion of data from different sensors (using sensor fusion) to improve prediction quality is of major importance. The focus lies on the determination of realistic scenarios and adequate sensor profile requirements. The results of realistic experiments with different sensors will be analyzed and evaluated regarding its quality statement. Main aspects are: – Significance of sensor signals with regard to geology and mechanics – Improved gathering of information through sensor and network fusion (fusion of different data) – Identification and handling of errors, outliers and malfunctions – Evaluation of measurements, data fusion and realization of decision making concepts with regard to false alarm rate reduction – Examination and evaluation of the sensor network concerning applicability as monitoring tool beyond early warning
80
In cooperation with the project partners simple but effective decision making algorithms will be developed to evaluate the results from experiments. Thus, early warning may be realized under outdoor conditions. For the algorithm development hierarchic decision making structures (e.g. simple exclusion rules like »winner takes it all« or majority decision, regression trees) will be used due to their interpretability and feasibility. Alternatively, cluster- and fuzzy logic or logistic regression methods will be examined. Data mining methods will be used for data analysis and parameterization of decision making processes. 3.1.2. Network and nodes Modern measurement nodes for wireless sensor networks are built in a strongly modular way. That way they especially provide open interfaces for integration of very different sensors. Simple temperature and humidity sensors can be integrated as well as high-precision vibration sensors or even GPS modules for exact location determination. Currently available products are mainly experimental platforms with only a few of them ready for the market. Products that are ready for the market have to pass certifications of regulatory bodies. Especially the European standard EN 300-220 has to be obeyed for any low-rate radio module that may be used outside research laboratories. Certifications also include further market relevant assertions
Figure 5: ScatterGate, measurement nodes and case in a ruggedized version
like reliability even in a wide temperature range from â&#x20AC;&#x201C;20 °C to 75 °C. The ScatterNodes of the project partner ScatterWeb follow these requirements and are ready to be used. Sensor data is then collected by distributed measurement nodes and forwarded across the wireless network towards a data collection point. This collection point aggregates data and may perform further steps like data preprocessing and compression. Finally, data is
provided via a wide-area-network like GSM/UMTS for secured remote access via the Internet. In the ScatterWeb product line, the data collection point is called the ScatterGate. A set of measurement nodes and one ScatterGate in a ruggedized version can be seen in the following figure. Low energy consumption is a key requirement for long-living wireless sensor networks. Therefore, modern sensor networks use power-down modes of processor and radio module as much as possible while still providing fast and reliable data transmission. That way it is possible to run an outdoor wireless sensor network for years using standard batteries or solar cells, where possible. The following picture shows the installation of a WSN using components from ScatterWeb for the study of warming effects in the Swiss alps as part of the project SensorGIS [15]. 3.1.3. Wireless sensor network technology in landslide monitoring Mass movements, e.g. slope movements or rock fall, may be described as complex deformation processes at the surface, resulting in
Figure 6: SensorGIS installation in the Swiss alps
81
a down-slope movement. The observation and monitoring of such movements needs a large amount of observations (in-situ measured data) and information. Only the combination of such information permits an appropriate conclusion about mechanism and attitude of the sliding area resulting in early warning information. The Wireless Sensor Network (WSN) consists of inexpensive computational nodes which gather relevant parameters and transfer this information to a collection point for further processing. WSNs nodes are used for sensing the environment while at the same time they communicate with neighbouring nodes and can even perform basic computations on the collected data. The usage of WSNs covers over than 222 of general and special applications [1]. The usage of WSNs in engineering geology is at the moment at the beginning but the following examples show the worldwide interest in the application on engineering geological and other problems in hazard monitoring. One of these projects is »SENSLIDE« –Project in India leaded by US universities. »SENSLIDE is a distributed sensor system for predicting landslide« [2]. The basic idea of this project is to use a large number of distributed inexpensive single-axis strain gauges connected to a cheap node. The missing linkage between sensors and the punctual measuring method of these sensors inform about rock movement in various parts pointly only, without information about relative motion between the rocks.Thus, »by measuring the cause of the landslide, we can predict landslides as easily as if we were measuring the incipient relative movement of rocks« [2]. Another project uses WSNs to predict the location of the slip surface of the landslide. To involve the sensors this project build a »Sensor Column«, which includes four types of sensors: geophones, strain gages, pore pressure transducer and reflectometers. The WSN Motes have the function of a telemetric system to transmit informations from the single column into the network.
82
The project »Wireless Sensor Networks with Self-Organization Capabilities for Critical and Emergency Applications (WINSOC)« aims in one of its sub projects to use WSN to detect rainfall induced landslides and lahars in India. This Project is relatively a large one with 11 partners in 7 countries [4]. The project aims the development of WSN in early warning systems including hardware and software development under consideration of biological inspired information processing. The main idea here is to use large amounts of inexpensive sensor nodes which collect data and compute them in a reasonable local decision making processes [4]. The above described projects are all at the first stages of realisation. Different concepts of strategies are under consideration to apply WSN to hazard observation and early warning systems, from single use as telemetric system to integrated information processing networks. This shows that the application is at the moment still in the beginning but has a great potential, especially as the technology WSN seems to become level of good maturity. Especially the evolution of Micro sensors e.g. from automotive sector and the application of sensor fusion offer a great potential. 3.2. Information system 3.2.1. State of the art Studies and pilot applications, in particular within the Open Geospatial Consortium (OGC), have shown that the classic technologies (measurement systems, simulation/calculation models, database systems, mapping and geographic information applications) should be extended or replaced by interrelated infrastructure-oriented technologies which may be described using the keywords Open Web Services, Sensor Web and Web Processing Service (WPS). These are grounded in a web-service paradigm and have been developed in the context of OGC Spatial Web applications. This service-orientated technology is overall highly interoperable and more capable of being coupled together than the classic monolithic technologies.
Service implementations are made available on the internet as cascading and self-describing processes. This means that a service may be called not only directly by a human user, but may also be called from another service. Each service may describe what operations are executable, what inputs are required and what outputs are possible in both human- and machine-readable form. Whole production chains for the preparation of information may be realised using these chained or cascading services. Because each service is capable of describing its own operations, the cascade may be formed not only as a linear structure but also as a complex and partially self-organising network. These services therefore already implement structural elements of the semantic web, the successor of the WWW. Web services isolate single tasks for spatial information management and implement these as separate modules within a single architecture. These modules take the form of self-contained services which are capable of intercommunication via standardised web interfaces. Such OGC Web Services (OWS) form the basis of spatial data infrastructure (SDI) projects on the international (GSDI) and national (GDI-DE) level, as well as in almost all federal states (e.g. GDI-NRW, GDI GeoMV). Currently the most widely-implemented and sophisticated services are the Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Further services are e.g. the Web Catalog Service (CSW) and the Web Coordinate Transformation Service (WCTS). OGC specifications for Sensor Web Enablement (SWE) and WPS are also available, some currently in draft form, others already approved. These broaden the OWS to include sensors and models as well as processing and analysis functionality. Initial pilot studies, such as the OGC Web Services Testbed Phases 3 and 4, have indicated that the development of the concept has reached an applicationready stage. 3.2.2. Information system architecture The prototype of a Spatial Data Infrastructure (SDI) in the context of an alarm- and early
warning system will be orchestrated by a number of services described by the OGC. Applied will be services of the OGC SWE family such as Sensor Observation Service (SOS) and Sensor Alert Service (SAS) accompanied by the use of associated specifications such as Sensor Model Language (SensorML) and Observations & Measurements Schema (O&M) and other services for the use of mapping data (WMS), meta data (CSW) and vector and raster data (WFS, WCS). Another fundamental task will be the implementation of analytic processes and calculations forming the business logic represented in self-contained web-based services (WPS). Early drafts encapsulate the information infrastructure in three main parts: sensor-side logic, server-side logic and client-side logic. The sensor-side logic contains process chains to survey and accumulate sensor data representing the underlying sensor network to the outside via SOS et al. at a level of abstraction still to be defined. The server-side logic embodies the infrastructure core by providing process chains for harvesting, analysing and visualizing sensor data and sensor parameters under consideration of geodetic and engineering geologic expertise. The client-side logic will provide a combination of portal and catalogue services allowing decision-makers to research and visualise security-relevant data from landslide-endangered areas. Visualisation techniques will be designed to take the userâ&#x20AC;&#x2122;s knowledge related to the field into account and will range from detailed output to fuzzy graphical warning elements. Additional synchronous and asynchronous information services as SAS and the Web Notification Services (WNS) can provide defined users and user groups with time-critical readings from the observation site. Despite a certain predetermined process chain between measuring sensor data and formulating conclusions for information purposes, due to its open and standardised architecture there are no constraints in using the described services in an arbitrary manner. Enabled through standardised software service interfaces, experienced users can employ the provided services
83
Figure 7: Information system service infrastructure: Sensor-side logic: (a) business logic (b) SOS (c) SAS/WNS; Server-side logic: (d) business logic/WPS (e) data base system (f) WMS/WFS/WCS; Client-side logic: (g) visualization services (h) CSW (i) viewing/portal services and applications
to fit their needs using a wide range of existing compliant software products. 3.2.3. Positioning and localization One aim of the project ÂťSLEWSÂŤ is to develop low-cost-methods and techniques for early warning systems against landslides. There are possibilities to use special sensors such as inclinometer, pressure sensor, hygrometer, humidity sensors or linear position sensors. These sensors have high accuracy, for example less than one millimetre. Price and accuracy stand in direct relation. All these sensors are local sensors with output about local phenomena. Classical surveying methods such as Global Positioning System, terrestrial measurement and remote sensing are established means to get information about movements across the area of unstable land. Although such methods have a high accuracy, their application is very expensive and they will therefore only be used in areas where the potential costs of a landslide are much higher. All these methods need a direct view between sensors and reflectors and/or receivers. The direct visible connection will be influenced for example by vegetation and is not always assured. Furthermore, the total cost of the measurements increases with a decreasing interval between epochs. Many observation points must be defined to cover the area. The outcome of this is a local geodetic network. Geodetic networks can be measured with the help of electro-optical range finding and theodolites. The accuracies
84
of electro-optical range finding today are at the millimetre level. The accuracies for the best types of theodolites today are under one milligon (3 seconds). The reference points of this local area network will be realised with differential GPS. Accuracies of a few millimetres are possible. It is also possible to measure distances with other methods such as radio technology. One of the newest methods in this domain is WLAN (WLAN Indoor Positioning System). Tests of WLAN Indoor Positioning System show accuracies of positioning between 1 and 3 meters. For this reason, this method is unsuitable for monitoring land slips. To get acceptable information about land slips, the measuring methods must be accurate to one centimetre. Therefore, ultra-wide band (UWB) technologies, which promise higher accuracies (sub-centimetre), may be applicable. The following paragraphs detail some important results of studies concerning UWB. Radio systems use electromagnetic waves with different frequencies to send information. One of these systems is the narrow-band system, which is also used in electro-optical range finding for surveying. The distance is calculated using the difference of phases on sensor and receiver. The accuracy in this case is less than one millimetre. To use this system it is necessary to have a direct view between sensor and reflector. Multipath effects are a common cause of error in this method. The next system is the broad-band system, also be known as ÂťMulti Band Orthogonal Fre-
quency Division Multiplexing«. In this case the information sequence is modulated on a carrier frequency. The receiver demodulates the information from the carrier frequency. This system is used for example in GPS (C/A Code). Multipath effects are also a problem for this method. Ultra-wide band has the possibility to send information with or without a carrier frequency. The absolute bandwidth is more than 500 MHz. The relative bandwidth is a quarter of the middle frequency. The frequency ranges between 0,1–0,96 GHz and 3,1–10,6 GHz but the information is sent with a low-power (max. –41,3 dBm/MHz). The maximum range for this method is therefore 50 metres. To send information without a carrier frequency, the information must be discretely encoded as binary data. Due to the high frequency and very small electro-magnetic pulses, a high data rate (theoretical more than 1 GBit/sec) can be achieved. This standard was developed in two working groups at the Institute of Electrical an Electronics Engineers (IEEE). The first group 802.15.3 developed standards for pulse method (Direct Sequence UWB, DS-UWB) and the second group 802.15.4 developed standards for the system with carrier frequency (Multi Band Orthogonal Frequency Division Multiplexing, MB-OFDM). In 2002 the Federal Communications Commission (FCC) released UWB licences for free use. Two European working groups at the European Telecommunications Standards Institute (ETSI) have developed standards for the European market (ERM/TG31A, ERM/TG31B). BLANKENBACH et al. [6] carried out tests with UWB for distance measurement. The tests took place outside with direct view. The results have been compared with a known true distance measurement. A number of distances were each measured 100 times. It was possible to measure distances of up to 50 meters. BLANKENBACH et al. gives 7cm for the standard deviation of one observation and 1 centimetre for the standard deviation of the arithmetic mean. Differences between the true and observed values due to systematic errors were less than 10 centimetres.
The most important advantages and disadvantages are as follows: Advantages – very small electro-magnetic pulses (<2 nanosec) no multi path effects – signals saturate materials – UWB signals are mostly not distinguishable from background noise for other radio receiver because of restriction of transceiver power – UWB systems do not need a carrier frequency – The installation and the hardware are easier and cheaper compared to systems with a carrier frequency – less power requirement compared to other systems with a carrier frequency (for example WLAN) – no direct view is needed Disadvantages – maximum range is about 50 m. – it is possible, that UWB transmitters disrupt other frequencies Some questions remain currently unanswered, such as: – Are there instruments for commercial use? – How much will they cost and what is their accuracy? – Who sells such systems? UWB is a new technology, which can be used to transmit data with high rates. It is also possible to use UWB as a range-finding method. The accuracy is high enough to consider the use in landslide monitoring. 3.2.4. Work progress Recent progression has been made in designing a generic extensible data model for gathering positioning data. At this point the data model contains geodetic information necessary for determining angles und distances between observation points, information which is used to express possible surface deformation processes. Future data sets can be fitted with an arbitrary amount of additional information attributes. A data profile has been implemented in the spatial database
85
management system PostGIS and data sets can be visualised using WMS provided by UMN MapServer. The data profile has been fitted with real-life sample data to plot first application scenarios. Future steps will address first deployment of OGC SWE services, for example describing preliminary sensor setups in SensorML and making remote simulated sample data available via SOS. 3.3. Warning and end user Early warning systems require very complex information and communication structures. They need to interface sensor technology, data management with data base management systems, data preparation and analysis, prognosis and model calculations, computer networks and communication terminal units, and also result in a processing and communication tool. Studies and pilot trials, particularly within the context of the Open Geospatial Consortium (OGC), have shown that classic technologies are replaced by interrelated, infrastructurefocused technologies. They are all based on the Web Service Paradigm and were developed within the context of the OGCâ&#x20AC;&#x2122;s Spatial Web approach. This technology has far greater networking and interfacing capabilities than the classic technologies. Initial pilot tests indicate that the concept has reached a level of a possible real world implementation. It has to be evaluated how far developed concepts can be implemented in practice and which modifications and changes are required in order to use the new technology successfully for early warning purposes.
In this joint project the Federal Institute for Geosciences and Natural Resources (BGR) Hannover is embedded as a multiplier to the community as well as a potential user of an early warning system for landslide hazards. The involvement of potential users in this early stage of the project ensures the development of a high quality, re-usable and portable information system for application in the field of early warning. Potential users may be dam operators who monitor unstable watersides (e.g. ThĂźringer Talsperrenverwaltung, slope Gabel) or the German Rail Company (DB) which needs to maintain the stability and safety of its rail network (e.g. rock fall hazards on Rheintal route). Other potential users are operators of mountain railways and lifts in the Alps region. Repeated thawing of permafrost can cause destabilization of subsoil. A second relevant group of users are those responsible for isolated sensor networks or early warning systems whose applications might integrate data into the system. Examples for these are dam operators, mining companies (RWE Power), the BGR and even participants of the BMBF program (GFZ, AWI). Besides these potential users operators of early warning systems for other natural hazards in regard to mass movement will have an interest in this study. Examples of these are the measuring network and early warning systems of the German Weather Service or the flood water central BadenWĂźrttemberg, which operates a measuring network for rainfall and river gauges related to flood forecasts. The project results are planned to comply with open international specifications to ensure
Figure 8: Digital elevation model sample data (left), PostGIS profile (center), visualization via WMS (right)
86
their availability to a wide community. As the WPS specification of the OGC is adopted it could advance to an international standard within the scope of this project. The project goals are beside others the determination of needs and the development and configuration of end-user friendly early warning-systems. Finally the integration of the project-results into national, European and international programs should be realized following international standards. An important aspect of the development of information systems is to observe standards in order to be able to obtain transferability on other questions and reuse of developed technologies in future applications. The results have to be integrated directly into open and international specifications in order to make them available to a broad Community. The development has to be increased to establish deterministic prediction systems supplying early warning systems with reliable data considering only a few determinable parameters. This is of importance, because monitoring systems tend to thin out with the times due to the loss of hardware components. 4. Conclusion This project provides a substantial contribution to the development of methods and technologies for early warnings with regard to mass movement events. Its interdisciplinary constitution offers the possibility to process the complex problem of effective early warning by classifying multiple approaches. Highly complex events like slopes failures depend on multiple sensitive parameters (e.g. humidity and stress constitution in an endangered area). The combination of small and precise measuring sensors like tiltmeter or displacement transducers, used in automobile technology, with a self-organizing monitoring system in a network permits the advance of a real-time monitoring system that is suitable for the detection and the observation of mass movements. Modern sensor networks, like ad hoc wireless sensor networks have the advantage over currently existing landslide monitoring systems that they can be used very variable
and the installation is quite simple. Furthermore the short-distance communication between adjacent nodes (still up to 1 km free space communication distance) allows cutting down costs and administrative effort. In addition, as the nodes are directly connected with each other, local preprocessing of the data can be applied and sensor fusion techniques can minimize false positives. The planned geoservice infrastructure will involve sensors, geodata, recent information and communication, as well as methods and models to estimate parameters with relevance to stability of landslides. The processing of these geodata applying capable procedures and algorithms will lead to better predictions and early warnings. By these aspects, this project contributes important improvement of geoservices by developing geoinformational methods. The new approach provides the possibility for any involved institution to get user adapted information in a very early state independent from a hierarchical information structure. The service orientated infrastructure closes the gap between data gathering and information retrieval using services as SAS and WNS. The intermediate results of the project are linked to the feedback of potential users, which will be integrated in the current project via workshops and interviews. Improvement of user interfaces and possibility of model and prognosis integration are of main interest and will lead to the development of new standards. 5. References [1] K. Sohraby, D. Minoli and T. Znati (2007): Wireless Sensor Networks: Technology, Protocols, and Applications, by John Wiley & Sons, Inc. [2] A. Sheth, Ch. A. Thekkath: SenSlide (2005): A Sensor Network Based Landslide Prediction System SenSys, 05, Nov 2â&#x20AC;&#x201C;4 [3] A. Terzis, A, Anandarajah, K. Morre and I, Wang (2006): Slip Surface Localization in Wireless Sensor Networks for Landslide Prediction, IPSNâ&#x20AC;&#x2122;06
87
[4] http://www.winsoc.org/keyfact.htm [5] Retscher, Günther; Moser, Eva: Genauigkeits- und Leistungstest eines WLAN Indoor Positionierungssystems. Zeitschriftenartikel aus: zfv Zeitschrift für Geodäsie, Geoinformation und Landmanagement, Jg.: 132, Nr. 1, 2007, S. 4–10 [6] Blankenbach, J.; Norrdine, A.; Schlemmer, H.; Willert, V.: Indoor-Positionierung auf Basis von Ultra Wide Band, Zeitschriftenartikel aus: AVN Allgemeine Vermessungs-Nachrichten, Mai 2007, S. 169–178 [7] IMST GmbH: UWB – Ultra Wide Band. UR: http://www.imst.de/de/funk_wir_uwb.php. Letzter Zugriff am 06.07.2007 [8] Bartels, Oliver: UWB: Störer oder Helfer? URL: http://www.heise.de/mobil/artikel/57048 (04.03.2005). Letzter Zugriff am 06.07.2007 [9] ITWissen; UWB (ultra wideband). URL: http://www.itwissen.info/index.php?aoid=135 89&id=31. Letzter Zugriff am 06.07.2007 [10] CIO: Trapeze offers 802.11n draft access point. URL: http://www.cio.de/news/cio_ worldnews/838319/index1.html (19.06.2007). Letzter Zugriff am 06.07.2007 [11] Macwelt: 480 Mbit/s: Ultra-BreitbandChip für Mega-Content-Handys. URL: http://www.macwelt.de/news/netz/443866/ind ex.html. Letzter Zugriff am 06.07.2007 [12] ELKO: UWB – Ultra-Wideband Wireless. URL: http://elektronik-kompendium.de/sites/ kom/1010131.htm. Letzter Zugriff am 06.07.2007 [13] Sikora, Axel : Ratgeber: Funknetzwerke daheim. URL: http://www.pcwelt.de/start/computer/netzwerk/praxis/85162/. Letzter Zugriff am 06.07.2007
88
[14] Macwelt: ZigBee-Standard verabschiedet: Revolution der Fernsteuerung?. URL: http://www.macwelt.de/news/software/335987/index.html. Letzter Zugriff am 06.07.2007 [15] http://www.sensorgis.de
Integrative Landslides Early Warning Systems (ILEWS) Glade T. (3), Becker R. (6), Bell R. (3), Burghaus S. (2), Danscheid M. (2), Dix A. (1), Greiving S. (7), Greve K. (2), Jäger S. (5), Kuhlmann H. (2), Krummel H. (4), Paulsen H. (8), Pohl J. (2), Röhrs M. (1) (1) University of Bamberg (2) University of Bonn (3) University of Vienna (4) geoFact (5) geomer (6) IMKO (7) plan + risk consult (8) terrestris
Introduction Early warning of landslides is a challenging topic because the possibilities of prediction vary significantly. A distinction is not only limited to different types of landslides, it also comprises the differentiation between reactivated and new movements. Due to their already moved compound, reactivated landslides are relatively easy to identify and if required, appropriate monitoring instruments can be installed. The locality of new landslides is by contrast rather difficult to predict. The aim is to design and implement an integrative early warning system for known (reactivated) and new landslides and debris flows, which provides information on future events with regard to local and regional requirements. The methodical configuration of the early warning system is developed to be transferable and modular, i.e. it can be adapted to local structures of different countries, and it can be customized to other natural processes (e.g. rockfalls). Key project targets are: – Formulation of an integrative early warning concept for landslides – Investigation and installation of an adapted early warning system – Monitoring, parametrisation and modelling of local data
– Development of risk management options – Linkage of local findings to regional modelling – Provision of information necessary for early warning – Integration of warning into the respective social processes of decision making. Strengthening the awareness towards underestimated risks that are associated with landslides – Transfer of the concept to an area with many, already existent monitoring stations which are not yet networked in the above mentioned manner. Within the scope of the submitted cooperative project, early warning systems are to be implemented in two European test areas. 1. Swabian Alb in Germany The study area is a settlement area on a historically active complex rotational slide. Regularly recurring damage to houses and measurements of inclinometer show the reactivation of at least parts of the slide. The landslide body is already under investigation within the InterRISK research project, so that the existing infrastructure as well as valuable data can be accessed and continuative knowledge may be built.
89
2. South Tyrol in Italy Research focuses on a debris flow in Nals, which is already equipped with an early warning system and the Corvara Landslide, a wellinvestigated complex rotational slide flow which poses a threat to the village of Corvara. The monitoring infrastructure and long series of measurement of the debris flow and the Corvara Landslide are provided for the project free of charge. In a first step it is intended to implement an ideal type of early warning system in the study site of the Swabian Alb â&#x20AC;&#x201C; from sensor to recommendation of actions. In a second step it is attempted to apply findings and methods from this early warning system to the second study site in South Tyrol. The early warning system consists of three clusters: Monitoring, Modelling and Implementation. The Cluster Monitoring is designed to realise a robust and easy to handle online-measurement-system for comprehensive monitoring of landslides. It provides the basis for an integrative early warning system. The system consists of a new sensor-combination for the divisions meteorology, soil hydraulics and â&#x20AC;&#x201C;mechanics with the corresponding innovative data transmission technologies. The broad spectrum of observed processes contributes to a better understanding of process interdependencies. Hence, the evaluation of the risk of critical landslides can be significantly be improved. Measurement and data transmission are automated. Data is provided in a standardised form by a central measurement database for further processing (CoreSDI, Setup Monitoring, SensorGIS). Important design criteria for the survey system are the ability to operate on different scales and to be transferable. Using adjusted sensor systems for various area sizes, regions, and processes, regional conclusions must be able to be drawn. The main aim of the Cluster Modelling is the conversion of all continuously and periodically gathered information into a reliable and efficient early warning. Therefore, different new models will be applied (Movement analysis Early Warning Model, Physically based Early
90
Warning Model) and an already existing model (Real-Time Early Warning Model for Debris Flows) will be integrated. The core of the Movement analysis Early Warning Model (Geomorphic Modelling, Geodetic Modelling) is the trend analysis of the reciprocal landslide movement rate. Linear trends give an early indication of the time for a catastrophic landslide event. This model is complemented by the integration of the continuously measured field parameters into permanent slope stability modelling. The latter results into a permanently updated safety factor (Physically based Early Warning Model). The combination of both early warning models should lead to a more reliable early warning. Regarding debris flows an existing Real Time Early Warning System will be used and optimised. The disadvantage of the existing system is the short premonition time of only some few minutes. It is intended to extend this time span by the integration of the weather forecast. Particularly important is the design of a structure, that is still fully functioning even if one or more system components fail. The Implementation of an early warning system is a multistage process. It starts with the definition of protection goals â&#x20AC;&#x201C; differentiated for the various subjects of protection and their respective damage potential (economic values, critical infrastructure, people, environment). The defined protection goals are then an important basis for decision-making of the Clusters Monitoring and Modelling. The scientific analysis is an essential part of an integrated early warning system. However, the question which risks are acceptable or tolerable is a normative one, i.e. it has to be answered by the authorised players. Thus, a cooperative risk communication (Communication) is integrated into the cooperative project, identifying the configuration of the involved actors and the communication relationships between these. Within the framework of an information management, respective results will be the basis for the development of a communication network towards and between the involved actors of a early warning chain.
The described project ILEWS is divided into ten subprojects. Five of them are executed by universities, the other ones by companies. The aims of the different subprojects will be explained on the next pages. Subproject: Monitoring of landslide movement and Early Warning Modelling Thomas Glade & Rainer Bell, University of Vienna Within this subproject shallow translational slides, deep-seated rotational slides, and debris flows will be investigated. Subsurface landslide movements are monitored periodically using a mobile inclinometer device and continuously with a permanent installed inclinometer chain. Gained results complete the assessment of relevant early warning parameters for the process type »slide« within the Cluster Monitoring. All gathered monitoring data of the Swabian Alb and supplied data of South Tyrol are integrated for an user optimised and reliable early warning message. The early warning modelling concept uses a physically based »Near Real-Time« Early Warning Model, a Surface Movement Analysis Early Warning Model (with the subproject »Geodetic Modelling«) and Regional Early Warning Models. Within the physically based »Near Real-Time« Early Warning Model the continuously measured data, especially soil moisture and rainfall (provided by the subprojects »Moisture Geolelectric« and »Setup Monitoring«), will be integrated in equations to calculate slope stability. Thus, a continuous safety factor can be calculated (the respective WebGIS application will be programmed by the subproject »InfoManagement«). If the safety factor gets lower than a specified threshold value, preliminary warning messages are provided in the WebGIS and SMS are sent to the scientists of the subproject (by the subproject »SensorGIS«) to check and validate the warning using sophisticated slope stability models as well as current subsurface movements. The slope stability software will also be applied to calculate highly likely sliding circles for rotational slides
and to identify the best suited slope stability equations. The results are then again input for the physically based »Near Real-Time« Early Warning Model. At the end of the optimisation period an autarkic running early warning model will be set up, which controls to some degree itself and must only be supervised by experts. Within the Surface Movement Analysis Early Warning Model all measured movement rates (inclinometer, inclinometer chain, tachymetry, GPS – with subproject »Geodetic Modelling«) are analysed using the approach of »progressive failures« (Petley et al. 2005). Depending on the way how the movement rates change, the catastrophic failure of a slope can be predicted. In a last step, it is investigated if both models can automatically support each other. It is tested at the Reisenschuh landslide in South Tyrol, whether the early warning models developed for slides in the Swabian Alb can be transferred to other study areas, which are already partly equipped with measurement instruments. Regarding the already existing Debris Flow Early Warning System the scientific basis as well as the extensive measurement series will be analysed in detail to suggest possibilities to improve the system on the one hand. On the other hand important insights towards the set up of a Regional Early Warning Model (with the subproject »History«) are expected. Local Early Warning Modelling will also provide information on the magnitude of potential events. To better define the endangered areas depending on the type and magnitude of the process empirical or physically based run out models are applied. Regional Early Warning Models for slides and debris flows are based on the regionalised critical local conditions. Regarding slides GIS-based models will be applied which calculate the threat due to translational slides on the basis of the »infinite slope model«. For debris flows an statistical model will be applied, which transfers the results based on the extensive measurement series from one debris flow track to other catchments. The integration of the official
91
weather forecast into local and regional early warning models will lead to an expansion of the warning time. A comparative analysis of all applied early warning models will give important information on the usage of each approach for different conditions. Historical frequency-magnitude analyses based on the data provided by the subproject »History« will help to estimate the recurrence interval of landslides of various magnitudes more reliably and to better interpret local measurement results. The GIS-based combination of the early warning modelling and damage estimation (with the subproject »Management«) will provide very helpful information for decision makers for choosing the bestsuited system for risk mitigation and reduction. In cooperation with the Cluster Implementation a user optimised preparation and provision of the early warning message will lead to improved reaction capabilities of the affected people. References Petley, D.N., Higuchi, T., Petley, D.J., Bulmer, M.H. And Carey, J., 2005. Development of progressive landslide failure in cohesive materials. Geology 33(3): 201–204. Subproject: Coordination, Integration and Optimisation of a Multi-Sensor System for Monitoring of Landslides Rolf Becker, IMKO Micromodultechnik, Ettlingen The most important objective of the »Setup Monitoring« project within the collaborative research project is the coordination and hardware-related integration of heterogeneous field sensors into a unified, robust and simple to use measurement system, as a standalone component of an extensive landslides early warning system. The innovation of the monitoring system are its particular measuring procedures and integrated data recording mechanisms, which will be specifically adapted to the early warning of rotational slides. The system includes sensors
92
for determining the load (meteorological data), the inner status of the vadose zone (water content, soil suction head, and pore water pressure) and the system response (assessed by monitoring kinematics using GPS, tilt meters, inclinometers, and extensometers) for a slipping body. Conventional sensors as well as novel measurement procedures will be used. The time variant soil water dynamics is the key factor ruling the current geo-mechanical stability of a slipping body. Determining the soil moisture in clay or highly electrically conductive soils is a technological challenge due to energy dissipation during the measuring procedure. The measuring principle Time-DomainReflectometry (TDR) is less prone to these effects and thus especially suited for the particular application. TRIME-TDR sensors by IMKO supply reliable measurements even with difficult soils, and are therefore used as the solid backbone of ground water measurements related to landslides. The novel stationary geoelectric system from the project partner geoFact is one of the first being capable of continuous monitoring and will be a key component of the integrated monitoring system. Scale transition from point to slipping body is achieved by the extended two- or three-dimensional conductivity fields resulting from geoelectricity in combination with the pointwise soil moisture measurements used for calibration. The subproject will support geoFact in developing a procedure for upscaling soil moisture. The »Spatial TDR« method, currently being developed at several german universities and research institutes, allows the determinination of water content profiles along elongated sensor cables of several meters length. However, this procedure requires a large mathematical effort to analyse signals and locally does not achieve the same accuracy as conventional TDR sensors. As part of the project it will be tested whether a combination of Spatial TDR and standard TDR sensors provides a significant information gain concerning infiltration, purched ground water tables, and hanging slippage.
Another forward-looking aspect of the system integration is the fusion of IMKO’s well established sensor technology for recording environmental variables with novel self-organising wireless networks, which will be installed and operated by the subproject »SensorGIS«. A small number of the sensors for vadose zone monitoring will be taken out of the previously built cabled field bus system and will be integrated into the wireless network. The wireless network nodes from ScatterWeb have a variety of interfaces to connect the sensors. Here too, hard- and software adjustments will probably be required to match the different interfaces. Robustness, prevention of downtime, and energy supply for the planned sensor network are important aspects of the joint investigations. These issues are decisive criteria for future applications of wireless sensor networks for environmental monitoring. Subproject: Cooperative Risk Communication Marco Danscheid & Jürgen Pohl, University of Bonn The subproject »Communication« has two superordinate aims: One is the clarification of general local and regional needs. The other regards the cooperative implementation of early warning systems with both the affected players and the other subprojects.As a logical starting point the work is oriented on the end users needs. These shall be assessed in detail right from the beginning of the project and they shall influence the design of the Early Warning System that is to be developed. It is of decisive importance to identify the information required and how it should be presented. More pointedly, do the end users want a »flashing red light« or do they prefer more detailed information, which require them to come to a decision? Questions are: Which protection-worthy goods exist on the localities? Which concrete need for action does that imply? Which kind of early warning information do the players need?
Where men are involved, one can speak of social systems. Every social system (e.g. companies or public authorities) has specific logics and connected languages. To investigate these is an important aim of this project. Since there are different logics and languages in different social systems, it is important to take this into account for risk communication. It is essential to develop showcase translations to enable different social systems to communicate. Questions are: How can be guaranteed that timely communicated early warnings will be transformed to adequate reactions? How to conceive an early warning system that is accepted by the end users? Examining other research projects on »early warning of landslides« it becomes obvious that social scientific components are either completely neglected or only implemented as a simple communication module. These modules are often only used to communicate natural scientific results to the players. A cooperative implementation of the Early Warning System, where players – both end user and consultant – are taken seriously right from the beginning does not exist, at least not as it is specified in the plans for the subproject »Communication«. The subproject »Communication« would like to break new ground by penetrating the complexity of social actor systems with the help of cooperative interviews and by developing sensible solutions in collaboration with the players. The aim of this approach is to make a contribution to an early warning system which even works in the ultimate consequence – and in doing so saves lifes. The cooperative implementation of the Early Warning System stretches across the whole project time. A main focus of the subproject »Communication« is to do qualitative interviews with the involved players and to analyse the findings. Based on the analyses showcase translation schemas are developed. They shall help to improve the communication between different actor systems or even to enable communication for the first time. The question is, how
93
does the communication of a player have to be formed for other players to perceive it and acknowledge it as relevant. Subproject: Historical comparative regional analysis of frequency and magnitude of landslides Andreas Dix & Matthias RĂśhrs, University of Bamberg The subprojectâ&#x20AC;&#x2122;s objective is to develop methods for monitoring frequency and magnitude of landslides through history. This is to be conducted based on research of the Swabian Alb region and compared with South Tyrol in two landslide-prone regions. Based on the results and experiences gained, conclusions shall be made as to how historical analysis can in the future be integrated into an early warning system which is as effective as possible. Historical data plays a decisive role in the complex chain of early warning and risk communication in at least two system areas: 1. Current knowledge of the spatiotemporal distribution of past landslide events is very incomplete. The effectiveness of an early warning system, however, depends upon the quality of data especially of the frequency and magnitude of events. It is therefore necessary to use all existing data pools to implement an early warning system, which includes the historical information saved in the archives. Systematic research and indexing of the historical material thus helps to improve the knowledge of the total distribution of landslides. Only by establishing event series whose spatial and temporal resolution is as high as possible, can conclusions regarding risk zones and expected distributions of future events be made. Along with the measurements of current conditions, these form the basis of a significant early warning system. This data can also be used to check existing data for its representativeness and validity. 2. Apart from scientific risk analysis, an early warning system is only effective if it is based on a high public risk awareness. This knowl-
94
edge, on the other hand, can essentially only be based on past events. Therefore historical knowledge seems fundamental for the successful implementation of an early warning system. This knowledge also includes handing down perception and handling of these specific natural hazards through history. The question of what and how much of this knowledge was actually handed down, can be adequately reconstructed by analysing the archives. Data acquisition is to be carried out comparatively in cooperation with the subprojects from the monitoring, modelling and implementing clusters in two sufficiently different test regions, the Swabian Alb and South Tyrol. The first work step, the index books of central archives are to be analysed so that the following step the relevant files can be searched through thoroughly and other local archives included.This should allow long data series regarding frequency, magnitude, triggering factors and perception of landslides to be created. To achieve results in a short period of time the snowball effect has proven successful for previous investigations. This does not only take into account all available types of source but also keeps track of any considerations as regards other subprojects, experts or the local population. Here an exchange in both directions has proven to be fruitful. It can be assumed that based on the experiences in the Swabian Alb region systematic research not previously carried out in South Tyrol will produce similar results. The source density in South Tyrol is supported due to the fact that from the 19th century the Habsburg Monarchy conducted high-resolution cadastral mappings precise by lot which can be rated as very good in terms of quality and density. We also assume that during the Alpine war from 1915 not only measurements but also the first aerial view series and photogrammetric measurements were conducted on a large scale which had never been used for such purposes before.
Subproject: Integration of early warning into an integrated risk management Stefan Greiving, plan + risk consult, Dortmund An early warning system has to be developed according to the requirements of its users who have been identified in the Cluster »Implementation«. Therefore, the management of information as well as the dissemination of information (risk communication) are of particular importance. Thus, there is a strong connection to both of the other sub-projects that belong to the cluster »Implementation«. Research objective of the part »Management«, to be awarded by contract, is to broaden the perspective and to provide the stakeholders in the case study areas with an appropriate consideration of action alternatives. An early warning system however is only one of many alternatives because the whole »disaster management cycle« of prevention, preparedness, reaction and reconstruction has to be considered. An early warning system thus may not have to be understood as a single isolated measure of which the implementation is based only on the identified hazard. Moreover, existing vulnerabilities and action alternatives have to be considered, too. As a part of the envisaged integrated approach, different and sometimes alternative measures may compete with each other. Criteria for the assessment of measures – that have been defined in co-operation with local stakeholders – are especially the protection goals for hazard-prone areas but also aspects of efficiency and effectiveness. Consequently, this part contributes to improve the implementation of the early warning system to be developed and tested as well as to improve the economic exploitation by the involved companies. Further, it will contribute to a secure and efficient use of public funds that are needed in order to establish early warning systems in practice. The existing fragmentation of isolated approaches often derives from the broad distribution of responsibilities for action within
the disaster management cycle among numerous persons and/or institutions in charge. According to the fact that planning and implementation competences are only bundled at the local level and taking into account that the proximity to potentially affected people promises a high level of involvement of concerned stakeholders, the focus of the work within this part will be on the local authorities of Lichtenstein-Unterhausen (Swabian Alb, Germany) and Nals/Nalles (South Tyrol, Italy). The relevant stakeholders in the Swabian Alb (association of administrations, county, Regional Association Neckar-Alb) and in South Tyrol (Autonomous Province of Bozen/Bolzano with its responsible departments) will be identified and involved in the project. Subproject: Central Spatial Data Infrastructure, Open Web Services and Web Processing Services for the Development of an Information and Decision-Support System for Risk Management in Early Warning Systems for Landslides Klaus Greve, University of Bonn This subproject aims to develop the basic technological principles for an early-warning specific spatial data infrastructure. Early warning systems require very complex information and communication structures. They need to interface sensor and data management technology with database management systems, data preparation and analysis, prognosis and model calculations, computer networks, communication terminal units, result processing systems, and communication tools. To date, major problems on syntactic and semantic levels have resulted from the connection of heterogeneous components in complex Systems. Basically, such interconnections are easily controllable by the use of classical technology. However, due to the special complexity of early warning systems, the coupling of numerous components and interfaces (syntactic problem), different requirements on the content (semantic prob-
95
lem), and a highly dynamic technology development, the interconnection is more difficult and potentially lossy. In these cases, infrastructure oriented technologies described by the keywords Sensor Web, Open Web Services and WPS will be used. They are based on the Web Service Paradigm and originated in context of Spatial Web approach of the OGC. This technology offers much higher interconnection and networking abilities than classical technology. Web Services isolate different ›jobs‹ of spatial information processing and implement them in the architecture of different modules. Theses modules can be implemented as stand alone services and communicated via standardized interfaces. Open Web Services (OWS) are the basis of spatial data infrastructure projects on international (GSDI) and national levels (GDI.de), and in nearly all of the German federal states (GDI NRW, GDI BB, GDI NI etc.). Initial pilot tests have indicated that the concept has reached a level of possible ›real world‹ implementation. Part of the information provided by these services includes assessments about the service’s own ability to provide information. Interlinking is not only possible along linear structures, but also within complex and partly self-organizing networks. These services thus already anticipate structural elements of the Semantic Web, the successor to the WWW. In this subproject, the spatial information technology basis of early warning systems will be explored by first implementing current concepts of geoinformatics into the service architecture and processing modules. The strengths and weaknesses of the solutions will then be evaluated within the application. In order to do this, the central components of an early warning specific spatial data infrastructure will be implemented. The basic concept and all essential modules of this subproject will be developed and implemented as WebServices, and therefore useful in other Spatial Data Infrastructures. The concept is considered to be innovative, and real-world efficiency can be supposed.
96
In this project, a high complex Spatial Data Infrastructure with very heterogeneous information sources and information will be created, as defined by the involved organisations. Subproject: Development of an adequate data model schema for an information and decision support system for risk management in landslides early warning systems Stefan Jäger, geomer, Heidelberg A standardized early warning process for mass movements requires a well structured information management for all relevant information and processes. This sub-project is a cross cutting issue for the project’s framework outline. The temporal and spatial uncertainties on one hand as well as the great variety in quality, amount and availability of relevant data to be expected on the other hand, require complex database schemas. These must have the ability to completely represent the early warning chain and thus the activities of the monitoring and the modeling clusters of ILEWS. The information and decision support system must have the ability to help decision makers as well as disaster management with qualified decision support. This includes also information concerning the triggering mechanisms of landslide processes and their temporal and spatial quality, which is dealt with in the modeling cluster. That means, information must be covered and conveyed ranging from dense, sitespecific monitoring systems on single landslide bodies, to simple landslide susceptibility maps and statistically poorly sustained rainfall intensity/duration triggering indices. The often uncertain information situation also requires the possibility to represent conclusions based on probabilities. In addition, a landslide early warning system must be able to provide estimates of the potentially associated risks for the affected population and the economic activities. Information about critical infrastructure locations and their importance are therefore an essential part of the information process.
– Definition of a geodata model for the relevant information components. The geodata model consists of basic geodata as well as of monitoring data from sensor webs, socio demographic data and critical infrastructure information, with respect to OGCconformal data formats and description (metadata standards). – Definition of interfaces to already existing data infrastructures of the national and municipal civil defense authorities, under consideration of interoperability standards of the OGC. Here, the task is to develop webbased services for the dissemination of the data the data and information sampled and compiled by the sub-projects to the users (civil protection, decision makers etc.). – Development of an AJAX based visualizing component for various user groups and devices on the basis of the before mentioned web services. This is of concern primarily for the monitoring and measuring of data, which serve as input for models and for the model results themselves. These data should be visualized preferably in real time and for various time steps for visual control. The visualization component mainly will be developed based on standardized OGC compatible mapping services as far as geodata are concerned. Additional development demand is given in the field of visualization of the historical data as well as for the sociodemographic and economic data in the context of early warning. – Implemantion and tests of the systems components will finalize the development process. Subproject: Geodetic Monitoring and Modelling Heiner Kuhlmann & Stephan Burghaus, University of Bonn The task of geodetic monitoring measurements is to get a confirmation of predictable changes (e.g. subsidence behaviour of buildings) or the proof for a non-expected or nonpredictable change of an object (e.g. land-
slide). Information is generally being supplied through selected measuring points. The behaviour of the object can be quantified by analysing the movements over time. Closely related with the determination of movements is also the question of reasons in order to derive a causal connection. By means of a specially created geodetic point network that spreads over stable as well as critical slope areas those areas should be identified whose movement intervals differ significantly from other areas due to certain other effects (e.g. increase of humidity, change of pore water pressure, etc.). Absolute movements of ground points in slope areas are being recorded and compared to reference points via measuring methods such as GPS and electronic precision tacheometry. Apart from those (geodetic) network points further measuring stations are being created which are equipped with sensors for relative measurements (e.g. chain inclination measuring systems in the subproject »Geomorphic Modelling«). They must be linked with the geodetic measuring points in order to get best redundant but assignable measuring information on movements. It is the intention to use both geodetic measuring methods one after another. If we assume that the movement intervals are about 0.3 mm/month, the measuring resolution of the precision tacheometer of about 0,2–0,3 mm will be sufficient if you carry out episodic measurements and repeated measurements about every 2–3 months. They do not only cover the monitoring of the geodetic point network but also the respective integration of all measuring stations for relative measurements. This does not only deliver redundant information on areas close to each other but the automatic relative measurements can serve as indicators for beginning movements in order to possibly initiate monitoring measurements outside the scheduled measuring epochs. As already indicated GPS measurements shall be carried out with the same measuring epoch. They do advantages with continuous monitoring methods over several weeks
97
provided they are being carried out as static measurements. The data achieved in the local network come together in a central control and evaluation unit, by that it is possible to calculate the basic lines between the network points automatically and continuously in order to derive station movements from that. The observation deviations with the GPS method show a certain auto-correlation in the range of a few minutes up to a few hours. Reasons for this are e.g. multipath and extension effects of the electromagnetic waves. The dimension of the deviations lies above the point movements to be expected. A reduction of those observation deviations can be achieved during long observation periods and an analysis using the post-processing method. Due to the planned early warning system a real-time process is wanted here which arranges the analysis in such a way that a separation of measurement deviations and point movements will be done in a filter approach. To get a relative precision of tacheometric measurement clearly less than 1 ppm a regular examination of the measuring instrument will be necessary. It is also not sufficient to introduce the gained meteorological parameters of the endpoint as representative factors for the entire measuring distance. Hence the determination of the refractive index plays an important role and finally is the precision limiting factor for the distance measuring especially in mountained areas. A successful way for high-precision distance measuring was taken in the 1990s. Based upon the light dispersion in a turbulent medium, the fluctuations due to atmospheric exchange processes are described in a model by means of statistic factors. Suitable commercial systems to measure these atmospheric fluctuations have been developed by Scintec/Tübingen in form of the scintillometer measuring systems. Regarding the chosen study areas a scintillometer will be used, which can do measurements up to 4–5 km. Its usage will lead to a significant improvement with the modelling of the refractive index and can
98
therefore cover huge parts of the refractive components which have previously been difficult to determine. In the chosen study areas changes are being expected that have an explicit time connection. The movements to be achieved can be modelled together as a function of rainfall, pore water pressure, slope inclination and ground parameters, etc. This will then be the basis for an Early Warning System via which an according measure and emergency concept should be initiated in order to best handle the current situation. Subproject: Spatial Monitoring of soil parameters with geophysical survey methods Heinrich Krummel, geoFact, Bonn
Primary objective – The aim of the project is the development of an innovative monitoring system for soil parameters and flow potential based on geophysical survey arrays. The intention is to permanently install sturdy 2D/3D geoelectrical survey systems in potentially endangered landslide areas. An automatic procedure will be developed for data collection and for transferring the data via modem to a central processing unit. The calibration of the geoelectrical data is done by singular in situ soil moisture measurements (co-operator: IMKO, Karlsruhe). The processed results of the survey will be implemented into a central database for the mutual use of all cooperators involved in the main project and can be evaluated and interpreted with respect to the common goal of the development of an early warning system for landslides. Within this sub project it is intended to permanently install a geoelectrical survey array at a known landslide endangered location in Lichtenstein-Unterhausen (Schwaebische Alb, Germany). The landslide body will be examined with geophysical methods (e.g. seismic, geoelectric) prior to the installation to determine the geometry of the body and to determine critical »hot spots« for the setup of the
permanent array. The survey system will automatically perform several surveys each day. Special (TDR-)probes of partner IMKO GmbH will be installed within the array at different positions in different depths to measure soil moisture. The geoelectrical results will be calibrated using the probes to gain areas of different soil moisture out of the geoelectrical resistivity data. Automatically procedures for data collection, data transfer and analysis will be developed within the sub project.
Scientific and technical objectives Optimisation of 2D/3D-geoelectrical survey arrays (e.g. linear, star shaped or quadratic arrays, electrode separation) for continuous surveying of soil moisture/flow potential at »hot spots«, e.g. sliding surface of a potential landslide. – Development of an automatic procedure for periodic data collection and data transfer – Development of an automatic procedure for analysis of geoelctrical data for interpretation of soil moisture conditions, therefore: – Development of an automatic inversion procedure for creating to create geoelectrical resistivity models from apparent resistivity data. – Scale transformation for soil moisture from singular in situ measurement to spatial information by using 2D/3D-geoelectrical survey systems. – Development of a prototype of an automated 2D/3D-geoelectrical survey system for monitoring the soil moisture/flow potential at potential landslide locations Subproject: Standardised, wireless sensor networks for the efficient acquisition, transmission, storage and visualisation of geodata Hinrich Paulsen, terrestris, Bonn Warnings about imminent events in the realm of mass movements require timely information about certain parameters. Amongst these are changes in slope angles, pore water pressure,
precipitation intensities, soil moisture contents to name but a few. The aim of terrestris is to employ wireless sensor networks (WSN) to efficiently acquire, transmit, store and visualise relevant geo-data. Data will be visualised in a web based geographical information system (WebGIS). In every phase of the project the use of a WebGIS has the added advantage of being able to immediately evaluate and quality check incoming data with ensuing modifications to the system gradually optimising the sensors. Following this procedure it is possible to evaluate the robustness and functionality of the system – on the one hand with regard to the data and on the other hand with regard to the hardware. Failed sensors due to lacking energy, frost or mechanical damage, etc. will immediately show up in the WebGIS. Apart from the above mentioned advantages the WebGIS can also function as a communication platform which is particularly useful when dealing with spatial data since it also facilitates the coordination of and discussion amongst spatially distributed project participants. Other players like communities, scientists working on the same test site, or even the general public can easily be granted access to the system. Wireless sensor networks are a new type of geographical information system for in-depth and continuous monitoring of the environment. Technically termed »embedded systems« these mini computers were only possible through the progress in semi-conductor technology of recent years. They consist of a central processing unit (CPU = microcontroller), memory and radio technology. This basic hardware can be equipped with sensors of any kind like those measuring temperature, vibration, movement, humidity, etc. On principle all sensors can be attached that provide an electronic signal. Due to the fact that sensors are distributed in space they can be assigned a coordinate which turns the measured data into geodata. This geodata is then routed to a special node with access to the internet by means of the Global System for Mobile Communications (GSM),
99
General Paket Radio Service (GPRS) or radio and is directly written to a database. These databases are deemed to be object-relational with some of them (PostgreSQL, Oracle) being able to directly store geographical features through the use of a spatial extension. Collaborative geographical information systems are dependent on standards to function efficiently. The Open Geospatial Consortium, Inc. (OGC) works with government, private industry, and academia to create open and extensible software application programming interfaces for geographic information systems (GIS) and other mainstream technologies. One of its initiatives is the Sensor Web Enablement (SWE) activity which is establishing interfaces and protocols that will enable sensors of all types to be accessed over the Web.
100
Development and testing of an integrative 3D early warning system for alpine instable slopes (alpEWAS) Thuro K. (1), Wunderlich T. (2) & Heunecke O. (3) (1) Prof. Dr. Kurosch Thuro, Lehrstuhl für Ingenieurgeologie, Technische Universität München, Arcisstraße 21, 80333 München, thuro@tum.de; http://www.geo.tum.de. (2) Prof. Dr. Thomas Wunderlich, Lehrstuhl für Geodäsie, Technische Universität München, Arcisstraße 21, 80333 München, Th.Wunderlich@bv.tum.de; http://www.geo.bv.tum.de. (3) Prof. Dr. Otto Heunecke, Institut für Geodäsie, Universität der Bundeswehr München, 85577 Neubiberg, otto.heunecke@unibw.de; http://www.unibw.de/ifg.
1. Introduction Although great advances in the recognition, prediction and mitigation of landslides have been made in the last few years, major events especially in alpine regions still claim a high social and economical tribute. Especially through extreme weather conditions, as e.g. the intense rainfall in August 2005, instable slopes can be activated and endanger people, settlements and goods in its surrounding. Currently an increase of this problem caused by the global climate change can be observed. Recent landslides, which occurred in the alpine region, demonstrate the need for a deeper understanding of the geological and physical processes, which can lead to a spontaneous failure of a natural slope. Major rockslides as Vajont (1963, Italy) or Randa (1991, Switzerland) and recent minor events as Sibratsgfäll (1999, Austria) prove the destructive potential of these mass movements and the need to investigate the mechanics of such processes more deeply. Progress in the assessment of the land slide risk will only be achieved if the triggering processes and the kinematics of the movements are better understood. To accomplish this task an assumedly instable slope has to be examined for its engineering geological properties and then has to be observed continuously with a suitable monitoring system. Exclusive instruments and
methods to achieve this are available, but for economical reasons they are rarely used. At the same time the number of localities with need for monitoring is rising noticeably. Therefore the goal of the alpEWAS joint project, which is being carried out by the Technische Universität München and the Universität der Bundeswehr München, is to develop and test a relatively economic and widely applicable monitoring and early warning system at a designated location. The monitoring system is based on the integration of innovative and economical measuring technologies to a Geo-Sensor Network. The surface movements will be detected punctiform and highly precise with the Global Navigation Satellite System (GNSS) as well as extensively in a large part of the landslide area through reflectorless tacheometry; the measurement of the movements in the depth alongside boreholes will be done by using a newly adapted Time-Domain-Reflectometry (TDR) System. Parallel to theses measurements the registration of the hydrostatic pore pressure, as well as the climatic conditions at the site will be carried out. In this way the 3D movements of the slope, which are determined nearly in real time, can be compared with the surrounding conditions (precipitation, hydrostatic pore pressure etc.) and can be analysed for trigger mechanisms.
101
Through the large amount of data collected by the system (in time and space) it should be possible to determine causal and temporal coherences between the most important influencing factors and the movement of the instable slope within a relatively short period of time (6 to 9 months), allowing the definition of critical threshold values. By using an automated alarm function, which informs the responsible authorities when the threshold is exceeded, an early warning system can be implemented. A major task of the alpEWAS project is to develop the necessary hardand software and to test the system at a designated site. A real cost advantage of the readily developed monitoring system is achieved by making remote maintenance and inquiry possible. All important functions can be queried and controlled from the project office via Modem, reducing the costs for personnel in comparison to conventional measurements by eliminating the need for repeated manual measurements. Furthermore the installation costs
are – when compared to the amount of data which is gathered in space and time – relatively low due to the used measuring technologies (reflectorless tacheometry, low-cost GPS and TDR). The linking of the different measuring units with the local control center will as far as possible be accomplished by using WLAN, which makes the usage of an expensive mobile telephone system or beam radio unnecessary. 2. Project Site The landslide »Aggenalm«, situated in the Sudelfeld area near Bayrischzell (Bavaria), has been selected as the location for the installation of the early warning system (Figure 1). After a first engineering geological investigation, the location has been proven suitable due to its movement rates of about 2 cm per year and the anticipated maximum depth of the landslide of about 25 m in most areas. Additionally the Aggenalm, a grass slope interspersed by single rock blocks, is ideal for the development and training of the new moni-
Figure 1: Orthophoto (scale ca. 1 : 5000) of the landslide Aggenalm with displacement vectors in the scale ca. 1 : 1 (observation period: 2003–2004). The shown results emanate from periodic geodetic measurements of the slope which were carried out on behalf of the Bavarian state office for the environment (Bayerisches Landesamt für Umwelt). In the upper half of the picture a secondary landslide which occurred in 1997 can be seen. The Aggenalm and the access road to the ski resort Sudelfeld are affected by the movement.
102
toring strategies needed when using reflectorless robot. 3. Deformation measurement techniques 3.1. Time Domain Reflectometry (TDR) Time Domain Reflectometry (TDR) is widely known as a system for the measurement of soil moisture (e.g. TOPP, DAVIS & ANNAN 1980). With few modifications TDR can also be used for the monitoring of localized deformation in rock and soil. To date this application has only found wider acceptance in North America, while it is still largely unknown in Europe. This is surely based on the fact that so far most of the research has been carried out at the Northwestern University (Evanston/Chicago, Illinois) under the leadership of Dowding and O’Connor, who have without doubt proved the usability of TDR in landslide monitoring (O’CONNOR & DOWDING, 1999). Especially the comparably low installation costs and the possibility to perform continuous measurements make TDR an interesting alternative to inclinometers. Presently, TDR landslide monitoring systems are capable of determining the exact depth of the observed deformation zone, while only a semi quantitative statement can be made of the amount of movement. The orientation of the movement can not be determined at all. Furthermore in most instances the application is limited to the measurement of localized deformation as it is typically observed in rock (e.g. localized shearing alongside joints) (KANE, BECK & HUGHES, 2001). In the opinion of the authors some of these disadvantages can be overcome by defining standardized installation procedures (e.g. grout type, coaxial cable type) adjusted for different geologic settings. Furthermore new methods for the analysis of the received TDR data are under development, especially when multiple TDR measuring points are connected to produce a 3D model of the deformation zone. The monitoring of deformation in rock/soil using TDR is based on an indirect measuring method: the deformation itself is not mea-
sured (as with an inclinometer) but a directly dependant value, the change in impedance of a coaxial cable as a result of deformation, is measured. TDR can be described as »cable-based radar« and consists of two basic components: a combined transmitter/receiver (TDR cable tester) and a coaxial cable (Figure 2). The TDR cable tester produces electric impulses, which are sent down the coaxial cable. When these pulses approach a deformed portion of the coaxial cable an electric pulse is reflected and sent back to the TDR Cable Tester. The reflected signals are collected and analysed. As with radar, by measuring the time span between emission and reception of the signal the distance between the TDR Cable Tester and the deformation can be determined. Furthermore by analysing the reflected pulse (amplitude, width, form etc.) information about the type and amount of deformation can be obtained. For landslide monitoring a semi rigid coaxial cable is installed into a borehole and connected to the rock mass with grout. There are basically three different installation methods (Figure 3): 1. The TDR cable is installed parallel to an inclinometer within the same borehole; 2. the coaxial cable is installed into a sheared inclinometer casing, therefore extending the lifespan of an inclinometer borehole; 3. the coaxial cable is installed into a borehole of its own. The installation parallel to an inclinometer in the same borehole is primarily for research purposes, since a direct comparison of inclinometer measurements (direct measuring method) with the TDR readings (indirect measuring method) is made possible. For this reason this method will be an important part of the ongoing research. Nevertheless options 2 and 3 are probably the most appropriate installation methods for commercial use: option 2 because it extends the lifespan of an existing inclinometer borehole, and option 3 because a sole TDR installation can be established at comparably low costs, because of the relative small boring diameters required and the low cost of the coaxial cable. Furthermore with option 3 the grout
103
Figure 2: Basic setup of a TDR measuring site. The coaxial cable is installed into an instable slope and connected to the TDR cable tester. As soon as the coaxial cable is deformed by the mass movement a peak can be seen in the reflected signal. Its amplitude is dependant to the amount of deformation taking place
Figure 3: Possible installation setups for a TDR coaxial cable into a borehole with and without an inclinometer
104
and other installation parameters can be optimized to greatly enhance the performance of the TDR measurements. Many different factors influence the measurement of deformation with TDR (Oâ&#x20AC;&#x2122;CONNOR & DOWDING, 1999). Generally two groups of effects can be distinguished: the influence of the deformation itself and the installation parameters of the TDR system. TDR measurements are influenced greatly by changes in the geometry of the measurement cable. The type (shearing, extension and compression) and the amount of deformation, as well as the width of the deformation zone, affect the received signals. Prliminary simple tests have shown that discerning the different types of deformation is not always possible. However, especially with progressive deformation, a determination of the deformation type is sometimes possible. It is evident that the components used in a TDR system, and the way it is installed into a landslide, influence the measurement. In particular, two parameters stand out in their importance: the type of coaxial cable used and the physical properties of the grout. The material from which the cable is made, its diameter and the length all influence the measurements. Generally, when using thicker cables, larger total deformations can be observed before the cable is severed. At the same time, the sensitivity of the system to small movements is reduced. Accordingly, thinner cables should be used for landslides with low deformation rates, and thicker cables for ÂťfasterÂŤ landslides. The grout is the interface between the cable and the rock mass. It is, therefore, important to match the physical properties of the grout to the particular geology, especially when working in soil. If, for example, the grout is too strong, a pillar-effect might occur whereby soil moves around a pillar over-stiff grout, resulting in too little or no deformation of the TDR cable. Varying the type of cable and the grout mix to suit the surrounding geology and the anticipated deformation rates will enhance the quality of the received TDR measurements.
This is one main task of the ongoing research: defining certain installation procedures for different geological settings and deformation rates. Furthermore a careful calibration of these setups is a pre-requisite for ensuring the maximum quantity and quality of information (especially the deformation type and amount) through the indirect measurement of deformation with a TDR system. Within the alpEWAS project these newly defined installation setups, which were determined in laboratory shear tests, will be tested for the first time in the field. Signal analysis software will be developed, which will allow an automated processing of the TDR signals. The TDR measurements are then compared and if necessary further calibrated with parallel inclinometer measurements. 3.2. Reflectorless Robot-Tacheometry The component RL-TPS (Prismless Terrestrial Positioning System) fulfills the task of interpolation within the joint project. Its aim is to densify the displacement pattern determined by TDR and GNSS at only few selected loci and focusses on discovering local maxima or singularities. To attain high economic efficiency it should be avoided to fit targets with special prisms. An exception is only allowed for those points which are needed for continuous testing the stability of the observation station itself and to confirm the motions of the GNSS and TDR spots. Most objects to observe will be natural targets, e.g. rocks standing out of the slope and tree stumps. Spatial polar methods based on automatic targeting robot-tacheometres with prismless EDM are well suited for periodical surveying of these objects. By corresponding programming the geodetic precision instruments are able to scan suitable objects with a dense grid. The 3d-model derived from this grid can be investigated for displacement and deformation by comparison with a 3d-template from the initial epoch. The 3d-templates themselves shall be determined in close-range manner by a terrestrial laser scanner with high resolution. To determine the spatial components of the movement, corresponding software has to be
105
developed based on known geodetic algorithms. In addition corresponding statistical tests of significance have to be provided. The solution could be substantially improved, if it were possible to make use of the internal image information collected by the automatic target recognition unit. In this way contours could be extracted and possibly grey values could be compared. The Chair of Geodesy disposes of an instrument with such an interface and pioneered successfully automatic pointing at natural targets. In spite of this the fix focus feature of the internal camera prevents the application for the present problem, because less than 500 m distance images will be poorly defined. A single manufacturer has developed a prototype of a focusable internal camera; although it is not yet on the market, an operative instrument based on the actual tacheometry technology was ordered by the Chair of Geodesy. First tests and calibration procedures already are planned and must be scheduled before the system can get operative at the site. These additional tasks must be performed to reach the highly innovative final aim: to realize the vision of a system, able to autonomously find and select suitable natural targets. Only with the image information it will become possible to train a survey robot to search systematically suitable natural targets on a slope, to scan them and to subsequently perform a periodical motion check. The necessary strategies have to be developed and programmed in a way that enables the RL-TPS to adapt itself for changing atmospheric and topographical conditions. The accuracy, the promptness and the reliability of the RL-TPS will be slightly lower than TDR and GNSS as the sophisticated processes of observation and evaluation will lead to a certain delay with respect to the numerous natural targets. Another reason comes from a necessary generalization which cannot be prevented during the modelling routines. Nevertheless, in total the evidence concerning the entire behaviour of the landslide will be of substantially increased value in order to gain insight into the complex kinematics.
106
The high economic advantage of the strategy to use natural objects as targets is opposed by an unavoidable disadvantage: especially in the alps those targets will be covered with snow during winter season and therefore will be invisible. Moreover, it has to be expected that due to the cold weather the supply of solar energy will be weak. Therefore we suggest to let the TPS scan the current snow cover once a week. This information can be used to compute the water volume from melting snow in spring to incorporate the most efficient landslide trigger into the general model. Low-cost GNSS and TDR are very suitable solutions to continuously gather highly precise information about the movements on the surface and underground, respectively. By doing so, the tendencies at a few selected points within an instable slope can be detected economically. Despite using great carefulness to select representative sites for the measurements, they still are only spot checks, which can miss areas with high rates of movement and therefore make a deeper understanding of the underlying landslide mechanism difficult. Thus it is necessary to economically gain information covering the complete landslide area, in order to incorporate all problematic regions into the monitoring system by either moving existing sensors or installing new ones. When high rates of movement are encountered, the methods of aerophotogrammetry and satellite remote sensing are used. At lower rates of movement and the need for continuously updated information, TPS is to prefer for economical an accuracy reasons. However, a dense placement of reflectors within an instable slope is not reasonable. The snow depths which have to be expected in the mountains force the installation on long metal posts. These would not only deface the slope and produce an unwanted psychological sensation of threat, but with time would also become askew and attract lightning during thunderstorms. Additionally the installation of the posts would lead to extra costs, especially when using active reflectors (dependent on manufacturer), which need an extra energy
supply. Thus an economic solution must aim at natural targets. This has already been tried using reflectorless tacheometres, which sweep the terrain in a manner comparable to a scanner, but did not lead to a verifiable success when used for the monitoring of instable slopes – other as when used for the observation of snow accumulations for the mitigation of avalanches (SCHEIKL et al. 2001). The new approach proposed in this project is now trying to combine the advantages of two established methods and at the same time include an awaited instrumental innovation. The idea is to first detect a great number of natural targets as e.g. large rocks or tree trunks using terrestrial laser scanning (TLS) and then to localize these with a terrestrial positioning system (TPS). First the target objects will be aimed at in their last known position and scanned using a reflectorless measuring precision tacheometre (STEMPFHUBER & WUNDERLICH, 2004). The resulting 3D-model of each structure is then compared to the original template from the TLS (WUNDERLICH et al. 2005) and statistically checked for changes in its position in space. Later the internal camera of the tacheometre, which is usually used for the automatic aim on reflectors, will support the reflectorless aim by conduction a contour comparison. First achievements have been made with this method at the Chair for Geodesy (TUM) using the example of the automatic recognition of church spires as geodetic orientation (WASMEIER, 2003, 2004). Momentarily the technique is limited by the internal camera, which, due to its fixed focus, can only picture objects in distances above 500 m sharply. However, one manufacturer has just developed a tachymeter containing a camera with adjustable focus (WALSER & BRAUNECKER 2003), so that it can be expected to be brought onto the market soon. Then the third phase is entered in which a great advance in the operating efficiency can be achieved. For such an instrument a control system can be developed, which will allow it to automatically search a slope for fitting natural objects, measure them and repeatedly check them for a change in position. Therefore it can be
installed at virtually any location and find targets itself, to then start with their periodical survey. In a symbiosis with the other two systems, TDR and GNSS, a nearly perfect intelligence unit is produced, which enables to establish an early warning system and to better understand the mechanical processes within a landslide. 3.3. Global Navigation Satellite System The primarily contribution of the Universität der Bundeswehr München to the alpEWAS project will be an all weather proof LowCost-GNSS-System which continuously records movements on the surface of the instable slope with the quality of a few millimetres. It concerns the development of a prototype of a GNSS-Monitoring Component which in particular features the following special characteristics: – Low-Cost-GNSS-Sensor technology (resp. Low-Cost-GPS-Sensor technology); – WLAN-Communication between the stations (wireless data transmission); – autonomous power supply of the Rovers; – flexibility of the analysis through (at any time adaptable) options of post processing; – possibility for remote maintenance and remote inquiry of the system (remote desktop operation); – separation between data recording and essential evaluation (especially time series analysis); defined ASCII-interface (»dual system«); – possibility of incorporation of existing (proofed, powerful) program systems; – open system to integrate other sensors resp. to be adapted to other sensors (»multisensorsystem«, component of a Geo-Sensor Network). After the installation the (entire) system – the GNSS-Monitoring Component is only a part of the planned Geo-Sensor Network – will continuously provide measuring data. Its presentation and analysis are essential parts of the later project phases (see below). Beside the development and trial of the three technological monitoring systems TDR, GNSS and TPS, their integration into the Geo-Sensor Network
107
Figure 4: Measuring configuration of the GNSS-Monitoring Component at the test site Aggenalm
is the centre of the scientific project. This includes at first a purely technical and later a software-related, data base oriented combination of the measuring results. This represents another focal point of the work done by the Universit채t der Bundeswehr M체nchen. Up to date the GNSS-Monitoring Component de facto is a purely GPS-Component. If (as soon as) available, it should be changed to receivers, which are able to register GALILEO and optionally GLONASS signals as well. This is presently not possible, but is, however, imminent. The planned set-up of the online Low-Cost-GPS-Systems (L1-receiver, no RTK availability) provides four stations/receivers. Two respectively three stations of them are installed in range of the slide slope and the fourth receiver on a central station in the presumably stable range in the near vicinity. The sketched configuration is currently tested in the scope of a still ongoing dissertation at the Universit채t der Bundeswehr M체nchen. Beside the OEM Boards Novatel Smart Antenna (receiver/antenna combination) different sen-
108
sors are also integrated in this test configuration like temperature sensors, voltmeters for control of the battery voltage in the solar mode, geotechnical sensors such as extensometer and hydrostatic measuring devices. At present a year-round field trial is coming up, the previous test structure is operating since a few months on the campus of the University of the Bundeswehr Munich. The system can be remotely maintained and controlled in the viewer modus (at any time) by thirds as well. At the computer centre (host system), where two computers are provided for the system control, the access to 220 V power supply and to the internet (telephone) is necessary. The other stations of the GNSS-Monitoring Component (rover stations) maintain a self-sufficient, autonomous power supply. The conception for the Aggenalm landslide provides solely solar panels and abandons an additional/ alternative use of wind engines. This is guaranteed by the sufficient dimensioning of the panels and the buffer batteries. Demands to the GNSS-Rover Stations are:
– preferably little signal shading of the orbit resp. space segment (no signal obstacles); – quasi-optical view to the computer centre (with WLAN Access Point); obtainable through a raised set-up of the WLAN antennas (transmitting poles), the choice of the antennas and/or a repeater station (planned in the financial means including necessary the self-sufficient power supply, see below); – proximity to TDR-Measuring Arrays, to be able to analyse the obtained information oriented in the superior reference frame and integrate this data into the Geo-Sensor Network, as well as, the data of the TPS. Technically speaking, the GNSS-Monitoring Component should revert to Low-Costreceivers with the possibility of a post processing. On the one hand for cost reasons (RTK-systems are 10-fold more expensive) and on the other hand for the advantages which arise when using the option controlled analysis (comp. HARTINGER, 2001). Among others the update of the time interval is arbitrary, it usually lies in the array of approx. 15 min. up to 1 h. Beside other aspects this is also of importance to the achievable accuracy of the base lines: the longer the time period in which phase measurements in the OEM board are acquired, the better the positioning accuracy may be expected. The studies of the GPS-Sensor Novatel Smart Antenna, conducted at the Universität der Bundeswehr München, show, that an accuracy in the scope of a few millimetres for the base lines can be achieved using 15 min. intervals. 4. Integration and Automation In addition to the development and testing of the three measuring systems, their integration into a combined Geo-Sensor-Network is a main task of the research project (figure 5). On the one hand this applies to the development of techniques and software for the database oriented combination of the data, on the other hand also to the gain of scientific knowledge by means of the analysis and interpretation of the data. WLAN (54 Mbps data rate) is earmarked for the data transmission between the rover sta-
tions, at which the GNSS, TPS, TDR, piezometer and climate measurements are collected, and the computer centre. With WLAN clear (»quasi-optical«) visibility up to 3 km can be bridged; adequate antennas presumed. With the distinct slope morphology at the Aggenalm landslide the direct visibility is severely limited, so that a repeater station is planned. With regard to the development of a flexible monitoring system the incorporation of such a relay station is also of fundamental interest, however. Compared to the usual transmission path (esp. with RTK-systems) WLAN offers several advantages and starts to be established. Basically every sensor with a RS232 interface can be integrated to the net using a COM server. Within the project the batch controlled processing will be carried out with the multivendor capability software GravNav Waypoint (see www.waypnt.com). GravNav is thereby integrated in a LabVIEW-Programing, National Instruments (see www.ni.com/labview), which manages the sampling and selecting of the sensor knots. As a result of the LabVIEW/GravNav-pre-processing the base lines (this means time information, 3D coordinates differences, assigned covariance matrices) are applied through an open interface to the automated evaluation system GOCA (GPS Control and Alarmsystem; developed at the Fachhochschule Karlsruhe, see e.g. KÄLBER; et al., 2001a, b, see www.goca.info). GOCA presents in this conception basically a highly sophisticated analysis tool for monitoring tasks, in which all features of the time series theory (filter operations as moving average and spline approximation, trend estimation, Kalman-filtering etc.) are realized, as well as, e.g. alarm functions (via SMS, e-mail, etc.) by the exceedance of thresholds. Furthermore, GOCA offers a nearly perfected visualization component. Also results from tacheometric observations can be integrated (DANESCU, 2006). Finally however, GOCA may be replaced by any other program which enables the analysis of time series. For the combination with all the other data sources (TDR, TPS, meteo-sensors,
109
Figure 5: Schematic illustration of an integrative 3D early warning system for instable slopes. GNSS and TDR continuously gather highly precise punctiform information about the movements on the surface and underground, respectively. The TPS in form of a reflectorless robot-tacheometre can gain displacement information covering most of the landslide area. Furthermore triggering mechanisms are observed using piezometers (hydrostatic pore pressure) and a meteorological station (precipitation). When combining all the different measurements, a 3D model of the slope and its movements can be derived in temporal correlation to the triggering parameters. The measured data is transmitted to the master station via WLAN and stored in a central computer. The data can be accessed and downloaded from a remote computer for further analysis and interpretation.
…) – this is not possible with GOCA – the development of an adequate, data base orientated and with GIS-functionalities equipped software within this project is planned. This should be done in a similar way as the programs DIVA by the diploma thesis CRAUSE & ELING (2001) and the program ANALYSIS by
110
the diploma thesis GLÄSER & KNAUER (1999). Furthermore a cooperation and coordination with the participants of the joint project EGIFF (»Entwicklung geeigneter Informationssysteme für Frühwarnsysteme«) is arranged. The software component to be developed should offer possibilities as an information sys-
tem for the user on manifold ways of (interactively) getting access to the data. Here it is stated that the data from different components TDR, GNSS and TPS (»reaction quantities«) are already pre-processed and if necessary reduced. In an ideal case the new software tool only will handle co-ordinates in a common reference frame with respect to the reaction quantities. (Minimum-)Characteristics of the new software component to be developed are: – Input of pre-processed data from TDR, GNSS and TPS; gauge data and meteo-sensors, storage in a data base, preferably MS ACCESS; – sophisticated possibilities to choose interesting sensors (Visual Basic- and/or MatLabprogramming, to see the allocation of the sensors at the slide, activation via »click«); – Description of single time series numerically and graphically, including a zoom-function; – Combination of any selectable sensor (e.g. comparison of impact quantities and reaction, comparison of adjacent sensors); – alert function after reaching limiting values (thresholds); – additional information for the sensors (digital images, specifications, maintenance, remarks, …). The software should be easy and intuitive to handle, especially for third persons, who are not involved to the project permanently, to get a quick overview. A modular composition is intended. At the end only the main user interface must be changed to make the software available for another project in a similar way. In the course of the geologic examination of the received data first simple causal and temporal coherences between the parameters, which influence the slope (e.g. precipitation), and the occurring movements will be identified. This easy analysis can be done within a relatively short period of time after starting the measurements and will lead to the definition of first threshold values for the early warning system. Then, through creation of numerical models of the instable slope using the Itasca UDEC and FLAC (Itasca) code, the geome-
chanical behaviour of the slope can be analysed more precisely. The continuously collected measuring data is integrated into the models leading to their constant improvement. The resulting better understanding of the trigger, mechanics and damage potential of the landslide is used to optimise the defined threshold values. Literature CRAUSE, C. & ELING, D. (2001): Entwicklung eines Datenvisualisierungs- und Analyseprogramms für die Überwachungsmessungen der Okertalsperre im Harz. – Diplomarbeit Universität Hannover, Geodätisches Institut. DANESCU, A. (2006): Case Studies Related to a New Approach of the GPS-Based Online Control and Alarm System Package Software for Standard and non-GPS Application. – Diplomarbeit Universität der Bundeswehr München, Institut für Geodäsie. GLÄSER, A. & KNAUER, C. (1999): Beurteilung von Deformationsprozessen mit der Hilfe der Filterung von Zeitreihen und der Ableitung von Konfidenzbändern. – Diplomarbeit Universität Hannover, Geodätisches Institut. HARTINGER, H. (2001): Development of a Continuous Deformation Monitoring System usind GPS. – Ingenieurgeodäsie – TU Graz (Hrsg. F.K. Brunner), Shaker-Verlag, Aachen 2001. KANE, W.F., BECK, T.J. & HUGHES, J.J. (2001): Application of Time Domain Reflectometry to Landslide and Slope Monitoring. – TDR 2001 – Second International Symposium and Workshop on Time Domain Reflectometry for Innovative Geotechnical Applications, Infrastructure Technology Institute, Northwestern University, Evanston, Illinois, U.S.A. O’CONNOR, K.M. & DOWDING, CH.H. (1999): GeoMeasurements by Pulsing TDR Cables and Probes. – CRC Press, Boca Raton. SCHEIKL, M, GRAFINGER, H. & POSCHER, G. (2001): Entwicklung und Einsatz eines
111
automatischen Fernüberwachungssystems basierend auf einem Laserscanner. – in: Tagungsband zur 11. Int. Geodätischen Woche Obergurgl, Wichmann-Hüthig Verlag, Heidelberg.
TOPP, G.C., DAVIS, J.L. & ANNAN, A.P (1980): Electromagnetic Determination of Soil Water Content: Measurement in Coaxial Transmission Lines. – Water Resources Research 16, Nr. 3, S. 574–582.
SINGER, J. & THURO, K. (2006): Development of a continuous 3D-monitoring system for unstable slopes using Time Domain Reflectometry. – 10th IAEG Congress, Nottingham, United Kingdom, 6–10 September 2006, Geological Society of London (in press).
WALSER, B. & BRAUNECKER, B. (2003): Automation of Surveying Systems through Integration of Image Analysis Methods. – in: Proc. of Optical 3D Measurement Techniques VI-1, Zürich.
SINGER, J., THURO, K. & SAMBETH, U. (2006): Entwicklung eines kontinuierlichen 3D Überwachungssystems für instabile Hänge mittels Time Domain Reflectometry (TDR). – Felsbau 24 (3), 16–23. STEMPFHUBER, W. & WUNDERLICH, TH. (2004): Leica System 1200: Auf dem Weg zur Sensorsynchronisation von GPS und TPS für kinematische Messaufgaben. – AVN, 111. Jg., Heft 5, S.175–184, Wichmann-Hüthig Verlag, Heidelberg.
112
WASMEIER, P. (2003): The Potential of Object recognition Using a Servo-Tacheoemter TCA2003. – in: Proc. of Optical 3D Measurement Techniques VI-2, Zürich. WASMEIER, P. (2004): Potenzial der Objekterkennung mit dem Servotachymeter TCA2003. – Geomatik Schweiz, Heft 2. Wunderlich, TH., SCHÄFER, TH. & ZILCH, K. (2005): Schadenserkennung an einer Spannbetonbrücke durch reflektorlose Deformationsmessungen. – in: Festschrift 60. Geb. Prof. Bancila, S. 60–68, TU Politehnica Timisvar.
Development of suitable information systems for early warning systems (EGIFF) Breunig M. (1), Reinhardt W. (2), Ortlieb E. (2), Mäs S. (2), Boley C. (3), Trauner F. X. (3), Wiesel J. (4), Richter D. (4), Abecker A. (5), Gallus D. (5), Kazakos W. (6) (1) Institute for Geoinformatics and Remote Sensing (IGF), University of Osnabrück, Kolpingstr. 7, D-49069 Osnabrück, Email: mbreunig@uni-osnabrueck.de (2) Geoinformatics Working Group (AGIS), University of the Bundeswehr Munich, Heisenberg Weg 39, D-85577 Neubiberg, Email: Wolfgang.Reinhardt@unibw.de (3) Institute for Soil Mechanics and Foundation Engineering, University of the Bundeswehr Munich, Heisenberg Weg 39, D-85577 Neubiberg, Email: Cornad.Boley@unibw.de (4) Institute of Photogrammetry and Remote Sensing (IPF), University of Karlsruhe, Englerstr. 7, D-76128 Karlsruhe, Email: Wiesel@ipf.uni-karlsruhe.de (5) Research Centre for Information Technologies (FZI) at University of Karlsruhe, Haid-und-Neu-Str. 10–14, D-76131 Karlsruhe, Email: Andreas.Abecker@fzi.de (6) disy Informationssysteme GmbH, Stephanienstr. 30, D-73133 Karlsruhe, Email: Kazakos@disy.net
Abstract At present the analysis and preparation of information are particularly critical points of an early warning chain. The responsible decision makers are usually confronted with huge amounts of structured and unstructured data. To enable reliable early warning, the available data must be pre-selected, analysed and prepared. The decision makers should be provided with a reliable and manageable amount of information for the warning decision and for taking preventive measures. In the introduced joint project »Development of suitable information systems for early warning systems« (EGIFF) components of an information system for early recognition of geological hazards will be developed and examined in a mass movement scenario. In particular the improvement of the information analysis and preparation is researched. Therefore techniques like GIS, numerical simulations, Spatial Data Mining, geodatabases and the application of linguistic methods will be combined. A particular focus will be put on innovative methodical investigations, which will serve to merge new technologies into the operational workflow of early warning and to
establish an efficient and reliable medium for distributing information to the responsible hazard/crisis managers. 1. Overall Objectives of the Project The central components of an early warning system are the recognition of the threats, the assessment and evaluation of the danger, the dissemination and communication of the warning, as well as the public reaction to the warning (Smith, 2004). The effectiveness of an early warning system largely depends on the transformation of the event recognition into the report of warning to the population (Dikau and Weichselgartner, 2005). Obviously the analysis and information preparation are particularly critical points of the early warning chain. They provide a basis for the warning decision and the risk estimate for the extent of the natural event. The objective of the joint project EGIFF is to improve the early warning chain through the conception and development of appropriate components for early warning systems. Hereby the analysis and preparation of information are the basis of the warning and the risk estimation of the forthcoming geological natural
113
event. The main research objectives of the project include: 1. Conception of a distributed component architecture of an information system for early warning systems; 2. Geotechnical evaluation of mass movements with the help of available events; 3. Coupling of numerical simulations and GIS; 4. Modelling and visualization of spatial relationships and their uncertainty as well as their extraction from textual and numerical data; 5. Transfer of Data Mining and analysis methods to spatially referenced data (Spatial Data Mining), as well as processing of structured and unstructured data (free texts of disaster and damage messages); 6. Web-based alerting; 7. Geodatabase support for handling data of different scenarios as decision basis for the geotechnical evaluation of mass movements;
Figure 1: Application Area »Isarhänge«
114
8. Development of concepts and methods for the support of the evaluation of risks and their prototypically implementation and finally 9. Evaluation of the developed prototype on the basis of concrete geological data and application scenarios. 2. Application Area A number of places in the German alpine upland and the Bavarian Alps seemed to be suitable investigation areas for exemplary landslide simulations. Main selection criteria were the availability of detailed geographical data and sensor measurements for a longer period of time, a coherent and comprehensive geology to verify the gained methodology in a generalized way and a potential risk. After investigations of different landslide areas a part of the Isar valley in the south of Munich, next to Pullach and Neugrünwald (see Figure 1), has been selected for the further studies.
Figure 2: Slopes in the Application Area »Isarhänge«
In this area, the height difference of the slope reaches up to 40 meters and the potentially endangered human infrastructure is located directly at the edge of the slope. Because of the high risk potential, the area is observed by
the responsible authorities (Bayerisches Landesamt für Umwelt), and corresponding inclinometer and extensometer measurements and geodetic surveys are available.
Figure 3: Overall Architecture
115
3. Description of the subprojects The project is subdivided into three subprojects. Figure 3 illustrates these subprojects and their interaction. The following subchapters provide an insight into the objectives and applied methodologies of the subprojects. 3.1. Subproject I »Development of an interconnected information and simulation system« Responsible: Conrad Boley, Institute for Soil Mechanics and Foundation Engineering (IBGN) and Wolfgang Reinhardt, Geoinformatics Working Group (AGIS) both University of the Bundeswehr Munich 3.1.1. Objectives The main aim of this subproject is the conception, prototypically implementation and evaluation of an interlinked three dimensional Numerical Simulation System (SIMS) and a Geographical Information System (GIS). Its research contributions are, in particular, the development of coupled information and simulation models for mass movements with a user-friendly control of the complete system. Another focus is put on a more precise predictability and a more exact determination of exposure of slopes. Therefore a transferable numerical simulation algorithm for landslide movements will be realised within a simulation system. Thus, it is possible, based on existing field data (sensor- and GIS data, for example DTM) and virtual exogenous scenarios, to assess the slope stability, future system behaviour and potential risk scenario. As a result, a »communication« between the sensor measurements, GIS and the geotechnical model is enabled. The interconnection between SIMS and GIS is schematically shown in figure 4. The process starts with the selection of relevant parameters (Input Data) which may have influence on the occurrence of landslides. These parameters include basically temperature, precipitation and slope geometry. The parameter transfer is controlled by the GIS. In the simulation system, the slope stability and deformation is calculated and, if possible, used for prediction of
116
the future deformation behaviour. Results of the simulation are parameters such as hazard potential, deformation and movement vectors and stability indices. These results are transferred to the GIS for visualization and, as far as possible, enrichment with other information. Furthermore, the data can be checked against rules to support the decision making process of the user. The developed concepts for the interconnection of the GIS and the simulation system will be prototypically implemented and evaluated. With this prototype of a Decision Support System (DSS), it can be demonstrated how the linking of numerical simulations and GIS can improve predictions of landslide hazards and calculation of vulnerability, and how the uncertainty of and lacks within the data can be considered. Additionally, it will be verified how the output of simulations, enriched with GIS analytic methods, can optimize the decision making process for the prevention of catastrophic hazards. Within the overall system architecture (see Figure 3), this DSS is integrated with the geodatabase (Subproject III) and the data mining system components (subproject II). The geodatabase provides the input data for the simulation and the modelling of slope stability and stores the simulation results. The event messages detected by the data mining system will also be incorporated into the decision support. 3.1.2. Concepts and Methodology In the following the main work packages of subproject I are described: 1. Concept of an overall architecture In cooperation with the project partner University of Osnabrück, a suitable architecture of the overall system consisting of SIMS, GIS, geodatabase, in-field installed sensors as well as othercomponents will be developed. The architecture should enable a seamless communication between individual system components. 2. Data integration and modelling The data models required for the numerical simulation and GIS will be designed. The
Figure 4: Interconnection between simulation system and GIS
main focus of the modelling process lies in the combination of stratigraphical, lithological and geotechnical data with geometric sensor data, long term physical series of measurements, as well as the requirements of visualisation and further processing of the simulation results. Subsequent to modelling, the available data of the test area will be adopted and integrated. 3. Conception of geotechnical models and their interpretation Natural slopes often consist of several different soil types, whose mechanical properties have to be described with different constitutive equations. For a realistic characterisation of the deformation behaviour of different soil types, the validity of distinct constitutive equations for each relevant material has to be analyzed and assessed. Therewith the application of appropriate constitutive equations for dif-
ferent soil types will be enabled. To apply the best suitable constitutive equations, the required material parameters have to be identified and a concept for the determination of these parameters developed. The material parameters will be stored in a database, so that the final numeric model can access the required parameters. For numeric modelling of landslides initialand boundary conditions have to be defined. These include the geometry of the slope as well as superimposed load and meteorological influences, etc. Due to the complexity of the actual initial- and boundary conditions, a simplification of the numerical model is inevitable. Numeric modelling demonstrates not only the deformation of the slope but also the slope stability. For the determination of the slope stability failure criteria have to be developed, which may vary for different materials and different constitutive equa-
117
tions respectively. With the definition of these criteria the material properties, the state of stress and strain and the desired security level have to be considered, as well as the strain rate and stress level. Based on a time variant 4D-simulation, calibrated with backwards analysis for past events, probabilities for future scenarios can be calculated. By comparing predicted deformations with measured displacements, the model can be adjusted and improved. 4. Conception of the Decision Support System The criteria for determining the stability of slopes enable the mapping of landslide prone areas and vulnerable zones. This means the slopes can be divided into homogenous areas according to the degree of actual or potential hazard. Furthermore, methods will be developed for refining and linking the simulation results with other information (e.g. from subproject II) and for visualizing the simulation events. In addition, the simulation results can be linked with decision rules in order to support and guide the user. For example, dangerous situations and adverse constellations that may be predefined and formalized in the system can be automatically revealed to the user as advice. The main goal is to establish an efficient and reliable medium for distributing information to the responsible hazard managers. Furthermore, it will be investigated if and how uncertainties of the data used in the simulation and subsequent processes can be modelled. In particular, for visualization and for the support of the user in the decision-making process these uncertainties should be recognizable, in order to allow for validation of the results by the user. Moreover it will be investigated if and how this validation of uncertainties can be done automatically. Another emphasis is put on the convenient presentation of the extensive simulation results to the user and the development of a user friendly and intuitive system control.
118
5. Definition of the interfaces between GIS/ geodatabase and the simulation component The main interfaces that need to be defined are for the transfer of the input data and the simulation results between the GIS and the simulation component, and for the connection of the GIS to the geodatabase. Therefore available international standardized interfaces will be examined and, if necessary, restricted, extended or adjusted. For the bidirectional access to the geodatabase, standardized interfaces of the OGC (Open Geospatial Consortium) will be taken into consideration. For the linking between the simulation system and the GIS also standards like HLA (High Level Architecture) will be analysed. 6. Prototype development The developed concepts will be prototypically implemented and verified with data of the above mentioned application area. Practical tests will be carried out to determine the system feasibility, performance and reliability. Furthermore a transferability study shall analyse the applicability of the concepts and solutions to areas with similar and also varying geologic and boundary conditions and data availability. 3.2. Subproject II ÂťIntegration, analysis and evaluation of vague textual descriptions of geoscientific phenomena for early warning system supportÂŤ Responsible: Peter Lockemann, FZI Karlsruhe; Joachim Wiesel, Institute of Photogrammetry and Remote Sensing (IPF), University of Karlsruhe; Wassilios Kazakos, disy Informationssysteme GmbH, Karlsruhe 3.2.1. Objectives The goal of subproject II is the collection, preprocessing and analysis of relevant structured and unstructured data (numerical and textual measurements, observations and descriptions of specialists and laymen (citizens)) related to natural hazards in form of disastrous mass movements, with a focus on development and
application of novel computational methods aimed at a combination of results of analyses from heterogenic data sources, followed by a prototypical implementation as suitable components of an early warning system. For this goal, the following central tasks have been identified: – development and application of techniques and methods aimed at an automatic extraction of early warning-relevant information and spatial references from textual messages, – transfer of suitable techniques and methods for automated data analyses and an application thereof in the domain of applied geosciences, – development and application of novel methods aimed at a combination of results of analyses of structured and unstructur-ed data related to natural hazard phenomena of the type »disastrous mass movement« – design of ergonomic user interfaces for the task of complex data analyses in context of early warning systems, with a focus on users with none or no profound IT knowledge Besides concentrating on disastrous mass movements as the context for development and application of appropriate techniques and methods, a high degree of transferability to other natural hazard types or even analytical tasks in the domain of applied geosciences is regarded as desirable. The latter is also seen as part of a general objective to maximize scientific connectivity as well as a prospective commercial value of the project’s results. 3.2.2. Concepts and Methodology In the following the three main work packages of subproject II are described. They will be worked on collaboratively by FZI, IPF and disy Informationssysteme GmbH, Karlsruhe. 1. Conceptual work In the first year of the project, the more geology-oriented and the more computer science-oriented project partners will come to a common understanding of the project’s goals, and develop well-defined,
effective and efficient ways of collaboration. Based on a deeper understanding of the application domain with its specific requirements, characteristics and potential problems, and in close cooperation with the other subprojects, a comprehensive system vision and architecture will be developed, including a set of accepted interfaces and – where necessary – shared data structures. On this basis, example-driven techniques and algorithms will be specified to be implemented later (year 2). In detail, we aim to focus on the following tasks: 1.1 Analysis of the application domain In a first step of the analysis of the application domain, a glossary of central concepts and terms for the subproject partners will have to be developed in order to facilitate communication with other subprojects and with end users. At the same time, relevant functional and non-functional requirements from end users’ perspective and constraints for the systems to be developed have to be identified. 1.2 Investigation of relevant prior work In this task, focus is set on an investigation of relevant prior work within and outside the domain of applied geosciences and an analysis with respect to an applicability and usefulness in the given problem setting. A first part of the work will include a review of SOKRATES, a system developed by FGAN/FKIE in Wachtberg (Schade and Frey, 2004). Relying on a combination of methods from natural language processing and techniques of knowledge representation, the system can be used to display events on a tactical map by means of limited automated interpretation of battle field reports. The combination of techniques employed by SOKRATES can serve as a methodological basis for parts of the system envisioned in subproject II. Due to differences in application context, a number of adjustments will have to be made, whereas parts of the system may be
119
replaced by Semantic Web de facto standards or Open Source developments. A parallel thread of work will focus on an investigation of data mining techniques which could be transferred into the domain of applied geosciences in order to devise, develop and select suitable techniques, algorithms and tools (Lindner, 2005). In this context, assessment of algorithms and tools will also comprise an evaluation of the transferability/applicability of selected representation formalisms, methods and systems to other real-world applications, as well as their interoperability with established tools. 1.3 Collection and detailed analysis of input data The next step in task chain will concern the collection and detailed analysis of available data for data mining algorithms to be developed. Available structured and unstructured data (text) will have to be collected, formally described and stored for application and test. Additionally, further work will address the issue of acquisition of expert knowledge needed for the interpretation of information as the basis for planning actions to avert damages of people and property. 1.4 Design of interfaces and specification of requirements for shared data structures and system components In this task, focus is set on the design of interfaces and requirements for shared data structures and system modules in order to ensure interoperability with other subprojects, in particular with respect to specific requirements introduced with computational representation of three-dimensional spatial data and the representation of vagueness and uncertainty resulting from an analysis of textual spatial references. 1.5 Specification of systems to be implemented (end user perspective) The next steps, subsumed in subtask 1.5, will deal with a specification of systems
120
and methods in the form of use cases, including a coordination of use cases with the other subprojectsâ&#x20AC;&#x2122; use cases and an evaluation of the use cases by end users. Another task will be concerned with a general assessment of transferability, scientific connectivity and a commercial exploitability of the systems to be developed, with respect to different applications in the context of disaster management (early warning systems) or data analysis in the domain of applied geosciences. 1.6 Specification of algorithms to be implemented In this task, three parallel threads of work will focus on the following subtasks: In a first subtask, necessary changes and modifications to the SOKRATES system with respect to the given setting will be defined. For the task of the analysis of structured spatial data, suitable data mining techniques will be specified. Additionally, the task of a rough specification of a graphical user interface of the system will be addressed. The tasks in the first year of the project (conceptual work) will be worked on collaboratively by all partners. 2. Implementation-technical realization In the second year, the focus of subproject II will be set on an implementation of the algorithms previously specified and the necessary system adjustments. Depending on the progress and state of work in other subprojects, further functional aspects will be defined which will be implemented later. It is seen as reasonable to delay the specification of these functional aspects into this phase of the project until a condition is met assuming basic work in the other subprojects is completed and a first feedback concerning the results of implementation etc. is received. In particular, the following packages will be processed:
2.1 Implementation of text analysis methods In this work package, previously specified adjustments, modifications, and enhancements to the SOKRATES system will be prototypically implemented. Subsequently, the system will be tested, making use of the data collected. 2.2 Implementation of spatial data mining/ suitable methods for the analysis of structured spatial data In this work package, the techniques and methods previously specified will be implemented, adapted to, and integrated into the project’s common software framework. Adaptations of existing techniques may also include handling of three dimensional spatial data, methods for efficient processing and techniques for processing data characterized by inherent vagueness or uncertainty. 2.3) Specifying mixed analysis methods and the integrated user interface Relying on results from 2.1 and 2.2, specifications of methods to be implemented in the third year of the project will be developed. A first part of the work will consist of devising methods for combined processing of structured data and postprocessed unstructured data (i.e. results of analyses of textual messages). In this task, central focus will be set on a more detailed investigation of the concept of »spatial coincidence« and certain aspects of methods for combined processing of heterogenic data, in particular related to a quantitative treatment of results under a common decision-theoretic framework. Secondly, an integrated user interface will be specified which will allow non-IT-experts to efficiently make use of the multitude of data sources and algorithms. Here, conceptual work related to problems of suitable representation of spatial data will be put into practice, including, but not limited to, a pragmatic choice of tools for formulating and testing hypotheses or for querying complex data and knowledge bases.
In context of visualization of spatial data, characteristics of the data related to reliability and precision of spatial references will have to be presented in a way satisfying requirements for adequate usability. In particular, previously established qualities like priority or urgency of information and their possible interpretations as well as scale-dependencies will be major aspects. 3. Evaluation In the third project year, two major threads of work will be pursued. In the first subtask, implemented algorithms and systems will be documented, tested, evaluated with real-world data and validated against different defined use cases. This will be done in coordination with other subprojects, and in touch with international scientific community. In the second subtask, a parallel thread of work will be started, focussing on an implementation of combined analysis algorithms for structured and unstructured data. Additional work will include an implementation of ergonomic user interfaces, test-driven removal of errors and deficiencies based on results of parallel evaluation work, and finally an integration of the implementation with the other subprojects’ results. 3.3. Subproject III »Geodatabase support for the geotechnical evaluation of mass movements« Responsible: Martin Breunig (Consortium coordinator), Institute for Geoinformatics and Remote Sensing (IGF), University of Osnabrück 3.3.1. Objectives In this subproject, the primary geological data, provided by the Bayerisches Landesamt für Umwelt, will be modelled and managed in a geodatabase management system. The management of the geological spatial and timerelated primary data enables its usage for analysis, simulation, and 3D visualization at any time. In a second step, finite element models and their results will be stored in the
121
geodatabase. This supports the geotechnical evaluation of mass movements through the assistance of suitable database queries to be developed during the project. Finally, the development and implementation of geometric/topological and time-related operations as input for spatial data mining methods shall provide new spatio-temporal patterns for the recognition of mass movement hazards. The developed methods will be evaluated on the basis of concrete geological application data from the selected area of investigation. 3.3.2. Concepts and Methodology Geodatabases are used for archiving and for providing fast and efficient access to geodata. This enables for the reuse of geodata, e.g. for upcoming new natural events. In addition, a geodatabase can also manage models and their results and even become active, e.g. by computing geometric intersection queries between a set of geoobjects. The subproject pursues three objectives: 1. Geometric and topological management and 3D visualization of existing geological primary data and time-dependent data. 2. Management of FE models and their results for the use in geotechnical evaluations of mass movements. 3. Development and implementation of geometric/topological and time-dependent operations as input for spatial data mining methods. For the first objective, selected test data from the investigation area (see section Application Area) will be examined and managed in a geodatabase in close cooperation with the geotechnical group Boley (subproject I). This includes a first interpretation of the primary data. In particular the representation of discontinuities (e.g. with already existing faults in the slope) will be taken into account. This goal can use the developments implemented in (Breunig et al., 2005) and benefit from this development of geodata management for complex geological models (surface layer and volume models). The available data structures and operations have to be examined for the suitability of the special requirements by mass
122
movements. Where necessary these need to be adapted and extended to fit the new special requirements. The second goal basically concerns the management of the input data and results from 3D model calculations within the geodatabase. This will be done in close cooperation with the geotechnical group of Boley. With the FE model calculation a second kind of data interpretation will be realized. The management in a geodatabase creates new possibilities of further use of such modelling results and for overlaying with further information like utilization data. For example the information »How is the area under the endangered slope used?« is relevant for early warning and therefore needs to be coupled with the model calculation. Homogenized and/or converted geometric data models from the model calculations and the geodatabases have to be examined. Also the information computed for each cell (scalars and vectors) of the FE model has to be considered. Furthermore, the results of a larger number of model calculations can be stored in the database and on later demand they can be compared and queried from different points of view. For example, versions or scenarios of model calculations can be stored and compared with each other. Additionally, taking the use of a »4D model« into consideration, i.e. different time steps of the FE model, the data for the temporal analysis of mass movements can be managed in the geodatabase. It is also of importance to store error margins and their accuracy in the database. Here we refer to subproject II. Intervals need to be stored in the database. Passing over or falling short of these intervals will cause an action of the database. From the database side, further geometric/topological constraints and/or integrity conditions can be realized and included in the analysis of mass movement endangerment. The thematic information connected with the geometry has to be extended in a way that the model calculation receives information about the different semantics of geometrically equal geometries (e.g. fault surfaces versus boundary surfaces). In this context we
speak of »thematic colouring« of the model, which indicates to the user, for example, at which locations danger zones exist. Afterwards the results are used in the data mining component (subproject II). The third goal consists of developing geometric/topological and time-dependent database operations, which support the geotechnical analysis of mass movements. Of concern are not only simple range calculations between point geometries (like it is the case in classical 2D GIS buffer operations). Also the movement/speed of complex 3D geometries and the direction of the movement have to be considered. The implemented operations can be used afterwards in the data mining methods, in order to determine a priori unknown spatial patterns from the combination of spatial and non-spatial attributes for upcoming mass movements. Methodically, one of the principal purposes of object-oriented software technology, the reusability of individual components, will be used during the software development, in order to react as flexible as possible to new requirements of future natural events. 4. Outlook The concrete research results of the joint project will be: – Geotechnical evaluation of mass movements; – Reusability of the geological and geotechnical primary data and models relevant for early warning through management in a geodatabase; – Employment of spatial data mining methods as support of the warning decision for natural events; – Development of new methods for the visual representation and decision support of early warning relevant information from alphanumeric data and texts including fuzzy and incomplete data; – Prototypical implementation of the developed methods and concepts and evaluation with the application scenario »mass movements« (e.g. landslides); – Improvement of the reliability of alarms for the early warning of natural events/hazards by employment of the developed methods.
The evaluation of the project results on the basis of concrete application scenarios and their economic utilization by the partner company are part of the project. The close cooperation with the Bayerisches Landesamt für Umwelt guarantees the direct application of the research results. Acknowledgements The funding of the research project »Development of suitable information systems for early warning systems« (Entwicklung geeigneter Informationssysteme für Frühwarnsysteme) by the German Ministry of Education and Research (BMBF) by grant no. 03G0645A et al. within the framework of the geotechnology initiative (http://www.geotechnologien.de) is gratefully acknowledged. The responsibility for the contents of this publication is by the authors. We also thank the Bayerisches Landesamt für Umwelt (LfU, www.lfu.bayern.de) and the Bayerisches Landesamt für Vermessung und Geoinformation (LVG, www.lvg.bayern.de), for providing data for the application area. References Breunig, M.; Bär, W.; Häussler, J.; Reinhardt, W.; Staub, G.; Wiesel, J. (2005): Advancement of Mobile Spatial Services for the Geosciences. In: Data Science Journal, International Council for Science (ICSU) Dikau, R., Weichselgartner, J. (2005): Der unruhige Planet – Der Mensch und die Naturgewalten, Wissenschaftliche Buchgesellschaft, Darmstadt, 191 S. Lindner, G. (2005): Algorithmenauswahl im KDD-Prozess. Dissertation an der Universität Karlsruhe, Institut AIFB, 2005. Schade, U.; Frey, M. (2004): Beyond information extraction: the role of ontology in military report processing. In: Bachberger, E. (Ed.): Proc. of KONVENS 2004, Vienna, Austria. Smith, K. (2004): Environmental Hazards: Assessing Risk and Reducing Disaster, London.
123
Exupéry: Managing Volcanic Unrest – The Volcano Fast Response System Hort M. (1), Wassermann J. (2), Dahm T. (1) and the Exupéry working group (1) Inst. für Geophysik, Universität Hamburg, 20146 Hamburg (2) Department für Geo- und Umweltwissenschaften, Universität München, 80333 München
Abstract Volcanic unrest and volcanic eruptions are one of the major natural hazards next to earthquakes, floods, and storms. In the framework of this project we will develop the core of a mobile Volcano Fast Response System (VFRS) that can be quickly deployed in case of a volcanic crisis or volcanic unrest. The core of the system builds on established volcanic monitoring techniques such as seismicity, ground deformation, and remote sensing tools for gas measurements. A major novelty of this mobile systems is the direct inclusion of satellite based observations to deduce ground deformation, to detect hazardous gas emissions and to monitor thermal activity. The backbone of the VFRS is a wireless communication network tied to a central data base that collects data from the instruments in the field. Using well defined exchange protocols that adhere to international standards the system will be open for novel and promising tools like geoelectric soundings and new spectroscopic methods to monitor gas chemistry during volcanic unrest. The raw data collected by the VFRS will be further analyzed and fed into different models to constrain the actual state of activity at the volcano and to set alert levels. Finally, all information and results will be made available through GIS based visualization tools to communicate the results to local decision makers. 1. Scientific reasoning 1.1. Introduction A great majority of the world’s potentially active volcanoes are unmonitored. Less than
124
twenty-five percent of the volcanoes that are known to have had eruptions in historical times are monitored at all, and, of these, only about two dozen are thoroughly monitored (Ewert and Miller, 1995). Moreover, seventyfive percent of the largest explosive eruptions since 1800 occurred at volcanoes that had no previous historical eruptions (Simkin and Siebert, 1994). Being able to quickly react to volcanic unrest at so far not well or unmonitored volcanoes is therefore a social challenge not in the least because the danger associated with volcanoes is not only restricted to their eruption, but also includes earthquakes, dangerous gases (lake Nyos, Cameroon; Sigvaldason, 1989), flank movement and other deformation, tsunamis (Stromboli, Italy; Bonaccorso et al., 2003), landslides, and even climatic changes (e.g. eruption of Mt. Pinatubo, 1991; McCormick et al., 1995). Defining criteria by which to forecast volcanic eruptions is therefore the most fundamental goal of volcanological research and it is a mandatory prerequisite for any successful hazard mitigation strategy associated with volcanic activity and critically depends on a full understanding of volcanic systems. The basic question of why volcanoes become restless and erupt is not trivial. In fact, it is a fundamental problem because only a small fraction of magmas generated at depth ever reaches the surface (Crisp, 1984). One of the keys to understand the eruptive potential of a volcanic system is our ability to characterize the actual state of stress of a volcanic system (e.g. Walter et al, 2005) and to understand how susceptible the system is to small param-
eter changes (e.g. pressure, temperature, water content etc. see also Hill et al, 2002). Whereas the first task involves proper monitoring strategies including novel ground and space based observation methods, the second one is concerned with understanding the response of volcanic systems to various forces acting on them. This requires an in depth modeling approach to understand so far unexplored internal feedback mechanisms in volcanic systems. Only then we will be able to identify a near critical state, and understand how small changes in system parameters may well be the decisive factor for the »final push« of magma ascent. Is there solid ground to believe that forecasting volcanic eruptions is a reasonable goal? Yes, because – volcanoes are basically point sources allowing focused research strategies, – their number is limited and only a few volcanoes are born each century, – volcanoes harbor an unparalleled event stratigraphy and chronology, and – many different types of chemical and physical precursors can be recognized and monitored weeks or months prior to an eruption. Classic success stories are the prediction of the eruptions of Mt. St. Helens, USA, (Lipman and Mullineaux, 1981; Swanson et al., 1983) the 1991 eruption of Mt. Pinatubo (Newhall and Punongbayan, 1996) and the crisis at Merapi volcano in 2006. In all cases thousands of lives were saved because of timely evacuation. These predictions were only possible because of combined geodetic, seismologic, petrologic (in case of Pinatubo) and remote sensing analysis. Especially the detection of different precursors points towards the importance of a multiparameter monitoring strategy that will ultimately lead to a proper characterization of the volcanoes state of activity. However, the nature of most precursor signals of active volcanoes has not been deciphered satisfactorily and many volcanic eruptions still occur unexpectedly. Successful volcano understanding and hazard assessment therefore necessitates significant advances in modeling processes inside volcanic systems.
1.2. Mechanisms know and suspected to trigger volcanic eruptions Over the course of the last two to three centuries several mechanisms triggering volcanic unrest have been identified. Regardless of which mechanism one is looking at, the final question is always: Does the state of stress inside the system change in response to changing system parameters? Only if this is the case, then there is a potential for unrest and an eruption. Therefore, all mechanisms known to trigger volcanic unrest do change the state of stress of the system, some in a very direct way (e.g. injection of magma into a confined system), some in a much more subtle manner. A successful early warning strategy therefore critically depends on identifying and detecting parameters that reflect the state of stress of volcanic systems on very different time scales. Well established trigger mechanisms involve magma injection (e.g. Kilauea volcano, Hawaii, Eaton and Murata, 1960; Tilling and Dvorak, 1993) and magma mixing (e.g. Pallister et al, 1992; Eichelberger, 1995), volatile exsolution (e.g. Holloway, 1976), and magma/water interaction (e.g. White and Houghton, 2000). Improved monitoring techniques and space based observations have led to the identification of several new trigger mechanisms like seismic energy radiated by earthquakes (e.g. Barrientos, 1994; Marzocchi et al., 2002; Moran et al., 2002) tectonic stresses (e.g. Nostro et al. 1998), meteorological (e.g. Violette et al., 2001; Hort et al., 2003; Richter et al., 2004) and climatic conditions (e.g. Schmincke, 2004), gravitational loading (Borgia et al., 2000), and tidal forces (e.g. Emter, 1997; Tolstoy et al., 2002). 2. Objectives and workplan Major motivation for this research aside from broadening our fundamental understanding of volcanic systems are the fast growing population, the increasing number of megacities, and the complex communication, transport and supply networks – all of which have resulted in a dramatically increased vulnerability of modern society especially near large volcanoes
125
(e.g., Naples, Catania, Tokyo, Mexico City, Seattle, Manila, Yogyakarta, Managua). Because some of the high risk volcanic systems (especially in the EU) are already monitored in various ways, we propose not to develop yet another monitoring system but to develop the core of a prototype of a mobile Volcano Fast Response System (VFRS) that can be deployed on a volcano in case of unrest or a crisis certainly only upon specific request form the government of the country the volcano is located in. The main idea behind this system is: – that it can be installed fast due to intelligent, cable-free communication between the different stations and a data center, – that a larger number of stations can be deployed, – that all data are collected in a central data base including the data from an existing network (open system) if desired by local scientists and authorities, – that models are developed to derive activity parameters out of the recorded data, – that the data are visualized and partially analyzed in real time, and – that objective and reliable data evaluations are carried out including recommendations for crisis management. The system is intended to densify an existing network including some novel monitoring parameters that are regularly not observed but do allow further insight into the processes inside the volcanic edifice. The VFRS includes a package of programs that uses the recorded data to model and thereby better constrain the complex internal feedback mechanisms in the volcanic system to further improve early warning. All data are collected in a central database that holds all raw data. All results from specific data analysis and modeling are fed into a second database. A GIS based visualization system will display these metadata as well as raw data in real time providing a continuous view on the systems activity. In case of a deployment of the VFRS all information gathered by the VFRS will be discussed with and provided to local experts and authorities to aid their assessment of volcanic activity. This is a very sensitive issue and requires a
126
great amount of diplomacy. It is clear that any official statement regarding the state of activity including setting alert levels or call evacuation can only be made by local authorities and not by us. Our task will be to provide additional information and parameters but we are not in a position to officially communicate those results nor can we call for any actions to be taken. During the three years of funding we will certainly not be able to deliver a perfect mobile volcanic early warning system. However, we will try to develop the core of an open system with well defined interfaces that will certainly further evolve over the years. We therefore view our Volcano Fast Response System as a seed for a Volcanic Task Force Team. We deem the inclusion of older geophysical data that have been recorded by the already existing network into the fast response system database extremely important because they serve as a baseline for assessing the current activity. In case the Volcano Fast Response System is deployed at a volcanic system where the past activity is fairly unknown, an essential part of the Task Force that operates the Volcano Fast Response System would involve a thorough mapping of the past activity. Past activity is an essential information and one of the keys when trying to assess future activity (e.g. Newhall and Punongbayan, 1996). However, this is beyond the scope of this project as this depends on the individual volcano to be monitored. 2.1. Project strategy The project includes a total of 9 groups working on different parts of the Volcanic Fast Response System. We therefore divided the project into 5 work packages which contain related tasks. This enhances communication between the different groups working on related topics. Three of the work packages focus on advancing ground based (WP1) and space based (WP2) observational techniques as well as developing a central database, an alert level system and visualization tools (WP3). One is focussing on quantitative, physical models and data interpretation (WP5).
Figure 1: Connection between the five different work packages
Finally WP4 hosts the development of the communication network and the prototype installation of the VFRS as well as the overall coordination of the whole project. Work Package 1: Ground based observations PIs: C. Gerstenecker, M. Becker (Darmstadt), T.H. Hansteen (Kiel) Goal of this work package is to provide novel instrumentation for ground based observations to the VFRS. The main problem of instrumenting a volcano during a crisis is a) the availability of a significant number of instruments that
b) can be installed at a safe distance to the volcano still providing key data which c) all need to be transmitted in real time to a central system d) in order to be evaluated in real time. The most known and most established observational method is the observation of seismic activity. Broadband seismometers are well developed and for our VFRS test installation (see WP4 below) we will use three component broadband instruments from the amphibian pool (DEPAS). Aside from seismic observations we incorporate two novel ground based observational techniques: a) high resolution local deformation measurements performed by a ground based InSAR system, and b) ground based gas measurements. The great advantage of ground based InSAR systems, which become commercially available now, is their high resolution as well as repetition measurements that are not tied to cycle times of satellites, and the immediate availability of the data. Degassing rates are also a key measurement indicating activity changes in the magmatic system (see Fig. 2). We will therefore incorporate a mini DOAS Instrument into our system for continuous measurements of gas concentrations and fluxes of various volcanic volatiles.
Figure 2: Temporal changes of the SO2 emissions at San Cristobal volcano, Nicaragua, Nov. 23rd, 2002. Clearly visible is the high temporal variability of the SO2 flux underpinning the importance of continuous gas flux measurement in the framework of the VFRS.
127
Work Package 2: Space based observations PIs: N. Adam M. Eineder, T. Erbertseder, P. Valks, (DLR, Oberpfaffenhofen), W. Thomas (Offenbach), R. Bamler, S. Hinz (TU München), M. Hort, D. Stammer (Hamburg) Goal of this work package is to include space based observations into the VFRS because in recent years developments in space based observations have more and more opened up a new field in volcanology through allowing the areal mapping of deformation (Massonette et al, 1993; satellites/instruments: Envisat, ERS1,2, ALOS, TerraSAR), the detection of thermal spots (Wright et al, 2004, satellites/Instruments: AVHRR, GOES, MODIS, ASTER), and the identification of broad degassing signatures, especially of SO2 (e.g. Eisinger and Burrows, 1998, satellites/instruments: GOME-2). Near real time data are already becoming available with satellites having access times of 2 days (TerraSAR), planned missions getting down to 12 hours (FIRES), or data from the geostationary system GOES being available every 15 min. The strategy for including space based observations is twofold: a) we will use older satellite images to assess the evolution of defor-
mation and degassing before the crisis, and b) a real novelty of this mobile system is that we will include the near realtime data from TerraSAR, GOME, MODIS which come in about every two days. New data from GOME-2 data will be available every day. Work Package 3: Databases, IT-Architecture and Visualization PIs: K. Klinge, K. Stammler (SZGRF – BGR), J. Wassermann (LMU-München) Goal of this work package is to provide the data base including GIS capability and visualization tools and the determination of alert levels. The success of the fast response system including early warnings lives through its connection capability to already installed monitoring systems and the new wireless network as well as through its combination of different raw and model data. At present none of the existing databases are able to handle the multi-parameter data resulting from modern volcano monitoring networks simultaneously which are quite common today (e.g. Zschau et al. 1998; Neuberg, 2000; Richter et al., 2004) to assess volcanic activity. In practice this means a high-dimensional,
Figure 3: IT Architecture and exchange protocols of the »expert system«
128
complicated (raw or already parameterised) data stream with different sampling rates and time histories that have to be stored and analysed. The success of the system will be closely tied to the systems capability to visualize the results of the data analysis as well as results from model calculations. Part of the data recorded (e.g. gas measurements) can be displayed directly whereas, for example, seismic data need to be processed to determine e.g. hypocenters (see WP5). All direct data as well as analysis results are stored in a GIS based meta data base and tools will be developed to either look at the data in real time or to visualize the temporal evolution over a certain time period (see Fig. 3). Taking all recorded data as well as the temporal evolution of the system into account automatic alert levels will be determined. By overlaying existing GIS systems on land use, vulnerable objects, and critical industrial complexes etc, existing evacuation plans as well as evacuation routes can be reconsidered in the light of the current activity and the results of model predictions.
Work Package 4: Prototype installation of the Volcano Fast Response System and overall project coordination PIs: T. Dahm, M. Hort (Hamburg) Goal of this work package is to develop the wireless network, install and test the Volcano Fast Response System (land and sea parts) and coordinate the whole project. A key to the success of the volcano fast response system is its wireless network that allows the fast deployment of additional instrumentation in addition to an existing network. The network will be based on so called meshnodes which build a selforganized multipoint to multipoint network, which reroutes data via a new connection once a single connection fails (see Fig. 4). The VFRS itself will be tested in a prototype installation on the Azores because there is currently an unclear situation at Fogo volcano on Sao Miguel near Ponta Delgada. Fogo volcano is showing seismic unrest since May 2002 and GPS measurements indicate a deflation on the NE flanks of Fogo volcano. The test installation includes a land and marine part that will be
Figure 4: General WLAN network design of the VFRS. Please note that the design is not specifically optimized for use during the test experiment but is a general layout allowing installation in very different environments. The arrows indicate all different ways of routing the data in this multipoint network.
129
deployed at the same time. The installation of the VFRS (land part) is scheduled for early 2009 because by that time the basic functionality of the system will be available. The associated marine expedition will take place in May 2009. Carrying out the installation much later would not leave enough time to improve the system. This will be achieved through replaying the data collected during the prototype installation in real time to further develop the functionality of the system.
logical observations and deformation measurements in order to constrain the activity status of a volcano (see Fig. 5). In particular algorithms for event detection and classification and inversion tools for transient and quasi-continuous signals will be developed. Locating the recorded events will be based on different approaches including seismic moment tensor inversion. These results will be transferred to WP3 in order to be displayed in real time.
Work Package 5: Multiparameter analysis of continuous network data for quantitative physical model building PIs: T. Dahm (Uni Hamburg), M. Ohrnberger (Uni Potsdam), Th. Walter (GFZ Potsdam), J. Wassermann (LMU Uni M端nchen), U. Wegler (Uni Leipzig)
3. Deployment strategies for the VFRS A prerequisite for the temporary installation of the VFRS is the request of a foreign administration to our government for help and assistance to manage a period of volcanic unrest or a volcanic crisis. The request for help can include various levels of technical assistance ranging from simply lending instruments all the way to the installation of different components of the VFRS. Before actually deploying parts or the whole systems it must be determined in detail, what type of help is asked for by the foreign government and what can be provided by us. This includes a detailed explanation by the coordinator of the VFRS what systems are available. At this point it will be of utmost importance to adhere to the local governmental structures when discussing what type of help is needed. This may require assistance from the local German embassy in order to avoid any misunderstanding.
Goal of this work package is to provide physical models to the VFRS that allow the determination of the state of stress inside the volcano. Understanding volcanic systems is fairly difficult not only because they are build from a series of intrusive and extrusive/explosive events but also because there are a lot of still poorly understood and/or unknown internal feedback mechanisms. Structure as well as various internal feedback mechanisms may finally lead to a critical stress change in the system and culminating in an eruption. This work package strongly builds on the seismo-
Figure 5: Scheme for near-real-time dislocation source and stress field modeling. Sketch demonstrates InSAR observation of a deforming volcano (1), the deformation data of which (2) are then inverted in elastic dislocation models (3). The dislocation source obtained from the inversion shall then automatically be included in forward models (4) to calculate the static and static stress field change at regions of interest (magma chamber, faults or other heterogeneities). Illustrated here for data of Sierra Negra volcano, the Galapagos Islands (InSAR data courtesy of T. Walter (Potsdam) and F. Amelung (Miami)).
130
Once it is agreed upon which type of help can be provided the coordinator puts together a small team of scientists that will travel to provide the help. The decision on who is going to travel will be based on a) specific requests of the foreign authorities who may have worked already with geoscientists from Germany, b) knowledge of the instrumentation requested, and c) on the expertise depending on the type of volcanic system being at unrest. The group of people will be kept as small as possible in order to not give the impression of overtaking the local agencies. Scientists of the VFRS provide all information gathered by the VFRS to the local scientists. They will also teach local scientists on how to use the systems, so that they can keep using the technology once the crisis is over if that is agreed upon between both parties. During the whole mission of the VFRS participating scientists will not communicate any results and predictions of what may be happening to the local as well as the international agencies and the press in order to a) give the local authorities a change to speak with one voice and b) avoid any confusion on who is in charge in managing the crisis. In charge of managing the crisis will always be the local scientists and agencies. As many volcanic crisis have shown before speaking with one voice is one of the keys to a successful crisis management. 4. Relation to the current social discussion Reducing the impact of natural disasters on the human habitat is a fundamental research goal. In volcanology the ultimate objective is to predict volcanic eruptions in time and space. Chances to succeed are high because volcanoes show many different types of precursors that can often be recognized prior to an eruption. In addition volcanoes are know point sources with a very well documented history. Unfortunately most volcanoes, especially in third world countries, are still unmonitored. It is therefore highly desirable to provide help in case of a volcanic crisis in order to reduce risk and fatalities. With the technology and system developed in the framework of
this project Germany will be in a position to offer help in case of a volcanic crisis, provided that additional funding for instruments to operate a task force is available. Offering such help in case of a request is therefore an internationally visible effort plus it would also include transfer of technology and knowledge into poorer countries. In addition to the task force of the USGS this would be the second task force worldwide and due to partially different instrumentation both task forces would complement each other nicely. References Barrientos SE, (1994) Large thrust earthquakes and volcanic eruptions. Pure Appl Geophys 142: 225–237 Bonaccorso A, Calvari S, Garfi G, Lodato L, Patane D, (2003) Dynamics of the December 2002 flank failure and tsunami at Stromboli Volcano inferred by volcanological and geophysical observations. Geophys Res Lett: 30, 10.1029/2003GL017702 Borgia A, Delaney P, Denlinger RP, (2000) Spreading volcanoes. Ann Rev Earth Planet Sci 28: 539–570 Crisp JA, (1984) Rates of magmatism. J Volcanol Geotherm Res 20: 177–211 Eaton JP, Murata DJ, (1960) How volcanoes grow. Science 132: 925–938 Eichelberger JC, (1995) Silicic volcanism; ascent of viscous magmas from crustal reservoirs. Ann Rev Earth Planet Sci 23: 41–63 Eisinger M, Burrows JP, (1998) Tropospheric Sulfur Dioxide observed by the ERS-2 GOME Instrument, Geophys Res Lett 25: 4177–4180 Emter D, (1997) Tidal triggering of earthquakes and volcanic events. In: Lecture Notes in Earth Sciences 66: Tidal Phenomena, Wilhelm H, Zürn W, Wenzel HG (eds). Springer Berlin Heidelberg New York, pp. 293–310
131
Ewert, JW, Miller CD, (1995), http://vulcan.wr.usgs.gov/Vdap/Publications/OFR95–553 /OFR95–553.html
tion. In Scarpa R, Tilling RI (eds) Monitoring and mitigation of volcanic hazards, Springer Verlag, pp. 807–838
Hill DP, Pollitz F, Newhall C (2002) Earthquake-volcano interaction. Physics Today Nov 2002: 41–47
Nostro C, Stein RS, Cocco M, Belardinelli ME, Marzocchi W (1998) Two-way coupling between Vesuvius eruptions and southern Apennine earthquakes, Italy, by elastic stress transfer. J Geophys Res 103: 24487–24424
Holloway JR (1976) Fluids in the evolution of granitic magmas: consequences of finite CO2 solubility. Geol Soc Am Bull 87: 1513–1518 Hort M, Seyfried R, Vöge M (2003) Radar Doppler velocimetry of volcanic eruptions: theoretical considerations and quantitative documentation of changes in eruptive behaviour at Stromboli volcano, Italy. Geophys J Int 154: 515–532
Pallister JS, Hoblitt RP, Reyes AG (1992): A basalt trigger for the 1991 eruptions of Pinatubo volcano. Nature 356: 426–428 Richter G, Wassermann J, Zimmer M, Ohrnberger M (2004) Correlation of seismic activity and fumarole temperature at Mt. Merapi volcano (Indonesia) in 2000. J Volcanol Geotherm Res 135: 331–342
Lipman PW, Mullineaux DR (eds) (1981) The 1980 eruption of Mount St. Helens. US Geol Surv Prof Pap 1250: 1–844
Schmincke HU (2004) Volcanism. Springer Verlag, Berlin.
Marzocchi W, Casarotti E, Piersanti A (2002) Modelling the stress variations induced by great earthquakes on the largest volcanic eruptions of the 20th century. J Geophys Res 107, 2320, DOI: 10.1029/2001JB001391
Sigvaldason GE (1989) International conference on Lake Nyos disaster, Yaoundé, Cameroon 16–20 March, 1987: Conclusions and recommendations, J. Volcanol Geotherm Res 39: 97–107
Massonnet D, Rossi M, Carmona C, Adragna F, Peltzer G, Feigl K, Rabaute T (2993) The displacement field of the Landers earthquake mapped by radar interferometry. Nature 364: 138–142
Simkin T, Siebert L (1994). Volcanoes of the World, 2nd edition. Geoscience Press, Tucson, 349 p.
McCormick PM, Thomason LW, Trepte CR (1995) Atmospheric effects of the Mt. Pinatubo eruption. Nature 373: 399–404 Moran SC, Stihler SD, Power JA (2002) A tectonic earthquake sequence preceding the April–May 1999 eruption of Shishaldin Volcano, Alaska. Bull Volcanol 64: 520–524 Neuberg J (2000) External modulation of volcanic activity. Geophys J Int 142: 232–240 Newhall C, Punongbayan RS (1996) The narrow margin of sucessful volcanic-risk mitiga-
132
Tilling RI, Dvorak JJ (1993) Anatomy of a basaltic volcano. Nature 363: 125–133 Tolstoy M, Vernon FL, Orcutt JA, Wyatt FK (2002) Breathing of the seafloor: tidal correlations of seismicity at Axial volcano. Geology 30: 503–506 Turner JS, Campbell IH (1986) Convection and mixing in magma chambers. Earth Sci Rev 23: 255–352 Violette S, de MG, Carbonnel JP, Goblet P, Ledoux E, Tijani SM, Vouille G (2001) Can rainfall trigger volcanic eruptions? A mechanical stress model of an active volcano;
»Piton de la Fournaise«, Reunion Island. Terra Nova 13: 18–24 Walter TR, Acocella V, Neri M, Amelung F (2005) Feedback processes between volcanic activity and flank slip at Mt. Etna (Italy) during the 2002–2003 eruption. Journal of Geophysical Research 110, B10205, doi:10.1029/2005JB003688 White JDL, Houghton B (2000) Surtseyan and related phreatomagmatic eruptions. In: Sigurdsson H, Houghton BF, McNutt S, Rymer H, Stix J (eds) Encyclopedia of Volcanology. Academic Press, San Diego, pp 495–511 Wright R, Flynn LP, Garbeil H, Harris AJL, Pilger E. (2004). MODVOLC: near-real-time thermal monitoring of global volcanism. J Volcanol Geotherm Res 135: 29–49 Zschau, J. Sukhyar, R., M.A. Purbawinata, B.-G. Lühr; M. Westerhaus (1998): Project MERAPI – Interdisciplinary Research at a HighRisk Volcano; DGG-Mitteilungen, Sonderband III/1998, S. 3–8
133
Author’s Index
A Abecker A. . . . . . . . . . . . . . . . . . . 113 Arnhardt C. . . . . . . . . . . . . . . . . . . 75 Asch K. . . . . . . . . . . . . . . . . . . . . . . 75 Azzam R. . . . . . . . . . . . . . . . . . . . . 75 B Becker R. . . . . . . . . . . . . . . . . . . . . 89 Bell R. . . . . . . . . . . . . . . . . . . . . . . . 89 Bill, R. . . . . . . . . . . . . . . . . . . . . . . . 75 Birkmann . . . . . . . . . . . . . . . . . . . . 62 Boley C. . . . . . . . . . . . . . . . . . . . . 113 Bonn G. . . . . . . . . . . . . . . . . . . . . . 31 Breunig M. . . . . . . . . . . . . . . . . . . 113 Buchmann A. . . . . . . . . . . . . . . . . . 31 Burghaus S. . . . . . . . . . . . . . . . . . . 89 D Dahm T. . . . . . . . . . . . . . . . . . 14, 124 Danscheid M. . . . . . . . . . . . . . . . . . 89 Dech . . . . . . . . . . . . . . . . . . . . . . . . 62 Dix A. . . . . . . . . . . . . . . . . . . . . . . . 89 E Erdik M. . . . . . . . . . . . . . . . . . . . . . 51 F Fernandez-Steeger T. M. . . . . . . . . . 75 Fischer J. . . . . . . . . . . . . . . . . . . . . . 41 Friederich W. . . . . . . . . . . . . . . . . . . 14 G Galas R. . . . . . . . . . . . . . . . . . . . . . . 7 Gallus D. . . . . . . . . . . . . . . . . . . . . 113 Ge M. . . . . . . . . . . . . . . . . . . . . . . . . 7 Gendt G. . . . . . . . . . . . . . . . . . . . . . 7
134
Glade T. . . . . . . . . . . . . . . . . . . . . . 89 Greiving S. . . . . . . . . . . . . . . . . . . . 89 Greve K. . . . . . . . . . . . . . . . . . . . . . 89 Gurgel K.-W. . . . . . . . . . . . . . . . . . . 20 H Hanka W. . . . . . . . . . . . . . . . . . . . . 14 Helzel T. . . . . . . . . . . . . . . . . . . . . . 20 Heunecke O. . . . . . . . . . . . . . . . . . 101 Hilbring D. . . . . . . . . . . . . . . . . . . . 31 Hirzinger . . . . . . . . . . . . . . . . . . . . . 62 Hohnecker E. . . . . . . . . . . . . . . . . . 31 Homfeld S. D. . . . . . . . . . . . . . . . . . 75 Hort M. . . . . . . . . . . . . . . . . . . . . . 124 J Jäger S. . . . . . . . . . . . . . . . . . . . . . . 89 K Kallash A. . . . . . . . . . . . . . . . . . . . . 75 Kazakos W. . . . . . . . . . . . . . . . . . . 113 Kind R. . . . . . . . . . . . . . . . . . . . . . . 14 Klein . . . . . . . . . . . . . . . . . . . . . . . . 62 Klüpfel . . . . . . . . . . . . . . . . . . . . . . 62 Kniephoff M. . . . . . . . . . . . . . . . . . 20 Krüger F. . . . . . . . . . . . . . . . . . . . . . 14 Krummel H. . . . . . . . . . . . . . . . . . . 89 Kühler T. . . . . . . . . . . . . . . . . . . . . . 31 Kuhlmann H. . . . . . . . . . . . . . . . . . 89 L Lehmann . . . . . . . . . . . . . . . . . . . . 62 Lessing R. . . . . . . . . . . . . . . . . . . . . 51 Lupp M. . . . . . . . . . . . . . . . . . . . . . 51
M Mäs S. . . . . . . . . . . . . . . . . . . . . . 113 Meier T. . . . . . . . . . . . . . . . . . . . . . 14 Milkereit C. . . . . . . . . . . . . . . . . . . . 51 Mott . . . . . . . . . . . . . . . . . . . . . . . . 62
O Ohrnberger M. . . . . . . . . . . . . . . . . 14 Ortlieb E. . . . . . . . . . . . . . . . . . . . 113
S Schedel F. . . . . . . . . . . . . . . . . . . . . 31 Scherbaum F. . . . . . . . . . . . . . . . . . 14 Schlick T. . . . . . . . . . . . . . . . . . . . . . 20 Schlurmann . . . . . . . . . . . . . . . . . . 62 Schöbinger F. . . . . . . . . . . . . . . . . . 31 Schöne T. . . . . . . . . . . . . . . . . . . . . . 7 Schubert C. . . . . . . . . . . . . . . . . . . . 51 Setiadi . . . . . . . . . . . . . . . . . . . . . . 62 Siegert . . . . . . . . . . . . . . . . . . . . . . 62 Stammer D. . . . . . . . . . . . . . . . . . . 20 Stammler K. . . . . . . . . . . . . . . . . . . 14 Strunz . . . . . . . . . . . . . . . . . . . . . . . 62
P Paulsen H. . . . . . . . . . . . . . . . . . . . . 89 Pohl J. . . . . . . . . . . . . . . . . . . . . . . . 89 Pohlmann T. . . . . . . . . . . . . . . . . . . 20
T Thuro K. . . . . . . . . . . . . . . . . . . . . 101 Toloczyki M. . . . . . . . . . . . . . . . . . . 75 Trauner F. X. . . . . . . . . . . . . . . . . . 113
Q Quante F. . . . . . . . . . . . . . . . . . . . . 31
W Walter K. . . . . . . . . . . . . . . . . . . . . 75 Wassermann J. . . . . . . . . . . . . . . . 124 Wenzel F. . . . . . . . . . . . . . . . . . 31, 51 Wiesel J. . . . . . . . . . . . . . . . . . . . . 113 Wunderlich T. . . . . . . . . . . . . . . . . 101
N Nagel . . . . . . . . . . . . . . . . . . . . . . . 62 Niemeyer F. . . . . . . . . . . . . . . . . . . . 75
R Redlich J. P. . . . . . . . . . . . . . . . . . . . 51 Reinhardt W. . . . . . . . . . . . . . . . . . 113 Richter D. . . . . . . . . . . . . . . . . . . . 113 Ritter H. . . . . . . . . . . . . . . . . . . . . . 75 Röhrs M. . . . . . . . . . . . . . . . . . . . . . 89 Rothacher M. . . . . . . . . . . . . . . . . . . 7
Y Yuan X. . . . . . . . . . . . . . . . . . . . . . . 14 Z Zschau J. . . . . . . . . . . . . . . . . . . . . . 51
135
GEOTECHNOLOGIEN Science Report’s – Already published/Editions
No. 1 Gas Hydrates in the Geosystem – Status Seminar, GEOMAR Research Centre Kiel, 6–7 May 2002, Programme & Abstracts, 151 pages. No. 2
No. 3
Information Systems in Earth Management – Kick-Off-Meeting, University of Hannover, 19 February 2003, Projects, 65 pages. Observation of the System Earth from Space – Status Seminar, BLVA Munich, 12–13 June 2003, Programme & Abstracts, 199 pages.
No. 4
Information Systems in Earth Management – Status Seminar, RWTH Aachen University, 23–24 March 2004, Programme & Abstracts, 100 pages.
No. 5
Continental Margins – Earth’s Focal Points of Usage and Hazard Potential – Status Seminar, GeoForschungsZentrum (GFZ) Potsdam, 9–10 June 2005, Programme & Abstracts, 112 pages.
No. 6
Investigation, Utilization and Protection of the Underground – CO2-Storage in Geological Formations, Technologies for an Underground Survey Areas – Kick-Off-Meeting, Bundesanstalt für Geowissenschaften und Rohstoffe (BGR) Hannover, 22–23 September 2005, Programme & Abstracts, 144 pages.
136
No. 7
Gas Hydrates in the Geosystem – The German National Research Programme on Gas Hydrates, Results from the First Funding Period (2001–2004), 219 pages.
No. 8
Information Systems in Earth Management – From Science to Application, Results from the First Funding Period (2002–2005), 103 pages.
No. 9
1. French-German Symposium on Geological Storage of CO2, Juni 21./22. 2007, GeoForschungsZentrum Potsdam, Abstracts, 202 pages.
Early Warning Systems in Earth Management In addition to currently implemented measures for establishing an early tsunami warning system in the Indian Ocean, the German Federal Ministry of Education and Research (BMBF) has launched a portfolio of 11 research projects for developing and testing early warning systems for other natural geological catastrophes. The projects are carried out under the umbrella of the national R&D-Programme GEOTECHNOLOGIEN. The overall aim of the integrated projects is the development and deployment of integral systems in which terrestrial observation and measurement networks are coupled with satellite remote sensing techniques and interoperable information systems. All projects are carried out in strong collaboration between universities, research institutes and small/medium sized enterprises on a national and international level. The abstract volume contains the presentations given at the â&#x20AC;&#x153;Kick-Off-Meetingâ&#x20AC;? held in Karlsruhe, Germany, in October, 2007. The presentations reflect the multidisciplinary approach of the programme and offer a comprehensive insight into the wide range of research opportunities and applications.
Science Report
Seite 1
GEOTECHNOLOGIEN
15:24 Uhr
Early Warning Systems in Earth Management
20.09.2007
GEOTECHNOLOGIEN Science Report
Early Warning Systems in Earth Management Kick-Off-Meeting 10 October 2007 Technical University Karlsruhe
Programme & Abstracts
The GEOTECHNOLOGIEN programme is funded by the Federal Ministry for Education and Research (BMBF) and the German Research Council (DFG)
ISSN: 1619-7399
No. 10
Umschlag_SR10.qxd
No. 10