GEOTECHNOLOGIEN Science Report
Early Warning Systems in Earth Management Status Seminar 12–13 October 2009 Technische Universität München
Programme & Abstracts Number 1
No. 13
Impressum
Editorship / Schriftleitung Dr. Ludwig Stroink © Koordinierungsbüro GEOTECHNOLOGIEN, Potsdam 2009 ISSN 1619-7399 The Editors and the Publisher can not be held responsible for the opinions expressed and the statements made in the articles published, such responsibility resting with the author. Die Deutsche Bibliothek – CIP Einheitsaufnahme GEOTECHNOLOGIEN Early Warning Systems in Earth Management Status Seminar 12–13 October 2009 Technische Universität München Programme & Abstracts – Potsdam: Koordinierungsbüro GEOTECHNOLOGIEN, 2009 (GEOTECHNOLOGIEN Science Report No. 13) ISSN 1619-7399 Distribution / Bezug Koordinierungsbüro GEOTECHNOLOGIEN Heinrich-Mann-Allee 18/19 14473 Potsdam, Germany Fon +49 (0)331-288 10 71 Fax +49 (0)331-288 10 77 www.geotechnologien.de geotech@gfz-potsdam.de Copyright Cover Picture / Bildnachweis Titel: Hadi
Preface Geo-Hazards, vulnerability, and risk-management are key words in the current discussion about natural phenomena like volcanic eruptions, earthquakes, tsunamis, landslides and rock falls. In Germany the national programme »Early Warning Systems for Natural Hazards« was launched in 2007 as part of the research and development programme GEOTECHNOLOGIEN to reduce the risks of these natural phenomena. After two years of funding, 11 projects present the results of their research and software development. The following origin objectives of the projects have been widely reached: 1. Development and improvement of measurement and observation systems in real time for only transmission of decisive physical-chemical danger parameters 2. Development and calibration of coupled prognosis models for quantitative determination of physical-chemical processes within and at the surface of the Earth 3. Improvement in the reliability of forecast and prognoses for decision-making and optimization of disaster control measures 4. Implementation of mitigation measures in concrete socioeconomic damage prognoses 5. Development of information systems based on an open spatial data infrastructure that ensure prompt and reliable availability of all information necessary for technical implementation of early warning and for decision-making by disaster managers The main objective of the second status seminar »Early Warning Systems« is to bring together all participants from the different research projects to discuss the results. Additionally, we have organised for the first time a demonstration of the technical developments in the projects, for example an open spatial data infrastructure or an innovative wireless sensor network. Further investigations in the research programme will concentrate on the integration of the single developments into a coordinated interoperable platform. Werner Dransch
Table of Contents Scientific Programme Status Seminar »Early Warning Systems for Natural Hazards«. . .
1
SLEWS – A Prototype System for Flexible Real Time Monitoring of Landslides Using an Open Spatial Data Infrastructure and Wireless Sensor Networks Fernandez-Steeger, T. M., Arnhardt C., Walter K., Haß S. E., Niemeyer F., Nakaten B., Homfeld S. D., Asch K., Azzam R., Bill R., Ritter H..
...............................................
3
ILEWS – Integrative Landslide Early Warning Systems (ILEWS) Bell R., Becker R., Burghaus S., Dix A., Flex F., Glade T., Greiving S., Greve K., Jäger S., Janik M., Krummel H., Kuhlmann H.,
......
16
alpEWAS – The Aggenalm Landslide – Innovative Developments for an Effective Geo Sensor Network for Landslide Monitoring and Early Warning Thuro K., Wunderlich T, Heunecke O., Singer J., Wasmeier P., Schuhbäck S., Festl J; Glabsch J. . . . . . . . . . . . . . . . . .
33
Lang A., Li. L., Mayer C., Mayer J., Padberg A., Paulsen H., Pohl J., Röhrs M., Schauerte W., Thiebes B., Wiebe H.
EGIFF – Developing Advanced GI Methods for Early Warning in Mass Movement Scenarios Breunig M., Schilberg B., Kuper P. V., Jahn M., Reinhardt W., Nuhn E., Mäs S., Boley C., Trauner F.-X., Wiesel J., Richter D., Abecker A., Gallus D., Kazakos W., Bartels A.
....................................
49
Last-Mile – Numerical Last-Mile Tsunami Early Warning and Evacuation Information System Taubenböck H., Goseberg N., Setiadi N., Lämmel G., Moder F., Schlurmann T., Oczipka M., Klüpfel H., Strunz G., Birkmann J., Nagel K., Siegert S., Lehmann F., Dech S., Gress A., Klein K. .
...........................
73
EDIM – Earthquake Disaster Information System for the Marmara Region, Turkey Wenzel F., Erdik, M., Köhler N., Zschau J., Milkereit C., Picozzi M., Fischer J., Redlich J. P., Kühnlenz F., Lichtblau B., Eveslage I., Christ I., Lessing R., Kiehle C..
............................................
84
Exupéry – Volcano Fast Response System Hort M., Barsch R., Bernsdorf S., Beyreuther M., Cong X., Dahm T., Eineder M., Erbertseder T., Gerstenecker C., Hammer C., Hansteen T., Krieger L., Läufer G. Maerker C., Montalvo Garcia A., Ohrnberger M., Rix M., Rödelsperger S., Seidenberger K.,
..............
96
WeraWarn – Real Time detection of Tsunami generated signatures in Current Maps Measured by the HF Radar WERA to Support Coastal Regions at Risk Dzvonkovskaya A., Gurgel K.-W., Pohlmann T., Schlick T., Xu J. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
111
Shirzaei M., Stammler K., Stittgen H., Valks P., Walter T., Wallenstein N., Wassermann J., Zaksˇ ek K.
EWS Transport – Early Warning System for Transport Lines Hohnecker E., Buchmann A., Wenzel F., Titzschkau T., Bonn G., Hilbring D., Quante F. .
....................
123
Status & Results of G-SEIS – With Focus on Real-time GNSS Software Development Chen J., Ge M., Gendt G., Dousa J., Ramatschi M., Falck C., Schöne T. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
141
RAPID – Rapid Automated Determination of Seismic Source Parameters Meier T., Dahm T., Friederich W., Hanka W., Kind R., Krüger F., Ohrnberger M., Scherbaum F., Stammler K., Yuan X.
.....
149
2. Statusseminar: »Early Warning Systems for Natural Hazards« 12–13 October 2009, TU München 12. October 2009 10.30 – 11.00 11.00 – 11.45 11.45 – 12.45 12.45 – 13.15 13.15 – 14.30 14.30 – 15.15 15.15 – 16.00 16.00 – 17.00 17.00 – 17.30 17.30 – 18.30
About 19.30
Welcome A Prototype System for Flexible Real Time Monitoring of Landslides Using an Open Spatial Data Infrastructure and Wireless Sensor Network (SLEWS) Integrative Landslide Early Warning Systems (ILEWS) Short Presentations of Technical Solutions (5–10 min) Luch Innovative Developments for an Effective Geo Sensor Network for Landslide Monitoring and Early Warning (alpEWAS) Development Advanced GI Methods for Early Warning in Mass Movement Scenarios (EGIFF) Numerical Last-Mile Tsunami Early Warning and Evacuation Information System (Last-Mile) Coffee break Cooperation Between the Projects Session I: SLEWS / ILEWS – User Requirements Session II: SLEWS / ILEWS / alpEWAS / EWS-Transpor t/ EGIFF – WebGIS/SWE Dinner
13. October 2009 08.30 – 09.30 09.30 – 10.00 10.00 – 10.30 10.30 – 11.00 11.00 – 11.30 11.30 – 12.30 12.30 – 13.30 13.30 – 14.00 14.00 – 14.45 14.45 – 15.45
15.45 – 16.00
Earthquake Disaster Information System for the Marmara Region, Turkey (EDIM) Volcano Fast Response System (Exupéry) – Lecture Volcano Fast Response System (Exupéry) – Poster Discussion
Coffee Break Realtime Detection of Tsunami Generated Signatures in Current Maps Measured by the HF Radar WERA (WeraWarn) Early Warning System for Transport Lines (EWS Transport)
Lunch GPS-SurfacE Deformations WithIn Seconds (G-SEIS) Rapid Automated Determination of Seismic Source Parameters (RAPID) Cooperation Between the Projects Session III: G-SEIS and RAPID – Realtime-GPS and its Application Open Discussion and Presentations Further Cooperations and Innovations Final Discussion
1
SLEWS – A prototype system for flexible real time monitoring of landslides using an open spatial data infrastructure and wireless sensor networks Fernandez-Steeger T. M. (1), Arnhardt C. (1), Walter K. (2), Haß S. E. (3), Niemeyer F. (2), Nakaten B. (1), Homfeld S. D. (4), Asch K. (3), Azzam R. (1), Bill R. (2), Ritter H. (4) (1) Chair of Engineering Geology and Hydrogeology (LIH), RWTH Aachen University e-mail: arnhardt@ / azzam@ / fernandez-steeger@ / nakaten@lih.rwth-aachen.de (2) Chair of Geodesy and Geoinformatics (GG), Rostock University e-mail: ralf.bill@ / frank.niemeyer@ / kai.walter@uni-rostock.de (3) Bundesanstalt für Geowissenschaften und Rohstoffe (BGR), Hannover e-mail: kristine.asch@ / stefanie.hass@bgr.de (4) ScatterWeb GmbH (SWB), Berlin e-mail: homfeld@ / ritter@scatterweb.de *Coordinator of the project: Prof. Dr. rer. nat. Dr. h. c. R. Azzam, RWTH Aachen University
1. Introduction The danger from landslide hazards to people and infrastructures is still rising worldwide. Due to the progressive development of urban areas and infrastructure, more and more people settle in environments that are or become endangered by mass movements. This situation is being complicated by the fact that the dependency of our society on a functioning infrastructure and number of human or objects in endangered areas increases at the same time. Early warning and alarm systems are an efficient tool to face landslide hazards and reduce the risks from landslides, especially where no other mitigation strategies are suitable. However the available technologies are comparatively expensive and not very flexible. Moreover the systems normally represent the state of technology and do not focus the user needs (people centred). The joint project »A Sensor-based Landslide Early Warning System (SLEWS)« aims at a systemic development of a prototyping alarm- and early warning system for different types of landslides utilizing
ad hoc wireless sensor networks and spatial data infrastructure technologies according to OGC (Open Geospatial Consortium) guidelines for real-time monitoring. Therefore, SLEWS investigates and simulates the whole chain from data gathering and acquisition, evaluation and interpretation, up to data analysis, visualization and data supply for different end users. Besides the technical aspects of alarm and early warning systems, the demands of the users and requirements for an effective warn management are further important research fields within the project. 2. Wireless ad-hoc multi-hop sensor network for landslide monitoring Modern, so called Ad-Hoc wireless sensor networks (WSN) are characterized by a self organization and self-healing capacity of the system (autonomous systems), without predefined infrastructures (Arnhardt et. al., 2008). It permits a non hierarchic data exchange and thus allows a very simple adaption of the network to changing conditions. The use of a
3
wireless system eliminates the need of extensive cabling and so reduces material costs as well as the vulnerability of the system (Garich, 2007). Moreover the reliability of the system increases subsequently. The network consists of numerous sensor nodes that can interact with their neighbor nodes (Fig. 1) and perform simple data processing. Each node has its own power supply, transmission and receiving unit, microprocessor, and internal memory. This setup allows an independent working from other nodes of the WSN and permits a stable runtime. Data packages from each node are sent directly via radio or over other nodes (Multi-Hop) to a collection point (gateway). The Multi-Hop function reduces the requirement for long-range transmission and in turn also the transmission power (Sohraby et al., 2007). Due to the ad-hoc character of the self organization new nodes or temporarily disconnected nodes can easily connect and synchronize themselves with the system (Fig. 1). 2.1 The SLEWS sensor network The sensor network used in the SLEWS project uses the 868 MHz frequency band for sending radio signals. The frequency band is divided in further 30 sub-bands to avoid noisy channels. Data rates range from 4,8–115 kbits/s at transmission distances from 10 m to 1,2 km (Scat-
terWeb, 2007). For the detection and direct monitoring of different kinds of landslide processes and deformations, a new sensor board was developed (Fig. 2). It includes the measuring sensors (acceleration, inclination and a barometric pressure sensor) and the node for data and command transfer. Due to the modular setup of the hardware it is easy to adapt or integrate the components in existing solutions. The sensor node is encapsulated in a box with a high protection class (IP: 68). For the power supply each node can be connected to normal (battery pack) or a solar powered (solar pack) rechargeable battery (Fig 2). The gateway can either be energized with a 6 V or 12 V adapter, solar panels (Solargateway) or batteries (6 V or 12 V). To extend the stand time the nodes can operate in different energy-saving modes. The bi-directional structure of the system enables data transfer from each node and also allows to transmit commands or software-updates to individual nodes or a group of nodes. 2.2 Sensors for the monitoring of surface deformations Landslides are complex ground movements that may cause deformation on the surface but also damage to constructions. To get
Fig. 1: Structure of an ad hoc wireless sensor network (WSN). Data packages are sent from each node either over other nodes (multi hop) or directly to a collection point (gateway)
4
Fig. 2: Components of the SLEWS sensor network. The left side shows a sensor module with the ScatterNode mounted on the sensor board and energy modules. The right side shows a solar powered and standard gateway
information about the activity in the subsurface and to develop a suitable warning structure it is important to monitor these deformations permanently with precise, robust and reliable measuring instruments. Even though today existing systems are quite accurate they are often very expensive and difficult to operate in the field. The size and weight of these systems can be another problem. Therefore it is important to develop new measuring instruments that are small, precise but also inexpensive to use it in large numbers outside in the field. Micro-Electro-Mechanical-Systems (MEMS) can support this effort, because they fulfill all of these requirements. The MEMS sensors combine very small mechanical and electronic units, sensing elements and transducers on a small microchip. In the frame of the SLEWS Project different Microsystems are integrated in a sensor node to detect and monitor surface deformations in landslide areas (Fig. 3). The selected sensors consist of 3D MEMS silicon capacitive sensors made of single crystal
silicon and glass for measuring acceleration, tilting and barometric pressure. The 3D acceleration sensor has a sensitivity of 1333 counts/g and a acceleration range of ± 2 g. Two different 2D inclination sensors or tilt sensors are tested in this project. The first one is a dual axis inclinometer with a measuring range of ± 30°, a sensitivity of 1638 counts/g while the other one has a measuring range of 90° and a sensitivity of 819 counts/g. The digital absolute barometric pressure sensor has a resolution of 1.5 Pa. and an accuracy of ± 50 Pa. All sensors have a standard SPI digital interface and an internal temperature device. The three sensor types are mounted on a new developed sensor board. In addition precise position sensors are used for elongation-measurements (Fig. 3). Two different types can be integrated into the network: potentiometric displacement transducers and linear magnetorestrictive position transducers. According to classical extensometers they are used to monitor the opening and closing of cracks and fissures. The measuring range of the displacement transducer used has a length of up to 1 m with a given linearity of
5
Fig. 3: Sensors used in the frame of the SLEWS project. In the left box the microsystems (MEMS) like acceleration, tilting and pressure sensor are shown. The right side shows the position sensors used in the project
Âą 0.05% of the effective range. The magnetorestrictive transducer used in this project has a measuring range of 1000 mm, with a given linearity of < Âą 0.01 % of the effective range. Due to the A/D converter integrated in the position sensors the energy demand is substantially higher. 2.3 Enhanced data and information retrieval by sensor fusion Today existing monitoring systems are very often monolithic solutions with only one or two measuring devices in the field. Due to the small number of instruments, reference systems are missing. Thus it is often very difficult to distinguish if the measured values are initiated by slope deformations or from malfunctions. The solution for this problem is the combination of sensor information (sensor fusion) from different autonomous working devices. Sensor fusion provides a more detailed and qualified image from what is happening in the field. E.g. numerous sensors that are mounted on a rotating or tilting block should all show the same movement pattern even though they provide different measuring values or information (Fig. 4). Hence the landslide activity can be recognized much better and more reliable. This concept
6
assists decision making and reduces false alarm rates. If one sensor on a non-moving block indicates changes while other sensors nearby this sensor show no changes, the probability that this single sensor is not working correctly is very high. Especially the detection of false reports and technical errors of the system is an important part of sensor fusion. Different strategies of sensor fusion are developed and tested in the frame of the project. The first one is the combination of identical sensor types (redundant sensor fusion) mounted on the same node. The data of two or more identical sensors (for example tilt sensors) may be used to supervise the system as well as for decision finding. Another strategy is the combination of different sensor types (complementary sensor fusion) at one node, like tilt-sensor and acceleration sensor. Normally sensors that complement each otherâ&#x20AC;&#x2122;s observations like tilting and acceleration or tilting and displacement transducers are used for sensor fusion. It is a very efficient strategy as no additional sensors are necessary. In this project this is one of the main types of sensor fusion due to the sensor configuration on the standard sensor board. The combination of sensor information from spatially distributed nodes (spatial sensor fusion) may improve the spatial monitoring. Open structures of the
Fig. 4: A potential application of complementary and spatial sensor fusion for tilt monitoring of a rock pillar. Case A at the left shows a clear situation where the pattern of sensor information indicates movements. In case B the pattern indicates a malfunction or error, as only one sensor is active and even the complement sensor shows no reaction
SLEWS system also a standardized integration of data from other sources. Thus the combination of different networks (network fusion or data fusion) is possible and will be tested with other projects in the frame of the GEOTECHNOLOGIEN program. In this context different experiments to investigate sensor combinations and sensor fusion were accomplished. The tests showed that good correlations between data of different sensors are possible. Additionally outliers and measurement errors can easily be recognized. The first tests were done with a small group of sensors and nodes but the spectrum will be increased in the near future. 2.4 Quality tests of the network components under laboratory conditions To get information about sensor quality and stability, different test boards were developed. Although every sensor has a well defined datasheet from the manufacturer, it is important to prove this data in the environment of the SLEWS system. In the first stage the data spread under stable conditions was investigated. The data stream from the sensors was observed under stable boundary conditions for some days. Afterwards the data were analysed using different statistical methods. The analy-
ses showed that all sensors show very small data variations without measurable data drifts. According to the accuracy of the test stations, the maximum variation of tilt sensor is ± 0.06° to ± 0.1° with a standard deviation of < 0.05°. For the accelerations sensor the variation of the gravitational acceleration under static conditions is ± 0.008g with a standard deviation of less than 0.003 g. There was no data spreading over a measuring period of 5 days measurable. In a following step the accuracy of the different sensors was investigated under dynamic conditions. Tilt and static acceleration measurements with different inclinations showed that changes > 0.1° and changes of > 0.02 g can be detected by the sensor reliably. To get information about the accuracy of the displacement transducer, data are correlated with a precise dial gauge (Accuracy: 0.01 mm). The test showed that length variations of up to 0.1 mm can be identified with high precision. The comparison of the test results with the specifications given in the datasheets from the manufactures showed good concordance. 2.5 Operational test under field conditions To test the system under field conditions a small wireless sensor network with 7 nodes
7
Fig. 5: Test area for field experiments at the landslide Super Sauze / F. The left picture (A.) is an overview over the landslide area. On the upper picture at the right (B.) a node with solar panel is presented. The lower picture (C.) shows a displacement transducer which observed In the course of the 5 day field test an opening of the fissure of about 1.5 mm
was set up on the Super Sauze landslide in the south of France (Fig. 5). This active landslide has a high movement rate of up to 0.4 m/day (Malet et al., 2005). The test took place in cooperation with scientists from the Stuttgart University, Delft University and the Observatory OMIV (Oberservatoires des InstabilitÊs de Versants). The Super Sauze Mudslide was triggered in the 1960´s and is an example of complex landslides in soft-clay shale. For testing the SLEWS sensor network seven nodes were set up in the area. Six of the nodes were equipped with inclination, acceleration and pressure sensors. The nodes were installed at the edge of the slope nearby cracks in order to detect deformations and movements on the surface (Fig. 5). The opening of one more active crack in the area was monitored by a displacement transducer (Fig. 5). As no rainfall as main trigger for movements occurred during the test period, the movements where quite small. Nevertheless the system recognized small movements. Within the range of the displacement transducer a crack opening of 1.5 mm in 5 days
8
could be observed. During the testing time all of the nodes worked without problems. Despite of large temperature variations of more than 25 degrees the measuring data did not show any measurable shifts. This first test shows that the system can be used in a landslide environment even with smaller movement rates. Even in an unknown environment the installation of the system worked very well. The first data could already be measured shortly after start up. Further field tests are planed this year to examine the system for longer periods and especially observe the data-transfer into the spatial data infrastructure of the SLEWS system. 3. Spatial data infrastructure and data processing A central point of the SLEWS project is the utilisation of ad hoc wireless sensor networks combined with spatial data infrastructure technologies according to OGC guidelines to provide a low-cost, interoperable and performant early warning system. Methods of data access, communication and visualisation are to be implemented using OGC Sensor Web Enable-
Fig. 6: Distributed service infrastructure in the frame of the SLEWS project. Sensor network and gateway server are operated in Aachen, while the service infrastructure is operated in Rostock. Data communication and handling is realized using W3C and OGC standards
ment (SWE, (Botts, 2007)) specifications, concurrently offering information resources via open standards to external applications while importing interoperable resources in return. Subsequently the progresses in the system under development to convert native sensor network data to a standardised format to be accessed and published via SWE services will be described. 3.1 Data import and management Testbed WSNs are being operated by the project partner RWTH Aachen University (LIH). The observation data can be accessed and interfaced with external applications from each WSN gateway via a server unit (gateway server) connected to it (Fig. 6). Communication between gateway and gateway server can be established using a physical or a wireless data connection like GSM mobile connection. A listener program running on the gateway server is used to access the data stream arriving at the gateway, consisting of a WSN specific text output, representing data packages received from sensor nodes. Additionally the data are stored at a local data base. However to be available for generic SWE-enabled client applications, measurement data has to be converted into an appropriate data schema.
For this purpose the listener program (Fig. 7) implemented by the project partner LIH parses the measurement strings using look-up tables to determine suitable parameters and extend them to meet the requirements of the SWE observation data model. In a next step the observation data is repeatedly transferred by a feeder application (Fig. 7) to a PostGIS spatial database at Rostock University (GG). The database serves as the central data management and is the source for the Sensor Observation Service (SOS). Another method to insert observation data into the database is to use the SOS interface itself. For this, the SOS operations must be extended to include a transactional profile. The advantage of data entry via SOS service interface is its interoperability: Any data provider using proper query syntax is able to register with the SOS and supply it with data sets without previous knowledge. However, taking the high frequency of data updates into account, direct database inserts result in a significantly better system performance. 3.2 Data access and distribution The SOS interface can be used by compliant clients such as visualisation or analysis applications to retrieve observation data and sensor metadata via spatial and temporal queries.
9
Fig. 7: WSN data import. A listener program on the gateway server reads and parses the data stream from the gateway. The feeder application transfers the data to a PostGIS spatial database
Response formats of the SOS are the XMLbased data encoding schemes SensorML, to describe and formalise parameters of sensors, and Observation & Measurement (O&M) for the exchange of actual measurements and observation data. However, in a monitoring and early warning scenario, not only »pulling« data sets from the SOS by user/application request, but also an implementation of eventbased notification mechanisms is needed. Messages have to be »pushed« asynchronously to draw the user’s attention in case respective thresholds are exceeded or system alerts arrive. This is achieved using the Sensor Alert Service (SAS) (Fig. 6 and 7). Sensors can publish observation and other data to the SAS, to which clients can subscribe for this data and receive notifications filtered by specific userdefined thresholds (Simonis, 2007). Both, sensor node and client, join a Multi User Chat (MUC) through which messages are passed along via XMPP-based (Extensible Messaging and Presence Protocol) protocols such as Jabber. In addition to transferring observation data to the SOS database, the feeder application running on the gateway server (Fig. 6 and 7) publishes this data and other respective WSN system messages to the MUC. Subscribed users can receive these messages depending on subscription parameters using desktop or web-based XMPP/Jabber client software. In order to extend the range of pos-
10
sible communication channels a Web Notification Services (WNS) server can be used as a protocol converter bridging the so called last mile by notifying users by Email or SMS. 3.3 Data processing and visualisation The use of SWE services as an interoperable data interface offers the basis for the implementation of highly scalable information products ranging from detailed expert-driven text or numeric output to very clearly represented map visualisation. A browser-based client (Fig. 8) using the OpenLayers framework is currently under development to quickly and comprehensible visualise the development of landslide related parameters at respective sensor nodes over time. Other external OGC-conformant data sources e.g. map data, national weather data or measurements of other sensor networks can be integrated easily to offer up-to-date information support. As a next step, to meet early warning requirements, the browser application is extended to integrate and display XMPP messages sent by the SAS. Access to the SOS interface for more sophisticated simulation and analysis operations can be executed by using applications such as MATLAB. First preliminary interoperability tests with GEOTECHNOLOGIEN parallel projects integrating SLEWS data into external applications have been established and are showing
Fig. 8: SOS visualization client for real time visualization of sensor observations in their spatial context
promising results. Future work will focus on establishing complete early warning cycling by adopting notification, visualisation and analysis methods to the requirements of monitoring and early warning scenarios concurrently tested within the SLEWS project. Especially data resulting from advanced spatial analysis operations will have to be closely connected to the data management (and also offered via OGC services) to offer better decision support information. Another objective is to establish the bi-directional communication with the WSN by accessing remote functions such as duty cycling or measuring rate that are already offered through the gateway server software. By wrapping the control software through a Sensor Planning Service (SPS) interface these functions may be integrated into client applications and operated very easily. 4. The interface user and politics â&#x20AC;&#x201C; information and warning management As the technical aspects of early warning systems are often challenging and resources con-
suming the users and their requirements are often neglected. Despite this fact people-centred warning is the central element in modern early warning systems. Thus an effective warning management requires the identification of the decision makers and their responsibilities within the warning process. In Germany disaster protection and disaster response lies in the responsibility of different departments within federal, state or local government authorities. This fact makes it difficult to develop a general people-centred warning chain. Instead, a standard scenario in case of a critical event has to be developed in a pre-event phase on local level. This requires an assessment of risk and preventive measures in a given area. Unfortunately to this day there is no strategy or a political framework at national or sub-national level for the management of natural risks in Germany. The implementation of prevention measures on local level starts quite often after the occurrence of natural distatsers or catastrophes. Further individual security concepts are often worked out in reply to catastroph-
11
ically events, instead of exercising the elements of a holistic risk management. Thereâ&#x20AC;&#x2122;s a lack of social discussion in responding to the question: What can happen (risk analysis) and what may happen (risk evaluation)? Furthermore there has to be an agreement about the accepted risk and the handling of residual risk. It is the task of Early Warning Systems to limit the residual risk. This illustrates that the implementation of a people-centred Early Warning System requires a risk dialogue between all stake-holders such as local authorities, scientific authorities, disaster management agencies, engineers, land use and urban planners and the local community. 4.1 Risk mitigation strategies in the frame of an early warning system A risk mitigation strategy includes different kind of information and warning levels (Fig. 9). Thereby, it must be distinguished between a technical-scientific and a socio-scientific understanding of an information and warning system. Considering the technical-scientific point
of view, the information and warning system consists of a monitoring system, threshold values and warning levels which depend on hazard intensity, lead time and probability of occurrence. In the context of landslide early warning four levels of information are possible: sensor information, early warning, alert / alarm and post-event alarm. 1) Sensor information Sensor measurements should lead to a better understanding of landslide processes. Moreover the monitoring system should have the ability to provide information about all relevant parameters at any time. In practice the trend is to integrate measured data in multiple-information systems, including raw data, processed data, measurements of other parameters (e.g. weather data, seismic data) and information about the slope stability. Therefore it is important to analyze user requirements and to develop user profiles. According to the specific user profiles, measured data can be
Fig. 9: Information and warning chain within a risk management. Depending on hazard and risk level different scenarios maybe feasible and acceptable for the different stakeholders
12
present as chart or time series. Stakeholders who are responsible for security and intervention on-site need understandable information about the hazard situation not just a wide data collection about slope inclination or acceleration. Furthermore individual graphical user interfaces can help to provide people-centred information. Finally the content of the information depends on how the early warning system is intended: as a stand-alone information system or as a part of a response operation system. 2) Early warning Currently, landslides are predicted by using probability statements (probability = low/ medium/high). To do so the categorization of landslides motion and their process intensity is required. It needs to be determined whether there are more slow-moving processes (permanent landslide) or fast-moving or sudden processes (e.g. rock fall, mudflow or spontaneous landslide). Another important aspect is if there is a sufficient knowledge about the mean velocity of the slide or kinetic energy? Therefore the kinematics of each landslide type and its physiographic environment has to be examined. Also the trigger factors for each process should to be measurable. In the best case an underlying stability model determines landslide-triggering thresholds and provides predictions about effective lead time. There are different approaches to implement graduated warnings depending on the ranking of decision makers, the remaining response time, the time slice or the event type 3) Alert or alarm Reliable statements for imminent hazardous landslide events (probability = 1) are not possible up to now. Thus an alert or alarm which involves the evacuation in a given area should be considered critically. Too many false alarms result in a general loss of public confidence in warning systems and lower the risk awareness. Unnecessary evacuations are also very cost intensive. If shorttermed forecasting techniques should deliv-
er a reliable performance in practice, the warning message needs to be integrated in already existing warning dissemination and communications systems of local or regional rescue coordination centres. In Germany the structures for civil protection and emergency aid are regulated by law. Communication systems for the dissemination of warnings in case of emergency do already exist on a regional level (ÂťRettungsleitstelleÂŤ). 4) Post-event alarm Automated road or rail closure is an appropriate action to protect persons from potential risks due to damaged infrastructures. The acceptability of such automated instructions depends on the reliability of the warning system. At the same time the regional rescue coordination centres and scientific authorities have to be notified by the warning system about the incurred event. In practice there is still scepticism towards automatically activated traffic light indicator. In fact the majority of the decision makers seem to require primarily a validation of the damaged infrastructure by experts (e.g. geological survey or road maintenance) before closing the road or railway. Even if the development of early warning systems is primarily focused on the technical-scientific approach of defining warning levels, the warning management should be considered in a broader context. In practice information systems and warning systems can often not be separated accurately. An early warning system for landslides consists of different kind of sub-systems, like monitoring systems, information systems, warning systems and emergency systems for the dissemination, communication and coordination in the case of an event. The sub-systems can be spatially distributed and be operated by different funding organisations (rescue coordination centre and engineering company on local level). Before implementing an early warning system, the operational procedures or the action plan and the information chain have to be outlined in an emergency plan. Finally it depends on the
13
local / national security strategy and the legal framework, if an early warning system for landslides is organized by severally operators or follows the single voice principle. 4.2 Warning management The socio-scientific understanding of a warning management includes also organizational measures like awareness creation, implementation of alarm chains and action planning. Methods like human observations or archived materials from earlier events should also be included and analyzed. The objective is to establish long-termed data and observation series for hazard assessment and the reduction of risk through the development of a risk culture in situ. In this context the implementation of an early warning system requires – a strategy for a greater awareness of geohazards regarding officials, authorities and citizens – the identification of key actors and decision makers in situ – the development of user profiles and usercentred information – the integration of disaster profiles and scenarios in emergency plans – standardized procedures in lines of communication and warning – a open discussion about the handling of remaining risk 5. Conclusion The SLEWS project investigates the whole information chain of a landslide early warning system. The major technical components of the system are now running and will be further tested under laboratory and field conditions. Laboratory and the field tests with the wireless sensor network prove that the system works in the expected way, even under difficult environmental conditions. This shows that wireless sensor networks are a promising alternative to conventional systems in terms of cost efficiency, flexibility, easy installation, energy efficiency and reliability. The developments in
14
the Sensor Web Enablement (SWE OGC) provide new perspectives regarding a more easy availability and integration of sensor data. Application of interfaces according to OGC guidelines allow to set up web based modular monitoring and early warning and systems. This allows integrating data management, processing and visualisation in a flexible system. The currently running infrastructure fulfils already the main requirements of the project. The operation of the sensor network is partly managed using the developed infrastructure. To provide a full operational control in the future bi-directional communication by accessing remote functions has to be established. In a long-term field test during winter time remote operation with distributed components will be tested under real conditions. The investigation of the stakeholders and their demands but also the warn management brought up surprising results. Distributed responsibilities, especially in Germany, make it difficult to develop a general people-centred warning chain. Instead standard scenarios for critical event situations have to be developed in a pre-event phase on local level. Best practise examples can assist these efforts and help to establish standards. Today’s systems are developed from a technical perspective and reflect normally the state of the art. Despite this for the efficiency and success of monitoring or early warning systems lead time and suitable action plans are often much more important. Soft factors like risk awareness and fast but precise reactions in an event situation may have a larger impact than more sophisticated technical systems. One step into this direction may be a more structured and adapted warn management considering different stakeholders, information and reaction levels. In a future field tests the demand and perspectives of a structured warn management will be investigated. A shift of strategies from best available technical equipment to a suitable equipment and information management considering demands can help in the future to make the warning management more efficient.
6. References Arnhardt, C., Nakaten, B., Fernandez-Steeger, T. M., Azzam, R. (2008): Data-Fusion of MEMSSensors in a wireless Ad-Hoc Multi-Hop Sensor Network for Landslide Monitoring and Realtime Early Warning, In: Schriftenreihe der Deutschen Gesellschaft für Geowissenschaften, International Conference and 160th annual meeting of the Deutsche Geologische Gesellschaft für Geowissenschaften e.V. and 98 annual meeting of the Geologische Vereinigung e.V., Geo 2008 Aachen, 176. Botts, M. (2007): OGC White Paper – OGC Sensor Web Enablement: Overview and High Level Architecture. – Version 3. – URL: http://www.opengeospatial.org/pressroom/ papers. OGC 07-165. Garich, E. A. (2007): Wireless automated monitoring for potential landslide hazards. – Master Thesis; Texas A & M University, 48 p. Malet, J.-P., Laigle, D.,Remaitre, A., Maquaire, O. (2005): Triggering conditions and mobility of debris flows associated to complex earthflows. – Geomorphology, 66, 215–235. ScatterWeb GmbH (2007): Company presentation. – http//www.scatterweb.com (Time: 04/2008). Simonis, I. (2007): Sensor Alert Service Implementation Specification – Version: 0.9.0.. – URL: http://www.opengeospatial.org/standards/ requests/44. OGC 06-028r5 Sohraby, K., Minoli, D., Znati, T. (2007): Wireless Sensor Networks – Technology, Protocols, Application. – 328 p.
15
ILEWS – Integrative Landslide Early Warning Systems Bell R. (1), Becker R. (2), Burghaus S. (3), Dix A. (4), Flex F. (5), Glade T. (1)*, Greiving S. (5), Greve K. (6), Jäger S. (7), Janik M. (8), Krummel H. (8), Kuhlmann H. (3), Lang A. (4), Li L. (3), Mayer C. (6), Mayer J. (6), Padberg A. (6), Paulsen H. (9), Pohl J. (6), Röhrs M. (4), Schauerte W. (3), Thiebes B. (1), Wiebe H. (8) (1) Institut für Geographie und Regionalforschung, Universität Wien, Austria (2) IMKO Micromodultechnik GmbH, Ettlingen, Germany (3) Institut für Geodäsie und Geoinformation, Universität Bonn, Germany (4) Institut für Geographie, Universität Bamberg, Germany (5) plan + risk consult, Dortmund, Germany (6) Institut für Geographie, Universität Bonn, Germany (7) geomer GmbH, Heidelberg, Germany (8) GeoFact GmbH, Bonn, Germany (9) Terrestris GmbH & Co KG, Bonn, Germany *Coordinator of the project: Univ.-Prof. Dipl.-Geogr. Dr. Glade, Universität Wien
1. Introduction Landslides cause fatalities and economic damage all over the world. In most of the cases severe consequences could have been reduced if a reliable and understandable warning had been provided right in time. Since possibilities to predict landslides vary significantly, early warning of landslides is a challenging topic. New technologies are needed to set up reliable early warning systems. However, a well working technical early warning system might not be sufficient, if the issued warning is not understood by the threatened people. Thus, effective early warning must integrate social science, humanities and decision making as well to ensure that the early warning system meets the needs of the involved players and the threatened people. The main aim of the project ILEWS is to design and implement such an integrative early warning system for landslides, which provides information on future events with regard to local and regional requirements.
16
Key project targets are: 1. Formulation of an integrative early warning concept for landslides. 2. Investigation and installation of an adapted early warning system on a landslide in Lichtenstein-Unterhausen (Swabian Alb, Germany). 3. Monitoring, parametrisation and modelling of local data. 4. Development of risk management options. 5. Provision of information necessary for early warning. 6. Integration of warning into the respective social processes of decision making. Strengthening the awareness towards underestimated risks that are associated with landslides. 7. Transfer of the concept to an area in South Tyrol with already existent monitoring stations which are not yet networked in the above mentioned manner. 2. Study areas The study areas are located in the Swabian Alb (Germany) and South Tyrol (Italy). In the Swabian Alb a historically active complex rotational slide in Lichtenstein-Unterhausen is
equipped with a new and innovative sensor combination to investigate the landslide movements in detail and to set up and test different kinds of landslide early warning models. Reactivations of at least parts of the landslide cause significant damage at a building. In South Tyrol research focuses on a debris flow in Nals, where sediments are delivered by sliding processes further upslope, and a complex slide in Pflersch without the installation of new instruments. Whereas in Nals an early warning system was already implemented, there is none available in Pflersch. In South Tyrol regional analyses regarding debris flow triggering rainfall thresholds are carried out. 3. Structure of project and research program The ILEWS project consists of the following three clusters: Monitoring, Modelling and Implementation. 3.1 Cluster Monitoring Within the cluster Monitoring local measurements in Lichtenstein-Unterhausen are carried out and data is collected, transferred and stored. More detailed information is listed below. 3.1.1 Measurements – Slope movement: Inclinometer chain (continuously), inclinometer (periodically), tacheometric surveying supported by applycation of scintillometer and precision nivellemenents (periodically to episodicly) – Soil moisture: 2D/3D-geoelectric (every two hours, automated analysis), TDR-probes (continuously), tensiometer (continuously), »Spatial«-TDR-cable (continuously) – Meteorology: Temperature, precipitation, wind, radiation, snow (continuously) 3.1.2 Data transfer – Data collection via wires – Wireless sensor networks – Temporal storage of data on local servers – Continous data transfer via DSL to central servers
3.2 Cluster Modelling Within this cluster data from local measurements is used to setup and test the following early warning models: – Movement analysis early warning model – Physical based early warning model – Empirical early warning model Furthermore, if rainfall thresholds can be determined, which when exceeded might trigger landslides, the possibility to integrate the weather forecast will be investigated in the respective models. In addition, historical and current frequencymagnitude relationships will be examined to better understand the general activity of the investigated geosystems and more reliably issue early warnings. 3.3 Cluster Implementation Within the cluster Implementation modern geoinformatic approaches are applied to manage and spread information, a cooperative risk communication is set up and early warning is integrated into an integrated risk management. 3.3.1 Information management – Setup of a decision-support information system – User optimised preparation and visualisation of the results and warnings – WebGIS, SensorGIS, standardised and interoperable geodata infrastructure 3.3.2 Cooperative risk communication – Analysis of the general local and regional needs – Cooperative implementation of the early warning system 3.3.3 Integration of early warning into an integrated risk management – Definition of protection targets based on damage potentials – Analysis of alternative management options (e.g. spatial planning, protection measures) based on respective laws and its discussion with the involved actors.
17
4. Status of the subprojects and intermediate results In Lichtenstein-Unterhausen (Swabian Alb) new and innovative sensor combinations were installed, which are shown in Fig. 1. 4.1 Subproject »Geomorphic Modelling« The overall aim of the subproject »Geomorphic Modelling« is the monitoring of subsurface landslide movement and modelling of early warning using three different approaches: movement-based modelling of early warning, a physically-based model and a statistical-empirical analysis of critical thresholds. In the following the focus lies on the most advanced model, the physically-based modelling of early warning implemented as a web-processing service. Landslide movement is monitored periodically by mobile inclinometers since 2004 and continuously with an inclinometer chain since 2007. Data is automatically send to the server in field and to the ILEWS database. Movements differ seasonally: Whereas in summer/ autumn there is a 8.5 m deep flowing move-
ment following heavy summer rainfalls, in spring sliding movements down to 15 m occur mainly after snow-melting (Fig. 2). In general, both movement types are extremely slow (following the classification of Cruden & Varnes 1996), so that in total the maximum displacement is approx. 1.2 cm from August 2004 to February 2009. Subsurface and surface movement rates were analysed in cooperation with the subproject »Geodetic Modelling«. In some cases movement rates are in good agreement (see also subproject »Geodetic Modelling«). Drilling cores were taken in field and sampled in the laboratory to analyse e.g. grain size distribution, moisture content and shear parameters. Based on the results of the laboratory analysis and geophysical surveys by the subproject »Moisture Geoelectric« a general subsurface model was created. This model is the basis for the physically-based early warning model using the Combined Hydrology and Stability Model (CHASM), which was initially developed by e.g. Anderson 1990 and Wilkinson et al. 2002. This model is used for the calculation of most likely
Fig. 1: Locations of installations in Lichtenstein-Unterhausen (Swabian Alb, Germany)
18
Fig. 2: Measured subsurface displacements at inclinometer Lic02
shear-surfaces. Furthermore, the model is implemented as a web processing surface (by subproject »Info-management«) to enable the web-based calculation of early warning. Generally, two features are implemented as web-processing: an automated and an userinitiated calculation of slope stability. The automated computing is performed along two profiles. The first is along the geoelectric monitoring profile (run by project partner »Moisture Geoelectric«) and has a length of 135 m, the second one is 300 m long and reflects the situation of a full reactivation of the landslide body. The user-initiated feature allows calculation of slope stability along freely selectible profiles. Additionally the user can select from rainfall scenarios and initial soil-moisture states and start the slope stability calculation by a conventional web-browser. The web-processing service is implemented and currently runs on test data regarding soil-moisture. The full integration of real-time measurements of soil moisture and rainfall, and eventually weather forecasts is expected for late 2009. Rainfall scenarios are based on the German weather service’s KOSTRA2000 atlas (Koordinierte Starkniederschlags-Regionalisierungs Auswertungen) and comprise storm events with varying duration (1 hour to 72 hours) and probability of occurrence (1 to 100 years).
Up to date the movement-based early warning model which follows the very promising approach of Petley et al. (2005) by analysing the reciprocal movement rate over time could not be applied due to the extremely slow movements and the missing significant acceleration of the landslide which is essential for this approach. However, if the landslide starts to accelerate the approach will be applied and further developed. The third early warning model applies the statistical analysis of critical thresholds in all data regarding slope movement, soil moisture and weather conditions. First analysis were carried out and will be intensified as soon as the improved analyses and visualisation tools of the subprojects »SensorGIS« and »Info-Management« are available. Furthermore, the transferability of the concepts outlined to the study sites in South Tyrol have been discussed with the Geological Survey of the South Tyrol. Whereas it seems that the technical early warning systems developed by ILEWS is not suitable for the very active landslides in the debris-flow catchment in Nals, it might be beneficial to be applied to the landslide in Pflersch. First rainfall thresholds for the initiation of debris flows were derived at a regional scale
19
Fig. 3: Results of the deformation analysis out of multiple two-epoch–comparisons (Measuring dates: Nov 2007, Mar 2008, Aug 2008, Sep 2008, Nov 2008, Mar 2009, Jun 2009)
for South Tyrol based on data from weather stations. Currently, it is analysed if the utilisation of rainfall radar data might confirm or even improve the derived thresholds. In a last step it will be tested how the models applied can automatically support each other. Finally, numerous interviews were carried out together with the subprojects »Communication« and »Management« (see their reports for main results). 4.2 Subproject »Geodetic Modelling« The aim of the subproject »Geodetic Modelling« is the surface surveying of the extremely slow moving landslide with different geodetic methods and the development of a complex landslide model to describe current and predict future movements. To survey the landslide, seven epochs have been measured until now. These measurements were done by precise levelling and surveying of a precise geodetic tacheometry network. The results of the basic deformation analysis of the tacheometric network are shown in Fig. 3. Significant movements of the landslide can be found in the area around
20
three different pointgroups: the point 201, the points 202 to 204, and the area of the points 205 to 207. All of these points are placed on or near the upper street of the research area, where the gradient of the slope is higher than at the points 101 to 106 on the lower street. Because of the higher gradient in this area, higher movement rates were expected, which are now confirmed by the surveying. The movements at the points 205 to 207 show a constant downhill movement, overlaid by sideward movements. Though these sideward movements cannot be fully explained until now, they are confirmed by inclinometer measurements near the point 306 carried out by subproject »Geomorphic Modelling« (see also Burghaus, Bell and Kuhlmann 2009). The movement in point 201 is interpreted as an effect of swelling and shrinking of clayrich soils in the underground, following dry and wet periods. Levelling results support this hypothesis. The recent uphill movement of the points 202 to 204 might indicate some rotational sliding movements. Rotational movements could also be found in inclinometers Lic02 and Lic04 of the subproject »Geomorphic Modelling«, but in February 2009. Lic05 is pretty close to the point 204. Unfortunate-
ly, there the uphill movement is not as strong as at the other locations and the uphill movement in February cannot be recognised by the periodic tacheometric measurements. At present a complex landslide model is under development, which shall include next to the geodetic measurements also data from other sensors like inclinometers, soil moisture sensors, tensiometers, precipitation sensors, etc. This landslide model shall give the possibility to predict future movements of this landslide depending on precipitation and soil moisture. Another advanced method to monitor landslides in real time is GPS Measurement. Thereby GPS real time series analysis is one of the major parts, providing the important basis for identifying a landslide. To get the observations, one GPS experiment was carried out in Bonn because GPS cannot be used in Lichtenstein-Unterhausen due to limited satellite visibility and the extremely slow movement rate. If the GPS observations with high sampling rate are correlated, the shaping filter can be used to describe the long term movement of correlated measurement deviations. The accuracy of the processed time series can be improved by the usage of Kalman filter with shaping filter. Because of the similarity and the difference between the deformation and outlier, the Kalman filter with shaping filter is used to detect and distinguish deformations and outlier simultaneously and is discussed how to determine the state vector when deformation and outlier occurred. The details of the results can be seen in Li & Kuhlmann (2008a). Two other methods were also used to reduce the coloured noise and detect the deformation epochs with less time delay: sequential algorithm and finite impulse response (FIR) filter. The FIR filter removes noise by weighing the previous observations. Kalman filter removes noise from the signal by using the initialisation and the propagation of error covariance statistics. For the static GPS measurement, the accuracy is in the order of a few millimetres.
Different methods have different ability to detect the deformation. For example, the Kalman filter with shaping filter can detect the deformation of 5 mm. The sequential algorithm can detect the deformation of 1 mm, but it causes larger time delay than the other two methods. A thorough comparison of these three methods has been made in Li & Kuhlmann (2008b). Such high accuracies of the GPS measurements are only possible to get if 7â&#x20AC;&#x201C;8 satellites are contemporaneously available. Unfortunately, this is not given in the study area Lichtenstein-Unterhausen. In other landslide areas with better satellite availability GPS measurements using one of the developed methods enables the monitoring of even extremely slow landslides. 4.3 Subproject ÂťSetup MonitoringÂŤ The most important objective of this subproject is the coordination and hardware-related integration of heterogeneous field sensors into a unified, robust and simple to-use measurement system, as a standalone component of an extensive landslide early warning system. A slide-prone hill slope is a system with an internal state reacting to external loads and forces. The inner state is determined by quasistatic morphological characteristics (e.g. soil type, grain size distribution, landscape formation, stratification) and time varying soil hydraulic variables such as soil moisture, suction, and pore water pressure. Both static and dynamic variables define the soil mechanical stability at a given time. Therefore, diligent monitoring of soil hydraulic variables is crucial to assess time varying slope stability. Moreover, the load to the system, i.e. precipitation in the form of rain and snow, has to be captured, too. The system reacts to the load depending on the current state of inner stability. To fulfil the particular needs for a robust, flexible and scalable soil monitoring system a distributed soil hydraulic measuring system including a weather station for the main atmospheric variables (temperature, pressure, precipitation, wind, radiation) has been set up in
21
close cooperation with subproject »Geomorphic Modelling«. The technical components chosen are well known for their reliability. Among these are IMKO Time Domain Reflectometers (TDR) measuring soil moisture reliably even under problematic soils revealing electromagnetic wave dispersion (e.g. clay) and energy dissipation (saline soils), and widely used tensiometers by UMS to measure the soil matric potential. TDRs and tensiometers were installed in pairs in three different depths (approx. 2 m, 5 m, and 10 m) along nine vertical profiles. Seven vertical profiles are aligned along two perpendicularly crossing transects. A Spatial TDR (STDR) system, which is still under construction, complements the soil water monitoring. Spatial TDR allows the derivation of spatially continuous dielectric profiles along elongated TDR wave guides such as custom-built three wire ribbon cables. This is done by inverse parameter estimation similar to tomography. Moisture profiles are yielded from dielectric profiles my means of calibration functions. The special STDR ribbon cable with RF relays at each side right at the junction to the coaxial feeding cable has already been built and buried. Both feeding cables will be connected to a custom-designed RF multiplexer allowing for the TDR measurement from both sides. The development of control software is the final step. To test different data collection and dissemination schemes data is collected partly by means of solar powered data loggers storing information locally and transferring it periodically to an internet host and partly by means of a common PC set up in a container installed on site connected to the grid and the internet. For remote sensors a wireless sensor network is under construction (in cooperation with subproject »Sensor GIS«. A significant part of a modular monitoring system is a dedicated, self-contained database to provide a harmonised way of data sharing. PostgreSQL has been chosen as object relational database management system (ODBMS).
22
Such a powerful tool alleviates the perusal of raw measuring data, assessment of data quality and health of sensors. Moreover, it provides standard import and export interfaces. The monitoring database is linked to the central ILEWS database and forms an important part of the early warning system. Even so it can be run solely and hooked up to another system to ensure modularity and scalability. To get a fast access to the quality of service data (sensors available, data loggers reachable, hosts running) Nagios has been installed and is still under test. In fact Nagios is an open source computer system and network monitoring software application, but is extremely flexible and extensible, so that it can be used to inform and alert an operator about the health of the monitoring system via email, SMS, or web frontend. It could even be trained to disseminate alerts possibly generated by the early warning system. Soil moisture and weather data is analysed in cooperation with the subprojects »Geomorphic Modelling« and »Moisture Geoelectric«. 4.4 Subproject Moisture Geoelectric Pilot surveys had been performed at the landslide location in Lichtenstein-Unterhausen prior to the installation of the D.C. resistivity monitoring system, using seismics and DC resistivity. The aim was to find the best configuration for the monitoring setup and to provide a model of the subsurface, and especially the potential landslide body. The latter was derived in cooperation with project partner »Geomorphic Modelling«. Based on the results of the pilot survey a DC resistivity monitoring system has been installed, consisting of two perpendicular geoelectrical lines. Every two hours a measurement is carried out and the raw data saved locally. In cooperation with the project partner »SensorGIS« the raw data are then transferred via an FTP-connection to the central database, and also to »Moisture Geoelectric«, where data is processed further. The monitoring system is remote controlled via an OpenVPN connection, which allows for
maintenance, troubleshooting and changing of system parameters without the need of on site inspection. The raw data are processed in several steps: – Visualisation of the data sets in so called pseudo sections. In cooperation with »SensorGIS« the possibility of web-based data visualisation was implemented. – A series of procedures to clean the data from outliers etc. – Cutting off datum points above 1200 Ωm and below 1 Ωm – Using mean and median filters – Sorting the data into a continuous time series for each electrode configuration – Converting the raw data into a special file and data format for future use with inversion software (Res2DInv). – Piping the transformed data into the inversion software (Res2DInv). – The inversion results are sorted into timelines to be able to follow changes with time. A crucial point in this procedure is the elimination of outliers. Only a clean data set will provide reliable information for further analyses, and the possibilities of automatic outlier removal are limited. A short look at the visualised data usually provides better results than programmed algorithms. Nonetheless the automation is useful, as large quantities of data
are produced every day and this approach significantly reduces the effort of manual data processing. Checking for unusual results after the inversion process can indicate corrupt data sets which can be cleaned by hand afterwards. The calibration of the measured apparent resistivity values with the measurements from the TDR probes provided by project partner »Setup Monitoring« has so far produced indifferent results. Not every change in TDR data can be assigned to changes in resistivity, and vice versa. One possibility to observe the changes in subsoil parameters is the so called time lapse inversion. Here the differences between several data sets are processed, not the data sets themselves. Several time lapse inversions have been carried out around specific points in time where changes in soil moisture, or rain events were recorded. For example the data sets from February 23rd to 26th 2009 show fluctuations in TDR data, especially for the sensor triple 30003, 30002, 30013, and rain events are recorded in this period of time (Fig. 4). Fig. 5 shows a line plot of a selected data point from the resistivity monitoring for the same period. With the use of time lapse inversion the changes in soil moisture can be followed. The inversion results also indicate that sensor 30002 may be positioned inside a clay lens,
Fig. 4: Raw data of the TDR sensors
23
Fig. 5: Rain events and apparent resistivity for a selected data point
Fig. 6: Result of the time lapse inversion. The first image shows the reference model, the other the change in resistivity in percent relative to the reference model at different times
resulting in damped amplitudes in TDR data. The other sensors do not react in the same way, only sensor no. 30005 shows an increase in soil moisture, which corresponds to the near-surface saturation changes visible in the time lapse results (Fig. 6). The above results show the great potential of using DC-resistivity measurements for soil moisture monitoring. With the DC-resistivity-monitoring water paths within the landslide body as well as the temporal infiltration of rainwater into the underground can be analysed and visualised. Both effects are essential for the early warning modelling of landslides.
24
4.5 Subproject »CoreSDI« »CoreSDI« is responsible for the technological foundation of a Spatial Data Infrastructure for early warning applications in ILEWS. A Spatial Data Infrastructure (SDI) is an infrastructure connecting a multitude of distributed resources (data and services) and providing a welldefined set of users with the means to access these resources through standardised interfaces. The infrastructure of an early warning system (EWS) is highly complex. EWS incorporate sensors, databases, data processing via GIS, forecast and model generation, computer networks and other user devices, as well as procedures for the handling of results. Signifi-
cant problems arise on a syntactical and semantical level when combining heterogeneous components in complex systems. OGC interoperability testbeds and pilot projects show the traditional technologies being replaced by Sensor Web, Open Web Services and Web Processing Services. The services are based on the Web Service paradigm and originate from the OGC Spatial Web approach. These technologies are more suitable for achieving interoperability than traditional technologies. Each Web Service is responsible for a single spatial information task and is deployed in an infrastructure of several different modules. Because of the high level of complexity, it was necessary for »CoreSDI« to do a thorough analysis of the requirements of data providers and customer needs prior to the modeling of the SDI architecture. In the ILEWS project the SDI consists of technical components like sensors for measuring subsurface movements, data generated by these sensors, services for accessing this data as well as organisational components like the rule set for data access. »CoreSDI« is concerned with developing the actual architecture of this SDI to facilitate the use of data generated by the project partners. The SDI is necessary for satisfying the informational and communicational needs of the other parts of the project. For this task a tight cooperation between the subprojects »SensorGIS«, »Info-Management« and »CoreSDI« has been established. Together with these project partners a data model was developed, that is used for storing data generated by the sensors in the central database. Sensors including meta information and their input and output properties can be described using SensorML documents. The SensorML standard developed by the OGC specifies »models and XML encodings that provide a framework within which the geometric, dynamic, and observational characteristics of sensors and sensor systems can be defined (www.opengeospatial.org/standards/sensorml). »CoreSDI« supports the project partners by creating SensorML documents for some of the
sensors used in the project. The SensorML documents are also utilized when setting up Sensor Observation Services (SOS) providing the sensor measurements in a standardized way so these services could be integrated into the project’s SDI. Further activities of »CoreSDI« apart from those mentioned above, include providing advice to the project partners that implement and run the SDI, as well as supporting all users of the project’s SDI. The »CoreSDI« part of the project aims at providing a flexible infrastructure for spatial data management and processing to the project partners. 4.6 Subproject »SensorGIS« Integrative landslide early warning systems are dependent on sensors, computers, transmission technology and other hardware. Sensor data in conjunction with historical facts from archives and information gleaned from interviews with people involved in the process constitute an information system that is designed to furnish early warnings. The aim of »SensorGIS« in the current project was to provide an infrastructure to process incoming data and to facilitate easy access to the latter through an interactive, web based client. Since the last science report in 2007 a lot of sensors were installed in the research area in Lichtenstein-Unterhausen. Amongst these are dc-resistivity-, soil-moisture-, soil-suction-, soiltemperature-, meteorology- and inclinometersensors. Apart from measurements for a geodetic mesh all other sensors take readings at regular intervals and transmit their data automatically to the central database via a container installed on location, that is connected to the power grid and to the internet. Two servers, one productive and one as a backup, were configured with the necessary software and located in Bonn and Heidelberg, Germany, respectively to ensure the redundancy and fail-safe setup needed for an early warning system. Once a day data is transferred from the apparatuses in the field and the backup
25
server synchronises itself with the productive server. Should the main server cease functioning the IP address would automatically be mapped to the second server without it being noticeable by the user and thus enabling uninterrupted service. The system administrator would receive an automated message so that the problem can be rectified. Apart from being involved in the technical setup described above SensorGIS was instrumental in storing and visualising incoming data. For storage purposes a PostgreSQL database is used in combination with the PostGIS library that enables the direct storage of geoobjects. For visualisation purposes the PHP framework symfony was used to program graphical user interfaces (GUIs) in cooperation with subproject »Info-Management« that allow interactive viewing and analysis of the data. Provided the user has appropriate rights data can also be modified or deleted. Bringing the abundance of data online compounded an initial assumption: The system is designed to be used by experts who are familiar with the data and who know how to extract the appropriate information from the tables and graphs. Another initial idea was to make the system also accessible to local decision makers and the interested public as a target group. Interviews conducted by the subprojects »Communication«, »Management« and »Geomorphic Modelling« revealed that this target group basically is not interested at all because there is none to extremely little perceived risk. This result has had a fundamental influence on the design of the second GUI, developed in cooperation with »Info-Management« and »Communication«, which was made as simple as possible to accomodate the wishes of this target group. Another aspect of the system being built is the interoperability with other systems. In this respect standards of the Open Geospatial Consortium (OGC) are being met and to date the web mapping service (WMS), web feature service (WFS), web processing service (WPS) and sensor observation service (SOS) standards
26
have been implemented. Adhering to these standards will allow future expansion of the system because other services based on standards will easily fit into the given architecture. In the same measure services of ILEWS can easily be integrated into other systems. Interoperability tests will be conducted together with the research projects SLEWS, alpEWAS and EWS-Transport. 4.7 Subproject Info-Management The subproject on information management, in close cooperation with the subprojects »SensorGIS« and »Core-SDI«, has developed and tested the database model and use case definitions for ILEWS. The database model includes sub-models for the complete monitoring components of the field instrumentation, which is transferred remotely to two database servers. Additional data models for socio-economic as well as for historic data have been designed. The geospatial components used are compatible to OGC-standards. The two database servers, located in different geographical locations, work as synchronized backups for each other, so that in case of malfunctioning of one of them, the substitute can jump in. The collaboration development process is supported by the PHP-web-development framework Symfony. In close cooperation with the subproject »Geomorphic Modelling« the slope-stability model CHASM (University of Bristol) was implemented as an OGC-compatible Web Processing Service (WPS). The model can be operated automatically or manually. In the first case it is fed by design rainfall scenarios derived from the German weather service’s KOSTRA-Atlas, which are compared to the monitoring data. The design rainfalls have been integrated in the data model, too. In manual mode the WPS can be »fed« with local datasets of a modeling specialist, in which case the terrain and subsurface data are located on the server but model parameters will be set by the modeler. Technically, CHASM has been implemented by software wrapper classes. The conceptual setup of the WPS is shown in Fig. 7.
Fig. 7: Schematic description of the WPS setup for slope stability modeling
In a further cooperation among the subprojects »Info-Management«, ›SensorGIS« and ›Communication«, and based upon the findings of the latter subproject an hierarchical warning system has been designed and is currently being implemented. An automated database analysis will issue a preliminary warning. However, due to the complex nature and mainly due to the very low frequency of movements in the study area of Lichtenstein-Unterhausen, a final warning will only be issued after at least one expert has ruled out. Several AJAX-based data visualisations have been developed and implemented. The most important ones are the integrative real-time visualisation for the field monitoring data and the tools for controlling and visualising the slope stability model. 4.8 Subproject »Communication« The subproject »Communication« would like to break new ground by penetrating the complexity of social actor systems with the help of cooperative interviews and by developing sensible solutions in collaboration with the involved players. For early warning systems to be functioning and efficient, they have to be thought as integrative chains. In doing so the players are not to be the last link of the chain, but those who should play a central role during the whole development of an early warning system. If the players merely have a signif-
icance as social variable to be calculated, the social implementation of early warning will remain a »black box«. Therefore, the endusers are the centre of interest for the subproject. The subproject »Communication« aims to clarify general local and regional needs regarding to a landslide early warning system. Interestingly, intermediate results show that local authorities, stakeholders as well as residents do not show any interests in any landslide risk management strategy. They are not interested in a landslide early warning system at all. Local people do not see any necessity for such a system. Stakeholders are little sensitive towards infrequent events such as landslides. The potential damage is often underestimated by decision makers and the public. Referring to these findings and in contrary to all expectations as well as the research results of the DFG-project InterRisk the involved stakeholders show only a very little willingness to cooperate. Therefore, it was exceedingly difficult to carry out the the cooperative risk communication and the cooperative implementation of the early warning system on the local level with the involved stakeholders. The interpretation of the qualitative interviews with the involved stakeholders at the Swabian Alb shows that they do not need any precise information. They only want very few and simplified information, for example in terms of a
27
»flashing red light signal«. Interviews were carried out together with subprojects »Management« and »Geomorphic Modelling«. Results of several cooperative interviews as well as a workshop with experts in South Tyrol confirm the experiences for the investigation area South Tyrol. Experts on the regional level at the provincial government take responsibility for prevention measures such as development, implementation and maintenance of early warning systems. In collaboration with the involved players in South Tyrol and together with the subproject »Management« risk management options beyond technical early warning systems have been discussed. Thus, in the scope of an integrative risk management approach, existing vulnerabilities and action alternatives have to take into account. The implementation of the landslide early warning system and so the development of the user interface has to take these findings into account. The user interface, cooperatively developed with the subprojects SensorGIS and Info-management has two profiles. The »nonprofessional« user profile is for interested nonprofessionals, which need simplified information about the current slope situation. They get these information in form of a traffic light status lamp. The »professional user« can see all the data, interpreted data as well as raw data, after a personal log in. This second status group has to check the system after a critical threshold is exceeded. These experts (e.g. system developer as experts for a special module (e.g. »Moisture Geoelectric«), responsible institutions (e.g. regional authorities), administrative district responsible for emergency management etc.) decide if there is any error in measurement or if the situation is truly critical. Depending on this data check the traffic light turns red or green again and further action can take place if necessary. The requirements on the system (which information, detailedness of the information, reliability of the information) vary, depending on the the particular situation. The physical and
28
social processes are specific, depending on the slope stability and the social setting. Any landslide as well as any social setting is specific to the situation and not representative and generalisable because of the complexity of the social and physical processes. The landslide early warning system developed by the ILEWS project provides the opportunity to customise the components to a particular situation and enduser’s option. In general, the responsibilities and competencies for landslide risk management of involved actors play a decisive role for the implementation of an early warning system. Surprisingly the regional level gains in importance. Organisations at the regional scale, for example the state office for geology, resources and mining or the regional planning administration have a large scope and flexibility in decision making and designing the risk management process. In this context landslide risk management options can be developed together with these organisations. The consideration of landslides in the regional plan in the Swabian Albs on the recommendation of the ILEWS project exemplifies the possibilities of developing risk management options (see subproject »Management«). Generally, the important role of different social systems and the application of the Social System theory of Niklas Luhmann (Luhmann 1984) could be confirmed. The method of cooperative interviews turned out to be very successful to investigate the social processes in the context of early warning. The results of the subproject »Communication« make a substantial contribution to the technical development of the landslide early warning system. 4.9 Subproject Management An early warning system has to be developed according to the requirements of its users who are identified by subproject »Communication«. This calls for an important role that the management of information as well as the dissemination of information (risk communication) have to play.
Research objective of the subproject »Management« is to broaden the perspective and to provide the stakeholders in the case study areas with an appropriate consideration of action alternatives. An early warning system however is only one of many options. Moreover, an early warning system may not have to be understood as a single isolated measure of which the implementation is based only on the identified hazard. Actually, existing vulnerabilities and action alternatives have to be considered, too. In line with the envisaged integrated approach, different and sometimes alternative measures may compete with each other. Criteria for the assessment of measures – that have been defined in co-operation with local stakeholders – are especially the protection goals for hazard-prone areas but also aspects of efficiency and effectiveness. The analyses took place in two study areas: the municipality of Lichtenstein (Swabian Alb / Germany) and the municipality of Nals (South Tyrol / Italy). The report is focused on the Italian case study, because the municipality of Nals is much more affected by landslides (esp. debris flows / rock falls). Moreover, precise information about frequency and magnitude of landslides were only available for Nals. On the contrary, landslides in the Swabian Alb are known as regional phenomena with low probability and of mostly low velocity. However, the given damage potentials in the threatened residential area »Im Weingarten« were analysed based on recovery costs. The estimated sum is about 18.6 Mio. € + 1.7 Mio. € for public infrastructure. The project concentrated nonetheless finally on recommendations for regional planning in order to keep landslide prone areas as much as possible free of further development or, if not possible, adjust the technical building design according to the given threads. This scientific advice has been integrated as »Vorbehaltsgebiet« (reserve zone) into the regional plan for the district of Neckar-Alb by the regional planning association. This is the first case ever where landslide
proneness is regarded as important by regional planning in Germany. An instrument called »Gefahrenzonenplan«, is obligatory for municipalities in the Italian study area of South Tyrol (see provincial ordinance GZP/KSR). According to this instrument, each land-use type, defined by the urban land-use plan, gets related to a specific loss potential rank between level 1 (not vulnerable) and level 4 (highly vulnerable). The results are shown by a so called »Karte der Schadensanfälligkeit« (damage susceptibility map). Moreover, a hazard map is part of the »Gefahrenzonenplan«. Hazard prone areas are classified by four classes from H 0 = not endangered to H 4 = Very high hazard intensity. Overlaid with the damage susceptibility of hazard prone areas, a risk map is the final outcome, consisting of four risk classes from low to very high risk. In the framework of the ILEWS-project these maps have been compiled by the sub-projekt »Management« for the community of Nals. Due to the fact that there is currently no landuse plan in force for the community of Nals, the presented map had to be worked out on basis of existing land-use data. In addition to these maps which are obligatory for each community, the existing damage potentials within hazard-prone areas of the study areas have been identified and processed at meso-scale (at the level of landuse types). The overall damage potential is about 70 Mio. €. Moreover, an examination at micro-scale within the identified hazard-prone areas was carried out. The main focus lays on subjects of high economic values, such as buildings and technical infrastructure. Social values (human lives, health) and the environment were subordinated. Here, no current real estate values, but the recovery costs were considered. These costs were estimated on basis of the official land register and combined with the construction cost index, which is in Italy (South Tyrol) part of the official statistic. Finally, annual estimat-
29
ed losses, which result from the probability of the regarded natural hazards, were calculated. The probability of natural hazards was only available for the Italian case study. This fact can be seen without difficulty for the German case study, since the building plots are only affected by a landslide of currently extremely low velocity (as mentioned above). So, for every hazard-prone building in Nals (South Tyrol) the annual amount of (financial) loss were detected. Depending from the phenomenon, its frequency and the damage potential of a certain building, the estimated annual losses are of about 1.700–6.800 € per object. For the next working stage, several mitigation measures will be compared with the early warning system. Here, the building and working costs have to be identified, to be able to estimate the effectiveness and efficiency of the mentioned alternatives. This allows a prioritisation of the implementation of early warning systems in different landslide-prone areas depending from the cost-benefit ratio. 4.10 Subproject History The aim of the subproject consists of developing methods for measuring the frequency and the magnitude of landslides on a historical time scale. This was operated in two landslideprone regions, and was based on two different researches in the Swabian Alb and compared to the region of South Tyrol. With these results and combined with methodological experience gained, statements can be made about how future historical analysis can be included in an effectively working early warning system. At first, we deepened our researches concerning the area of interest, the Swabian Alb region, by taking the already investigated, hand-written archive sources from the former DFG-project InterRISK and constipating the timeline which goes back to the 15th century. After this, important information on time, location, trigger and size of the damage of the event was implemented in a database, constructed by the subproject »SensorGIS«. Furthermore, a multitem-
30
poral, GIS-based land-use-analysis was carried out for the region of the community of Lichtenstein-Unterhausen based upon historical maps and aerial photographs. In addition, historical landslides found in the archives were exactly located in the study region. Parallel to this, systematic research in archives from Tyrol and South Tyrol was done. This also led to a high amount of discoveries for the area of investigation about so far unknown and not yet enough documented landslides dating back to the 16th century. The historical original sources not only give information about the spatiotemporal spread of the events, but also include important information on natural and anthropogenic triggers, on the process itself and the handling of natural disasters by former societies. Research was concluded by adding an amount of discovered maps, plans, engravings and paintings to the handwritten sources, which could help to precise the localisation of the events. After finishing the explorative phase, the collected historical material was and will be further prepared to integrate it to the central database. So the data will be available for other subprojects, especially Geomorphic Modelling, to make them accessible for an early warning system. To generally conclude: the deeper the digging in the existing historical sources, the clearer the awareness of a lack of knowledge about former natural events is. Furthermore, whilst so called »natural-chronics« offer a large amount of – unassigned – data, it is shown that as soon as this data is compared to archival sources, it can be found out how rare and fragmentary the present level of knowledge is. It seems that until now the relevant archival inventory was ignored and it was content to copy from the existing time- and event-courses of the 19th century without ever questioning their credibility because of missing source records and so questioning their scientific value. But for a meaningful early warning system especially exact historical data about the trigger and the
spatiotemporal spread of the events in form of preferably long and dense time-courses is not only informative but essential. 5. Benefits of an integrative project Developing integrative landslide early warning systems is quite complex and involves experts from various disciplines and totally different scientific backgrounds. But only by integrating these different experts all important questions can be thoroughly tackled, which cannot be done by single experts. In the past often very promising technical developments finally failed because they were developed without regard to the actual needs. In the following a selection of some excellent results are listed which mainly were achieved only due to the integrative approach of ILEWS: – Reliable soil moisture information, which is essential for the early warning system and especially for extending the early warning times, can only be derived from the new and innovative sensor combinations by the close cooperation of geophysisicists, physicists, geomorphologists and computer scientists. – A detailed picture of current surface and subsurface movements can only be detected due to the combination of geomorphological and geodetic experts. – An user optimised frontend of the web-based early warning information system was developed which meets the specific local and regional requirements. This could only be achieved based on the cooperation of computer scientists, technicians and social scientists. – Only due to the cooperation of a spatial planner, a social scientist and a geomorphologist and based on their comprehensive scientific advice landslide prone areas will be integrated as reserve zones into the regional plan for the district of Neckar-Alb by the regional planning association. It is the first time ever this happens in Germany. Above all the project profits a lot from the intensive interdisciplinary discussions regarding the most important questions like: Which re-
quirements must the early warnings system fulfil? How can critical thresholds be defined? Who are the involved actors? What is the legal framework for setting up an early warning system and for reducing natural risks in general? 6. Collaboration with other research projects Regarding the spatial data infrastructure developed in ILEWS, interoperability tests will be conducted together with the research projects SLEWS, alpEWAS and EWS-Transport. Based on SensorML documents and a sensor observation service (SOS) it will be tried to integrate data from the ILEWS sensors into the Demonstrator developed by EWS-Transport. Due to the research fields of enduser demands and requirements for an effective warn- and risk management ILEWS cooperates with the research project SLEWS. 7. References Anderson, M. G. (1990): A feasibility study in mathematical modelling of slope hydrology and stability. Geotechnical Control Office Civil Engineering Services Department Hong Kong Report CE 23/90. Burghaus, S., Bell, R. and Kuhlmann, H. (2009): Improvement of a terrestric network for movement analysis of a complex landslide, in: Proceedings of FIG Working Week: Surveyors Key Role in Accelerated Development, 3.–.8. May 2009, Eilat, Israel Cruden, D. & Varnes, D. J. (1996): Landslide types and processes. In: A. K. Turner und R. L. Schuster (Hrsg.), Landslides: investigation and mitigation. Special Report. National Academey Press, Washington, D.C., S. 36–75. Li, L. and Kuhlmann, H. (2008-1): Detection of deformations and outliers in Real-time GPS measurements by Kalman Filter Model with Shaping filter, 13th FIG International Symposium on Deformation Measurements and Analysis and 4th IAG Symposium on Geodesy for Geotechnical and Structural Engineering, Lisbon, May 2008.
31
Li, L. and Kuhlmann, H. (2008-2): Comparison of Colored Noise Reduction Performance in the GPS Real-time Series based on FIR Filter, Kalman Filter with Shaping Filter, and Sequential Algorithm, 4th International Conference on Engineering Surveying, Bratislava, Slovakia, October 2008. Luhmann, N. (1984): Soziale Systeme. Grundriß einer allgemeinen Theorie. Frankfurt am Main. Petley, D. N., Higuchi, T., Petley, D. J., Bulmer, M. H. and Carey, J. (2005): Development of progressive landslide failure in cohesive materials. Geology, 33(3), S. 201–204. Wilkinson, P. L., Anderson, M. G., Lloyd, D. M. (2002): An integrated hydrological model for rain induced landslide prediction. In: Earth Surface Processes and Landforms 27, S. 1285–1297.
32
alpEWAS – The Aggenalm Landslide – Innovative Developments for an Effective Geo Sensor Network for Landslide Monitoring and Early Warning Thuro K. (1)*, Wunderlich T. (2), Heunecke O. (3), Singer J. (4), Wasmeier P. (5), Schuhbäck S. (6), Festl J. (7), Glabsch J. (8) (1) Chair of Engineering Geology, Technische Universität München, Germany e-mail: thuro@tum.de (2) Chair of Geodesy, Technische Universität München, Germany e-mail: Th.Wunderlich@bv.tu-muenchen.de (3) Institute of Geodesy, Universität der Bundeswehr München, Germany e-mail: otto.heunecke@unibw.de (4) Chair of Engineering Geology, Technische Universität München, Germany e-mail: singer@tum.de (5) Chair of Geodesy, Technische Universität München, Germany e-mail: P.Wasmeier@bv.tu-muenchen.de (6) Institute of Geodesy, Universität der Bundeswehr München, Germany e-mail: stefan.schuhbaeck@unibw.de (7) Chair of Engineering Geology, Technische Universität München, Germany e-mail: festl@tum.de (8) Institure of Geodesy, Universität der Bundeswehr München, Germany e-mail: jessica.glabsch@unibw.de *Coordinator of the project: Prof. Dr. rer. nat. K. Thuro, Technische Universität München
Brief progress report The alpEWAS project (»development and testing of an integrative 3D early warning system for alpine instable slopes«) has made major progress since the last status seminar in October 2008. Not only the development of the new measuring techniques for landslide monitoring – time domain reflectometry (TDR), prismless robot tacheometry (TPS) and low cost global navigation satellite system (GNSS) – has been brought forward but also the integrative data analysis and the supplemental software and geo database development has made great progress, leading to a now fully operational landslide monitoring and alarming system, which started its continuous operation
in October 2008. While the TDR and GNSS systems – as far as it can be stated to date – fulfil the expectations concerning reliability and accuracy, some general issues have arisen in the outdoor use of the TPS system, which still limit its usability for the landslide monitoring task at the moment. The time loss mainly caused by the difficulties during the borehole drilling programme (Singer et al. 2008) now is almost regained as some of the other work packages could be pushed up in the schedule, as e.g. the numeric modelling of the Aggenalm Landslide for the determination of critical threshold values for the triggers precipitation and pore water pres-
33
Fig. 1: Comparison of TDR calibration curves (deformation against maximum TDR signal amplitude) based on several shear test results using a cement-bentonite grout (a) and a cement-cement admixture (Rheomatrix100) grout (b). The admixture BASF Rheomatrix100 is a synthetic copolymer, which reduces the segregation and bleeding of the cement, making the cement mixture usable for grouting. While the results of the shear tests, in which a cementbentonite grout was used, significantly deviate from each other, the results of the cement-Rheomatrix100 grout are more homogeneous, which leads to a higher correlation coefficient for the polynomial fit of the according data
sure. In the currently ongoing project phase 4 the main focus is put on the development of integrative data analysis methods for reliability enhancement and early warning issuance. Measuring systems The continuous development and optimization of suitable measurement systems is the main task of the alpEWAS project. Therefore each of the three subprojects focuses on one of the deformation measuring systems TDR, TPS and GNSS. Time Domain Reflectometry (Subproject 1): The quantification of subsurface deformations with Time Domain Reflectometry is influenced by several parameters, as e.g. the used coaxial cables and the grout composition. These influences have been quantified in a large number of laboratory shear tests, whose results partly were presented in the previous status report (Singer et al. 2008). Laboratory shear tests Further shear tests have been performed since then, in order to determine the usefulness of various chemical cement admixtures (accelerator, plasticizer, shrinkage-reducer, segregation
34
and bleeding-reducer, air-entraining agent) for the grout in TDR measuring sites, as bentonite-cement grouts are difficult to control especially during field installation (e.g. due to clumping; Mikkelsen 2002), which leads to strongly varying mechanical properties of the grout and in the consequence often to a bad reproducibility of the TDR measurements (Festl 2008). The preliminary results (e.g. Fig. 1) of the shear tests show a distinctive enhancement in the reproducibility of the TDR measurements when using chemical cement admixtures opposed to bentonite. By combining different admixtures the mechanical properties of the grout can be further modified to fit the surrounding rock mass. The work on a TDR field installation guidebook, which includes installation suggestions (grout composition, cable types, procedures) for different geological settings, is still in progress. TDR Deformation analysis The deformation analysis procedure is shown in Fig. 2 and described in more detail in Singer & Thuro 2007a. It is based on empirically determined calibration curves (polynomial fits in Fig. 2) which describe the relation between the maximum amplitude of a TDR signal caused by
Fig. 2: TDR deformation analysis procedure. Translated and edited from Singer & Thuro 2007a
the deformation of a coaxial cable and the true shear deformation as it was determined in laboratory shear tests. As this relation in general is ambiguous, a calibration curve has to be determined for each installation setup (combination of grout, measurement cable, lead cable, measuring device etc.) and deformation regime (shear, tension, shear zone width etc.). By comparing not only the maximum signal amplitude, but also other characteristic signal properties, e.g. the signal width or signal symmetry, the deformation regime is determined and the best corresponding calibration curve is selected for the deformation quantification. When using long lead or measurement cables the effect of signal attenuation has to be taken into account. This is done using simple correction functions, which also are determined in laboratory tests. In contrast to the empiric approach presented here, recently Lin et al. (2009) have presented
a promising mathematical TDR wave propagation model to numerically simulate TDR deformation measurements. The model works well when the influence of the grout is neglected (air filled shear gaps), but to date cannot be applied to actual field instrumentations as the numerical modelling of the complex soil-groutcable interaction is not solved yet. Software development The complete deformation analysis procedure â&#x20AC;&#x201C; from the manual or automated data acquisition in field to the output of deformation time series â&#x20AC;&#x201C; has been automated using the National Instruments LabVIEW software development environment. The software features various data management options (e.g. import/export of several data formats including a new XMLbased data format, data selection and hourly/ daily/weekly averaging for data reduction), the visualization of TDR raw data (Fig. 3) in several different ways, the creation of calibration files (from shear test data) and the automated de-
35
Fig. 3: TDR data from site B5-1 from January 1st to June 30th 2009. Shown is the absolute change of the TDR reflection coefficient compared to the reference measurement acquired on January 1st 2009. Until June 5th only very small deviations from the reference occur. A signal caused by the deformation of the coaxial cable was not observed. A heavy precipitation event in the first half of June caused the flooding of the installation shaft at this site leading to a water intrusion into the connector between lead and measuring cable, which resulted in a number of erroneous measurements
formation detection and quantification. Although it still is in the beta version phase, the program has been tested on data from several laboratory tests as well as field installations and thereby has proven its general functionality (Singer & Thuro 2009, Singer et al. 2009). The fully automated TDR deformation analysis software incorporated as a sensor plug-in in the alpEWAS control software (see below) has been tested for about a month in a laboratory setup during which no major problems arose. In August 2009 the software was installed at the Aggenalm Landslide for a first field test. First field experiences The Campbell Scientific TDR measuring system at the Aggenalm Landslide is in continuous operation since October 2008, performing hourly measurements. The TDR system has been very reliable; in the last 6 months (February to July 2009) the data loss caused by some short power outages and memory overflows due to late data retrieval sums up to less than 5% of the planned measurements. Still, through the
36
recent installation of the automated data acquisition and status monitoring software (see below) it is probable that in future an even higher reliability will be achieved. Most transmission lines (lead- and measuring cables, connectors; mostly buried) have also proven stable to outside influences even throughout the winter. During a heavy multiple day precipitation event in June 2009 some installation shafts at the Aggenalm Landslide filled with water, submerging several TDR connectors. As the installed connectors purposely had not been waterproofed to allow easy access to all cable connections, the intruding water lead to erroneous measurements at several sites (Fig. 3). To avoid this in the future, all connectors have now been waterproofed using heat shrink tubing. To date no significant deformation signals could be found in the TDR measurements, prohibiting to perform deformation analyses. As no meaningful deformation was measured in the inclinometer measurements either, it seems the Aggenalm Landslide has shown
only very little subsurface movement in the monitored boreholes throughout the past 9 months. This is contradictory to the GPS surface measurements (see below). Future data from the monitoring system and the integrative analysis will hopefully make it possible to determine the reason for this discrepance. Prismless robot tacheometry (Subproject 2) Up to the last status seminar in October 2008 primarily calibration work and accuracy examinations had been performed. In doing so, predominantly meteorological influences and illumination variations have proven to be crucial for imaging and measurement quality. Several test series, using artificial targets as well as rocks in the project area as natural targets have been performed since then in order to quantify these effects and to work out possible prevention strategies (see also Wasmeier 2009b). In a pure geometric way, especially high frequent, turbulent air density variations due to thermal and stochastic convection (air flickering over hot surfaces, scintillation of objects in larger distances) lead to an aggravation of the image representation of objects: â&#x20AC;&#x201C; Apparent position variations of target points: The optical path from the target point to the tacheometer changes and so does its imaging position on the camera chip. With every image representing a discrete state, this impact on a single image is neither quantifiable nor correctable. To reduce this it can be tried to evaluate a sequence of images. â&#x20AC;&#x201C; Apparent deformations of target structures: Parts of the object scene are shifted relatively within the image. Geometric structures become deformed, so that an automatic detection of geometric primitives by means of image analysis gets more complicated; matching algorithms suffer additional bias and a lower classification quality. These deformations are usually local and consist of stochastic components.
â&#x20AC;&#x201C; Blurring: During exposure by air turbulences edges and small structures become smeared. In the histogram local extremes are diminished, influencing threshold and texture operators. Blurring is naturally intimately connected with the actual exposure time. The IATS camera is very sensitive to natural daylight resulting in exposure times normally between 2 and 20 ms. With outdoor imaging, blurring therefore is a minor effect. Scintillation can be seen even in distances of considerably less than 100 m on average sunny days. In the project area Aggenalm its influences on close targets are significant, too. Illumination variations at different measurement times additionally alter the appearance of the mapped object scene. This is valid for the typical brightness changes throughout the day, but especially can be seen when rapid changes occur (sun is covered by a cloud). In these cases, modifications of the exposure time up to the decuple can be required to sustain the average image brightness. Using feedback-algorithms on exposure and image processing, as e.g. non-linear scaling, it is tried to minimize these effects, but in too bright or too dark areas not only radiometric, but also geometric information may be lost. This is problematic especially when there are bright blooming image domains on the one hand, while on the other still too little texture information can be seen in dark areas. This circumstance also leads to blurring effects of object edges and is a severe limiting factor. Considering this, the goal of different image processing steps is to transform the radiometric parameters of an image into a preferably small range of values, independently of the circumstances at acquisition time. Following image analysis operators then can act with high reliability and repeatability (see also Wasmeier 2009a). Processing techniques consist especially of histogram and multi channel adjustment, adaptive smoothing and simple
37
segmentation. For the analysis steps different appendages have been examined. Geometric and topologic techniques have shown adequate for artificial passive targets; while matching algorithms prove to be the choice with natural objects: in the run-up a template of every target object has to be created in a teaching phase, during run-time instances of these templates are searched for in the actual measurement image. An overview of different matching algorithms can be found in Steger et al. 2008. The alpEWAS matching task is performed best by using intensity- or edgebased algorithms. The latter are considerably harder to use with low contrast structures as rocks and therefore have a higher failure rate, but also show a higher accuracy when they succeed and so can be used for deformation detection. By special mathematic methods, a conditional independence of global image brightness can be added to the operator, so mainly the blooming problem remains as the most negative impact. The matching algorithms which are finally used in the alpEWAS project have shown to be rather insensitive to pure geometric distortions of the object representation in context to the template. Variances of up to 2 mgon can be found for single image measurements of targets in the range up to 100 m, while the mean standard deviation is below 1 mgon. The usual procedure to co-evaluate subsequent images gives additional redundancy, so that the angular direction towards an ordinary target point will result in 1–3 mm of position accuracy. This is comparable to the tacheometer’s distance measurement unit accuracy. On the other hand, the algorithm parameters can be very sensitive to the individual image quality. Successful evaluation often is only possible by slight modifications of the parameter set. So far the research work with the alpEWAS tacheometer has shown, that automatic and autonomous measurement using video tacheometry is to be performed preferably in controlled environments, e.g. in industrial pro-
38
duction. Working outdoor, only simple and clear target structures can be reliably detected. Complex models with only little individual descriptive features and low texture, as natural objects usually are, are to be extracted by semi-automatic process steps with manual validation of single results at the end. A completely automatic monitoring system of arbitrary natural targets can not be provided due to the sensitivity of the algorithm parameters at the moment. Considering the project goals of the alpEWAS subproject 2 this means, that in the current project stage the visionary goal of a completely autonomous acting measurement robot is hardly to realize. The hardware to use in the project, which anyway is at the top end of a low-cost-system’s costs, is not sufficient for this application area without major modification. The already made experiences in developing algorithms for image analysis show that the reliability, which per definition needs to be claimed from a early warning sensor to be accepted in its mode of operation and its results, is not sufficient at this stage when using an automatic control loop. Human intervention seems indispensable. Apart from that, a controlled video tacheometric system like it is actually used in the course of further alpEWAS project work at the monitoring site, can achieve several planned project goals: the detection of natural objects based on video tacheometric images without any need to enter the area at risk is already done. Using calibrated spatial direction measurements directly from the image and reflectorless distance measurement, 3D-localization of surface objects is done to consolidate the TDR and the GNSS-points. Currently, selected points at the slope are semi-automatically and periodically observed (Fig. 4). In this regard, the IATS setup pillar and the project area have been integrated in the official Gauss-Krüger coordinate system by DGPS measurements. The various measurement positions, stable TPS reference points and natural target points are now absolutely coordinated.
Fig. 4: The position of suitable surface rocks to be observed with video tacheometry. The average quality of the IATS images can be seen (although real resolution is much higher). The illumination problems can already be estimated from these images, but still they have shown to be adequate for semi-automatic operation
Additionally to video tacheometer applications, tacheometric scanning approaches have been further examined. In the context of accuracy evaluation, a diploma thesis has been finished (Striegl 2008) which dealt with ICP-algorithms to detect transformation parameter sets of rigid body movements (as e.g. surface rocks show) using undense, term-wise scan patterns. The achieved accuracy (standard deviation) is in the range of 1 mm to 2 cm depending on the movement and the surface structure. As expected, especially for very small displacements, only little significance can be proved. This first step is further developable (especially when using instruments with a higher scanning frequency of up to 20 points/ second), but very time-consuming with the project’s instrument. Also continued was the evaluation of snow height detection using tacheometric scanning. Due to the very flat angles of impact at the Aggenalm slope it is not possible to scan the whole project area, as backscattering often is very poor. In this context, even snow is the more cooperative target surface than for exam-
ple meadow, which has a very high divergence loss. In the surrounding of the IATS pillar, however, surface coverage models are developed, which generally can be extrapolated as an indicator for the present snow amount, and furthermore for the speed of snowmelt. In the alpEWAS subproject 2, tacheometric scanning will be primarily focussed again in the coming winter period. Especially the comparison of surface coverage and the additional spring delivery recordings are of interest. As long as weather conditions allow, further image assisted displacement measurements to the selected surface rocks will be performed and stored in the alpEWAS database. Integrating the other measurement systems, an advanced accuracy model for the deformations acquired shall be derived. Within the scope of the alpEWAS project a PhD thesis with the title »Basic principles of displacement monitoring using video tacheometric data« was finished at the Chair of Geodesy at the TU München in June, 2009 (Wasmeier 2009a).
39
Tab. 1: Statistic of the days with a GNSS system malfunction
Global navigation satellite system (Subproject 3) In this status report completed workings of the finished project phase 3 and the ongoing phase 4 are presented as well as first results of the GNSS monitoring component. System status As mentioned in the last report, project phase 3 – learning phase – was splitted into two parts. These two parts are the test phase on campus in Munich and the intrinsically learning phase at the test site, which started with the first starting up of the system in August 2008. The most determining defect of the system’s first field test was the DSL via satellite link which was not available until the end of January 2009. The delay was mainly caused by the telecommunications service provider. So, remote-control of the central computer station on site was impossible. Due to the fact of a missing remote-control, immediate maintenance operations were unfeasible and trouble shooting was handicapped as well. Main causes of the fragmentary data recording during this period were repeatedly power black outs in the two alpine huts and ongoing and surprisingly communication problems (completely eliminated now by diverse antenna changes). All these problems could be solved and with the (few) obtained data the most important parameters for GNSS analysis could be defined. The length of epochs is determined to 15 minutes at the moment and a new set of input parameters of the baseline processing software especially fitted to the conditions onsite could be defined. All milestones of project phase 3 have been achieved.
40
With the availability of the DSL via satellite equipment in January 2009 project phase 4, the testing phase, started. Since February 2009 the system is running satisfactorily. Tab. 1 shows the days with malfunction for all three object points on the slope in the epoch from February 2009 to July 2009. A day of malfunction is a day where more than 12 hours of recorded data is missing. The functional reliability of the system over half a year was 89.6%. Without the occurred two longer incidents the system’s reliability would be 95.1%. Both values still are not really satisfactory for an early warning system for which reason it is worked forcefully on an optimization of the status control and defect detection. Despite these troubles it can positively emphasized that winter conditions with snow coverage of 200 cm and temperatures much below –10°C (see Fig. 5) had no negative effects. The autarkic power supply worked correctly, the dimensions of the individual power management, communications and sensor components are chosen and configured adequately. Results The results of the continuous data recording at the test site Aggenalm containing the interesting period of snow melt in spring time and a period of heavy rainfall in June 2009 are presented in the following and will be discussed mainly relating to reachable accuracies. The unfiltered raw GNSS results, this means the 15 minute solutions from the baseline processing step, cannot be used without any further filter options. One exemplary sensor on the slope is depicted in Fig. 6.
Fig. 5: Winter conditions at Aggenalm Landslide. The picture shows the GNSS sensor and the autarkic power supply via solar panel of a sensor node
Fig. 6: Raw GNSS results from the baseline processor (February â&#x20AC;&#x201C; July 2009). Depicted are the time series for horizontal coordinates X, Y and the height H for every 15 minutes. Some blunders and gaps occur. The used sensor is a Novatel Smart Antenna
Fig. 7: Variations of the horizontal position during a representative day. Depicted are again the 15 minutes results where all outliers are eliminated and the gaps are closed
Outliers mainly resulting of a bad satellite constellation lead to insufficient baseline processing results and must be eliminated with a reliable, highly sophisticated quality management. Furthermore it has to be considered that the height component using GNSS measurements is always worse by a factor of approximately 2â&#x20AC;&#x201C;3 compared with the horizontal components. The result of the above depicted raw data for a selected day is shown in Fig. 7. After removal of all incorrect measurements the total variation of the horizontal position lays in an area of less than 1.5 cm. Between 06:00 and 11:00
a lack of measurements occurs. In this period many incorrect results from the baseline processing had to be eliminated. A quite long time of a bad constellation, only few satellites in view due to huge shadowing effects of both stations (reference on stable ground and the depicted sensor at the slope) occurred. For this day, a 1 h solution combining four single epochs using a moving average filter is shown in Fig. 8. The variations now clearly rest in sub-centimeter region and allow reliable short-term conclusions about the deformation process.
41
Fig. 8: Variations of the horizontal position during a representative day using a moving average filter of 1 hour
Fig. 9: Medium-term trend of point 1 in the upper region of the slope (moving average filter of 6 hours from February to July 2009)
Fig. 10: Medium-term trend of point 2 in the middle of the slope (moving average filter of 6 hours from February to July 2009)
Fig. 11: Medium-term trend of point 3 in the lower region of the slope (moving average filter of 6 hours from February to July 2009)
In order to make conclusions of long term trends of the deformation process the use of a longer filter interval, e.g. 6 hours, is favorable. Especially the points #2 and #3 show noticeable movements (Fig. 10 and Fig. 11). On the one hand in spring time during time of snow melt (Aprilâ&#x20AC;&#x201C;May) and on the other hand during the period of heavy and continuous rainfall at the end of June. Point 1 is apparently not affected by the rainfall (Fig. 9). As well it can be seen that the data quality of point 1 is a bit worse due to the shadowing effects. Increased filtering for an enhanced trend interpretation can be performed if necessary. Conclusion The first experiences and results from the outdoor use under temporary adverse conditions show the reliability and performance of the GNSS monitoring component in the alpEWAS project. Despite that the obtained results are very positive further effort has to be made to increase the data quality. The current efforts concentrate on a comprehensive quality man-
42
agement and on developing additional processing and estimation procedures until the end of the project. The results of the baseline processing and an affiliated first quality management are automatically archived in a MySQL database (see below). Additionally the complete raw data is stored separately, enabling to perform advanced researches (e.g. reprocessing with an updated set of parameters) at any time. Integrative data management and analysis Within the alpEWAS project a new flexible and easily extendible control, management and data analysis software package for landslide early warning systems is already developed based on the National Instruments LabVIEW software development environment. The software has been designed to be easily adaptable to different sensor constellations and to be easily extendible by additional or new sensors at any time. This is achieved using
a »plug-in« program structure (Fig. 13). Most major program elements (sensor I/O, database management, data viewers and analyzers, import and export filters etc.) are independent programs, which can individually be added to the main control program according to the current need. The core of the software package forms an open source MySQL database. All sensor data (metadata, raw data, results of intermediate steps and preliminary analysis (1st level) results, status information etc.) are stored centrally and an in a flexible but well structured way. All elements of the software package access this database, which thereby serves as the central »data turntable«. Sensor I/O The communication (control and data acquisition) with each sensor or group of sensors is carried out by a »sensor plug-in« which ideally also is a LabVIEW program. In this case the sensor plug-in integrates a graphical user interface (GUI) into the main application, thereby providing seamless access and full control to the sensors within the main application. In the alpEWAS project this is the case for the TDR, GNSS and piezometer measuring systems as well as for the weather station. If a sensor can not be controlled using a LabVIEW program (e.g. due to exclusive I/O protocols) it is also possible to use an automated data file import, as it is planned at the Aggenalm Landslide for the integration of the TPS data. In this case however the control functionality is limited. In most cases the sensor plug-ins also perform first data analyses (e.g. baseline processing for GNSS data or deformation analysis for TDR data) and provide status information (e.g. error messages) about the sensor(s), which is written into the MySQL database. Integrative analysis The main benefit of the combined application of the different measurement systems arises from the integrative analysis. A first intuitive evaluation and analysis of the data from the
different measuring systems can be done by simple time series analyses. This often is the data presentation experts ask for as basis for decisions concerning the landslide. The alpEWAS software offers this functionality always including the newest available data, allowing the user to combine up to four different value types (each from several sensors) to a common time series plot (Fig. 12). Furthermore data viewers for the visualization of inclinometer measurements and TDR raw data are available. The next step in the integration of the different sensor datasets is still under progress. It is planned to automatically cross-check the data from the three different deformation measuring systems and to thereby provide information to identify erroneous measurements. Also planned is the geometric combination of related measuring sites (e.g. GNSS at the top of a TDR borehole or close by GNSS and TPS measuring points) to provide better information about the 3D deformation mechanism. Alarming and early warning Currently basic email alarming and notification functions have been added to the software. A status monitor repeatedly checks the status of the different sensor plug-ins (plug-in execution status and sensor status information within the MySQL database) and issues an email if any sensor or sensor-plug-in has encountered an error. It is also possible to have the program send status reports in certain time intervals (e.g. daily), which include all error messages and current status information of the entire sensor network and its infrastructure. This way the system administrators easily can keep an eye on the status of the monitoring system. Additionally thresholds can be defined for any of the datasets within the database (e.g. precipitation amount, deformation rate). If a newly acquired value surpasses a threshold, an email is released. The user can decide if a notification is sent only once or every time the threshold is surpassed, which however might lead to heavy email traffic. The implementation of a SMS notification service as addition to the email service is planned.
43
Fig. 12: Screenshot of the time series data viewer within the alpEWAS conrol center program showing actual rainfall and piezometer data from the Aggenalm Landslide: reaction of the ground water level (site B4) to a heavy precipitation event in June 2009
Early warning will be achieved by the definition of sets of graded thresholds (information – preliminary warning – imminent danger) for the main triggering factors precipitation and in consequence ground water level rise (monitored with piezometers). The critical issue thereby is the definition of the threshold sets. Therefore several numerical models of the Aggenalm Landslide have been created using FLAC and Phase 2, which incorporate all geological and hydrological knowledge gained through the geological examination and the geotechnical measurements so far. Sensitivity analyses are performed with the aim to quantify the influence of ground water rise to the instable slope (Jung 2007). The preliminary results of these models have led to the definition of a first set of threshold values for critical ground water levels. Data interoperability Another important aspect for the early warning system for instable slopes is a widespread and easy use of the results. Status information of the observed object – including different metadata
44
such as for instance voltage supply – should be attained as fast as possible and easy understandable to the corresponding end users (administrators, stakeholders, common viewers). In times of worldwide network-link by the internet interoperability has top priority. Only with standardized interfaces a trouble-free data exchange between heterogeneous systems is possible. The Sensor Web Enablement (SWE) initiative of the Open Geospatial Consortium (OGC_) for example aims to develop and introduce a worldwide standard of handling space-orientated data for an interoperable benefit. Finally, every monitoring system has to be regarded as a part of the Spatial Geo Data Infrastructure (SDI). Therefore providing open, standardized interfaces is one of the main criteria of an up to date monitoring system. The alpEWAS project faces these criteria of an interoperable data exchange and further work will concentrate on this aspect, which is not implemented yet. Conclusion and perspective To date with the successful installation and operation of a continuous geo sensor network
Fig. 13: Structure of the alpEWAS sensor control and data management solution. The hatched elements are still in development
45
at the Aggenalm Landslide most project goals have been successfully realized. The deformation measuring methods TDR, GNSS and TPS have been further developed and tested in field and through time the reliability of the individual sensors and the whole geo sensor network was constantly increased. Additionally a flexible and easily extendible integrative data management and analysis software with alarming capabilities was developed. The numerical modelling of the slope for the definition of reliable threshold values for the triggering factors is in progress and when completed hopefully will make reliable early warnings possible.
tional data of up to two years might be necessary to successfully complete this task. This additional time would also allow the further development of the existing numerical models and together with the growing amount of monitoring data would make the consolidation of the early warning thresholds possible. References Lin, C. P., Tang, S. H., Lin, W. C. & Chung, C. C. (2009): Quantification of Cable Deformation with Time Domain Reflectometry – Implications to Landslide Monitoring. – Journal of Geotechnical and Geoenvironmental Engineering, 135 (1), 143–152.
Still, as the deformation of the Aggenalm Landslide has slowed down since the beginning of the project (proven by sporadic conventional geodetic measurements), the determination of the field performance of the deformation measuring systems partly can only proceed slowly. This remains one major aspect of the future work at the Aggenalm Landslide. Other than that, the main focus is put on the development of integrative data analysis methods for reliability enhancement and early warning issuance. Furthermore it is planned to extend the geo sensor network by an additional measuring system in the next months: measuring weirs will be constructed at the two dominant springs in the landslide area. The thereby gained information of the spring discharge will assist the development of a reliable hydrological model of the slope, thereby further enhancing the quality of the numerical models for the threshold evaluation. Additionally in cooperation with the Remote Sensing Technology Institute (IMF) of the German Aerospace Center (DLR) (Prof. Bamler) it is planned to install several corner reflectors at the Aggenalm Landslide for the punctiform determination of the landslide deformation using the D-InSAR remote sensing technique with the new TerraSAR-X satellite.
Günther, J., Heunecke, O., Pink, S. & Schuhbäck, S. (2008): Developments towards a lowcost GNSS based Sensor Network for the Monitoring of Landslides. – LINEC, 13th FIG Symposium on Deformation Measurement and Analysis, Lisbon, 12–15 May 2008.
Currently it can not be assured that the evaluation of the field performance will be accomplished at the end of the project time. Addi-
Glabsch, J., Heunecke, O. & Schubäck, S. (2009c): A low cost PDGNSS based sensor network for landslide monitoring – challenges,
46
Mikkelsen, P. E. (2002): Cement-Bentonite Grout Backfill for Borehole Instruments. – Geotechnical Instrumentation News, 2002 (1), 36–40. Steger, C., Ulrich, M. & Wiedemann, C. (2008): Machine Vision. Algorithms and Applications. – Wiley-VCH Verlag, Weinheim. alpEWAS Publications Festl, J. (2008): Eignungsprüfung von ZementBentonit-Suspensionen als Injektionsgut bei TDR Deformationsmessungen. – unpublished M.Sc. thesis, TU München. Festl, J., Singer, J. & Thuro, K. (2009): Kalibrierung von TDR Deformationsmessungen für Hangbewegungen. – In: Schwerter, R. (ed.): Tagungsband der 17. Tagung für Ingenieurgeologie mit Forum für junge Ingenieurgeologen, Hochschule Zittau-Görlitz, Fachsektion Ingenieurgeologie, Deutsche Gesellschaft für Geotechnik, 6-9 May 2009, 397–400.
possibilities, prospects. – International Journal of Digital Earth (IJDE). (paper submitted, review in progress) Glabsch, J., Heunecke, O. & Schubäck, S. (2009b): Development and testing of a LowCost GNSS monitoring system. – Joint Board of Geospatial Information Societies (JB GIS), Best Practices Booklet on Geo-information for Risk and Disaster Management. (paper accepted, completion in progress) Glabsch, J., Heunecke, O. & Schubäck, S. (2009a): Monitoring the Hornbergl landslide using a recently developed low cost GNSS sensor network. – Journal of Applied Geodesy (JAG), vol. 3, 179–192. (reviewed paper) Hoepfner, R., Singer, J., Thuro, K. & Aufleger, M. (2008): Development of an integral system for dam and landslide monitoring based on distributed fibre optic technology. – British Dam Society, 15th Biennial Conference on »Ensuring Reservoir Safety into the Future«, University of Warwick, 10–13 September 2008. Jung, S.-Ch. (2007): Untersuchung der Hangbewegung an der Aggenalm östlich des Sudelfelds zwischen Bayrischzell und Oberaudorf. – unpublished diploma thesis, TU München. Singer, J., Schubäck, S., Wasmeier, P., Thuro, K., Heunecke, O., Wunderlich, Th., Glabsch, J. & Festl, J. (2009): Monitoring the Aggenalm Landslide using economic deformation measurement techniques. – Austrian Journal of Earth Sciences (AJES). (paper accepted, review in progress) Singer, J., Wasmeier, P., Schuhbäck, S., Glabsch, J., Thuro, K., Wunderlich, Th. & Heunecke, O. (2008): alpEWAS Status Report. – Geotechnologien Statusseminar, University Osnabrück, 8–9 October 2008. Singer, J. & Thuro, K. (2006): Monitoring mit Time Domain Reflectometry (TDR). – In: Brunner, F. K. (ed.): Ingenieurvermessung 07 – Beiträge zum 15. Internationalen Ingenieurvermessungs-
kurs Graz, 17-20 April 2007. – 430 p., Heidelberg (Wichmann), 259–270. (reviewed paper) Singer, J. & Thuro, K. (2007a): Computergestützte Auswertung von Time Domain Reflectometry Messdaten zur Überwachung von Hangbewegungen. – Tagung für Computer Orientierte Geologie, 2. July 2008, Salzburg. (reviewed paper) Singer, J. & Thuro, K. (2007b): Entwicklung eines kontinuierlichen 3D-Überwachungssystems für instabile Hänge mittels Time Domain Reflectometry (TDR). – In: Otto, F. (ed.): Veröffentlichungen von der 16. Tagung für Ingenieurgeologie, 7–10 March 2007, Bochum. – 492 p., Bochum (Technische Fachhochschule Georg Agricola), 69–76. Singer, J., & Thuro, K. (2009): Monitoring of subsurface deformations using Time Domain Reflectometry. – In: Bayerisches Landesamt für Umwelt (ed.): 6th European Conference on Regional Geoscientific Cartography and Information Systems EUREGEO, Earth and Man. – 480 p., Proceedings Volume I, Munich 9–12 June 2009, 397 400. Singer, J., Grafinger, H. & Thuro, K. (2009): Deformation monitoring of temporary top heading inverts using Time Domain Reflectometry. Überwachung der Deformation einer temporären Kalottensohle mit Time Domain Reflectometry. – Geomechanics & Tunnelling – Geomechanik & Tunnelbau 3: 238–249. (reviewed paper) Singer, J., Thuro, K. & Sambeth, U. (2006): Development of a continuous 3d-monitoring system for unstable slopes using time domain reflectometry. – Felsbau 24 (3), 16–23. Striegl, P. (2008): Deformationsmessung nicht signalisierter Objekte mittels tachymetrischen Scannings. – unpublished diploma thesis, TU München. Thuro, K., Wunderlich, Th. & Heunecke, O. (2007): Development and testing of an inte-
47
grative 3D early warning system for alpine instable slopes (alpEWAS). – In: Stroink, L. (ed.): Geotechnologien. Science Report No. 10. – 136 p., Kick-Off-Meeting 10. October 2007, Technical University Karlsruhe, Programme und Abstracts, 10: 101–112. Thuro, K., Wunderlich, Th., Heunecke, O., Singer, J., Schuhbäck, S., Wasmeier, P., Glabsch, J. & Festl, J. (2009a): Low cost 3D early warning system for alpine instable slopes – the Aggenalm Landslide monitoring system. Kostengünstiges 3D Frühwarnsystem für alpine instabile Hänge – Das Überwachungssystem der Aggenalm-Hangbewegung. – Geomechanics & Tunnelling – Geomechanik & Tunnelbau 3: 221–237. (reviewed paper) Wasmeier, P. (2003): Potential der Objekterkennung mit dem TCRA2003. – Proceedings of 6th Optical 3D Measurement Techniques. Wasmeier, P. (2009a): Grundlagen der Deformationsbestimmung mit Messdaten bildgebender Tachymeter. – Dissertationsschrift, DGK Reihe C 638. Wasmeier, P. (2009b): Videotachymetrie – Sensorfusion mit Potenzial. – AVN 7/2009, 261–267, Wichmann Verlag, Heidelberg.
48
EGIFF – Developing Advanced GI Methods for Early Warning in Mass Movement Scenarios Breunig M. (1)*, Schilberg B. (1), Kuper P. V. (1), Jahn M. (1), Reinhardt W. (2), Nuhn E. (2), Mäs S. (2), Boley C. (3), Trauner F.-X. (3), Wiesel J. (4), Richter D. (4), Abecker A. (5), Gallus D. (5), Kazakos W. (6), Bartels A. (6) (1) Institute for Geoinformatics and Remote Sensing (IGF), University of Osnabrück, Barbarastr. 22b, D-49076 Osnabrück, e-mail: mbreunig@uni-osnabrueck.de (2) Geoinformatics Working Group (AGIS), University of the Bundeswehr Munich, Werner-Heisenberg-Weg 39, D-85577 Neubiberg, e-mail: Wolfgang.Reinhardt@unibw.de (3) Institute for Soil Mechanics and Geotechnical Engineering, University of the Bundeswehr Munich, Werner-Heisenberg-Weg 39, D-85577 Neubiberg, e-mail: Conrad.Boley@unibw.de (4) Institute of Photogrammetry and Remote Sensing (IPF), University of Karlsruhe, Englerstr. 7, D-76128 Karlsruhe, e-mail: Wiesel@ipf.uni-karlsruhe.de (5) Research Centre for Information Technologies (FZI) at University of Karlsruhe, Haid-und-Neu-Str. 10–14, D-76131 Karlsruhe, e-mail: Andreas.Abecker@fzi.de (6) disy Informationssysteme GmbH, Erbprinzenstr. 4-12, D-73133 Karlsruhe, e-mail: Kazakos@disy.net *Coordinator of the project: Prof. Dr. Martin Breunig, University of Osnabrück
Abstract There is a strong demand for analyzing mass movement scenarios and developing early warning systems to save lives and properties. However, hitherto the preparation of information and the analysis of hazards are still particularly critical links in the early warning chain. The responsible decision makers are usually confronted with huge amounts of structured and unstructured data. Thus the question arises, how they may be provided with a reliable and manageable amount of information to create the warning decision and to take preventive measures. In this article, objectives, concepts and results are presented, examining methods of an information system for the early recognition of geological hazards in mass movement scenar-
ios. The simulation of landslides is executed on the basis of geotechnical, mechanically founded models. By coupling the simulation with GIS and advanced geo-databases, a better understanding of the corresponding geoscientific processes is achieved. Additionally, the analysis of structured and unstructured data executed by statistical and linguistic methods, respectively, improves risk assessment and supports the early warning decision. Finally, a service-based 3D/4D geo-database manages selected data of a mass movement scenario. Keywords: early warning, mass movement, geotechnical model, GI methods, 3D/4D geodatabase, landslides, decision support system, learning system, finite element analysis.
49
1. Introduction and objectives The number of geological events such as landslides (Glade and Dikau, 2001; Merritt et al., 2003; Bell and Glade, 2004; Dikau and Weichselgartner, 2005) has increased worldwide during the last decades. Thus, there is a strong demand for developing early warning systems to save lives and properties. The central components of an early warning system for natural phenomena are the recognition of the threats, the assessment and evaluation of the danger, the dissemination and communication of the warning, as well as the public reaction to the warning (Smith, 2009). The effectiveness of an early warning system largely depends on the transformation of the event recognition into the report for warning to the population. Obviously, the analysis and the preparation of information are particularly critical points of the early warning chain. However, they contribute significantly to the warning decision and to the risk estimate and the extent of the consequences caused by the natural event. The objective of the GEOTECHNOLOGIEN joint project (Geotech, 2009) »Development of suitable information systems for early warning systems« (EGIFF) is to improve the early warning chain by the design and development of new methods to be integrated into appropriate components for early warning systems (Breunig et al., 2008). 2. Application areas The methods of the EGIFF project are tested and evaluated with real mass movement scenarios. For this task, suitable application areas were selected. Main selection criteria were the availability of detailed data and sensor measurements, a coherent and comprehensive geology to verify the gained methodology in a generalized way and the potential risk for landslides. After the investigation of different landslide areas, in close cooperation with the Bavarian Environment Agency (LfU), a part of the Isar valley in the south of Munich, next to Pullach and Neugrünwald, has been selected for further studies. In this area the height difference of the slope is up to around 40 meters
50
and endangered human infrastructure is located nearby the edge of the slope. Because of the risk potential, the area is observed by the responsible authorities. Inclinometer, extensometer, groundwater level measurements and geodetic surveys are available. These data include: – – – –
Digital Elevation Model (DEM); topographic data (vector and raster format); orthophoto; geological and geomorphological data including drilling profiles and tearing edges; – groundwater level measurements; – deformation measurements; – geotechnical properties. Since the »Isarhänge Grünwald« application area and the available data appeared to be well suited for numerical simulation, but not for testing statistical methods and linguistic methods, we have decided to additionally use a second application area. Main reason was a larger size of the second application area, which is concentrated in Vorarlberg. The data of the second application area include: – Digital elevation model (resolution 5 m), contour lines (50 m); – topographic data (vector and raster format); – orthophoto mosaic (resolution 1 m); – geological data inclusively tectonics, geomorphology, drilling profiles, tearing edges (vector and raster format); – risk maps for landslide hazards in various areas prepared by the Dept. of Applied Geology (AGK), University of Karlsruhe; – measuring points for precipitation (since 1893); – documentations of natural phenomena inclusively landslide hazards (since 1400); – statistics of the Vorarlberg fire brigade (1997–2006). 3. EGIFF system architecture In this section the EGIFF components for the early recognition of geological hazards in mass movement scenarios are presented. While in the first component the coupling of GIS and modeling/simulation to support the evaluation of risks is examined. In the second component
Fig. 1: Application areas »Isarhänge Grünwald« (Bavaria) and »Vorarlberg« (Austria)
text descriptions are processed and spatial data mining is executed for the construction of hazard susceptibility maps. In the third component primary and secondary data are modeled and managed in a 3D/4D geo-database management system. The following subchapters provide an insight into the objectives and applied methods of the components.
simulation systems and for landslide susceptibility determinations, their handling should be more intuitive and user-friendly. GIS with their ability to store, manage and visualize geographical information provide a good basis for setting up the inputs of an FE simulation, analyzing and integrating the outputs to finally support a decision.
EGIFF Component I: Development of an interconnected information and simulation system
The interconnection between simulation system and GIS is schematically shown in Fig. 2. The process starts with the selection of relevant parameters which are required for the analysis. The parameter transfer is controlled by the GIS. These parameters basically describe the model geometry, the subsoil structure and additional boundary conditions. Within the simulation system the modeling of the slope and the simulation of the landslide evolution are executed.
System Architecture and data flow of the coupled system At present, the application of the Finite-Element-Method (FEM) for the analysis and simulation of landslides is subject of research. Due to its complexity the corresponding simulation systems are predominantly used by experts and scientists. For disaster prevention and management, such tools are currently not available. An FE-analysis of presumably instable slopes requires detailed information about the subsoil structure, the occurring soil materials, the deformation history and the stress situation of the slope, etc. Therewith the configuration of the simulation input data is very complex and usually not sufficiently supported by the simulation system. Furthermore, simulation outputs are extensive and the interpretation of the simulation results is usually only weakly supported by the simulation system. For a broader use of
After the analysis, the results are transferred to the GIS for visualization, assessment and for processing them into a form which is understandable for decision makers. Furthermore, i.a. stability indices and movement vectors can be calculated from the simulation results to assess the slope stability, the likely system behavior in future and the potential risk scenario. Uncertainties in the data used in the simulation and in subsequent processes should also be modeled and visualized in the GIS. In particular to support the user in the decision-
51
Fig. 2: Interconnection between simulation system and GIS
Fig. 3: Architecture of the decision support and the learning system
making process, the uncertainties have to be recognizable, in order to allow the validation of the results by the user. Additionally, rule-based GIS components support the user in the decision whether to issue an early warning or not. Operational modes of the coupled system Comprehensive and exhaustive simulations are computationally intensive and can be too time-
52
consuming in case of an early warning decision. Therefore two main operational modes of the coupled system with differing computational costs were identified (Ortlieb et al., 2009a). a) Learning system mode for better understanding of landslides and b) Decision support system (DSS) mode for prevention or reaction to a hazardous event.
Fig. 4: Support of the generation of the geotechnical model in the GIS
The learning system enables the evaluation of consequences of various scenarios and allows a better understanding and prognosis of landslides. It can be used to analyze and to compare different simulations, which were performed under varying conditions or for different time intervals. Furthermore, the learning system enables to compare observed historical events with simulated ones. Thus, critical events can be determined (e.g. a critical flood discharge). This allows for the announcement of warnings at an early stage when the critical event is expected or forecasted (e.g. intense rainfall by the weather forecast). Another functionality of the learning system is the comparison of simulations with actual measured values. This enables the calibration and refinement of the simulations and supports the improvement of the understanding of the geotechnical characteristics of the slope. The results which were gained in the learning system are stored in the database. This allows either executing further analysis in the learning system mode or applying the DSS (see Fig. 3). In contrast to the learning system the DSS is used, if an acute danger exists and immediate action is essential. Examples are intense rainfall or an approaching flood wave, which may destabilize the slope causing a potential danger. This occurrence requires a fast decision whether to issue an early warning or not. Because in most cases there is no time for complex and comprehensive and therefore timeconsuming numerical modeling of the slope and simulation of the system behavior, it is necessary that information is already available in the database from previous simulations for
the actual case. If there has not been a simulation before, a new analysis has to be carried out. In this case a simplified FE-analysis is executed, because an exhaustive simulation would take too much time. How this simplified simulation is carried out with satisfying accuracy and significance is still under investigation. Generation of geotechnical models and slope stability analysis Before a slope can be analyzed in the simulation system, an appropriate model of the system has to be set up, which is able to reflect physical processes leading to slope failures, e.g. landslides. The generation of these geotechnical models (Trauner and Boley, 2009) is supported by the GIS. First the area of interest is selected within the GIS platform on basis of an orthophoto. Instead of an orthophoto various maps can be used, for example topographic maps or susceptibility maps, which may already indicate the necessity of further detailed investigations. The associated data for the ground and subsurface structure of the area of interest is obtained from the geo-database (see chapter ÂťComponent IIIÂŤ) and preprocessed in the GIS. The ground surface topography is represented by a digital elevation model (DEM). For processing in the simulation system the DEM is transformed to a local coordinate system before it is imported in the simulation system (see Fig. 4). Additionally to the DEM the subsurface structure is needed to generate a wire-frame model, which represents the geometry of the ground for the selected area of interest in the
53
simulation system (see Fig. 5). This information is obtained from borehole logs, which are located in or around the selected area. Upon request the data can either be retrieved from the geo-database, if available for that area, or provided by the user. The wire-frame block model generated for the area of interest represents a part of the subsurface continuum. For FE-Analysis this continuum has to be discretized, i.e. a defined grid of points (nodes) is generated within the continuum and each node is connected to neighbors. These connections represent the edges of the finite elements. The structure of finite elements represents the geometry defined by
the wire-frame-model and is called Finite-Element-mesh (FE-mesh). All characteristics or attributes of the material and actions are then assigned to the nodes or elements, respectively. Fig. 5 shows a FE-mesh based on the above mentioned wire-frame model assembled by tetrahedral elements (Fig. 4). The set-up of the FE-mesh is supported by an external mesh-generator, since particularly geometric design conditions have to be followed to achieve satisfying performance for the numerical analysis. The mesh-generator is integrated into the simulation system, but any external tool could be employed within the modular system architecture, if favored.
Fig. 5: Wire-frame model for the area of interest
Fig. 6: FE-mesh for the wire-frame model (Fig. 5)
54
Fig. 7: Visualized deformation vectors from a 3D analysis
On basis of the FE-mesh all locations and magnitudes of loads (forces are arranged on nodes, edges or surfaces of finite elements) or geometric modifications (action effects in general) are defined. In addition to the geometric description of the model represented by the FEmesh, the behavior of the soil material due to changes in the primarily stress state, has to be described by constitutive equations. Based on these mathematical functions, deformations of nodes can be calculated, if the applications of loads or geometric changes cause the formation of a new equilibrium stress state.
material utilization at different locations are obtained. If no equilibrium stress state for the slope is determined for the simulated scenario, the slope will fail and a landslide may occur. In this case, the latest possible equilibrium stress state defines the slopeâ&#x20AC;&#x2122;s ultimate limit state, i.e. all loads greater than the ones applied during incremental loading, will result in failure of the slope. Although the computation will abort in this case, information on the expected deformations of the slope before its failure and the critical magnitude of action effects can be gained by these computations.
When all information required for an FE-analysis is available, the data is put together in a single input file which is then basis for computation. This input file is compiled such that the computation program can directly process the data.
Relevant results from the analysis â&#x20AC;&#x201C; in regard to the assessment of the endangering by possible landslides (e.g. deformations, degree of material strength utilization, etc.) â&#x20AC;&#x201C; are written to an output file and transferred back to the GIS for further processing and visualization.
During the computation, the defined action effects are applied incrementally on the slope, e.g. the loads are not applied with their full magnitude at once, but stepwise. If finally the entire action effects are successfully applied and an equilibrium stress state is determined, the deformations of nodes or the degree of
Preparation of the extensive simulation results in the GIS for decision support Results of the simulation include several parameters (e.g. stresses, strains or deformations, degree of material utilization), which can be referred to the nodes of the FE mesh. In Fig. 7 the result of a 3D simulation is shown. In this
55
Fig. 8: Deformation vectors for different areas
example the deformations of the FE nodes are visualized as deformation vectors. To identify the important parameters, namely the deformation direction und deformation length of the deformation vectors, the deciption has to be strongly enlarged. But therewith the overview of the whole slope will get lost. In the following paragraph a short abstract of a methodology, which allows for the userfriendly visualization of the complex simulation results is presented. The methodology is shown schematically on simplified 2D simulation results. For a more detailed version see Ortlieb et al., 2009b. The length of the deformation vectors may indicate if the slope has to be categorized as susceptible to landslide. This length results directly from the loads, which were applied during the simulation. Usually, the stronger these impacting loads have been, the larger are the deformations of the FE mesh nodes and therefore the lengths of the deformation vectors. According to their lengths these deformation vectors can be divided into classes. Subsequently the deformation vectors are divided into direction classes. Afterwards clusters are detected, which include deformation vectors, belonging to the same deformation class and to the same direction class. The clusters are spatially adjacent.
56
The deformation vectors, which belong to a cluster, can be aggregated to one single deformation vector. For the aggregated deformation vectors, the area of validity has to be determined. Therefore the FE mesh can be used. Around the FE-nodes of the FE-mesh, which belong to a cluster, polygons are built (see Fig. 8). Result of this method is a visualization, which presents the constitutive movement tendencies of different areas with a related deformation vector. Besides the user-friendly preparation and visualization the complex simulation results can be enriched by additional data from any other resources in the GIS to support the decisionmaking process. The modeling area can, for example, be intersected with available digital topographical data (see Fig. 9) to determine the potential endangered human infrastructure (e.g. buildings and streets). This information can then be used to support the user in the decision-making process, whether to issue an early warning. EGIFF Component II: Analysis and valuation of fuzzy textual descriptions of geo-scientific phenomena to support and improve early warning systems The objectives during the development of component II are the collection, pre-processing and analysis of relevant structured and un-
Fig. 9: Modeling area with overlain topographical data
structured data (numerical and textual measurements, observations and descriptions of specialists and laymen (citizens)) related to natural hazards (alpine mass movements). The focus is on the development and application of novel computational methods aimed at a combination of results obtained from analyses of heterogenic data sources, followed by a prototypical implementation as suitable components of an early warning system. In the following the main working results concerning component II are described. They have been worked on collaboratively by FZI, IPF and disy Informationssysteme GmbH, Karlsruhe. The work in component II can be divided into a) Conceptual work (01.04.2007â&#x20AC;&#x201C;31.03.2008) and b) Implementation/ technical realization (01.09.2007) Conceptual work In the first year of the project, research into the application domain seen from the point of view of component II, followed by an investigation into previous work relevant to the objectives of component II, was conducted. The two main lines of work pursued by the project partners focused on (1) the investigation of statistical approaches to predictive modeling of landslide susceptibility/ hazard, including an investigation of model-free (geo-
statistical) and model-based statistical techniques/pattern recognition methods (FZI), and (2) an investigation of approaches integrating natural language processing and knowledge representation for the task of an analysis of unstructured (i.e., textual) data relevant to the application domain (IPF). Analysis of structured data In the first year of the project, the first line of work (1) was followed by the selection of suitable methods combining GIS techniques and multivariate statistics/pattern recognition, a comparative analysis, an application on regional scale, and a dissemination step (Gallus et al., 2007a; Gallus and Kazakos, 2008). The progress in (1) was the result of a contribution from a preliminary stage, involving the collection of structured data, data preparation, and a pre-processing step (IPF). In the second year of the project, further work was directed into research, conceptual work and development related to improvement in the performance of the chosen techniques (classification performance, computational complexity), and further enhancements. Towards the goal of improving classification performance, approximate inference techniques for Gaussian process model classification were investigated, with focus on (fast) analytical approximation tech-
57
Fig. 10: a) Hazard map (Gaussian process model/Laplace approximation) â&#x20AC;&#x201C; prediction for study area Hochtannberg/ Arlberg (Vorarlberg), ArcGIS-postprocessing b) Hazard map (Gaussian process model/Laplace approximation) â&#x20AC;&#x201C; uncertainty of prediction for study area Hochtannberg/Arlberg (Vorarlberg), ArcGIS-postprocessing
niques (PQL/ Laplace, Expectation Propagation). As a second line of research, an investigation into established and emerging web service/ Geo-Data Infrastructure (GDI) technology standards (OGC web coverage service (WCS), and web processing service (WPS)) was launched, in order to facilitate integration of statistical computation procedures into existing software (desktop GIS systems). As a first step towards integration, a prototypical implementation (Java) was developed, allowing for flexible, TCP/IP-based access to statistical computation procedures from desktop GIS software. Analysis of unstructured data For the second line of work (2) the SOKRATES system (Schade and Frey, 2004), a system combining natural language processing (NLP) and knowledge representation techniques, was investigated. The system, which can be used to display events on a tactical map by means of
58
limited automated interpretation of battle field reports, has been considered as a methodological basis for the envisioned system. However, an application of SOKRATES in component II was ruled out for several reasons, including the different application context, missing spatial and temporal concepts, and the missing consideration of uncertainties and fuzziness inherent in textual data. Instead, the open source framework GATE (Cunningham et al., 2002), used for all sorts of language processing tasks, including Information Extraction (IE) in many languages, was chosen. The IE module was realized with various gazetteers and JAPE grammars, in order to extract relevant information on different entities, including location and spatial relations, time, event, trigger, and damage/consequence. As a starting point for the development of the prototype, historical data, including 1008 text excerpts were collected to generate a test corpus. Furthermore, an empir-
ical survey was carried out investigating spatial descriptions of humans to support interpretation and locality modeling (Andrienko and Andrienko, 2001; Schuffert, 2009). Different classes of spatial descriptions (Tab. 1) were distinguished according to Guo, Liu et al. (2008), and identified by means of GATE to further calculate coordinates and uncertainties. All named places, including also features like river confluences or road junctions, are being determined using the available Vorarlberg base maps, and feature generation will be done within a spatial database that stores these base maps as well as the text corpus and annotations of the information extraction process. There are existing various approaches to georeference locality descriptions, allowing methods to handle sources of imprecision and uncertainty resulting from an analysis of textual spatial references (Dilo, 2006; Guo et al., 2008; Hwang and Thill, 2005; Wieczorek et al., 2004). According to different classes of spatial description various approaches were investigated followed by an exemplary modeling using a fuzzy set approach (Schuffert, 2009). Current work focuses on the evaluation and implementation of adequate modeling methods and the combination of analysis results from structured and unstructured data (1) and (2) (see Fig. 11).
In the context of spatial data visualization, characteristics of the data related to reliability and precision of spatial references will be presented in a way that they satisfy the requirements for adequate usability. In particular, previously established qualities like priority or urgency of information and their possible interpretations are major aspects to be considered. Also, scale-dependencies will play a special role. EGIFF Component III: 3D/4D geo-database support for the geotechnical assessment of mass movements In EGIFF component III, measured and interpreted geological data are modeled and managed in a service-based geo-database management system. Fig. 12 shows an example for data of the application area »Isarhänge Grünwald« managed by the geo-database management system, visualized with the GoCAD®1 3D modeling and visualization system (Mallet, 1992; Mallet, 2002). In this component the geo-database is used for archiving data and for providing fast and efficient access to geo-data. This enables the re-use of geo-data, e.g. for upcoming new hazards. In addition, the geo-database can also manage models and their results and even become active, e.g. by computing geometric intersection queries between a set of geo-objects.
Tab. 1: Exemplary classes of spatial descriptions according to Guo et al. (2008).
59
Fig. 11: Common workflow to integrate results obtained from analyses of structured and unstructured data
Fig. 12: 3D database query example for data of the application area »Isarhänge Grünwald«
The following three main topics have been identified: 1. 3D Geometric and topological management of measured and interpreted geological data including time sequences. 2. Development and implementation of geometric/topological and time-dependent geo-database operations. 3. Management of FE model parameters and their results for the use in geotechnical evaluations of mass movements. For the first topic, selected measured and interpreted test data (e.g. extensometer data
60
and profile sections) from the application area »Isarhänge Grünwald« have been examined and managed in the geo-database. The second topic consists of developing geometric, topological and time-dependent geo-database operations, which support the geotechnical analysis of mass movements. Not only simple range calculations between point geometries (as it is the case in classical 2D GIS buffer operations) are of interest, but also the movement/speed of complex 3D geometries and the direction of the movement have to be considered. The third topic concerns the management of the input data and results from 3D model calculations within the geo-database.
During the FE model calculation a second kind of data interpretation takes place. The management of modeling results in a geo-database creates new possibilities for their further use. Furthermore, the results of a larger number of model calculations can be stored in the geo-database and, on later demand, they can be compared and queried by different database views. For example, versions or scenarios of model calculations can be stored and compared with each other. The geo-database has been tested in the earlywarning scenario »Isarhänge Grünwald«. Hitherto the following heterogeneous data are managed by the geo-database: – digital terrain model (resolution of 2m); – drilling profiles (borehole data); – approximated surface model in 3D space. DB4GeO To meet the requirements of the three topics, DB4GeO, a service-oriented geo-database core is used (Bär, 2007; Breunig et al., 2009a, Breu-
nig et al. 2009b) which is based on our experiences gained from GeoToolKit (Balovnev et al., 2004) and geological modeling (Alms et al., 1998; Mallet 2002). DB4GeO is developed as an extensible toolkit for 3D/4D geo-database services. It provides suitable data types and geometric geo-database services to support data analysis by complex geo-database queries. Furthermore, combined thematic and spatio-temporal database queries are supported by its data model. The easy-to-use internetbased interface enables the direct service access and usage from other geo-tools. DB4GeO is embedded into a simple service infrastructure. DB4GeO has been designed with a servicebased architecture right from the beginning and is exclusively implemented in the Java programming language, i.e. its interface provides Java-based services. Presently, REST (Fielding, 2000) is used as communication platform. The system architecture of DB4GeO is presented in Fig. 13. On the client side, GIS clients or mobile clients may access 3D data managed by the DB4GeO server. On the server side,
Fig. 13: System architecture of DB4GeO
61
DB4GeO may be accessed exclusively via its services, being divided into operations and version management. The central part of DB4GeO is its 3D/4D geodatabase which is based on a geometry library and the R-tree based spatial access structure. The latest version of DB4Geo is implemented upon the free object-oriented database management system db4o. Geometry library of DB4GeO The geometry library of DB4GeO consists of geometric and analytic objects. The relevant classes are Vector3D, Line3D, Plane3D, Point3D, Segment3D, Triangle3D, Tetrahedron3D and Wireframe3D. The class ScalarOperator makes methods available to compare
scalars with a given accuracy constant (space feature). All geometric comparisons are based on this principle to avoid errors due to imprecise representations of floating-point numbers. The class MBB3D defines a minimal bounding box of 3D objects. Based on the geometry library, a geometric model was implemented (Fig. 14). Its structure is symmetrical to the dimension of the geometric objects. Putting <Simplex> in place of the k-dimensional Point3D, Segment3D, Triangle3D or Tetrahedron3D, the model can be described as follows. The class <Simplex>Net3D has a number of <Simplex>Net3DComp components. The components have no topological coherence to each other. Each <Simplex> Net3DComp models the specific k-dimensional
Fig. 14: Geometric model of DB4GeO in UML notation (figure from (B채r, 2007))
62
simplicial complex through an adjacent disjoint set of 3D elements, i.e. <Simplex>Elt3D objects. There are no isolated simplexes within those components, i.e. they are topologically connected. <Simplex>Elt3Ds are finally direct subclasses of the supported simplex types collected by the geometric library. A special case of the implementation is the support of solids. The hull representation is established by 2-dimensional simplicial complexes implemented in ClosedHull3D and ClosedHull3DComp. Those classes wrap around TriangleNet3D and TriangleNet3DComp. Area calculations are passed to these more simple classes and volume calculations are implemented in the wrapping classes.
A sequence S = {ste1, στε2, .... , sten} consists of a series of n spatio-temporal states (ste). Every element stem+1 is a continuation of the element stem (1 ≤ m ≤ n–1). The implementation is realized with the class Point4DTubes. Each of these tubes consists of n Point4D elements and describes one point of an n-simplex (0 ≤ n ≤ 3) in the specified time interval.
»4D« objects in DB4GeO Time dependency is one of DB4GeO´s key features. Thus beside the request of spatial data we offer the ability to access temporal data in a simple and effective way. »4D« objects in DB4GeO are structured into components, which have a unique sequence for every single geometry object, e.g. a triangle. A spatiotemporal component C consists of a set C = {seq1, seq2, .... , seqn} of sequences which temporal discretizations are all equal (Fig. 15). The elements of C build a contiguous and topological invariant network.
Geometric objects are members of a thematic group in which attributes are defined. Floating point numbers, integers, Boolean values, strings and vectors are supported. Likewise an indexed table can be defined in runtime to include a collection of datasets for thematic purpose.
Semantic model for spatial data types The »semantic« (thematic) model (Fig. 16) implements the attributes, which may be attached to the geometric objects. The theme is defined by the class ThematicObject3D which implements the abstract class Thematic3D.
DB4GeO services Fig. 18 shows the approximate surface model of the application area (a) and the result when applying one of DB4GeO´s typical services, the so called »3D-to-2D service« to our application scenario »Isarhänge Grünwald« (b). This
Fig. 15: Structure of 4D objects in DB4GeO
63
Fig. 16: Thematic model of the database objects, showed in UML notation
Fig. 17: (a) Approximated surface model of application area »Isarhänge Grünwald«
(b) Result of 3D-to-2D service
approximated surface model has been designed manually with the 3D modeling tool GOCAD®2 and then stored in DB4GeO. The 3D-to-2D service computes the intersecting geometry between geologically defined surfaces and a vertical plane specified by the user. Internally the basic 3D database operators (intersects, isContained, boundingBox) are applied to the surface model. In DB4GeO, spatio-temporal geo-objects are accessed in a simple and effective way. But how do we get the geo-objects at a specified time step out of the database? To handle this request we have developed the »4D-to-3D service«. With this service we offer access to the following queries: 2
64
GOCAD software is distributed by Paradigm.
– »Get one specified geo-object at a specified time step«. – »Get all geo-objects contained in a space4D at a specified time step«. – »Get one specified geo-object and all it’s time representations in one response«. – »Get all geo-objects and all it’s time representations in one response«. The REST requests of these four sample queries run as follows: 1. http://server/projects/p1/area1/object?get TimeStep(STEP) 2. http://server/projects/p1/area1/all?getTimeStep(STEP) 3. http://server/projects/p1/area1/object.gml 4. http://server/projects/p1/area1/all.gml
Fig. 18: Results of the 4D-to-3D service at different time steps
As we can see in Fig. 18, the access to the temporal data is very simple and easy structured. With every request we get the 3D geometry of the object at the specified time step. The 4Dto-3D service has explicitly been designed to retrieve snapshots of moving 3D objects. A typical workflow, i.e. geo-database session in this scenario is described as follows: 1) Export the 3D model from a 3D modeling tool and import it into DB4GeO. 2) Use the DB4GeO services (e.g. 3D-to-2D or 4D-to-3D service) for data retrieval and geometric computations. 3) Work with spatio-temporal client application analyzing the temporal change of landslides.
the Abaqus® software, may be stored in DB4Geo, the only prerequisite is an ASCII export provided by Abaqus. To manage the geometry of the Finite Element Model in the 3D/4D geo-database, the ASCII export has to be sent to the database via HTTP protocol using the HTTP-Request method PUT. A puristic example call using a command line with cURL-Program (cURL program is used to transfer files without a browser) runs as follows: curl -X PUT –header »Content-Type: text/plain« -s –url http://localhost/egiff/isarhang-data-binary »@AbaqusGeometryExport.gz« –header »application/x-gzip« –header »ContentEncoding: gzip«
These spatial and spatio-temporal services are ready to support the analysis of landslides. To provide further early warning functionality such as »triggering« a client application when the content of the geo-database has changed, DB4GeO is being coupled with suitable services such as GeoRSS (http://georss.org).
Considering a »4D model« it is planned to use different geometric time-/modeling steps of the FE model to be managed in the 3D/4D geo-database for temporal analysis of mass movements. With the assistance of the introduced 4D modules/components the movement/speed of complex 3D geometries and the direction of movements will be considered.
Conception and prototypical implementations for using the thematic model of DB4GeO for the management of FE modeling parameters The geometric representation of the Finite Element Model constructed in component I by
DB4GeO’s thematic model can be used for the management of geotechnical parameters for the material of the surfaces used by numerical modeling. Exemplary geotechnical parameters are: density, dilatation, cohesion, and joint per-
65
meability factor. They can be retrieved in the geo-database by REST requests such as: GET/egiff/isarhang/sandstone?getThematic (density) Examples of thematic REST queries for some cells of a 3D geo-object are: GET/egiff/isarhang/sandstone/componentID/ elementID?getThematic(density<26) GET/egiff/isarhang/sandstone/componentID/ elementID?getThematic(cohesion!=10) Adding single geotechnical parameters can be done by the following REST request: http://server/egiff/isarhang/sandstone? addThematic(demsity=26) If several geotechnical parameters shall be added to an object, an ASCII-text containing geotechnical parameters can be sent to the 3D/4D geo-database: density = 26 cohesion = 10 joint normal stiffness = 20 … By extending the themes, it is possible to get information about different semantics of geometrically identical objects. This information is important for the numerical modeling. In this context we are also speaking of the »thematic coloring« of the model. The data (Scalars and Vectors) calculated in the FE-model for every cell (2D-Simplex) can also be taken into account by extending themes in DB4GeO: GET/egiff/isarhang/sandstone/componentID/ elementID?getVector GET/egiff/isarhang/sandstone/componentID /elementID?getScalar The thematic model of DB4GeO can be used to attach 0D, 1D, 2D, and 3D simplexes, res-
66
pectively. No close coupling of the geo-database with the FE model is intended by using DB4GeO’s thematic model. The aim by using the thematic model is the documentation of the process (movement -/ deformation history), therefore the management of result geometries and their movements as well as the management of geotechnical parameters for the numerical modeling are provided. 4. Integration of the EGIFF components One main challenge was to integrate the separate EGIFF modules in one coherent application in order to demonstrate how a user can interact with the separate services offered by EGIFF. In a first step the Text Annotation and Hazard Map Calculation was integrated (see Fig. 19). The basic idea is, that a user starts a given text, coming for instance from a current message, such as »land slide observed between village 1 and village 2«. Through the text analysis process developed within EGIFF the main annotations are extracted. Special emphasis is put on the spatial annotations, like names of specific locations and terms such as »between« (1). The service proposes a list of terms which are then extended by their geometries (2) though a Gazetteer. The text annotation is itself using a gazetteer for the annotations. The user now has a list with areas mentioned in the text and can zoom to one of the specific areas (3). In case a pre-calculated hazard map for the area is available, he can view the map by contacting the appropriate service (5). If now hazard map is available he can call the service calculating hazard maps (4). This service is using the statistical methods developed within EGIFF and provides the calculated hazard map. In case the calculation lasts longer, the service will publish the new map to a WMS-Server where the user can access this map. In the first step the process was developed without the use of Web services. An extension to Web services is currently under development. To do so we decided to use the reference Architecture by OGC using the Internet as a service bus and Web Processing Services
Fig. 19: EGIFF service based integration of components
Fig. 20: EGIFF samples: Integration of text-analysis and statistical methods
(WPS), Web Feature Services (WFS) and Web Map Services (WMS) to demonstrate the application in a wider scenario. The application itself was developed as a plugin for the desktop GIS software disy GISterm, a Java-based software package widely used in environmental authorities in Germany and Austria. Fig. 20 gives some impressions of the current prototype.
5. Cooperation with EWS joint projects and Geotechnologien main topics The EGIFF project seeks cooperation with other early warning system (EWS) projects within the GEOTECHNOLOGIEN research program. For example, the EGIFF project maintains a close cooperation with the staff of the alpEWAS project (Thuro et al., 2007; Singer et al., 2008). The objective of this project is to design and implement an integrated early warning
67
system for known (reactivated) and new landslides and mudflows. Innovative sensor measurement technology, comprising amongst others surface sensors (Global Navigation Satellite System), reflectorless tachymetric surveys and Time-Domain-Reflectometry systems promise the availability of large and accurate deformation data. Also the GIS/simulation coupling is interesting for the alpEWAS project as a tool to show and clarify failure mechanisms. Hence, sensors can be installed in crucial locations to obtain the most valuable data. The concepts and results of the EGIFF- and the alpEWAS-project have been discussed intensively in joint sessions (Schuhbäck, 2008; Trauner, 2008). Furthermore, the EGIFF project was represented at the workshop »Transparenz schaffen, Synergien nutzen – Geodateninfrastruktur Komponenten und Sensor Observation Services« at Bonn University (Richter 2008). However, after the workshop the question remained open, if the early warning system projects within the GEOTECHNOLOGIEN research program really have the person power for an additional intensive training and sustainable use of standards and infrastructures for spatial information. In future project phases we see realistic chances for EGIFF project cooperation with other joint EWS projects in the following fields: a) Exchange of results gained from numerical methods and sensors developed for the analysis of mass movements Data for surface and subsurface deformations, ground water level fluctuation and other additional information obtained from various sensor measurements are of great importance for numerical simulation models. These data is required to represent geometrical characteristics of a study area in a preferably realistic manner and to calibrate the models by adjusting/choosing appropriate material parameters as well as constitutive equations. On the other hand results obtained from numerical analysis may be valuable to explain triggers
68
of slope movements and deformations in detail, especially underground, which are often hardly visible on the slope surface. Providing this information to other EWS projects, the cooperation partners can select appropriate and best-suited sensors and install them in key-positions. b) Methodological Exchange in the domain of quantitative assessment of landslide/mass movement hazard/ susceptibility and beyond In EGIFF component II, traditional and novel statistical classification methods have been selected, developed and evaluated against spatial data on regional scale. The techniques developed in this component can be used considering a »higher level« of hazard research. Research towards an integration of data analysis techniques in the framework of emerging GDI technology standards is considered a promising direction. Regarding increasing extreme weather situations relatively simple and fast methods are required for landslide hazard and risk assessment, also to provide people or decision makers with useful information to reduce hazards. Therefore, site specific quantitative methods based on extensive data and formal analysis can contribute to assessment of mass movement risk. Furthermore, historical frequency-magnitude analyses supply valuable information, whereby linguistic methods can play an important role. Exemplary the ILEWS subproject (Glade et al. 2007) develops/investigates methods for monitoring frequency and magnitude of landslides through history in the Swabian Alb and South Tyrol region to integrate historical analysis into early warning systems. However, spatio-temporal distribution of past landslide events is incomplete. Nevertheless, all existing data including historical information stored in archives should be provided for an adequate reconstruction of time series. This is the first step to make conclusions regarding risk zones and expected distributions of future events. These methods could also be transferred to reconstruct previous historical volcanic eruptions (Hort et al., 2007; Exupéry, 2008).
c) Software exchange and discussion of standards, infrastructures for spatial information, and web-based GIS As impact of the workshop »Transparenz schaffen, Synergien nutzen – Geodateninfrastruktur Komponenten und Sensor Observation Services« at Bonn University in April 2008 and further discussions in the status meeting at Osnabrück in October 2008, it should be further examined what the effort is to achieve a more homogeneous software use in the EWS projects. Web-based GIS and Sensor Web Enablement (SWE) seem to be suitable »candidates« to start this process. Also the effort for using and further developing or extending standards and infrastructures for spatial information should be further examined. However, the different priorities of the projects such as different application areas, different data types, different methods, and different software environments have to be taken into account. Regarding software and data interoperability, in the EGIFF project the GML data exchange standard is used. The most promising cooperation partners in this field seem to be the ILEWS project (Glade et al. 2007), the SLEWS project (Arnhardt et al. 2007, FernandezSteeger et al. 2008), the AlpEWAS project (Thuro et al. 2007, Singer et al. 2008), and the TRANSPORT project (Hohnecker et al. 2008) These projects are using modular software environments that could be used in more general EWS software architectures. Finally, the service-based 3D/4D geo-database access via Internet realized in the EGIFF project could be interesting for other EWS projects. Finally, the plug-in techniques used by our commercial project partner disy Informationssysteme GmbH, Karlsruhe, could also contribute to further integration of EWS projects in future. Additionally, requirements of specific visualization and presentation methods should be exchanged between our project and the projects ILEWS (Glade et al. 2007) and SLEWS (Arnhardt et al. 2007), because advanced 3D/
4D visualization techniques should be used for all mass movement scenarios. Last, but not least, the EGIFF project has maintained close cooperation with two other Geotechnologien main topics: the main topic »Technologies for safe and permanent storage of the greenhouse gas CO2« and the closed main topic »Sedimentary basins: The largest resource of humanity«. In both main topics EGIFF is cooperating with the group of H.-J. Götze/Sabine Schmidt/Andreas Thomsen (Working group Geophysics and Geoinformation, Christian-Albrechts-Universität zu Kiel. These »trans-disciplinary connections« provide interesting insides into the need of GI technologies and data integration methods in the geosciences. Clearly, the GI methods needed are transferable to the EWS program. 6. Conclusions and outlook In this article, objectives, requirements, and results of the GEOTECHNOLOGIEN joint project »Development of suitable information systems for early warning systems« (EGIFF) have been presented. In our work, the focus has been set on the design and implementation of new methods and on the combination of methods from the fields of simulation, GIS, spatial data mining, geo-databases and linguistics. The simulation of mass movements is executed on the basis of geotechnical, mechanically founded models. Thus a better understanding of the geological processes is achieved. The decision for early warning is supported by coupling simulation of mass movements with GIS. Furthermore, methods for the statistical analysis of structured data and linguistic examinations of unstructured data have been presented. Finally measured and interpreted data have been managed by an advanced 3D/4D geodatabase to support the analysis and early warning of mass movements. The presented work is a first important step on the way to efficient recognition and early warning of mass movements. However, establishing operative early warning systems from the scratch needs more time and future research.
69
Acknowledgements The funding of the research project »Development of suitable information systems for early warning systems« (Entwicklung geeigneter Informationssysteme für Frühwarnsysteme) by the German Ministry of Education and Research (BMBF) by grant no. 03G0645A et al. within the framework of the Geotechnologien initiative (http://www.geotechnologien.de) is gratefully acknowledged. We also thank the Geotechnologien office for their support. Finally, we thank the Bayerisches Landesamt für Umwelt (LfU, www.lfu.bayern.de) and the Bayerisches Landesamt für Vermessung und Geoinformation (LVG, www.lvg.bayern.de), for providing data for the application area. References Alms, R., Balovnev, O., Breunig, M., Cremers, A. B., Jentzsch, T., Siehl, A. (1998): Space-time modelling of the Lower Rhine Basin supported by an object-oriented database. Physics and Chemistry of the Earth, Vol. 23, No. 3, Elsevier Science, 251–260. Andrienko, N. and G. Andrienko (2001): Intelligent Support for Geographic Data Analysis and Decision Making in the Web. Geographical Information and Decision Analysis 5(2): 115–128. Arnhardt, C., Asch, K., Azzam, R, Bill, R., Fernandez-Steeger, T. M., Homfeld, S. D., Kallash, A., Niemeyer, F, Ritter, H., Toloczyki, M., Walter, K. (2007): Sensor based Landslide Early Warning System – SLEWS. Development of a geoservice infrastructure as basis for early warning systems for landslides by integration of real-time sensors. Stroink, L. (ed.): Geotechnologien. Science Report No. 10. – 136 p., Kick-Off-Meeting 10. October 2007, Technical University Karlsruhe, 75–88. Balovnev, O., Bode, Th., Breunig, M., Cremers, A.B., Müller, W., Pogodaev, G., Shumilov, S., Siebeck, J., Siehl, A., Thomsen, A. (2004): The Story of GeoTookKit – an Object-oriented Geo-Database Kernel System. Geoinformatica 8:1, Kluwer Academic Publishers, 5–47.
70
Bär W. (2007): Verwaltung geowissenschaftlicher 3D Daten in mobilen Datenbanksystemen. Ph. D. Thesis, University of Osnabrück, Germany, 166 p. Bell, R., Glade, T. (2004): Quantitative risk analysis for landslides – Examples from Bíldudalur, NW-Iceland. – Natural Hazard and Earth System Science 4(1): 117–131. Breunig, M., Broscheit, B., Reinhardt, W., Ortlieb, E., Mäs, S., Boley, C., Trauner, F.-X., Wiesel, J., Richter, D., Abecker, A., Gallus, D., Kazakos, W., Bartels, A. (2008): Towards an information system for early warning of landslides. In: Möller A., Page B., Schreiber M. (eds.), Environmental Informatics and Industrial Ecology, 22nd Conference on Informatics for Environmental Protection, Lüneburg, Germany, 476-481. Breunig, M., Broscheit, B., Thomsen, A., Butwilowski, E., Jahn, M., Kuper, P. V. (2009a): Towards a 3D/4D geo-database supporting the analysis and early warning of landslides. Proceedings Cartography and Geoinformatics for Early Warning and Emergency Management: Towards Better Solutions, Prague, 10 p. Breunig, M., Schilberg, B., Thomsen, A., Kuper, P. V., Jahn, M., Butwilowski, E. (2009b): DB4GeO: Developing 3D geo-database services. Accepted for 4th international 3DGeoInfo workshop, Ghent, Belgium, 6 p. Cunningham, H., Maynard, D. et al. (2002): GATE: A Framework and Graphical Development Environment for Robust NLP Tools and Applications. Proceedings of the 40th Anniversary Meeting of the Association for Computational Linguistics (ACL’02). Philadelphia. Dikau, R., Weichselgartner, J. (2005): Der unruhige Planet – Der Mensch und die Naturgewalten, Wissenschaftliche Buchgesellschaft, Darmstadt, 191 p. Dilo, A. (2006): Representation of and reasoning with vagueness in spatial information: a system for handling vague objects. Disserta-
tion. Wageningen Universiteit. Mathematical and Statistical Methods Group. Enschede. Exupéry team (2008): Exupéry Volcano Fast Response System, Status seminar »Early Warning systems in earth management, University of Osnabrück,131–140. Fernandez-Steeger, T. M., Arnhardt, C., Niemeyer, F., Haß, S. E., Walter, K., Homfeld, D., Nakaten, B., Post, C., Asch, K., Azzam, R., Bill, R., Ritter, H., Tolczyki, M. (2008): Current Status of SLEWS – a sensor-based landslide early warning system. Fielding, R. Th. (2000): Architectural Styles and the Design of Network-based Software Architectures, Ph.D. Dissertation, University of California, Irvine, Information and Computer Science. Gallus, D., Abecker, A., and Richter, D. (2007): Classification of Landslide Hazard/Susceptibility in the Development of Early Warning Systems. SDH/SAGEO, Montpellier, France, Springer. Gallus, D., Kazakos, W. (2008): Einsatz von statistischen Methoden zur automatischen Erstellung von Gefährdungskarten in Frühwarnsystemen am Beispiel gravitativer Massenbewegungen. AGIT 2008, Salzburg. Geotech (2009): http://www.geotechnologien.de. Glade, Th., Dikau, R. (2001): Gravitative Massenbewegungen: Vom Naturereignis zur Naturkatastrophe. Petermanns Geographische Mitteilungen, 145 (6): 42–55. Glade, T., Becker, R., Bell, R., Burghaus, S., Danscheid, M., Dix, A., Greiving, S., Greve, K., Jäger, S., Kuhlmann, H., Krummel, H., Paulsen, H., Pohl, J., Röhrs, M. (2007): Integrative Landslides Early Warning Systems (ILEWS). – In: Stroink, L. (ed.): Geotechnologien. Science Report No. 10. – 136 p., Kick-Off-Meeting 10 October 2007, Technical University Karlsruhe, 89–100.
Guo, Q., Y. Liu and Wieczorek, J. (2008): Georeferencing locality descriptions and computing associated uncertainty using a probabilistic approach. International Journal of Geographical Information Science, Vol. 22, No. 10, 1067–1090. Hohnecker, E., Buchmann, A., Schöbinger, F., Wenzel, F., Titzschkan, T., Bonn, G., Hilbring, D., Quante, F. (2008): Early warning system for transport lines EWS Transport. In: Status seminar »Early Warning systems in earth management, University of Osnabrück,59–69. Hort M., Wassermann J., Dahm T. and the Exupéry working group (2007): Exupéry: Managing Volcanic Unrest – The Volcano Fast Response System. – In: Stroink, L. (ed.): Geotechnologien. Science Report No. 10. – 136 p., Kick-Off-Meeting 10 October 2007, Technical University Karlsruhe, 124–133. Hwang, S. and J.-C. Thill (2005): Modeling Localities with Fuzzy Sets and GIS, In: Cobb, M., Petry, F., and Robinson V. (eds) Fuzzy Modeling with Spatial Information for Geographic Problems, Springer-Verlag, 71–104. Mallet, J. L. (1992): GOCAD: a computer aided design program for geological applications. In: A. K. Turner (ed.), Three-Dimensional Modeling with Geoscientific Information Systems, NATO ASI 354, Kluwer Academic Publishers, Dordrecht, 123–142. Mallet, J. L. (2002): Geomodeling, Oxford Press, 599 p. Merritt, W. S., Letcher, R. A., Jakeman, A. J. (2003): A review of erosion and sediment transport models, Environmental Modelling & Software, Volume 18, Issues 8–9, October–November, 761–799. Ortlieb, E., Reinhardt, W. and Trauner, F.-X. (2009a): Developement of a coupled geo information and simulation system for early warning systems. In: Cartography and Geoinformatics for Early Warning and Emergency
71
Management: Towards better Solutions Proceedings, Prague, 10 p. Ortlieb, E., Reinhardt, W., Trauner, F.-X. (2009b): Kopplung von Geoinformations- und Simulationssystemen zur Entscheidungsunterstützung. Paper accepted for: Tagungsband zu 45. AgA-Tagung – Automation in der Kartographie, Photogrammetrie und GIS, Frankfurt a. M., 10 p. Richter, D. (2008): EGIFF protocol from the workshop »Transparenz schaffen, Synergien nutzen Geodateninfrastrukturkomponenten und Sensor Observation Services«, Geographical Institute, Bonn University, 4.1.2008. Schade, U. and M. Frey (2004): Beyond Information Extraction: The Role of Ontology in Military Report Processing. KONVENS 2004 – 7. Konferenz zur Verarbeitung natürlicher Sprache, Schriftenreihe der Österreichischen Gesellschaft für Artificial Intelligence, Vol. 5, Wien. Schuffert, S. (2009): Untersuchung von Unsicherheiten räumlicher Beschreibungen und deren Modellierung, Diplomarbeit. Institut für Photogrammetrie und Fernerkundung, Universität Karlsruhe. Schuhbäck, A. (2008): Joint project »Entwicklung und Erprobung eines integrativen 3D-Frühwarnsystems für alpine instabile Hänge (alpEWAS)«, invited presentation at the EGIFF-project meeting, UniBW München, 03.12.2008. Singer, J., Wasmeier, P., Schuhbäck S., Thuro, K., Wunderlich, Th., Heunecke, O. (2008): The Aggenalm slide – a testing site for an integrative 3D early warning system. In: proceedings Status seminar »Early Warning systems in earth management, University of Osnabrück, 95–104. Smith, K. (2009): Environmental Hazards: Assessing Risk and Reducing Disaster, London. Thuro, K., Wunderlich, Th. & Heunecke, O. (2007): Development and testing of an inte-
72
grative 3D early warning system for alpine instable slopes (alpEWAS). – In: Stroink, L. (ed.): Geotechnologien. Science Report No. 10. – 136 p., Kick-Off-Meeting 10. October 2007, Technical University Karlsruhe, Programme und Abstracts, 10: 101–112. Trauner F.-X. (2008): Entwicklung geeigneter Informationssysteme für Frühwarnsysteme, invited presentation at the Workshop »Geotechnologien für instabile Hänge«, TU München, 05.01.2008. Trauner F.-X., Boley C. (2009): Application of geotechnical models for early warning systems to mitigate threats by landslides. Accepted for: 17th International Conference on Soil Mechanics and Geotechnical Engineering 2009, Alexandria, Egypt. Wieczorek, J., Q. Guo and Hijmans, R. J. (2004): The point-radius method for georeferencing locality descriptions and calculating associated uncertainty. International Journal of Geographical Information Science 18: 745–767.
Last-Mile – Numerical Last-Mile Tsunami Early Warning and Evacuation Information System Taubenböck H. (1, 2), Goseberg N. (3), Setiadi N. (4), Lämmel G. (5), Moder F. (6), Schlurmann T. (3)*, Oczipka M. (7), Klüpfel H. (8), Strunz G. (1), Birkmann J. (4), Nagel K. (5), Siegert S. (6), Lehmann F. (7), Dech S. (1, 2), Gress A. (9), Klein K. (9) (1) German Remote Sensing Data Center (DFD), German Aerospace Center (DLR), 82234 Wessling-Oberpfaffenhofen e-mail: hannes.taubenboeck@dlr.de; günter.strunz@dlr.de; stefan.dech@dlr.de (2) Julius-Maximilans University Würzburg, Geographical Institute, Earth Observation, 97074 Würzburg (3) Leibniz Universität Hannover, Franzius-Institut für Wasserbau und Küsteningenieurwesen, 30167 Hannover e-mail: goseberg@fi.uni-hannover.de; schlurmann@fi.uni-hannover.de (4) United Nations University, Institute for Environment and Human Security (UNU-EHS), 53113 Bonn e-mail: setiadi@ehs.unu.edu; birkmann@ehs.unu.edu (5) Technische Universität Berlin Verkehrssystemplanung und Verkehrstelematik, Institut für Land- und Seeverkehr, Fakultät V – Verkehr- und Maschinensysteme, 10587 Berlin e-mail: laemmel@vsp.tu-berlin.de; nagel@vsp.tu-berlin.de (6) Remote Sensing Solutions GmbH (RSS), 81667 München e-mail: moder@rssgmbh.de; siegert@rssgmbh.de (7) Institut für Robotik und Mechatronik, German Aerospace Center (DLR), Berlin-Adlershof e-mail: martin.oczipka@dlr.de; frank.lehmann@dlr.de (8) TraffGo HT GmbH, Falkstr. 73-77, 47058 Duisburg e-mail: kluepfel@traffgo-ht.com (9) Insitute of Computer Science II, University of Bonn, 53117 Bonn e-mail: gress@cs.uni-bonn.de; klein@cs.uni-bonn.de *Coordinator of the project: Prof. Dr.-Ing. Torsten Schlurmann, Leibniz Universität Hannover
1. Introduction The joint research project funded by DFG/ BMBF (Sponsorship code: 03G0666A-H) develops a numerical last-mile tsunami early warning and evacuation information system on the basis of detailed earth observation data and techniques as well as unsteady, hydraulic numerical modelling of small-scale flooding and inundation dynamics of the tsunami, including evacuation simulations in the coastal hinterland for the city of Padang, West Sumatra, Indonesia.
The joint research project is composed of five working packages, based on field expertise of each working partners, namely socio-economic vulnerability assessment (WP 1000), inundation scenarios, flow analysis (WP 2000), geodatabase, information system and vulnerability assessment (WP 3000), evacuation analysis, and pedestrian traffic optimization (WP 4000) and highly-resolved 3D-model of Padang (WP 5000) based on aerial large-scale topographic mapping. The project itself has accomplished its second project year. In the following sections, the actual progress of each working package is briefly explained.
73
2. WP 1000: Socio-economic vulnerability assessment Susceptibility and coping capacity as components of vulnerability are defined as the degree to which a system is able or unable to cope with the adverse effects of a hazardous impact (adapted from McCarthy et al, 2001). We attempt to measure the degree of vulnerability within the specific context of tsunami early warning and evacuation, i.e. the socioeconomic vulnerability assessment should provide hints on the challenges that may appear due to perception and behaviour of the people. Social groups confronted with the tsunami risk in reality indicate gaps between early warning technology and social response capability. They are not covered using pure engineering approaches or models that normally use simplified assumptions on the social aspects. It is intended to investigate parameters that would determine the susceptibility or coping capacity of various social groups during evacuation in order to understand the reason for to the gaps identified. Initially, the socio-economic factors influencing the exposure of various social groups to tsunamis, derived from daily activity pattern and mobility data, are further investigated using data obtained from the household questionnaires on socio-economic vulnerability. This provides complementary information to vali-
date the time-specific population distribution according to the physical structural pattern extracted from remotely sensed data. Subsequently, further analysis on the evacuation behaviour is conducted. As a starting point the following hypotheses were used with regard to the relevant vulnerability themes: – various social groups have different access to early warning, – various social groups would respond differently to the warning and to the given evacuation instruction, – various social groups would have different capability and considerations (e.g. evacuation routes, evacuation in groups) in conducting evacuation. These hypotheses are tested using common statistical methods for descriptive analysis, correlation and regression analysis. As an example, the findings from the household questionnaires showed that during the September 2007 earthquake in the region of Padang, 26.5% of people did not receive any tsunami early warning (Fig. 1). 51.3% received formal warning through television (TV) and radio, the rest (22.2%) was only informally notified. There was varying effectiveness of warning dissemination through different media in proportion to availability of the respective media in the households. This may be
Fig. 1. Various responses to the warning during the last »potential« tsunami experience
74
explained by the utilization of these media during the time of the event. It indicates that some households have to take uninformed decisions because of receiving different warning messages from different media. Results show that especially the informal notification and the interpretation of the warning information was difficult. There was only a small proportion of people evacuating after receiving the warning during the last experience (34%), however, the intention for evacuating in case of future tsunami early warning showed higher rates (75%) (Setiadi 2008; Birkmann et al. 2008). Multinomial logistic regressions are employed in order to investigate which parameters influence the intention to evacuate. The preliminary results identify a model of evacuation intention including parameters with some extent of significant influence, namely knowledge of tsunamis, discussion about tsunami risk in the community, perception on peopleâ&#x20AC;&#x2122;s vulnerability, knowledge of evacuation places, personal preference for safe places, self-efficacy in evacuation, and doubt on tsunami early warning. The aggregation of these parameters shows the level of awareness and attitude towards the current early warning and evacu-
ation plan and will be proxy of potential response to tsunami early warning. Moreover, distance to the coast and household characteristics, such as household size and existence of elderly also play a role. In order to estimate the capability of the people to conduct evacuation on time, the following parameters are taken into account: particularly parameters related to demographic distribution of vulnerable groups with limited running speed (and access to vehicle, if evacuation using vehicle would be considered) as well as evacuation destination, related to evacuation knowledge, participation in evacuation drill and perceived safe places. Assessments reveal socio-economic parameters, which shall be used as indicators for a continuous monitoring the effectiveness of implemented and planned measures to reduce the vulnerability of the people, such as education/training programs. The assessment not solely uses questionnaires and qualitative research, but also incorporates spatial information extracted from remote sensing. This combination is an enriching approach for assessing vulnerability comprehensively.
Fig. 2. Effectiveness of various media in disseminating the warning during the last experience
75
Fig. 3. Knowledge places and routes of evacuation
3. WP 2000: Tsunami Inundation Scenarios For the tsunami inundation modeling a hybrid approach was chosen to study wave propagation from the source onto dry land. In a basic step the source modeling and the deep-water propagation was carried out by Alfred-Wegener-Institute (AWI) based on the model TsunAWI (Harig et al, 2007). Subsequently a hydrodynamic inundation modeling tool called ANUGA (Australian National University and Geoscience Australia, Nielsen et al. 2005) is employed for the computations of runup and on-land flow. A twofold strategy is adopted throughout our modeling for the city of Padang in order to satisfy the fact that the urban topology of our focus area is densely built-up. On the one hand, we employ two different house masks derived from the remotely sensed imagery. On the other hand, we employ the highly-resolved digital surface model that is sufficiently represented by a maximal triangle area of about 20 square meters. A large set of sensitivity studies was carried out alongside the main study in order to estimate several effects resulting from the numerical computation of the governing equations. A common approach to calibrate and validate numerical studies is to compare them with historic tsunami inundation pattern. However, the availability of validation events for our numerical studies is very poor for the city of
76
Padang (Borrero et al. 2006). Therefore we decided to rely on a large set of pre-calculated scenarios with spatially distributed epicentres. Based on these scenarios, hazard zones are identified and communicated, to conduct adjacent hazard and risk assessment studies. The pre-calculated scenarios were taken as boundary conditions for both, a preliminary study as well as for the full model domain. The result from the preliminary study revealed some important hints on the further modeling published in Goseberg et al. (2009) and Goseberg and Schlurmann (2008). The most striking fact is that the employments of highlyresolved data sets in contrast to commonly available data lead to channelized flow in major streets together with much higher currents. When a more sophisticated DEM is used, not only flow velocities are amplified, but also flow depths and water levels in direct adjacency of the shoreline are increased. Fig. 4 shows maximum inundation extents for two different model runs for the two modeling strategies discussed above. On the left, the maximum inundation is plotted using the digital surface model of Padang. Due to the mesh resolution open spaces and built-up areas are not distinguishable from the result, since buildings that are clearly represented in the DSM degenerate under the applied mesh interpolation algorithms. Nevertheless, building structures from the DSM significantly affect the model results compared to plain DEM that are commonly used for such simulations. We
Fig. 4: Exemplary flood extent maps for the different modeling strategies applied. Left: Maximum inundation depth [m] for a model with digital surface model; Right: Maximum inundation depth [m] for a model using a medium house mask derived from remotely sensed data
Fig. 5: Exemplary hazard maps from preliminary results of our multi-scenario approach compiled from 21 scenarios with underlying DSM model derived from aerial image acquisition (cp. AP 5000), counts are normalized by the maximal number of scenarios, left: city center of Padang, right: harbour Teluk Bayur south of the city
found that using DSM data or DEM data together with a house mask for our numerical studies, inundation extent is less compared to simulation runs using only DEM data. The reason is that the flow is impeded by the existents house features in the model. The overland flow is reflected or decelerated while interacting with houses. On the other hand maximum inundation plotted for a simulation combined with a medium resolved house mask influences inundation dynamics as well.
Finally we generated probabilistic hazard maps from the modeling approach discussed above. The hazard maps presented in Fig. 5 have been compiled from the scenarios already post processed (MW 9.0) by means of counting any inundated raster cell in each scenario. For the probabilistic analysis inundated raster cells were assigned to a value of 1 and dry raster cells a value of 0. The resulting hazard map was deduced by summing up the available scenarios. Fig. 5 shows normalized values with
77
respect to the maximal number of tsunami scenarios in the same moment magnitude classified in 4 hazard zones. The resulting map products are finally being delivered to assess perspectives of vulnerability and to improve evacuation planning. 4. WP 3000: Contribution of Remote Sensing to a comprehensive early warning chain Remote sensing is one scientific field contributing to a comprehensive early warning chain for a potential tsunami event in Padang, West Sumatra, Indonesia. Efficient risk and vulnerability management and an accordingly substantial urban planning need precise, upto-date and area-wide spatial knowledge on the urban situation. We used multi-sensoral remotely sensed data to provide a highly detailed city model as information basis for planning of field surveys, physical vulnerability analysis, socio-economic analysis (WP 1000), the analysis of interactions of simulated tsunami impact (WP 2000) and evacuation planning (WP 4000). To ensure substantial collaboration within the project partners, a data exchange platform has been setup. As presented at the Status Meeting 2008, we derived a high resolution 3-D city model to provide a substantial, area-wide and up-to-
date information basis on the urban landscape. The results are eight classes mapping the urban structure-houses, streets, sealed areas, grassland, trees, wetland, bare soil, and water. For the generation of a city model, parameters defining the urban objects have been added. In particular, building footprints were calculated as well as building height, which was inferred from the correlation of the house and the corresponding shadow length. Utilizing the field work experience, land use was assessed as an additional feature of every building, basically differentiating between residential, mixed and commercial usage. Based on these data sets we also correlated population information with the static urban morphology to assess time-dependent population distribution (Taubenböck et al, in press). Interdisciplinary combination of the hazard map, the 3-D city model as well as the timedependent population distribution facilitates now the probabilistic quantification assessment of scenario-dependent affected buildings and people. Tab. 1 shows the results for the 4 risk zones (cp. Fig. 5) and reveals for the worst case scenario in our test area that overall 37.347 buildings would be inundated in different risk zones and 227.100 people at daytime and 217.400 at night-time would be at risk.
Tab. 1. Quantification of affected buildings and time-dependent people with respect to the hazard map
Name of zone
Probability
Buildings affected
Buildings affected
People affected Day-time
Night-time
No risk
0,0%
50 657
57,56%
285 600
281 500
Low risk
0,01–5,0%
4 213
4,79%
36 900
32 400
Medium risk
5,01–15,0%
17 418
19,79%
98 800
102 100
High risk
15,01–45,0%
14 308
16,26%
81 900
75 700
Extreme risk
45,01–100,0%
1 408
1,60%
9 500
7 200
88 004
100%
512 700
498 900
Total
78
Furthermore, the 3-D city model allows localizing and identifying safe areas for various hazard scenarios as well as the assessment of stable structures in case of a tsunami event. Fig. 6 visualizes safe areas, which have the following conditions: Accessibility by the available street network; larger than 10.000m2 to be able to accommodate enough people in a rescue situation; flat area with land-cover bare soil or grassland and with respect to the hazard map a location outside any inundation area. In addition, stable structures are identified by punctual field work analysis of 500 sample buildings by civil engineers. Correlating the stability analysis with area-wide available physical information derived from remote sensing allows an extrapolation to assess safe structures over the complete urban landscape of Padang with an accuracy of over 80% (Fig. 6). Finally the 3-D city model supports the identification of appropriate locations for planned sirens (towers) as one way to inform people in the forefront of a tsunami.
5. WP 4000: Risk minimizing evacuation strategies under uncertainty Time dependent evacuation routes: The overall egress time is a crucial aspect in most evacuation situations. There are many models that compute optimal routing strategies. Sometimes the area that has to be evacuated is time dependent. For instance, largescale inundations or conflagrations do not cover all the endangered area at once. One solution is to model this as a time dependent network, where an endangered area is passable as long as the inundation or conflagration has not reached this area. Ways of implementing this approach within a microscopic multiagent simulation has been shown in our previous work. However, that approach only works as long the advance warning time is known beforehand. In the case of uncertain advance warning times this approach does not longer result in a routing strategy that always is risk minimizing.
Fig 6. Street network, safe areas, and identification of safe structures in Padang
79
Risk minimization methods Within the context of the »Last-Mile« project only risk-decreasing move strategies are proposed. A move is defined as risk-decreasing, if after that move the evacuee’s distance to the danger is larger as before the move. Within the endangered area, the distance describes the temporal distance. For inundation scenarios, this means that the location of the evacuee before that move will be flooded earlier than the location of the evacuee after that move. But even people outside the directly affected area should keep some distance to the danger. This is important because otherwise those people could block evacuees from leaving the endangered area. Therefore, we propose an additional buffer around the endangered area that also has to be evacuated. Within this buffer a move is defined as risk-decreasing, if after that move the evacuee’s spatial distance to the danger is higher than before the move. An illustration of this risk minimizing strategy is given in Fig. 7. Simulation framework The risk minimizing strategy has been implemented as an optimization module in a microscopic multi-agent based large-scale evacua-
tion simulation framework. The simulation framework is based on MATSim and has been discussed in many of our previous papers. During an evacuation run each evacuee (agent) iteratively optimizes its personal evacuation plan. After each iteration, every evacuee calculates the cost of the most recently executed plan. Based on this cost, the evacuees revise the most recently executed plans. Some evacuees generate new plans using a time-dependent router. The others select an existing plan they have previously used. Time dependent router The outcome of a simulation of this kind highly relies on the cost function of the router. The time-dependent router calculates a shortest path based on generalized cost function. If the cost function only takes the expected travel time into account the system converges towards a Nash equilibrium. To find a risk minimizing evacuation solution, the cost function is the sum of two cost components. The first component is the expected travel time for a link. The expected travel time corresponds to the experienced travel time from the iteration. The second component represents the risk costs. The risk cost for a link is zero if the ori-
Fig. 7. Illustration of the risk minimizing strategy: The boxes denoted with \FL TIME »showing the flooding time for the corresponding crossings (nodes) and the boxes denoted with \DIST« showing the distance of the corresponding crossings to the flooding area. The black arrows pointing towards lower risk
80
gin node is only closer to the danger than the destination node. Otherwise, an additional risk cost, depending on the destination node’s distance to the danger will be applied. Since the solution should not be a trade-off between risk minimizing and fast evacuation routes, the risk cost has to be higher than cost of the most expensive risk minimizing evacuation path. Case study The performance of the risk minimizing strategy will be demonstrated through the evacuation of the Padang. The advance warning time for a tsunami wave is expected to be between 15 and 30 minutes. The risk minimizing strategy will help to give appropriate evacuation recommendations even under uncertainty. However, the simulation results have shown that in particular areas an evacuation to higher grounds seems to be not achievable in such a short time period. Therefore we recommend establishing tsunami proof shelters in such regions. For the local evacuation planner, the simulation framework could be used to find appropriate locations and sizes for the shelters. Currently we are working on the integration of the tsunami proof shelters into the simulation framework. With this feature it will be possible to estimate the effect of a tsunami proof shelter beforehand and so it will help the local decision maker not only to find appropriate locations for the shelters but will also help with the dimensioning of the planned shelters.
6. WP 5000: Very high resolution topographic mapping of densely populated coasts in support of risk assessment of tsunami hazards in Indonesia A data acquisition campaign was conducted using a Multi Functional Camera (MFC-3). The MFC-3 camera, developed by the German Space Agency DLR (Deutsches Zentrum für Luft- und Raumfahrt), is one of the most advanced digital aerial photographic systems worldwide in the context of spaceborne, airborne and terrestrial data acquisition. The camera’s very high ground-resolution of 25 cm
has a height estimation accuracy of +/– 40 cm. The high resolution DSMs, derived from the acquired stereo images and the multispectral image data, formed the basis of several working packages such the inundation modeling and were the main data source for the 3-D Viewer applied for capacity building purposes. Coping capacity in the case of a disastrous event encompasses management and physical planning, social and economic capacity. This requires pooling all different perspectives and results along the process chain in order to derive recommendations for decision-makers before, during and after an event. The basis for substantial management is realized in a tool, where all results of the project are stored in a centralized geodatabase. The tool streams all necessary data via internet to an explicitly developed 3D client browser with a multilingual user interface. The browser enables the user, such as the city administration of Padang, rescue teams, etc. to view and analyze tsunami relevant data in 3D at different levels of detail, even on the level of individual buildings. Because of the low data transfer rates in Indonesia’s IT infrastructure, it was essential to build on very efficient multi-resolution techniques for visualizing the DSM (Wahl et al. 2004) and for efficiently mapping GIS data on it (Schneider & Klein 2007). These techniques further improve geometry representation (Wahl et al. 2008), texturing quality (Schneider & Klein 2008), as well as caching and streaming techniques in such a way that the usability and efficiency of the viewer application within a restricted network infrastructure could be assured while retaining high rendering quality. The 3D visualization of geospatial data (Fig. 8a), adapted to the special requirements of hazard management such as tsunami events, offers an effective tool even for inexperienced GIS users to interpret geospatial data. This is indispensable if the results should be spread to a wider audience, such as in Padang, with its limited computing and IT infrastructure. The maintenance and update of the geospatial data is very cost-efficient due to its centralized data storage.
81
Fig. 8. (a) Landmark building, Safe buildings (identified by the University Würzburg – WP 3000) (b) Maximum number of persons per square meter in case of an evacuation, from 0.1 persons per square meter in green to 5.4 persons per square meter in red (calculated and provided by TU Berlin and visualized in the 3-D Viewer of RSS GmbH, superimposed on the aerial image of DLR)
The tool intends to improve coping capacity in manifold ways: By combining the interdisciplinary research results and visualizing them in a consistent manner (building on the concepts described in Greß & Klein 2009), it enables to support decision making before, during and after a disastrous event. Before the event, we aim at the correct assessment of various hazard scenarios to quantify exposed susceptible elements and people, assess their coping capacity and identify bottlenecks for evacuation. Furthermore, the web application is used as platform for information dissemination, and thus aims at awareness raising and guidance of political will. This analysis is the basis for recommendations in preparation for the expected disaster. During and after the tsunami event, the platform is used to spatially plan and manage rescue measures. 7. Conclusion The results provide a large spatial data basis for interdisciplinary research within the Last-Mile project, and in comparison to most other studies, we provide high geometric and thematic detail: e.g. the building mask is used as input parameter for inundation modeling of a potential tsunami (Goseberg et al., 2009), or the derived street network is used for evacuation modelling (Lämmel et al., 2008). Furthermore the 3-D city
82
model is used to distribute the survey samples (Setiadi, 2008). Multi-layer analysis allows quantifying and localizing exposed elements of the urban landscape (Taubenböck et al., in press). Furthermore, the results are combined and presented in a 3-D Viewer. The tool intends to improve coping capacity in manifold ways: Since it combines the interdisciplinary research results and visualizes them in a consistent manner (building on the concepts described in Greß & Klein 2009), it enables to support decision making before, during and after a disastrous event. Before the event, we aim at the correct assessment of various hazard scenarios to quantify exposed susceptible elements and people, assess their coping capacity and identify bottlenecks for evacuation. Furthermore, the web application is used as platform for information dissemination, and thus aims at awareness raising and guidance of political will. This analysis is the basis for recommendations in preparation for the expected disaster. During and after the tsunami event, the platform is used to spatially plan and manage rescue measures. In addition the results of Last-mile are actively contributed to the Padang consensus process by many workshops and meetings with decision-makers and stakeholders in Padang.
8. Bibliography Birkmann, J., Setiadi, N. & Gebert, N. (2008): Socio-economic vulnerability assessment at the local level in context of tsunami early warning and Evacuation planning in the city of Padang, West Sumatra, Paper for International Conference on Tsunami Warning, November 12–14, 2008, Nusa Dua, Indonesia.
dang, Proc. of the 4th Interantional Conference on Pedestrian and Evacuation Dynamics, Wuppertal, Germany.
Borrero, J. C., Sieh, K., Chlieh, M. & Synolakis, C. E. (2006): Tsunami inundation modeling for western Sumatra Proceedings of the National Academy of Science of the United States of America, 2006, 103, 19673–19677. PNAS online publication, available at http://www.pnas.org
Nielsen, O., Roberts, S., Gray, D., McPherson, A. & Hitchman, A. (2005): Hydrodynamic modeling of coastal Inundation. In: Zerger, A. (Hrsg.): MODSIM 2005 Int. Congress on Modeling and Simulation, pp. 518–523, Modeling and Simulation Society of Australia and New Zealand, 2005.
Goseberg, N., Stahlmann, A., Schimmels, S., Schlurmann, T. (2009): Highly resolved numerical modeling of tsunami run-up and inundation scenario in the city of Padang, West Sumatra. Proc. of the 31st Int. Conference on Coastal Engineering, 2009 (in press).
Schneider, M. & Klein, R. (2007): Efficient and Accurate Rendering of Vector Data on Virtual Landscapes. Journal of WSCG (Jan. 2007), 15: 1–3 (59–64).
Goseberg, N. & Schlurmann, T. (2008): Relevant factors on the extent of inundation Based on Tsunami scenarios for the city of Padang, West Sumatra. Proc. of the International Conference on Tsunami Warning (ICTW), 2008. Greß, A. & Klein, R. (2009): Visualization Framework for the Integration and Exploration of Heterogeneous Geospatial Data. Proceedings of Eurographics 2009 – Areas Papers, pages 27–34, Eurographics Association, Apr. 2009. Harig, S., Chaeroni, C., Behrens, J., Schroeter, J. (2007): Tsunami Simulations with unstructured grids (TsunAWI) and a comparison to simulations with nested grids (Tsunami-N3). 6th Int. Workshop on Unstructured Mesh Numerical Modeling of Coastal, Shelf and Ocean Flows, London, 2007. Lämmel, L., Rieser, M., Nagel, K., Taubenböck, H., Strunz, G., Goseberg, N., Schlurmann, T., Klüpfel, H., Setiadi, N., and Birkmann, J. (2008): Emergency Preparedness in the case of a Tsunami – Evacuation Analysis and Traffic Optimization for the Indonesian city of Pa-
McCarthy, J. J., O. F. Canziani, N. A. Leary, D. J. Dokken, & K. S. White (eds.): (2001): Climate Change: Impacts, Adaptation and Vulnerability. Cambridge: Cambridge University Press.
Setiadi, N. (2008): WP 1000: Socio-Economic Vulnerability Assessment – Numerical Last-Mile Tsunami Early Warning and Evacuation System. Presentation for the Status Meeting Geotechnologies, October 8, 2008, Osnabrück. Taubenböck, H., Goseberg, N., Setiadi, N., Lämmel, G., Moder, F., Oczipka, M., Klüpfel, H., Wahl, R., Schlurmann, T., Strunz, S., Birkmann, J., Nagel, K., Siegert, F., Lehmann, F., Dech, S., Gress, A., Klein, R. E. (in press): ›LastMile‹ preparation for a potential disaster – Interdisciplinary approach towards tsunami early warning and an evacuation information system for the coastal city of Padang, Indonesia. In: Natural Hazards and Earth System Sciences (NHESS). Wahl, R., Massing, M., Degener, P., Guthe, M. & Klein, R. (2004): Scalable Compression and Rendering of Textured Terrain Data. Journal of WSCG (Feb. 2004), 12:3(521–528). Wahl, R., Schnabel, R. & Klein, R. (2008): From Detailed Digital Surface Models to City Models Using Constrained Simplification. Photogrammetrie, Fernerkundung, Geoinformation (2008): 3 (207–215).
83
EDIM – Earthquake Disaster Information System for the Marmara Region, Turkey Wenzel F. (1)*, Erdik, M. (6), Köhler N. (1), Zschau J. (2), Milkereit C. (2), Picozzi M. (2), Fischer J. (3), Redlich J. P. (3), Kühnlenz F. (3), Lichtblau B. (3), Eveslage I. (3), Christ I. (4), Lessing R. (4), Kiehle C. (5) (1) Geophysical Institute, Karlsruhe University (TH) e-mail: friedemann.wenzel | nina.koehler@gpi.uni-karlsruhe.de (2) GeoForschungsZentrum (GFZ) Potsdam e-mail: zschau | online | picoz@gfz-potsdam.de (3) Computer Science Department, Humboldt University Berlin e-mail: fischer | jpr | kuehnlenz | lichtbla@informatik.hu-berlin.de; eveslage@gmail.com (4) DELPHI IMM GmbH, Potsdam e-mail: Ingrid.christ | rolf.lessing@delphi-imm.de (5) lat/lon GmbH, Bonn e-mail: kiehle@lat-lon.de (6) Kandilli Observatory and Earthquake Research Institute (KOERI), Bogazici University, Istanbul e-mail: erdikm@gmail.com *Coordinator of the project: Prof. Dr. Friedemann Wenzel, Karlsruhe University
1. Introduction The main objectives of EDIM were to enhance the Istanbul earthquake early warning (EEW) system with a number of scientific and technological developments that – in the end – provide a tool set for EEW with wide applicability. Innovations focus on three areas. (1) Analysis and options for improvement of the current system; (2) development of a new type of self-organising sensor system and its application to early warning; (3) development of a geoinformation infrastructure and geoinformation system tuned to early warning purposes. Development in the frame of the Istanbul system, set up and operated by KOERI, allows testing our novel methods and techniques in an operational system environment and working in a partnership with a long-standing traditon of success. EDIM is a consortium of Karlsruhe University (TH), GeoForschungsZentrum (GFZ) Potsdam, Humboldt University (HU) Berlin, lat/lon GmbH Bonn, DELPHI Informations Muster Manage-
84
ment GmbH Potsdam, and Kandilli Observatory and Earthquake Research Institute (KOERI) of the Bogazici University in Istanbul. The work packages (WPs) are distributed among the project participants as following: WP A: Real-time information from a regional accelerometer network (Karlsruhe University) WP B1: The Self-Organising Seismic Early Warning Information System (GFZ Potsdam) WP B2: Infrastructure of Self-Organising Sensor Systems (HU Berlin) WP C1: Development of a Dynamic Geoinformation-Infrastructure (DELPHI IMM GmbH) WP C2: EDIM Information System (lat/lon GmbH). 2. Real-time information from a regional accelerometer network This aims on the evaluation and optimisation of the existing EEW system for Istanbul, and on establishing the best database of seismic events in the Marmara region. As already described in last year’s status report [Köh-
ler & Wenzel, 2008], a set of 280 simulated earthquake scenarios located along the segments of the Main Marmara Fault has been developed for this purpose at Karlsruhe University. Subsets of the simulated data are used by DELPHI IMM as input scenarios for estimating building damages in Istanbul. These are based on spectral displacement values calculated from spectral acceleration of the scenario earthquakes delivered by Karlsruhe University. The spectral acceleration is calculated from the response spectra that describe the peak motion response of a single-degree of freedom elastic structure (building) towards a base acceleration (seismic ground motion). With a damping of 5%, the spectral acceleration is calculated for pre-defined periods. A comparison of the peak ground acceleration (PGA) and spectral acceleration with standard attenuation relationships from the literature [Boore et al., 1997; Campbell & Bozorgnia, 2008; Özbey et al., 2004] showed that, for a distance range of about 10–100 km, our simulated ground motion is of high quality. However, the correlation with the literature values is slightly better for larger magnitudes (M 7) and hard rock sites than for soft rock sites.
tracted ground shaking in Istanbul were caused and probably also considerable damage in its immediate vicinity. About 13,000 people were killed by the earthquake. We simulated the event analogue to the existing synthetic data by using FINSIM, a stochastic simulation method for finite faults [Beresnev & Atkinson, 1997]. We assumed a moment magnitude of 7.3 and a rupture length of 70 km. The event is located on the central fault segment between Istanbul and the Central Basin. The fault width is set to 18.9 km, with a fault strike of 265° and a dip of 90°. The depth to the upper fault edge is set to 0.4 km. For calculating the acceleration time series, the fault is divided into 7 sub-faults in strike direction and 3 sub-faults along dipping. The slip on each sub-fault is random normally distributed and varies between 0.0 and 6.6 m. The acceleration time series are calculated for Istanbul using a dense grid with 0.005° x 0.005° grid spacing by including the according site classifications of the grid nodes. The hypocentre location is set to the middle of the fault. For comparing directivity effects, we also set the hypocentre location to both ends of the fault.
Because of this year’s 500th remembrance of the historic 1509 Istanbul earthquake, we decided to additionally simulate this earthquake to investigate the effects of such a major event on today’s city of Istanbul.
After comparing PGA and spectral acceleration with the above mentionned attenuation relationships from the literature, we set the average stress drop to 5 MPa – this led to the best correlation of ground motion with the literature values (Fig. 1).
According to Ambraseys [2001], the earthquake ruptured a 70 ± 30 km long fault segment in the Sea of Marmara. Violent and pro-
Fig. 2 shows the simulated ground motion maps for Istanbul for the three different hypocentre locations. The PGA values range
Fig. 1. Attenuation of PGA with distance from the earthquake for hard (left) and soft rock sites (right) of the 1509 scenario, with the hypocentre located in the middle of the fault. The simulation curves are smoothed over 5 km
85
between 23.1 and 456.9 cm/s2 for the hypocentre located in the middle of the fault, between 21.1 and 391.2 cm/s2 for the hypocentre located at the eastern end of the fault, and between 25.3 and 445.9 cm/s2 for the hypocentre located at the western end of the fault. The seismic intensity values range between 4.4 and 9.1, 4.2 and 8.6, and 4.3 and 9.0, respectively. The ground motion depends strongly on the site classifications and the distance to the fault. In all three scenarios, the strongest ground motions occur on the European part of Istanbul. The fact that the nearest hypocentre position does not automatically generates the strongest ground motion is caused by directivity effects. Another task of last year was the performance evaluation of PreSEIS, a neural network-based approach to EEW developed at Karlsruhe University [Bรถse et al., 2008], using real earthquake observations. The method is used as a benchmark system for performance tests of the Istanbul EEW system. However, due to the lack of real earthquake observations in the Marmara region, we tested the performance of PreSEIS using a dataset of southern California earthquakes instead. The results of this successful study have been accepted for publication by Seismological Research Letters
(Kรถhler, N., Cua, G., Wenzel, F, and M. Bรถse (2009). Rapid source parameter estimations of southern California earthquakes using PreSEIS. In press of Seismological Research Letters, DOI 10.1785/gssrl.80.5.743). 3. The Self-Organising Seismic Early Warning Information System The Self-Organising Seismic Early Warning Information Network (SOSEWIN) is technically a decentralised, wireless mesh sensor network, made up of low-cost components, with a special seismological application that supports EEW and rapid response tasks. The development of SOSEWIN focuses on two points. The first is the design of the low-cost nodes themselves, while the second is its self-organising, decentralised character. The low cost nodes consist of an embedded computer equipped with 2 WLAN Mini-PCI cards, a compact flash card as data storage and a digitizer board with MEMS accelerometers and a GPS unit which delivers the seismic data. The basic system software running on a node consists of OpenWRT (Linux), OLSR as a self-organising mesh network routing protocol, a Seedlink server, and further self-written components to retrieve, store and process the acceleration data measured by the SOSEWIN nodes. The system is developed by GFZ Potsdam and HU Berlin.
Fig. 2. Distribution of simulated PGA (top) and seismic intensity (bottom) of the 1509 scenario in Istanbul. The black lines represent the fault, and the solid black circles mark the location of the epicentre
86
GFZ activities focused on testing the SOSEWIN system and the identification of operational problems: Significant progress has been made during the last year with regard to utilizing the system for seismic early warning and for civil infrastructure monitoring. Since July 2008, a first test-bed deployment of the SOSEWIN EEW system is operative in the Ataköy district of Istanbul. The network of 20 stations provides a continuous streaming of data that a Seiscomp Server at GFZ collects in real time, and distributes them to third parties (e.g. KOERI, HU Berlin, and lat/lon). In cooperation with the colleagues of HU Berlin, the performance of the network is continuously monitored. Nonetheless, since during this period of time no relevant seismicity has been observed close to Istanbul, the preliminary tests about the test-bed network performance focused on the various aspects of communication. The main positive results are – the performance and the long-term stability of the sensor nodes as strong motion sensors, which have proven to be running stable for several months; – the performance of the installed network and its self-organisation capability. During this period, also several problems were faced and solved: Problems with the WLAN drivers were observed and the rate of transmission of the accelerometric data had to be throttled to 1MBit/s. However, despite the low rate of transmission, there is still enough bandwidth for streaming all data out of the network with SeedLink. Modifications of the SOSEWIN’s software by HU Berlin allows to increase the data rate from 1 MBIT to a higher value. In the long period we observed problems with the performance of standard, commercial CompactFlash (CF) cards (which act as the hard-disk of the SOSEWIN stations). In order to solve these problems, we tested a new industrial grade CF cards. These new hardware components showed a higher level
of reliability. SOSEWIN’s software was optimised for the new CF cards by HU Berlin. A manuscript dealing with the description of the SOSEWIN philosophy, hardware, and software, as well as an overview of the communication performance for the first test-bed SOSEWIN deployed in Istanbul has been accepted for publication by Seismological Research Letters (Fleming, K., Picozzi, M., Milkereit, C., Kuehnlenz, F., Lichtblau, B., Fischer, J., Zulfikar, C., Ozel, O., and the SAFER and EDIM working groups. The Self-Organising Seismic Early Warning Information Network (SOSEWIN), accepted to be published by Seismological Research Letters). During summer 2008, the suitability of the SOSEWIN system for monitoring the vibration characteristics and dynamic properties of strategic civil infrastructures has been tested. In particular, an ambient vibration recording field test on the Fatih Sultan Mehmet Bridge in Istanbul has been performed. The bridge is also equipped by a traditional vibration monitoring system encompassing 5 Guralp Systems CMG-5TD instruments. These instruments are located inside at the edges of the deck and provide continuous data by transmission to KOERI. One of the main goals of the experiment was to compare the signals recorded by the SOSEWIN and Guralp sensors. Fig. 3 shows the corresponding Power Spectrum Density (PSD) functions computed for the vertical components of motion at the sensors located approximately in the middle and onethird of the bridge’s deck. Despite the WSUs lying over the bridge’s deck while the Guralp sensors are installed inside the deck, the agreement between their PSDs is still strong. Fig. 4 provides an overview of the ambient vibration analysis results for a pair of SOSEWIN stations installed at characteristic locations on the bridge (i.e. the deck, and the towers, respectively). When comparing the average spectral ratio (SR) curves (Figs. 4c and d) for pairs of sensors installed at different points, it
87
Fig. 3. PSD functions for the vertical components of motion. Average PSD plus +/-95% confidence interval for SOSEWIN (white and dark grey, respectively) and Guralp (black and light grey, respectively). (a) Sensors located on the middle of the bridge’s deck (i.e. WSUs over the deck, and Guralp within the deck). (b) Similar to (a), but with nodes located at about 1/3 of the way along the bridge’s deck
Fig. 4. Results for pairs of WSUs. (a) Selected sensors (white symbols) placed at the bridge’s deck. (b) is similar to (a), but of sensors placed on top of the bridge’s towers. (c) and (d): Spectral Ratio (SR) functions for the vertical (dark gray), longitudinal (light gray), and transversal (black) components of motion. (e) and (f)
is clear that SOSEWIN stations provide consistent and robust results, with a clear image of how the diverse parts of the bridge react differently to the ambient vibrations. Moreover, SR spectrograms (Figs. 4e and f) show that ambient vibrations have a stationary character, and indicate that the SOSEWIN stations provide stable estimates.
88
Comparisons with standard instrumentation and results obtained in terms of modal properties of the bridge indicate an excellent performance of the low-cost WSU. The results were found to be consistent with those from the studies of Brownjohn et al. [1992], Apaydin [2002], and Stengel [2009].
A manuscript dealing with the testing of SOSEWIN for the monitoring of the Fatih Sultan Mehmet Suspension Bridge has been accepted for publication by Bulletin of Earthquake Engineering (Picozzi, M., Milkereit, C., Zulfikar, C., Fleming, K., Ditommaso, R., Erdik, M., Zschau, J., Fischer, J., Safak, E., Ă&#x2013;zel, O., and Apaydin N., (2009). Wireless technologies for the monitoring of strategic civil infrastructures: an ambient vibration test on the Fatih Sultan Mehmet Suspension Bridge in Istanbul, Turkey. In press on Bulletin of Earthquake Engineering, DOI 10.1007/s10518-009-9132-7). HU Berlin focused on further development of the system architecture and software improvements: Fig. 5 shows an overview of the current SOSEWIN layer architecture with several services. The core functionality of SOSEWIN is the EEW and therefore it provides a hierarchical alarming system, defined by the alarming protocol. Following a model-based development approach, the alarming protocol is based on common structure and behavioural models.
The alarming protocol was defined informal and based on this, a formal description language (SDL in addition with ASN.1/UML/C++) was used to build up a formal model of its functionality. Using such a formal model, code for the target hardware platform (sensor nodes) and for different kinds of simulators supporting different experiment scenarios (including the system and its environment) in preparation for the implementation is generated. A further speciality of our approach is the integration of the model-driven tool chain into a spatial-time-based Experiment Management System (EMS) in connection with a Geographic Information System (GIS). This allows us to describe the WSN topology and the distribution or movement of the physical phenomena in a geographic map. All tool components are integrated by our GIS-based Development and Administration Framework for Wireless Sensor Networks (GAF4WSN). With this infrastructure large networks with thousands of nodes can be simulated in their behaviour and evaluated. It allows also administrating the prototype SOSEWIN installed in AtakĂśy, Istanbul.
Fig. 5. SOSEWIN layer architecture
89
Fig. 6. Results of simulation experiment
The functionality of the alarming protocol depends on several parameters with certain value ranges (e.g. earthquake event parameter, group topology). All the combinations of the parameter values are setting up a huge parameter space which has to be taken into account in the test and the evaluation of the alarming protocol. Experiments covering (interesting parts of) this parameter space have to be done within different execution environments (different simulators and real-world prototype), where each has different limitations, feasibility and significance. The aim is to observe the complex dynamic reaction of the system using different input configurations according to the identified experiment sets in different execution environments, answering particular questions about the system’s behaviour. For example, Fig. 6 shows the results of a simulation experiment set of 20 experiments whereas the distance between the epicentre and the first station of the network varied from 2 to 40 km in a step width of 2 km. All other parameters (e.g. wave travel speed, rupture characteristics, network topology, P-wave
90
detection parameters) are constant. As expected, there is a linear correlation between the distance and the time when the early warning message leaves the network: increasing distances result in later detections due to the longer way the seismic waves are travelling. In the future work we will continue experimentation (e.g. with larger SOSEWIN topologies) and connect the SOSEWIN prototype installation in Ataköy with the visualising infrastructure developed by our project partners lat/lon and DELPHI IMM. 4. Geoinformation Infrastructure and geoinformation systems EEW and rapid response are – to a large extent – based on information provided in a geospatial context and to a specific, generally restricted, set of users. In this frame information services such as rapid damage estimates and a mediation system are needed (work package C1 of DELPHI IMM GmbH) and the organisation of access to information and realtime sensor data must be assured (work package C2 of lat/lon GmbH).
Damage calculation due to earthquakes is fundamental in regions at high risk of earthquake and with dense population. Both conditions apply particular to the Megacity Istanbul, which is less than 10 km away from the Main Marmara Fault with a rapidly growing population (see Fig. 7). The dynamic geoinformation infrastructure consisting of a damage estimation service for buildings and a mediation system for flexible visualising earthquake information responds to these current requirements in the field of disaster management. Here we also integrate achievements of the other work packages dealing with ground motion prediction and information and sensor networks. Results of work package C1 were presented and discussed in May 2009 at a meeting in Istanbul with members of AKOM (Afet Koordinasyon Merkezi) and KOERI. User requirements are already considered in our geoinformation infrastructure. Rapid assessment of building damage after an event (rapid response) is important information for operating a disaster task force, getting a quick damage overview or doing cost estimations. Damage estimation service (DES) is implemented in JAVA as a web service as it aims at quick calculation, at being available for many users and especially at being integrated in a spatial data infrastructure. DES procedure is based on the FEMA356 coefficient method, modified by KOERI. This performance-based method assesses the damage probability with respect to vulnerability of buildings due to building structures. A main item of this method is the building inventory of Istanbul, classified with respect to building type (e.g. RC frame building, masonry building or pre-fabricated building), building height and construction date. Construction date is an important factor to classify the design level for buildings because it gives some hints about the applied seismic code. Probability values of each damage class and building class for every grid cell is an intermediate result of this six-stepped processing. Finally, the number of affected buildings will be calculated. The expertise of KOERI was thankfully used for validation.
The use of DES for planning (long before an earthquake) or scientific scenario analysis is as import as the use in the event of damage. The above described simulated scenarios of the historic 1509 Istanbul earthquake from Karlsruhe University were used as input for damage calculation to show the impact on the current building inventory. Fig. 7 shows the results of the three scenarios differing in the position of the hypocentre. With the help of DES, the different impact on buildings caused by shifting the hypocentres can be shown. As for the distribution of ground motion, not the nearest hypocentre location affects the most damage on buildings. For scenario 1 and 3, almost the same number of buildings is affected (about 30%). The distribution of damage classes is almost identical, as well. The area of extensive destroyed buildings stays the same for all three scenarios. The Mediation system visualises information of earthquakes, especially the results of building damage, by using user-flexible functions. It is realised as an OGC (Open Geospatial Consortium) compliant MapClient and address to further web services, for example DES. In the field of disaster management it is necessary to analyse information according to individual requirements in order to give strong statements and make the right decisions. For this purpose, a flexible filter system is available which allows the selection of any combination of building and damage classes. In addition, a statistical statement of the total number of affected building for the five damage classes (slight, moderate, extensive and complete) supports the interpretation. The simultaneous viewing of two earthquake results is a new implemented function which is very helpful for comparing scenarios in detail. The individual selected view of damage data can be OGC compliant integrated into the EDIM information system to view the information in a broader context. Work package C2 as part of the overall goal of the EDIM project deals with providing access to data relevant for analysing earthquake related
91
Fig. 7. Comparison of complete building damages of the three simulated scenarios for the 1509 earthquake
information in the Marmara region. Besides the access to data and information, an EEW and rapid response component should also be accessible to restricted user groups. As potential user groups lat/lon has identified scientists analysing historical as well as most current data sets, EEW/rapid response specialists from AKOM, and interested parties from the gener-
92
al public. During an on-site demonstration in Istanbul the information system and the underlying concepts have been presented to local stakeholders and the assumed set of user groups has been verified. In order to restrict access to classified data to an authorised user group, the system has been enhanced by a user-/and rights management component.
The access-restricted EDIM information system is realised as a Spatial Data Infrastructure based on the deegree framework [Fitzke et al., 2004] and is therefore implemented as a set of loosely coupled services adhering to OGC interfaces. While the first two project years were coined by development and enhancement of the service architecture, the integration of components by project partners has been focused since 2009. This includes an integration of the EDIM DES provided by DELPHI IMM, an integration of various data sets provided by KOERI as well as a connection to the SAFER middleware for Geographical Applications [Fischer et al., 2009]. In addition to software implementation the following goals have been archived: – Project partners as well as potential users of the information system have been informed about the current project status during a workshop in Istanbul. – Project partners as well as potential users in Istanbul have been granted with access to the Information System in order to provide feedback on the usability and functionality of the system. – User access to the system has been restricted. Classified data will only be visible to authorised persons. Authorisation is ensured through the deegree user rights management system. – A concept for realising high availability of the infrastructure is under development. This concept utilises the paradigm of Grid computing [Foster & Kesselman, 1998] to archive service availability, short response times as well as advanced back-up mechanisms for data and services. – Documentation of available components (https://wiki.deegree.org/deegreeWiki/deegree3) to ensure a sustainable availability of the developed components. – Compliance testing of the Sensor Observation Service with OGC in order to provide a standardised interface for sharing observations and measurements (http://www.opengeospatial.org/resource/ products/details/?pid=761).
The EDIM information system is able to access static (historical) spatial data but also – and more importantly – real time data through a sensor observation service. A direct access to sensor nodes placed in-situ at the Istanbul test site has been established by HU Berlin and GFZ Potsdam. The inter-sensor communication is based on a proprietary protocol which triggers alerts containing detailed information on seismic events [Fischer et al., 2009]. To get notified about an alert (i.e. a seismic event exceeds a predefined threshold), a notification is issued by a Message Notification Broker (MNotiBroker). This webservice (based on the SOAP-protocol and described by WSDL) has to be registered (subscribe) at the Message Notification Service provided by HU Berlin. After subscribing to the MNotiBroker, any seismic event is propagated to the registered webservice. Within EDIM, lat/lon has developed an integration layer providing a mechanism to subscribe to the seismic notification service. The notification, which is not based on standardised interfaces, is evaluated by the integration layer and propagates the relevant information to portal users. Fig. 8 illustrates the workflow between a portal user accessing iGeoPortal (lat/lon’s web-based client for the EDIM information system) and the MNotiBroker. The MNotiBroker is a webservice within the SOSEWIN middleware provided by HU Berlin and located on-site. The integration layer (Fig. 8: alerting subsystem) provides the necessary functionality to subscribe to the MNotiBroker and receive alerting messages. These messages are evaluated by a message processing component (Fig. 8: MessageProcessing). The portal component (iGeoPortal) polls the necessary messages and propagates the information to the portal user. This implementation realises the connection between the EDIM information system and the on-site sensor system and makes the required information available to the stakeholders accessing the information system.
93
Fig. 8: UML Sequence illustrating the information flow between a portal user and an event notification from the SOSEWIN-System
5. Summary The integration of strong motion seismology, sensor system hard- and software development, and geoinformation real-time management tools prove a successful concept in making seismic early warning a novel technology with high potential for scientific and technological innovation, disaster mitigation, and many spin-offs for other fields. EDIM can serve as a model for further developments in the field of early warning on a global scale. 6. References Ambraseys, N.N. (2001). The Earthquake of 1509 in the Sea of Marmara, Turkey. Revisited. Bull. Seismol. Soc. Am., 91(6), 1397–1416. Apaydin, N (2002). Seismic Analysis of Fatih Sultan Mehmet Suspension Bridge. Ph.D Thesis, Department of Earthquake Engineering, Bogazici University, Istanbul, Turkey. Beresnev, I. and G. Atkinson (1997). Modeling finite-fault radiation from the spectrum. Bull. Seismol. Soc. Am., 87(1), 67–84. Böse, M., Wenzel, F., and M. Erdik (2008). PreSEIS: A neural network-based approach to earth-
94
quake early warning for finite faults. Bull. Seismol. Soc. Am., 98 (1), 366–382, doi:10.1785/ 0120070002. Boore, D. M., Joyner, W .B., and T. E. Fumal (1997). Equations for Estimating Horizontal Response Spectra and Peak Acceleration from Western North American Earthquakes: A Summary of Recent Work. Seismol. Res. Lett., 68(1), 128–153. Brownjohn, J. M. W., Dumanoglu, A. A., and R. T. Severn (1992). Ambient vibration survey of the Fatih Sultan Mehmet (Second Bosphorus) Suspension Bridge. Earthquake Engineering and Structural Dynamic, 21, 907–924. Campbell, K. W., and Y. Bozorgnia (2008). NGA Ground Motion Model for the Geometric Mean Horizontal Component of PGA, PGV, PGD and 5% Damped Linear Elastic Response Spectra for Periods Ranging from 0.01 to 10 s. Earthquake Spectra, 24(1), 139–171. FEMA 356, 2000: Federal Emergency Management Agency: »Pre-standard and Commentary for the Seismic Rehabilitation of Buildings«.
Fischer, J., Kühnlenz, F., Eveslage, I., Ahrens, K., Lichtblau, B., Nachtigall, J., Milkereit, C., Fleming, K., and M. Picozzi (2009). Deliverable D4.22 Middleware for Geographical Applications. Online: http://casablanca.informatik. hu-berlin.de/wiki/images/2/2f/D4.22_SAFER_ UBER.pdf Fitzke, J., Greve, K., Müller, M., and A. Poth (2004). Building SDIs with Free Software – the deegree project. In: Global Spatial Data Infrastructure. 7th International Conference, February 2–6, 2004, Bangalore, India. Technical Symposia. SDI Technologies, Bangalore: 136–148. Foster, I., and C. Kesselman (1998). The Grid: Blueprint for a New Computing Infrastructure. San Francisco. Köhler, N. and F. Wenzel (2008). Real-time Information from a Regional Accelerometer Network. In: R&D Programme GEOTECHNOLOGIEN – Early Warning Systems in Earth Management, Abstracts of Status-Seminar, October, 8–9, 2008, University Osnabrück. Özbey, C., Sari, A., Manuel, L., Erdik, M., and Y. Fahjan (2004). An empirical attenuation relationship for Northwestern Turkey ground motion using a random effects approach. Soil Dynamics and Earthquake Engineering, 24, 115–125. Stengel, D. (2009). System Identification for 4 Types of Structures in by Frequency Domain Decomposition and Stochastic Subspace Identification Methods. Diploma Thesis, Karlsruhe University, Germany.
95
Exupéry – Volcano Fast Response System Hort M. (1)*, Barsch R. (2), Bernsdorf S. (3), Beyreuther M. (2), Cong X. (4), Dahm T. (1), Eineder M. (4), Erbertseder T. (4), Gerstenecker C. (5), Hammer C. (6), Hansteen T. (7), Krieger L. (1), Läufer G. (5), Maerker C. (4), Montalvo Garcia A. (1), Ohrnberger M. (6), Rix M. (4), Rödelsperger S. (5), Seidenberger K. (4), Shirzaei M. (8), Stammler K. (9), Stittgen H. (9), Valks P. (4), Walter T. (8), Wallenstein N. (10), Wassermann J. (2), Zaksˇek K. (1) (1) Institute of Geophysics, University of Hamburg e-mail: matthias.hort@zmaw.de, torsten.dahm@zmaw.de, lars.krieger@zmaw.de, arturo.montalvo@zmaw.de, klemen.zaksek@zmaw.de (2) Department of Earth and Environmental Sciences, Geophysical Observatory, Ludwig-Maximilians Universität Munich e-mail: barsch@lmu.de; beyreuther@geophysik.uni-muenchen.de; jowa@geophysik.uni-muenchen.de (3) Centauron – geosoftware & consulting e-mail: stefan.bernsdorf@centauron.de (4) DLR Oberpfaffenhofen, DLR e-mail: xiao.cong@dlr.de, michael.eineder@dlr.de, thilo.erbertseder@dlr.de, cordelia.maerker@dlr.de, meike.rix@dlr.de, katrin.seidenberger@dlr.de, pieter.valks@dlr.de (5) Institute of Physical Geodesy, Technical University Darmstadt e-mail: gerstenecker@geod.tu-darmstadt.de, laeufer@ipg.tu-darmstadt.de, roedelsperger@geod.tu-darmstadt.de (6) Institute of Geosciences, University of Potsdam e-mail: conny@geo.uni-potsdam.de, matthias.ohrnberger@geo.uni-potsdam.de (7) IFM-GEOMAR, Leibniz Institute of Marine Sciences e-mail: thansteen@ifm-geomar.de (8) Department physics of the earth, GFZ e-mail: shirzaei@gfz-potsdam.de, twalter@gfz-potsdam.de (9) BGR, Geocentre Hannover e-mail: stittgen@sdac.hannover.bgr.de, klaus@szgrf.bgr.de (10) Centro de Vulcanologia e Avaliação de Riscos Geológicos, Universidade Dos Acores e-mail: nicolau.mb.wallenstein@azores.gov.pt *Coordinator of the project: Prof. Dr. Matthias Hort, Institute of Geophysics, University of Hamburg
Introduction During the summer 2008 three remote volcanoes in the Aleutians erupted within a month. Due to their remote location they did not affect too many people except that a number of airplanes had to be rerouted due to the ash emitted by the volcanoes. At the same time several volcanoes were at unrest in heavily populated regions in south East Asia, some of them being almost unmonitored. It is those volcanoes for which the Volcano fast response system has been designed for. It is intended to be a system (including various types of instru-
96
ments) that is easy to install on a volcano showing signs of unrest. The idea behind this system is that one wants access to all the different data collected during volcano monitoring in one system. One does not want to move from one system to another to look at different datasets. Plus, one wants to be able to share the data with colleagues abroad to discuss the activity status of a volcano. The core of the volcano fast response system is a database that allows for the collection of data of various kinds, i.e. simple time series
data like seismic data, satellite data, deformation data, gas measurements, and so on. Part of the data collected in the database may come from an already existing network the other data may be transmitted through a wireless network that has been specifically designed for the volcano fast response system. One of the main challenges is to access all the data through one common interface. This has been implemented through a GIS based Web interface which allows for various layers of data being looked at at the same time. The system is designed such that it only uses open source software, so it can be easily installed on other systems not having to deal with purchasing proprietary software. The parts of the system are developed by various research groups in Germany. The project has been subdivided into 5 main work packages that address various parts of the system. WP1 delivers ground based observational data, like GPS measurements, ground based INSAR observations, and ground based gas measurements to the system. WP2 provides satellite based observations including deformation maps, SO2 concentrations and thermal hot spots. Inside WP3 the GIS based visualization system as well as the backbone database are developed. WP4 deals with the development of the wireless communication system as well
as with the test installation of the VFRS on Sao Miguel. Finally inside WP5 parts of the incoming data are analyzed (i.e. earthquakes are classified, moment tensors are calculated, deformation is modelled trying to locate the source of the deformation), and the results are send back to the database to be displayed in the common visualization system. WP1: Ground based observations WP1A: GROUND BASED DEFORMATION MONITORING In April 2009 a combined GPS and ground based SAR deformation monitoring system was installed at the crater lake on Mount Fogo on Sao Miguel, Azores. The system consist of the Synthetic Aperture Radar (SAR) instrument IBIS-L, three single-frequency GPS receivers and two dual-frequency GPS receivers. IBIS-L scans the area shown in Fig. 1 automatically every 10 minutes. The data are automatically processed and evaluated in near real-time. All field stations are equipped with meshnodes, which control the sensors, store the data and establish the wireless LAN. Additionally meteorological data are measured at different points of the network and a webcam takes every 10 minutes a picture of the view across the lake, to permit a better assessment of the atmospheric conditions.
Fig. 1. Area monitored by the ground based SAR Instrument IBIS-L
97
Fig. 2. Mini-DOAS measurements were performed at fumaroles of the Furnas volcano
Main problems are the power supply of the stations and the stability of the radio communication. Long distances and harsh weather conditions make the stand-alone operation more difficult. Nevertheless a big amount of data was gathered, processed and included into the ExupĂŠry database. WP1A: GROUND BASED SO2 MEASUREMENTS The basis for a successful incorporation of the mobile Mini-DOAS system into the VFRS was the reprogramming and re-compilation of the control software, which was originally developed within the EU-funded project NOVAC (www.novac-project.eu). The required changes were done in close cooperation with colleagues at the Chalmers Technical University, Gothenburg (group of Prof. Bo Galle). During our field work on the Azores, both mobile and stationary Mini-DOAS measurements were performed at several fumaroles of the Furnas volcano (Fig. 2). Using such a combination of methods, both concentrations and
98
fluxes of SO2 can be determined. Increased SO2 fluxes are generally a reliable indicator for the presence of shallow-level magma or volcanic unrest (i.e. seismicity, changes in degassing activity). The Mini-DOAS system was verified prior to the measurement campaign using calibrated gas cells containing known amounts of SO2. Such a routine serves as an indicator for the reliable operation of all optical components in the system, and further ensures that the measurements are comparable to those performed at other locations under other atmospheric and technical measurement conditions. Measurements at several fumaroles all showed that no SO2 could be presently detected, which is unequivocal evidence against a shallow magma source below Furnas volcano. Fluid samples from the same fumaroles, sampled and analysed by our colleagues at Universidades dos Açores, similarly show that the fumaroles presently do not emit SO2. The field tests were performed without technical problems.
WP2: Satellite based observations WP2A: MONITORING OF VOLCANIC SO2 PLUMES USING THE GOME-2 SATELLITE INSTRUMENT Since October 2008 the activities within task working on monitoring of volcanic SO2 plumes were dedicated to the improvement of the GOME-2 SO2 retrieval and the trajectory analysis as well as the provision of daily SO2 and trajectory maps. A correction for the interference of SO2 and ozone absorption within the wavelength range of the retrieval (~320 nm) was developed and implemented into the automatic processing chain. In addition the SO2 vertical columns are provided for three different altitudes (2.5 km, 6 km and 15 km) because the retrieval is dependent on the height of the SO2 plume. These three altitudes represent typical volcanic events: 2.5 km for non-eruptive emissions (passive degassing), 6 km for moderate eruptions and 15 km for explosive eruptions. A first validation of the GOME-2 SO2 retrieval with ground-based measurements and other satellite-based observations has been carried out and yields good results. The stochastic trajectory analysis technique has been optimized for the estimation of eruption characteristics from the GOME-2 data. Information on emission height, emission time and duration as well as the most probable emission source is provided. The latter requires the creation of probability density maps. The emission height, time and duration is derived from backward trajectory ensembles that are calculated for the most probable emission source. The results are statistically firm due to the use of coherent trajectory ensembles. Furthermore hypothetical forward trajectory and probability maps are generated starting at possibly active volcanoes. This allows forecasting the dispersion of volcanic SO2 and ash depending on the emission height in case of an eruption. First studies of the atmospheric transport of SO2 with the Lagrangian Particle Dispersion Model (PDM) FLEXPART have been carried out. The PDM was initialized using the
source term parameters determined by the trajectory analysis. The first results are promising. The GOME-2 retrieval and the stochastic trajectory analysis have been tested and validated during recent eruptions (Okmok, Alaska: July 2008, Kasatochi, Alaska: August 2008, Alu/Dalaffilla, Ethiopia: November 2008, Mt. Redoubt, Alaska: March 2009) and non-eruptive events (Papua New Guinea/Vanuatu: 2007–2009, Ecuador 2007-2009). The investigation of passive degassing is very important for the early warning system of Exupéry because changes in volcanic SO2 fluxes can be an indicator for the onset of volcanic unrest. The GOME-2 SO2 data are available at the World Data Centre for Remote Sensing of the Atmosphere in near-real time (2–3 hours after the overflight) for the whole world as well as for 29 predefined volcanic regions (wdc.dlr.de). An important task within the Exupéry project is the development of a database and the visualization of the data within the Exupéry-webGIS. Therefore daily maps of the SO2 vertical columns, the hypothetical trajectories and the probability density are provided automatically for the volcanic regions »Italy« (Fig. 3) and »Azores«. The maps are available in the database and the GIS within 7 hours after the GOME-2 observation. In addition detailed information on the SO2 (e.g. maximum values and special SO2 alert levels) enables the characterization of the volcanic activity. WP2B: Interferometric Measurements During the first Azores campaign in April two corner reflectors were successfully installed for TerraSAR-X on the test site Lagoa do Fogo in order to ensure the coherence for interferometric point measurements (Fig. 4). During April and August eleven strip-map scenes were gathered consequently. All the scenes will be downloaded from the server approximately five days after the acquisition time (when precise science orbits are available) and postprocessed at DLR. For the standard InSAR processing two scenes were selected with small-
99
Fig. 3: Visualization of GOME-2 SO2 vertical columns for the Italy region within ExupĂŠry- GIS
Fig. 4: Plots of theoretical value of radar cross section vs. the measurements from SAR images (left); the energy from the background (right)
est temporal distance. The resulting interferogram, differential interferogram and coherence image will be delivered to the center database in Hannover and will be published instantly in the ExupĂŠry database. Coherent scatterers during whole acquisition duration as well as their phase values were extracted from the stack of interferograms. By use of phase measurements different parameters could be estimated, such as deformation velocity, DEM
100
error and atmospheric distortions. This result will be delivered to the Hannover database at the end of the measurement phase. All products were post-processed in DLR with the GENESIS-Software, developed at DLR. Time series analysis with coherent scatters from the stacking was applied in order to detect complex deformation from mountainous area. The test area will be illuminated
from different directions, each direction once in an eleven day repeat cycle. The goal is to merge the measurements afterwards using ground truth (corner reflectors) to increase the temporal and spatial redundancies. This application will be accomplished after the gathering all needed data for PSI processing. According to the milestones in second year, the new algorithm suitable for volcanic area shall be developed and also the system for the data processing. The processing system was established above existed module from GENESIS software. The results were processed halfautomatic with software and were delivered regularly to the database in Hannover. As well as the cooperation with GFZ, we supported the modeling group with the interferometric measurements for better understanding the geophysical phenomenon. A further cooperation with Prof. Thuro (co-supervision of a master thesis) for a joint experiment was planed. WP2C: MONITORING THERMAL ANOMALIES The emphasis of this task is on detecting and characterizing thermal anomalies using remote sensing techniques. According to Wien’s Displacement Law the peak emission wavelength gets longer by lowering the temperature. Therefore, a difference between medium (MIR – spectra around 4 µm) and thermal (TIR – spectra around 10 µm) infra red data can reveal a thermal anomaly within a pixel even if its area is significantly smaller than the whole pixel area. For example, given a lava lake at 500 K that covers a quarter of the pixel area, an observation in the MIR spectra would result in a temperature of 450 K. TIR observation would in contrast be just 390 K given that the background temperature equals 300 K. Considering this fact a reliable and effective thermal anomalies detection algorithm was developed for MODIS sensor; it is based on the threshold of so called normalized thermal index (NTI). This is the method we use in Exupéry, where we characterize each detected thermal anomaly by temperature, area, heat flux and effu-
sion rate. At the moment, merely MODIS (a sensor aboard NASA’s Terra and Aqua satellite) data are used for the operational use. These data are usually available four times per day with a nadir spatial resolution of 1000 m. MODIS is basically a meteorological sensor, thus it is suitable also for producing a general overview images of the crisis area. For each processed MODIS image we produce RGB image where some basic meteorological features are classified: e.g. clouds, ash plume, ocean, etc. (Fig. 5). In the case of detected hotspot an additional image is created; it contains the original measured radiances of the selected channels for the crisis area. All anomaly and processing parameters are additionally written into an XML file. The results are available in web GIS in the worst case two hours after NASA process the retrieved data to level 1b data. The software for thermal anomalies monitoring is based on an IDL code, which means, that the end user has to either have an IDL licence or uses IDL virtual machine to run it. The code responsible for data input/output is written in Python that can be used in various operating systems. This software combination is operational since the beginning of the field experiment in Azores. Almost 60 Gb of data have been processed in the first four months but no thermal anomalies have been detected so far. The operational version of the system can be effectively used also to monitor forest fires. One could set up a near real time monitoring system of forest fires for a whole country by minor adjustments in the code. In the future we would like to increase the reliably of the results. Four MODIS images per day might be too sparse to monitor some sudden volcanic events, thus we would like to include additional satellite data into the operational use. Additional data might also reduce the noise in measurements; our current research is orientated into fusing the results from MODIS with AVHRR data using the Kalman filter. In order to validate the proposed approach we use data from DLR micro satellite mission BIRD.
101
Fig. 5: Thermal infrared image; no hotspost detected on July 17 2009; one can observe clouds â&#x20AC;&#x201C; dark areas South of Sao Miguel (in centre bright)
WP3: Expert System WP3A: DATABASE AND GIS The main task of this subproject is to store and give access to the multi-component datasets provided by the other work packages within the ExupĂŠry project. It was decided to use the database SeisHub, mainly because of its easy extensibility and limitations of other existing purely seismological databases. SeisHub claims to be a modular, scalable web service framework for serving and processing XML documents, in particular geophysical metadata, via a network, such as the Internet. In addition to the database itself, WP3 is also concerned with the development of a browser based, license free GIS to visualize the data. JenaOptronik has been assigned with this task. Since SeisHub handles XML documents, all data have to be provided in this format. For every type of dataset, the structure of the corresponding XML documents was determined, in coordination with the project partners.
102
These XML documents contain metadata and, in some cases, the data itself (e.g. GPS, MiniDOAS, seismic stations and events). In other cases, references to additional data files are included (e.g. IBIS and satellite images). The resource descriptions and several interfaces (in particular to the GIS) were implemented into SeisHub. A prototype system with all the necessary software components (primarily the SeisHub database and a Tomcat web server hosting the GIS) is now running on a server at the BGR in Hannover and can be accessed by the project partners. All satellite data are automatically downloaded from the respective data providers and stored in the database. The ground based data are supposed to be sent to SeisHub directly by the suppliers. Because of technical problems or lack of real data (the MiniDOAS system was not part of the field experiment on the Azores because no SO2 could be detected on site) this was only tested and not realized continuously as proposed.
For the field test on the Azores, three servers and a raid system – especially acquired for the Exupéry project – were shipped to Ponta Delgada. One of them hosted a comparable environment to the one at the BGR (SeisHub and GIS). However, due to low internet bandwidth and the interests of the local university, this system only contained data derived on the island (no satellite data) and was not remotely accessible. Another computer was used as a SeedLink server and successfully received online data streams from seismic stations (Fig. 6) set up by project partners around the volcano Sete Cidades. These data, as well as data recorded by the existing seismic network on the Azores using the newly developed software package ObsPy, were stored on the raid system and fed into the automatic analysis systems SeisComP3 and Earthworm that were in parallel installed on the servers on the Azores. The detected seismic events were sent to the SeisHub databases on the Azores and to the BGR in Hannover.
The waveform data recorded on the Azores were copied to a file server in Hannover on a daily basis to make the data available to the project partners. A SeisHub feature was used to automatically index all the MiniSEED files and extract the quality information. This allows an easy overview of gaps, overlaps, time quality etc. via the SeisHub web interface and to some extent the GIS. The development of the GIS is well advanced. All the georeferenced Exupéry data are available as separate layers. The data can be selected via a predefined filter strategy which is attached to the layer. The resulting data are then visualized by selecting from an attached style set. For every layer, additional information from the XML documents can be shown via a description button. Furthermore, it is possible to access the full XML documents via the GIS. A diagram representation of the data is also available for certain layers. In some cases, the GIS will even provide time series imaging or links to external programs that allow the user to work with the data interactively.
Fig. 6: Seismic stations network; the stations that are telemetered via the WIFI system are indicated (four grey triangles surrounded by a black line in the West of the island)
103
Fig. 7: Alert level control within ExupéryGIS
WP3B: AUTOMATIC ALERT LEVEL ESTIMATION USING ARTIFCIAL INTELLIGENCE The objective of the alert level (AL) WP3 task in the last year was to improve the performance of the artificial intelligence (AI) method and to connect the automatic connection of the AI method to the Exupéry data base and the GIS system (Fig. 7). As AI method Bayesian networks are used. In detail the basic Bayesian network of Dr. W. Aspinall (University Bristol) was examined with respect to the possible application on other volcanoes. In the special case of the field study at Sete Cidades (Azores), the structure of the network needed to be slightly modified. To adapt the conditional probabilities, two single expert elicitations (University Munich) were carried out. The resulting Bayesian network was examined by estimating the alert level from randomly generated observations. The automation was the second focus of this subproject. So far all developments were carried out manually using the graphical software genie (genie.sis.pitt.edu). Methods need to be designed to automatically deduce the alert level from observations stored in the Exupéry
104
database seishub (www.seishub.org). The program for estimating the alert level is based on the C++ graphical decision-theory library smile (genie.sis.pitt.edu). The connection to the data base on the server and on the client side is written in Python. On the server side this programs are called mapper and return e.g. the maximum number of earthquakes per day of the last two weeks. In addition to the proposed milestones of this subproject, the focus in the last year was to support the development of the wifi mesh and the computer administration during, before and after the field test on the Azores. Among others a program was developed which reads in the proprietary format of the existing seismic stations on the Azores and feeds it into the Exupéry network/database structure. As part of this the Python Toolbox for Seismology ObsPy (www.obspy.org) was significantly extended. This will increase the flexibility of the VFRS in future as now a standardized library of software modules exists which is also easily extendable. In order to incorporate the automatic alert level estimation in the »Volcano Fast Response
System« the preliminary Bayesian network need to be generalized for different volcanoes. During a field study an already existing Bayesian network of the given type of Volcano should only be adapted. Both structure and conditional probabilities of these general Bayesian networks still need to be investigated. This is the main subject for the next year. WP4: Installation of the mobile volcano fast response system During the first 18 month of the project the field components of the mobile system have been developed. These include the power management system and the wireless communication system, The power management system was designed such that the components of the system could be bought in almost any country around the globe. We purposefully omitted highly integrated circuitry for this reason. The WiFi radios are off the shelf products which are usually running an operating system called IMedia which is designed for a station and access point communication. This has been changed to a so called mesh node protocol configuration, which allows the addition of new stations to the system without paying attention to station and access point configurations. During the last month before the installation of the test system on Sao Miguel (Azores) 16 units of the power management system as well as 18 WiFi radios were Assemblyed and tested for the use during the field test of the volcano fast response system. From March 23th until April 6th, 20 seismic stations, a DOAS system, a ground based INSAR system, single and dual band GPS receivers and a couple of corner reflectors were installed on the island of Sao Miguel. As all of the land on the Azores is privately owned one of the main challenges during this installation period was to get permissions for putting stations out. About half of the stations have been installed in already existing empty vaults build and maintained by the volcanology group of the University of the Azores and by the CVARG (Centro de Vulcanologia e Avaliação
de Riscos Geológicos). A map of the station locations is shown in Fig. 6 above. The installation of all the equipment was carried out by a total of 9 people. Out of the 20 seismic stations installed on the island 4 are telemetered to the University of the Azores via a WiFi network (see marked region in Fig. 6). The main reason why we did not transmit signals of more stations to the University of the Azores is the topography of the island. We would have needed several repeater stations including all the necessary permissions for the installation. This was too complex a task for the simple proof of concept installation on Sao Miguel. The WIFI system was working for a day after which it broke down right at the end of out installation period. A snapshot of the WiFi cloud is shown in Fig. 8. We therefore decided to leave one person on the island of Sao Miguel for the time of the test installation, so he could work on getting the WIFI network to run. Since then a couple of problems have been identified. One is the bad location of our two repeater stations. The antenna of the main repeater for the data downlink from Sete Cidades to the University has since then been moved to a new, better spot significantly stabilizing the data throughput. The second repeater connecting the stations east and north of Lagoa das Sete Cidades to the rest of the network is currently located on an old, abandoned Hotel. Permits for moving that antenna to a nearby communications tower have been granted and the repeater will be moved at the end of July. We then hope that the connection of the different stations is working much better. Our main problems with the WiFi system raises the question if the mesh node protocol is really the proper one for an installation where stations are at least 4–5 km apart and usually see only one or two other stations and not 10 or 20 like in a typical mesh setting. In the case of a volcano installation it might actually be better to switch back to the simple station access point protocol which is much better tested and
105
Fig. 8: Mesh topology of the network at Sete Cidades. The values indicated along the different connections are ETX values, i.e. the number of packages that have to be sent in order to successfully send one package
well established. The transmission of the GPS and ground based INSAR data has actually been switched back to a station/ access point mode due to significant problems with the mesh node protocol and the topography. After the land part of the seismic network had been installed the seismic network was extended through the deployment of 15 Ocean-Bottom-Seismometers (OBS) and 3 Ocean-Bottom-Tiltmeters (OBT) around the island with the cruise P381 of the R/V Poseidon in April 2009. In September 2009 all stations (land and sea, in this case using the R/V Meteor) on Sao Miguel will be recovered and sent back. The data stored in the instrument data loggers while the running experiment (land and sea) will be put into the project database afterwards. WP5: Multiparameter analysis of continuous network data for quantitative physical model WP5A: EVENT DETECTION AND WAVEFORM CLASSIFICATION Volcanic eruptions are often preceded by seismic activity which can be used to quantify the volcanic activity. The automatic detection and classification of volcanic activity related seismic signals are based upon a combination of dif-
106
ferent methods. To enable first robust event detection in the continuous data stream different modules are implemented in the real time system Earthworm which is characterized by its wide distribution for monitoring active volcanoes. Among those software modules are classical trigger algorithm like STA/LTA and cross-correlation master event matching which is also used to detect different classes of signals. Furthermore an additional module is implemented in the real time system to compute continuous activity parameters. Here the real time seismic amplitude measurement (RSAM) as well as the seismic spectral amplitude measurement (SSAM) are used to characterize the volcanic activity. In order to be independent of previously existing databases of event classes we develop a novel seismic event spotting technique. The main goal is to provide a robust event classification based on a minimum number of reference waveforms and thus allowing for a fast build-up of a volcanic signal classification scheme as early as events have been identified. We follow an approach from Wilcox and Bush (1992) for indexing recorded speech where the Hidden Markov Toolkit (HTK), which is mainly intended for speech recognition, is used for the implementation. First of all, a valuable set of wave field parameters for characterizing differ-
ent source types of seismic sources is extracted in a sliding window from an unknown continuous data stream. Among those features are polarization, spectral as well as time-domain attributes whereas the spectral characteristics show the best discrimination among event classes. In the following these parameters are used to extract a fixed number of clusters in the feature space. In order to allow the build-up of a classifier from an individual data example the sequence of features extracted from the reference waveform is modeled using the clusters defined before. Finally the obtained model is trained using the reference waveform again to ensure the best fit to these types of events. The performance of the classification system shows a strong dependency on the reference waveform. Not each event shows the typical feature pattern of the corresponding class. Accordingly the reference waveform is chosen interactively based on seismological expert opinion. Furthermore the features used to characterize an event class play a prominent role. Different classes show various classification results for different sets of features used in the classification process. Since the mechanisms for generating seismic energy of different event types might be best represented by different features it is reasonable to use an individual subset of features for each class. At the moment the most suitable subset of features for describing the current class can be chosen interactively based on expert opinion. However we are also pursuing the possibility to choose the features automatically using mutual information following Ding and Peng (2005). Here a test function is minimized to account for minimum redundancy within one class and maximum discriminability between different classes. First results of the automatic feature selection process are promising and the method will be tested further. WP5B: NEAR-REAL-TIME INVERSION FOR DISLOCATION SOURCES AND STRESS FIELD COMPUTATION The subproject aims to develop new modelling techniques for fast inversion and stress field
computation. As described in the last report, we developed two random search approaches that are utilized iteratively. The iterated approach helps to prevent the algorithms to get trapped in local minima and increases redundancy for exploring the search space. We successfully tested this optimization method for the modelling of synthetic and real volcano InSAR data, such as provided by partners of the Exupery project. Our findings and developments are novel and have been described in detail in a manuscript that was recently accepted for publication. The first terms of the sub-project were planned to be about model geometry construction and the development of model inversions. This was achieved. Now we are in an application and testing period of the developed tools and algorithms. In order to facilitate data exchange and directly work on the automatization, we are currently in very close collaboration with DLR (WP2). Our algorithm developments have been very successful and could drastically reduce computation time and obtain more stable results. This is of fundamental importance for the success of the sub-project and the development of a fast response system. Different modelling strategies have been tested, such as the boundary element method and the multiple Yang source method. We found out that consideration of time series is a fundamental step that will have to be included into the modelling scheme. This will be targeted during the coming months. Currently we also work on strategies for implementation of our developed modelling scripts into a comprehensive software tool. The results may have a larger impact also for other research agencies and especially volcano observatories that have to rely on fast source characterization. In addition, we currently investigate the possibility to utilize the developed modelling strategies also for earthquake hazards, in order to evaluate changes in stress fields and aftershock locations. Collaboration, application and technical advice is envisaged
107
within the Potsdam Research Cluster for Georisk Analysis (PROGRESS, BMBF), where the planned project start is October 2009, pending positive evaluation. The application of the results to other disciplines, especially earthquake hazards, is of rigorous interest and one of the most promising possibilities to be explored in the near future. Possibly, also stress triggering effects observed in reservoir geology (oil, gas and salt) follows similar physical principles and hence may allow applying similar modelling strategies. In a collaboration project we test this idea for a salt diapir. WP5C: (Near-)Realtime Centroid Moment Tensor Inversion The focus of this sub-project is on seismic long-period-events (LP-event) with frequencies below 0.5 Hz, since these events have been associated with magma and volcanic fluid movement and are considered an important class of events to characterize a specific state of unrest of a volcano. Working with relatively low frequencies avoids a possibly strong influence from the unknown small-scale structure of the volcano complex. The developed software-module performs a near real time source-analysis of LP-events and is usually applied on a pre-defined time window of multi-trace waveform data after the module is triggered by the event detection process (WP5A, see above). The inversion is based on a grid search over location and time and a least squares generalized inverse for the time dependent moment tensor. The result is a complete point-source description of the possible event. All output data concerning the source and the metadata describing the full inversion process are stored in an XML-file, which is send to the project database. In case of a detected and »confirmed« LPevent the best moment tensor solution and location is displaayed in the GIS using the standard projection, i.e. a lower hemispherical projection of the radiation pattern on the unit sphere around the centroid.
108
The developed modules were used as stand alone analysis tool for investigation of real seismic data of known events (non-volcanic) that radiated low frequent signals. Plotting routines for moment tensor projections and seismic traces – real data and synthetic traces – have been implemented to visually control of the results. Project publications JOURNAL ARTICLES 1. S. Roedelsperger, M. Becker, C. Gerstenecker, G. Laeufer, K. Schilling, and D. Steineck: Digital Elevation Model with the Ground-based SAR IBIS-L as Basis for Volcanic Deformation Monitoring. Submitted to Journal of Geodynamics. 2. M. Rix, P. Valks, N. Hao, J. van Geffen, C. Clerbaux, L. Clarisse, P.-F. Coheur, D. G. Loyola R., T. Erbertseder, W. Zimmer and S. Emmadi, Satellite monitoring of volcanic sulfur dioxide emissions for early warning of volcanic hazards. IEEE, Fostering Applications of Earth Observations of the Atmosphere Special Issue. Accepted for publication. 3. M. Shirzaei, T. R. Walter: Randomly Iterated Search and Statistical Competency (RISC) as powerful inversion tools for deformation source modeling: application to volcano InSAR data. JGR. In press. CONFERENCE ARTICLES / ABSTRACTS 2009 4. Bonanno, S. Gresta, M. Palano, E. Privitera, G. Puglisi, M. Shirzaei, and T. R. Walter: Analysis of coulomb stress changes during 1997–1998 intrusive period at Mt. Etna volcano. EGU General Assembly, Vienna, Austria, April 2009. 5. S. Rödelsperger, M. Becker, C. Gerstenecker, G. Läufer, D. Steineck: First Results of Monitoring Displacements with the Ground Based SAR IBIS-L. EGU General Assembly, Vienna, Austria, April 2009. 6. M. Shirzaei and T. R. Walter: New wavelet based InSAR time series (WAB-InSAR) technique eliminating atmospheric and topograph-
ic artifact using wavelet transforms and lead to accurate saptio-temporal deformation filed mapping. EGU General Assembly, Vienna, Austria, April 2009. 7. M. Shirzaei and T. R. Walter: InSAR time series shows multiple deformation and interaction of gravitational spreading, intrusion and compaction on Hawaii Island. EGU General Assembly, Vienna, Austria, April 2009. CONFERENCE ARTICLES / ABSTRACTS 2008 8. X. Cong, S. Hinz, M. Eineder, A. Parizzi: Ground deformation measurement with radar interferometry in Exupéry. In Proceedings of USE of remote sensing techniques (USEReST) for monitoring volcanoes and seismogenic areas, 11–14 November 2008, Naples, Italy, 2008. 9. M. Hort, K. Zaksˇ ek: Managing volcano unrest: the mobile volcano fast response system. In Proceedings of USE of remote sensing techniques (USEReST) for monitoring volcanoes and seismogenic areas, 11–14 November 2008, Naples, Italy, 2008. 10. G. Laeufer, M. Becker, R. Drescher, C. Gerstenecker, S. Leinen, and S. Roedelsperger: Ground based deformation monitoring within the Geotechnologien Project »Exupéry«. In Proceedings of 14th General Assembly of WEGENER. 15–18 September 2008, Darmstadt, Germany, 2008. 11. G. Laeufer, M. Becker, M. Drescher, C. Gerstenecker, S. Leinen, and S. Roedelsperger: Exupery: Entwicklung eines Volcano Fast Response Systems. 41. Herbsttagung des Arbeitskreises Geodäsie/Geophysik, Hirschegg, Austria, October 2008. 12. C. Maerker, K. Seidenberger, T. Erbertseder, M. Rix, P. Valks, J. van Geffen: Trajectory matching and dispersion modeling of volcanic plumes utilising space-based observations. In Proceedings of USE of remote sensing techniques (USEReST) for monitoring volcanoes and seismogenic areas, 11–14 November 2008, Naples, Italy, 2008.
13. M. Rix, P. Valks, N. Hao, T. Erbertseder, J. van Geffen: Monitoring of volcanic SO<inf> 2</inf> emissions using the GOME-2 satellite instrument. In Proceedings of USE of remote sensing techniques (USEReST) for monitoring volcanoes and seismogenic areas, 11–14 November 2008, Naples, Italy, 2008. 14. S. Roedelsperger, C. Gerstenecker, G. Laeufer, and D. Steineck: Deformationsmessungen mit IBIS-L im Steinbruch Dieburg. 41. Herbsttagung des Arbeitskreises Geodäsie/ Geophysik, Hirschegg, Austria, October 2008. 15. M. Shirzaei, M. Motagh, T. R. Walter, A. Golamzadeh and F. Yamini-Fard: Earthquake mechanism and stress transfer induces salt diapir deformation: Results from hybrid inversion of d-InSAR and aftershock data of the Nov 27 2005 Qeshm Island earthquake, Iran. EGU General Assembly, Vienna, Austria, April 2008. 16. K. Zaksˇ ek, M. Hort: Integration of thermal anomaly monitoring into a mobile volcano fast response system. In Proceedings of USE of remote sensing techniques (USEReST) for monitoring volcanoes and seismogenic areas, 11–14 November 2008, Naples, Italy, 2008. CONFERENCE ARTICLES / ABSTRACTS 2007 17. M. Shirzaei, T. R. Walter, M. Motagh, A. Manconi, R. Lanari: Inversion of InSAR data via Genetic Inversion of InSAR data via Genetic Inversion of InSAR data via Genetic CampiFlegrei volcanic region. FRINGE 2007 Workshop-European Space Agency. POSTERS 2009 18. K. Seidenberger: Estimation of volcanic eruption characteristics using satellite-based observations and coherent trajectory ensembles. Poster at EGU General Assembly, 19–24 April 2009, Vienna, Austria, 2009. 19. S. Rödelsperger, M. Becker, C. Gerstenecker, G. Läufer, K. Schilling, D. Steineck: Generation of a Digital Elevation Model with the Ground Based SAR IBIS-L. Poster at EGU Gen-
109
eral Assembly, 19–24 April 2009, Vienna, Austria, 2009. POSTERS 2008 20. S. Roedelsperger, M. Becker, R. Drescher, C. Gerstenecker, G. Läufer, and S. Leinen: Deformationsmessungen mit GPS und Synthetic Aperture Radar. Poster at 100 Jahre August-Euler Flugplatz, 30–31 August 2008, Griesheim, Germany, August 2008. 21. C. Maerker, K. Seidenberger, T. Erbertseder, M. Rix, P. Valks, J. van Geffen: Trajectory matching and dispersion modeling of volcanic plumes utilising space-based observations. In Proceedings of USE of remote sensing techniques (USEReST) for monitoring volcanoes and seismogenic areas, 11–14 November 2008, Naples, Italy, 2008. 22. K. Zaksˇ ek, M. Hort: Integration of thermal anomaly monitoring into a mobile volcano fast response system. Poster at USE of remote sensing techniques (USEReST) for monitoring volcanoes and seismogenic areas, 11–14 November 2008, Naples, Italy, 2008. UNPUBLISHED CONFERENCE CONTRIBUTIONS 2009 23. C. Maerker: Derivation of eruption characteristics and modelling of the long-range transport of volcanic SO2 based on spacebased measurements. Invited talk at GlobVolcano User Workshop, 7–10 July 2009, San José, Costa Rica, 2009. 24. M. Rix: Monitoring of volcanic SO2 emissions using satellite instruments. Invited talk at GlobVolcano User Workshop, 7–10 July 2009, San José, Costa Rica, 2009. 25. K. Zaksˇ ek: Exupéry – the mobile volcano fast response system VFRS. Invited talk at SAVAA Users Workshop, 6–7 April 2009, Rome, Italy, 2009.
110
WeraWarn â&#x20AC;&#x201C; Real Time Detection of Tsunami Generated Signatures in Current Maps Measured by the HF Radar WERA to Support Coastal Regions at Risk Dzvonkovskaya A., Gurgel K.-W., Pohlmann T., Schlick T., Xu J. University of Hamburg, Centre for Marine and Climate Research, Institute of Oceanography, Germany *Coordinator of the project: Prof. Dr. Detlef Stammer
1. Introduction The WeraWarn project intends to develop a contribution to a tsunami early warning systems. The hypothesis is to use over-the-horizon High Frequency (HF) radar techniques to identify a tsunami wave travelling towards the coast at ranges up to 200 km off the shore. HF radars are operated in the decameter wave fraction of the radio spectrum and use operating frequencies between 5 and 30 MHz. Bragg-resonant backscattering by ocean waves with half the electromagnetic wavelength allows to measure the ocean surface current at far distances. These radar systems recently became an operational tool in costal monitoring, e.g. along the coasts of the United States. Applications include distribution of pollutants, search and rescue, ship detection and tracking, as well as ship guidance, fishery and tourism. As these systems already exist, it is mainly a software package to be added to enable support for tsunami detection. The hope is that in case of an approaching tsunami, a strong ocean current signature can be observed by the radar when the tsunami wave enters the shelf edge and travels into shallower water (200 to 50 meters). Depending on the width and depth of the shelf, it would then take up to 45 minutes from the first detection of the signature until the tsunami reaches the coast. As no HF radar measurements of a real tsunami exist, the con-
cept can not be applied to real conditions and is not yet proven. To test the concept, the current signature generated by a tsunami was therefore simulated by an oceanographic model, which is a modified HAMSOM in this case. The ocean model output is then used as input to an electromagnetic backscatter model to simulate what a HF radar will actually observe. Based on these results, an algorithm to detect a tsunami signature in the HF radar data is being developed. The project is organized in several sub-projects which are discussed in more detail in the following sections. 2. TsunamiCoast After successfully concluding the simulations of tsunami waves on global and also on local scales it was decided to develop an assimilation scheme, which allows for the direct implementation of HF-radar derived sea surface currents into hydrographic numerical models. By means of this strategy the usability of HF-radar data can be strongly enhanced compared to the traditional validation and calibration procedure. With respect to the envisaged tsunami detection such kind of assimilation scheme could, in principle, provide great advantages since the natural background variability can be determined much more accurately and realistically if the model could be update with observational data during the model run. However, applying the data assimilation in real time remains a
111
Fig. 1: Bathymetry of the model domain (m) covering the EuroROSE investigation area, with a resolution of 1 nautical mile
challnage and is likely impractable. The data assimilation approach used in this context is the Ensemble Kalman Filter Method (Evensen, 1994; Evensen, 1997), which is based on the Kalman filter scheme and designed for nonlinear systems. The Kalman Filter is a sequential filter method, which means that the model can be integrated forward in time like a standard model run and whenever measurements are available they are used to update the model result and a subsequent re-initialisation of the model is performed. In order to find a practical way to implement the Ensemble Kalman Filter Method a number of sensitivity test have been conducted. As a basis for these tests, HF-radar data collected during the EU project Euro-ROSE have been chosen. The European Research project focussed on the small regional area off the northwestern Norwegian coast. The obtained observations have been assimilated into the circulation model HAMSOM, which was also utilised for the tsunami forward runs. For this test case HAMSOM was adapted to the topography of the EuroROSE region. Fig. 1 shows the employed bathymetry which has a resolution of 1 nautical mile (approx. 1.85 km). In addition to the HF-radar data HAMSOM was forced by meteorological data from NCEP/NCAR which were available for the actual period of the EuroROSE measuring campaign. Moreover, at
112
the open boundaries tidal data from the global tide model OTIS have been prescribed. The traditional Ensemble Kalman method requires an ensemble of model realisations in order to determine the possible model error. Due to limited computer resources we undertook five models realisation using slightly changed initial distributions. As an example Fig. 2 shows the differences between the fourth and first model ensemble member run for the sea surface elevation (Fig. 2a) and the sea surface currents (Fig. 2b) after 31 days of simulation. The resulting difference could be used to identify the existing model error. Obviously, the largest deviations between the different model realisations occur in regions which exhibit significant frontal structures. This is due to the fact that in these areas slightly different transport rates can lead to the large local differences in salinity and temperature and subsequently in the baroclinic flow fields. The traditional Ensemble method has the disadvantage that a certain number of model realisations is required to give a representative distribution of the expected model error. Thus, this method has an extremely high demand for computing resources. In order to circumvent this problem Breivik and Saetra (2001) have developed the Quasi Ensemble Kalman Filter
Fig. 2: Difference between the fourth and first ensemble member run after 31 days of simulation a) left: sea surface elevation (cm) b) right: sea surface velocity (cm/s)
Method, In this approach the required error covariance matrix was extracted from a reference forward run which was assumed to be representative for the conditions during the assimilation run. However, this error covariance matrix is constant over the entire assimilation run, which means it is not flow-dependent. Accounting for this deficiency, we have modified this original idea from Breivik and Saetra in so far, as in our new approach the error matrix is obtained from a model run of a period for which the actual model assimilation data are available. In order to achieve an even stronger linkage between error covariance matrix and observational data, the particular model states around the individual observational time have been chosen. Following the suggestion of Peter Jan Vanleeuwen (pers. comm.) the assumption was made that forecast errors can be set equal to forecast tendencies, which is a general proceeding for the Variational Assimilation Method. By this means it is guaranteed that the error covariance matrix is flow-dependent and as close as possible to the assimilation data. An additional modification was made concerning the initial conditions. If these conditions are derived from the ensemble runs a further improvement could be achieved. The different steps are presented in Fig. 3. Fig. 3a
shows the direct measurement from the HFradar, whereas Fig. 3b presents the pure forecast model run, i.e., without any assimilation. In contrast Fig. 3c and 3d depict the results of the assimilation runs. Obviously, the assimilation generates significantly more regional structures compared to the pure forward run. In particular the velocity maximum in the north-western part of the investigation area can only be reproduced by the data assimilation run. Looking at the sensitivity towards the choice of the initial fields it is obvious that the employment of realistic initial conditions (Fig. 3d) leads to an additional improvement of the simulation results compared to unrealistic initial conditions (Fig. 3c). 3. TsunamiHFSpec This sub-project aims to simulate the signals seen by HF radar in case of a tsunami travelling towards the coast and to propose an algorithm to detect and monitor the moving tsunami signature. 3.1 Backscatter Model Fig. 4 shows the radial component of the ocean surface current modeled by HAMSOM at about 25 minutes after the underwater earthquake. The bottom topography used represents a typical shape from the deep ocean towards the shelf with the shelf edge being
113
Fig. 3: Sea surface velocity (m/s) in the EuroROSE region off the north-western Norwegian coast a) upper left: HF-radar measurement b) upper right: simulation results without assimilation c) lower left: simulation result with assimilation but without realistic initialisation d) lower right: simulation result with assimilation and with realistic initialisation
located 100 km off the coast. The underwater earthquake is represented by an initial ocean elevation disturbance of 2 meters, 180 km off the coast. The ocean surface current field induced by a propagating tsunami is calculated at spatial scales of 1 km covering an area of 250 Ă&#x2014; 250 km2 extension and temporal scales of 1 second. The white lines indicate the area which the radar covers at 250 km range and an azimuthal angle of Âą 60 degrees. Due to the decreasing water depth, the wave speed is reducing and it takes about 30 minutes until the tsunami reaches the coast. The tsunami induced surface current reaches up to 1.5 m/s, which is well above the ocean current due to tides, density gradients, and wind.
different directions of echo return. An example of measured range-Doppler spectra is shown in Fig. 5a. In this case, WERA was operated in the 8 MHz frequency range and the two first-order Bragg lines can be observed up to 250 km offshore. The tsunami induced surface current signal is superposed to the time series of the radar backscatter by calculating the time series of the additional Doppler shift due to the tsunami and calculating the Hadamard product of the two time series. Because the Doppler shift due to the tsunami changes quickly with time, the integration time of the Doppler spectrum must be kept between 2 and 3 minutes. Fig. 5b shows the resulting range-Doppler spectrum.
As WERA uses a linear receive antenna array, beam forming algorithms can be applied to the acquired backscatter data to distinguish
Fig. 6 presents an example of measured radial ocean surface currents using the WERA system. Vectors of surface currents were evaluat-
114
Fig. 4: Simulated radial velocities of tsunami induced currents using HAMSOM model
Fig. 5: Measured range-Doppler-spectra without (left) and with (right) superposition of the tsunami induced currents shown in Fig. 4
ed from the beam formed range-Doppler power spectra. The grid spacing is 1.5 km with maximum range of 250 km off the coast. Arrows show the direction of the radial current component. The color scale corresponds to the ocean currents velocities in m/s. Specific tsunami current signatures are clearly observed in this map. The first appearance of such signatures can be monitored early enough to issue a warning message about an approaching tsunami. 3.2 Tsunami Detection Algorithm Two different statistical scenarios for the simulated tsunami event are considered in order to investigate tsunami detection possibilities. The
first scenario includes the tsunami event simulation according to the HAMSOM model when the total simulation time covers a one-hour time frame since the earthquake has happened, while the second scenario covers a whole day of radar measurements with a tsunami superposed between 14:00 UTC and 15:00 UTC. 3.2.1 Scenario one The proposed tsunami detection technique is based on the statistical approach. The ocean current velocity values are mapped into an entropy field using the Shannonâ&#x20AC;&#x2122;s entropy filtering for a single snapshot, i.e. about 2 min of integration time, collected by the HF radar. The current velocity map statistics may vary from
115
Fig. 6: Radial ocean surface current velocity map based on the measured HF radar spectra with the superimposed simulated tsunami currents shown in Fig. 4
snapshot to snapshot, therefore we have to detect the tsunami event against a background clutter and noise, which has an unknown distribution of entropy values. We apply the detection technique to the range-azimuth ocean current entropy map using a conventional constant-false-alarm-rate (CFAR) algorithm. Variations in both signal strength and the contending noise or clutter level determine the probability that a signal on the threshold boundary will be registered as detection. The probability of false alarm is determined by the likelihood that the noise or clutter signal amplitude exceeds a given level. The detection of a tsunami event can be expressed with the hypotheses H0 (no tsunami, only clutter) and H1 (tsunami and clutter). The meaning of clutter contains all the environmental surface current changes and interferences. CFAR methods usually formulate a test statistic for each cell of interest and compare it to some threshold. The CFAR threshold calculation is usually based on the NeymanPearson criterion with a fixed probability of false alarm and a maximum probability of target detection.
116
A detection decision must be made for each range-azimuth cell of each entropy map individually. The adaptive threshold εthr at the specified test cell can be selected according to the ordered-statistic CFAR (OS-CFAR) detector. The test cell entropy value ε is compared to the threshold εthr using the following statistical test rule: – We choose the hypothesis H1 in case of the test value ε exceeds the threshold value εthr; – We choose the hypothesis H0 in case of the test value ε being less than the threshold value εthr. The threshold is always calculated as the product of the scaling factor C used to adjust the probability of false alarm Pfa and the entropy distribution quantile εα of the α-th order. In order to specify the threshold, which is necessary to make a correct decision about a tsunami event, we have to evaluate the probability distribution of maximum entropy values related to quantile entropy values. The relation of quantiles gives the necessary background to find the scaling factor C. The measured quantile relations can be fitted estimating the pa-
Fig. 7: Probability density function (pdf) and cumulative distribution function (cdf) for measured entropy quantile relations and generalized extreme value (GEV) distribution fitting. Data were measured on May 18th, from 14:00 till 15:00
rameters of the supposed distribution. In our case we propose that such relations could be described by the generalized extreme value (GEV) distribution. The GEV distribution is often used to model the smallest or largest value among a large set of independent, identically distributed random values representing measurements or observations, in our case the quantiles of entropy. The probability density function (pdf) for the GEV distribution depends on the location parameter µ, scale parameter σ, and shape parameter k. The GEV distribution allows you to »let the data decide« which distribution is appropriate. The three cases covered by the GEV distribution are often referred to as the Types I, II, and III, which are sometimes also referred to as the Gumbel, Frechet, and Weibull distributions respectively. Using 95% confidence level we obtain maximum likelihood estimates of the GEV parameters, thus defining the GEV cumulative distribution function. Based on this function the scaling factor C can be estimated as a quantile of the (1–Pfa)th order.
Fig. 7 shows the main approach how to choose the probability of false alarm Pfa based on the measured entropy quantile cumulative distribution function (cdf) and find the scaling factor C. The area under the estimated GEV pdf corresponds to the selected probability of false alarm. To investigate the OS-CFAR procedure we have to consider three different quantile orders, e.g. α = 75%, 90% and 95%, applied to the measured surface current velocity maps on May 18th, from 14:00 till 15:00 without a tsunami. Fig. 8 shows the measured entropy quantile relations and the selected thresholds. The changes due a tsunami event are clearly visible and exceed the threshold line. When the local entropy value exceeds this threshold, we detect a tsunami area according to our hypothesis test rule and issue a tsunami warning. The detection results for the OS-CFAR with Pfa = 10–4 using the 75%-quantile thresholding are shown in Fig. 9. These results are a basis of the tsunami alert maps shown in Fig. 10. Besides the detection decisions, they give the location of the tsunami wave, which
117
Fig. 8: The measured entropy quantile relations and the selected thresholds
Fig. 9: Detection results for the simulated tsunami, which is generated on May18th between 14:00 and 15:00
can be even tracked with time. Since the probability of false alarm is not equal to zero in real world, we need to confirm warning messages. If we have tsunami detections consequently in time then we can state that the tsunami alarm is confirmed. For example, Fig. 10aâ&#x20AC;&#x201C;10d show the tsunami alert map at different times. The tsunami detection technique is based on the OS-CFAR detection algorithm applied to the entropy filed of surface currents gives an opportunity to make an automated issue of tsunami alert message during the real-time monitoring using the WERA radar. 3.2.2 Scenario two The second scenario covers one full day of radar measurements including the simulated
118
tsunami event at the time period between 14:00 and 15:00. When we consider the detection results based on the proposed OSCFAR algorithm during the whole day May 18th (see Fig. 11), the tsunami detection part between 14:00 and 15:00 can be seen but also there are several values showing detections even without superimposing tsunami currents. Especially they are visible at around 13h, 17h and 21h. If we investigate the quantile relations for the full day, we can observe that constant thresholding, which has been derived from one hour of data (see Fig. 8), shows a time dependency of quantile relations during the day, i.e. the process cannot be treated as a stationary process.
Fig. 10: Tsunami alert maps at (a) 5 minutes after the tsunami earthquake; (b) 15 minutes after the earthquake; (c) 25 minutes after the earthquake; (d) 35 minutes after the earthquake
Fig. 11: 24-hour detection results for the simulated tsunami, which is generated on May18th between 14:00 and 15:00
In order to understand the reason of these outliers, we have to investigate our measured radar spectra. Fig. 12 shows the rangeDoppler spectra for a specified beam angle in the vicinity of 13:00. It can be seen that large
parts of the spectra are hardly corrupted with signals caused by ionosphere interference. The ionosphere is the uppermost part of the atmosphere consisting of a shell of electrons and
119
Fig. 12: Range-Doppler spectra corrupted by strong ionosphere interference during the measurements
electrically charged atoms and molecules that surrounds the Earth, stretching from a height of about 80 km to more than 1000 km. It exists primarily to high ultraviolet radiation from the sun and has practical importance, because it influences radio wave propagation to distant places on the Earth. Total refraction of the radio wave can occur when the electron density in the ionosphere is high enough. We observed this specific case in Fig. 12 where the ionosphere interference is mapped by horizontal patches of high spectral power. The patches are produced by strong reflections of the radar signal from a low-altitude sporadic dense ionosphere layer. Moreover, these specific patches lead to a wrong estimation of ocean surface current velocities. The problem of ionosphere interference suppression is a key problem of HF radar signal processing, which is an ongoing topic of research since decades. One solution to this problem is to use a second radar frequency, where the interaction with the ionosphere is completely different. Therefore, the development of a dual-frequency HF radar was part of the work package WeraOptimize (Helzel Messtechnik) within WeraWarn.
120
Another solution is to apply a special filtering technique based on the knowledge about the shelf edge. For this particular tsunami simulation we suppose the shelf edge to start at 100 km off the coast. It gives us the information that we can blank out all very strong surface current velocity deviations that happen at farer ranges covering deep water because they do not correspond to a tsunami event We provide a new simulation of the tsunami event using the same HAMSOM data as previously. Now the event happens between 12:00 and 13:00. Fig. 13 shows the measured entropy quantile relations applied to the measured current velocity maps on May 18th, from 12:00 till 13:00 when the strong ionosphere interference happens. Three cases are considered, i.e. entropy quantile relations without the superimposed tsunami, with the superimposed tsunami but without data filtering, and with the superimposed tsunami and data filtering. The changes due a tsunami event are clearly visible and exceed the threshold line after the data filtering.
Fig. 13: The measured entropy quantile relations with and without filtering the corrupted data
To conclude the above mentioned results we have to point out the following items: â&#x20AC;&#x201C; The developed tsunami detection algorithm based on the constant thresholding works well in a quiet electromagnetic environment and allows generating tsunami alerts automatically while operating the HF radar system. â&#x20AC;&#x201C; In case of radio frequency interference of reflections by the ionosphere, either a dualfrequency radar or a special algorithm based on adaptive thresholding and filtering the corrupted data according to knowledge of ocean shelf extension can be applied. This complicated topic has to be investigated and confirmed by additional measurements. 4. TidalBore This experiment was designed as a test bed for the detection algorithm developed within TsunamiCoast and TsunamiHFSpec. As tsunamis happen at unpredictable times, it is difficult to acquire HF radar backscatter data during a real tsunami event. A tidal bore is expected to generate comparable signals in the radar system at predictable times and is considered be a good replacement.
There are only a few places worldwide where the tidal bore phenomenon leads to waves of extreme height and extreme propagation speed at spatial scales suitable to the resolution of HF radar measurements, which is in China at the Huangzou Delta and in Brazil inside the Amazon delta at the mouth of the Rio Araguari. The first option, do have the experiment in China, was dropped due to problems with the authorization and the data exchange between China and Germany, as at that time the Chinese Navy became responsible for research in Chinese costal waters. After a site survey, the second option, to have the experiment in Brazil, turned out to be complicated and more expensive in logistics than initially planned. In addition, due to the rain season in the Amazon delta, the experiment could no longer be finished in time to process and validate the data within the projectâ&#x20AC;&#x2122;s schedule. The experiment is now planned within an international consortium, preferably under Chinese lead to simplify authorization and data exchange issues. To help bringing together this consortium, the results of WeraWarn have already been presented during the
121
Oceans 2009 conference and the International Radiowave Oceanography Workshop ROW2009. They will be presented at the International Radar Symposium IRS-2009 and the IEEE Radar Conference 2009 in the near future. Besides interested parties from Australia and Canada, there is a new contact to Japan. Kokusai Kogyo Co. Ltd. is very interested in our results as they are involved in the design of the Japanese tsunami early warning system. Kokusai Kogyo may also have access to historical HF radar measurements of tsunami events. It must however be clarified, if the HF radar technology available at that time allows the data to be processed by the WeraWarn detection algorithm because of the requirements on spatial resolution and integration time. References: R. Breivik and R. Saetra. Real time assimilation of HF radar currents into a coastal ocean model. Journal of Marine Systems, 28: 161–182, 2001. G. Evensen. Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. Journal of Geophysical Research-all Series, 99: 10–10, 1994. G. Evensen. Application of ensemble integrations for predictability studies and data assimilation. In Monte Carlo Simulations in Oceanography Proceedings AhaHulikoa Hawaiian Winter Workshop, University of Hawaii at Manoa, 14–17, 1997. A. Dzvonkovskaya, K.-W. Gurgel, T. Pohlmann, T. Schlick, J. Xu. Simulation of Tsunami Signatures in Ocean Surface Current Maps Measured by HF Radar. Oceans 2009, Bremen, Germany, May 2009. K.-W. Gurgel, G. Antonischki, H.-H. Essen and T. Schlick. Wellen Radar (WERA), a new groundwave based HF radar for ocean remote sensing. Coastal Engineering, VOL 37, NOS. 3–4, ISSN 0378-3839, pp. 219...234, August 1999.
122
EWS Transport – Early Warning System for Transport Lines Hohnecker E. (1) , Buchmann A. (1), Wenzel F. (2), Titzschkau T. (2), Bonn G. (3), Hilbring D. (3), Quante F. (3) (1) Department of Railway Systems, University of Karlsruhe (TH) (2) Geophysical Institute, University of Karlsruhe (TH) (3) Fraunhofer Institute for Information and Data Processing, Karlsruhe *Coordinator of the project: Univ.-Prof. Dr.-Ing. E. Hohnecker, University of Karlsruhe (TH)
1. Introduction The project EWS Transport (Early Warning System for Transportation Lines) analyses the potential of earthquake early warning for railway systems. The project’s main goals are (a) performance studies of dense station networks that detect earthquakes based on a neural network method, (b) understanding of noise conditions for seismological recording close to railway tracks (c) risk analysis that identifies and warns endangered parts of the railway network as well as estimates potential damages to the railway infrastructure, and (d) the specification of an open service architecture providing access to information and results created by the early warning workflow. The principle of functionality, from extracting earthquake parameters to visualizing estimated infrastructure damage, is realized in an online demonstrator. The research components are integrated in the demonstrator with data bases from Baden-Württemberg as all required data are available to us for this region. If data from high seismicity areas become available the results can be easily extended to such cases. 2. Realtime Seismology for Earthquake Early Warning
chosen as test area for EWS Transport because of e.g. the availability of GIS railway data, knowledge of the railway operational system and information on the local geology. BW is the state in Germany with the highest seismic activity which, however, from a global viewpoint is low but not negligible. The maximum magnitudes range between 6.0 and 6.5 and in terms of European Macroseismic Scale intensities, the non-exceedence probability of 90% in 50 years ranges between V and VIII (Grünthal and Bosse, 1996; Grünthal et al. 1998). Due to the rare occurrence of earthquakes and the lack of large magnitude events, there is a shortage of observed data that are suitable for this study. Synthetic seismograms of earthquakes in and around BW thus have been generated and provide the basis for the analysis of EWS Transport options. The use of synthetically generated ground motion makes it possible to test different earthquake and sensor network scenarios. EWS Transport’s innovative idea of implementing a large number of low cost sensors in or near railway tracks can thus be simulated. Such a network could make use of the given infrastructure, power lines and communication options and would provide comprehensive information directly for and directly at the object of interest.
2.1 Earthquake Early Warning: Example Baden-Württemberg The federal state of Baden-Württemberg (BW), located in the south-west of Germany, was
2.1.1 Synthetic Database Within the scope of this study, Sokolov and Wenzel (2008) developed the first stochastic ground motion model customized to BW. It is
123
Fig. 1: Map of the test area
based on regional earthquake ground motion data and site amplification parameters. With this model, ground motion records can be created at any point in the test area and for any given hypocenter and magnitude. In order to generate scenarios that are as realistic as possible and as hazardous as necessary for EWS Transport testing purposes, the historic earthquake catalog of Germany was consulted (Leydecker, 2008). The distribution of earthquakes was obtained from hypocenter coordinates of historic events with an intensity ≥ 5 in a radius of 250 km around the center of BW. A Gutenberg-Richter relationship of log10N = 4.9 – 0.72 ML, where N is the number of earthquakes with a local magnitude ML or greater, was determined from all earthquakes in the catalog with ML ≥ 2.5. As input for the generation of synthetic ground motion, 290 earthquakes with a magnitude distribution between 4.5 and 6.5 following the GutenbergRichter relationship were randomly distributed over the hypocenters. For the sake of testing the
124
system, 15 additional earthquakes with magnitudes between 6.6 and 8.0, which are considered as impossible in this area, were added at the locations of historically greatest earthquake intensity. The distribution of all 305 earthquakes is shown in Fig.1 For the current version of the online demonstrator (see section 4.2) synthetic ground motion was generated at the location of the 13 stations of the Landeserdbebendienst (LED) BW that have direct data transfer (http://www.lgrb.unifreiburg.de) (Fig. 1, white squares) and at 50 additional virtual stations that are randomly distributed along railway lines in the test area (Fig. 1, white triangles). 2.1.2 Earthquake Early Warning for Extended Objects Using PreSEIS EWS Transport implements a modified version of the earthquake early warning method PreSEIS (Böse et al., 2008) for earthquake detec-
tion and analysis. In PreSEIS, earthquake parameters are extracted from the sensor network via artificial neural net technology. This early warning method combines the advantages of regional and on-site early warning approaches (Böse et al., 2008; Köhler et al., 2009): PreSEIS is quick and reliable because it is not required that seismic waves have arrived at all stations before parameter estimations can be issued and the parameters are continuously updated using the incoming information from all triggered and not yet triggered stations. The synthetic data used in this analysis is very basic and source and path effects have not been optimized. However, PreSEIS has already been proven to work successfully with real data (Köhler et al., 2009) and EWS Transport does not focus on the early warning method itself, but on the special requirements of an earthquake early warning system for railway lines. Two layer feed forward neural networks are trained with a subset of the synthetic dataset to extract information from P waves at the 13 LED stations (where sensitive instruments record the ground motion in a quiet environment) and issue parameter estimations, which are updated every 0.5 seconds. The networks are trained to estimate hypocenter coordinates and magnitude, as well as the expected horizontal peak ground acceleration (PGA) at each of the 50 virtual sites along the railway lines, i.e. directly at user sites. The ability of the network to estimate correct parameters from ›unknown‹ earthquakes (not included in the training dataset) is tested with the remainder of the dataset. Based on the 50 PGA estimates, alert maps for BW with a resolution of 1 km2 are calculated by interpolation. Alert maps are a crucial component of EWS Transport as they provide the basis for hazard assessment along the railway lines in the test area (see section 3.1). Immediately after an event, a shake map with a resolution of 1 km2 is calculated by interpolation of the 50 true PGA values obtained from the seismograms of each station along the railway lines. Shake maps allow the identification of areas that have been affected by very
high ground shaking and the assessment of potential damage to the railway infrastructure (see section 3.2). The performance of a trained network is evaluated by comparing the estimated with the true hazard at 3176 points along the railway lines, i.e. every ~1 km, for each time increment and each earthquake. The hazard assessment depends on railway track characteristics and the PGA value at each point and results in an alarm (trains have to stop), a limit (trains have to reduce their speed) or a clear (trains can continue as planned) being issued for each track section (see section 3.1 for details). A comparison of the estimated hazard (using PGA values from the alert maps) with the true hazard (using PGA from the shake maps) allows assessing the correct hazard identification at different points, time steps and for different earthquakes and thus reflects the performance of a network. Fig. 2, lower left panel, shows the percentage of points for which, based on the estimated PGA value, a correct alarm, limit or clear assessment has been made. The percentage increases with time (horizontal axis), since more and more information from the recorded P waves is included in the PGA estimations, which thus become more accurate. The hazard assessment for large magnitude earthquakes (magnitude increases with earthquake number, vertical axis) is not as good as for smaller events (lower left plot). Summed over all 305 earthquakes, the percentage of correctly identified points ranges between 91% and 97% in the first 11 seconds (upper left plot) for both the training and the test dataset. The lower central panel of Fig. 2 shows the relation of points with missed alarm/limits (corresponds to PGA being underestimated) to the total number of points where an alarm or limit should be issued. Small and distant earthquakes often do not generate large accelerations at any point along the railway lines, so that no alarm/limit is issued and the PGA cannot be underestimated (white lines). The lower right plot displays the relation of points with false alarms/limits (correspond to the PGA being overestimated) to the total number of points where a clear or only a limit and not
125
Fig. 2: Performance plot of a neural network
an alarm should be given. The PGA is often greatly underestimated at first and then, especially for larger magnitude events, slightly overestimated at later times. The upper central and right plot show the time dependence of the percentage of missed or false alarms/limits, respectively, summed over all events in the test (grey line) and training (black line) data set. Performance plots of trained networks can vary greatly, depending on e.g. the chosen network training parameters, the initial neuron weights and the earthquakes that are in the training dataset. 2.2 Ground movement measurements along railway lines In order to assess the options of densely instrumenting railway tracks with accelerometers the noise level in the vicinity of railway lines has to be quantified. Several seismic measurements were carried out using a
126
Lennartz seismometer as well as SOSEWIN accelerometers (Picozzi et al., 2009; Fleming et al., 2009), which have been developed in the Geotechnologien project EDIM (http://www.cedim.de/EDIM.php). The sensor type, the connection of the sensor to the ground, the distance of the sensor to the track and the track type were varied in these measurements in order to allow an appropriate evaluation of the possibility to extract information on earthquake ground motion from accelerometers along railway lines. Furthermore, the measurements may provide a basis for the development of methods for continuous track state monitoring. 2.2.1 Measurements with SOSEWIN Sensors In the most recent measurements the SOSEWIN sensors were placed directly in a ballast track between two sleepers and the rails
Fig. 3: Examples of vertical ground accelerations caused by trains
under a ~20 cm layer of ballast. Passing freight trains generated large amplitudes up to â&#x20AC;&#x201C;8.1 m/s2 on the vertical component (UD). The amplitudes recorded on the components parallel (PA) and perpendicular (PP) to the track are smaller by a factor of 20 and 10, respectively. Between train passages, amplitudes on all components are slightly larger than 0.01 m/s2, which corresponds to the sensitivity of the sensor. As expected, the large accelerations caused by trains will usually overprint any P waves on the UD component. The duration of the train induced accelerations, however, is very short. Fig. 3 shows the UD signal generated by a passenger train (left) and a freight train (right). The regular pattern of the largest amplitudes during a train passage indicates that they are caused by the axles. The signal is well defined and does not begin and end with a gradual increase or decrease in amplitude. Generally, the down time during a train passage thus can be expected to be less than 1 second before and after the passage of a train, the duration of which depends on the length and the speed of the train. The further away the sensor is from the track, the smaller the amplitudes of ground motion caused by trains are, as measurements in different distances to the track confirm. On the
other hand, due to scattering of the waves, the duration of the induced vibrations will increase (Chen et al., 2004). An interesting observation made in particular from freight train passages is that the acceleration is mostly or completely negative, i.e. downwards. This leads to an imbalance of the complete seismogram because the negative amplitudes are not equalized. The upward amplitudes could be suppressed by the consecutive forces induced by the trainsâ&#x20AC;&#x2122; axles which are especially large for freight trains due to their heavy weight, or by a destructive interference of the forces induced by all axles of the train. Also, the resolution of a 100 Hz recording may not allow the detection of high frequency upward amplitudes. The effect is currently being studied in detail. 2.2.2 Signal to Noise Ratios Seismic noise along railway lines is characterized following an approach by McNamara et al. (2004 and 2009). The recorded data is divided into 82 second segments with 50% overlap. For frequencies ranging between 0.02 and 35 Hz power spectral density is calculated for each segment and smoothed in full-octa-ve averages at 1/8 octave intervals. Powers for each 1/8 period interval are accumulated in 1 decibel (dB) power bins (dB with respect to 1 ((m/s2)2/Hz)). The statistical analysis of all
127
power bins then allows the calculation of probability density functions (PDFs) as a function of noise power for each of the octave bands. Fig. 4 shows a PDF of a 24 hour measurement with a SOSEWIN sensor placed in a ballast track, as described above. The limits of the high and low noise model (Peterson, 1993) are given by the black lines in Fig. 4. The peak power probability is almost constant at –62 dB in the period range of 0.1–5 seconds and corresponds to the sensitivity level of the instrument (1 mg). Thus, ambient noise levels in the track are smaller than signals that can be detected by the sensors. The peak probability decreases to smaller dB levels for lower periods due to the instrument filter function and smears out at higher periods due to the short segment lengths of 82 seconds. The small probability of dB levels above the peak power line is caused by the passing of trains.
The recorded trains did not excite characteristic frequencies and span a broad dB range from –62 to 0, increasing from lower periods to higher periods with a small decrease around 0.6 Hz. Therefore, train passages cannot be filtered easily and must be classified as down time on the UD component. On the PP component, maximum power probability levels up to –42 dB are reached and on the PA component they do not exceed –52 dB. Fig 4 suggests that the noise level for SOSEWIN accelerometers integrated in the track is, at times without train passages, not higher than the sensitivity of the instrument, but well above the limits of the high noise model. Due to the low sensitivity of the instrument, it may be of limited use for seismological recording. However accelerometers integrated into the track can be of relevance for the detection of strong motion and for structural health monitoring.
Fig. 4 PDF of a 24 h measurement with a SOSEWIN sensor (UD component)
128
3. Risk Reduction for Transport Lines In order to reduce the risk for railbound transportation systems during and after an earthquake, it is necessary to identify the relevant hazard and damage parameters, to define corresponding emergency measures, and to make reliable predictions of the expected infrastructure damage. As discussed in the following, we distinguish two main applications (use cases) of EWS Transport: (i) early warning phase right before and during strong motion and (ii) the subsequent damage analysis. 3.1 Hazard parameters, thresholds, and emergency measures Concerning the early warning phase, the horizontal peak ground acceleration aPGA at the site of the track is identified as the crucial parameter that characterizes the hazard in train operation. In particular, acceleration thresholds above which train derailment is likely to occur, and pertinent emergency measures are defined as a function of train velocity. According to current practice for high speed trains (Shinkansen), the local acceleration threshold is defined as aPGA =0.4 m/s2 above which the local power supply at the train position is switched off and an emergency braking of the train is initiated (Nakamura, 1996). These measures are applied directly without the intermediary of a train control center. As a first step, we adopt the local horizontal peak ground acceleration aPGA at the track site as the relevant hazard parameter, and aPGA = 0.4 m/s2 as the critical threshold above which train operation must be considered unsafe and emergency train control measures are initiated. Emergency measures depend on the actual ground acceleration along the track and the type of railway system, e.g. high speed or low speed lines. For the early warning phase we currently employ two measures in EWS Transport: (a) for high speed lines with v > 80 km/h trains are stopped and the corresponding lines are blocked if the horizontal ground acceleration is above 0.4 m/s2 , (b) for low speed lines with v < 80 km/h train speed is reduced to
v < 20 km/h if the horizontal ground acceleration exceeds 0.4 m/s2 but is below a second threshold usually taken as 1 m/s2 (Nakamura, 2008). For aPGA > 1 m/s2 trains are stopped and the track infrastructure is inspected on foot, before it is decided where and in which form train operation is resumed. This classification is based on the fact that high speed trains are more vulnerable to derailment than low speed trains. Derailment takes place when horizontal track forces Y, which increase quadratically with train speed, become larger than the vertical axle load Q. Track stability and safety against derailment requires that the ratio of horizontal to vertical track forces Y/Q be below the critical value Y/Q < 0.8 (Prud’homme, 1978, Lichtberger, 2004). Ground shaking and seismically induced track irregularities cause additional train velocity dependent horizontal forces so that fast trains reach the critical value Y/Q = 0.8 earlier than slow trains. 3.2 Vulnerability assessment, damage catalogue, and damage map For fast damage map generation immediately after the earthquake, the expected damage to the railway infrastructure is calculated as a function of local ground acceleration and the results are stored in a data base (damage catalogue). A damage catalogue is constructed for a specific railway network and geographical region. The »Deutsche Bahn AG« has supplied the EWS Transport research partners with data of its infrastructure in Baden-Württemberg. These data (GIS ArcView shape files) contain the entire railway network including information on the geographic location of stations, bridges, tunnels, as well as route number, electrification, etc. The damage catalogue is organized in the form of a matrix where the rows contain the various railway infrastructure elements (e.g. track, bridges, tunnels, buildings) and their main attributes (e.g. for bridges these are: length, number of spans, width, construction type, construction year, etc.) and the columns describe various damage classes together with
129
Fig. 5: Analytical and empirical fragility curves (Basöz and Mander, 1999)
the ground acceleration thresholds underlying the definition of these damage classes. A reliable damage classification for a given structure and seismic loading will be possible if the design loads for this structure and the local soil classes are known. For buildings in Germany design loads and soil classes are defined by DIN standards (DIN 4149, 2005). On the other hand, information on damage analyses for railway infrastructure is scarce. In the following we concentrate on assessing the seismic vulnerability of two railway infrastructure elements, namely bridges and straight tracks. 3.2.1 Railway Bridges For highway bridges a seismic damage analysis is available (Meskouris, 2007; Basöz and Mander, 1999), where five bridge damage classes (none, minor, moderate, major, complete) and the corresponding median peak ground accelerations are defined for various generic bridge construction types. Basöz and Mander define six bridge construction categories: (i) multi-column bents simply supported, (ii) single-column bents box girders, (iii) continuous concrete, (iv) continuous steel, (v) single span, (vi) long bridges (l > 150 m). For assessing the seismic vulnerability of a bridge empirically determined fragility curves are often used. The latter describe the probability P as a function of PGA that the actual
130
bridge damage D exceeds a certain damage state dsi, where the subscript i indicates one of the five damage states. These classifications and the corresponding fragility curves have been used for estimating railway bridge damage in EWS Transport. A fragility curve for a given bridge type and damage state is a lognormal probability function that is characterized by two parameters, a median value for the peak ground acceleration ai and a standard deviation σi.. As an example, we show in Fig. 5 the comparison of analytical and empirical fragility curves for single span bridges and non-monolithic abutments for the damage class »major damage«. Presently, the EWS Transport database for the test area includes only two bridge classes, short bridges with a length l < 150 m and long bridges with l > 150 m. For both bridge types and the five damage classes the corresponding acceleration thresholds ai for i = 2 … 5 (medians) are taken from Basöz and Mander for »single span« and »long bridges« (see Tab. 1) and are integrated in the EWS Transport database. We currently define the five damage classes 1–5 as a function of median ground acceleration thresholds a2 … a5 listed in Tab. 1 as follows no damage (class 1) if a < a2 minor damage (class 2) if a2 a < a3
Tab. 1: Acceleration thresholds for two bridge types and damage classes 1â&#x20AC;&#x201C;5 (see text)
a [m/s2]
a2
a3
a4
a5
short bridges
7.8
8.8
10.8
15.7
long bridges
3.9
4.9
5.9
7.8
moderate damage (class 3) if a3 a < a4 major damage (class 4) if a4 a < a5
of the ballast bed. If the lateral forces due to ground acceleration exceed the critical horizontal force, permanent track deformation (buckling) will result.
complete damage (class 5) if a5 a These thresholds are used as input values for the damage assessment and damage map generation that is discussed in more detail in section 4. Although railway bridges are generally recognized as the most vulnerable infrastructure elements of a railway system, there are various other and more frequently occurring causes for system failure. Examples are seismically induced liquefaction, track buckling, as well as landslides, which render a railway line unserviceable. For assessing the seismic vulnerability of a straight track segment empirical fragility curves are published (Pitilakis, 2009) but acceleration thresholds for specific railway track systems are not available. Therefore, we investigate seismically induced track buckling (see Fig. 6), which is a frequently occurring failure mode, and estimate for different track systems the critical lateral accelerations beyond which major track damage will occur. 3.2.2 Railway Track The seismic vulnerability of a straight track segment is mainly characterized by the lateral track resistance w [N/m]. The latter is defined as the critical horizontal force which leads to a lateral sleeper displacement of 2 mm, beyond which the static Coulomb friction goes over into sliding friction. The lateral track resistance depends on the type of track superstructure (ballast track or slab track), the type of sleepers (wooden or concrete sleepers) as well as the density and granular composition
We consider several limiting situations. First, for a ballasted track with wooden sleepers and tamped ballast, the empirically determined lateral track resistance is 5 kN/m (Fendrich, 2007). Using a track mass per unit length (including rails, sleepers and the upper part of the ballast bed) of 1300 kg/m one obtains a critical horizontal acceleration of acr = 4 m/s2 beyond which track stability is no longer guaranteed. Second, for a ballast track with concrete B70 sleepers and stabilized ballast bed, the lateral resistance is increased to 17 kN/m (Fendrich, 2007). With a corresponding track mass of 1700 kg/m the critical horizontal acceleration is 10 m/s2. These figures do not yet take into account that the lateral track resistance is lowered in the presence of vertically upward directed seismic forces. To assess the vulnerability of a free track segment exposed to vertical seismic forces, we take the empirical lateral resistance 1 kN/m for a track with wooden sleepers lying on top of the ballast (Lichtberger, 2004) as the worst case scenario. For this situation we find a critical lateral acceleration of 3 m/s2. Third, for slab track systems, which (in Germany) are mainly used for high speed lines, the lateral resistance and hence the critical horizontal acceleration are in general considerably higher (>> 10 m/s2) than for a ballasted track. Therefore, with respect to lateral track stability, the seismic vulnerability of a slab track system is very much reduced compared to the standard ballast track. Currently, we employ two damage classes (no damage and major damage) and a critical threshold value of aPGA = 4 m/s2 above which
131
Tab. 2: Acceleration thresholds for various railway track systems (see text)
w [kN/m]
acr[m/s2]
ballast (wooden sleepers
5.0
4.0
ballast (concrete sleepers)
17.0
10.0
no ballast (wooden sleepers)
1.0
3.0
>> 17
>> 10
lateral track resistance w [kN/m]
slab track
Fig. 6: Track buckling (China, Tangshan, 1976) from nisee.berkeley.edu
major track damage results. It should be emphasized that the acceleration thresholds compiled in table 2 are crude estimates. Further research is required to substantiate, or if necessary to modify the above thresholds for a straight track segment. Although the empirical lateral track resistance w includes the effect of temperature related longitudinal rail stresses on lateral track stability, we nevertheless investigated the critical buckling force Pcr of a continuously welded track. Due to the constrained thermal expansion of continuously welded rails, large axial (longitudinal) compression forces are generated inside the rails. If these axial forces exceed a certain critical value, even small additional horizontal forces caused either by track irregularities or by seismic influences will lead to the typical sinusoidal lateral track buckling (see Fig. 6). For orientation we employ Euler’s buckling theory combined with Winkler’s elastic foundation model to calculate the critical
132
compression load Pcr for a beam (rail) fixed at ------both ends which is given by Pcr = √kEI . Here, EI is the rail’s lateral bending stiffness, where E [N/m2] is elastic modulus of steel and I [m4] the second area moment of the rail cross section. Winkler’s elastic foundation parameter in lateral direction is denoted by k [N/m2]. For a UIC 60 rail profile and typical values for k, one obtains Pcr = 1072 kN. This corresponds to a critical temperature difference ΔT~56 K, which may very well occur in practice. Furthermore, the expression for the critical buckling load clearly shows that a seismically induced decrease of the lateral track stiffness parameter k (related to the lateral track resistance w) leads to a corresponding decrease of the critical buckling load and to an increased vulnerability of the track. Separate investigations are necessary for curves and switches because these infrastructure elements have higher demands on track stability and hence an increased seismic vul-
nerability. In addition, for tracks laid on an embankment or in a cut, the seismic vulnerability of these earth structures has to be taken into account. 3.3 Earthquake early warning and permanent infrastructure monitoring An interesting aspect of railbound systems is the principal possibility to employ the existing rail network for earthquake and damage detection as well as for permanent infrastructure monitoring. The earthquake early warning system would then become part of the standard train control and protection system. The train control and protection infrastructure is currently redesigned with the aim of achieving technical interoperability between different EU member states (Berger, 2004). In the following we investigate if track integrated sensors can be used for early detection purposes, fast damage map generation, as well as for continuous track state monitoring. Track integrated accelerometers may be used for early warning purposes if (i) train passage induced vibrations can be clearly distinguished from a seismic event and if (ii) there is a sufficient number of sensors that remain unaffected by train induced background noise. We have studied the spectrum of train induced ground vibrations over a wide frequency range between 1 Hz and several hundred Hz using the model of a Bernoulli-Euler beam (rail) on an elastic Winkler foundation (representing rail fastening, sleepers, ballast, subgrade), which in railway engineering is known as Winkler’s model (Esveld, 1989). In the low frequency domain (υ < 16 Hz), bending deformation of the subgrade due to the moving load is the dominant excitation mechanism. For train velocities v between 20 km/h and 160 km/h the corresponding excitation frequencies lie in the seismically relevant region 1 < Δ < 8 Hz. Higher frequency excitations come from periodically changing loads, e.g. due to the so-called discrete support frequency Δ = v/a, where a [m] is the sleeper distance with excitation frequencies between 9 < Δ < 74 Hz for train speeds 20 km/h < v < 160 km/h. Still higher frequency
vibrations result from global superstructure resonances, which depending on the railway system lie typically between 60 Hz (embedded rail system) and 130 Hz (ballast track). The moving load, discrete rail support, and superstructure excitations have a clear signature in experimental vibration spectra. In contrast to a seismic signal, a train passage spectrum will always contain these characteristic high frequency components. Thus, a train passage can be clearly distinguished from a seismic signal. An important issue is the relative down time of a given sensor because of train passages, during which the relatively small seismic Pwave signals cannot be detected reliably because they will be buried under a huge background of train induced vibrations (for details see section 2). Using the railway line with the highest train frequency of 750 trains/24h, an average train velocity of 100 km/h, and an effective train length of 2500 m (including an envelope of train induced track vibrations of 1000 m in front of and behind the train) we obtain a relative down time per sensor of 75%. However, the average relative down time per sensor for the entire test area is less than 1%. A second estimate is based on the spatial distribution of trains on the rail network of total track length LG = 4745 km. Assuming an average distance between track acceleration sensors of 1 km (typical railway block distance) the test area contains 4745 sensors. At a given time, the average number of trains on the network is about 360 occupying a track length L = (360) (2500 m) = 900 km. Compared to the total network length LG, this amounts to 19%. These estimates show that at any point and time there is a sufficient number of sensors available for earthquake early warning. The same dense network of track integrated acceleration sensors can also be used for permanent infrastructure monitoring. An accelerometer network can detect critical deviations from regular track geometry and rail head irregularities because when a train passes over these defects, it will generate charac-
133
teristic traces in the vibration spectra. In addition, also irregularities of the rolling stock, such as wheel flats, polygonized wheels and broken axles cause typical track vibration signals that can be detected (Müller-Borrutau, 2009). Aside from standard piezoelectric accelerometers, modern laser and fiber optic sensor techniques avoiding electromagnetic interferences with existing track based train control equipment are nowadays available and may be employed to the benefit of both railway infrastructure and rolling stock operators. The various aspects of track integrated sensors and their use for earthquake early warning and permanent infrastructure monitoring are described in more detail elsewhere (Quante, 2009; Eisenmann, 2009). 4. Information System for Earthquake Early Warning 4.1 System Architecture Early detection of an earthquake in combination with alert creation and finally the possible manipulation of rail traffic is a complex task. Therefore the early warning system is in need for an information system addressing key aspects like the access to data, its processing and the visualisation of results. This can be reached by the specification of an appropriate system architecture. The defined architecture and its principles should be tested in a first test implementation. This section describes the architecture of the information system which has been designed for the EWS Transport project and the test implementation of this architecture in the »Earthquake Early Warning Simulator«. The architecture is designed according to the principles of an open architecture with standard-based interfaces of system components. These interfaces ensure the flexible and easy exchange of components and are especially advantageous for the joint use of system components between cooperating institutions. The architecture is designed according to the views of the ISO »Reference Model for Open Distributed Processing« (RM-ODP, 2009) and takes principles of the ORCHESTRA and
134
SANY project into account (RM-ODP, 2009) (ORCHESTRA, 2009) (SANY, 2009). Several use cases for the early warning system have been defined in the Enterprise View of the architecture (Hohnecker et al, 2008). The two main scenarios are the use cases »Earthquake Early Warning System« and »Structural Health Monitoring«. The use case »Earthquake Early Warning System« describes the workflow starting with the detection of an earthquake via the hazard estimation (see section 3.1) on to the notification of the railway operating company. The use case »Structural Health Monitoring« describes the workflow starting with the detection of an earthquake on to the damage estimation of endangered infrastructure elements like railway tracks or bridges. The theoretical background is described in section 3.2. A similar workflow could also be used to realize the permanent infrastructure monitoring described in section 3.3. Based on the scenarios the architecture defines main building blocks concentrating semantically related parts of the system in the Information Viewpoint. The block »Sensor System & Early Detection« realizes the early detection of the earthquake. This block encapsulates the sensor network from the information system. The block »Assessment & Decision« combines the earthquake information delivered by the first block with the railway infrastructure to identify potential hazards and damages. The block »Action Execution« is responsible for the execution of actions proposed by block »Assessment & Decision«, while block »Visualisation & Analysis« visualizes the results and provides analysis functionality. The main functional blocks were filled with appropriate services necessary for the realisation of the scenarios in the Service Viewpoint of the architecture. Where ever possible open geospatial standards defining service interfaces were used. This ensures at one side that system components can easily exchanged and on the other side that information produced by the system can be accessed from outside the
Fig. 7: Standard Based Interfaces in Early Warning System Architecture
system. Fig. 7 shows the resulting architecture which differs partly from the proposed architecture in 2008, due to experiences with the first test implementation. The following services defined by the Open Geospatial Consortium (OGC) have been identified for the usage in the information system: – The Sensor Observation Service (SOS) provides means for accessing observed data (Nah, Priest, 2007). EWS Transport has defined a new observation type for the SOS according to the Observation und Measurement Model of the OGC. The observation type describes an earthquake with the following parameters: location of hypocenter, magnitude and an acceleration field containing peak ground acceleration values in the area, which is monitored by the sensor network. The defined observation type can be used for the creation of alert maps and shake maps during and after the strong motion phase of an earthquake. The acceleration field in the newly defined observation type is a coverage. The idea to deliver coverages through an SOS rather than the conventional time series data has been developed in SANY and successfully tested in the SANY Fusion SOS (Kunz et al., 2009). This approach has been re-used by EWS Transport.
– The Web Processing Service (WPS) provides means for the execution of geospatial processes (Schut, 2007). EWS Transport has defined new processes for the WPS. One algorithm realizes the hazard analysis of the railway tracks. This information can be used for train alerts. Another algorithm calculates the estimated damages to the railway infrastructure elements (railway tracks and bridges). Finally, a third algorithm transforms earthquake peak ground acceleration values to earthquake intensity values. – The Web Feature Service (WFS) provides means for accessing geospatial data. It has been used in EWS Transport for two purposes. Firstly it provides access to the geospatial railway infrastructure data (railway tracks and bridges). These are the elements contained in the damage catalogue described in section 3.2. Secondly it provides access to the result elements created by the early warning system, such that external components can easily access the result layers via the WFS interface. – The Web Map Service (WMS) provides means for accessing maps. It has been used in EWS Transport for the visualization of the results of the early warning system. The following service defined by OASIS has been identified for the usage in the information system:
135
– The Web Service Notification (WSN) Standard provides interfaces for the realization of event-based alerts (Graham et al., 2006). A component interested in receiving alerts (a consumer) can subscribe to a broker for specific topics. EWS Transport has defined the following topics: The »AlertMap« topic represents the event that an earthquake has just been detected and that proposed earthquake parameters are available. The »Shake Map« topic represents the event that the earthquake has finished and that measured earthquake values are available. The »DamageMap« topic represents the event that the early warning system has finished its calculations and that the results are available for consumers outside the system. Once an event occurs, the component, which has detected the event, notifies the broker of the occurance of a specific topic. The detecting components of EWS Transport are the Earthquake Early Detection component for the AlertMap and ShakeMap topics and the client for the DamageMap topic. The broker forwards the notification to the consumers, which have been subscribed for this topic. The notifications send by the components are structured using the Common Alerting Protocol (CAP) (Jones, Botterell, 2005). The following service defined by the project ORCHESTRA has been identified for the usage in the information system: – The Map and Diagram Service (MDS) is an extension of the WMS (Iosifescu-Enescu, 2007). It offers possibilities for the visualization of coverages in the Ascii Grid format. This format is created from the earthquake acceleration field contained in the newly de-
fined SOS observation. The usage of the MDS can be recommended for the visualization of alert maps, shake maps and intensity maps. 4.2 Earthquake Early Warning Simulator The architecture described in the previous section has been tested in a first implementation: the Earthquake Early Warning Simulator. It realizes the two main scenarios »Earthquake Early Warning« and »Structural Health Monitoring«. Both use cases have been mapped to the following workflow: 1. The user selects one of the available earthquakes for the simulation in the demonstrators »Live Client« and triggers the simulation by hitting the »Start« button. 2. The Earthquake Early Detection component uses PreSEIS for the early detection of the simulated earthquake (see section 2.1.2). Once PreSEIS has detected the earthquake the earthquake observation type is generated containing an alert map and uploaded to the SOS. The Earthquake Early Detection component triggers an AlertMap event. 3. The Client of the early warning system listens continuously for events. Once it receives the AlertMap event the information is forwarded to the hazard algorithm of the WPS, which identifies endangered railway tracks. This information can be used to influence the train traffic. 4. Once the strong motion phase of the earthquake has passed PRESEIS calculates the shake map, uploads it to the SOS and triggers the ShakeMap event. 5. When the Client receives the ShakeMap event it calculates the potential damages for the railway tracks and bridges and the earthquake intensity map with the means of the WPS.
Fig. 8: Selection of Earthquake to be simulated
136
Fig. 9: Dark: Block Track; Light: Speed Limit for Track
Fig. 10: Estimated Bridge Damage
6. Finally the client stores all calculated results in the WFS and visualizes the results via the MDS together with railway infrastructure data and topographical data accessible via WMS. The DamageMap notification is created by the client and sent to components interested in receiving results from the earthquake early warning system. Web Services were chosen for the implementation of the demonstrator. The advantages are, firstly that available open source components could be tested and used for the implementation and secondly that the demonstrator is online accessible for demos. The following components were used for the implementation and extended with the work described in section 4.2: the Fusion SOS of SANY, the WPS of 52North, The WMS and WFS of GeoServer, the MDS of the ETH Zürich and the WSN of Apache ServiceMix (52North, 2009) (GeoServer, 2009) (QGIS Mapserver, 2009) (Apache ServiceMix, 2009). The Earthquake
Early Warning Simulator is online accessible under: http://ews-transport.iitb.fraunhofer.de. 4.3 Future Potential for the Information System The open service architecture of the early warning system provides the possibility to share results via well defined standard based interfaces. This will be tested in the following scenario. Once an earthquake has occurred and caused severe damage, rescue measures need to be planned. Here the »Lagetisch« a multi display system can help to coordinate the necessary actions. The Lagetisch provides means for accessing data from geospatial standards like WMS and WFS and therefore is able to directly access and visualize results of the early warning system. Another aspect of the standard based interoperability can be tested with the access to external sensor data available via Sensor Observation Services. Access to data from ex-
137
ternal SOS could be integrated easily into the early warning component to improve the outcomes of the »Sensor System & Early Detection« block. First simple tests with another SOS provided by the SLEWS project (http://www.slews.de) show that the technical integration is easy due to the standard base interfaces used in the architecture. These examples show that external systems can easily cooperate with the early warning system designed by EWS Transport. This is promising for future extensions of the system which could not only include the transfer of the system in areas of significant seismic activity but which could also include the integration of other natural hazards (e.g. landslides) or the extension of the monitored infrastructure. 5. Summary and Outlook Major results of the EWS Transport research project, which focuses on earthquake early warning for railway lines, are briefly summarized below. Synthetic seismic data can be generated at any point in the test area, therefore the performance of different sensor deployment scenarios can be analysed and compared. Real ground acceleration measurements along railway lines allow to characterize signal to noise ratios and to study the possible use for structural health monitoring. Sensors are nowadays available at low costs, so the implementation of a large number of sensors is realistic. Optical fiber sensors, for example, are not affected by electric and magnetic fields and thus may be especially appropriate for EWS Transport purposes. For the early warning phase, local ground acceleration thresholds for low and high speed railway lines and pertinent emergency measures are defined. Furthermore, the seismic vulnerability of various railway infrastructure components is assessed using published fragility curves and established railway engineering models. Corresponding ground acceleration thresholds are integrated in a damage catalogue which forms the basis of a post earthquake damage map.
138
The standard-based design which is chosen for the architecture of the Early Warning System offers possibilities for cooperations outside the project. The design has been tested successfully in the implementation of the Earthquake Early Warning Simulator and shows potential for the interoperability with external systems, as first tests with a multi display system for emergency planning and the project SLEWS show. The online demonstrator models EWS Transport’s entire early warning chain ranging from earthquake detection and risk reduction measures for railway transportation to damage map generation. This tool is the basis for optimizing the various subtasks of the early warning chain as well as their interrelations. It allows the incorporation of data from high seismicity areas, if available, and thus guarantees the portability of our results. 6. References Apache ServiceMix (visited in July 2009) Apache ServiceMix, http://servicemix.apache.org/ home.html Berger, R. et al. (2005), The way to coordinated deployment of ERTMS/ETCS throughout the European Network, Railway Technical Review 4, 21, 2005. Basöz, N. and Mander, J. (1999), Enhancement of Highway Transportation Lifeline Module in HAZUS, Draft 7, Federal Emergency Management Agency (FEMA) and National Institute for Building Sciences, 1999. Böse, M., Wenzel, F. and Erdik, M. (2008) PreSEIS: A Neural Network-Based Approach to Earthquake Early Warning for Finite Faults, Bull. Seism, Soc. Am., 98 (1), 366–382. Chen, Q., Li, l., Li, G., Chen, L., Peng, W., Tang, Y., Chen, Y., Wang, F. (2004) Seismic features of vibration induced by train. Acta Seismolgica Sinica, 17 (6), 715–724. DIN 4149-2005-04 (2005), Bauten in deutschen Erdbebengebieten – Lastannahmen, Bemessung und Ausführung üblicher Hochbauten , 2005.
Esveld, C. (1989), Modern Railway Track, MRT Productions, Duisburg,1989. Eisenmann, Th. (2009) Fiber Optical Sensing with Fiber Bragg Gratings, BMBF Geotechnologien Science Report 14, 2009. Fendrich, L. (Ed.) (2007) , Handbuch Gleisinfrastruktur, Springer Verlag, Berlin, 2007. Fleming, K., Picozzi, M., Milkereit, C., Kuehnlenz, F., Lichtblau, B., Fischer, J., Zulfikar, C., Ozel, O., and the SAFER and EDIM working groups (accepted 2009) The Self-Organising Seismic Early Warning Information Network (SOSEWIN), to be published by Seismological Research Letters. GeoServer (visited in June 2009) GeoServer, http://geoserver.org Graham S., Hull D., Murray B. (eds.) (2006): OASIS Web Services Base Notification 1.3 (WS-BaseNotification), http://docs.oasisopen.org/wsn/wsn-ws_base_notification-1.3spec-os.pdf Grünthal, G. and Bosse, Ch. (1996) Probabilistische Karte der Erbebengefährdung der Bundesrepublik Deutschland – Erdbebenzonierungskarte für das Nationale Anwendungsdokument zum Eurocode 8, GeoForschungsZentrum Potsdam, STR 96/10, 24 pp. Grünthal, G., Mayer-Rosa, D., and Lenhardt, W. A. (1998) Abschätzung der Erdbebengefährdung für die D-A-CH-Staaten – Deutschland, Österreich, Schweiz, Bautechnik, 10, 753–767. Hohnecker E. et al (2008) Early Warning System for Transport Lines, Geotechnologien, Frühwarnsysteme, Statusseminar Osnabrück. Iosifescu-Enesu, I. (2007) ORCHESTRA, Specification of the Map and Diagram Service, http://www.eu-orchestra.org/docs/OA-Specs/ Map_and_Diagram_Service_Specification_v3.0 -ETHZ.pdf
Jones E., Botterell A. (eds.) (2005) OASIS Common Alerting Protocol, v1.1, http://www.oasisopen.org/committees/download.php/15135/e mergency-CAPv1.1-Corrected_DOM.pdf Köhler, N., Cua, G., Wenzel, F., and M. Böse (accepted: June 2009) Rapid source parameter estimations of southern California earthquakes using PreSEIS. Seism. Res. Lett., in press. Kunz, Usländer, Watson (2009) A Testbed for Sensor Service Networks and the Fusion SOS: towards plug & measure in sensor networks for environmental monitoring with OGC standards, accepted by 18th World IMAC/ MODSIM Congress, Cairns, Australia. Leydecker, G. (2008) Erdbebenkatalog für die Bundesrepublik Deutschland mit Randgebieten ab dem Jahre 800; Datafile: http://www.bgr.de/ quakecat; Bundesanstalt für Geowissenschaften und Rohstoffe (BGR); Version 22.05.2008. Lichtberger, B. (2004), Handbuch Gleis, Tetzlaff Verlag, Hamburg 2004. Meskouris, K. et al. (2007), Gefährdungsabschätzung von Brücken in Deutschland unter Erdbebenbelastung, Forschung Straßenbau und Straßenverkehrstechnik 952, Bundesministerium für Verkehr, Bau und Stadtentwicklung, 2007. Müller-Borrutau, F, Breitsamer, N., Pieper, S. (2009), Railway Technical Review 1, 24–29. Nakamura, Y. (1996), Real-time information systems for seismic hazards mitigation UrEDAS, HERAS and PIC, Quaterly Review of RTRI, Vol. 37, No. 3, 1996. Nakamura, Y. (2008), private communication McNamara, D. E., Buland, R. P. (2004) Ambient Noise Levels in the Continental United States. Bull. Seism. Soc. Am., 94 (4), 1517–1527. Mc Namara, D. E., Hutt, C. R., Gee, L. S., Benz, H. M., Buland, R. P. (2009) A Method to
139
Establish Seismic Noise Baselins for Automated Station Assessment, Seism. Res. Lett., 80 (4), 628–637. Nah, A., Priest, M. (eds.) (2007) Sensor Observation Service, Open Geospatial Consortium, OGC 06-009r6 ORCHESTRA (visited in July 2009) ORCHESTRA, http://www.eu-orchestra.org/ Peterson, J. (1993) Observation and modeling of seismic background noise. U.S.G.S. Open File Report, 93–322. Picozzi, M., Milkereit, C., Zulfikar, C., Fleming, K., Ditommaso, R., Erdik, M., Zschau, J., Fischer, J., Safak, E., Özel, O., and Apaydin N. (2009) Wireless technologies for the monitoring of strategic civil infrastructures: an ambient vibration test on the Fatih Sultan Mehmet Suspension Bridge in Istanbul, Turkey. In press on Bulletin of Earthquake Engineering, DOI 10.1007/s10518-009-91327. Pitilakis, K., Argyroudis, S. (2009) Seismic risk assessment of lifeline systems with emphasis to transportation systems, BMBF Geotechnologien Science Report 14, 2009. Prud’homme, A.,(1978) Forces and behaviour of railroad tracks at very high train speeds, Railroad Track Mechanics and Technology, ed. A. D. Kerr, Pergamon Press, Oxford, 1978. Quante, F. and Schnellbögl, G. (2009). Railway Infrastructure and Seismic Early Warning Systems BMBF Geotechnologien Science Report 14, 2009. QGIS Mapserver (visited in June 2009) Implementing an easy to use and cartographically rich Web Map Server, http://karlinapp.ethz.ch/ qgis_wms/index.html RM-ODP (visited in July 2009) ISO Reference Model for Open Distributed Processing, http://www.rm-odp.net/
140
SANY (visited in July 2009) Sensors Anywhere, http://sany-ip.eu/ Schut P (ed.) (2007) OpenGIS Web Processing Service, Open Geospatial Inc., OGC 05-007r7 Sokolov V. and Wenzel F. (2008) Toward realistic ground motion prediction models for Baden-Württemberg, Germany, Proceedings of the 31 General Assembly of European Seismological Commission, September 2008, Crete, Greece, CD ROM. 52North, (visited in June 2009) WPS, http://52north.org/maven/project-sites/wps/ 52n-wps-webapp/
Status & Results of G-SEIS – With Focus on Real-time GNSS Software Development Chen J., Ge M., Gendt G.*, Dousa J., Ramatschi M., Falck C., Schöne T. Department of Geodesy and Remote Sensing Helmholtz-Zentrum Potsdam, Deutsches GeoForschungsZentrum (GFZ) 14473 Potsdam, Germany *Coordinator of the project: Dr. sc. nat. G. Gendt
1 Introduction Natural hazards are of major concern to the society. Huge losses are reported in recent years. It is widely believed that modern GNSS (Global Navigation Satellite System) technologies are effective in hazard monitoring detection and modeling. Considering current accuracy and reliability, however, more sophisticated strategies and models have to be developed. The G-SEIS (GPS – SurfacE Deformations withIn Seconds) project focuses on monitoring of Earth surface deformations on different temporal and spatial scales based on GPS and other GNSS systems (e.g. Galileo). The aim of G-SIES is to develop a monitoring system for hazards such as land slides, volcano eruptions or detection of earthquake precursors. Within G-SEIS the existing expertise at GFZ is used to enhance existing GNSS hardware towards real-time capability and to develop state-of-the-art real-time data analysis software for deformation monitoring at the centimeter to millimeter level. This report summarizes the current status of G-SEIS project. Main focus are recent results of GNSS analysis software developments for realtime applications, where multi-technique (e.g. SLR and GNSS) and multi-system (e.g. GPS and Galileo) data source can be handled uniquely.
2 G-SEIS Objectives The main work packages of the G-SEIS project focus on both, hardware and software components. On the hardware side: – new multi-sensor stations for unattended and autonomous long-term operation will be designed, including hybrid power systems for remote regions, – high data rates (of up to 20–50 Hz) should be possible, – combination with seismic sensors, tide gauges and weather sensors should be possible, – modern data transfer techniques (including satellite and wireless terrestrial data communication technologies) should be developed and tested. On the software side, a new and automated real-time GPS/GNSS data analysis software for centimeter to millimeter positioning is being developed. This includes: – acquisition and pre-processing of real-time raw data streams – high-precision satellite orbit and clock estimation at the centimeter level – fast data processing algorithms (filter techniques) for high-precision positioning – reliable automated event detection – product and warning dissemination via TCP/IP
141
Fig.1: GFZ GSS (Geodetic Sensor Station)
In addition, all newly developed components will be tested in several application studies. This includes high data rate studies (of up to 20 to 50 Hz), system studies (including the GLONASS- and Galileo-system) and prototype application studies in selected regions with seismic activities. 3 G-SEIS Status 3.1 Sensor Stations (WP 200) Based on GFZ’s experience of GPS sensor station development in other projects (such as German Indonesian Tsunami Early Warning System – GITEWS) new hardware has been designed and tested for G-SEIS (Fig. 1). The main features of a newly designed 19-inch rack are: – suitable for any kind of commercial GNSS receiver – ›no moving parts design‹: removable solid state hard disks, no fans for power supply or CPU
142
– ›12 Volt design‹: all components operated with 12 Volt DC (except monitor) – ›off the shelf design‹: components are bought from regular providers – no single vendor source – integrated LCD monitor, keyboard and mouse – low power consumption – high internal UPS autonomy endurance – climate chamber tested from –20° C to +40° C – virtually no noise production under normal conditions – secure VPN connection through VPN router, tested e.g. with VSAT – high redundancy: two PC’s installed – high remote management capability down to PC BIOS level – remote temperature and voltage logging – upgradable for future requirements Within WP220 existing technology used in the GITEWS project has been integrated into the
Fig. 2: Data communication techniques. Left: BGAN system, Right: VSAT system
newly developed sensor station. Different concepts have been tested to reduce power consumption and to enhance reliability. Further investigations are currently in progress. In WP230 (Arrays of GPS-based Sensor Stations) a first prototype network of field sensor stations has been set up to test communication systems between the sensor stations (Array Slave Stations, ASS), the Array Master Station (AMS) and a Central Control Station (CCS) which monitors the array performance, downloads/archives data and products on a regular basis. 3.2 Data Transfer and Communication (WP 300) Within WP 310 different technologies such as VSAT, BGAN (Broadband Global Area Network), HF or L-Band communication are tested or used for data communication from the sensor stations to the processing center. Currently, the most reliable and thus most promising communication technique is based on BGAN, a service of INMARSAT. This technique enables high data rates with low power consumption. Fig. 2 shows example installations.
WP 320 (›First mile‹ Communication Issues): For the so-called ›first mile‹ communication (i.e., from the GNSS sensor to the next public internet or satellite-based network connection node) currently a ›Radio Modem Link‹ is used. First devices for both, one-way and two-way communication have been purchased and are currently under investigation. Currently, most of the real-time data streams are based on Ntrip technology (Networked Transport of RTCM via Internet Protocol). This technology has been developed by the Federal Agency of Cartography and Geodesy (BKG) and the University of Dortmund. This technique enables transfer of high-rate GNSS raw data via TCP/IP and the HTTP protocol. In the first few months of the project (in WP 330 Data Transfer & Streaming) several performance checks have been conducted and led to the conclusion that up to 100 data streams of 1-Hz data can easily be handled by both, the Ntrip broadcasters and the GFZ data acquisition software (socalled Ntrip clients).In addition to the use of existing real-time data streams of IGS stations, a commercial Ntrip broadcaster has been set up and most of the GFZ GNSS sensor stations have
143
Fig. 3: Structure and data flow of the network solution for deformation monitoring (Regional deformation, volcano, and landslide) and for precise positioning of moving platform (airplane, ship etc)
been upgraded towards data dissemination via the Ntrip technique. Presently the station and communication concept is used also for GPS- and hydro-metrological stations in Central Asia to get more experience in various climate regions. 3.3 Analysis Software Development (WP 400) A newly-designed software, EPOS-RT (with SRIF being the main program), is being developed for both Network-mode and PPP-mode, and it is planned that this software will be the basis for the future GNSS software at GFZ. A lot of new features are implemented, for more detail one can refer to (Ge et al 2008, 2009; Chen et al 2008, 2009). 3.3.1 Data analysis scenarios of EPOS-RT EPOS-RT applies the un-difference epoch-wise processing strategy and it is ready to fulfil the multi-functional performance, such as:
144
– real-time / post-mission processing – static / kinematic / dynamic positioning – … Using the same software, two real-time processing scenarios are supported. The first one is the so called »real-time network monitoring« (Fig. 3). In the network monitoring we use a regional network, where several reference stations are fixed and satellite clocks, station tropospheric parameters, ambiguities together with »rover« clocks and kinematic coordinates are estimated. The other scenario is »PPP (Precise Point Positioning) based positioning service« (Fig. 4). In PPP based positioning service, 2 steps are included. Firstly, real-time satellite clocks are estimated based on a reference network. They are disseminated together with real-time orbits to the users who calculate its precise position thereafter.
Fig. 4: Structure and data flow of the PPP based positioning service. Data from a global distributed reference network are received at the data processing center. Satellite clocks are estimated epoch by epoch in real-time by fixing the estimated orbits. The satellite orbits, and clocks are disseminated through internet. The user receives the related information and performs single point positioning
In the above two scenarios, basically three parts are included: data communication, processing kernel, and product service. Different types of data, where file-based or real-time streaming (via internet) are supported, are imported into data processing kernel through the data communication part. Data processing kernel can run in Network-mode and PPP-mode, where only one »control« file needs to be adapted when we switch one approach to the other. In the Network-mode approach, parameters – including station coordinates and clocks, satellite orbits and clocks, and troposphere parameters etc. – may be set up, and coordinates of targeted stations are monitored. Using user specified satellite orbits and clocks, station coordinates and clocks together with troposphere parameters are estimated in PPP-mode, where satellite orbits and clocks may be received from the broadcast of the »product service« block.
In the following sub-sections, we present some examples of the monitoring of natural and artificial deformations, for which the above strategies within the EPOS-RT software are evaluated. 3.3.2 Real-time deformation monitoring An on-site real-time test was performed on the roof of building A17 located at GFZ on 12th May, 2009. One experimental station (A17D) was installed, which height could be manually adjusted and the height changes could be recorded by the scale panel attached to the tripod (Fig. 5 left). The IGS reference station POTS at GFZ, several meters away from A17D, was used as reference station to perform a real-time network solution. During this test, we began to increase smoothly the antenna height at 13:02 (UTC) until the height was increased by 15 cm at 13:03:07
145
Fig. 5: Real-time deformation monitoring. Left: Experimental station (A17D) with instruments to adjust antenna height and record height changes. Right: Coordinate changes of station A17D compared to the truth, where black lines represent the real track of the manual height changes and RMS refers to the known coordinates
(UTC). After running around 1 hour and 46 minutes, we began to decrease the antenna height by 20 cm. Fig. 5 (right) shows the kinematic coordinate changes of A17D in real-time, compared to its known coordinates (from previous daily solution). In Fig. 5 (right), coordinate accuracy of a few millimetres is obtained in horizontal components and less than 1 cm difference were observed in the height component. We conclude that the manual deformation was precisely recorded. 3.3.3 Co-seismic deformation monitoring The Eastern Honshu earthquake occured at 23:43:46 (UTC) on June 13, 2008. The GPS station Mizusawa (MIZU, Japan) from GFZ, around 50 km from the epicenter, records the co-seismic deformation. Using EPOS-RT, GPS data was analyzed in a real-time-like post-processing mode, where epoch by epoch solution was performed. The network showing in Fig. 6 (left) was used where the station PETS (Russia), SUWN (South Korea) and SHAO (China) were fixed and MIZU was treated as a kinematic station. A Network solution was performed using the combined L3 observations. GFU orbits and ERPs were used
146
and fixed, clock of satellites and stations together with kinematic coordinates of MIZU were estimated at each epoch. Tropospheric parameters were set up under a Piece-WiseConstant (PWC) model with an interval of 1 hour. A data span of 2 hours (1 Hz sampling), staring from 22:00, was analyzed, and an ambiguity fixing was performed every 20 minutes after the first 30 minutes (due to the long baselines the solution takes longer time to converge). Fig. 6 (right) shows the kinematic coordinate changes of MIZU compared to its known coordinates (from previous daily solution). From Fig. 6 (right), we see that earthquake wave arrives at MIZU after around 20 seconds. Strong shaking lasts for a few minutes. Later on, an obvious offset of 10 cm in the east-west component can be detected. 3.4 Supporting Studies (WP 500) For the PPP based real-time service, the realtime satellite clocks are the parameters to be provided to the users. Under the IGS Pilot Project (IGS-RTPP), more than 100 IGS stations can provide real-time streams and can be used to estimate real-time satellite clocks. Based on our current strategy, which is the fastest within IGS community, normally epoch-wise data
Fig. 6: Left: location of the stations used in the experiment, and length of the baselines from the three reference stations to the »rover« (kinematic station), MIZU in Japan. Right: co-seismic deformation of station MIZU, Japan, June 13, 2008 Tab. 1: Satellite clock precision, compared to the IGS Final clock, with different networks
Num.
85
70
60
50
40
30
bias(ns)
–0.25
–0.25
–0.25
–0.25
–0.24
–0.24
STD(ns)
0.10
0.10
0.10
0.10
0.11
0.11
Tab. 2: Satellite clock precision, compared to the IGS Final clock, using different orbits
Orbit
igu 00–03h igu 01–04h igu 02–08h igu 00–24h
bias(ns)
1.23
0.54
–0.24
1.20
STD(ns)
0.10
0.10
0.11
0.12
analysis of a network with around 100 stations requires 1 second. More stations mean more processing time. We therefore investigated the network influence on the satellite clock estimation. With networks consisting of different number of globally distributed stations (from 85 to 30 in Tab. 1), we computed 1 Hz satellite clocks. Tab. 1 compares the estimated clocks to the IGS Final clocks. From Tab. 1 we conclude that using a network with more than 30 globally distributed stations is sufficient for the satellite clock estimation For the PPP based real-time service, real-time orbits are the next key point, which have influences on the satellite clock estimation. The realtime orbits can be obtained from IGU orbits (predicted part). IGU orbits contain orbits of 48 hours, where the first 24-hour part is estimated and the later 24 hour part is predicted. The precision of the predicted orbits is around few cm for the first 6 hours. The IGU orbits are
updated every 6 hours with a delay of 3 hours. Some analysis centers of the IGS update their orbits every 3 hour. We investigated whether current orbit updating strategy (interval and delay) is sufficient for the real-time satellite clock estimation. Using a global network consisting of 30 globally distributed stations, we computed 1 Hz satellite clocks using different orbits (igu 00–03h, igu 01–04 h etc. where e.g. igu 01–04 h means predicted IGU orbits spanning 3 hours with a delay of 1 hour; igr 00–24 is the estimated IGS rapid orbit). Depending on the different orbit time intervals, satellite clocks in the corresponding time span are estimated. Tab. 2 compares the estimated clocks to the IGS Final clocks. From Tab. 2 we conclude that current orbit updating strategy is sufficient for real-time satellite clock estimation. 4 Summary For all work packages the project is right on schedule: All tasks could be successfully completed.
147
We have developed the new software system, EPOS-RT, for real-time network deformation monitoring, providing real-time PPP based positioning service and for post-processing applications as well. Various tests have been performed with confidential results. References Chen, J., M. Ge, M. Vennebusch, Z. Deng, G. Gendt, M. Rothacher (2008) Progress of the real-time GNSS software development at GFZ, Oral presentation at International GNSS Service Analysis Center Workshop, Miami Beach, Florida, USA. 2–6 June, 2008. Chen J., M. Ge, Jan Dousa, G. Gendt (2009) GNSS Software Development for Real-time Surface Deformation Monitoring at GFZ. Oral presentation at IWGT 2009, Shanghai, China, 1–4 June, 2009. Chen, J., M. Bender, G. Beyerle, G. Dick, C. Falck, M. Ge, G. Gendt, S. Heise, M. Ramatschi, T. Schmidt, R. Stosius, and J. Wickert (2009) GNSS Activities for Natural Disaster Monitoring and Climate Change Detection at GFZ. Oral presentation at EOGC 2009, Chengdu, China, 25–29 May 2009, full paper to be published in the conference proceedings. Chen, J., M. Ge, Jan Dousa, G. Gendt (2009) Evaluation of EPOS-RT for real-time deformation monitoring, submitted to Journal of Global Satellite Systems. Ge, M., J. Chen, M. Vennebusch, G. Gendt, M. Rothacher (2008) GFZ prototype for GPSbased real time deformation Monitoring, Geophysical Research Abstracts, Vol. 10, EGU2008-A-03990, Oral presentation at EGU General Assembly 2008. Ge, M., J. Chen, G. Gendt(2009) EPOS-RT: Software for Real-time GNSS Data Processing, Geophysical Research Abstracts,Vol. 11, EGU2009-8933, Oral presentation at EGU General Assembly 2009.
148
RAPID – Rapid Automated Determination of Seismic Source Parameters Meier T. (1)*, Dahm T. (2), Friederich W. (3), Hanka W. (4), Kind R. (5), Krüger F. (6), Ohrnberger M. (7), Scherbaum F. (8), Stammler K. (9), Yuan X. (10) (1) PD. Dr. Thomas Meier, Institut für Geologie, Mineralogie und Geophysik, Ruhr-Universität Bochum, D-44801 Bochum, Universitätsstr. 150, NA 3/173, e-mail: meier@geophysik.rub.de (2) Prof. Dr. Torsten Dahm, Universität Hamburg, Institut für Geophysik, Bundesstr. 55, D-20146 Hamburg, e-mail: dahm@dkrz.de (3) Prof. Dr. Wolfgang Friederich, Ruhr-Universität Bochum, Institut für Geologie, Mineralogie und Geophysik, NA 3/165, Universitätsstr. 150, D-44801 Bochum, e-mail: friederich@geophysik.rub.de (4) Dr. Winfried Hanka, GFZ Potsdam, Telegrafenberg, 14473 Potsdam, e-mail: hanka@gfz-potsdam.de (5) Prof. Dr. Rainer Kind, GFZ Potsdam, Telegrafenberg, 14473 Potsdam, e-mail: kind@gfz-potsdam.de (6) Dr. Frank Krüger, Universität Potsdam, Institut für Geowissenschaften, PSF 601553, 14415 Potsdam, e-mail: kruegerf@geo.uni-potsdam.de (7) Dr. Matthias Ohrnberger, Universität Potsdam, Institut für Geowissenschaften, Karl- Liebknecht Str. 24/25, D-14415 Potsdam, e-mail: mohrn@rz.uni-potsdam.de (8) Prof. Dr. Frank Scherbaum, Universität Potsdam, Institut für Geowissenschaften, Karl- Liebknecht Str. 24/25, D-14415 Potsdam, e-mail: fs@geo.uni-potsdam.de (9) Dr. Klaus Stammler, Seismologisches Zentralobservatorium Gräfenberg, Bundesanstalt für Geowissenschaften und Rohstoffe, Mozartstr. 57, D-91052 Erlangen, e-mail: stammler@szgrf.bgr.de (10) Dr. Xiaohui Yuan, GFZ Potsdam, Telegrafenberg, 14473 Potsdam, e-mail: yuan@gfz-potsdam.de *Coordinator of the project: PD. Dr. Thomas Meier, Ruhr-Universität Bochum
Introduction Automated processing of seismological recordings is essential for earthquake and tsunami early warning systems. Early recognition of earthquakes and the determination of their source parameters are needed for a rapid warning against earthquake generated tsunamis as well as for any rapid post-earthquake response decisions by emergency management authorities. Because of recent progress in realtime data transfer as well as in off-line data
processing, automated near-real time estimation of source parameters is feasible. Building on existing real-time data transfer and data processing software packages like SeedLink and SeisComP developed at GFZ Potsdam, additional software components are provided in the framework of the RAPID project. The project consists of 5 work packages. Each work package aims at developing a software component for a specific purpose namely (1)
149
signal detection, signal characterisation and event localisation, (2) rupture monitoring, (3) determination of source mechanisms, (4) teleseismic estimation of shake-map estimations and (5) on evaluation and classification of source parameter estimates. Here we report on the significant progress that has been achieved in the second year of the project. New algorithms have been developed and successfully tested by all work packages. The algorithms are suitable for implementation into real-time data processing systems. In the following, detailed reports by the 5 work packages are given. Work package 1: Signal detection, signal characterization and event localization J. Lee, L. Küperkoch, T. Meier, W. Friederich (Ruhr-University Bochum), F. Scherbaum (University Potsdam), K. Stammler (Seismologisches Zentralobservatorium Gräfenberg, BGR) Automated estimation of arrival times and localization of events An earthquake early warning system requires reliable automatic processing of the event detection and localization as quickly as possible. Reliability of event localization depends on accuracy of the P and S arrival time estimate. Conventional automatic algorithms for event detection and localization are usually based on a comparison of short-term-averages (STA) with long-term-averages (LTA), which form the characteristic function (CF), to which the picker is applied. Two algorithms have been developed. One using kurtosis for P phase picking and another algorithm that is based on autoregressive prediction. The later is applicable to both P- as well as S-phases. The CF calculated by AR represents the estimated errors of the predicted waveform from the measured one, using a fixed order AR-process. Therefore, after the onset of the signal the CF is increasing because the signal cannot be predicted from noise preceeding the signal. This algorithm accounts for changes in frequency and phase as well as amplitude. In the case of ARprediction precise arrival times of P and S phases are obtained by two steps. At first the
150
Akaike-Information-Criterion (AIC) is applied to the CF in order to identify the phase onset and to obtain an initial pick whereas a pragmatic picking routine is used to esimate to arrival time and to improve the initial pick. In order to test the accuracy and robustness of the developed algorithms we apply them to a large waveform dataset of local and regional events. The waveform data contain recordings from 2600 events recorded by the EGELADOS network from October 2005 to April 2006. The quality of the P and S onsets are estimated from the slope of the CF in our phase picking routine. The estimated automatic phase picks of P and S arrival times are compared with the corresponding manual picks. An example for automated estimation of S wave travel times recorded at station KASO in the EGELADOS network is shown in Fig. 1. The time residuals of the initial and the refined automatic picks relative to the manual picks are 0.28 and 0.03 seconds, respectively. Fig. 2 shows the distribution of differences between automatically estimated P and S arrival times and corresponding manual onsets. 1123 P picks are counted with average deviations from the manual picks of 0.057 ± 0.22 seconds for excellent automatic picks. 571 S picks are counted with average deviations from the manual picks of –0.242 ± 2.43 seconds for excellent automatic picks. These results show that the proposed algorithms are well suited for automated arrival time estimates in the framework of an early warning system. Tests have shown that the localization routine HYPOSAT is well suited for rapid automated localizations. An new version of the routine has been released by NORSAR that can be implemented into real-time early warning systems. Magnitude estimation An important issue of early-warning systems is the fast determination of the moment magnitude of large earthquakes. An instantaneous
Fig. 1. Local earthquake example for automated estimation of S wave arrival times using autoregressive prediction from E-W and N-S horizontal components recorded by station KASO. This event of M = 2 occurred in February, 22, 2006 and its epicentral distance is about 0.17 degree. The automatic picks by the AIC and pragmatic routines are plotted in left and right panels, respectively. Waveforms of horizontal components are shown in black and blue. The pink line in the left panel shows the AIC. The red and blue lines above the waveform in black show the corresponding unsmoothed / smoothed characteristic functions, respectively. The green and red vertical lines show the arrival times of manual and automatic picks
Fig. 2. Distribution of deviations between automatically and manually derived P and S arrival times. Automatic P and S picks are obtained using kurtosis and AR, respectively. In addition to the arrival times, quality estimates are obtained that can be used to determine weights of the arrival times for the subsequent localization. Weight 0 corresponds to an excellent pick while a pick with weight 4 is not considered by the localization procedure
maeasure of the current size of an ongiong earthquake can be estimated from the sourcetime function determined in real-time from available data using waveform inversion. As a test, a real-time simulation was done with waveforms of the 2004 Northern Sumatra earthquake that occurred at 00:58:53 UTC on December 26, 2004. The moment magnitude is estimated from the waveform recorded at station UGM. About 800 seconds of STF is necessary to reach a moment magnitude of about
9.2 (Fig. 3). This preliminary result indicates that we can estimate the size and duration of a strong earthquake by continuously updating STF while the data come in. The magnitude estimate from the determined STF will complement existing tools to estimate Mpw, ML, or a cumulative body wave magnitude. Outlook The next steps are: 1. Comparison of the developed algorithms with those implemented
151
Fig. 3, top: 800 seconds long, positive moment rate function derived from waveform modeling. Middle: Moment function by integrating moment rate given on the top. Bottom: Time-dependent moment magnitude estimated from the seismic moment
in SeisComp3. 2. Greenâ&#x20AC;&#x2122;s function database will be built up and be applied to moment magnitude estimation in order to reduce computation time in waveform inversion. 3. Implementation into SeisComp3 as external moduls.
Work package 2: Rupture Tracking in Real Time by Polarization Analyses of P-waves Bayer, B., Kind, R., Yuan, X., Hanka, W. (GFZ Potsdam); Meier, T. (University of Bochum); Krueger, F. (University of Potsdam)
References KĂźperkoch, L., Meier, T., Friederich, W. & EGELADOS working group, 2009. Automated determination of P-phase arrival-times at local and regional distances using higher order statistics. GJI, submitted.
Rapid estimation of earthquake rupture propagation is essential to declare an early warning for tsunami-generating earthquakes. Here we present a new method to follow the rupture progress in near real time by a polarization analysis of local and regional P- wave seismograms. The method stems from a single-station earthquake location method and is expanded here to monitor P-wave polarization variations
152
through time. The back azimuth of a incoming P-wave is expected to change accordingly while the earthquake source is moving away from the hypercentre. By polarization analysis we may be able to monitor the temporal change in P-wave back azimuth in order to follow the rupture progress in near real time. Three component P-wave seismograms are scanned to determine the azimuthal variation as a function of time. Back azimuth of the moving rupture front is determined by calculation of co-variance matrix. Seismic stations in local and regional distances (less than ~30°) are suitable for the method, partially because there are no disturbing multiple P-waves (e.g., PP) and partially as the effect of 3d earth heterogeneity on polarization analysis is getting more significant for larger distances. We note that the method may only track part of rupture length for very large earthquakes due to early arrival of S-waves. We note also that the method does not work for bi-lateral earthquake propagation. We tested the new method with theoretical simulation of the Great Andaman earthquake (December 26th, 2004, Mw = 9.3) and with real data from the same earthquake as well as from the Sichuan earthquake (May 12nd, 2008, Mw = 6). Introduction of the new method The main idea bases upon a grid search of the smallest orthogonal distance weighted with
the correspondent linearity (weighted-distance grid). To be more precise, for each grid point and for each time point and additionally for each station the (spherical and orthogonal) distances to a ray that points in direction of the corresponding polarization value is being calculated and weighted with the corresponding linearity. The latter is essential as it helps to qualify a measurement by enhancing very good data sets. This procedure yields for each grid point on the Earthâ&#x20AC;&#x2122;s surface a weighted sum of the distances to all rays that are calculated from the time dependent P-polarization at the seismological stations. Finally, the smallest value within the summed weighted-distance grid is being determined and can then be linked to the location, where the P-wave was generated. By this means the uni-lateral rupture propagation can be monitored. New method verified with synthetics Working with synthetics is important to verify an idea. For that purpose, we used the algorithm of R. Wang (pers. comm., 2009) which is based on the calculations of Greens functions for a multi-layered half-space (Wang, 1999b). The program enables the modeling of an extended source and thus we modeled the rupture of the Great Andaman earthquake with 30 single sources (orange dots on Fig. 4) according to the rupture geometry obtained by
Fig. 4. Rupture propagation of about 240 s of a moving source (blue crosses) derived from synthetics.
153
the semblance-technique (Krueger and Ohrnberger, (2005)). On Fig. 4. we show the moving source with minimum distance values retrieved by applying the new method on the polarization and linearity values obtained from synthetics. Joining the back azimuth and linearity information from different stations surrounding the rupture front enables now an interpretation of some part of the rupture front. The minimum distances (blue crosses) fit the hypothetical point sources used for calculation of the sythnetics (orange dots) very well. As expected, the rupture moves from southern to northern direction, and the rupture tracking was terminated at about 240 s by the arrival of the first S-wave. New method verified with field data As we have shown above, the idea of rupture tracking with back azimuth and linearity values over time works successfully with synthetics. However, another challenge is working with real data as often heterogeneities of the Earth structure disturbs the basic wavefield. Thus, we applied the new polarization technique on real data sets of the two larger earthquakes Great Andaman and Sichuan. Great Andaman earthquake: The obtained polarization and linearity values of the four stations DGAR, QIZ, KMI, and MBWA were taken into account. These stations
are located in regional distances around the epicenter. On Fig. 5, a propagating source of about 200 s of the Great Andaman earthquake can be derived by applying the new technique. The map shows the minimum distance points (blue crosses) which align perfectly with the plate boundary (red line) and fall into the aftershock region (orange circles). As a first order approximation for the resulting rupture length we obtained about 440 km by calculating the distance on a great circle path defined by the geographical coordinates of the first and last minimum distance point. Sichuan earthquake: The polarization and linearity values of the six regional stations CHTO, MDJ, SSE, TATO, TLY, and ULN were taken into account. On Fig. 3, a moving source of about 200 s of the Sichuan earthquake can be derived by applying the new technique. The minimum distance values are denoted with blue stars and they form the rupture track for which we obtained a length of about 180 km. This value was obtained by calculating the distance on a great circle path defined by geographical coordinates related to the minima at 0 s and 75 s. Between 75 s and 100 s, the rupture direction turns to the opposite. Afterwards, a more randomize distribution of the
Fig. 5. Rupture propagation of about 200 s derived from real data of the Great Andaman earthquake
154
Fig. 6. Rupture propagation of about 100 s derived from real data of the Sichuan earthquake.
minima characterize the track of the minimum distances, thus we terminated the rupture tracking. The retrieved rupture track aligns within the fault zones (thin black lines) and the rupture direction follows the area of the aftershocks as well as the solution of the semblance technique. The latter are additionally plotted with green crosses on Fig. 6 (M. Ohrnberger, pers. comm., 2009). Outlook After verifying the new method, the algorithms can now be implemented into the real time system of SeisComp4. References Krueger, F. & Ohrnberger, M. (2005). Tracking the rupture of the Mw 9.3 Sumatra earthquake over 1,150 km at teleseismic distance, Nature, 435, 937–-939, doi:10.1038/nature 03696. Wang, R. (1999b). A simple orthonormalization method for the stable and effcient computation of Green’s functions. Bulletin of the Seismological Society of America, 89, 733–741.
Work package3: Fast estimation of kinematic parameters of the earthquake source Cesca S; Heimann S; Dahm T (University of Hamburg) Fast, automated and stable routines for the inversion of kinematic earthquake sources face the problem of the overparameterization of the rupture model, as occurs for example using standard slip map representations (Beresnev 2003). The adoption of these models for the earthquake source makes possible to well reproduce observations, but may result in unstable inversions and ambiguous solutions, as several different source models may equally well fit the observations. To overcome this problem and implement an automated kinematic inversion, we adopted the eikonal source model, to represent the extended earthquake source, and a multi-step inversion strategy, in order to retrieve both point and extended source parameters. Significant source information, including focal mechanism, source depth, magnitude, centroid location, resolution of the fault plane ambiguity, rupture size, average slip and directivity effects can be provided. Automation
155
and fast parameter estimation is here discussed for specific applications, including the recent LĂ quila (Italy) earthquake and a set of moderate to large earthquakes occurred in Japan. Quality and stability of inversion results are first discussed, by using full waveform information. Point source parameters are always well determined, while kinematic parameters such as the rupture extension, the average slip and the unilateral or bilateral character of the rupture can be resolved in many cases. The possibility of providing fast solutions, which are needed within early-warning systems, is further discussed. The eikonal source model and a multi-step inversion approach The eikonal source model (Heimann et al. 2008) is fully described by 13 parameters. Latitude, longitude, depth and origin time locate the source in space and time; strike, dip, rake, scalar moment describe the point source radiation pattern; radius, relative rupture velocity, nucleation coordinates and rise time describe the finite rupture. The rupture area is circular for small earthquakes and bounded within the seismogenic region (e.g., almost rectangular for large shallow earthquakes constrained within the crust). Depth-dependent rupture velocity (which scales with shear wave velocity) and modelling of directivity effects (by varying nucleation coordinates) are examples of powerful features of the eikonal model. Source model parameters may be grouped, depending on the source characteristic they describe (e.g. radiation pattern, centroid location, extended rupture process); this classification suggests their inversion at different stages, each time using the most proper approach for their stable retrieval. The multi-step inversion (Cesca et al. 2009) approach retrieves eikonal source parameters in 3 steps. First, a moment tensor inversion is carried out in the frequency domain by fitting amplitude spectra: strike, dip, rake, scalar moment and depth are here determined. Then, the centroid location is inverted in the time domain, by fitting body wave time windows,
156
deriving the centroid location (latitude, longitude, time offset). Finally, kinematic inversion is carried out: fault plane discrimination, radius, area, average slip, rise and rupture times, nucleation point (directivity) are retrieved. A case study, the April 2009 Abruzzo (Italy) seismic sequence Given the recent occurrence of the Mw 6.3 Abruzzo (Italy) earthquake, causing almost 300 human losses and several damages, we show here results from the application of the proposed automated inversion method to the main shock (6.4.2009) and its strongest aftershock (7.4.2009). Similar successful applications have also been carried out for shallow earthquakes at regional distances in Greece and Germany (Cesca et al. 2009). Fig. 7 shows the local seismicity in the period 5â&#x20AC;&#x201C;14 April 2009 and focal mechanisms determined for the two events. Source depth estimations indicate a shallower source for the main event (h = 7.8 km) and a deeper one for the aftershock (h = 23.9 km). Scalar moments correspond to magnitudes Mw 6.2 and 5.5 respectively. Centroids are located southward of the assumed epicenters; centroid times present offsets of 6 and 4s. Kinematic parameters were inverted for the main shock: our results indicate that the earthquake occurred along the SW dipping plane, with an almost unilateral rupture propagating toward SE. The source radius of 15 km may be overestimated, in consequence of the assumption of a too short rise time and/or a too fast relative rupture velocity (fixed to 0.9, equivalent to an average rupture velocity of 2.9 km/s). The estimated average slip is 12 cm. Kinematic inversion is unstable for the aftershock using this stations configuration, given its lower magnitude. Fast inversion, Mw 6.5-8.0 earthquakes in Japan A fast inversion approach has been further investigated, selecting as dataset 15 earthquakes with magnitudes Mw 6.5â&#x20AC;&#x201C;8.0 occurred in Japan. A fast and stable routine relies on one side on a fast parallelized algorithm,
Fig. 7. Source modelling of the 6/4/2009 Lâ&#x20AC;&#x2122;Aquila (Italy) earthquake and strongest aftershock
which has been recently improved. On the other side, the quickness of response strongly depends on the quality and availability of seismic data. Since our approach is based on seismic waveform modelling, this means that a seismograms of given time length and at a proper number of seismic stations have to be recorded, in order to allow a stable inversion. Promising results have been obtained in recent applications, using the first 10 minutes recordings. The quality of inversion results is now evaluated by using a bootstrap approach, which can provide uncertainty estimations for each source parameter. Focal mechanisms and scalar moment inversion stably retrieved the expected solutions, consistent with those provided by other institutions. Centroid depths show a general agreement with the average behaviour of available solutions; the major scattering of the values proposed by different catalogues stresses the difficulties in its stable and precise retrieval. It should be stressed that we use a different approach, consisting in the fit of amplitude spectra of full waveforms and bodywaves phases. Centroid location routine identified the best relative location (in space and time) with respect to the assumed epicentral location. Time offsets vary for the studied events between 8 and 28 s, with a consistent trend relating larger offsets to larger earthquakes. Kinematic modelling, carried out for a
subset of events, suggests the possibility to identify the true fault plane, the radius and area of the rupture process, the nucleation point, the average slip, and the rupture time. However, fast automated kinematic inversion is still to be improved, by taking into account different stability and quality indicators. Outlook Our next effort is to improve the stability and fastness of the inversion approach and to test it over a larger dataset. A second goal is to provide inversion results in terms of the probability of different rupture models, allowing the further evaluation of groundshaking scenarios by other work packages. References Beresnev, I. A., 2003. Uncertainties in finitefault slip inversion: to what extent to believe? (A critical review). Bull. Seism. Soc. Am., 93, 6, 2445â&#x20AC;&#x201C;2458. Cesca, S., S. Heimann, K. Stammler, and T. Dahm, 2009. An automated procedure for point and kinematic source inversion at regional distances. J. Geophys. Res., submitted. Heimann, S., S. Cesca, T. Dahm, and F. KrĂźger, 2008. Stable estimation of extended source properties for medium-sized earthquakes using teleseismic waveform data. EGU General Assembly, Wien, Austria.
157
Work package 4: Near real-time estimation of expected spatial distribution of strong ground-motion Delavaud, E., Frank Scherbaum, F., KrĂźger, F. (University Potsdam) In the framework of the RAPID project, the work package 4 is developing and implementing a method to rapidly obtain first estimates of the expected spatial distribution of strong shaking for damaging earthquakes based on teleseismic recordings (ÂťTeleShakeMapsÂŤ). The backbone of this approach is the combination of teleseismically determined source dimensions obtained by the work package 5.3 with composite ground-motion prediction equations (GMPEs) calibrated for the prospective application regions using an information-theoretic approach. Since the previous progress report, one year ago, we have particularly focused on the test and calibration of a new method (Task 1 and 2), in comparison with the implementation of the final software, TeleShakeMap (Task 4). Test and calibration of the method Since ten years, a huge amount of empirical regression models has been developed for the prediction of ground motion in different geographical and tectonic regions. With more than 150 GMPEs recently counted for pseudo spectral ordinates, we are theoretically able to estimate ground motion and the associated uncertainties in most regions of interest. However, determining the degree of appropriateness of each model for each region is not straightforward. It is now standard to combine these GMPEs in a logic tree framework in which different GMPEs are populating alternative branches. The corresponding branch weights then correspond to the degree-of-belief of an expert in a particular model. These weighting factors are the key ingredient of our composite model to rapidly estimate ground motion. If stored in tables, they can be read by the TeleShakeMap software to quickly determine the composite model which is considered the most appropriate to predict ground motion when a
158
specific earthquake, in a specific region happens. The determination of these weighting factors is then a challenging and critical task. We have worked on the implementation of a new method, based on an information-theoretic approach, to rank these models and compute the corresponding weighting factors. The candidate models, which are defined as probability distributions, are ranked according to the likelihood concept which tells how likely a model is, given the data displayed. We have tested and calibrated the method, which has been implemented with Mathematica, by using data from eight Californian earthquakes, from magnitude 6.2 to 7.3, and seventeen candidate GMPEs for the computation of response spectral data associated with the model of Atkinson and Kaka (2007) for the computation of macroseismic intensities. The results of this applicability study are very encouraging, both regarding the use of response spectra and macroseismic intensities. The Fig. 8 shows the comparison between the ranking order of the seventeen candidate GMPEs based on response spectral data and the macroseismic intensities based one for the Mw 6.7 Northridge earthquake. The colors indicate the region of derivation of the models (red for California and western US, blue for Europe and Middle East, green for central and western US and magenta for Japan). As one would expect, Californian and western US models perform best. This figure also shows the coherency between the two rankings. These results have been presented in a paper which is now under review, while a first paper about the theoretic study is already in press (see Publications below). Now that we have proven the coherency and the potential of the method, we are extending the application of these tools to a global scale. For this purpose, we are selecting models and collecting data worldwide. Development of the TeleShakeMap software We have collaborated, as all members of the RAPID project, with the GFZ Potsdam to define
Fig. 8. Comparison between ranking based on response spectral data and macroseismic intensities, respectively. The 1994 Northridge earthquake is considered
how our software can interface with the rapid near real-time earthquake alert system SeisComp3. On the basis of this discussion, we have improved and completed the implementation of the TeleShakeMap program, in accord with their representation of seismological data based on QuakeML data model. We have especially defined a new object describing the extended source, which does not exist yet in the QuakeML representation. We have also discussed with the members of the work package 3 about the uncertainty for the source parameters they will provide. We have agreed on a definition of their degree of confidence for these parameters, so that we can integrate these uncertainties into the logic tree for the estimation of ground-motion. We are now working on the implementation of the composite ground-motion model. We are exploring different approaches, such as Monte Carlo simulations. We believe that the weights should not be directly used in model aggregation schemes but rather serve as likelihood estimates to update experts prior belief on models.
Time schedule for the remaining tasks – Library of GMPEs weighting factors for global application: Task 2 . . . .4 months – Implementation of the composite model – Finalization of Task1 and Task 4 . .4 months – Interface with WP5.3: Task 3 . . . . .1 month – Test of the complete system, reporting, publications: Task 5 . . . .3 months Publications: Delavaud, E., F. Scherbaum, N. Kuehn and C. Riggelsen (2010). Information-Theoretic GroundMotion Model Selection for Seismic Hazard Analysis: An Applicability Study Using Californian Data. Under review for publication in BSSA. Scherbaum, F., E. Delavaud and C. Riggelsen (2009). Model Selection in Seismic Hazard Analysis: An Information-Theoretic Perspective. In press for publication in BSSA. Conference Abstracts: 2009 SSA Annual Meeting Delavaud, E., F. Scherbaum and C. Riggelsen. Ground-Motion Model Selection Based on Response Spectra and Macroseismic Intensities: A Case Study Using Californian Data.
159
2008 AGU Fall Meeting Delavaud, E., F. Scherbaum and C. Riggelsen. Ground Motion Model Selection Based on Macroseismic Intensities: Application to Californian Earthquakes. Work package 5: Evaluation and classification of near-real time seismic source parameter estimates for tsunami hazard L. Blaser, M. Ohrnberger, F. Scherbaum (University of Potsdam) Emergency managers at tsunami early warning centers are in urgent need of automatic tools supporting their decision making in critical situations when lives and property are at stake. The tectonic situation at the Mediterranean Sea as well as in Indonesia leads to earthquake locations very close to the coast and little time for evacuations remains in case of emergency. Timely and precise warnings are thus essential requirements to save coastal population. The first available evidences for a tsunami triggering come from seismic waves as indirect measurements of tsunami generation. As the earthquake source parameter estimates are prone to large uncertainties decision makers at tsunami warning centers face a difficult challenge: to issue tsunami warning based on incomplete and ambiguous data. The common rule-based hazard estimation method at tsunami warning centers bases on the evaluation of three variables only (location, magnitude, depth). As long as one variable is missing no risk assessment can be made. Further the (large) uncertainties can not be taken into consideration. Our approach focuses on the evaluation of the seismic parameters on tsunami hazard with a Bayesian Belief Network (BBN), which is capable of handling uncertain and missing data. We aim to estimate the likelihood of occurrence of an earthquake-triggered tsunami exceeding a 1
specific run-up level given the real-time estimates of earthquake source parameters provided by WP 1-3. The immediate tsunami hazard estimate could support a decision maker at a tsunami warning center any time new seismological evidences are available. BBN construction A BBN is a probabilistic graphical model representing a set of random variables and their (in)dependencies. The network consists of nodes (variables) combined by arcs (structure) which are quantified by conditional probabilities (parameters). To construct a tsunami early warning BBN we defined in a first step a set of nine variables including earthquake epicenter, magnitude, hypocentral depth, rupture length and width, mean slip, focal mechanism, water depth at the epicenter location as well as the maximal run-up height of the tsunami wave at a specific site. Those variables are assumed to have an influence on the tsunami run-up size and location and will be provided by the WP1–3. The structure and the parameters of the BBN were learned from data. As large earthquakes which are able to trigger tsunamis are rare events and resulting tsunamis are even less frequent the historical world wide tsunami database is to sparse to learn a BBN. Therefore we compiled a synthetic tsunami database for a study area covering the Indonesian island Sumatra, its forearc and the offshore subduction zone. We compiled 50’000 entries for all of the nine variables by ancestral sampling and learned a first tsunami early warning BBN draft (Blaser et al. 2009). BBN testing and refinement The first draft of our tsunami early warning BBN behaves in an expected and coherent way1. The output is composed of probability estimates of four discrete tsunami threat levels. »Major tsunami« describes an expected maximal tsunami wave height of larger than
The first draft can be downloaded form http://www.geo.uni-potsdam.de/mitarbeiter/Blaser/blaser.html and interactively tested with the freely available software GeNIe (http://genie.sis.pitt.edu/)
160
3 m, »tsunami« a wave height between 0.5 m and 3 m, »minor tsunami« between 0.1 m and 0.5 m and »no tsunami« all smaller than 0.1 m. As soon as some seismic evidences are available, the a priori tsunami threat probability is immediately replaced with a conditional probability, e.g. the probability of a major tsunami given the epicenter location P (»major
tsunami« | epicenter). Every new incoming evidence updates the tsunami threat probability and provides a very fast and helpful additional information to a chief officer on duty on a tsunami warning center. A sensitivity analysis confirmed the selection of the variables epicenter location and earth-
Fig. 9. Evaluation of the real-time seismic source parameters for the Bengkulu 2007 tsunami at the cities of Padang and Bengkulu
161
quake magnitude considered in common rulebased models. They have the largest content of information in our first BBN draft. Given location and magnitude other earthquake parameter estimates (e.g. rupture length) do not change the tsunami probability estimate significantly. As our synthetic tsunami database contains only shallow earthquakes it is not possible to learn the conditional probabilities from the database. In this case, expert knowledge was used to specify the influence of hypocentral depth of an earthquake on its tsunamigenic potential: increasing depth reduces the probability of tsunami-triggering restricted by a threshold of 100 km where no tsunami can be triggered any more. We removed the nodes water depth and mean slip as no significant influence on tsunami runup size could be found. The tsunami early warning BBN with seven variables (location, magnitude, rupture length and width, focal mechanism, hypocentral depth and tsunami size) was tested with archived real-time earthquake source parameter estimates of historical events. Next to some small events (no or minor tsunami at the coast of Sumatra) there was one earthquake triggering a tsunami offshore Sumatra in the last three years. In September 2007 a shallow earthquake with magnitude M = 8.4 located offshore the city of Bengkulu triggered a destructive local tsunami. The real-time seismic source parameter estimates are available in the SeisComp3 archive. Fig. 9 shows the three analyzed real-time earthquake parameter estimates location or discretized to region, respectively, magnitude and hypocentral depth during the first 15 min since event detection, as well as the probabilities of tsunami threat levels at the cities of Padang and Bengkulu given the available seismic evidences at eight different time steps (white: no tsunami, light gray: minor tsunami, gray: tsunami, black: major tsunami). It is illustrated how the tsunami threat probabilities are updated whenever a new evidence
162
is available. The observations of the Bengkulu 2007 tsunami fit well to the tsunami threat level probabilities. The tsunami threat estimates provide important additional information for a emergency manager at a tsunami warning center. However we would like to note that for a use in a tsunami warning center guidelines for useful decision support have to be elaborated first. Outlook The real-time implementation into SeisComp3 has been started and will be developed further. We will implement Bayesian model averaging to reduce model uncertainties (e.g. dependence of discretization of the variable ranges) and analyze the influence of extended seismic source parameters like e.g. rupture length on tsunami triggering in particular. Furthermore we will consider additional variables like e.g. rupture direction to enlarge and complete the BBN variable set. References Blaser, L., Ohrnberger, M., Riggelsen, C., and Scherbaum, F., Bayesian Belief Network for Tsunami Warning Decision Support. Sossai, C. & Chemello, G. (ed.) ECSQUARU 2009, Springer 2009, p.757â&#x20AC;&#x201C;768.
Author’s Index
A Arnhardt C. . . . . . . . . . . . . . . . . . . . . 3 Asch K. . . . . . . . . . . . . . . . . . . . . . . . 3 Azzam R. . . . . . . . . . . . . . . . . . . . . . . 3 B Barsch R. . . . . . . . . . . . . . . . . . . . . . 96 Becker R. . . . . . . . . . . . . . . . . . . . . . 16 Bell R. . . . . . . . . . . . . . . . . . . . . . . . 16 Bernsdorf S. . . . . . . . . . . . . . . . . . . . 96 Beyreuther M. . . . . . . . . . . . . . . . . . 96 Bill R. . . . . . . . . . . . . . . . . . . . . . . . . . 3 Birkmann J. . . . . . . . . . . . . . . . . . . . 73 Bonn G. . . . . . . . . . . . . . . . . . . . . . 123 Buchmann A.. . . . . . . . . . . . . . . . . 123 Burghaus S. . . . . . . . . . . . . . . . . . . . 16 C Chen J. . . . . . . . . . . . . . . . . . . . . . 141 Christ I. . . . . . . . . . . . . . . . . . . . . . . 84 Cong X. . . . . . . . . . . . . . . . . . . . . . . 96 D Dahm T. . . . . . . . . . . . . . . . . . . . . . 149 Dahm T. . . . . . . . . . . . . . . . . . . . . . . 96 Dech S. . . . . . . . . . . . . . . . . . . . . . . 73 Dix A. . . . . . . . . . . . . . . . . . . . . . . . 16 Dousa J.. . . . . . . . . . . . . . . . . . . . . 141 Dzvonkovskaya A. . . . . . . . . . . . . . 111 E Eineder M. . . . . . . . . . . . . . . . . . . . . 96 Erbertseder T.. . . . . . . . . . . . . . . . . . 96 Erdik, M. . . . . . . . . . . . . . . . . . . . . . 84 Eveslage I. . . . . . . . . . . . . . . . . . . . . 84
F Falck C. . . . . . . . . . . . . . . . . . . . . . 141 Fernandez-Steeger T. M. . . . . . . . . . . . Festl J. . . . . . . . . . . . . . . . . . . . . . . . 33 Fischer J. . . . . . . . . . . . . . . . . . . . . . 84 Flex F.. . . . . . . . . . . . . . . . . . . . . . . . 16 Friederich W. . . . . . . . . . . . . . . . . . 149 G Ge M. . . . . . . . . . . . . . . . . . . . . . . 141 Gendt G. . . . . . . . . . . . . . . . . . . . . 141 Gerstenecker C. . . . . . . . . . . . . . . . . 96 Glabsch J. . . . . . . . . . . . . . . . . . . . . 33 Glade T. . . . . . . . . . . . . . . . . . . . . . . 16 Goseberg N. . . . . . . . . . . . . . . . . . . 73 Greiving S. . . . . . . . . . . . . . . . . . . . . 16 Gress A. . . . . . . . . . . . . . . . . . . . . . . 73 Greve K. . . . . . . . . . . . . . . . . . . . . . 16 Gurgel K.-W. . . . . . . . . . . . . . . . . . 111 H Hammer C. . . . . . . . . . . . . . . . . . . . 96 Hanka W.. . . . . . . . . . . . . . . . . . . . 149 Hansteen T. . . . . . . . . . . . . . . . . . . . 96 Hafl S. E. . . . . . . . . . . . . . . . . . . . . . . 3 Heunecke O. . . . . . . . . . . . . . . . . . . 33 Hilbring D. . . . . . . . . . . . . . . . . . . . 123 Hohnecker E. . . . . . . . . . . . . . . . . . 123 Homfeld S. D. . . . . . . . . . . . . . . . . . . 3 Hort M. . . . . . . . . . . . . . . . . . . . . . . 96 J Jäger S. . . . . . . . . . . . . . . . . . . . . . . 16 Janik M.. . . . . . . . . . . . . . . . . . . . . . 16
163
K Kiehle C. . . . . . . . . . . . . . . . . . . . . . 84 Kind R.. . . . . . . . . . . . . . . . . . . . . . 149 Klein K. . . . . . . . . . . . . . . . . . . . . . . 73 Klöpfel H. . . . . . . . . . . . . . . . . . . . . 73 Köhler N. . . . . . . . . . . . . . . . . . . . . . 84 Krieger L. . . . . . . . . . . . . . . . . . . . . . 96 Krüger F. . . . . . . . . . . . . . . . . . . . . 149 Krummel H. . . . . . . . . . . . . . . . . . . . 16 Kuhlmann H. . . . . . . . . . . . . . . . . . . 16 Kühnlenz F. . . . . . . . . . . . . . . . . . . . 84
O Oczipka M. . . . . . . . . . . . . . . . . . . . 73 Ohrnberger M.. . . . . . . . . . . . . . . . 149 Ohrnberger M.. . . . . . . . . . . . . . . . . 96
L Lämmel G. . . . . . . . . . . . . . . . . . . . . 73 Lang A. . . . . . . . . . . . . . . . . . . . . . . 16 Läufer G. . . . . . . . . . . . . . . . . . . . . . 96 Lehmann F. . . . . . . . . . . . . . . . . . . . 73 Lessing R.. . . . . . . . . . . . . . . . . . . . . 84 Li L. . . . . . . . . . . . . . . . . . . . . . . . . . 16 Lichtblau B. . . . . . . . . . . . . . . . . . . . 84
Q Quante F. . . . . . . . . . . . . . . . . . . . . 123
M Maerker C. . . . . . . . . . . . . . . . . . . . 96 Mayer C. . . . . . . . . . . . . . . . . . . . . . 16 Mayer J.. . . . . . . . . . . . . . . . . . . . . . 16 Meier T. . . . . . . . . . . . . . . . . . . . . . 149 Milkereit C. . . . . . . . . . . . . . . . . . . . 84 Moder F. . . . . . . . . . . . . . . . . . . . . . 73 Montalvo Garcia A. . . . . . . . . . . . . . 96 N Nagel K.. . . . . . . . . . . . . . . . . . . . . . 73 Nakaten B.. . . . . . . . . . . . . . . . . . . . . 3 Niemeyer F. . . . . . . . . . . . . . . . . . . . . 3
164
P Padberg A.. . . . . . . . . . . . . . . . . . . . 16 Paulsen H. . . . . . . . . . . . . . . . . . . . . 16 Picozzi M. . . . . . . . . . . . . . . . . . . . . 84 Pohl J. . . . . . . . . . . . . . . . . . . . . . . . 16 Pohlmann T.. . . . . . . . . . . . . . . . . . 111
R Ramatschi M.. . . . . . . . . . . . . . . . . 141 Redlich J.P. . . . . . . . . . . . . . . . . . . . . 84 Ritter H. . . . . . . . . . . . . . . . . . . . . . . . 3 Rix M. . . . . . . . . . . . . . . . . . . . . . . . 96 Rödelsperger S. . . . . . . . . . . . . . . . . 96 Röhrs M. . . . . . . . . . . . . . . . . . . . . . 16 S Schauerte W. . . . . . . . . . . . . . . . . . . 16 Scherbaum F. . . . . . . . . . . . . . . . . . 149 Schlick T. . . . . . . . . . . . . . . . . . . . . 111 Schlurmann T. . . . . . . . . . . . . . . . . . 73 Schöne T. . . . . . . . . . . . . . . . . . . . . 141 Schuhbäck S. . . . . . . . . . . . . . . . . . . 33 Seidenberger K. . . . . . . . . . . . . . . . . 96 Setiadi N. . . . . . . . . . . . . . . . . . . . . . 73 Shirzaei M.. . . . . . . . . . . . . . . . . . . . 96 Siegert S. . . . . . . . . . . . . . . . . . . . . . 73 Singer J.. . . . . . . . . . . . . . . . . . . . . . 33
Stammler K.. . . . . . . . . . . . . . . . . . 149 Stammler K.. . . . . . . . . . . . . . . . . . . 96 Stittgen H. . . . . . . . . . . . . . . . . . . . . 96 Strunz G. . . . . . . . . . . . . . . . . . . . . . 73 T Taubenbรถck H.. . . . . . . . . . . . . . . . . 73 Thiebes B. . . . . . . . . . . . . . . . . . . . . 16 Thuro K. . . . . . . . . . . . . . . . . . . . . . 33 Titzschkau T. . . . . . . . . . . . . . . . . . 123 V Valks P.. . . . . . . . . . . . . . . . . . . . . . . 96 W Wallenstein N. . . . . . . . . . . . . . . . . . 96 Walter K. . . . . . . . . . . . . . . . . . . . . . . 3 Walter T. . . . . . . . . . . . . . . . . . . . . . 96 Wasmeier P. . . . . . . . . . . . . . . . . . . . 33 Wassermann J.. . . . . . . . . . . . . . . . . 96 Wenzel F. . . . . . . . . . . . . . . . . . . . . 123 Wenzel F. . . . . . . . . . . . . . . . . . . . . . 84 Wiebe H. . . . . . . . . . . . . . . . . . . . . . 16 Wunderlich T.. . . . . . . . . . . . . . . . . . 33 X Xu J. . . . . . . . . . . . . . . . . . . . . . . . 111 Y Yuan X. . . . . . . . . . . . . . . . . . . . . . 149 Z: Zaksห ek K.. . . . . . . . . . . . . . . . . . . . . 96 Zschau J. . . . . . . . . . . . . . . . . . . . . . 84
165