Brenden Jongman, Hessel C. Winsemius, Stuart A. Fraser, Sanne Muis, and Philip J. Ward
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Natural Hazard Science. Please check back later for the full article.
Flooding of rivers and coastlines is the most frequent and damaging of all natural hazards. Between 1980 and 2013, total direct damages exceeded $1 trillion, and at least 220,000 people lost their lives. Events with major economic losses include the 2011 flooding in Thailand ($40 billion) and the 2013 Central Europe floods ($16 billion). Flooding also triggers great humanitarian challenges. The 2015 Malawi floods were the worst in the country’s history and were followed by food shortages across large parts of the country.
Flood losses are increasing rapidly in some world regions, driven by economic development in floodplains and increases in extreme precipitation events and global sea level due to climate change. The biggest increase in flood losses is seen in low-income countries, where population growth is rapid and many cities are expanding quickly. At the same time, evidence shows that adaptation to flood risk is already happening, and that a large proportion of losses can be successfully contained by effective risk management strategies. Such risk management strategies may include floodplain zoning, construction and maintenance of flood defenses, reforestation of land draining into rivers, and use of early warning systems.
To reduce risk effectively, it is important to know the location and impact of potential floods, under current and future social and environmental conditions. In a risk assessment, models can be used to map the flow of water over land after an intense rainfall event or storm surge (the “hazard”). Modeled for many different potential events, this provides estimates of potential inundation depth in flood prone areas. Such maps can be constructed for different scenarios of climate change, based on changes in rainfall, temperature, and sea levels specified in climate change scenarios.
To assess the impact of the modeled hazard, that is, the cost of damage or lives lost, exposure (including buildings, population, and infrastructure) must be mapped using land-use and population density data, as well as construction information. Population growth and urban expansion can be simulated by increasing the density or extent of the urban area in the model. The effects of flood on people and on types of buildings and infrastructure are determined using a vulnerability function. This indicates the damage expected to occur to a structure (or group of people) as a function of flood intensity (i.e., inundation depth and flow velocity).
Potential adaptation measures such as land-use change or new flood defenses can be included in the model, to understand how effective they may be in reducing flood risk. This way, risk assessments can demonstrate the possible approaches available to policy makers to build a less risky future.
Evolution of Strategic Flood Risk Management in Support of Social Justice, Ecosystem Health, and Resilience
Throughout history, flood management practice has evolved in response to flood events. This heuristic approach has yielded some important incremental shifts in both policy and planning (from the need to plan at a catchment scale to the recognition that flooding arises from multiple sources and that defenses, no matter how reliable, fail). Progress, however, has been painfully slow and sporadic, but a new, more strategic, approach is now emerging.
A strategic approach does not, however, simply sustain an acceptable level of flood defence. Strategic Flood Risk Management (SFRM) is an approach that relies upon an adaptable portfolio of measures and policies to deliver outcomes that are socially just (when assessed against egalitarian, utilitarian, and Rawlsian principles), contribute positively to ecosystem services, and promote resilience. In doing so, SFRM offers a practical policy and planning framework to transform our understanding of risk and move toward a flood-resilient society. A strategic approach to flood management involves much more than simply reducing the chance of damage through the provision of “strong” structures and recognizes adaptive management as much more than simply “wait and see.” SFRM is inherently risk based and implemented through a continuous process of review and adaptation that seeks to actively manage future uncertainty, a characteristic that sets it apart from the linear flood defense planning paradigm based upon a more certain view of the future.
In doing so, SFRM accepts there is no silver bullet to flood issues and that people and economies cannot always be protected from flooding. It accepts flooding as an important ecosystem function and that a legitimate ecosystem service is its contribution to flood risk management. Perhaps most importantly, however, SFRM enables the inherent conflicts as well as opportunities that characterize flood management choices to be openly debated, priorities to be set, and difficult investment choices to be made.
How big, how often, and where from? This is almost a mantra for researchers trying to understand tsunami hazard and risk. What we do know is that events such as the 2004 Indian Ocean Tsunami (2004 IOT) caught scientists by surprise, largely because there was no “research memory” of past events for that region, and as such, there was no hazard awareness, no planning, no risk assessment, and no disaster risk reduction. Forewarned is forearmed, but to be in that position, we have to be able to understand the evidence left behind by past events—palaeootsunamis—and to have at least some inkling of what generated them.
While the 2004 IOT was a devastating wake-up call for science, we need to bear in mind that palaeotsunami research was still in its infancy at the time. What we now see is still a comparatively new discipline that is practiced worldwide, but as the “new kid on the block,” there are still many unknowns. What we do know is that in many cases, there is clear evidence of multiple palaeotsunamis generated by a variety of source mechanisms. There is a suite of proxy data—a toolbox, if you will—that can be used to identify a palaeotsunami deposit in the sedimentary record. Things are never quite as simple as they sound, though, and there are strong divisions within the research community as to whether one can really differentiate between a palaeotsunami and a palaeostorm deposit, and whether proxies as such are the way to go. As the discipline matures, though, many of these issues are being resolved, and indeed we have now arrived at a point where we have the potential to detect “invisible deposits” laid down by palaeotsunamis once they have run out of sediment to lay down as they move inland. As such, we are on the brink of being able to better understand the full extent of inundation by past events, a valuable tool in gauging the magnitude of palaeotsunamis.
Palaeotsunami research is multidisciplinary, and as such, it is a melting pot of different scientific perspectives, which leads to rapid innovations. Basically, whatever is associated with modern events may be reflected in prehistory. Also, palaeotsunamis are often part of a landscape response pushed beyond an environmental threshold from which it will never fully recover, but that leaves indelible markers for us to read. In some cases, we do not even need to find a palaeotsunami deposit to know that one happened.
Abdelghani Meslem and Dominik H. Lang
In the fields of earthquake engineering and seismic risk reduction the term “physical vulnerability” defines the component that translates the relationship between seismic shaking intensity, dynamic structural uake damage and loss assessment discipline in the early 1980s, which aimed at predicting the consequences of earthquake shaking for an individual building or a portfolio of buildings. In general, physical vulnerability has become one of the main key components used as model input data by agencies when developinresponse (physical damage), and cost of repair for a particular class of buildings or infrastructure facilities. The concept of physical vulnerability started with the development of the earthqg prevention and mitigation actions, code provisions, and guidelines. The same may apply to insurance and reinsurance industry in developing catastrophe models (also known as CAT models).
Since the late 1990s, a blossoming of methodologies and procedures can be observed, which range from empirical to basic and more advanced analytical, implemented for modelling and measuring physical vulnerability. These methods use approaches that differ in terms of level of complexity, calculation efforts (in evaluating the seismic demand-to-structural response and damage analysis) and modelling assumptions adopted in the development process. At this stage, one of the challenges that is often encountered is that some of these assumptions may highly affect the reliability and accuracy of the resulted physical vulnerability models in a negative way, hence introducing important uncertainties in estimating and predicting the inherent risk (i.e., estimated damage and losses).
Other challenges that are commonly encountered when developing physical vulnerability models are the paucity of exposure information and the lack of knowledge due to either technical or nontechnical problems, such as inventory data that would allow for accurate building stock modeling, or economic data that would allow for a better conversion from damage to monetary losses. Hence, these physical vulnerability models will carry different types of intrinsic uncertainties of both aleatory and epistemic character. To come up with appropriate predictions on expected damage and losses of an individual asset (e.g., a building) or a class of assets (e.g., a building typology class, a group of buildings), reliable physical vulnerability models have to be generated considering all these peculiarities and the associated intrinsic uncertainties at each stage of the development process.
Mahesh Prakash, James Hilton, and Claire Miller
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Natural Hazard Science. Please check back later for the full article.
Remotely sensed data for the observation and analysis of natural hazards is becoming increasingly commonplace and accessible. Furthermore, the accuracy and coverage of such data is rapidly improving. In parallel with this growth are ongoing developments in computational methods to store, process, and analyze this data for a variety of geospatial needs. One such use of this geospatial data is for input and calibration for the modelling of natural hazards, such as the spread of wildfires, flooding, tidal inundation, and landslides. Computational models for natural hazards show increasing real world applicability, and it is only recently that the full potential of using remotely sensed data in these models is being understood and investigated. Some examples of geospatial data required for natural hazard modelling include:
• Elevation models derived from Radio Detection and Ranging (RADAR) and Light Detection and Ranging (LIDAR) techniques for flooding, landslide, and wildfire spread models.
• Accurate vertical datum calculations from geodetic measurements for flooding and tidal inundation models.
• Multi-spectral imaging techniques to provide land cover information for fuel types in wildfire models or roughness maps for flood inundation studies.
Accurate modelling of such natural hazards allows a qualitative and quantitative estimate of risks associated with such events. With increasing spatial and temporal resolution, there is an opportunity to investigate further value-added usage of remotely sensed data in the disaster-modelling context. Improving spatial data resolution allows greater fidelity in models allowing, for example, the impact of fires or flooding on individual households to be determined. Improvement of temporal data allows the incorporation into models of short- and long-term trends, such as the changing conditions through a fire season or the changing depth and meander of a water channel.
Between 50 and 70 volcanoes erupt each year—just a fraction of the 1,000 identified volcanoes that may erupt in the near future. When compared with the catastrophic loss of lives and property resulting from typhoons, earthquakes, and floods, losses from the more infrequent but equally devastating volcanic eruptions are often overlooked. Volcanic events are usually dramatic, but their various effects may occur almost imperceptibly or with horrendous speed and destruction. The intermittent nature of this activity makes it difficult to maintain public awareness of the risks. Assessing volcanic hazards and their risks remains a major challenge for volcanologists.
Several generations ago, only a small, international fraternity of volcanologists was involved in the complex and sometimes dangerous business of studying volcanoes. To understand eruptions required extensive fieldwork and analysis of the eruption products—a painstaking process. Consequently, most of the world’s volcanoes had not been studied, and many were not yet even recognized. Volcano research was meagerly supported by some universities and a handful of government-sponsored geological surveys. Despite the threats posed by volcanoes, few volcanological observatories had been established to monitor their activity.
Volcanology is now a global venture. Gone are the days when volcanologists were educated or employed chiefly by the industrial nations. Today, volcanologists and geological surveys are located in many nations with active volcanoes. Volcanological meetings, once limited to geologists, geophysicists, and a smattering of meteorologists and disaster planners, have greatly expanded. Initially, it was a hard sell to convince volcanologists that professionals from the “soft sciences” could contribute to the broad discipline of volcanology. However, it has become clear that involving decision makers such as urban planners, politicians, and public health professionals with volcanologists is a must when exploring and developing practical, effective volcanic-risk mitigation.
Beginning in 1995, the “Cities on Volcanoes” meetings were organized to introduce an integrated approach that would eventually help mitigate the risks of volcanic eruptions. The first conference, held in Rome and Naples, Italy, encompassed a broad spectrum of topics from the fields of volcanology, geographic information systems, public health, remote sensing, risk analysis, civil engineering, sociology and psychology, civil defense, city management, city planning, education, the media, the insurance industry, and infrastructure management. The stated mission of that meeting was to “better evaluate volcanic crisis preparedness and emergency management in cities and densely populated areas.” Since that meeting nearly twenty years ago, Cities on Volcanoes meetings have taken place in New Zealand, Hawaii, Ecuador, Japan, Spain, and Mexico; the 2014 venue was Yogyakarta, Indonesia. The significant and rewarding result of these efforts is a growing connection between basic science and the practical applications needed to better understand the myriad risks as well as the possible hazard mitigation strategies associated with volcanic eruptions.
While we pursue this integrated approach, we see advances in the technologies needed to evaluate and monitor volcanoes. It is impossible to visit all the world’s restless volcanoes, let alone establish effective monitoring stations for most of them. However, we can now scrutinize their thermal signatures and local ground deformation with instruments on earth-observing satellites. When precursory activity is detected by remote sensors in an area where a population is at risk, teams can be deployed for ground-based monitoring of that activity. In addition, by evaluating a volcano’s past eruption history, scientists can forecast both future activity and the possible risks to inhabitants. Using physics-based modeling, there is a better understanding of the types and severity of potential eruption phenomena such as pyroclastic flows, ash eruptions, gaseous discharge, and lava flows. Field observations of changes indicating an imminent eruption are now monitored with geophysical and geochemical instrumentation that is smaller, tougher, and more affordable.
Volcanology has evolved into a broader, integrated scientific discipline, but there is much still to be accomplished. The new generation of volcanologists, who have the advantage of knowing the theoretical underpinnings of volcanic activity, can now turn to the allied endeavor of reducing risk—their aspiration for the 21st century.