Anna Bozza, Domenico Asprone, and Gaetano Manfredi
In the early 21st century, achieving the sustainability of urban environments while coping with increasingly occurring natural disasters is a very ambitious challenge for contemporary communities. In this context, urban resilience is a comprehensive objective that communities can follow to ensure future sustainable cities able to cope with the risks to which they are exposed.
Researchers have developed different definitions of resilience as this concept has been applied to diverse topics and issues in recent decades. Essentially, resilience is defined as the capability of a system to withstand major unexpected events and recover in a functional and efficient manner. When dealing with urban environments, the efficiency of the recovery can be related to multiple aspects, many of which are often hard to control. Mainly it is quantified in terms of the restoration of urban economy, population, and built form (Davoudi et al., 2012). In this article, engineering resilience is defined in relation to cities’ capability to be sustainable in the phase of an extreme event occurrence while reconfiguring their physical configuration. In this view, a city is resilient if it is sustainable in the occurrence of a hazardous event.
Accordingly, in an urban context, a wide range of nonhomogeneous factors and intrinsic dynamics have to be accounted for, which requires a multi-scale approach, from the single building level to the urban and, ultimately, the global environmental scale. As a consequence, cities can be understood as physical systems assessed through engineering metrics. Hence, the physical dimension represents a starting point from which to approach resilience. When shifting the focus from the single structure to the city scale, human behavior is revealed to be a critical factor because social actors behave and make choices every day in an unpredictable and unorganized manner, which affects city functioning. According to the ecosystem theory, urban complexity can be addressed through the ecosystem theory approach, which accounts for interrelations between physical and human components.
Russ S. Schumacher
Heavy precipitation, which in many contexts is welcomed because it provides the water necessary for agriculture and human use, in other situations is responsible for deadly and destructive flash flooding. Over the 30-year period from 1986 to 2015, floods were responsible for more fatalities in the United States than any other convective weather hazard (www.nws.noaa.gov/om/hazstats.shtml), and similar findings are true in other regions of the world. Although scientific understanding of the processes responsible for heavy rainfall continues to advance, there are still many challenges associated with predicting where, when, and how much precipitation will occur. Common ingredients are required for heavy rainfall to occur, but there are vastly different ways in which the atmosphere brings the ingredients together in different parts of the world. Heavy precipitation often occurs on very small spatial scales in association with deep convection (thunderstorms), factors that limit the ability of numerical models to represent or predict the location and intensity of rainfall. Furthermore, because flash floods are dependent not only on precipitation but also on the characteristics of the underlying land surface, there are fundamental difficulties in accurately representing these coupled processes. Areas of active current research on heavy rainfall and flash flooding include investigating the storm-scale atmospheric processes that promote extreme precipitation, analyzing the reasons that some rainfall predictions are very accurate while others fail, improving the understanding and prediction of the flooding response to heavy precipitation, and determining how heavy rainfall and floods have changed and may continue to change in a changing climate.
Tropical cyclones, also known as hurricanes or typhoons, are one of the most violent weather phenomena on the planet, posing significant threats to those living near or along coastlines where tropical cyclone–related impacts are most pronounced. About 80 tropical cyclones form annually, a rate that has been remarkably steady over the period of reliable historical record. Roughly two thirds of these storms form in the Northern Hemisphere from about June to November, while the remaining third form in the Southern Hemisphere typically during the months of November to May. Our understanding of the global and regional spatial patterns, the year-to-year variability, and temporal trends of these storms has improved considerably since the advent of meteorological satellites in the 1960s because of advances in both remote-sensing technology and operational analysis procedures. The well-recognized spatial patterns of tropical cyclone formation and tracks were laid out in a series of seminal papers in the late 1960s and 1970s and remain an accurate sketch even to this day. Concerning the year-to-year variability of tropical cyclone frequency, the El Niño Southern Oscillation (ENSO) has by far the most dominant influence across multiple ocean basins, so much so that it is typically used as the main predictor for statistical forecasts of seasonal tropical cyclone activity. ENSO has a modulating influence on atmospheric circulation patterns, even in regions remote to the tropical Pacific, which, in turn, can act to enhance or inhibit tropical cyclone formation.
While the meteorological and climate community has come a long way in our understanding of the global and regional climatological features of tropical cyclones, as well as some aspects of the broader relationship between tropical cyclones and climate, we are still hindered by temporal inconsistencies within the historical record of storm data, particularly pertaining to tropical cyclone intensity. Despite recent efforts to homogenize the historical record using satellite-derived intensity data back to the early 1980s, the relatively short period makes it difficult to discern secular trends due to anthropogenic climate change from natural trends occurring on decadal to multidecadal time scales.
Evolution of Strategic Flood Risk Management in Support of Social Justice, Ecosystem Health, and Resilience
Throughout history, flood management practice has evolved in response to flood events. This heuristic approach has yielded some important incremental shifts in both policy and planning (from the need to plan at a catchment scale to the recognition that flooding arises from multiple sources and that defenses, no matter how reliable, fail). Progress, however, has been painfully slow and sporadic, but a new, more strategic, approach is now emerging.
A strategic approach does not, however, simply sustain an acceptable level of flood defence. Strategic Flood Risk Management (SFRM) is an approach that relies upon an adaptable portfolio of measures and policies to deliver outcomes that are socially just (when assessed against egalitarian, utilitarian, and Rawlsian principles), contribute positively to ecosystem services, and promote resilience. In doing so, SFRM offers a practical policy and planning framework to transform our understanding of risk and move toward a flood-resilient society. A strategic approach to flood management involves much more than simply reducing the chance of damage through the provision of “strong” structures and recognizes adaptive management as much more than simply “wait and see.” SFRM is inherently risk based and implemented through a continuous process of review and adaptation that seeks to actively manage future uncertainty, a characteristic that sets it apart from the linear flood defense planning paradigm based upon a more certain view of the future.
In doing so, SFRM accepts there is no silver bullet to flood issues and that people and economies cannot always be protected from flooding. It accepts flooding as an important ecosystem function and that a legitimate ecosystem service is its contribution to flood risk management. Perhaps most importantly, however, SFRM enables the inherent conflicts as well as opportunities that characterize flood management choices to be openly debated, priorities to be set, and difficult investment choices to be made.
The immediate aftermath of a great urban earthquake is a dramatic and terrible event, comparable to a massive terrorist attack. Yet the shocking impact soon fades from the public mind and receives surprisingly little attention from historians, unlike wars and human atrocities. In 1923, the Great Kanto earthquake and its subsequent fires demolished most of Tokyo and Yokohama and killed around 140,000 Japanese: a level of devastation and fatalities comparable with the atomic bombing of Hiroshima and Nagasaki in 1945. But the second event has infinitely more resonance in public consciousness and historical studies than the first. Indeed, most people would be challenged to name a single earthquake with an indisputable historical impact, including even the most famous of all earthquakes: the San Francisco earthquake and fire of 1906.
In truth, however, great earthquakes, from ancient times—as recorded by Greek and biblical writers—to the present day, have had major cultural, economic, and political consequences—often a combination of all three—some of which were beneficial. Thus, the current prime minister of India owes his election in 2014 to an earthquake that devastated part of his home state of Gujarat in 2001, which led to its striking economic growth. The martial law imposed on Tokyo and Yokohama after the 1923 earthquake gave new authority to the Japanese army, which eventually took over the Japanese government and led Japan to war with China and the world. The destruction of San Francisco in 1906 produced a boom in rebuilding and financial and technological development of the surrounding area on the San Andreas Fault, including what became Silicon Valley. A great earthquake in Venezuela in 1812 was the principal cause of the temporary defeat of its leader Simon Bolivar by the Spanish colonial regime, but his subsequent exile led to his permanent freeing of Bolivia, Colombia, Ecuador, Peru, and Venezuela from Spanish rule. The catastrophic Lisbon earthquake of 1755—as well known in the early 19th century as the 1945 atomic bombings are today—was a pivotal factor in the freeing of Enlightenment science from Catholic religious orthodoxy, as epitomized by Voltaire’s satirical novel Candide, written in response to the earthquake. Even the minor earthquakes in Britain in 1750, the so-called Year of Earthquakes, produced the earliest scientific understanding of earthquakes, published by the Royal Society: the beginning of seismology.
The long-term impact of a great earthquake depends on its epicenter, magnitude, and timing—and also on human factors: the political, social, intellectual, religious, and cultural resources specific to a region’s history. Each earthquake-struck society offers its own particular lesson, and yet, taken together, such earth-shattering events have important shared consequences for the history of the world.
Michael Wehner, Federico Castillo, and Dáithí Stone
Extremely high air temperatures are uncomfortable for everyone. For some segments of the population, they can be deadly. Both the physical and societal aspects of intense heat waves in a changing climate warrant close study. The large-scale meteorological patterns leading to such events lay the framework for understanding their underlying causal mechanisms, while several methods of quantifying the combination of heat and humidity can be used to determine when these patterns result in stressful conditions. We examine four historic heat waves as case studies to illustrate differences in the structure of heat waves and the variety of effects of extreme heat on humans, which are characterized in terms of demographic, geographic, and socioeconomic impacts, including mortality and economic ramifications.
Weather station data and climate model projections for the future point to an increase in the frequency and intensity of extreme heat waves as the overall climate gets warmer. Changes in the radiative energy balance of the planet are the principal culprit behind this increase. Quantifying changes in the statistics of extreme heat waves allows for examination of changes in their potential contribution to human health risk. Large-scale mortality during heat waves always occurs within a context of other factors, including public health policy, rural and urban management and planning, and cultural practices. Consequently, the impacts of heat waves can be reduced, and may in many places be manageable into the future, through implementation of such measures as public health warning systems, effective land management, penetration of air conditioning, and increased monitoring of vulnerable or exposed individuals. Given the potential for severe impacts of the more intense heat waves that are virtually certain to occur in the warmer future, it is critical that both the physical and social sciences be considered together to enable society to adapt to these conditions.
Guy J.-P. Schumann
For about 40 years, with a proliferation over the last two decades, remote sensing data, primarily in the form of satellite and airborne imagery and altimetry, have been used to study floods, floodplain inundation, and river hydrodynamics. The sensors and data processing techniques that exist to derive information about floods are numerous. Instruments that record flood events may operate in the visible, thermal, and microwave range of the electromagnetic spectrum. Due to the limitations posed by adverse weather conditions during flood events, radar (microwave range) sensors are invaluable for monitoring floods; however, if a visible image of flooding can be acquired, retrieving useful information from this is often more straightforward. During recent years, scientific contributions in the field of remote sensing of floods have increased considerably, and science has presented innovative research and methods for retrieving information content from multi-scale coverages of disastrous flood events all over the world. Progress has been transformative, and the information obtained from remote sensing of floods is becoming mature enough to not only be integrated with computer simulations of flooding to allow better prediction, but also to assist flood response agencies in their operations.
Furthermore, this advancement has led to a number of recent and upcoming satellite missions that are already transforming current procedures and operations in flood modeling and monitoring, as well as our understanding of river and floodplain hydrodynamics globally. Global initiatives that utilize remote sensing data to strengthen support in managing and responding to flood disasters (e.g., The International Charter, The Dartmouth Flood Observatory, CEOS, NASA’s Servir and the European Space Agency’s Tiger-Net initiatives), primarily in developing nations, are becoming established and also recognized by many nations that are in need of assistance because traditional ground-based monitoring systems are sparse and in decline. The value remote sensing can offer is growing rapidly, and the challenge now lies in ensuring sustainable and interoperable use as well as optimized distribution of remote sensing products and services for science as well as operational assistance.
Rapid urbanization and growing populations have put tremendous pressures on limited global housing stocks. As the frequency of disasters has increased with devastating impacts on this limited stock of housing, the discourse on post-disaster housing recovery has evolved in several ways. Prior to the 1970s, the field was largely understudied, and there was a narrow understanding of how households and communities rebuilt their homes after a catastrophic event and on the effectiveness of housing recovery policy and programs designed to assist them. Early debates on post-disaster housing recovery centered on cultural and technological appropriateness of housing recovery programs. The focus on materials, technology, and climate missed larger socioeconomic and political complexities of housing recovery. Since then, the field has come a long way: current theoretical and policy debates focus on the effect of governance structures, funding practices, the consequences of public and private interventions, and socioeconomic and institutional arrangements that effect housing recovery outcomes.
There are a number of critical issues that shape long-term post-disaster housing recovery processes and outcomes, especially in urban contexts. Some of them include the role of the government in post-disaster housing recovery, governance practices that drive recovery processes and outcomes, the challenges of paying for post-disaster housing repair and reconstruction, the disconnect between planning for rebuilding and planning for housing recovery, and the mismatch between existing policy programs and housing needs after a catastrophic event—particularly for affordable housing recovery. Moreover, as housing losses after disasters continue to increase, and as the funding available to rebuild housing stocks shrinks, it has become increasingly important to craft post-disaster housing recovery policy and programs that apply the limited resources in the most efficient and impactful ways. Creating housing recovery programs by employing a needs-based approach instead of one based solely on loss could more effectively focus limited resources on those that might need it the most. Such an approach would be broad based and proportional, as it would address the housing recovery of a wide range of groups based upon their needs, including low-income renters, long-term leaseholders, residents of informal settlements and manufactured homes, as well as those with preexisting resources such as owner-occupant housing.
Glacier retreat is considered to be one of the most obvious manifestations of recent and ongoing climate change in the majority of glacierized alpine and high-latitude regions throughout the world. Glacier retreat itself is both directly and indirectly connected to the various interrelated geomorphological/hydrological processes and changes in hydrological regimes. Various types of slope movements and the formation and evolution of lakes are observed in recently deglaciated areas. These are most commonly glacial lakes (ice-dammed, bedrock-dammed, or moraine-dammed lakes).
“Glacial lake outburst flood” (GLOF) is a phrase used to describe a sudden release of a significant amount of water retained in a glacial lake, irrespective of the cause. GLOFs are characterized by extreme peak discharges, often several times in excess of the maximum discharges of hydrometeorologically induced floods, with an exceptional erosion/transport potential; therefore, they can turn into flow-type movements (e.g., GLOF-induced debris flows). Some of the Late Pleistocene lake outburst floods are ranked among the largest reconstructed floods, with peak discharges of up to 107 m3/s and significant continental-scale geomorphic impacts. They are also considered capable of influencing global climate by releasing extremely high amounts of cold freshwater into the ocean. Lake outburst floods associated with recent (i.e., post-Little Ice Age) glacier retreat have become a widely studied topic from the perspective of the hazards and risks they pose to human society, and the possibility that they are driven by anthropogenic climate change.
Despite apparent regional differences in triggers (causes) and subsequent mechanisms of lake outburst floods, rapid slope movement into lakes, producing displacement waves leading to dam overtopping and eventually dam failure, is documented most frequently, being directly (ice avalanche) and indirectly (slope movement in recently deglaciated areas) related to glacial activity and glacier retreat. Glacier retreat and the occurrence of GLOFs are, therefore, closely tied, because glacier retreat is connected to: (a) the formation of new, and the evolution of existing, lakes; and (b) triggers of lake outburst floods (slope movements).
Fatalism about natural disasters hinders action to prepare for those disasters, and overcoming this fatalism is one key element to preparing people for these disasters. Research by Bostrom and colleagues shows that failure to act often reflects gaps and misconceptions in citizen’s mental models of disasters. Research by McClure and colleagues shows that fatalistic attitudes reflect people’s attributing damage to uncontrollable natural causes rather than controllable human actions, such as preparation. Research shows which precise features of risk communications lead people to see damage as preventable and to attribute damage to controllable human actions. Messages that enhance the accuracy of mental models of disasters by including human factors recognized by experts lead to increased preparedness. Effective messages also communicate that major damage in disasters is often distinctive and reflects controllable causes. These messages underpin causal judgments that reduce fatalism and enhance preparation. Many of these messages are not only beneficial but also newsworthy. Messages that are logically equivalent but are differently framed have varying effects on risk judgments and preparedness. The causes of harm in disasters are often contested, because they often imply human responsibility for the outcomes and entail significant cost.