dot
Detailansicht
Katalogkarte GBA
Katalogkarte ISBD
Suche präzisieren
Drucken
Download RIS
Hier klicken, um den Treffer aus der Auswahl zu entfernen
Titel Application of wildfire simulation models for risk analysis
VerfasserIn A. Ager, M. Finney
Konferenz EGU General Assembly 2009
Medientyp Artikel
Sprache Englisch
Digitales Dokument PDF
Erschienen In: GRA - Volume 11 (2009)
Datensatznummer 250023866
 
Zusammenfassung
Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of fires and generate burn probability and intensity maps over large areas (10,000 – 2,000,000 ha). The MTT algorithm is parallelized for multi-threaded processing and is imbedded in a number of research and applied fire modeling applications. High performance computers (e.g., 32-way 64 bit SMP) are typically used for MTT simulations, although the algorithm is also implemented in the 32 bit desktop FlamMap3 program (www.fire.org). Extensive testing has shown that this algorithm can replicate large fire boundaries in the heterogeneous landscapes that typify much of the wildlands in the western U.S. In this paper, we describe the application of the MTT algorithm to understand spatial patterns of burn probability (BP), and to analyze wildfire risk to key human and ecological values. The work is focused on a federally-managed 2,000,000 ha landscape in the central interior region of Oregon State, USA. The fire-prone study area encompasses a wide array of topography and fuel types and a number of highly valued resources that are susceptible to fire. We quantitatively defined risk as the product of the probability of a fire and the resulting consequence. Burn probabilities at specific intensity classes were estimated for each 100 x 100 m pixel by simulating 100,000 wildfires under burn conditions that replicated recent severe wildfire events that occurred under conditions where fire suppression was generally ineffective (97th percentile, August weather). We repeated the simulation under milder weather (70th percentile, August weather) to replicate a “wildland fire use scenario” where suppression is minimized to manage fires for fuel reduction. The average BP was calculated for these scenarios to examine variation within and among a number of key designated management units, including forest-urban interface, conservation areas, protected species habitat, municipal watersheds, recreation areas, and others. To quantify risk, we developed a number of loss-benefit functions using fire effects models that relate fire intensity to tree mortality and biomass consumption. We used these relationships to measure the change in highly-valued old forest, designated wildlife conservation areas, aboveground carbon, surface fuels, and other wildland values. The loss-benefit functions were then coupled with BP’s for different intensity classes to estimate expected value change (risk) for each pixel. For a subset of the study area we also measured the change in risk from fuels management for selected resources. Estimates of BP, excluding non burnable fuels (water, rock), fro the simulations ranged from 0.00001 to 0.026 within the study area, with a mean value of 0.007. In comparison, the annual burn probability estimated from fire occurrence data within the study area (1910 – 2003) was 0.0022. The estimate from simulations represents the average probability of a random pixel burning from a single large fire that escapes suppression, hence some difference is expected. Variation in BP among designated conservation and fire protection units was relatively large and illustrated spatial differences in wildfire likelihood among highly values resources. For instance, among the 130 different forest-urban interface areas, average BP varied from 0.0001 to 0.02. Average BP for nesting sites used by the endangered Northern spotted owl averaged 0.04 and varied from 0.001 to 0.01. The marginal BP’s for high fire intensities was higher for many of the conservation areas compared the surrounding managed forest. Conservation areas that were located on the lee side of non-burnable fuels such as lava flows and lakes showed markedly reduced BP. When wildfire probabilities were combined with habitat loss functions for the Northern spotted owl, we observed expected loss from a random wildfire event ranging from 0.0 to 9.4% with a mean value of 1.5%. Expected loss was strongly correlated with BP for owl habitat, apparently because fires at very low intensities caused understory mortality and reduced stand canopy closure below minimum levels. The effect of simulating strategic fuel treatments on a subunit of the area resulted in significant decrease in expected loss of owl habitat. The effect of changing weather from a severe to mild (97th to 70th) percentile weather resulted in a dramatic 8-fold drop in BP and reduced the average wildfire size. However, the reduction was not uniform with the departures well correlated with specific fuel models. In total, this work demonstrated the application of wildfire spread models to quantitative risk assessment for fuels management on federally-managed lands in the U.S. The analyses revealed spatial variation in BP that is useful in prioritizing fuels treatments and guiding other wildfire mitigation activities. The work also illuminated the conflict between biodiversity conservation efforts on federally-managed lands and the high wildfire risk on fire-prone landscapes.