Crown scorch volume and scorch height: estimates of postfire tree condition

1985 ◽  
Vol 15 (3) ◽  
pp. 596-598 ◽  
Author(s):  
David L. Peterson

In salvage operations after wildfire, timber managers need to identify those trees most likely to die. Crown scorch volume and scorch height are commonly used to estimate damage to conifers after fire. Calculated crown scorch volume based on scorch height and tree dimensions was compared with observed crown scorch volume for four common conifer species of the northern Rocky Mountains. Calculated crown scorch volume was significantly greater than observed crown scorch volume for all species. The overestimates are the result of differences among species and trees of varying crown shape. When postfire tree condition was evaluated from observed crown scorch volume rather than from measured scorch height, crown damage was estimated with greater accuracy. Functions that estimate postfire tree mortality based on crown damage should be based on observed crown scorch volume rather than scorch height.

2007 ◽  
Vol 37 (6) ◽  
pp. 1058-1069 ◽  
Author(s):  
Sharon Hood ◽  
Barbara Bentz

Douglas-fir ( Pseudotsuga menziesii (Mirb.) Franco) were monitored for 4 years following three wildfires. Logistic regression analyses were used to develop models predicting the probability of attack by Douglas-fir beetle ( Dendroctonus pseudotsugae Hopkins, 1905) and the probability of Douglas-fir mortality within 4 years following fire. Percent crown volume scorched (crown scorch), cambium injury, diameter at breast height (DBH), and stand density index for Douglas-fir were most important for predicting Douglas-fir beetle attacks. A nonlinear relationship between crown scorch and cambium injury was observed, suggesting that beetles did not preferentially attack trees with both maximum crown scorch and cambium injury, but rather at some intermediate level. Beetles were attracted to trees with high levels of crown scorch, but not cambium injury, 1 and 2 years following fire. Crown scorch, cambium injury, DBH, and presence/absence of beetle attack were the most important variables for predicting postfire Douglas-fir mortality. As DBH increased, the predicted probability of mortality decreased for unattacked trees but increased for attacked trees. Field sampling suggested that ocular estimates of bark char may not be a reliable predictor of cambium injury. Our results emphasize the important role of Douglas-fir beetle in tree mortality patterns following fire, and the models offer improved prediction of Douglas-fir mortality for use in areas with or without Douglas-fir beetle populations.


1986 ◽  
Vol 16 (6) ◽  
pp. 1175-1179 ◽  
Author(s):  
David L. Peterson ◽  
Michael J. Arbaugh

Survival patterns after late summer wildfires were evaluated for Douglas-fir and lodgepole pine in the northern Rocky Mountains. Crown scorch was the most important variable for predicting postfire survival and variables representing bole damage improved the significance of logistic regression models for both species. Crown scorch and basal scorch were the best combination of variables for predicting survival in lodgepole pine. Crown scorch and insect damage were the best combination of variables for predicting survival in Douglas-fir. Postfire survival of lodgepole pine, which has relatively thin bark, was more sensitive than Douglas-fir to variables that quantified bole damage.


1991 ◽  
Vol 1 (1) ◽  
pp. 63 ◽  
Author(s):  
DL Peterson ◽  
MJ Arbaugh ◽  
GH Pollock ◽  
LJ Robinson

Dendroecological methods were used to study the effects of wildfire on radial growth of Pseudotsuga mniiesii (Douglas-fir) and Pinus contorta (lodgepole pine) in the northern Rocky Mountains. Mean basal area increment during a 4-year postfire period declined relative to prefire growth in 75% of burned P. menziesii trees and 70% of P. contorta trees. Percent of crown volume scorched was the most important variable related to postfire growth of P. menziesii, while basal scorch was slightly more important than crown scorch to postfire growth of P. contorta. Postfire growth always declined when crown scorch exceeded 50% in P. menziesii and 30% in P. contorta. None of the significant regression models had high predictive capability because of the large amount of variance in the data. It is clear, however, that crown injury is critical to postfire survival and growth of P. menziesii, while basal injury is critical for the thin-barked species P. contorta.


Author(s):  
Michael L. Zientek ◽  
Pamela D. Derkey ◽  
Robert J. Miller ◽  
J. Douglas Causey ◽  
Arthur A. Bookstrom ◽  
...  

Author(s):  
Edward A. Mankinen ◽  
Thomas G. Hildenbrand ◽  
Michael L. Zientek ◽  
Stephen E. Box ◽  
Arthur A. Bookstrom ◽  
...  

The Holocene ◽  
2019 ◽  
Vol 30 (3) ◽  
pp. 479-484
Author(s):  
Daniel P Maxbauer ◽  
Mark D Shapley ◽  
Christoph E Geiss ◽  
Emi Ito

We present two hypotheses regarding the evolution of Holocene climate in the Northern Rocky Mountains that stem from a previously unpublished environmental magnetic record from Jones Lake, Montana. First, we link two distinct intervals of fining magnetic grain size (documented by an increasing ratio of anhysteretic to isothermal remanent magnetization) to the authigenic production of magnetic minerals in Jones Lake bottom waters. We propose that authigenesis in Jones Lake is limited by rates of groundwater recharge and ultimately regional hydroclimate. Second, at ~8.3 ka, magnetic grain size increases sharply, accompanied by a drop in concentration of magnetic minerals, suggesting a rapid termination of magnetic mineral authigenesis that is coeval with widespread effects of the 8.2 ka event in the North Atlantic. This association suggests a hydroclimatic response to the 8.2 ka event in the Northern Rockies that to our knowledge is not well documented. These preliminary hypotheses present compelling new ideas that we hope will both highlight the sensitivity of magnetic properties to record climate variability and attract more work by future research into aridity, hydrochemical response, and climate dynamics in the Northern Rockies.


2009 ◽  
Vol 18 (7) ◽  
pp. 857 ◽  
Author(s):  
Chad T. Hanson ◽  
Malcolm P. North

With growing debate over the impacts of post-fire salvage logging in conifer forests of the western USA, managers need accurate assessments of tree survival when significant proportions of the crown have been scorched. The accuracy of fire severity measurements will be affected if trees that initially appear to be fire-killed prove to be viable after longer observation. Our goal was to quantify the extent to which three common Sierra Nevada conifer species may ‘flush’ (produce new foliage in the year following a fire from scorched portions of the crown) and survive after fire, and to identify tree or burn characteristics associated with survival. We found that, among ponderosa pines (Pinus ponderosa Dougl. ex. Laws) and Jeffrey pines (Pinus jeffreyi Grev. & Balf) with 100% initial crown scorch (no green foliage following the fire), the majority of mature trees flushed, and survived. Red fir (Abies magnifica A. Murr.) with high crown scorch (mean = 90%) also flushed, and most large trees survived. Our results indicate that, if flushing is not taken into account, fire severity assessments will tend to overestimate mortality and post-fire salvage could remove many large trees that appear dead but are not.


Sign in / Sign up

Export Citation Format

Share Document