Westerling et al. find that changes in the timing of snowpack melt in the mountains of the western US, due to changes in climate, has led to an increased number of wildfires and a higher large-fire frequency across the period 1970 – 2003. They suggest this is due to a longer fire season (i.e. spring snow-melt is occuring earler in the year and the onset of winter freezing is occurring later) and that, generally, wildfire regimes at broad scales across this region are more senstive to climate change than human land-use histories.
I find this interesting for two reasons:
- In work I’ve done with others we found that for a similar time period (1970 – 2000) and across similar broad space scales there was no significant change in the frequency-area distribution of wildfire through time (i.e. decadal scale). I’d like to extend on this empirical work and examine causal factors as Westerling et al. have
- I plan to examine the influence of both land-use and climate change on wildfire regimes using the simulation model I’m currently working; it will be interesting to see which I find is more important…
Today I was also thinking about the importance of vegetation flammability on the frequency-area scaling of wildfires in a region. Which is most important;
- total flammability of all vegetation in a region (related to broad scale climate)
- distributed of risk between different vegetation species (composition of the landscape)
- spatial distribution of risk across a landscape (configuration of the landscape)?
Something to examine with a CA model in the future maybe…