Our new paper shows that the probability of crown fire in mountain forests under extreme weather conditions is greatest when trees are about 15 years old. This has implications for debates about how timber harvesting influences the risk of fire.
Crown fire is a major driver of the dynamics of mountain ash forest, so changes in the chance of crown fire with stand age are important. Crown fire also has a major impact on risk to humans. Crown fire typically has much greater intensity than fires that remain on the forest floor, partly because more fuel is being consumed in the crown. And the probability of houses being burnt and people dying increases with fire intensity.
The relationship between stand age and the incidence of crown fire is also somewhat controversial. Some studies suggest that the probability of crown fire in mountain ash forests decreases with stand age, while others suggest it increases. These previous studies have tended to only look for monotonic relationships, and have sometimes not controlled for weather conditions at the time of fire.
Our recent paper shows that, under the most extreme fire weather conditions, the probability of crown fire is very low in the youngest forests (<5 years), but then increases rapidly until around 15 years of age. After that point, the probability of crown fire decreases substantially with age.
This pattern occurs in response to changes in the forest structure and fuel availability. In mountain ash forests, fuel loads approach their maximum levels at around 15-20 years of age. Beyond that age, fuel loads remain high, but various factors will reduce the risk of crown fire. The most obvious effect is that the crowns are further from the ground, so the fires are less easily able to climb into the crown. Secondly, moister forest elements (e.g., rainforest plant species) can become more prevalent over time, and these can reduce fire intensity.
One of the other interesting aspects of this paper is our method of analysis. We used a spatially-correlated probit regression model to model the probability of fire. This is a form of the multi-variate probit model used in our recent joint species distribution model. The difference is that in the fire paper we modelled the correlation as a simple (negative exponential) function of distance. That means that fire was perfectly correlated in its incidence for points immediately adjacent to each other (i.e., at zero distance apart, pairs of points either both burnt or both remained unburnt). As distance between points increased, the correlation decayed toward zero (i.e., at large distances between points, the incidence of fire was independent).
The controversy has hit the press, largely focusing on the consequence for fire risk. Essentially, if an area is logged or burnt a decade or three previously, then the risk of crown fire is substantially increased. In areas where the Black Saturday fires burnt, many of these younger areas were places where 1939 regrowth had been harvested. Therefore, we see headlines such as “Study finds logging increased intensity of Black Saturday fires”.
If you’d like to know more, please read the paper.
Taylor, C., McCarthy, M.A., and Lindenmayer, D.B. (in press). Non-linear effects of stand age on fire severity. Conservation Letters. [Online]
Originally Published at http://mickresearch.wordpress.com/2014/08/04/effects-of-stand-age-on-fire-severity/