By David Middleton – WUWT
This post was inspired by Anthony Watts’ recent post about wildfires and their unwillingness to cooperate with the Gorebal Warming narrative.
If the “climate” was warming faster or sea level was rising faster “than at any time in the past 2000 years,” it would be front page material (they aren’t, see Addendum 1). However, actual evidence that “global biomass burning during the past century has been lower than at any time in the past 2000 years” doesn’t get much notice in the mainstream media.
Analysis of charcoal records in sediments  and isotope-ratio records in ice cores  suggest that global biomass burning during the past century has been lower than at any time in the past 2000 years. Although the magnitude of the actual differences between pre-industrial and current biomass burning rates may not be as pronounced as suggested by those studies , modelling approaches agree with a general decrease of global fire activity at least in past centuries . In spite of this, fire is often quoted as an increasing issue around the globe [11,26–29].
In spite of this, fire is often quoted as an increasing issue around the globe…
Why is fire “often quoted as an increasing issue around the globe”?
We have been told, many times, that wildfires are multiplying and will continue to increase in severity and/or frequency due to global warming.
From November 2012:
4. Wildfires are multiplying
This map published in the National Research Council report shows how rising temperatures and increased evaporation will cause widespread fires in the western US. Fire damage in the northern Rocky Mountain forests, marked by region B, is expected to more than double annually for each 1.8 degree Fahrenheit increase in average global temperatures. With the same temperature increase, fire damage in the Colorado Rockies (region J) is expected to be more than seven times what it was in the second half of the 20th century.
Despite no “direct relationship between climate change and fire,” they are continuing to tell us “that wildfires [will] increase due to global warming”…
Will global warming produce more frequent and more intense wildfires?
There isn’t a direct relationship between climate change and fire, but researchers have found strong correlations between warm summer temperatures and large fire years, so there is general consensus that fire occurrence will increase with climate change.
Hot, dry conditions, however, do not automatically mean fire—something needs to create the spark and actually start the fire. In some parts of the country (like Alaska), most fires are ignited by lightning. In other regions (like California), most fires are ignited by humans.
Climate models tell us that average summer temperatures will continue to increase through this century, but ignition is the wild card. What will happen in the future is a more complicated story because we don’t understand what will happen with convective storms and the lightning.
“There isn’t a direct relationship between climate change and fire, but…” There’s always a BIG BUT…
But models and consensus say that Gorebal Warming will increase the frequency and/or severity of wildfires… But… another BIG BUT… There’s no scientific basis for the first BIG BUT.
A Geological Perspective of Wildfires
The Fire Window
Geological evidence for ancient wildfires generally consists of sedimentary charcoal deposits (inertinite). Fossil charcoal is also a key factor in understanding the evolution of Earth’s atmosphere, particularly oxygen content. The first clear evidence of fire is in the Late Silurian.
Fossil charcoal provides direct evidence for fire events that, in turn, have implications for the evolution of both terrestrial ecosystems and the atmosphere. Most of the charcoal record is known from terrestrial or near-shore environments and indicates the earliest occurrences of fire in the Late Silurian. However, despite the rise in available fuel through the Devonian as vascular land plants became larger and trees and forests evolved, charcoal occurrences are very sparse until the Early Mississippian where extensive charcoal suggests well established fire systems. We present data from the latest Devonian of North America from terrestrial and marine rocks indicating that fire became more widespread and significant at this time. This may be a function of rising O2 levels and the occurrence of fire itself may have contributed to this rise through positive feedback. Recent atmospheric modelling suggests an O2 low during the Middle Devonian (around 13%), with O2 rising steadily through the Late Devonian and Early Carboniferous (Mississippian) (from 17-19%). In Devonian-Carboniferous marine black shales, fossil charcoal (inertinite) steadily increases up-section suggesting the rise of widespread fire systems. Scanning electron and reflectance microscopy of charcoal from Late Devonian sites indicate that the fires were moderately hot (around 550 °C) and burnt mainly surface vegetation dominated by zygopterid ferns and lycopsids, rather than being produced by forest crown fires.
The evolution of vascular land plants from the Late Silurian through the Mississippian (Lower Carboniferous) provided both the fuel and the oxygen required for widespread fire systems.
Fire is an exothermic oxidation reaction dependent on the rapid combination of fuel and oxygen in the presence of heat11. Charcoal is a by-product of wildfire and is first documented in the latest Silurian12,13, and has subsequently been recorded in all geological periods from a range of sedimentary settings14. Calculation of fuel flammability at varying oxygen concentrations enables past pO2 to be constrained within the range 15-35% (`fire window’) whenever charcoal is recovered from the fossil record11,15,16.
The “fire window” is defined as an atmospheric oxygen content range of 13-15% to 35%. Below 13-15% fire will not ignite and above 35% fire cannot be extinguished (which would really suck!).
Under modern atmospheric O2 conditions (Present Atmospheric Level (PAL) = 21%), a low plant-moisture content is necessary for fire to spread (Wildman et al., 2004) as is a sufficient fuel build-up and an ignition source (usually lightning) (Pyne et al., 1996). Atmospheric O2 levels are crucial: below 13% fires will not ignite and spread (Chaloner, 1989) whereas at high O2 levels (above 35%) plants may burn irrespective of fuel moisture, resulting in an upper limit for atmospheric O2 concentration as above this level no fire could be extinguished (Watson et al., 1978; Lenton and Watson, 2000; Lenton, 2001). This has led to the concept of the fire window (Jones and Chaloner, 1991).
Inertinite abundance provides estimates of past fire systems and atmospheric oxygen content. As vascular land plants became more abundant, atmospheric oxygen levels rose. The increasing abundance of vascular land plants and higher atmospheric oxygen levels led to an increase in the prevalence of fire.
As can be seen in Figure 1, wildfires were far more common during the Late Paleozoic and Mesozoic than they have been in the Cenozoic. While the various methods of estimating Phanerozoic atmospheric O2 yield somewhat different results (Berner 1999, 2006, 2009), they generally depict a rapid rise from about 15% to 25-30% during the Carboniferous, an Early Permian peak of 30-35%, about 25% from the Triassic through mid-Cretaceous, punctuated by drops to 15-20% associated with anoxic events.
The atmospheric oxygen level has been slowly declining over time. O2/N2 ratios from Greenland and Antarctic ice cores indicate that atmospheric oxygen has declined by 0.7% over the past 800,000 years (Stolper et al., 2018).
The modern decline in atmospheric oxygen is also reflected in recent instrumental records.
While the atmosphere is still well-within the “fire window,” the trend in atmospheric oxygen content has been steadily moving in the opposite direction of increased fire risk since the Late Cretaceous.
So, atmospheric oxygen has not been playing ball with the Gorebal Warming narative… Unless the Gorebal Warming narative asserts that oxygen levels will decline as wildfires increase… Which would be one of the most fracking moronic things a Gorebot could assert.
What Does Fire Need Apart From Oxygen?
Wood and other biomass. Dry wood and other dry biomass, to be more precise. The U.S. Forest Service Wildland Fire Assessment System bases fire risk on a number of factors revolving around humidity, atmospheric stability, lighting potential, winds, humidity and moisture.
Fire Potential / Danger
- Fire Danger Rating
- Haines Index
- Dry Lightning
- Potential Lightning Ignition
- Lightning Efficiency
- NDFD Fire Danger Forecasts
- Fire Weather
Moisture / Drought
- Dead Fuel Moisture
- Growing Season Index
- AVHRR NDVI
- Keetch-Byram Index
- Palmer Index
- National Fuel Moisture Database
The National Park Service also has a very informative web page on the assessment of fire danger, which includes the following:
Ignition Component (IC)
The Ignition Component is a number that relates the probability that a fire will result if a firebrand is introduced into a fine fuel complex. The ignition component can range from 0 when conditions are cool and damp, to 100 on days when the weather is dry and windy. Theoretically, on a day when the ignition component registers a 60 approximately 60% of all firebrands that come into contact with wildland fuels will require suppression action.
- A firebrand must come into contact with the dead fuel,
- the fuel particle must be dry, and
- the temperature of the fuel particle must be raised to the kindling point which is about 380 degrees centigrade.
Living material in the fine fuel complex reduces the efficiency of ignition. Therefore, an adjustment to the ignition component is made based on the percentage of live fuel (herbaceous vegetation) in the fine fuel complex.
The moisture content of the dead component of the fine fuel (1-hr. timelag fuel moisture) is determined by the state of the weather (sunny or cloudy), air temperature, and relative humidity at the time of the 2 p.m. fire weather observation.
The condition of the herbaceous (live) vegetation and the 1-hour time lag fuel moisture are then integrated in the calculation the fine fuel moisture (FFM) which expresses the effective moisture content of the fine fuels.
The closer the initial temperature of the fuel is to the ignition temperature, the more likely a fire will result when a firebrand is introduced into the fine fuel complex, since not a much energy is required to raise the fuel particle to its ignition temperature.
It’s little wonder that wildfires don’t correlate very well with the rise in average surface temperatures over the past 500 years. 15.0 °C isn’t significantly closer to 380 °C, than 14.2 °C is.
Referring back to Rimmer et al., 2015:
Under modern atmospheric O2 conditions (Present Atmospheric Level (PAL) = 21%), a low plant-moisture content is necessary for fire to spread (Wildman et al., 2004) as is a sufficient fuel build-up and an ignition source (usually lightning) (Pyne et al., 1996).
From a climatological perspective, fuel moisture is the key factor in fire risk and the second most important factor from a geological perspective, right behind atmospheric O2conditions.
Fuel moisture is closely related to drought conditions. If the soil is wet, the fuel is probably wet, and the “global” soil has actually been getting wetter since 1950.
An overall increasing trend in global soil moisture, driven by increasing precipitation, underlies the whole analysis, which is reflected most obviously over the western hemisphere and especially in North America.
And there has been no statistically meaningful change in global drought conditions since 1950.
Here we show that the previously reported increase in global drought is overestimated because the PDSI uses a simplified model of potential evaporation7 that responds only to changes in temperature and thus responds incorrectlyto global warming in recent decades. More realistic calculations, based on the underlying physical principles8 that take into account changes in available energy, humidity and wind speed, suggest that there has been little change in drought over the past 60 years.
This is TOO FRACKING FUNNY!
Here we show that the previously reported increase in global drought is overestimated because the PDSI uses a simplified model of potential evaporation that responds only to changes in temperature and thus responds incorrectlyto global warming in recent decades.
PDSI *overestimates* drought conditions? Too fracking funny!
Does anyone need an explanation of Figure 5?
So… Why Are We Experiencing So Many “Megafires”?
Modern Megafires Are Truly Unusual
By Live Science Staff May 16, 2012
The gigantic wildfires that blow through the southwestern United States today are unprecedented in the long-term historical record, new research suggests, and are due to modern human activities.
“The U.S. would not be experiencing massive large-canopy-killing crown fires today if human activities had not begun to suppress the low-severity surface fires that were so common more than a century ago,” study researcher Christopher Roos, of Southern Methodist University, said in a statement.
They discovered that this time period, the Medieval Warm Period, was no different from the Little Ice Age in terms of what drives frequent low-severity surface fires: year-to-year drought patterns.
“It’s true that global warming is increasing the magnitude of the droughts we’re facing, but droughts were even more severe during the Medieval Warm Period,” Roos said. “It turns out that what’s driving the frequency of surface fires is having a couple wet years that allow grasses to grow continuously across the forest floor and then a dry year in which they can burn. We found a really strong statistical relationship between two or more wet years followed by a dry year, which produced lots of fires.”
The researchers found that even when ancient climates varied from each other — one hotter and drier and the other cooler and wetter — the frequencies of year-to-year weather patterns that drive fire activity were similar.
In ancient forests, frequent small fires swept the forest floor. These “fires cleaned up the understory, kept it very open, and made it resilient to climate changes because even if there was a really severe drought, there weren’t the big explosive fires that burn through the canopy because there were no fuels to take it up there,” Roos said. “The trees had adapted to frequent surface fires, and adult trees didn’t die from massive fire events because the fires burned on the surface and not in the canopy.”
“If anything, what climate change reminds us is that it’s pretty urgent that we deal with the structural problems in the forests,” Roos said. “The forests may be equipped to handle the climate change, but not in the condition that they’re currently in. They haven’t been in that condition before.”
One answer to today’s megafires might be changes in fire management, the researchers said.
The findings were published today in the March issue of the journal The Holocene.
- ‘The Medieval Warm Period, was no different from the Little Ice Age in terms of what drives frequent low-severity surface fires: year-to-year drought patterns.”
- “Droughts were even more severe during the Medieval Warm Period,” than the modern era.
- “For at least 200 years prior to Euroamerican settlement, extensive fires occurred frequently in these forests (e.g. every 3–15 years), consuming fine surface fuels and maintaining an open, park-like structure of mixed age forests.”
- “Frequent low-severity surface fires” cleared for forests of fuel. These types of fires are rare in the modern era due to fire prevention methods and firefighting activities… Hence megafires.
Couple all of that with the fact that atmospheric oxygen levels are declining and soil moisture is increasing and you can schist-can the notion that Gorebal Warming is increasing the frequency and/or severity of wildfires. Although, human activities clearly are increasing the risk of “megafires.”
So… That pretty much settles it… Gorebal Warming does not cause an increase in forest fires… And… Only you can prevent forest fires.
Read the rest at WUWT