Uncovered: decades-old government report showing climate data was bad, unfit for purpose

Image: Former NOAA scientist: Colleagues manipulated climate change data for political reasons

IPCC Knew from Start Climate Data Inadequate

Climate Monitoring Weather Station at the University of Arizona, Tucson, measuring temperature in the University parking lot if front of the Atmospheric Science Building. The station has since been closed. 2007 Photo by Warren Meyer

Guest opinion: Dr. Tim Ball

In 1999, the National Academy of Sciences, the research arm of the National Research Council, released a study expressing concern about the accuracy of the data used in the debate over climate change. They said there are,

“Deficiencies in the accuracy, quality and continuity of the records,” that “place serious limitations on the confidence that can be placed in the research results.”

The people who reached these conclusions and their affiliations at the time follows.

———————-

  • THOMAS R. KARL (Chair), National Climatic Data Center, Asheville, North Carolina
  • ROBERT E. DICKINSON,University of Arizona, Tucson
  • MAURICE BLACKMON,National Center for Atmospheric Research, Boulder, Colorado
  • BERT BOLIN,University of Stockholm, Sweden
  • JEFF DOZIER,University of California, Santa Barbara
  • WILLIAM P. ELLIOTT, NOAA/Air Resources Laboratory, Silver Spring, Maryland
  • JAMES GIRAYTYS, Certified Consulting Meteorologist,Winchester, Virginia
  • RICHARD E. HALLGREN,American Meteorological Society, Washington, D.C.
  • JAMES E. HANSEN, NASA/Goddard Institute for Space Studies, New York, New York
  • SYDNEY LEVITUS, NOAA/National Oceanic Data Center, Silver Spring, Maryland
  • GORDON MCBEAN, Environment Canada, Downsview, Ontario
  • GERALD MEEHL, National Center for Atmospheric Research, Boulder, Colorado
  • PHILIP E. MERILEES, Naval Research Laboratory, Monterey, California
  • ROBERTA BALSTAD MILLER, CIESIN, Columbia University, Palisades, New York
  • ROBERT G. QUAYLE, NOAA/National Climatic Data Center, Asheville, North Carolina
  • S. ICHTIAQUE RASOOL, University of New Hampshire, Durham
  • STEVEN W. RUNNING, University of Montana, Missoula
  • EDWARD S. SARACHIK, University of Washington, Seattle
  • WILLIAM H. SCHLESINGER, Duke University, Durham, North Carolina
  • KARL E. TAYLOR, Lawrence Livermore National Laboratory, Livermore, California
  • ANNE M. THOMPSON, NASA/Goddard Space Flight Center, Greenbelt, Maryland
  • Ex Officio Members
  • W. LAWRENCE GATES, Lawrence Livermore National Laboratory, Livermore, California
  • DOUGLAS G. MARTINSON, Lamont-Doherty Earth Observatory, Columbia University, Palisades, New York
  • SOROOSH SOROOSHIAN, University of Arizona, Tucson
  • PETER J. WEBSTER, University of Colorado, Boulder

—————————

These are prominent names, many of them important in the Intergovernmental Panel on Climate Change (IPCC) and promotion of its ideas. For example, Gordon McBean chaired the formation meeting of the IPCC in Villach Austria in 1985. Bert Bolin was appointed first IPCC co-chair along with Sir John Houghton. Thomas Karl and James Hansen were two dominant figures in terms of the control and manipulation of the data right up to their very recent retirements.

Karl chaired the study, so he knew better than any that to achieve the results they wanted, namely a steadily increasing temperature over the 120 + years of instrumental record, was made easier by the inadequacy of the data. They ignored the fact that the inadequacy of the data negated the viability of the work they planned and did. For example, the extent, density, and continuity of the data are completely inadequate as the basis for a mathematical computer model of global climate. In short, they knew they would have to create, make up, or modify data to even approximate a result. The trouble is the data was so inadequate that even with their actions the results could not approximate reality.

Despite this, the IPCC committed to the surface data even though elsewhere decisions were made to reduce the number of stations and thus further limit the coverage. There were two main reasons for the reduction, the increasing diversion of funds to global warming research and expensive computer models, and the anticipation of weather data from satellites. NASA GISS produced a plot (Figure 1) to show what was happening. I expanded each graph to show the important details (Figures 2, 3, 4).

clip_image002

Figure 1

Commendably they upgraded the data, but all that does is emphasize the anomalies.

clip_image004

Figure 2

Important points:

· There are no stations with over 130 years of record.

· There are approximately 300 stations with about 120 years of record.

· Virtually all the stations with over 100 years of record are in the eastern US or western Europe.

clip_image006

Figure 3

Important points;

· First significant decrease after 1960 anticipating satellite replacement.

· Second decrease around 1995 associated with switch of funding to global warming away from data collection and reduction of stations used.

· Figure 2 shows maximum number of stations at approximately 7200, but Figure 3 shows it at approximately 5500.

clip_image008

Figure 4

Important points;

· Despite reduction in number of stations, coverage only declines slightly. That is scientifically not possible.

· Currently 20 percent of the Northern Hemisphere and 25 percent of Southern Hemisphere has no coverage.

· Quality of the coverage is critical but very variable as Thomas Karl notes in the forward “Unlike sciences where strict laboratory controls are the rule, climate researchers have to rely on observations collected in different countries and using differing instruments.” Remember, the US coverage is the best yet, but only 7.9% of their stations are accurate to < 1°C. Here is an example from the preface to the 1951-1980 Canadian Climate Normals of what Karl is reporting. “No hourly data exists in the digital archive before 1953, the averages appearing in this volume have been derived from all ‘hourly’ observations, at the selected hours, for the period 1953 to 1980, inclusive. The reader should note that many stations have fewer than the 28 years of record in the complete averaging.”

Although he did not contribute to the study, Kevin Trenberth commented on its release that,

It’s very clear we do not have a climate observing system…This may be a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t

Despite this Trenberth, created an energy balance model that is central to the entire greenhouse effect climate claims.

The National Research Council study focussed on the inadequacies of the instrumental record. It appeared shortly after H. H. Lamb released his autobiography (1997) in which he expanded on the larger limitations for climate research. He wrote that he established the Climatic Research Unit, because,

“…it was clear that the first and greatest need was to establish the facts of the past record of the natural climate in times before any side effects of human activities could well be important.”

So, in 1999 the people involved in the creation and control of the IPCC knew that the data during the 120 years of human activity was inadequate. They also knew that the data necessary to show the extent of the impact was equally inadequate. That didn’t stop them, but it meant they knew they had to create the data they needed and how to do it.

The IPCC could maintain the story and the data that supported their claim of human caused global warming (AGW) if they controlled all the data sources. This all started to unravel after 2000 A.D.

1. The satellite data confirmed by balloon records were reaching a length they could no longer ignore. By 2007 the IPCC included the following comment in the Fourth Assessment Report (FAR).

“New analyses of balloon-borne and satellite measurements of lower- and mid-tropospheric temperature show warming rates that are similar to those of the surface temperature record and are consistent within their respective uncertainties, largely reconciling a discrepancy noted in the TAR.”

2. Severely cold winters and increased snowfall captured public attention as many cartoons attested.

clip_image010

3. The pause or hiatus passed the 17 years Benjamin Santer required before even considering the pattern.

4. The gap between increasing atmospheric CO2 levels and actual temperature continued to grow.

But none of this stopped the search for proof Thomas Karl created a record and wrote a paper with Tom Peterson that claimed the hiatus did not occur using data and methodswith serious limitations. After exposure of the misuse, the co-authors refused a congressional subpoena seeking the data and methods used.

Karl and others involved in the deception that is anthropogenic global warming (AGW) knew from the start and better than anyone, the severe limitations of the instrumental data set. They likely knew from Lamb’s work the limitations of the historic record. Despite this, or rather because of this, they oversaw the building of computer models, wrote papers, produced ‘official’ Reports and convinced the world that it faced impending doom through runaway global warming, insisted. Their work, based on what they knew was inadequate data from the start, achieved universal acceptance. Of course, they also knew better than most how to manipulate and select data to make points that supported their false hypothesis; at least until the satellite data achieved scientific status, but even then, they maintained the deception. They prove they are charter members of the post-fact society. The latest example was triggered by panic over Trump’s withdrawal from the Paris Climate Agreement and analysed by Tony Heller.

Ref.: https://wattsupwiththat.com/2017/08/12/uncovered-decades-old-report-showing-climate-data-was-bad/

Support

Newscats – on Patreon or Payoneer ID: 55968469

Cherry May Timbol – Independent Reporter
Contact Cherry at: cherrymtimbol@newscats.org or timbolcherrymay@gmail.com
Support Cherry May directly at: https://www.patreon.com/cherrymtimbol

Ad

Why do CO2 lag behind temperature?

71% of the earth is covered by ocean, water is a 1000 times denser than air and the mass of the oceans are 360 times that of the atmosphere, small temperature changes in the oceans doesn’t only modulate air temperature, but it also affect the CO2 level according to Henry’s Law.

The reason it is called “Law” is because it has been “proven”!

“.. scientific laws describe phenomena that the scientific community has found to be provably true ..”

That means, the graph proves CO2 do not control temperature, that again proves (Man Made) Global Warming, now called “Climate Change” due to lack of … Warming is – again – debunked!