Another BOM scandal: Australian climate data is being destroyed as routine practice (Updated 10.08.2017)

The two different thermometers sit side-by-side in a Stevenson Screen, this example is at Wagga Wagga airport, NSW. Photo: Bill Johnston.

Historic climate data is being destroyed

The Bureau have a budget of a million dollars a day, but seemingly can’t afford an extra memory stick to save historic scientific data.

In the mid 1990s thermometers changed right across Australia — new electronic sensors were installed nearly everywhere. Known as automatic weather sensors (AWS) these are quite different to the old “liquid in glass” type. The electronic ones can pick up very short bursts of heat –  so they can measure extremes of temperatures that the old mercury or liquid thermometers would not pick up, unless the spike of heat lasted for a few minutes. It is difficult (impossible) to believe that across the whole temperature range that these two different instruments would always behave in the exact same way. There could easily be an artificial warming trend generated by this change (see the step change in the graphs). The only way to compare the old and new types of thermometer is to run side by side comparisons in the field and at many sites. Which is exactly what the bureau were doing, but the data has never been put in an archive, or has been destroyed. It’s not easily available (or possibly “at all”). We have this in writing after an FOI application by Dr Bill Johnston (see below).

These measurements from past years can never be re-recorded. A four-terabyte external hard drive costs a couple of hundred dollars and would probably store a whole years worth of text files. For just 0.02% of their budget they could buy one every day. Why, why, why wouldn’t a scientist who cared about the climate want to save this information?

Dr Bill Johnston put in an FOI request for side-by-side data from both kinds of thermometer. He asked for six months of data from Sydney and Canberra Airports and was told it would cost him $460. That’s quite a barrier, and that was only to access the Sydney records. Look at what happened to the Canberra ones — the data was gone. No one could analyze it, no matter how much they were willing to pay.

Field books “disposed”?

Here’s the FOI decision regarding raw data from Canberra Dec 2014.

Bureau of Meteorology, raw data, Canberra, temperatures. FOI.

The BOM stated that “in accordance with records management practices”, the field books for early 2013 at Canberra Airport were  “disposed of” twelve months after the observations were taken. By mid 2014 the situation was even worse (if that were possible). The more recent Canberra Airport records didn’t even have field books to be destroyed. There were no records to be disposed of.

For what it’s worth, the $460 data fee was helpfully reduced to $230 after a lengthy appeal. The four page assessment cost the taxpayer more than the $230 charge, but it did successfully stop taxpayers from analyzing the data. Was that the point? The Bureau has a budget of $365 million a year – how much does it cost to store a text file?

Johnston declined to buy the Sydney data (it was confounded by multiple site changes, and he’s not paid to do this work).

He commented this week on the scant evidence that was available and the potential for a undocumented warming effect:

Comparisons between screens were done at one site using PRT (Platinum Resistance Thermometer) only and reported as a “preliminary report”, which is available; but after Automatic Weather Stations (AWS) became primary instruments, as I’ve reported before, the Bureau had an internal policy that parallel liquid-in-glass thermometer data were not databased. Furthermore, they had another policy that paper-data was destroyed after 2-years. So there is nothing that easy available….    

The only way to compare AWS  and Liquid in Glass, is to hunt for sites where there is overlap between two stations; where the AWS is given a new number. This is possible BUT the problem is that the change-over is invariably confounded with either a site move or the change to a small screen.

So, we suspect that the introduction and reliance on AWS has led to artificially higher maxima (and thus record temperatures) than in the past, but we have no way of knowing for sure and how much.

How can the CSIRO hope to produce reliable climate modelling with any number of climate scientists when the BOM cannot produce reliable temperature data?  Garbage in, garbage out.

More at: http://joannenova.com.au/2017/08/another-bom-scandal-australian-climate-data-is-being-destroyed-as-routine-practice/

BOM caught again fudging temperature data

Australia’s Bureau of Meteorology (BOM) has been caught again fudging temperature data.
Long-time concerned person, Dr Jennifer Marohasy has been keeping a watchful eye on the BOM for several years, after she caught them “homogenizing” station data.
Dr Marohasy is well qualified as a skeptic or ‘outsider’, (and our friend), since she was fired by Central Queensland University in 2015.

Dr Marohasy’s website; http://jennifermarohasy.com/2014/08/h…
Data fraud in Australia at BOM;
http://jennifermarohasy.com/tag/tempe…
http://jennifermarohasy.com/wp-conten…
http://jennifermarohasy.com/2014/09/h…
http://www.theaustralian.com.au/natio…
If just the raw data itself is used, Australia’s hottest year was 1885.
Data fraud at New Zealand’s NIWA;
http://wattsupwiththat.com/2010/10/09…

BoM raw temperature data, GHCN and global averages.

In honor of Google’s latest diversity kerfuffle, I continue with my diversity initiative on WUWT with a guest post by Nick Stokes.~ctm

By Nick Stokes,

There is an often expressed belief at WUWT that temperature data is manipulated or fabricated by the providers. This persists despite the fact that, for example, the 2015 GWPF investigation went nowhere, and the earlier BEST investigation ended up complementing the main data sources. In this post, I would like to walk through the process whereby, in Australia, the raw station data is immediately posted on line, then aggregated by month, submitted via CLIMAT forms to WMO, then transferred to the GHCN monthly unadjusted global dataset. This can then be used directly in computing a global anomaly average. The main providers insert a homogenization step, the merits of which I don’t propose to canvass here. The essential points that you can compute the average without that step, and the results are little different.

The accusations of data corruption got a workout with the recent kerfuffle over a low temperature reading on a very cold morning at Goulburn, NSW in July, so I’ll start with the Bureau of Meteorology online automatic weather station data. I counted recently a total of 712 such stations, for which data is posted online every half hour, within ten minutes of being measured. You can find the data by states – here is NSW. You can find other states from the bar at the top, under “latest observations”. Here is a map of the stations in NSW in this table:

clip_image002

For context, I have marked with green the stations of Goulburn and Thredbo top which had temperatures of below -10C flagged on that very cold morning in July. On that BoM table, you can see stations listed like this (switching now to Victoria):

clip_image004

I switched because I am now following a post from Moyhu here, and I want a GHCN station which I could follow through. But it is the same format for all stations. This data is from 4 December 2016, and I have highlighted in green the min/max data that will flow through (unchanged except for possible quality control flagging) to GHCN unadjusted. It shows for Melbourne Airport, the most recent temperature (22.4) at 7pm, various other data, and then the min and max, along with time recorded. The min is incomplete; it showed the latest 7pm temperature, but would no doubt be lower by 9am the next day, which is the cut-off. The max probably wouldn’t change. You can see the headings by linking to the page here.

If you click on the station name, it brings up a full table of the half-hourly readings for the last three days, in this style:

clip_image006

Apologies for jumping forward to now (7 Aug), but I didn’t record this back in December. It shows the headings relevant to the above too; the top line is present (a few minutes ago), going back. Now you can see that this has to be automated; no-one is hovering over this stream of data with an eraser. If you click on the “Recent months”, it brings up the following table (an extract here, and we’re back in Dec 2016):

clip_image008

That was taken at the same time (just after 7pm, 4 Dec), and you’ll see that it shows the minimum attributed to Sunday 4th (before 9am), at 9.1, but not yet the max. If you look below that table you’ll see a list of the last 13 months linked, for which you can bring up the complete table. Here is what that Dec 2016 table now looks like:

clip_image010

The max of 31.7 is there; the min went down to 15.7. The other data hasn’t changed. Further down on that page, as it appears now, are the summary statistics for the month:

clip_image012

At the end of Dec 2016, that was transmitted to the WMO as a CLIMAT form, which you can see summarized at the Ogimet site

clip_image014

 

clip_image016

You can see that the min and max are transmitted unchanged. The mean of the two has also been calculated and is marked in brown. If you want further authenticity, that site will show you the code that the met office transmitted.

Finally, the CLIMAT form is transcribed into the GHCN unadjusted file, which you can see here. It’s a big file, and you have to gunzip and untar. You can also get a file for max and min. Then you have a text file, which, if you search for 501948660002016TAVG (which includes the Melb code) you see this line:

clip_image018

There is the 19.5 (multiplied by 100, as GHCN does). The other numbers will appear in the GHCN TMAX and TMIN files.

You can even go through to the adjusted file, and, guess, what, it is still unchanged. That is because homogenization rarely modifies recent data. But older data may be. GHCN unadjusted does not change, except if the source notifies an error. There are quality controls, which don’t change numbers, but may flag them.

There have been endless articles at WUWT about individual site adjustments, but no-one has tried to calculate the whole picture of the effect of adjustment. With the unadjusted vs adjusted files, it is possible to do that. I have been calculating a global anomaly every month, using the unadjusted GHCN data with ERSST. The June result is here; there is an overview page here, with links to the methods and code. This post compares the result of unadjusted vs adjusted GHCN; the difference is small. Here from it is a plot from 1900 to start 2015 showing TempLS (my program) unadjusted (blue) vs adjusted (green) and GISS (brown), 12 month running man. It’s an active plot, so you can see more details at the linked site.

image

If you want more convenient access to the station data, I have a portal page here. The heading line looks like this:

image

The BoM AWS link takes you to this page, listing all station names with links to their current month data page. BoM also posts the metadata for all their stations, and that link takes you to this page, which lists all stations (not just AWS, and including closed stations) with links to metadata. The GHCN Stations button links to this page, which links to the NOAA summary page for each GHCN station by name, or if you click the radio buttons, to station annual data in various formats.

Summary

I have shown, for Australia (BoM) at least, that you can follow the unadjusted temperature data right through from within a few minutes of measurement to its incorporation into the global unadjusted GHCN, which is then homogenized for global averages. Of course, I can only show one example of how it goes through without change, but the path is there, and transparent. Those who are inclined to doubt should try to find cases where it is modified.

Ref.: https://wattsupwiththat.com/2017/08/08/bom-raw-temperature-data-ghcn-and-global-averages/

Support

Newscats – on Patreon or Payoneer ID: 55968469

Cherry May Timbol – Independent Reporter
Contact Cherry at: cherrymtimbol@newscats.org or timbolcherrymay@gmail.com
Support Cherry May directly at: https://www.patreon.com/cherrymtimbol

Ad

Why do CO2 lag behind temperature?

71% of the earth is covered by ocean, water is a 1000 times denser than air and the mass of the oceans are 360 times that of the atmosphere, small temperature changes in the oceans doesn’t only modulate air temperature, but it also affect the CO2 level according to Henry’s Law.

The reason it is called “Law” is because it has been “proven”!

“.. scientific laws describe phenomena that the scientific community has found to be provably true ..”

That means, the graph proves CO2 do not control temperature, that again proves (Man Made) Global Warming, now called “Climate Change” due to lack of … Warming is – again – debunked!