By Paul Homewood
Reposted from Jo Nova:
On Sunday, Goulburn got colder than the BOM thought was possible (and a raw data record was “adjusted”).
The BOM got caught this week auto-adjusting cold extremes to be less cold. Lance Pidgeon of the unofficial BOM audit team noticed that the thermometer at Goulburn airport recorded – 10.4°C at 6.17am on Sunday morning, but the official BOM climate records said it was -10.0°C. (What’s the point of that decimal place?) Either way this was a new record for Goulburn in July. (The previous coldest ever July morning was -9.1°C. The oldest day in Goulburn was in August 1994 when it reached -10.9°C).
Apparently this was an automated event where the thermometer recorded something beyond a set limit, and the value put into the official database was the artificial limit. Since colder temperatures have already been recorded in Goulburn, who thought it was a good idea to trim all future minus-ten-point-somethings as if they were automatically “spurious”?
Yesterday, the BOM have acknowledged the error and at first deleted the -10.0 figure, replacing it with a blank space. Then today, after Jennifer Marohasy’s post, they’ve corrected it.
You might think a half degree between friends is not that significant, but this opens a whole can of worms in so many ways — what are these “limits”, do they apply equally to the high side records, who set them, how long has this being going on, and where are they published? Are the limits on the high temperatures set this close to previously recorded temperatures? How many times have raw records been automatically truncated?
This raises questions about what is “raw” data?
Perhaps most importantly, Jennifer Marohasy, I and the whole BOM audit team had been told that the Climate Data Online (CDO) represented real raw temperatures. Now apparently it does not. Raw is not necessarily raw it seems, but pre-adjusted and possibly by unpublished, unknown methods? The CDO data is the only data that matters for long term climate studies. To a scientist, shouldn’t the real raw data be kind of sacred?
Marohasy uses a simple plot of minimum temperatures recorded at Goulburn and a normal curve to show that the BOM choice of -10.0 would be expected to cut off normal real raw measurements.
This is yet another way to bias the long term so-called “raw” climate data. Thanks to a belief in Man-Made-Global-Warming, researchers might have a mindset that temperatures can only naturally break records on the high side, so they may have set asymmetrical high and low limits. There’s no way to know until the BOM provides the details. But if the if the top-end limit is set at 52C, while the bottom end limit is set at -10 — a temperature that have already been recorded in recent history — this would be, yet another, artificial bias. High end noise might be considered “real” while low end real data might be considered “spurious”.
Where are these methods published, or is it another secret process?
Jenifer Marohasy has the full story here.
Is this simply an unfortunate mistake? Maybe, but these datasets usually have some sort of algorithm to throw out suspect readings. Is this evidence that it is the colder temps that tend to be chucked out, rather than the warmer ones.
Meanwhile clearly fake temperatures at Heathrow are accepted as gospel.