Fifteen posts into this series — and I certainly hope that you have read all of them — perhaps there are still a few of you out there who continue to believe that this whole global average surface temperature (GAST), “hottest year ever,” “record warming” thing can’t really be completely fraudulent. I mean, these claims are put out by government bureaucrats, highly paid “experts” in their designated field of temperature measurement. It’s really complicated stuff to figure out a “global average surface temperature” from hundreds of scattered thermometers, some of which get moved, get read at different times of the day, have cities grow up around them, whatever. Somebody’s got to make the appropriate adjustments. Surely, they are trying their best to get the most accurate answer they can with a challenging task. Could it really be that they are systematically lying to the people of America and the world?
The designated field for my own career was civil litigation, and in that field lawyers regularly call upon ordinary members of the public (aka jurors) to draw the inference of whether fraud has occurred. Lawyers claiming that a defendant has committed fraud normally proceed by presenting to the jury a few glaring facts about what the defendant has done. “Here is what he said”; and “here is the truth.” The defendant then gets the chance to explain. The jurors apply their ordinary judgment and experience to the facts presented.
So, consider yourself a member of my jury. The defendants (NASA and NOAA) have been accused of arbitrarily adjusting the temperatures of the past downward in order to make fraudulent claims of “hottest year ever” for the recent years. You decide! I’ll give you a couple of data points that have come to my attention just today.
James Freeman is the guy who has taken over the Wall Street Journal’s “Best of the Web” column since James Taranto moved on to another gig at the paper earlier this year. Here is his column for yesterday. (You probably can’t get the whole thing without subscribing, but I’ll give you his critical links.) Freeman first quotes the New York Times, March 29, 1988, which in turn quotes James Hansen, then head of the part of NASA that does the GAST calculations:
One of the scientists, Dr. James E. Hansen of the National Aeronautics and Space Administration’s Institute for Space Studies in Manhattan, said he used the 30-year period 1950-1980, when the average global temperature was 59 degrees Fahrenheit, as a base to determine temperature variations.
So 59 deg F was the “average global temperature” for the 30-year period 1950-1980. Could that have been a typo? Here is the Times again, June 24, 1988:
Dr. Hansen, who records temperatures from readings at monitoring stations around the world, had previously reported that four of the hottest years on record occurred in the 1980’s. Compared with a 30-year base period from 1950 to 1980, when the global temperature averaged 59 degrees Fahrenheit, the temperature was one-third of a degree higher last year.
OK, definitely not a typo. Freeman also has multiple other quotes from the Times, citing both NASA and “a British group” (presumably Hadley CRU) for the same 59 deg F global average mean for the period 1950-80. So let’s then compare that figure to the official NOAA January 18, 2017 “record” global warming press release: “2016 marks three consecutive years of record warmth for the globe”:
2016 began with a bang. For eight consecutive months, January to August, the globe experienced record warm heat. With this as a catalyst, the 2016 globally averaged surface temperature ended as the highest since record keeping began in 1880. . . .
And kindly tell us, what was the global average temperature that constituted this important “record warm heat”?
The average temperature across global land and ocean surfaces in 2016 was 58.69 degrees F . . . .
OK, over to you to decide. Was the claimed “record warm heat” real, or was it an artifact of downward adjustments of earlier temperatures? If you think it might help (it won’t), here is a link to NASA’s lengthy bafflegab explanation of its adjustments. It’s way too long to copy into this post, and provides literally no useful information as to what they are doing, or why they think it’s OK.
Do you still think it might be possible that they are playing straight with you? My friend Joe D’Aleo (he’s one of the co-authors of the paper that was the subject of Part XV of this series) sent me this morning a write-up he had done about the temperature adjustments at one of the most prominent sites in the country, the one at Belvedere Castle in Central Park in Manhattan. There are lots of charts and graphs at the link for your edification. The temperature measuring site has been at the very same location near the exact middle of the park since 1920. That location is about 0.2 mi from the West edge of the park, and 0.3 mi from the East edge, so relatively speaking it is highly immune to local land use changes that affect many other stations. Yes, the City has grown some in that century, but the periphery of the park was already rather built up in 1920, and in any event the closest Central Park West park boundary is almost a quarter-mile away at the closest point.
This paper is another real eye-opener. You should read the whole thing (it’s only 7 pages long). The Central Park site is one for which the National Weather Service (part of NOAA) makes completely original, raw data available. D’Aleo does a comparison between that completely raw data and adjusted data for the same site from NOAA’s so-called “HCN Version 1” set, for two months each year (July and January) going for the century from 1909 to 2008. Essentially all of the temperatures for Central Park in the HCN Version 1 set are adjusted down, and dramatically so; but the adjustments are not uniform. From approximately 1950 to 1999, the downward adjustments for both months are approximately a flat 6 deg F — an astoundingly huge amount, especially given that the recently declared “record” temperature for 2016 beat the previous “record” by all of 0.07 deg C (which would be 0.126 deg F). Then, when 1999 comes, the downward adjustments start to decrease rapidly each year, until by 2008 the downward adjustment is only about 2 deg F. Result: whereas the raw data have no material upward or downward trend of any kind over the whole century under examination, the adjusted data show a dramatic upward slope in temperatures post-2000, all of which is in the adjustments rather than the raw data. D’Aleo:
[T]he adjustment [for July] was a significant one (a cooling exceeding 6 degrees from the mid 1950s to the mid 1990s.) Then inexplicably the adjustment diminished to less than 2 degrees. The result is [that] a trendless curve for the past 50 years became one with an accelerated warming in the past 20 years. It is not clear what changes in the metropolitan area occurred in the last 20 years to warrant a major adjustment to the adjustment. The park has remained the same and there has not been a population decline but a spurt in the city’s population [since 1990].
Since NOAA and NASA will not provide a remotely satisfactory explanation of what they are doing with the adjustments, various independent researchers have tried to reverse-engineer the results to figure out what assumptions are implied. One such effort was made by Steve McIntyre of the climateaudit.org website, and D’Aleo discusses that effort at the link. McIntyre gathered from correspondence with NOAA that their algorithm was making an “urbanization” adjustment based on the growing population of the urbanized area surrounding the particular site. Based on the adjusted temperatures reported at Central Park and the known population of New York City in the first half of the twentieth century, McIntyre then extrapolated to calculate the implied population of New York City for the recent years of the adjusted record. He came up with an implied population of about 17 million for 1975-95, then suddenly plunging to barely 1 million in 2005. Well, I guess that’s not how they do it! Any other guesses out there?
By the way, in case you have the idea that you might be able to dig into this and figure out what they are doing, I would point out that by the time you have completed any analysis they will undoubtedly have adjusted their data yet again and will declare your work inapplicable because that’s “not how we do it any more.” As the Wall Street Journal’s Holman Jenkins noted in November 2015:
By the count of researcher Marcia Wyatt in a widely circulated presentation, the U.S. government’s published temperature data for the years 1880 to 2010 has been tinkered with 16 times in the past three years.
I’m just wondering if you still think there’s anything honest about this.