Category Archives: Climate Science

GHCN V3 A First Look

Back around Christmas of 2009 Dr. Tom Peterson of the NCDC let it be known that the GHCN dataset was going through a revision in an Email to Willis Eschenbach. Willis posted what Dr. Peterson had to say in a comment on WUWT and you can read the whole thing there:

http://wattsupwiththat.com/2009/12/20/darwin-zero-before-and-after/#comment-272529

However I will copy the pertinent parts that apply to this post here:

They are currently in the process of applying their adjustment procedure to GHCN. Preliminary evaluation appears very, very promising (though of course some very remote stations like St Helena Island (which has a large discontinuity in the middle of its long record due to moving downhill) will not be able to be adjusted using this approach). GHCN is also undergoing a major update with the addition of newly available data. We currently expect to release the new version of GHCN in February or March along with all the processing software and intermediate files which will dramatically increase the transparency of our process and make the job of people like you who evaluate and try to duplicate surface temperature data processing much easier.

SNIP

We’re doing a lot of evaluation of our new approach to adjusting global temperature data to remove artificial biases but additional eyes are always welcome. So I would encourage you to consider doing additional GHCN evaluations when we release what we are now calling GHCN version 2.5 in, hopefully, February or March of 2010.

Well as we now know what they called then GHCN v2.5 was actually named GHCN v3. Also we know that Dr. Peterson was over optimistic on the release date by over a year since GHCN v3 didn’t go “live” until May 2nd:

 Effective May 2, 2011, the Global Historical Climatology Network-Monthly (GHCN-M) version 3 dataset of monthly mean temperature will replace GHCN-M version 2 as the dataset for operational climate monitoring activities. Although GHCN-M version 2 will continue to be updated with recent observations until June 30, 2011, users are encouraged to begin using GHCN-M version 3.

http://www.ncdc.noaa.gov/ghcnm/

Now prior to this news release GHCN v3 was available in a beta version and people started comparing the GHCN v3 beta  against GHCN v2 adj, however I held off waiting for the live version before I started looking. There was a couple of reasons for this: Read more of this post

Advertisements

The Southeast US Has Been Cooling For 115 Years

Here we go again. Whenever there is an outbreak of severe WEATHER, the global warmists try and tie it to “Climate Change”.

First for them WEATHER is not CLIMATE, but even if you could tie  them togethers the facts do not support their assertion that it is getting warmer and causing these outbreaks.

Look at where these Tornadoes occurred: The Southeast US (Arkansas, Mississippi, Georgia and the Carolinas).

What the warmists are trying to do is tie the long term trend in the Global Temperature Anomaly (GTA) to a regional phenomena, instead of looking at what the long term trend in the regional temperatures are. According to NOAA/NCDC the Southeast US is in a a long term cooling trend:

http://www.ncdc.noaa.gov/temp-and-precip/time-series/index.php?parameter=tmp&month=12&year=2010&filter=12&state=104&div=0

Now yo can clearly see in this screen shot of NOAA/NCDC’s own information that the long term trend of 115 years is a cooling one. You can also see that this region cycles: From about 190 to about the mid 1930’s it warmed, from then until the mid 70’s it cooled, from there to recently it warmed again. However in the last couple of  years (2007 and on) it has been getting cooler again.

This clearly shows that where the tornadoes occurred the surface air temperature has NOT been warming but cooling.

Do We Really Know What The Temperature Is?

How many times have you seen temperature graphs similar to this:

Figure 1

In figure 1 it is a graph of the average yearly anomaly’s for the USHCN Raw dataset for the station located at State College Pa, but it could just as well be a station record from GISS or from GHCN or from any place around the world and the basic properties of the graph would be the same. You would have start and end dates, data points, a possible trend line and so forth. It all looks fine except for one little problem that I have yet to seen addressed in Climate Science: Instrument Error.

Every device that man has built to measure something has a built in error range and it is something that can only be accounted and know if you compare your device against one of known value or against something of a known physical property. For Thermometers they typically compare them to the known freezing and boiling points of water. This is called calibration and even after that there is still a error band, which is the accuracy of the thermometer. With today’s technology we can whittle down that error band sometimes as low as +/- .001 but for something like that it takes a lot more money so typically you won’t get near that level. However what about in the past? How accurate were those thermometers back in the late 1800’s and early 1900’s? What was the accuracy of ones even in the 40’s and 50’s?

If you look around the internet you find that Liquid in Glass (LIG) are still for sale and they have error ranges as much as .5°, others at .3° and you can buy Platinum Resistance Thermistors (PRT) that have accuracy’s of .1°, which brings us back to what was it like back in the olden days before the people taking these readings were worried about raises in temperature of a fraction of a degree.  So would it be reasonable to assume that early Max/Min thermometers had a +/- accuracy of at least .5°? I would say yes.

Now you may ask what does that got to do with anything?  The answer is a lot. You see that +/- rating of .5° means that when you see a daily max/min temperatures in those old records it isn’t exactly true. Example if it is written that the Max was 50°F and the Min was 39°F that means that the Max could actually be up to 50.5° and as low as 49.5°, with the Min being actually between 39.5° and 38.5°. This in turn affects the mean for day  and consequently the average mean for the month by the same amount. Here is an example: Lets say that the Thermometer in State College Pa during the year of 1939 had an accuracy of +/- .5° F . Below you will see how much that can affect the means. Read more of this post

2010 Hottest Year Ever?

2010 is coming to a close at the end of this month, at least in a meteorological sense. You see for meteorological purposes the year ends at the end of November instead of December because December is the starting month for the winter season. So how is 2010 shaping up? Well according to most of the analysis 2010 will not be the Hottest Year on Record.

Figure 1

Here in Figure 1 we see the monthly anomalies for each of the analysis datasets. The reason it only goes back to 1979 is for two reasons, the first being 1979 is as far back as you can go for the satellite datasets and that for all of the datasets they have the Hottest Year on Record after 1979. As can be clearly seen the only dataset that has 2010 as the possible record year is the GISS analysis. Read more of this post

Another Problem with GISS Infilling?

Over on WUWT Steven Mosher has a very interesting post up about problems with the latitude and longitude that NCDC gives in it’s GHCN dataset. What he has found if that the coordinates given by NCDC for different temperature monitor stations are off and not just by a little bit, sometimes it’s by as much as 300km from their actual locations. He shows how this error is then transmitted from GHCN into the GISS analysis via the GISS nightlights adjustment for UHI. One example he shows is that the temperature station is at an airport out among the sands of a desert. Now this should cause GISS to find nightlights and also to list the station as airport, but it doesn’t. According to GISS that station comes up dark and is listed as desert. Why? Well the WMO (World Meteorological Organization) has the coords listed for the airport but GHCN has the coords 300km away in the middle of the desert with nothing surrounding it. This will obviously skew the numbers for that station in the GISS analysis,but how much is not known at the moment and we do not know how many other stations are incorrectly placed (some listed as dark when they should be lighted and vise versa) and how much the global numbers are affected. While I have tried to explain it adequately it is better to see the post it self which includes Google Earth shots that drive the point home before I move on to the next part. Here is the link to the post:

http://wattsupwiththat.com/2010/10/31/metadata-errors-in-the-global-weather-station-database/

Now I didn’t read all the comments on WUWT yet but a thought occurred to me: If this error causes this problem in GISS could it cause another type of problem? My answer is yes.

Read more of this post

How Long Does Spring, Summer and Fall Last on Mt.Hood?

Well for the year of 2010 if you go by the Ski season it totals about 4 months. You see back in the second week of June when I was on vacation in the Portland area I visited Mt. Hood and the Timberline Lodge Ski resort. At that time the resort was still open and there was skiing and here is a pictures taken from there:

Read more of this post

East Coast Summer

In my last post we had the West Coast, now it’s time for the East Coast. Again the stations selected were based on being from the late 1800’s through the summer of 2010, to include both coastal and inland stations, rural and urban. First I will work from North to South with the Coastal stations, then do the same with Inland ones.

Figure 1

Here in Figure 1 we see that 2010 was a record breaking summer with the summer mean being over 2.25° C over the average. When you look at this record you see regular ups and downs with a steady rise over time.

Figure 2

Here in Figure 2 we see again that 2010 was a record breaking summer with the mean being over 2° C over the average. This record is very similar to Figure 1 with what looks to be regular ups and down on top of a steady rise. Read more of this post

West Coast Summer

UPDATE: Some of the graphs were in error and have now been fixed, however it didn’t change any of the results.

Back in August I did a post that showed that Los Angeles and San Diego had record breaking cool July’s. I also did a post on the how hot July really was on the East Coast. This time I will do expanded posts on the summers for the West Coast and the East Coast. These posts will have both rural and urban stations, coastal and inland stations with the criteria I looked for was records that went from the late 1800’s to the present in GISS station analysis. You can get the data from here: http://data.giss.nasa.gov/gistemp/station_data/

So we will start on the West Coast with 11 stations in total 6 coastal, 5 inland. I will start in Washington and work south for the coastal stations then do the same with the inland stations.

 

Figure 1

 

Here in Figure 1 we see that the summer of 2010 was the second coolest in the Aberdeen record and well below average.

 

Figure 2

 

Here in Figure 2 we see that in Astoria while 2010 was below the average temperature it was nowhere close to a record summer, if anything it would be considered just another normal summer. Read more of this post

Wildfires, Pine Beetles and CAGW/Climate Change Part 2

Back at the beginning of Sept there was two wildfires in Colorado, one of which the Fourmile Canyon fire got major media attention in the US which brought about people looking to tie it to Global Warming/Climate Change/Climate Disruption or whatever they are calling it this week. That led me to making this post here: https://boballab.wordpress.com/2010/09/12/wildfires-pine-beetles-and-cagwclimate-change/ . Now I finally got things straightened out and some time so here is part 2 of the series.

As shown in Part 1 the assertion is that AGW/Climate Change is causing longer summers in the Rockies and thus causing an increase in Pine Beetle population. This in turn leads to more forest area being vulnerable to wildfires.

Here in step one I used the USHCN Web Interface based on a Google maps to find which stations are in and around the wooded areas of Colorado. There is 13 stations that meet that criteria in the USHCN database and thus in the NASA GISS dataset which is the one I will be using for this look at temperatures. Of those 13 stations only 12 are usable, one station (Collbran) data stops in the year 2000 the others all reach to 2008 or 2009. Read more of this post

Mann Criticizes Mann 08 For Data Quality Errors!

No I’m not making that up for that is what Dr. Michael Mann has done in his attempt to refute the Mcshane & Wyner paper (MW 10). Dr. Mann in conjunction with Dr Gavin Schmidt and Dr. Rutherford written a comment to the MW 10 paper and the Journal of Applied Statistics has accept it for print, however it wasn’t a very good critic of the MW 10 paper because one of the points they make is that:

However, the absence of both proper data quality control and appropriate “pseudoproxy” tests to assess the performance of their methods invalidate their main conclusions.
Now if you had read the MW10 paper the issue of data quality control was explicitly handled this way:
We are not interested at this stage in engaging the issues of data quality. To wit, henceforth and  for the remainder of the paper, we work entirely with the data from Mann et al. (2008)3. This is by far the most comprehensive publicly available database of temperatures and proxies collected to date. It contains 1,209 climate proxies (with some going back as far as 8855 BC and some continuing up till 2003 AD). It also contains a database of eight global annual temperature aggregates dating 1850-2006 AD (expressed as deviations or ”anomalies” from the 1961-1990 AD average4). Finally, there is a database of 1,732 local annual temperatures dating 1850-2006 AD (also expressed as anomalies from the 1961-1990 AD average)5. All three of these datasets have been substantially processed including smoothing and imputation of missing data (Mann et al., 2008). While these present interesting problems, they are not the focus of our inquiry. We assume that the data selection, collection, and processing performed by climate scientists meets the standards of their discipline. Without taking a position on these data quality issues, we thus take the dataset as given.
Now as you can see if there is any data quality issues in the data made publicly available then it is the responsibility of the Climate Scientist that made the database, McShane and Wyner started their paper a priori that the Climate Scientist knew what he/she was doing when sampling, collecting and data quality was performed (processing). Now who was the Climate Scientist that did the processing of the data in the data base and responsible for its quality? Dr. Michael Mann.
In conclusion for Dr. Mann to now claim that the data used by McShane & Wyner was of poor quality, he is actually admitting  that the data in the Mann 08 paper is poor and wasn’t handled properly. Thus by criticizing the MW 10 paper he actually criticizes himself and Mann 08,  because the data used in MW10 is nothing more then the data used in Mann 08.
You can read the MW paper here:
You can read the Mann/Schmidt/Rutherford comment here: