A Hypothetical Cow “Gored” to Death

Hopefully this post will go a long way to once and for all “Gore” to death the warmist hypothetical cow of explaining the “Great Dying of Thermometers” in the GHCN datase that it nothing more then a lag in reporting.

Background first:

This all started with a post on EM Smiths blog about how since basically the 1990’s there has been a huge drop off in the number of reporting stations used in GHCN.

http://chiefio.wordpress.com/2009/11/03/ghcn-the-global-analysis/

Now GHCN started out back in the late 80’s and in the 1990’s as a project of the National Climatic Data Center. There they gathered historical data from around the world and compiled it in 1991 into the first iteration of GHCN. Now this would take a lot of work and time as you transcribed paper copies into a digital format. GHCN then went through a second revision in 1997 into what we know now as GHCN V2 which can be found here:

http://www.ncdc.noaa.gov/oa/climate/ghcn-monthly/index.php

Now this new update shouldn’t have taken as much time to do since most of the transcribing was already done into digital format. Only newly discovered older records would have required this process and all new data added (such as their monthly updates) all come in as digital data and should be a simple process if you have a competent programmer.

So when EM Smith found that the number of reporting stations in the GHCN dataset went down from a high of roughly 6 to 7,000 stations to roughly around 1500 stations by 2008 something didn’t sit right and he looked at which ones went “missing”.

What he found was that thermometers that were at higher latitudes and elevations were the ones dropped. Now it is well known that the closer you get to the poles the lower your temperatures are and the same thing applies as you start getting further up in the atmosphere from sea level. This then led to an hypothesis on his part. Since these lower temperature thermometers were included in the “baseline” period, but not incorporated in the later periods it would induce a bias into the trend since it would artificially cause the later temperature anomalies to be artificially higher then they should. Thus giving a false trend.

Now before this hypothesis has been even tested warmists rushed out with a few hypothetical cows to explain the “missing” thermometers. One of these is that the reason there is a drop off is because there is lag in reporting. Now in some cases this does make sense such as areas were there is political unrest such as Central Africa or Iraq, which in my animation “March of the Thermometers” you can see they have dropped out. However that doesn’t in anyway shape or form explain Canada.

Now on the face of it the excuse of lag in reporting for Canada is farcical since we all know that Canadians have access to every piece of modern technology we here in the US have. To try and say that the remote stations in the rugged Canadian wilderness need to send their reports in on paper, because of technology is not just insulting to my intelligence but to the entire country of Canada.

To make it simple for the warmists even people living in the back hills of the Appalachians known as “Hillbilly’s” and “Rednecks” know you can get Internet access via satellite and I’m sure the Canadian government knows this too. So all they need is a generator or other power supply (can even be a solar panel), a laptop computer with a satellite Internet hookup and even the most remote place on earth can report in virtual “real” time.

Now to give a little of my background I was one of the technicians that operated the first Tele-radiology systems on the planet when I deployed with Fleet Hospital Six. Via landline or our backup Sat-phone we could send X-rays taken from our version of a MASH back to Bethesda Naval Hospital for experts to look at and they could talk to the Doc’s where we were via teleconferencing. All we did was scan the X-ray into a laptop and use that new fangled thing called the Internet. Now this was all done back in 1994 and I’m sure the Canadians have figure out the basics of this by now. So this isn’t rocket science people, this can be done with stuff you find at Radio Shack.

However that piece of logic is probably too much for warmists to understand so lets go with what the WMO has to say about this. You see the WMO set up a Global Telecommunications System (GTS) so that even the most remote and backwards countries could report in with a monthly CLIMAT report. So lets go to the WMO guide on operations to see what they say:

2.6.2 Logging and reporting of observations

Immediately after taking an observation at a manual station, the observer must enter the data into a logbook, journal, or register that is kept at the station for this purpose. Alternatively, the observation may be entered or transcribed immediately into a computer or transmission terminal and a database. Legislation or legal entities (such as courts of law) in some countries may require a paper record or a printout of the original entry to be retained for use as evidence in legal cases, or may have difficulty accepting database generated information. The observer must ensure that a complete and accurate record has been made of the observation. At a specified frequency (ranging from immediately to once a month), depending on the requirements of the NMHS, data must be transferred from the station record (including a computer database) to a specific report form for transmittal, either by mail or electronically, to a central office.

Climatological station personnel must ensure that there is a correct copy of the pertinent information in the report form. In the case of paper records, the necessities for good, clear handwriting, “clean” journals and report forms should be emphasized. Usually, more information, perhaps pertaining to unusual weather phenomena and occurrences, is entered in the local record than is required by the central office. The onstation record must be retained and readily accessible so that the station personnel can respond to any inquiries made by the central office regarding possible errors or omissions in the report form. Some services request observers to send logbooks to the national climate center for permanent archiving.

Some national climate centers will require the station personnel to calculate and insert monthly totals and means of precipitation and temperature so that the data may be more easily checked at the section or central office. In addition, either the climate center or observer should encode data for the CLIMAT messages (WMO/TDNo.1188), if appropriate. WMO has software to encode the data. The observer should note in the station logbook and on the report forms the nature and times of occurrence of any damage to or failure of instruments, maintenance activities, and any change in equipment or exposure of the station, since such events might significantly affect the observed data and thus the climatological record. Where appropriate, instructions should be provided for transmitting observations electronically. If mail is the method of transmission, instructions for mailing should be provided to the station as well as preaddressed, stamped envelopes for sending the report forms to the central climate office.

  Page 46/47 of http://www.wmo.int/pages/prog/wcp/documents/Guide2.pdf

Now I know that mail is slow but I doubt that it takes up to 18 years for observations taken in the early 90’s to reach NCDC. Also notice that WMO provides software for proper transmission of CLIMAT reports.

Now that should kill the hypothetical cow of “late reporting” there but lets keep looking:

3.3 Climate data management

Climatological data are most useful if they are edited, quality controlled, and stored in a national archive or climate center and made readily accessible in easytouse forms. Although technological innovations are occurring at a rapid pace, many climatological records held by NMHS are still in nondigital form. These records must be managed along with the increasing quantity of digital records. A Climate Data Management System (CDMS) is a set of tools and procedures that allows all data relevant to climate studies to be properly stored and managed.

The primary goals of database management are to maintain the integrity of the database at all times, and to ensure that the database contains all the data and metadata needed to meet the requirements for which it was established, both now and into the future. Database management systems have revolutionized climate data management by allowing efficient storage, access, conversion, and update for many types of data, and by enhancing security of the data.

A major step forward in climate database management occurred with the World Climate Data and Monitoring Programme (WCDMP) Climate Computing (CLICOM) project in 1985. This project led to the installation of climate database software on personal computers, thus providing NMHS in even the smallest of countries with the capability of efficiently managing their climate records. The project also provided the foundation for demonstrable improvements in climate services, applications, and research. In the late 1990s, the WCDMP initiated a CDMS project to take advantage of the latest technologies to meet the varied and growing data management needs of WMO Members. Aside from advances in database technologies such as relational databases, query languages, and links with Geographical Information Systems, more efficient data capture was made possible with the increase in AWS, electronic field books, the Internet, and other advances in technology.

Page 63 http://www.wmo.int/pages/prog/wcp/documents/Guide2.pdf

So here we see that the WMO mandated starting in 1985 that the records were to be kept in digital form. Now I’m sure Canada could have managed that feat in the last 25 years, let alone the “smallest countries” the WMO mentions. Then in the late 90’s the WMO initiated a project that uses such new fanlged things such as the Internet and Electronic Field Books for more efficient data capture. I’m sure Canada could implement that system in the last 10 years.

Now this hypothetical cow of “late reporting” or a lag is looking more and more dead but lets soldier on:

3.5 Exchange of climatic data

Exchange of data is essential for climatology. For states that are members of WMO the obligation to share data and metadata with other members, and the conditions under which these may be passed to third parties is covered under WMO Resolution 40 (CgXIII) for meteorological data, WMO Resolution 25 (CgXIV) for hydrological 1 data, and the Intergovernmental Oceanographic Commission Resolution XXII6 for oceanographic data. The Resolutions embody the concepts of “essential” and “additional” data, with a specification of a minimum set of data that should be made available in a nondiscriminatory manner and at a charge of no more than the cost of reproduction and delivery without charge for the data and products themselves. Members may decide to declare as “essential” more than the minimum set. The use of agreed upon international standard formats for data exchange is critical.

Beyond CLIMAT and related messages (see section 4.8.7), members are also asked to provide additional data and products that are required to sustain WMO programmes at the global, regional, and national levels and to assist other Members in providing meteorological and climatological services in their countries. Members supplying such additional data and products may place conditions on their reexport. Research and educational communities should be provided with free and unrestricted access to all data and products exchanged under the auspices of WMO for their noncommercial activities.

Members of WMO volunteer subsets of their stations to be parts of various networks, including the GCOS GUAN, the GCOS GSN, the Regional Basic Synoptic Network, and the Regional Basic Climatological Network. Nomination of stations in these networks implies an obligation to share the data internationally.

Data are also shared through International Council for Science World Data Centers (WDCs). The WDC system works to guarantee access to solar, geophysical, and related environmental data. It serves the whole scientific community by assembling, scrutinizing, organizing, and disseminating data and information. WDCs collect, document, and archive measurements and the associated metadata from stations worldwide and make these data freely available to the scientific community. In some cases WDCs also provide additional products including data analyses, maps of data distributions, and data summaries. There are climate related International Council for Science WDCs covering Meteorology, Paleoclimatology, Oceanography, Atmospheric Trace Gases, Glaciology, Soils, Marine Geology and Geophysics, Sunspots, Solar activity, SolarTerrestrial Physics, Airglow, Aurora, Cosmic Rays, as well as for other disciplines.

WMO is actively involved in the provision of data to a number of these International Council for Science WDCs, and there are a number of associated centers operated directly through WMO. The WMO centers include Ozone and Ultraviolet Radiation, Greenhouse Gases, Aerosols, Aerosol Optical Depth, Radiation, and Precipitation Chemistry. There are differences in data access policy for International Council for Science and WMO centers. International Council for Science data centers will exchange data among themselves without charge and will provide data to scientists in any country free of charge. WMO data centers must abide by data exchange Resolutions 40 and 25 that allow for some data or products to be placed in the WDCs with conditions attached to their use.

There are many other centers beyond the International Council for Science WDCs that operate under cooperative agreements with WMO or with individual NMHS. These centers include the Global Precipitation Climatology Center and Global Runoff Data Center (Germany); Australian National Climate Center; Canadian Climate and Water Information Data Center; Hadley Center (UK); and in the USA, LamontDoherty Earth Observatory of Columbia University, National Climatic Data Center, National Oceanographic Data Center, National Geophysical Data Center, NASA Goddard Distributed Active Archive Center, Tropical Pacific Ocean Observing Array, and University Corporation for Atmospheric Research.

Exchange of digital data is simple for many members because of the range of computer communications systems available. The Global Telecommunication System is the meteorological communication system with connections to virtually all countries of the world. As an operational system with a critical role in global weather forecasting, it provides reliable communication services, albeit sometimes with low bandwidth. Like the Internet, the Global Telecommunication System is based on a confederation of interconnected networks. However, as a closed system, it is free from the security breaches that often plague the Internet. Open communication linkages such as the Internet should be protected by the best available security software systems to minimize the danger of unwanted access and file manipulation or corruption.

It is highly unlikely that archived formats used for climatological data by one country would be the same as those used by another. The format documentation describing the data organization, element types, units, and any other pertinent information should accompany the data. In addition, if the digital data are compacted or in a special nontext format, it is extremely useful for the contributing archive center to provide “read” routines to accompany digital data requested from an archive.

International data exchange agreements allow for the global compilation of publications such as Climatic Normals, World Weather Records, and Monthly Climatic Data for the World. Bilateral or multilateral agreements are also important in creating and exchanging long term data sets, such as the Global Historical Climate Network, Comprehensive Aerological Reference, and Comprehensive OceanAtmosphere Data Sets compiled by the United States and the Hadley Centre global observations data sets compiled by the United Kingdom. These data sets are generally provided to research centers.

The current WMO information systems have been developed to meet a diverse set of needs for many different programmes and Commissions. The multiplicity of systems has resulted in incompatibilities, inefficiencies, duplication of effort, and higher overall costs for Members. An alternative approach planned to improve efficiency of the transfer of data and information among countries is the WMO Information System. It is envisioned that the WMO Information System will be used for the collection and sharing of information for all WMO and related international programmes. Nonmeteorological and nonclimatic environmental and geophysical data such as ecological, earthquake and tsunami data could be included. The WMO Information System vision provides guidance for the orderly evolution of existing systems into an integrated system that efficiently meets the international environmental information requirements of Members.

The WMO Information System will provide an integrated approach to routine collection and automated dissemination of observed data and products, timely delivery of data and products, and requests for data and products. It should be reliable, cost effective, and affordable for developing as well as developed Members; technologically sustainable and appropriate to local expertise; modular; and scalable, flexible, and extensible. It should be able to adjust to changing requirements, allow dissemination of products from diverse data sources, and allow participants to collaborate at levels appropriate to their responsibilities and budgetary resources. The WMO Information System should also support different user groups and access policies such as WMO Resolutions 40 and 25, data as well as network security, and integration of diverse data sets.

 

Pages 79-82 http://www.wmo.int/pages/prog/wcp/documents/Guide2.pdf

Now this section is very important and I highlighted the most important of that section (some of this might already be emphasized in the WMO document).

Guess who is the WDC for maintaining the temperature archives that the WMO is telling its member nations to send the data to?

World Data Center(WDC) for Meteorology, Asheville is one component of a global network of discipline subcenters that facilitate international exchange of scientific data. Originally established during the International Geophysical Year (IGY) of 1957, the World Data Center System now functions under the guidance of the International Council of Scientific Unions ( ICSU).

The WDC for Meteorology, Asheville is maintained by the U.S. Department of Commerce, National Oceanic and Atmospheric Administration (NOAA) and is collocated and operated by the National Climatic Data Center (NCDC).

In accordance with the principles set forth by ICSU, WDC for Meteorology, Asheville acquires, catalogues, and archives data and makes them available to requesters in the international scientific community. Data are exchanged with counterparts, WDC for Meteorology, Obninsk and WDC for Meteorology, Beijing as necessary to improve access. Special research data sets prepared under international programs such as the IGY, World Climate Program (WCP), Global Atmospheric Research Program (GARP), etc., are archived and made available to the research community. All data and special data sets contributed to the WDC are available to scientific investigators without restriction.

http://www.ncdc.noaa.gov/oa/wdc/index.php

That’s right the National Climatic Data Center (NCDC) Asheville NC. That is who is suppose to get every CLIMAT report sent over the GTS.

Why?

One reason is as mentioned by the WMO the GHCN dataset.

Various data sets and data products from international programs and/or experiments, including meteorological and nuclear radiation data for International Geophysical Year (IGY)(see IGY Annuals, Vol.26); meteorological data and data products from Global Atmospheric Research Program, World Climate Research Program, World Climate Data and Monitoring Program; and data (including data publications) exchanged with the WDC by participating countries. Quality control is performed and documentation prepared by designated experiment centers or contributors before submission to WDC.

Global Historical Climate Network (GHCN) dataset. GHCN is a comprehensive global baseline climate data set comprised of land surface station observations of temperature, precipitation, and pressure. All GHCN data are on a monthly basis with the earliest record dating from 1697.

Is it starting to dawn on the warmists now that all CLIMAT records are available to the makers of the GHCN dataset and further more that one of their primary jobs is to archive all the temperature data in the world?

There is no such thing as “lag” in reporting the data from Canada to the NCDC due to the GTS system once a CLIMAT report is transmitted NCDC has access to it.

However lets make sure this hypothetical cow gets turned completely into hypothetical hamburger. You see there is another organization that is getting those same CLIMAT reports that NCDC is suppose to be getting. That organization in the US Air Force and they combine the CLIMAT reports with weather data from their own AWN system and give the Qced dataset to NCDC.

National Climatic Data Center

DATA DOCUMENTATION  FOR

DATA SET 9950 (DSI-9950)

DATSAV2 SURFACE

January 6, 2003

Abstract: DATSAV2 is the official climatological database for surface observations. The database is composed of worldwide surface weather observations from about 10,000 currently active stations, collected and stored from sources such as the US Air Force’s Automated Weather Network (AWN) and the WMO’s  Global Telecommunications System (GTS). Most collected observations are decoded at the Air Force Weather Agency (AFWA) formerly known as the Air Force Global Weather Central (AFGWC) at Offutt AFB, Nebraska, and then sent electronically to the USAF Combat Climatology Center (AFCCC), collocated with NCDC in the Federal Climate Complex in Asheville, NC. AFCCC builds the final database through decode, validation, and quality control software. All data are stored in a single ASCII format. The database is used in climatological applications by numerous DoD and civilian customers.

AFCCC sorts the observations into station-date-time order, validates each station number against the Air Weather Service Master Station Catalog (AWSMSC), runs several quality control programs, and then merges and sorts the data further into monthly and yearly station-ordered files. AFCCC then provides the data to the collocated National Climatic Data Center (NCDC).

 

http://www1.ncdc.noaa.gov/pub/data/documentlibrary/tddoc/td9950.pdf

In 2003 while GHCN was losing stations the Air Force somehow came up with 10,000 active reporting stations and according to NASA today the number is up to 13,000: 

Contains worldwide surface observations (synoptic, airways, METAR, synoptic ship) for about 13,000 stations. All weather elements transmitted are retained; in some cases, computed/derived values are
incorporated into the record. Also available are “station files-individual station data sets for selected stations–that have … received more quality control. Elements reported are: Wind direction, Snowfall and snow depth data, Wind speed, Runway data, Barometric pressures, Hail data, Pressure tendency & change, Sunshine data, Dry bulb temperature, ground temperature and conditions, Dew point temperature, Maximum and minimum temperatures, Total sky cover, Ship data, Visibility, Sea surface temperature, Past and present weather, Wave data, Cloud layer data, Swell data, Ceiling, Ship ice reports, Precipitation data.

The DATSAV2 Surface Data is also available from NOAA/NESDIS/NCDC (National Climatic Data Center) in Asheville, NC.

http://gcmd.nasa.gov/records/GCMD_USAFETAC_SFFMG.html\ 

Now we see that this dataset has anywhere from 10,000 to 13,000 stations active. GHCN has 1500 stations active. Now I can tell you straight out the US military does not operate 8500 to 11,500 bases around the world, so most of the data is coming over the WMO’s GTS which the NCDC is suppose to get and archive the data from.

So how is it the Air Force is able to find and use these stations to make what NCDC calls the “official climatological dataset” but NCDC can’t seem to included them into GHCN, even after the Air Force has given the data to them?

So as shown the technological tools are there, Canada has the resources to use those tools, the NCDC is the designated world archive for the worlds temperatures, but can’t seem to find active stations outside of the 1500 they have in the GHCN, while at the same time the US Air Force is able to find over 10,000 active stations, update their dataset every three months and even share that data with NCDC, but NCDC still can’t get that data into GHCN after most of the work has been done for them.

Advertisements

One response to “A Hypothetical Cow “Gored” to Death

  1. Patty February 23, 2010 at 1:20 am

    Nice post! I really like your posting.
    i will come back to read more of your posts.

    Cheers

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: