Monthly Archives: August 2010

History of NOAA 16

The images that showed the extreme temperatures for Lake Michigan were Advanced Very High Resolution Radiometer (AVHRR) images taken from the AVHRR sensors on the NOAA 16 satellite. Now these images came from July 4th 2010 and has been dismissed by defenders of NOAA by saying that “They say don’t use it for climatological studies”. There is multiple problems with this defense, the first being is that those types of warnings haven’t stopped scientists from using inappropriate data before (See the paper Mann 08 and his use of a inappropriate temperature proxy after being told it couldn’t be used in temperature reconstructions by the scientist that made the proxy). However the bigger problem is that this satellite has been known to have a history of having problems in its AVHRR sensor systems as seen by public records:

On Friday 19th September 2003 NOAA 16 developed AVHRR synch. problems
similar to those on NOAA 14 (and previously NOAA 15).
Since then reports have been mixed. Some periods of good data, some
poor.

http://homepage.ntlworld.com/phqfh1/status.htm

NOAA-16 started malfunctioning in early 2004, when a scan motor problem caused a ‘barcode’ appearance.

http://www.nass.usda.gov/research/avhrr/avhrrmnu.htm

A known problem exists with the NOAA-16 Version 3 data. Due to a problem with the scan motor, all the channel data are shifted sporadically causing the channels to contain data from another channel; thus, the derived parameters are in error when this channel shift occurs. The dates when the scan motor problem affects the data are list in Table 12. Note however that the scan motor problem does not affect the entire composited image on these dates, but rather there are patches of bad data within the composite.

http://nsidc.org/data/docs/daac/nsidc0066_avhrr_5km.gd.html

DEVELOPMENT OF A GLOBAL QC/QA PROCESSOR FOR

OPERATIONAL NOAA 16-18 AND METOP AVHRR SST PRODUCTS

 Figure 2 shows time series of global number of observations, mean, standard deviation, skewness,

and kurtosis of SST anomalies for NOAA 16-18 and MetOp-A (left panel: day, right panel: night). Each

point represents an 8 day period of global statistics. Mean anomalies from N17 and N18 during

nighttime are highly consistent, whereas N16 shows anomalous behavior due to sensor problems

(Figure 2b).

http://www.eumetsat.int/Home/Main/Publications/Conference_and_Workshop_Proceedings/groups/cps/documents/document/pdf_conf_p50_s5_01_dash_p.pdf

Volume 26, Issue 9 (September 2009)

Effect of Out-of-Band Response in NOAA-16 AVHRR Channel 3b on Top-of-Atmosphere Radiances Calculated with the Community Radiative Transfer Model

This study shows that this discontinuity is caused by the out-of-band response in NOAA-16 AVHRR Ch3b and by using a single layer to the NCEP GFS temperature profiles above 10 hPa for the alpha version of CRTM. The problem has been solved in CRTM v.1.1, which uses one of the six standard atmospheres to fill in the missing data above the top pressure level in the input NCEP GFS data. It is found that, because of the out-of-band response, the NOAA-16 AVHRR Ch3b has sensitivity to atmospheric temperature at high altitudes.

http://journals.ametsoc.org/doi/abs/10.1175/2009JTECHA1259.1?journalCode=atot

Click the link below to see the latest status of NOAA 16 and the dates of the all the problems with the satellite:

http://www.oso.noaa.gov/poesstatus/spacecraftStatusSummary.asp?spacecraft=16

Now certain other places around the planet that claim to be skeptical of science, point out in an article today something that the real skeptics pointed out at the minimum yesterday: That RSS and UAH do not use NOAA 16 data. With that said they then try to paper over the fact that scientists studying the Great Lakes use the data from this secondary satellite, that’s why it was on a major US university’s website and is still used by NOAA.

Now ask yourself these important questions: Why is even local studies being done with data from a KNOWN bad satellite, giving KNOWN bad data for years? Why isn’t the data from the NOAA 18 and 19 satellites being used in the MSU archives, it is the primary after all?

Also we are told that Coastwatch is not used for climate studies. Why then does NOAA on their Coastwatch website say that Coastwatch data is used for Climate studies?

Climate

The CoastWatch search interface provides access to multiple satellite ocean remote sensing data and products useful for climate studies. In the search panel on the left, a default set of products have been pre-selected. These products will activate after a region has been selected. Modification of the products, sensors and satellites may be selected/highlighted by using “[shift]-click” and dates may be entered manually into the text fields or by using the pop-up calendars.

http://coastwatch.noaa.gov/cwn/search/interface.html?application=Climate

When you go to that page you see a data selector. When you select the Great Lakes Region, then AVHRR sensor, then SST, you then get a list of satellites to choose from: NOAA 15-19 and Metop 2. Now you can still pull up data from those secondary satellites such as NOAA 15 and 17 but when you select the NOAA 16 option you get no info because NOAA has pulled that data once this was brought to their attention. Matter of fact the ONLY time you get to use the NOAA 16 data option is when you select AVHRR and SST.

Here is the Metadata from NOAA, it tells you which platforms and sensors were used in one of their Global SST datasets:

Sea Surface Temperature (100 KM Global)

Metadata from the NOAA Metadata Manager and Repository (NMMR)

 Platform and Instrument Identification:

Mission Name:
POES > Polar-orbiting Operational Environmental Satellites
Platform Full Name:
National Oceanic & Atmospheric Administration-16
Platform Short Name:
NOAA-16
Platform Serial Identifier:
N16
Instrument Name:
Advanced Very High Resolution Radiometer/3
Instrument Short Name:
            AVHRR/3

http://www.class.ngdc.noaa.gov/saa/products/nmmr/Classic/SST100

Again why is a satellite that is considered secondary, and KNOWN for years to be wrong, having it’s data archived for use for any reason?

Advertisements

Things That Make You Go Hmmm Part 3

In the first two parts we saw that using either the older adjustment method used by GHCN or the new one in USHCN, these adjustments are the only thing that reversed the findings in the Christy et al 2006 paper while using the Hanford and Lemon Cove stations. 

https://boballab.wordpress.com/2010/07/31/things-that-make-you-go-hmmm/ 

https://boballab.wordpress.com/2010/08/09/things-that-make-you-go-hmmm-part-2/ 

In this third part I’m going to look and see if I find the same changing of the results by looking at the Tmax and Tmin trends for Merced and Yosemite stations in the USHCN v2 dataset. You can download the data from the following link: 

http://cdiac.ornl.gov/epubs/ndp/ushcn/ushcn_map_interface.html 

  

Couple of things I noticed about the Merced/Yosemite station data, is that there is no Raw data for Yosemite prior to 1911, except for one year and there is no Raw data after 2003. All adjusted data for those years is by the USHCN Filnet computer routine. Merced isn’t much better with very scarce data in the early Raw record but it does happen to have Raw up to 2009. So to make a true comparison for this pair we will have to restrict the data range from 1911 to 2003. 

Also take note that both stations have a lot more missing data than the Hanford/Lemon Cove stations in the selected data range. 

Figure 1

 

Now here in Figure 1 we see the Merced (Valley) Tmax raw trend is a warming one of about 2.3° F over the time period. After adjustment the trend becomes a very slight warming trend, maybe .1° to .2° F. 

For Yosemite (Sierra) Tmax raw trend is a cooling trend of about -.6° F over the time period. After adjustment the trend becomes a very large cooling trend of -1.75° F. 

Figure 2

 

Now here in Figure 2 we the Merced (Valley) Tmin raw trend is is warming trend of about .5° F. After adjustment the trend is reduced very slightly to about a .4° F warming trend. 

Yosemite (Sierra) Tmin raw on the other hand is a eye opener to say the least. You have a very large negative excursion right at the beginning of the graph , followed by a relatively flat period, followed by what looks like a step jump around 1977. Usually those are signs of changing equipment/station move, which means taking a look at the meta data on that station closely. However with all that said the trend is a very large warming one of 7° F. The adjusted trend is still a very strong warming one of about 3.5° F. 

So lets see does this match what Dr. Christy had in his 2006 paper of Valley and Sierra showing a lack of warming during daytime (Tmax) but Valley showing a warming during nighttime (Tmin) with Sierra not? 

In this pair we saw that the Valley is strongly warming during daytime in the raw data and only slightly in the adjusted, where the Sierra one is showing a cooling in the raw data and is increased via adjustment. On the nighttime side the Valley showed a slight warming trend in both raw and adjusted. The Sierra one on the other hand has a very large warming trend and adjustment cuts it in half and still it’s a very strong warming trend. 

Now this goes against the finding of Dr. Christy, but this pair of stations IMHO is a poor pair to use to look to see if irrigation of a desert landscape, changing it into farmland causes a temperature change. The reason for that is if you go into Google Earth yourself and zoom all the way in on the Merced station coords given in USHCN, you find yourself smack dab in the middle of the Merced Airport. 

 

Yes there is farmland to the west of the airport, but there is a city just to the east of the airport and lets not forget the airport itself. To me this doesn’t make a very good station. 

Now what about Yosemite? Earlier I pointed out what looked like possible station moves in the record, but that is not why I think it’s a bad choice. IMHO it’s a bad choice because it’s not in the Sierra Foothills in desert like terrain, it is in the Sierras themelves. If you go back and look at the Google Earth Image that shows the stations in relative position to each other, you will see a brown strip between the city of Merced and the Trees and mountains of the Sierras in Yosemite National Park. That brown strip is the desert foothills of the Sierra’s and what we are suppose to be compring to, not the temperatures in the mountains. Matter of fact the Yosemite station can be seen very clearly to be sitting up in the mountains at the Park Ranger’s HQ. IMHO this station should never had been in the 2006 paper, since it is not in the Central Valley, unless it was when the study was done and moved to it’s present location afterward (the USHCN station numbers match between this station and the one in the study), but I can’t find any evidence of that in the USHCN or GHCN records (No multiple stations in GHCN, no flags in USHCN showing a move). 

For more on the Merced Station location: 

http://gallery.surfacestations.org/main.php?g2_itemId=662 

http://gallery.surfacestations.org/main.php?g2_itemId=27876 

For more on the Yosemite Station location: 

http://gallery.surfacestations.org/main.php?g2_itemId=710 

http://gallery.surfacestations.org/main.php?g2_itemId=2443

Lake Michigan Temp July 4th 2010: 489.2°F ?

Major Update about the NOAA 16 Satellite and when problems with it were known see end of the article

Well according to NOAA at least part of Lake Michigan reached that temperature, while other parts had temperatures in excess of 100°, 200°, 300° and 400° F. I saw this a couple of days ago and at first thought it was a hoax being perputrated to try and discredit skeptics, and I still had a nagging suspicion that was the case when I saw the data myself from the Michigan State University’s CoastWatch archives (CoastWatch is a co-operative effort between NOAA and MSU). Then yesterday I saw something about all the data for the NOAA-16 Satellite going bye bye (don’t remember where exactly)

Well today those two little facts got tied together with a nice little bow and a big attempt by NOAA to first cover up and then whitewash at the minimum gross negligence by the agency.

First to get things in perspective you should read the article from 9 Aug on the website “Climate Change Fraud” (CCF) where the story about the broiling waters of Lake Michigan is found:

In his email the faceless whistleblower explains that what precipitated the scoop was “a rather dubious report in the media that the Great Lakes temperatures have risen 10 to 15 degrees, I found it was downright laughable.” (Just a few examples of media hysteria here and here and here and here)

He continues, “ Prior to this report I would frequent the ‘Coastal Watch’ temperature maps for northern Lake Michigan.  When this report came out it dawned on me that the numbers didn’t match what I had been reading on the Coastal Watch temperature page.”

Under a scheme called ‘Sea Grant’ NOAA collaborates with national universities to compile an official federal temperature record. In this instance, the partnersip is with Michigan University’s ‘Coastal Watch.’

Together the two institutions show temperature maps for northern Lake Michigan registering an absurd 430 degrees Fahrenheit -yes, you read it right –that’s four hundred and thirty degrees-and this is by no means the highest temperature recorded on the charts.

In the heated debate about Earth’s ever-changing climate you certainly don’t need to be scientist to figure out that the Great Lakes would have boiled away at a mere 212 degrees so something has seriously gone awry inside this well-funded program.

http://www.climatechangefraud.com/climate-reports/7479-us-government-in-massive-new-global-warming-scandal-noaa-disgraced

Now before you go running off to the CoastWatch site to see for yourself, be advised that it has been taken down but you can still see the map because CCF copied it before NOAA could disappear it and which I reproduce here:

 Here is the direct link to the map at CCF: http://www.climatechangefraud.com/images/stories/pics3/2010_Jul04_959EDT.gif

How helpful of them to tell you not to use that map for navigational purposes!

The next part of the story comes about on 11 Aug as CCF breaks the next part and where you see how NOAA operates:

US Government admits satellite temperature readings “degraded.” All data taken offline in shock move. Global warming temperatures may be 10 to 15 degrees too high.

NOAA Whitewash Fails in One Day

NOAA’s Chuck Pistis went into whitewash mode on first hearing the story about the worst affected location, Egg Harbor, set by his instruments onto fast boil. On Tuesday morning Pistis loftily declared, “I looked in the archives and I find no image with that time stamp. Also we don’t typically post completely cloudy images at all, let alone with temperatures. This image appears to be manufactured for someone’s entertainment.”

But later that day Chuck and his calamitous colleagues now with egg on their faces, threw in the towel and owned up to the almighty gaffe. Pistis conceded,

“I just relooked and (sic) the image again AND IT IS in my archive. I do not know why the temperatures were so inaccurate (sic). It appears to have been a malfunction in the satellite. WE have posted thousands if (sic) images since the inauguration of our Coatwatch (sic) service in 1994. I have never seen one like this.”

But the spokesman for the Michigan Sea Grant Extension, a ‘Coastwatch’ partner with NOAA screening the offending data, then confessed that its hastily hidden web pages had, indeed, showed dozens of temperature recordings three or four times higher than seasonal norms. NOAA declined to make any comment as to whether such a glitch could have ramped up the averages for the entire northeastern United States by an average of 10-15 degrees Fahrenheit by going undetected over a longer time scale

that article is here: http://www.climatechangefraud.com/climate-reports/7491-official-satellite-failure-means-decade-of-global-warming-data-doubtful

Well at least going forward we can use this as a learning moment and we will get better satellites and data in the future.

Right? Ummm No. You see the federal government is cutting funding for the satellites and causing crucial sensors not to be installed.You can read the article here: http://www.contracostatimes.com/top-stories/ci_15689267?nclick_check=1

Of course the usual blame Bush angle is tried, but wasn’t it the Obama administration that changed NASA’s priority from exploring space to a Muslim Outreach Program:

http://www.youtube.com/watch?v=aUNc9bWu_1I

So what’s the uptake?

NOAA either through incomptence, negligence or malice “cooked” the books (pun intended) on Lake Michigan temperatures, but it has far wider questions and implications. If the NOAA 16 satellite is “degraded” what about the others such as the new NOAA 18 satellite and the older NASA AQUA satellite that is used by UAH and RSS to make the “official” Satellite temperature records, what about NOAA 15 that was used prior to NASA’s AQUA by RSS and UAH are they as well ? or will the newer ones be expected to now or not? Is the older one, older then one discussed, not just “degraded” by orbital decay but also it’s sensors and if so when did it start? Was UAH and RSS told if so? How was it handled.

Then you got the questions of how many scientific studies in the Peer Reviewed Litichur might now be nothing more then junk science because of NOAA’s mismangement?

Then there is this to consider: The underlying data used by CRU and GISS for their temperature data analysis is almost all based on data collected and managed by NOAA through the NCDC. So now ask yourself this question: If NOAA could screw this up so badly, why should we trust the data that they collect and manage to make these surface temperature records?

What a can of worms this is.

Update: in the comments section over at CCF Lubos Motl (Reference Frame Blog: http://motls.blogspot.com/)  pointed out that RSS stopped using the NOAA 16 Satellite in 2007:

# Luboš Motl 2010-08-12 06:19

RSS AMSU has kicked out NOAA-16 in 2007 and I guess that UAH AMSU of Christy and Spencer has done the same thing years ago, too.

This was quickly followed up by another commenter that found this info:

Hi Ian! Thanks Lubos! Here is a site that shows the Remote Sensing Systems. www.remss.com/msu/msu_data_description.html For this purpose NOAA-16 was removed as of Feb 2007. The reason was (www.remss.com/data/msu/support/Changes_from_Version%202_1_to_3_0.pdf) “Data from NOAA-16 is no longer used. The data from this instrument appears to be drifting relative to data from the earlier satellites. The cause of this drift has not yet been determined. The drift is as large as several tenths of a degree K per decade, as large or larger than the expected climate signal”.

When you click on the link to the RSS PDF it is there plain as day. This is huge this shows that a KNOWN malfunctioning satellite was still being used by NOAA for research purposes  3 years after the fact. The malfunctionhad to have started to happen before Feb 2007, when exactly the PDF doesn’t say, but for RSS to “kick” it from its analysis in that month/year it had to be before. Now the question is when was the drift first noticeable and how long before that did it start to drift?

 
UPDATE #2: UAH stopped using the NOAA 16 data as 5 Dec 2006 and took out all NOAA 16 data from Oct 2005 on and recalculated:
Update 5 Dec 2006 *******************************

Data products are still 5.2 and 5.1.  For LT 5.2 and MT 5.1 we have
eliminated the data from NOAA-16 after September 2005 when NOAA-16
began to diverge in a manner that suggested NOAA-16 was having problems.
Thus, the data since Oct 2005 is based on NOAA-15.  The net effect on this
change was to increase post-Oct 2005 temperatures slightly, and thus the
global trend is increased by about 0.01 C/decade.
ftp://ghrc.nsstc.nasa.gov/pub/data/msu/t2lt/readme.13Apr2010

What is interesting is earlier comments from that date about porblems they were seeing. Some of it was a problem which they corrected, but there was still problems. At first they thought it was the older NOAA 15 satellite (1998) not the newer NOAA 16 bird (Feb 2001). However pay attention to how long prior they noticed the problems start:

Update 6 Apr 2006 ****************************

Roy is working on a diurnal adjustment for the AMSU
instruments as they have now drifted over an hour
from their initial crossing time.  NOAA-15 has backed up
from 7:30 to 5:48 and NOAA-16 has drifted forward from 1:54
to 3:10.  Be on the lookout for a new version that will have
these additional adjustments.

There is also a divergence between NOAA-15 and NOAA-16 that
has developed in the last 12 months.
  We don’t know if
N15 is spuriously warming or N16 is spuriously cooling. As
soon as this is resolved, we hope to include this correction
in the next version as well.

So between Apr 2005 and Apr 2006 they spotted the problem but intially blamed NOAA 15, from Dec 06 on they relied on just NOAA 15 until they were able to get data from the NASA Aqua satellite.

 

Monetizing The Debt

Today the Fed Chairman Ben Bernanke did what he told Congress the Fed wouldn’t do: Monetize the Debit

06/05/09 London, England “Either cuts in spending or increases in taxes will be necessary to stabilize the fiscal situation,” said Ben Bernanke in response to a question posed by a member of Congress. Then, he added…

“The Federal Reserve will not monetize the debt.”

That last sentence has a ring to it. It reminds us of Richard Nixon’s “I am not a crook.” Surely, it is destined to make its way into the history books, alongside Bill Clinton’s “I did not have sex with that woman” and the builder of the Titanic’s “even God himself couldn’t sink this ship.”

Monetizing the debt is precisely what the Fed will do. But it will not do so precisely. Instead, it will act clumsily…reluctantly…incompetently…accidentally…and finally, catastrophically.

That’s our prediction, here at The Daily Reckoning. Prove us wrong!

http://dailyreckoning.com/monetizing-debt-the-grandest-of-larcenies/

As you can see that was the prediciton right after the Fed Chairman said  the Fed wouldn’t Monetize the debt over a year ago. Here is what the Fed did:

In a step that will be one of the markers on the road to economic and financial catastrophe, the Federal Open Market Committee (otherwise known as the FOMC) of the Federal Reserve, made a bombshell policy decision on August 10, 2010, one fraught with dangerous long-term consequences for the American and global economy. In a policy being dubbed QE2, the Federal Reserve’s FOMC conceded that the so-called U.S. economic recovery has “slowed,” and required more stimulus from the Fed. However, with federal funds interest rates now effectively at zero, the only aspect of monetary policy left is money printing. Thus, the Federal Reserve, in effect, will use its printing press to buy long-term U.S. government debt.

What is likely to result from the QE2 phase of the Federal Reserve’s disastrous policymaking? In time, sovereign wealth funds will recognize Bernanke’s manoeuvre for what it is: monetization of the U.S. national debt. When that happens, Treasury auctions will begin to fail, and yields will advance. This will all put added pressure on the Fed to print even more dollars, and monetize an increasing proportion of the federal government’s debt. This will unquestionable inject liquidity into the U.S. economy. But this Federal Reserve monetary injection will be as beneficial as money printing was in Weimar Germany in the early1920s, or Zimbabwe more recently.

http://www.globaleconomiccrisis.com/blog/archives/1153

So it looks like the prediction of daily reckoning was spot on.

GHCN v3 Timetable

Way back (even before I started this blog) long, long ago (Dec 23rd 2009) in a reply to an email far away Dr. Peterson of NCDC told Willis Eschenbach that GHCN would be changing adjustment methods and that the new version should be ready in 2 to 3 months:

Partly in response to this concern, over the course of many years, a team here at NCDC developed a new approach to make homogeneity adjustments that had several advantages over the old approaches. Rather than building reference series it does a complex series of pairwise comparisons. Rather than using an adjustment technique (paper sent) that saw every change as a step function (which as the homogeneity review paper indicates was pretty standard back in the mid-1990s) the new approach can also look at slight trend differences (e.g., those that might be expected to be caused by the growth of a tree to the west of a station increasingly shading the station site in the late afternoon and thereby cooling maximum temperature data). That work was done by Matt Menne, Claude Williams and Russ Vose with papers published this year in the Journal of Climate (homogeneity adjustments) and the Bulletin of the AMS (USHCN version 2 which uses this technique).Everyone here at NCDC is very pleased with their work and the rigor they applied to developing and evaluating it.

They are currently in the process of applying their adjustment procedure to GHCN. Preliminary evaluation appears very, very promising (though of course some very remote stations like St Helena Island (which has a large discontinuity in the middle of its long record due to moving downhill) will not be able to be adjusted using this approach). GHCN is also undergoing a major update with the addition of newly available data. We currently expect to release the new version of GHCN in February or March along with all the processing software and intermediate files which will dramatically increase the transparency of our process and make the job of people like you who evaluate and try to duplicate surface temperature data processing much easier.

You can find the full email here: http://wattsupwiththat.com/2009/12/20/darwin-zero-before-and-after/#comment-272529

Well this of course caused a bit of interest far and wide, with a bunch of us waiting to see the differences between GHCN v2 adjusted and GHCN v3 adjusted, what new data was added to the files, which in turn would change the GISS output so there would be another comparison. However something must have happened on the way to debuting the files and programs because in an update to a paper submitted by Dr. Hansen of GISS for publication, he stated that GHCN v3 will not come out until late 2010:

When GHCN version 3 becomes available, expected in late 2010, we will make results of our analysis available on our website for both versions 2 and 3 for a period that is at least long enough to assess the effect of differences between the two versions. 

  http://data.giss.nasa.gov/gistemp/paper/gistemp2010_draft0803.pdf 

That begs the question of what happened to delay the launch for over half a year? The adjustment process at the time of the email was in standard use for USHCN v2 so that shouldn’t have been too much trouble to come up with a program that took in the GHCN “raw” data instead of USHCN “raw” data. Possible areas that could have caused delay could be such as trying to come up with a Web Interface similar to what USHCN uses and not just rely on a ftp site to problems getting the program just right to spit out the intermediate steps which is not seen in USHCN v2. Another slightly less innocent reason for delay is based on the controversy of how the USHCN v2 adjustment method was devised.  

Contrary to many, NOAA and NCDC were paying close attention to Anthony Watts and the Surface Stations Project ( http://www.surfacestations.org/). According to Mr. Watts after some initial contacts about possible collaboration between himself and NCDC about working on a paper/project to deal with what Mr. Watts had found, NCDC decided to go it alone. According to Mr. Watts; NCDC and Dr. Menne used an non quality controlled 43% complete dataset that Mr. Watts had made available online for general reference purposes only to perform an analysis and from that came up with the USHCN v2 adjusted method. According to Mr. Watts he has an almost 90% complete quality controlled dataset that he is working on with another climatologist to produce a journal paper. According to him their findings are vastly different to what Dr. Menne reached and he announced that on his blog Watts Up With That back in January of 2010. You can read the article here:   http://wattsupwiththat.com/2010/01/27/rumours-of-my-death-have-been-greatly-exaggerated/

So with that announcement made just days before February rolled around NCDC might have taken a watch and see attitude, waiting to see when or if that paper makes it into the peer-reviewed lititchur and if it really does show their method is incorrect based on the underlying data.   

 Either way, don’t expect GHCN v3 until late October at the earliest would be my guess.

Your Tax Dollars At Work!

There is many things that can only be accomplished by the Federal Government and we the Taxpayer must foot the bill for. The following is not one of them, matter of fact it is down right stupid. What am I talking about? Just the waste of time and money the Substance Abuse & Mental Health Services Administration (a Part of the Department of Health and Human Services) came up with:  

Tips for Talking to Children & Youth About the Oil Spill Disaster – A Guide for Parents and Educators  

http://samhsa.gov/Disaster/docs/OilSpill_TipsforTalkingtoChildrenYouth_GuideforParentsandEducators_508.pdf  

Now this piece of wasted taxpayer money starts out from the premise that all the kids in the US are going to react to the Oil Spill in the Gulf in negative ways:  

 Children and youth may react to the oil spill disaster in many different ways. Some may have reactions very soon; others may do fine for weeks or months and then begin to show troubling behavior. Knowing the signs that are common at different ages can help parents, caregivers, and teachers recognize problems and respond appropriately.  

Notice these are all negative consequences some sooner others later, but all negative. Not one single kid will ever have a positive reaction such as wanting to become a scientist that invents a way to generate power without using oil, or comes up with an other ways to reduced or eliminate oil use due to seeing the oil spill. Nope not one will go well that sucks for the people living there, but now that they are getting things cleaned up they can get back to living their normal lives. Nope, all of them sooner or later are going to breakdown because oil spilled into the gulf. Lets see what the breakdowns are for the various age groups:  

Preschool Age   

Children ages 1–5 find it particularly hard to adjust to change and loss. These youngsters have not yet developed their own coping skills, so they depend on parents, family members, and teachers to help them through difficult times. 

  

Very young children may return to an earlier behavioral stage to cope with the stress and loss associated with the oil spill disaster. Preschoolers may resume thumb-sucking or bedwetting, or they may suddenly become afraid of strangers, animals, darkness, or “monsters.” They may cling to a parent or teacher, or become very attached to a place where they feel safe.  

Changes in eating and sleeping habits are also common, as are unexplainable aches and pains. Other symptoms to watch for are disobedience, hyperactivity, speech difficulties, and aggressive or withdrawn behavior.   

Oh Dear God where to start on this mess?  

thumb-sucking? Bedwetting? Afraid of Strangers? Hyperactivity? Darkness and Monsters in the Night?  

HAVE THESE PEOPLE NEVER RAISED A CHILD BEFORE!  

Seriously how many 1 year olds have you known that have wet the bed, sucked a thumb, been afraid of the dark or monsters in the closet? Has there ever been a 3-year-old that hasn’t been hyperactive? This entire list of so-called “symptoms” is nothing more than run of the mill stuff for kids growing up, this is nothing new. Really how many 1 year olds watch the fricking news? Seasme Street yes, the CBS evening news with Katie Couric no (I know bad example, most people not just preschoolers don’t watch the CBS evening news). We actually pay taxes for this drivel!   

Early Childhood  

Children ages 5–11 may have some of the same reactions that younger children have. They also may withdraw from playgroups and friends, compete more for the attention of parents, fear going to school, allow school performance to drop, become aggressive, or find it hard to concentrate. These children may return to more childish behaviors, such as asking to be fed or dressed.  

Fear of going to school? Withdraw from playgroups and friends? Compete for more attention from parents? Allow school performance to drop? 

  

Sigh again these are things many of us adults have experienced growing up. Why is it the oil spills fault little Billy fears going to school and not the class bully that beats the shit out of him during recess everyday? Why is it the spills fault that little Billy had a fight with his playmates and is not talking to them (at least for that week)? Why is it the spills fault that little Billy is feeling like his little brother or sister is getting more attention from his parents then he is? Why is it the spills fault that little Billy already understands that 2+2=4 and finds it boring as hell to sit all day in class while the teacher trys to teach that to the future climatologists of the world who believe 2+2=5? Or maybe little Billy is a future climatologist and can’t grasp the concept that 2+2=4 not 5?  

Adolescence  

Children and youth ages 12–18 are likely to have vague physical complaints when under stress, and they may abandon chores, school work, or other responsibilities that they previously handled. Although some may compete vigorously for attention from parents and teachers, they also may withdraw, resist authority, become disruptive or aggressive at home or in the classroom, or begin to experiment with high-risk behaviors, such as alcohol or drug use.  

Abandon Chores, School Work or other responsibilities? Still competing for attention from parents? Resist Authority? Become disruptive at home and school? experiment with drugs and/or Alcohol? 
 
Now the other sections were bad but this section is the worst for the simple reason of: Haven’t these people never heard the term Teenage Rebellion? Since it seems they haven’t I’ll help them out:
 Temple University psychologist Laurence Steinberg suggests that “competing systems within the brain make adolescents more susceptible to engaging in risky or dangerous behavior.”[1] He argues that social programs and measures discouraging youth from taking part in risky behavior (such as drug and alcohol abuse, reckless driving, and unsafe sex) have been largely ineffective.

  

 
 

  

Hmm so Teenagers were doing risky things before the oil spill, so tell me again why them doing risky things after the oil spill is caused by the spill again? Also of course no Teen ever refused to clean his room before, or quit his paper route, or defy his parents or get into fights with his/her siblings. Nope they were all perfect angels until the oil spill came along!
 
So what is the governments answer to all this?
Well talk to the kids and restrict their exposure to the news (Seriously as a kid how many time did you say to your friends “sorry have to rush home the news is on”) and if that doesn’t work run to the nearest shrink to get a bunch of pills.
 
Now that was bad enough but they were not content to just waste your tax dollars on that they also need to make a separate PDF on just how to deal with this:
 
or if Joe or jane Six Pack living in Kansas couldn’t cope:

http://samhsa.gov/Disaster/docs/OilSpill_TipsforCoping_ManagingYourStress_508.pdf 

 
Keep in mind these documents weren’t made for just the people down at the Gulf but the General public at large:
Tips for Parents and Teachers
 
Tips for Talking to Children & Youth About the Oil Spill Disaster – A Guide for Parents and Educators

Tips for Talking to Children & Youth About the Oil Spill Disaster – Interventions at Home for Preschoolers to Adolescents  

  

Tips for the General Public

Tips for Coping with the Oil Spill Disaster – Managing Your Stress  

Tips for Dealing with Grief Due to the Oil Spill Disaster  

Now down at the bottom of the page they finally got something specifically for someone effected by the spill:
  
 You can find the page here: 

http://samhsa.gov/Disaster/traumaticevents.aspx  

 
 

  

  

Tips for Emergency Response Workers

Tips for Oil Spill Disaster Response Workers – Possible Signs of Alcohol and Substance Abuse  

Tips for Oil Spill Disaster Response Workers – Managing and Preventing Stress for Managers and Workers  

What a colossal waste of time and taxpayer money.  

Climate Change and Power Generation Research

I came across a short blog post by Dr. Roger Pielke Jr. concerning journalists that can’t seem to help themselves when it comes to natural disasters.

http://rogerpielkejr.blogspot.com/2010/08/catastrophe-catnip.html

The example he shows is from a reporter at Time magazine trying to tie the floods in Pakistan to “Global Warming” (or is it “Climate Change” this week?). This intrepid reporter appeals to the authority of the IPCC that floods are going to be worse due to rising CO2 emissions. Dr. Pielke pointed out that the IPCC actually said very little about floods like we are seeing in Pakistan. This is not something new with journalists, they even try to turn severe cold weather events into being caused by “Global Warming” induced by CO2 emissions and this typically gets the Skeptic dander up. However that isn’t what interests me it’s the second part that Dr. Pielke highlights in his piece: the call for low-carbon emissions.

Now what is one of the biggest emitters of CO2 emissions? Power Generating stations such as coal-fired power plants.

So from there the Green brigades tied research into new energy generating technologies to “Global Warming” and that is where things went off the rails IMO. If you go back and look at how the energy technology changed over time you will see one constant theme: New technologies are only invented when the older one becomes uneconomical. From that point you can see why the Greens are pushing things that they are, if you implement a Cap and Tax scheme you push up the price on the older technology of coal-fired power generation, while at the same time giving subsidies to the newer technologies (which aren’t that new) of “alternative” or “renewable” power generation.

So you can follow the chain in the logic: World ending due to rising temps, caused by human CO2 emissions, so to fix that change the power generating from the carbon based system to other based systems that are zero to low-carbon. From there governments went out and basically picked winner and losers in the research funding sweepstakes. In France the winner has been Nuclear Power, which would give most Greens in Germany and the US a heart attack, so it’s a loser in those countries. What became the 2 big winners in most countries was Wind and Solar power, both are not really that new, by giving big subsidies for their use. This is where the problems started popping up Wind is very unreliable as a base load power generating source. Besides not being able to generate the needed power when there is not enough wind, they can catch fire or explode if the wind is too strong and the brakes fail. You can see this very clearly online on Youtube no matter how much the greens want to say it isn’t true:

http://www.youtube.com/watch?v=7nSB1SdVHqQ

http://www.youtube.com/watch?v=rkGXoE3RFZ8

http://www.youtube.com/watch?v=MOfHxINzGeo

http://www.youtube.com/watch?v=ppLh5pGX3qQ&feature=related

Another problem with Wind is the size of the area needed for the physical placement of wind farms compared to the power generated out of it. Here is what I’m talking about. In a report in 2004 the peak power demand for New York City in 2003 was 11,020 MW of power and they were estimating that by 2008 that would rise by 3,780 MW to 14,800 MW.

http://www.nyc.gov/html/om/pdf/energy_task_force.pdf

In 2003 to supply that power with wind turbines it would have taken 14 wind farms the size of the Roscoe wind farm in Texas which is the worlds largest. In that 1 wind farm there is 627 wind turbines and for 14 farms the total goes up to 8,778. Now lets say the report was right and 2008 needed 14,800 MW the number of farms increases to 19 and a total of 11,913 turbines. How much ground would be taken up to supply that power in 2008? Well according to the wiki article linked below the Roscoe farm covers nearly 100,000 acres, so for New York City alone you are talking about 1,900,000 acres or 2,969 sq miles of ground needed. To put that in perspective you would need to take the entire state of Delaware and empty it of every person and building and fill it with wind turbines and you would still be about 500 sq miles short of space and this is just to power New York City alone not the entire state of New York.

http://en.wikipedia.org/wiki/Roscoe_Wind_Farm

Without mentioning the un-sustainability of government subsidies that make Wind a feasible economic alternative to carbon based generation, the area needed is just not practical to meet our present or future energy needs, so Wind in not a viable replacement for Coal.

Solar faces the same problem of area to usable power generated, the worlds largest Solar farm takes up 247 acres to produce just 20 MW of power.

http://www.treehugger.com/files/2008/02/powering_20000.php

So lets give it the New york City treatment: For a need of 14,800 MW you would need 740 of these Solar farms covering an area of 182,780 acres or 286 sq miles. To put it in perspective the area of the city of New York is just about 469 sq miles, so over half the size of the city. Not as bad as wind but again not very practical especially when you take into account that they only work when the sun is shining and the storage batteries needed are not cheap nor environmentally friendly to make, dispose of or recycle. Again we didn’t point out the subsidies that are making this a viable economic alternative to coal.

So where does that leave the US citizen where the government with the backing of the greens has sunk our research dollars into non viable alternatives?

If you try to change the research funding so a viable alternative can be found you get the greens, eco loonies and the far left-wing of the Dems out in the street crying about “global warming” and killing the planet (never mind how many will die if you go with the crazy idea that wind and solar can replace coal). This pressure causes the politicians, not all but enough, to keep on this insane course.

You see finding new sources of generating energy is needed, not because of “Global Warming”, but because eventually coal will become uneconomical as a fuel source. As an example of this go back in time to when the majority of the world got most of its heating and cooking energy from the burning of wood. As trees were being used up (this is prior to the idea of loggers replanting forests they cut down) to build homes, ships, furniture as well as heat the places in winter and provide the fire that cooked the food year round; the price of wood increased. England at one point was importing wood from colonial america because they were that short of trees. Now here is the thing, coal was known to be a fuel source well before this point, but in places like North America and most of Europe it was a lot more expensive to mine for coal then it was to just go out and chop down the trees and use. When the trees got scarce the economics changed and coal became viable, but notice it wasn’t done through government picking winners and losers. Then when the industrial revolution hit the ways coal was used was changed to get more bang for the buck, again not by governments getting involved. The governments didn’t all of a sudden pour money into mining or power generation research, that was done by private companies and individuals and that is way things stayed until the 1950’s when government got the Nuclear bug and sunk fortunes into researching it, not just to build bombs but for power generation.

Then another problem cropped up: Man Made “Global Warming” is not that big of a deal. The physics have not changed in over 20 years, you still only get about 1°C rise out of a doubling of CO2. The hypothesized “positive water feedback” has never been found in that entire 20 year period, matter of fact observations are finding that the atmosphere is actually not that sensitive to CO2. So the reason that the greens put forth that we need to get new power generation technologies flew out the window, taking with it for the governments justification to pick winners and losers.

These steps are what I think need to happen to move ahead and find new practical as well as economical ways to generate power:

  1. Power Generation research needs to get that anchor of “Global Warming” off from around its neck. These are two different situations and by tying one to the other, both will sink when CAGW due to human emitted CO2 goes down the tubes as non physical.
  2. Governments need to butt out of research. They are doing far more harm than good, the market really will fix this problem on its own. Besides the example I gave about coal look at the logging industry. When they saw that by not replanting trees after you cleared an area eventually you run out of trees followed shortly by being out of business. So the logging industry changed to meet economic reality. The role government can play in this is making sure what comes out of the research is safe and if they really want to help money wise stop with direct grants. Instead do not tax private enterprise for any monies they spend on power generation research, let them give grants to Universities and hire the bright new scientists that will be graduating from them. This will ensure that what they come up with, really will be economically viable.

So when the economics and practicality of the technology causes us to shift away from carbon based power generation, you will get your de-carbonizing without government interference. The cry of we need to de carbonize to save the planet is patently not true and all it does is divert funds (still searching for that elusive feedback and the hotspot you know) from needed research into viable technology, thus prolonging the carbonization of the world’s energy production. But then again the greens position was never really about saving the planet, or getting clean affordable abundant power for the poor of the world. It was always about their socialistic political goals.

Things That Make You Go Hmmm Part 2

Ok the last time I looked at the data I found that the adjustments made by NCDC to the Hanford and Lemon Cove records for Tmax and Tmin caused them to look like those stations do not support the conclusions of Dr. Christy’s 2006 paper, where as in the “Raw” data you see a huge warming trend in the Hanford Tmin, a slight warming trend in the Lemon Cove Tmin and both stations showing cooling trends in Tmax. Those trends go against what the computer models say should happen, matter of fact they are reversed. According to the models Lemon Coves Tmin should have a large warming trend because it is in the mountains and Hanford should not because it is in a valley. This was shown in the Snyder 2002 paper. So you can imagine my surprise that the new adjustment method that NCDC came up with in 2009 for USHCN adjusts the data in such away that the data all of a sudden matches what the models said back in 2002. The adjustments made were not small either, not when you are dealing with almost 3° F swing from a huge warming trend in the Valley station Tmin to a very small one of .45° F.      

https://boballab.wordpress.com/2010/07/31/things-that-make-you-go-hmmm/ 

Now I was going to start looking at the USHCN records for the Merced/Yosemite pair, but I remembered something about overlap between USHCN and GHCN. Basically what it boils down too is that some USHCN stations are also in the GHCN stations. What makes that important is that GHCN still uses the old adjustment method that USHCN used to use, so if the Hanford/Lemon Cove pair is in the GHCN stations we can compare the old adjustment method to see if it gives us different results: ie does it agree with Dr. Christy’s paper? If not how much difference is there?      

So I went and checked my copy of GHCN max and min files (Dec 2009 is the one I have unzipped atm) and yes Hanford and Lemon Cove are in there. So it’s time to put them into the spreadsheet and see what we got.      

NOTE: GHCN records its data in Celsius where USHCN is in Fahrenheit, the temperature anomalies in the last post will not scale with the ones in this one. However the overall trends should not be affected, a warming trend should still be a warming trend in either scale and vice versa for cooling.      

In Figure 1 we have the Tmax numbers for Hanford. You will notice that the data only goes up to 2005, that is because that is all there is in the GHCN file I got. Also note that I converted the USHCN Tmax adjusted data from Fahrenheit to Celsius and add it in.      

Figure 1

 

Now here you see the GHCN Raw trend is almost -1.5° C over the 105 year period. The GHCN adjusted trend is down to a -.25° C, while the USHCN adjusted trend is about a .3° C of warming. So while the old adjustment method (First Difference) does lower the cooling trend it doesn’t flip it to a warming trend.      

Figure 2

 

Now in Figure 2 we see the Tmax for Lemon Cove, with a “Raw” trend of about -1° C over the time period. The GHCN adjusted trend is almost 1.25° C and the USHCN adjusted trend is just a little over .25° C. Both adjusted methods change a cooling trend into a warming trend but the older method still used in GHCN is the more egregious here with an almost 2.25° C swing, where the newer method in USHCN reduces that swing to just a 1.25° C change (notice both stations with both adjustment methods produce swings that are greater than the IPCC backed warming trend of .7° to .8° C per century).      

Now we move on to Tmin for both stations, starting with the Hanford Tmin in Figure 3      

Figure 3

 

Here for the “Raw” we got a trend of 2° C over the time period. The GHCN Adjusted trend is about 2.25° C and the USHCN Adjusted trend is not even .5°C, closer to a .4° C trend, that is a huge adjustment from the raw.      

So to recap the numbers for Hanford: 

  Tmax Trend Tmin Trend Tmean Trend [Tmean=(Tmax+Tmin)/2]
Raw -1.5° C 2° C .25° C
GHCN -.25° C 2.25° C 1° C
USHCN .3° C .4° C .35° C

So according to the numbers the Tmean trend should be the lowest for the raw with a strong daytime cooling trend and a strong nighttime warming trend. For GHCN a warming trend about .2° to .3° C over the global average with a low daytime cooling trend and a strong nighttime warming trend. For USHCN you got a small warming trend with a small daytime and nighttime trend. This can be seen in that my eyeball estimate is very close in Figure 4:      

Figure 4

 

Now in Figure 5 we will look at the Lemon Cove Tmin:      

Figure 5

 

For the Raw Tmin trend we get about .5° C of warming. For GHCN Adjusted we see a slightly higher trend of about .6 to .65° C of warming. Now for the USHCN Adjusted we get a trend of about 1.2° of warming, which is an increase of the trend by almost .75° C from the Raw trend and almost .6° C over the GHCN adjusted trend.      

So a recap of the Lemon Cove numbers: 

  Tmax Trend Tmin Trend Tmean Trend [Tmean=(Tmax+Tmin)/2]
Raw -1° C .5° C -.25° C
GHCN 1.25° C .6° C .93° C
USHCN .25° C 1.2° C .73° C

As we see there is a slight cooling trend in the Lemon Cove Tmean by the raw numbers with a strong cooling trend for daytime high temperature and a moderate warming trend during nighttime lows. GHCN adjusted shows a strong warming trend with a strong warming trend for daytime temperatures (an over 2° change from the raw numbers) and a moderate warming trend for nighttime lows. USHCN also has a strong warming trend which is very close to the adjusted global average, with a slight warming trend for daytime highs and a strong warming trend for nighttime lows      

Figure 6

 

The GHCN and USHCN trends in the graph made by the v2 meanadj file and USHCN data appears to be almost exactly as my rough numbers in the table above, however the v2 mean file trend appears a little less than the numbers derived from the rough numbers in the table, only about a .1° to .15° cooling trend.      

Conclusion:      

For Hanford (valley) the GHCN adjusted trends are close to the results of the Christy et al paper in that the Tmin warming that Dr. Christy pointed out is shown with a less of a cooling trend in Tmax trends. For Lemon Cove (Sierra) however it is even worse than USHCN. For Tmax. Christy et al 2006 found that there was no warming trend in either Tmax or Tmin for Sierra stations, but GHCN has an even higher warming trend then USHCN for Tmax, while almost no adjustment to the Tmin trend from Raw.      

So once again the adjustments made to the dataset by NCDC are what causes the divergence from the Christy et al findings of no appreciable warming trend in either Valley or Sierra Tmax (both show a cooling in the paper), but a clear warming trend in Valley Tmin but no appreciable warming trend in Sierra Tmin.  

now it’s off to look at the other pair and see if things work out the same, if it does there is 3 stations in Costa Rica I want to revisit and see if they are in the GHCN Tmax and Tmin files and see if the Valley/Mountain theme was just a Central Valley thing or if it will apply to the Beach/Mountain in Costa Rica.

The MWP Is Important, But Maybe Not What You Think For

In the raging debate about CACC (Catastrophic Anthropogenic Climate Change) formerly known as CAGW (Catastrophic Anthropogenic Global Warming) you keep hearing about something called the Medieval Warm Period (MWP). Now one side of the debate says it’s important and the other says it isn’t and IMHO both sides are right and both are wrong.

Now that I got you confused because you are scratching your head and saying “They can’t be both right and wrong”, I’ll explain how that is a true statement.

Skeptics say the MWP is important. The reason they typically say it’s important is that it shows the late 20th century warming is not unprecedented. The reason they are wrong is the fact you don’t need the MWP to prove that, the earlier Roman, and Holocene Optimum warm periods show that the 20th century warming is not unprecedented. Also the previous interglacial’s show that as well, so you don’t need the MWP for that which is where the Alarmists get it right.

Now where the Alarmists are wrong and the Skeptics are right in that the MWP is important is not strictly for how high the temperatures were during that time. It has been shown in numerous studies that there was a MWP and even Dr. Keith Briffa of the CRU believes there is one, so what makes it important is not how high the temps rose but that they did rise and the relationship between Temperature and CO2. If you remember back to the famous graph in Al Gore’s fiction film “An Inconvenient Truth” where you saw a graph where one followed the other one of the things he got wrong about them is that he had them backwards. In the film he showed temperature rising, after an increase in CO2 when in fact the opposite occurs. It should be well known by now that an increase in Temperature gets a corresponding increase in CO2 800 to 1000 years later. So if the temperatures did rise during the MWP then 800 to 1000 years later there should have been a corresponding natural rise in the CO2 level in the atmosphere. So first things first when did the MWP occur?

Well according to NOAA the MWP occurred between the 9th and 13th centuries:

Medieval Warm Period – 9th to 13th Centuries

Norse seafaring and colonization around the North Atlantic at the end of the 9th century indicated that regional North Atlantic climate was warmer during medieval times than during the cooler “Little Ice Age” of the 15th – 19th centuries. As paleoclimatic records have become more numerous, it has become apparent that “Medieval Warm Period” or “Medieval Optimum” temperatures were warmer over the Northern Hemisphere than during the subsequent “Little Ice Age”, and also comparable to temperatures during the early 20th century. The regional patterns and the magnitude of this warmth remain an area of active research because the data become sparse going back in time prior to the last four centuries.

http://www.ncdc.noaa.gov/paleo/globalwarming/medieval.html

So that means that the MWP occurred between the years 801 AD and 1300 AD. It also states that it started towards the end of the 9th century so that definitely means it’s between 850 AD and 900 AD. Now take the time lag for the corresponding rise in CO2 and what years do you get: 1650 to 1900 AD before you see a rise at the earliest. These are rough estimates from the range of the MWP.

So lets take a look at what the US Government says the historic CO2 levels have been during that time period:

This graph was taken from a US government report here: http://clinton4.nara.gov/Initiatives/Climate/greenhouse.html

Now look at that according to this the CO2 started rising around the year 1860 AD, hey what do you know that corresponds to the lag you get from the MWP (860 AD to 1060 AD). So if you go from there the rise of CO2 that would naturally occur from the natural rise in temps in the MWP could end between: 2001 AD and 2300 AD. To get this subtract the earliest end point year with the latests start point year 1201 – 1060 = 141 then add in the 1860 start point for the CO2 = 2001. Now repeat with subtracting the earliest start point year from the latest end point, 1300 – 860 = 440 then add in the 1860 start point for the CO2 = 2300.

So what does that mean? It means this:

Yes Humans are pumping out CO2, however unlike what the Alarmists want you to believe the rise in CO2 levels is not all due to man, part of it is the natural rise you get from the MWP. How much of a rise I haven’t myself calculated or seen somewhere else calculated, but you will get a natural rise and it might even be large enough that we could continue emitting CO2 and see CO2 levels drop in the next coming years. Maybe that is why the UN and the Alarmists wanted the Copenhagen accords signed so badly, if the CO2 levels start dropping without some over arching UN made, global governance mandate for man made emissions reduction and the levels drop on their own there goes the whole CAGW theory out the window. So if the warming during the 20th century was natural and not caused by man made CO2, you could see temps still rising with a reduction of the CO2 levels with out Human emission reductions. That is why the MWP is important it could explain the majority of the CO2 rise during the 20th century without a correlation in 20th century temperature rise.

So until someone does a study on how much of a bump the MWP could cause in CO2, what I just posted might come to pass. Of course I might have missed a study on this that says it was small and won’t play any major role and if any one knows of one post it in a comment so I can read it.

Democrats Crocodile Tears

Nancy Pelosi and the Democrats are calling Congressional members back to the swamp in DC to pass a bill that will as they put it give money to state and local municipalities to pay for teachers, cops and Medicaid. Now they claim to pay for this by using some unspent stimulus fund money but also by closing some loopholes in the tax code so that those nasty multinational corporations have to pay their fair share.

The Republicans and Tea Party groups have called the Dems on this as job killing in the middle of a recession, which in turn Nancy Pelosi and the Dems shed crocodile tears about how can the Republicans be against cops and teachers that will lose their jobs if congress doesn’t pass this bill, all the while stuffing Teacher Union donations into their campaign coffers.

Now in the Bizzaro world that the Dems inhabit that is the end of the story because to them they don’t believe in how the real world works.

They don’t believe in the fact that every time they cause the taxes of corporations to go up, the corporations make up the money lost to the Government by doing two things: raising their prices and cutting jobs/not hiring. They haven’t grasped the simple idea that a business is in business to make money, nothing else. Not provide jobs, not provide benefits and most assuredly not pay taxes to the US Government. Business’s work on a very old and simple principle: money earned-money spent = profit = successful business. If for what ever reason the equation goes this way: money earned-money spent = loss then you have an unsuccessful business and eventually it goes bankrupt. Of course the Dems have an answer for that too: Bailout/Government Takeover, however business and corporations don’t operate too well on that basis.

So what does that mean to you the average Joe and Jane sitting at home? Plenty, for you are the one that is going to take it on the chin not the “evil” multinationals, nor the teachers union, nor the bankrupt municipalities with their morally bankrupt politicians (See Bell California for that poster child)

Lets say the Dems get what they want and it passes, the loopholes are closed and the multinationals spend more in taxes. Well guess what? They are not going to eat that loss, they spread the pain. As an example one of those “evil” multinational companies is Walmart. Now whether you like them or not, they are the single largest PRIVATE employer in the US and two things will happen. One the low prices for those that shop there will go up, especially products that are made by other “evil” multinationals like Coke, Pepsi, Unilever and Proctor & Gamble because they will charge Walmart more, and Walmart will cut back on employment. You know the minimum wage earners, the poor that the Dems say they are trying to help. So there you go people that needed that paycheck are now out of work and the prices of stuff goes up, thus causing spending to go down. This will apply to all stores be they Target, your local supermarket and even wholesale clubs like Costco. Need a new TV, sorry Best Buy has to raise it’s prices too since Sony, Samsung, LG and the others are “multinationals”. Are you catching the drift? Yep it’s the dreaded Trickle Down Economics, except where Reagan cut taxes to give you more money to spend, Pelosi is going to make sure what you do have doesn’t go as far.

Now this has a follow on effect, because more people are losing jobs, that means unemployment goes up, which means more unemployment spending bills the congress has to pass. Thus the only thing this did was change where that pittance these people get is from: Walmart were they got health insurance, to the Government and public paying their health coverage. Other companies will do the same thing and the bloated government gets more bloated and gets more power and it gets deeper into debt as those people drop intot he “no longer paying tax” category. That’s right instead of decreasing debt as the Dems claim it will have the opposite effect. People need to engrave this on their foreheads increase in Tax Rate does not equal increase in Tax Revenue. This was something that Dems knew at one time, back in 1960 when JFK was President andsaid that Tax Revenue was inversely proportional to Tax Rate. I know and you thought trickle down economics was a Republican idea, the Dems kinda gloss that over when they bring him up, funny that they like to point to him but never talk much about his policies. Maybe that’s because his policies make him sound like Ronald Reagan?

I know what your saying: “Well at least the cops are still there because of this and the kiddies still have teachers”

Here is the truth on that: Unless the Feds do a complete State bailout, state and local governments are going to have to tighten the belt period. Now for those that saw the Glenn Beck program a couple of days ago this will be familiar, for those that didn’t it will be instructive. Lets look at two near by municipalities near where I live, that are in the hole spending wise. They plan to cut cops but in Philadelphia they are still handing out a million dollars for “street murals” and in Baltimore while cutting cops they are still spending a boatload on……wait……wait…..an Opera House. Yep there you go folks these cities are so far in debt they have to layoff cops but they can’t stop funding graffiti artists and building an Opera House. Now everyone knows that every lower middle class and low income family just can’t wait to pay huge sums of money to go see shows in the new opera house. Yep can’t cut that from the spending, nope got to cut the cops and teachers if the Feds won’t give us more money. Remember Opera houses are a need and cops and teachers are a want.

That’s why it’s Crocodile tears that the Dems are shedding for the cops and teachers, instead of cutting unnecessary things like Opera houses and paying for “Street Murals” they want to cut the basic functions government is called on to do, just so they can make political hay when the Republicans oppose them. This is just another Tax and Spend scheme, nothing new from the progressive wing of the Democratic Party. Matter of fact they haven’t had a new idea since FDR when his progressive programs prolonged the Great Depression for at least 8 years.

Unlike Captain Smith that didn’t see the Iceberg that sank the Titanic, Pelosi, Reid and Obama not only see they are steering right for it and telling you to keep with the course don’t elect Republicans or Conservatives. Yep keep with that course right into the Iceberg and watch the ship of state sink right after it.