Saturday, December 05, 2009

The Carbon Dioxide Fallacy

And the Man-Made Global Warming, I mean Climate Change, hoax continues to unravel:

We are told, based on computer models, that human beings burning fossil fuels -- and exhaling -- is increasing the amount of carbon dioxide (CO2) in the atmosphere. This, in turn, is trapping heat, which is responsible for the modest temperature increase between 1976 and 1998. The conclusion is that we must alter our entire lifestyles to avoid a planetary catastrophe.
For computer models to be accurate, inputs must include all of the factors that can impact climate. Knowing this, as well as believing it likely that the majority of factors that do impact climate are yet unknown, how can we trust the models?
Especially when we find out the inputs were "fudged" if not outright fabricated.

And yet, we who doubt the agenda driven alarmism, are said to be "anti-science." Ha! The real "anti-scientists", also sinister liars, are those pushing the man-made global warming hoax. The hoax is put forth by those out for more research grant money at best, crypto totalitarians looking to take away our mobility and therefore freedom at worst, and those looking for another excuse to tax us in between.

Gee, it sure is "science" when the "scientists" have been caught red-handed adjusting / fabricating data to fit the "computer models". If the data doesn't fit the models, make it fit! Science!

Wow, what happened to empirical proof and causal links?

Solar orbital cycles, sunspots, natural non-industrial increases in carbon dioxide output, and volcanic eruptions could not be reached for comment.

It is also important to understand what the measure of "global warming" in degrees Fahrenheit or degrees Celsius/Centigrade *really* means. Remember, zero degrees Fahrenheit and zero degrees Celsius/Centigrade are man-made conventions. To state a percent temperature change relative to either is meaningless.

Temperature is simply a measure of heat. At absolute zero (-460 degrees F or -273 degrees C) there is no energy. So, any thermodynamically relevant measure would consider a percentage energy change relative to absolute zero (that is, in degrees Kelvin) rather than an arbitrary Fahrenheit or Celsius/Centigrade measure.

Why is this distinction important? Because variations in the sun's radiation on the earth are defined against a true zero (usually as watts/sq. meter). When stated as a percentage relative to absolute zero, changes in the earth's average temperature can easily be explained through the typical variations in the sun's radiant energy. That no "climate expert" mentions this only further solidifies my dislike for the lot of them.

No comments: