While much of the United States is freezing, let's all warm ourselves by considering what a bunch of liars the global warming zealots have become!
Completely inadequate IPCC models produce the ultimate deception about man made global warming
By Dr. Tim Ball
Canada Free Press
December 22, 2008
E. R. Beadle said, “Half the work done in the world is to make things appear what they are not.” The Intergovernmental Panel on Climate Change (IPCC) does this with purpose and great effect. They built the difference between appearance and reality into their process. Unlike procedure used elsewhere, they produce and release a summary report independently and before the actual technical report is completed. This way the summary gets maximum media attention and becomes the public understanding of what the scientists said. Climate science is made to appear what it is not. Indeed, it is not even what is in their Scientific Report.
The pattern of falsifying appearances began early. Although he works at the National Center for Atmospheric Research (NCAR), Stephen Schneider was heavily employed in the work of the IPCC as this biography notes.
Much of Schneider’s time is taken up by what he calls his “pro bono day job” for the Intergovernmental Panel on Climate Change (IPCC). He was a Coordinating Lead Author in Working Group II of the IPCC from 1997 to 2001 and a lead author in Working Group I from 1994 to 1996. Currently, he is a Coordinating Lead Author for the controversial chapter on “Assessing Key Vulnerabilities and the Risks from Climate Change,” in short, defining “dangerous” climate change.” - Pubmedcentral.nih.gov
He continued this work by helping prepare the Summary for Policymakers (SPM) of the Fourth Assessment Report (AR4) released in April 2007.
Schneider, among others, created the appearance that the Summary was representative of the Science Report. However, he provides an early insight into the thinking when speaking about global warming to Discovery magazine (October 1989) he said scientists need, “to get some broader based support, to capture the public’s imagination…that, of course, entails getting loads of media coverage. So we have to offer up some scary scenarios, make simplified dramatic statements and make little mention of any doubts we may have…each of us has to decide what the right balance is between being effective, and being honest.” The last sentence is deeply disturbing--there is no decision required.
The Summary for Policymakers is designed to convince everyone that global warming is due to human production of CO2. In SPM AR4 issued in April 2007 they say, “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic GHG concentrations.” The term “very likely” is from a table reportedly produced by Schneider and means greater than 90%. Professor Roy Spencer says about probabilities in this context. “Any statements of probability are meaningless and misleading. I think the IPCC made a big mistake. They’re pandering to the public not understanding probabilities. When they say 90 percent, they make it sound like they’ve come up with some kind of objective, independent, quantitative way of estimating probabilities related to this stuff. It isn’t. All it is is a statement of faith.”
So they create an appearance of certainty about a human cause of warming. But what is the reality? The only place where CO2 is causing temperature increase is in the IPCC computer models. In every record of any duration for any time period in the history of the Earth, temperature increase precedes CO2 increase. So an incorrect assumption that a CO2 increase will cause temperature increase is built into the computer models. That is damaging enough, but the computer models themselves are completely inadequate to represent global climate or make any predictions about future climate. But don’t believe me. The IPCC Technical Report (“The Physical Science Basis”) produced by Working Group I and released in November 2007, says so.
Problems begin with the definition of climate change used because it requires they only consider human causes. From the United Nations Environment Program (article 1) of the United Nations Framework Convention on Climate Change (UNFCCC), “a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over considerable time periods.” But you cannot determine the human portion unless you understand natural climate change. As Professor Roy Spencer said in his testimony before the US Senate EPW Committee, “And given that virtually no research into possible natural explanations for global warming has been performed, it is time for scientific objectivity and integrity to be restored to the field of global warming research.”
Media and public are allowed to believe the IPCC make climate predictions, but they don’t. The First Assessment Report (Climate Change 1992) said, “Scenarios are not predictions of the future and should not be used as such.” While the Special Report on Emissions Scenarios says; “Scenarios are images of the future or alternative futures. They are neither predictions nor forecasts. Climate Change 2001 continues the warnings; “The possibility that any single in emissions path will occur as described in this scenario is highly uncertain.” In the same Report they say, “No judgment is offered in this report as to the preference for any of the scenarios and they are not assigned probabilities of recurrence, neither must they be interpreted as policy recommendations.” This is a reference to the range of scenarios they produce using different future possible economic conditions. Of course, they didn’t build in the recent financial collapse.
Climate Change 2001 substitutes the word projection for prediction. Projection is defined as follows, “A projection is a potential future evolution of a quantity or set of quantities, often computed with the help of a model. Projections are distinguished from predictions in order to emphasise that projections involve assumptions concerning e.g. future socio-economic and technological developments that may or may not be realised and are therefore subject to substantial uncertainty”.
This and similar statements are based on the unproven hypothesis that human produced CO2 is causing warming and or climate change. The evidence is based solely on the output of 18 computer climate models selected by the IPCC. There are a multitude of problems including the fact that every time they run them they produce different results. They use an average of all the runs. The IPCC then take the average results of the 18 models and average them for the results in their Reports.
Tim Palmer, a leading climate modeler at the European Centre for Medium - Range Weather Forecasts said, “I don’t want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.” This comment is partly explained by the scale of the General Circulation Models (GCM). The models are mathematical constructs that divide the world into rectangles. Size of the rectangles is critical to the abilities of the models as the IPCC AR4 acknowledges. “Computational constraints restrict the resolution that is possible in the discretized equations, and some representation of the large-scale impacts of unresolved processes is required (the parametrization problem). “ (AR4 Chapter 8. p.596.)
The IPCC uses surface weather data, which means there is inadequate data in space and time for most of the world to create an accurate model. Limitations of the surface data are surpassed by an almost complete lack of information above the surface. An illustration of the surface problem is identified by the IPCC comment of the problems of modeling Arctic climates.
“Despite advances since the TAR, substantial uncertainty remains in the magnitude of cryospheric feedbacks within AOGCMs. This contributes to a spread of modelled climate response, particularly at high latitudes. At the global scale, the surface albedo feedback is positive in all the models, and varies between models much less than cloud feedbacks. Understanding and evaluating sea ice feedbacks is complicated by the strong coupling to polar cloud processes and ocean heat and freshwater transport. Scarcity of observations in polar regions also hampers evaluation.” (AR4.,Chapter 8, p593.) Most of the information for the Arctic came from the Arctic Climate Impact Assessment (ACIA) and a diagram from that report illustrates the problem.
The very large area labeled “No Data” covers most of the Arctic Basin, an area of approximately 14,250,000 km2 (5,500,000) square miles). Remember, certainties of arctic ice conditions are core to Gore’s alarmism.
In the Southern Hemisphere the IPCC identifies this problem over a vast area of the Earth’s surface. “Systematic biases have been found in most models’ simulation of the Southern Ocean. Since the Southern Ocean is important for ocean heat uptake, this results in some uncertainty in transient climate response.” (AR4. Chapter 8. p. 591.)
Atmosphere and oceans are fluids governed by non-linear rather than linear equations. These equations have unpredictability or randomness - also known as chaos – it explains why the models get different results every time they are run. These problems well known outside of climate science were specifically acknowledged in the IPCC Third Assessment Report (TAR), “In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.” (TAR, p.774.)
Validation is essential for any model before using it for predictions. A normal procedure is to require proven evidence that they can make future predictions to a satisfactory level of accuracy. The IPCC use the term evaluation instead of validation, but they don’t evaluate the entire model. To do so they say shows problems but the source is not determined. Instead they evaluate at the component level. This means they don’t evaluate the important interactions between the components at any level.
IPCC Report AR4 makes a remarkable statement not repeated in the Summary for Policymakers. It speaks to the lack of valuation, which explains the failure of their projections. “What does the accuracy of a climate model’s simulation of past or contemporary climate say about the accuracy of its projections of climate change? This question is just beginning to be addressed, exploiting the newly available ensembles of models.” (AR4, Chapter 8. p.594.)
A simple single word definition of science is the ability to predict. It is not used by the IPCC, yet they present their work as scientific predictions. Media and the public generally believe the IPCC is making predictions and that is clearly the assumption for government policies. Sadly, members of the IPCC do nothing to dissuade the public from that view. All previous “projections” were wrong. The most recent example is the period from 2000 to 2008. IPCC predicted warming but temperatures went down while CO2 increased. Finally, the IPCC AR4 itself explains why IPCC model projections fail.
“Models continue to have significant limitations, such as in their representation of clouds, which lead to uncertainties in the magnitude and timing, as well as regional details, of predicted climate change.” (AR4, Chapter 8. p.600)
It is hard to imagine a better example of Beadle’s axiom paraphrased as follows, “Half the work done by the IPCC is to make things appear what they are not.”Dr. Tim Ball is a renowned environmental consultant and former climatology professor at the University of Winnipeg. Dr. Ball employs his extensive background in climatology and other fields as an advisor to the International Climate Science Coalition, Friends of Science and the Frontier Centre for Public Policy.
No comments:
Post a Comment