Här kommer ytterligare en i raden av analyser som visar hur osäkra klimatmodellerna är och på vilken lösan vetenskaplig grund de vilar. I det här fallet rör det sig om molnbildning, energimängder och ”Greenhouse GAS Forcing”.
Det är alltså dessa voodoo modeller som Global Warming Hysterikerna avgudar och på vars altare de är bereda att offra VÅRT välstånd och ekonomiska framtid.
Se även mina inlägg:
Validation, Evaluation and Exaggeration from the IPCC, Has Global Warming Research Misinterpreted Cloud Behavior?, Honest Statement Of Current Capability In Climate Forecasts, Tropical Water Vapor and Cloud Feedbacks in Climate Models, Basic Greenhouse Equations ”Totally Wrong” – ytterligare ett anförande från konferensen i New York, Hey, Nobel Prize Winners, Answer Me This, The Sloppy Science of Global Warming!, IPCC models are incoherent and invalid from a scientific point of view!, But the forecasts, especially for regional climate change, are immensely uncertain!, There will be no more warming for the foreseeable future. ROBUSTNESS AND UNCERTAINTIES OF CLIMATE CHANGE PREDICTIONS, Has the IPCC inflated the feedback factor?, Climate change confirmed but global warming is cancelled, Why multiple climate model agreement is not that exciting!, Open letter to IPCC to renounce its current policy!, Average Day By Day Variations Of The Global And Hemispheric Average Lower Tropospheric Temperatures, Scientists Reveal Presence Of Ocean Current ‘Stripes’, Cold in the tropical troposphere but it should be warming if Global Warming ”theories” are correct!, Assessment of the reliability of climate predictions based on comparisons with historical time series, Mera om Klimat modellernas falsarium, Klimatmodellernas falsarium, Klimatmodellernas skojeri – Fel på 100 - 300%!
Analysen finns här:
The claim that anthropogenic CO2 is responsible for the current warming of Earth climate is scientifically insupportable because climate models are unreliable
by Patrick Frank
”One approach to determining error is to integrate the total cloudiness retrodicted by each model and compare that to the total cloudiness actually observed (SI Section 3). Calculating error this way is a little simplistic because positive error in one latitude can be cancelled by negative error in another. This exercise produced a standard average cloudiness error of ±10.1%, which is about half the officially assessed GCM cloud error.24 So let’s call ±10.1% the minimal GCM cloud error.
The average energy impact of clouds on Earth climate is worth about -27.6 W/m2. 27 That means ±10.1% error produces a ±2.8 W/m2 uncertainty in GCM climate projections. This uncertainty equals about ±100 % of the current excess forcing produced by all the human-generated greenhouse gasses presently in the atmosphere.10 Taking it into account will reflect a true, but incomplete, estimate of the physical reliability of a GCM temperature trend.
So, what happens when this ±2.8 W/m2 is propagated through the SRES temperature trends offered by the IPCC in Figure SPM-5 (Figure 1)? When calculating a year-by-year temperature projection, each new temperature plus its physical uncertainty gets fed into the calculation of the next year’s temperature plus its physical uncertainty. This sort of uncertainty accumulates each year because every predicted temperature includes its entire ± (physical uncertainty) range (SI Section 4).
Figure 4. The Special Report on Emission Scenarios (SRES-SPM-5) A2 projection from Figure 1 showing the physical uncertainty of the projected temperature trend when including ±10.1% cloud error (light shading), or the uncertainty in greenhouse gas forcing (dark shading). Inset: A close-up view of the first 20 years of the A2 projection and the uncertainty limits.
Figure 4 shows the A2 SRES projection as it might have looked had the IPCC opted to show the minimal ±10.1 % cloud error as a measure of the physical accuracy of their GCM-scenarioed 21st century temperature trend. The result is a little embarrassing. The physical uncertainty accumulates rapidly and is so large at 100 years that accommodating it has almost flattened the steep SRES A2 projection of Figure 1. The ±4.4°C uncertainty at year 4 already exceeds the entire 3.7°C temperature increase at 100 years. By 50 years, the uncertainty in projected temperature is ±55°. At 100 years, the accumulated physical cloud uncertainty in temperature is ±111 degrees. Recall that this huge uncertainty stems from a minimal estimate of GCM physical cloud error.
In terms of the actual behavior of Earth climate, this uncertainty does not mean the GCMs are predicting that the climate may possibly be 100 degrees warmer or cooler by 2100. It means that the limits of resolution of the GCMs – their pixel size – is huge compared to what they are trying to project. In each new projection year of a century-scale calculation, the growing uncertainty in the climate impact of clouds alone makes the view of a GCM become progressively fuzzier.
It’s as though a stronger and stronger distorting lens was placed in front of your eyes every time you turned around. First the flowers disappear, then the people, then the cars, the houses, and finally the large skyscrapers. Everything fuzzes out leaving indistinct blobs, and even large-scale motions can’t be resolved. Claiming GCMs yield a reliable picture of future climate is like insisting that an indefinable blurry blob is really a house with a cat in the window.
The dark shading in Figure 4 shows the error due to uncertainties in greenhouse gas forcings themselves (~1% for CO2 ~10% for methane, ~5% for nitrous oxide),10 and how this small uncertainty accumulates during 100 years of climate projection. After a century, the uncertainty in predicted global average temperature is ±17 degrees just from accumulation of the smallish forcing error alone.
The difficulty is serious even over short times. The inset to Figure 4 shows that after only 20 years, the uncertainty from cloud error is ±22° and for forcing, it’s ±3°. The effect of the ~1% forcing uncertainty alone tells us that a 99% accurate GCM couldn’t discern a new Little Ice Age from a major tropical advance from even 20 years out. Not only are these physical uncertainties vastly larger than the IPCC allows in Figure SPM-5 (Figure 1), but the uncertainties the IPCC allows in Figure SPM-5 aren’t even physical.16
When both the cloud and the forcing uncertainties are allowed to accumulate together, after 5 years the A2 scenario includes a 0.34°C warmer Earth but a ±8.8°C uncertainty. At 10 years this becomes 0.44±15° C, and 0.6±27.7°C in 20 years. By 2100, the projection is 3.7±130°C. From clouds alone, all the IPCC projections have uncertainties that are very much larger than the projected greenhouse temperature increase. What is credible about a prediction that sports an uncertainty 20-40 times greater than itself? After only a few years, a GCM global temperature prediction is no more reliable than a random guess. That means the effect of greenhouse gasses on Earth climate is unpredictable, and therefore undetectable. And therefore moot.
The rapid growth of uncertainty means that GCMs cannot discern an ice age from a hothouse from 5 years away, much less 100 years away. So far as GCMs are concerned, Earth may be a winter wonderland by 2100 or a tropical paradise. No one knows.
Direct tests of climate models tell the same tale. In 2002, Matthew Collins of the UK Hadley Centre used the HadCM3 GCM to generate an artificial climate, and then tested how the HadCM3 fared predicting the very same climate it had generated.28 It fared poorly, even though it was the perfect model. The problem was that tiny uncertainties in the inputs – the starting conditions – rapidly expanded and quickly drove the GCM into incoherence.
Läs även andra bloggares åsikter om <a href=”http://bloggar.se/om/milj%F6” rel=”tag”>miljö</a>
Even with a perfect model, Collins reported that, ”[I]t appears that annual mean global mean temperatures are potentially predictable 1 year in advance and that longer time averages are also marginally predictable 5 and 10 years in advance.” So with a perfect climate model and near-perfect inputs one might someday ”potentially [predict]” and ”marginally [predict],” but can not yet actually predict 1 year ahead. But with imperfect models, the IPCC predicts 100 years ahead.
Likewise, in a 2006 test of reliability, William Merryfield used 15 GCMs to predict future El Niño-Southern Oscillations (ENSO) in a greenhouse-warmed climate,29 and found that, ”Under CO2 doubling, 8 of the 15 models exhibit ENSO amplitude changes that significantly (p<0.1) exceed centennial time scale variability within the respective control runs. However, in five of these models the amplitude decreases whereas in three it increases; hence there is no consensus as to the sign of change.” So of 15 GCMs, seven predicted no significant change, 5 predicted a weaker ENSO, and 3 predicted a stronger ENSO. This result is exactly equivalent to ‘don’t know.’ The 15 GCMs tested by Merryfield were the same ones used by the IPCC to produce its Fourth Assessment Report.
In light of all this, why is the IPCC so certain that human-produced CO2 is responsible for recent warming? How can the US National Academy of Sciences say, in a recent brochure, that, ” … Earth’s warming in recent decades has been caused primarily by human activities that have increased the amount of greenhouse gases in the atmosphere”?30 This brochure offers a very telling Figure 4 (SI Section 5), showing the inputs to 20th century global temperature from a GCM projection. Only when the effects of human greenhouse gasses are included with normal temperature variation, we are told, does the GCM projected temperature trend match the observed temperature trend.
But their Figure 4 has another trait that is almost ubiquitous in GCM temperature projections. It shows no physical uncertainty limits. We are given a projected temperature trend that is implicitly represented as perfectly accurate. NAS Figure 4 would be more truthful if the National Academy presented it complete with ±100 degree uncertainty limits. Then it would be obvious that the correspondence between the observations and the projection was no more than accidental. Or else that the GCM was artificially adjusted to make it fit. It would also be obvious that it is meaningless to claim an explanatory fit is impossible without added CO2, when in fact an explanatory fit is impossible, period.
It is well-known among climatologists that large swaths of the physics in GCMs are not well understood.31 Where the uncertainty is significant GCMs have ”parameters,” which are best judgments for how certain climate processes work. General Circulation Models have dozens of parameters and possibly a million variables,32 and all of them have some sort of error or uncertainty.
A proper assessment of their physical reliability would include propagating all the parameter uncertainties through the GCMs, and then reporting the total uncertainty.33 I have looked in vain for such a study. No one seems to ever have directly assessed the total physical reliability of a GCM by propagating the parameter uncertainties through it. In the usual physical sciences, an analysis like this is required practice. But not in GCM science, apparently, and so the same people who express alarm about future warming disregard their own profound ignorance.
So the bottom line is this: When it comes to future climate, no one knows what they’re talking about. No one. Not the IPCC nor its scientists, not the US National Academy of Sciences, not the NRDC or National Geographic, not the US Congressional House leadership, not me, not you, and certainly not Mr. Albert Gore. Earth’s climate is warming and no one knows exactly why. But there is no falsifiable scientific basis whatever to assert this warming is caused by human-produced greenhouse gasses because current physical theory is too grossly inadequate to establish any cause at all.
Nevertheless, those who advocate extreme policies to reduce carbon dioxide emissions inevitably base their case on GCM projections, which somehow become real predictions in publicity releases. But even if these advocates admitted the uncertainty of their predictions, they might still invoke the Precautionary Principle and call for extreme reductions ”just to be safe.” This principle says, ”Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing costeffective measures to prevent environmental degradation.”34 That is, even if we don’t fully know that CO2 is dangerously warming Earth climate, we should curtail its emission anyway, just in case. However, if the present uncertainty limit in General Circulation Models is at least ±100 degrees per century, we are left in total ignorance about the temperature effect of increasing CO2. It’s not that we, ”lack … full scientific certainty,” it’s that we lack any scientific certainty. We literally don’t know whether doubling atmospheric CO2 will have any discernible effect on climate at all.
If our knowledge of future climates is zero then for all we know either suppressing CO2 emissions or increasing them may make climate better, or worse, or just have a neutral effect. The alternatives are incommensurate but in our state of ignorance either choice equally has two chances in three of causing the least harm.35 Complete ignorance makes the Precautionary Principle completely useless. There are good reasons to reduce burning fossil fuels, but climate warming isn’t one of them.
Some may decide to believe anyway. ”We can’t prove it,” they might say, ”but the correlation of CO2 with temperature is there (they’re both rising, after all),36 and so the causality is there, too, even if we can’t prove it yet.” But correlation is not causation,37 and cause can’t be assigned by an insistent ignorance. The proper response to adamant certainty in the face of complete ignorance38 is rational skepticism. And aren’t we much better off accumulating resources to meet urgent needs than expending resources to service ignorant fears?
So, then, what about melting ice-sheets, rising sea levels, the extinction of polar bears, and more extreme weather events? What if unusually intense hurricane seasons really do cause widespread disaster? It is critical to keep a firm grip on reason and rationality, most especially when social invitations to frenzy are so pervasive. General Circulation Models are so terribly unreliable that there is no objectively falsifiable reason to suppose any of the current warming trend is due to human-produced CO2, or that this CO2 will detectably warm the climate at all. Therefore, even if extreme events do develop because of a warming climate, there is no scientifically valid reason to attribute the cause to humanproduced CO2. In the chaos of Earth’s climate, there may be no discernible cause for warming.39 Many excellent scientists have explained all this in powerful works written to defuse the CO2 panic,40 but the choir sings seductively and few righteous believers seem willing to entertain disproofs.