“The report of our extinction was an exaggeration.”

För att parafrasera vad Mark Twain skrev I New York Journal 1897 när han fick reda på att hans dödsruna hade publicerats i samma tidning.

Detsamma kan sägas om isbjörnarna.

Kommer ni ihåg den berömda isbjörnsbilden som Al Gore använder i sin film och i sina föredrag för att visa hur isarna har smält och hur hotad isbjörnarna är etc. Att bilden som han använde utan lov och var tagen mitt i sommaren nämnde han bekvämt nog inte.

Nå, en rad organisationer, däribland U.S. Department of the Interior och U.S. Geological Survey (USGS), blev så oroade av dessa uppgifter att de satte igång utredningar för att se hur isbjörnarna påverkades av Global Warming och den ”smältande” isen.

Man ville också sätta upp isbjörnarna som ”a threatened species” under the U.S. Endangered Species Act (januari 2007, the U.S. Fish and Wildlife Service).

I dessa ”utredningar” så kom man fram till att isbjörnstammen skulle minska med minst 2/3 fram till år 2050 p.g.a. Global warming.

Nå, nu har Armstrong, Green och Soon använt sig av de principer och metoder som jag redogjorde närmare för i mitt inlägg The Unscientific way of IPCC:s forecasts eller IPPC:s lögn del 2! angående förutsägelser på vetenskaplig basis. Och gjort en analys av USGS: s (U.S. Geological Survey) rapporter för att se hur mycket vetenskap det finns bakom USGS: s påståenden och förutsägelser i deras rapporter.

Dessa rapporter använder USGS  för att skrämma folk med vilka katastrofala konsekvenser Global Warming skulle få för isbjörnarna. Och som våra politiker och massmedia naturligtvis slaviskt följer.

Och resultatet?

En total sågning av dessa s.k. vetenskapliga metoder som USGS använt. I det ena fallet så klarar rapporten bara av att uppfylla 10% av kraven och i det andra fallet bara 15%.

 

Det är med sådana ”lysande” vetenskapliga insatser som Global Warming Hysterin hålls vid liv

 

Några citat:

Table 1: Summary ratings from the forecasting audits

Principles                        AMD    H6

Contravened                   41       61

Apparently contravened   32       19

Not auditable                    26       15

Properly applied               17       10

”Of the nine, Amstrup, Marcot, and Douglas (2007) and Hunter et al. (2007) were the most relevant to the listing decision. Their forecasts were products of complex sets of assumptions. The first in both cases was the erroneous assumption that General Circulation Models provide valid forecasts of summer sea ice in the regions inhabited by polar bears. We nevertheless audited their conditional forecasts of what would happen to the polar bear population assuming, as the authors did, that the extent of summer sea ice would decrease substantially over the coming decades.

In all, Amstrup et al. properly applied only 15% of relevant forecasting principles and Hunter et al. only 10%. Their forecasts are unscientific and therefore a disservice to decision makers. We recommend that all relevant principles should be observed and forecasting processes audited when important public policy decisions depend on accurate forecasts.”

 

”The reports included 444 unique references in total. We were unable to find any references to works providing evidence that the forecasting methods used in the reports had been validated.”

 

”The fact that the AMD forecasts rest on the GCM forecasts and that these forecasts lack a scientific basis puts their assumptions into question. Indeed, some climate modelers state that the GCMs do not provide forecasts. Furthermore, the GCM models have not been designed for analysis at a regional level in the way they are used by AMD and H6”

”Since the current population of bears is not at a level that causes concern, the case for listing depends upon forecasts of serious declines in bear numbers in decades to come. None of the reports included references to scientific works on forecasting methods.

We found that the two reports most relevant to the listing decision made some questionable assumptions. Even if these assumptions had been valid, the bear population forecasting procedures used in the reports contravened many important forecasting principles.”

USGS rapporter finns här:

http://www.usgs.gov/newsroom/special/polar_bears/

Analysen finns här:

Polar Bear Population Forecasts: A Public-Policy Forecasting Audit

http://forecastingprinciples.com/Public_Policy/PolBears.pdf

Polar Bear Hearing: Armstrong Testimony Inhofe Statement at EPW Committee Hearing on Polar Bears

 Polar Bear Hearing: Glenn Testimony

Läs även andra bloggares åsikter om <a href=”http://bloggar.se/om/milj%F6 rel=”tag”>miljö</a>

Citat från analysen:

”Abstract

 

Calls to list polar bears as a threatened species under the U.S. Endangered Species Act are based on forecasts of large long-term declines in their population. Nine government reports were prepared to ”support” the listing decision. We assessed them in the light of evidence-based (scientific) forecasting principles. None referred to works on scientific forecasting methodology.

 

Of the nine, Amstrup, Marcot, and Douglas (2007) and Hunter et al. (2007) were the most relevant to the listing decision. Their forecasts were products of complex sets of assumptions. The first in both cases was the erroneous assumption that General Circulation Models provide valid forecasts of summer sea ice in the regions inhabited by polar bears.

We nevertheless audited their conditional forecasts of what would happen to the polar bear population assuming, as the authors did, that the extent of summer sea ice would decrease substantially over the coming decades. In all, Amstrup et al. properly applied only 15% of relevant forecasting principles and Hunter et al. only 10%. Their forecasts are unscientific and therefore a disservice to decision makers. We recommend that all relevant principles should be observed and forecasting processes audited when important public policy decisions depend on accurate forecasts.

 

Polar bears have been described by some as the ”canaries of climate change.” Despite widespread agreement that the polar bear population rose rapidly over recent years after stricter polar bear hunting rules were imposed, new concerns have been expressed that climate change will threaten the survival of some sub-populations in the 21st Century. Such concerns led the U.S. Fish and Wildlife Service to consider listing polar bears as a threatened species under the Endangered Species Act. To list a species that is currently in good health must surely require forecasts that its population would, if it were not listed, decline to levels that threaten the viability of the species.

The decision on listing polar bears thus rests on long-term forecasts.

In order provide the necessary forecasts for the listing decision, the Secretary of the Interior and the Fish and Wildlife Service requested the U.S. Geological Survey to conduct additional analyses. The Geological Survey in turn commissioned nine ”administrative reports” to satisfy the request.

We asked ”Are the forecasts derived from accepted scientific procedures?”

In order to answer our question, we first examined the references in the nine unpublished government reports. Second, we examined the forecasting methods employed in two of those nine reports by assessing the procedures described in them against forecasting principles. The forecasting principles are based on evidence from scientific research that has revealed which methods provide the most accurate forecasts for a given situation and which methods to avoid.

 

We examined the references cited in the nine unpublished U.S. Geological Survey Administrative Reports posted on the Internet at http://usgs.gov/newsroom/special/polar_bears/. The reports were

Amstrup et al. (2007); Bergen et al (2007); DeWeaver (2007); Durner et al. (2007); Hunter et al. (2007); Obbard et al. (2007); Regehr et al. (2007); Rode et al. (2007); and Stirling et al. (2007).

 

The reports included 444 unique references in total. We were unable to find any references to works providing evidence that the forecasting methods used in the reports had been validated.”

We audited the forecasting procedures used in what we judged to be the two reports most crucial to the forecasts. These were Amstrup et al. (2007), which we refer to as AMD, and Hunter et al. (2007), which we refer to as H6.”

 

Och om vi då tittar I detalj på vilka vetenskapliga principer som USGS bryter mot:

”MD made forecasts of polar bear populations for 45, 75, and 100 years from the year 2000. AMD implicitly made many assumptions: (1) global warming will occur; (2) this will both reduce the extent of and thin the summer sea ice; (3) polar bears will obtain less food by hunting

from the sea ice platform than they do now; (4) they will not obtain adequate supplementary food using other means or from other sources; (5) the bear population will decline; (6) the designation of polar bears as an endangered species will solve the problem and will not have serious detrimental effects; and (7) there are no other policies that would produce better outcomes than those based on an endangered species classification.

AMD assumed that the general circulation models (GCMs) provide scientifically valid forecasts of global temperature and the extent and thickness of sea ice. They stated (AMD 2007, p. 2 and Fig 2 on p. 83): ”Our future forecasts are based largely on information derived from

general circulation model (GCM) projections of the extent and spatiotemporal distribution of sea ice.” That is, their forecasts are conditional on long-term global warming forecasts leading to a dramatic reduction in Arctic sea ice during melt-back periods in spring, late summer and fall.”

”The fact that the AMD forecasts rest on the GCM forecasts and that these forecasts lack a scientific basis puts their assumptions into question. Indeed, some climate modelers state that the GCMs do not provide forecasts. Furthermore, the GCM models have not been designed for analysis at a regional level in the way they are used by AMD and H6 (see the discussion of Principle 9.2 in H6 below).

We audited AMD’s polar bear population forecasting procedures to assess whether they would produce valid forecasts assuming valid climate and sea ice forecasts were available as inputs. Of the 140 forecasting principles, we agreed that 24 were irrelevant to the forecasting problem. We then examined principles on which our ratings differed. After two rounds of consultation (i.e., the process involved three rounds in all), we were able to reach consensus on all 116 relevant principles. We were unable to rate 26 relevant principles (Table A.3) due to a lack of information. We attempted to obtain additional information from the authors of the administrative reports, but they refused to cooperate. Full disclosure of the ratings is provided in Tables A.1, A.2 and A.3 in the Appendix.

Overall, we found that AMD contravened 41 principles and apparently contravened 32. No justification was provided for the contraventions. We could see no reason for any contraventions.We describe some of the more serious problems with the AMD forecasts here:

 

H6 forecast polar bear numbers and their survival probabilities in the southern Beaufort Sea for 45, 75, and 100 years from 2000. To do so, they implicitly assumed the following: (1) global warming will occur; (2) frequent ”low-ice years” in the Arctic region will be a consequence of global warming; (3) polar bears cannot adapt to ”low-ice years” and their population will decline, (4) the designation of polar bears as an endangered species will solve the problem and will not have serious detrimental effects; and (5) there are no other policies that would produce better outcomes than those based on listing polar bears under the Endangered Species Act.

These are key issues and they require scientific forecasts, not assumptions.

Like AMD, H6 accepted GCM forecasts of global warming and reduced extent and thickness of sea ice. They stated that ”we extracted forecasts of the availability of sea ice for polar bears in the Southern Beaufort Sea region, using monthly forecasts of sea ice concentrations from 10 IPCC Fourth Assessment Report (AR4) fully-coupled general circulation models” (p. 11 of H6).

That is, their forecasts are conditional on long-term forecasts of global warming producing dramatic effects. However, Green and Armstrong (2007) were unable to find any forecasts made in accordance with scientific forecasting principles that support the hypothesized predictions of global warming throughout the 21st Century.”

 

We found that H6’s procedures clearly contravened 61 principles (Appendix Table A.4) and probably contravened an additional 19 principles (Appendix Table A.5). We were unable to rate 15 relevant principles (Appendix Table A.6) due to a lack of information.

Many of the contraventions in H6 were similar to those in AMD and we provide the H6 audit details in the Appendix. Here are some examples of contraventions, some of which are, on their own, raise serious questions about the value of the H6 forecasts:”

 

Summary and conclusions

 

We inspected the nine administrative reports that were commissioned by the USGS with the stated purpose of supporting the listing of polar bears under the Endangered Species Act. Since the current population of bears is not at a level that causes concern, the case for listing depends upon forecasts of serious declines in bear numbers in decades to come. None of the reports included references to scientific works on forecasting methods.

We found that the two reports most relevant to the listing decision made some questionable assumptions. Even if these assumptions had been valid, the bear population forecasting procedures used in the reports contravened many important forecasting principles. Table 1 summarizes our forecasting audits of the two key reports, Amstrup et al. (2007) and Hunter et al. (2007):

Table 1: Summary ratings from the forecasting audits

Principles                        AMD    H6

Contravened                   41       61

Apparently contravened   32       19

Not auditable                    26       15

Properly applied               17       10

Decision makers and the public should expect to see scientific forecasts of both the polar bear population and the net benefits from feasible policies before any decision is made on whether to list polar bears as threatened or endangered. We recommend that important forecasting efforts such as this should observe all relevant principles and that their procedures be audited to ensure that they do so.

 

Appendix: Full Disclosure of the Codings

 

 

Table A.1: Principles contravened in AMD

 

Setting Objectives:

1.2 Prior to forecasting, agree on actions to take assuming different possible forecasts.

1.3 Make sure forecasts are independent of politics.

1.4 Consider whether the events or series can be forecasted.

1.5 Obtain decision makers’ agreement on methods.

Identify Data Sources:

3.5 Obtain information from similar (analogous) series or cases. Such information may help to estimate trends.

Collecting Data:

4.2 Ensure that information is reliable and that measurement error is low.

Selecting Methods:

6.1 List all the important selection criteria before evaluating methods.

6.2 Ask unbiased experts to rate potential methods.

6.7 Match the forecasting method(s) to the situation

6.8 Compare track records of various forecasting methods.

6.10 Examine the value of alternative forecasting methods.

Implementing Methods: General

7.3 Be conservative in situations of high uncertainty or instability.

Implementing Judgmental Methods:

8.1 Pretest the questions you intend to use to elicit judgmental forecasts.

8.2 Frame questions in alternative ways.

8.5 Obtain forecasts from heterogeneous experts.

8.7 Obtain forecasts from enough respondents.

8.8 Obtain multiple forecasts of an event from each expert.

Implementing Quantitative Methods:

9.1 Tailor the forecasting model to the horizon.

9.3 Do not use ”fit” to develop the model.

9.5 Update models frequently.

Implementing Methods: Quantitative Models with Explanatory Variables:

10.6 Prepare forecasts for at least two alternative environments.

10.8 Apply the same principles to forecasts of explanatory variables.

10.9 Shrink the forecasts of change if there is high uncertainty for predictions of the explanatory variables.

Combining Forecasts:

12.1 Combine forecasts from approaches that differ.

12.2 Use many approaches (or forecasters), preferably at least five.

12.3 Use formal procedures to combine forecasts.

12.4 Start with equal weights.

Evaluating Methods:

13.6 Describe potential biases of forecasters.

13.10 Test assumptions for validity.

13.32 Conduct explicit cost-benefit analyses.

Assessing Uncertainty:

14.1 Estimate prediction intervals (PIs).

14.2 Use objective procedures to estimate explicit prediction intervals.

14.3 Develop prediction intervals by using empirical estimates based on realistic representations of forecasting situations.

14.5 Ensure consistency over the forecast horizon.

14.7 When assessing PIs, list possible outcomes and assess their likelihoods.

14.8 Obtain good feedback about forecast accuracy and the reasons why errors occurred.

14.9 Combine prediction intervals from alternative forecasting methods.

14.10 Use safety factors to adjust for overconfidence in the PIs.

14.11 Conduct experiments to evaluate forecasts.

14.13 Incorporate the uncertainty associated with the prediction of the explanatory variables in the prediction intervals.

14.14 Ask for a judgmental likelihood that a forecast will fall within a pre-defined minimum-maximum interval

Table A.2: Principles apparently contravened in AMD

 

Structuring the problem:

2.1 Identify possible outcomes prior to making forecasts.

2.7 Decompose time series by level and trend.

Identify Data Sources:

3.2 Ensure that the data match the forecasting situation.

3.3 Avoid biased data sources.

3.4 Use diverse sources of data.

Collecting Data:

4.1 Use unbiased and systematic procedures to collect data.

4.3 Ensure that the information is valid.

Selecting Methods:

6.4 Use quantitative methods rather than qualitative methods.

6.9 Assess acceptability and understandability of methods to users.

Implementing Methods: General

7.1 Keep forecasting methods simple.

Implementing Quantitative methods:

9.2 Match the model to the underlying phenomena.

9.4 Weight the most relevant data more heavily.

Implementing Methods: Quantitative Models with

Explanatory Variables:

10.1 Rely on theory and domain expertise to select causal (or explanatory) variables.

10.2 Use all important variables.

10.5 Use different types of data to measure a relationship.

Combining Forecasts:

12.5 Use trimmed means, medians, or modes

12.7 Use domain knowledge to vary weights on component forecasts.

12.8 Combine forecasts when there is uncertainty about which method is best.

12.9 Combine forecasts when you are uncertain about the situation.

12.10 Combine forecasts when it is important to avoid large errors.

Evaluating Methods:

13.1 Compare reasonable methods.

13.2 Use objective tests of assumptions.

13.7 Assess the reliability and validity of the data.

13.8 Provide easy access to the data.

13.17 Examine all important criteria.

13.18 Specify criteria for evaluating methods prior to analyzing data.

13.27 Use ex post error measures to evaluate the effects of policy variables.

Assessing Uncertainty:

14.6 Describe reasons why the forecasts might be wrong.

Presenting Forecasts:

15.1 Present forecasts and supporting data in a simple and understandable form.

15.4 Present prediction intervals.

Learning That Will Improve Forecasting Procedures:

16.2 Seek feedback about forecasts.

16.3 Establish a formal review process for forecasting methods.

Table A.3: Principles not rated due to lack of information in AMD

 

Structuring the problem:

2.5 Structure problems to deal with important interactions among causal variables.

Collecting data:

4.4 Obtain all of the important data

4.5 Avoid the collection of irrelevant data

Preparing Data:

5.1 Clean the data.

5.2 Use transformations as required by expectations.

5.3 Adjust intermittent series.

5.4 Adjust for unsystematic past events.

5.5 Adjust for systematic events.

5.6 Use multiplicative seasonal factors for trended series when you can obtain good estimates for seasonal factors.

5.7 Damp seasonal factors for uncertainty

Selecting Methods:

6.6 Select simple methods unless empirical evidence calls for a more complex approach.

Implementing Methods: General

7.2 The forecasting method should provide a realistic representation of the situation

Implementing Judgmental Methods:

8.4 Provide numerical scales with several categories for experts’ answers.

Implementing Methods: Quantitative Models with Explanatory Variables:

10.3 Rely on theory and domain expertise when specifying directions of relationships.

10.4 Use theory and domain expertise to estimate or limit the magnitude of relationships.

Integrating Judgmental and Quantitative Methods:

11.1 Use structured procedures to integrate judgmental and quantitative methods.

11.2 Use structured judgment as inputs to quantitative models.

11.3 Use pre-specified domain knowledge in selecting, weighting, and modifying quantitative methods.

11.4 Limit subjective adjustments of quantitative forecasts.

Evaluating Methods:

13.4 Describe conditions associated with the forecasting problem.

13.5 Tailor the analysis to the decision.

13.9 Provide full disclosure of methods.

13.11 Test the client’s understanding of the methods.

13.19 Assess face validity.

Assessing Uncertainty:

14.12 Do not assess uncertainty in a traditional (unstructured) group meeting.

Learning That Will Improve Forecasting Procedures:

16.4 Establish a formal review process to ensure that forecasts are used properly.

Table A.4: Principles contravened in H6

 

Setting Objectives:

1.3 Make sure forecasts are independent of politics.

1.4 Consider whether the events or series can be forecasted.

Structuring the problem:

2.6 Structure problems that involve causal chains.

Identify Data Sources:

3.4 Use diverse sources of data.

3.5 Obtain information from similar (analogous) series or cases. Such information may help to estimate trends.

Collecting Data:

4.4 Obtain all of the important data

Preparing Data:

5.2 Use transformations as required by expectations.

5.4 Adjust for unsystematic past events.

5.5 Adjust for systematic events.

Selecting Methods:

6.1 List all the important selection criteria before evaluating methods.

6.2 Ask unbiased experts to rate potential methods.

6.6 Select simple methods unless empirical evidence calls for a more complex approach.

6.7 Match the forecasting method(s) to the situation.

6.8 Compare track records of various forecasting methods.

6.10 Examine the value of alternative forecasting methods.

Implementing Methods: General

7.1 Keep forecasting methods simple.

7.2 The forecasting method should provide a realistic representation of the situation.

7.3 Be conservative in situations of high uncertainty or instability.

7.4 Do not forecast cycles.

Implementing Quantitative Methods:

9.1 Tailor the forecasting model to the horizon.

9.2 Match the model to the underlying phenomena.

9.3 Do not use ”fit” to develop the model.

9.5 Update models frequently.

Implementing Methods: Quantitative Models with

Explanatory Variables:

10.2 Use all important variables.

10.5 Use different types of data to measure a relationship.

10.7 Forecast for alternate interventions.

10.9 Shrink the forecasts of change if there is high uncertainty for predictions of the explanatory variables.

Integrating Judgmental and Quantitative Methods:

11.1 Use structured procedures to integrate judgmental and quantitative methods.

11.2 Use structured judgment as inputs to quantitative models.

11.3 Use pre-specified domain knowledge in selecting, weighting, and modifying quantitative methods.

Combining Forecasts:

12.1 Combine forecasts from approaches that differ.

12.2 Use many approaches (or forecasters), preferably at least five.

12.3 Use formal procedures to combine forecasts.

12.8 Combine forecasts when there is uncertainty about which method is best.

12.9 Combine forecasts when you are uncertain about the situation.

12.10 Combine forecasts when it is important to avoid large errors.

Evaluating Methods:

13.1 Compare reasonable methods.

13.2 Use objective tests of assumptions.

13.3 Design test situations to match the forecasting problem.

13.5 Tailor the analysis to the decision.

13.6 Describe potential biases of forecasters.

13.7 Assess the reliability and validity of the data.

13.8 Provide easy access to the data.

13.10 Test assumptions for validity.

13.12 Use direct replications of evaluations to identify mistakes.

13.13 Replicate forecast evaluations to assess their reliability.

13.16 Compare forecasts generated by different methods.

13.17 Examine all important criteria.

13.18 Specify criteria for evaluating methods prior to analyzing data.

13.26 Use out-of-sample (ex ante) error measures.

13.27 Use ex post error measures to evaluate the effects of policy variables.

13.31 Base comparisons of methods on large samples of forecasts.

Assessing Uncertainty:

14.3 Develop prediction intervals by using empirical estimates based on realistic representations of forecasting situations.

14.5 Ensure consistency over the forecast horizon.

14.9 Combine prediction intervals from alternative forecasting methods.

14.10 Use safety factors to adjust for overconfidence in the PIs.

14.11 Conduct experiments to evaluate forecasts.

14.13 Incorporate the uncertainty associated with the prediction of the explanatory variables in the prediction intervals.

14.14 Ask for a judgmental likelihood that a forecast will fall within a pre-defined minimum-maximum interval (not by asking people to set upper and lower confidence levels).

Presenting Forecasts:

15.1 Present forecasts and supporting data in a simple and understandable form.

15.2 Provide complete, simple, and clear explanations of methods.

Table A.5: Principles apparently contravened in H6

 

Setting Objectives:

1.1 Describe decisions that might be affected by the forecasts.

1.2 Prior to forecasting, agree on actions to take assuming different possible forecasts.

Structuring the problem:

2.1 Identify possible outcomes prior to making forecasts.

2.3 Decompose the problem into parts.

Identify Data Sources:

3.2 Ensure that the data match the forecasting situation.

3.3 Avoid biased data sources.

Collecting Data:

4.2 Ensure that information is reliable and that measurement error is low.

4.3 Ensure that the information is valid.

Preparing Data:

5.3 Adjust intermittent series.

5.7 Damp seasonal factors for uncertainty

5.8 Use graphical displays for data.

Implementing Methods: General

7.6 Pool similar types of data.

Implementing Methods: Quantitative Models with

Explanatory Variables:

10.4 Use theory and domain expertise to estimate or limit the magnitude of relationships.

10.8 Apply the same principles to forecasts of explanatory variables.

Evaluating Methods:

13.4 Describe conditions associated with the forecasting problem.

13.9 Provide full disclosure of methods.

Assessing Uncertainty:

14.6 Describe reasons why the forecasts might be wrong.

14.7 When assessing PIs, list possible outcomes and assess their likelihoods.

14.8 Obtain good feedback about forecast accuracy and the reasons why errors occurred.

Table A.6: Principles not rated due to lack of information in H6

 

Setting Objectives:

1.5 Obtain decision makers’ agreement on methods Structuring the problem:

2.7 Decompose time series by level and trend

Identify Data Sources:

3.1 Use theory to guide the search for information on explanatory variables

Collecting Data:

4.1 Use unbiased and systematic procedures to collect data

4.5 Avoid the collection of irrelevant data

Preparing Data:

5.1 Clean the data

Selecting Methods:

6.4 Use quantitative methods rather than qualitative methods

6.5 Use causal methods rather than naive methods if feasible

6.9 Assess acceptability and understandability of methods to users

Evaluating Methods:

13.11 Test the client’s understanding of the methods

13.19 Assess face validity

Presenting Forecasts:

15.3 Describe your assumptions

Learning That Will Improve Forecasting Procedures:

16.2 Seek feedback about forecasts

16.3 Establish a formal review process for forecasting methods

16.4 Establish a formal review process to ensure that forecasts are used properly

Etiketter: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Ett svar to ““The report of our extinction was an exaggeration.””

  1. Björntjänst Says:

    Tack Sophia Albertina

    Al Gore har gjort svenskarna och övriga världen en björntjänst genom den fruktansvärda bluffen att tillsammans med IPPC manipulera fram påståendet om att global warming skulle vara förorsakad av människans CO2-utsläpp och därigenom ha medverkat till isbjörnarnas begynnande utrotning i arktiska trakter. Dessa falska bilder har skrämt barn över hela jorden. Om Al Gore har någon skam i kroppen borde han lämna tillbaka det så kallade fredspriset han fick i Oslo och erkänna skojeriet och be om ursäkt för sina falsarier tillsammans med IPPC.

Kommentera

Fyll i dina uppgifter nedan eller klicka på en ikon för att logga in:

WordPress.com Logo

Du kommenterar med ditt WordPress.com-konto. Logga ut / Ändra )

Twitter-bild

Du kommenterar med ditt Twitter-konto. Logga ut / Ändra )

Facebook-foto

Du kommenterar med ditt Facebook-konto. Logga ut / Ändra )

Google+ photo

Du kommenterar med ditt Google+-konto. Logga ut / Ändra )

Ansluter till %s


%d bloggare gillar detta: