Kajm on DeviantArthttps://www.deviantart.com/kajm/art/Climate-Sensitivity-Lower-than-Model-Assumptions-488207343Kajm

Deviation Actions

Kajm's avatar

Climate Sensitivity Lower than Model Assumptions

By
Published:
1.8K Views

Description

The IPCC uses computer models to make its' predictions for future weather. ALL of these simulations predict hell on Earth, some of them, as early as the 2040s.

Every single one of those models has problems: they cannot properly model clouds. They cannot accurately model major ocean-atmospheric effects, such as el-ninos. They do not take into account negative feedback mechanisms in the atmosphere- in fact, they ASSUME that all feedbacks are positive. Example: More CO2 going into the atmosphere leads to more warming leads to more water vapour entering the atmosphere leads to more warming leads to THE PLANET IS GOING TO COOK.
Despite the fact that warming has pretty much Flat-lined the past 18 years, and actually shows a very slight cooling trend over the past 13 years.

One of the major problems with computer models of the climate, is that they assume extremely high climate sensitivity- as high as 6 to 8 degrees per doubling of CO2 levels (to 560 PPM, something which will not happen for decades, possibly a century- and possibly, not at all). In a century of advances in science, those who back the 'man-made' global warming hypothesis, don't seem to have even been able to narrow this down.

However those who have chosen to work with real-life observations and data, have, over the past 5 years or so, come out with much lower climate sensitivity values.  The IPCC, activist scientists, pro-AGW alarmists and politicians, have chosen to demonize these people rather than pay attention to what is becoming increasingly obvious: The Earth has a great many mechanisms with which it controls and constrains the atmosphere.

I've touched on a few of those from time to time, and really need to get some more examples of those mechanisms up for you. I'll keep trying. In the meantime, climate scientist Judith Curry has recently written an op-ed on climate sensitivity, which may help you better understand the points I try to get across. Following her discussion will be another article which relates to the above chart, and links directly to 14 peer-reviewed papers, all of which indicate climate sensitivity is actually fairly low, compared to the high numbers the IPCC and others use to create their 'WE'RE ALL GONNA DIIIIIIIIIIIIIIIIIIIIIIIE!' scenarios.

Note: Because Judith does not advocate what pro-AGW alarmists want- high taxes, shutting down major portions of economies world-wide, leaving coal and oil in the ground, limiting YOUR choices in every aspect of YOUR lives, depopulation of the planet- she is demonized as a 'climate denier.'

My WSJ op-ed: Global warming statistical meltdown

Posted on October 9, 2014

by Judith Curry

At the recent United Nations Climate Summit, Secretary-General Ban Ki-moon warned that “Without significant cuts in emissions by all countries, and in key sectors, the window of opportunity to stay within less than 2 degrees [of warming] will soon close forever.” Actually, this window of opportunity may remain open for quite some time. A growing body of evidence suggests that the climate is less sensitive to increases in carbon-dioxide emissions than policy makers generally assume—and that the need for reductions in such emissions is less urgent.

According to the U.N. Framework Convention on Climate Change, preventing “dangerous human interference” with the climate is defined, rather arbitrarily, as limiting warming to no more than 2 degrees Celsius (3.6 degrees Fahrenheit) above preindustrial temperatures. The Earth’s surface temperatures have already warmed about 0.8 degrees Celsius since 1850-1900. This leaves 1.2 degrees Celsius (about 2.2 degrees Fahrenheit) to go.

In its most optimistic projections, which assume a substantial decline in emissions, the Intergovernmental Panel on Climate Change (IPCC) projects that the “dangerous” level might never be reached. In its most extreme, pessimistic projections, which assume heavy use of coal and rapid population growth, the threshold could be exceeded as early as 2040. But these projections reflect the effects of rising emissions on temperatures simulated by climate models, which are being challenged by recent observations.

Human-caused warming depends not only on increases in greenhouse gases but also on how “sensitive” the climate is to these increases. Climate sensitivity is defined as the global surface warming that occurs when the concentration of carbon dioxide in the atmosphere doubles. If climate sensitivity is high, then we can expect substantial warming in the coming century as emissions continue to increase. If climate sensitivity is low, then future warming will be substantially lower, and it may be several generations before we reach what the U.N. considers a dangerous level, even with high emissions.

The IPCC’s latest report (published in 2013) concluded that the actual change in 70 years if carbon-dioxide concentrations double, called the transient climate response, is likely in the range of 1 to 2.5 degrees Celsius. Most climate models have transient climate response values exceeding 1.8 degrees Celsius. But the IPCC report notes the substantial discrepancy between recent observation-based estimates of climate sensitivity and estimates from climate models.

Nicholas Lewis and I have just published a study in Climate Dynamics that shows the best estimate for transient climate response is 1.33 degrees Celsius with a likely range of 1.05-1.80 degrees Celsius. Using an observation-based energy-balance approach, our calculations used the same data for the effects on the Earth’s energy balance of changes in greenhouse gases, aerosols and other drivers of climate change given by the IPCC’s latest report.

We also estimated what the long-term warming from a doubling of carbon-dioxide concentrations would be, once the deep ocean had warmed up. Our estimates of sensitivity, both over a 70-year time-frame and long term, are far lower than the average values of sensitivity determined from global climate models that are used for warming projections. Also our ranges are narrower, with far lower upper limits than reported by the IPCC’s latest report. Even our upper limits lie below the average values of climate models.

Our paper is not an outlier. More than a dozen other observation-based studies have found climate sensitivity values lower than those determined using global climate models, including recent papers published in Environmentrics (2012), Nature Geoscience (2013) and Earth Systems Dynamics (2014). These new climate sensitivity estimates add to the growing evidence that climate models are running “too hot.” Moreover, the estimates in these empirical studies are being borne out by the much-discussed “pause” or “hiatus” in global warming—the period since 1998 during which global average surface temperatures have not significantly increased.

This pause in warming is at odds with the 2007 IPCC report, which expected warming to increase at a rate of 0.2 degrees Celsius per decade in the early 21st century. The warming hiatus, combined with assessments that the climate-model sensitivities are too high, raises serious questions as to whether the climate-model projections of 21st century temperatures are fit for making public policy decisions.

The sensitivity of the climate to increasing concentrations of carbon dioxide is a central question in the debate on the appropriate policy response to increasing carbon dioxide in the atmosphere. Climate sensitivity and estimates of its uncertainty are key inputs into the economic models that drive cost-benefit analyses and estimates of the social cost of carbon.

Continuing to rely on climate-model warming projections based on high, model-derived values of climate sensitivity skews the cost-benefit analyses and estimates of the social cost of carbon. This can bias policy decisions. The implications of the lower values of climate sensitivity in our paper, as well as similar other recent studies, is that human-caused warming near the end of the 21st century should be less than the 2-degrees-Celsius “danger” level for all but the IPCC’s most extreme emission scenario.

This slower rate of warming—relative to climate model projections—means there is less urgency to phase out greenhouse gas emissions now, and more time to find ways to decarbonize the economy affordably. It also allows us the flexibility to revise our policies as further information becomes available.

-------

Climate Sensitivity Papers:

 

The new paper (report here) by independent UK-based climate researcher Nicholas Lewis and professor Judith Curry of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, derives estimates of the warming effect of CO2 – climate sensitivity - using observational data. This raises the possibility that projections made by the Intergovernmental panel on Climate Change (IPCC) in its most recent scientific report (known as AR5) for future temperature trends may be too high because they rely on over-estimates of the Earth's climate sensitivity.

In their Climate Dynamics paper, entitled “The implications for climate sensitivity of AR5 forcing and heat uptake estimates”, the authors write that their estimates are “less dependent on global climate models” and so allow more realistically for the uncertainties in the warming effect – or forcing - of CO2 compared with similar estimates which use forcings diagnosed from simulations by such models.

This is the latest in a series of papers to highlight the possibility that IPCC climate sensitivity estimates may be too high. Most of the sensitivities estimated in these papers are 40% below the average climate sensitivity of the models used by the IPCC, according to a feature on the Cato website co-authored by climate scientist and Cato director Patrick Michaels.

Below we reproduce the Cato feature that lists these papers and then below that we also reproduce the abstracts from, and links to, all 14 papers on lower climate sensitivity.

Cato Feature

Here is the Cato Institute feature reproduced from its website:

September 25, 2014 5:09PM

The Collection of Evidence for a Low Climate Sensitivity Continues to Grow

By Patrick J. Michaels and Paul C. "Chip" Knappenberger

Nic Lewis and Judith Curry just published a blockbuster paper that pegs the earth’s equilibrium climate sensitivity—how much the earth’s average surface temperature is expected to rise in association with a doubling of the atmosphere’s carbon dioxide concentration—at 1.64°C (1.05°C to 4.05°C, 90% range), a value that is nearly half of the number underpinning all of President Obama’s executive actions under his Climate Action Plan.

This finding will not stop the President and the EPA from imposing more limits on greenhouse-gas emissions from fossil fuels. A wealth of similar findings have appeared in the scientific literature beginning in 2011 (see below) and they, too, have failed to dissuade him from his legacy mission.

The publication of the Lewis and Curry paper, along with another by Ragnhild Skeie and colleagues, brings the number of recent low-sensitivity climate publications to 14, by 42 authors from around the world (this doesn’t count our 2002 paper on the topic, “Revised 21st Century Temperature Projections”).  Most of these sensitivities are a good 40% below the average climate sensitivity of the models used by the U.N.’s Intergovernmental Panel on Climate Change (IPCC).

Lewis and Curry arrive at their lower equilibrium climate sensitivity estimate by using updated compilations of the earth’s observed temperature change, oceanic heat uptake, and the magnitude of human emissions, some of which should cause warming (e.g., greenhouse gases), while the others should cool (e.g., sulfate aerosols). They try to factor out “natural variability.” By comparing values of these parameters from the mid-19 century to now, they can estimate how much the earth warmed in association with human greenhouse gas emissions.

The estimate is not perfect, as there are plenty of uncertainties, some of which may never be completely resolved. But, nevertheless, Lewis and Curry have generated  a very robust observation-based estimate of the equilibrium climate sensitivity.

For those interested in the technical details, and a much more thorough description of the research, author Nic Lewis takes you through the paper (here) has made a pre-print copy of the paper freely available (here).

In the chart below, we’ve added the primary findings of Lewis and Curry as well as those of Skeie et al. to the collection of 12 other low-sensitivity papers published since 2010 that conclude that the best estimate for the earth’s climate sensitivity lies below the IPCC estimates. We’ve also included in our Figure both the IPCC’s  subjective and model-based characteristics of the equilibrium climate sensitivity. For those wondering, there are very few recent papers arguing that the IPCC estimates are too low, and they all have to contend with the fact that, according to new Cato scholar Ross McKitrick, “the pause” in warming is actually 19 years in length. 

References:

Aldrin, M., et al., 2012. Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations of hemispheric temperature and global ocean heat content. Environmetrics, doi: 10.1002/env.2140.

Annan, J.D., and J.C Hargreaves, 2011. On the genera­tion and interpretation of probabilistic estimates of climate sensitivity. Climatic Change, 104, 324-436.

Hargreaves, J.C., et al., 2012. Can the Last Glacial Maximum constrain climate sensitivity? Geophysical Research Letters, 39, L24702, doi: 10.1029/2012GL053872

Lewis, N. 2013. An objective Bayesian, improved approach for applying optimal fingerprint techniques to estimate climate sensitivity. Journal of Climate, doi: 10.1175/JCLI-D-12-00473.1.

Lewis, N. and J.A. Curry, C., 2014. The implications for climate sensitivity of AR5 focring and heat uptake estimates. Climate Dynamic, 10.1007/s00382-014-2342-y.

Lindzen, R.S., and Y-S. Choi, 2011. On the observational determination of climate sensitivity and its implica­tions. Asia-Pacific Journal of Atmospheric Science, 47, 377-390.

Loehle, C., 2014. A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.

Masters, T., 2013. Observational estimates of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5  models. Climate Dynamics, doi:101007/s00382-013-1770-4

McKitrick, R., 2014. HAC-Robust Measurement of the Duration of a Trendless Subsample in a Global Climate Time Series. Open Journal of Statistics, 4, 527-535. doi: 10.4236/ojs.2014.47050.

Michaels. P.J. et al., 2002. Revised 21st century temperature projections. Climate Research, 23, 1-9.

Otto, A., F. E. L. Otto, O. Boucher, J. Church, G. Hegerl, P. M. Forster, N. P. Gillett, J. Gregory, G. C. Johnson, R. Knutti, N. Lewis, U. Lohmann, J. Marotzke, G. Myhre, D. Shindell, B. Stevens, and M. R. Allen, 2013. Energy budget constraints on climate response. Nature Geoscience, 6, 415-416.

Ring, M.J., et al., 2012. Causes of the global warming observed since the 19th century. Atmospheric and Climate Sciences, 2, 401-415, doi: 10.4236/acs.2012.24035.

Schmittner,  A., et al. 2011. Climate sensitivity estimat­ed from temperature reconstructions of the Last Glacial Maximum. Science, 334, 1385-1388, doi: 10.1126/science.1203513.

Skeie,  R. B., T. Berntsen, M. Aldrin, M. Holden, and G. Myhre, 2014. A lower and more constrained estimate of climate sensitivity using updated observations and detailed radiative forcing time series. Earth System Dynamics, 5, 139–175.

Spencer, R. W., and W. D. Braswell, 2013. The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Science, doi:10.1007/s13143-014-0011-z.

van Hateren, J.H., 2012. A fractal climate response function can simulate global average temperature trends of the modern era and the past millennium. Climate Dynamics,  doi: 10.1007/s00382-012-1375-3.

End of CATO feature.

Abstracts Links and Citations

Here are abstracts, citations and links to all these papers:

Aldrin, M., et al., 2012.

Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations of hemispheric temperature and global ocean heat content. Environmetrics, doi: 10.1002/env.2140.

Abstract

Predictions of climate change are uncertain mainly because of uncertainties in the emissions of greenhouse gases and how sensitive the climate is to changes in the abundance of the atmospheric constituents. The equilibrium climate sensitivity is defined as the temperature increase because of a doubling of the CO2 concentration in the atmosphere when the climate reaches a new steady state. CO2 is only one out of the several external factors that affect the global temperature, called radiative forcing mechanisms as a collective term. In this paper, we present a model framework for estimating the climate sensitivity. The core of the model is a simple, deterministic climate model based on elementary physical laws such as energy balance. It models yearly hemispheric surface temperature and global ocean heat content as a function of historical radiative forcing. This deterministic model is combined with an empirical, stochastic model and fitted to observations on global temperature and ocean heat content, conditioned on estimates of historical radiative forcing. We use a Bayesian framework, with informative priors on a subset of the parameters and flat priors on the climate sensitivity and the remaining parameters. The model is estimated by Markov Chain Monte Carlo techniques. Copyright © 2012 John Wiley & Sons, Ltd.

Read the abstract and get the paper here.

Annan, J.D., and J.C Hargreaves, 2011.

On the genera­tion and interpretation of probabilistic estimates of climate sensitivity. Climatic Change, 104, 324-436.

Abstract

The equilibrium climate response to anthropogenic forcing has long been one of the dominant, and therefore most intensively studied, uncertainties in predicting future climate change. As a result, many probabilistic estimates of the climate sensitivity (S) have been presented. In recent years, most of them have assigned significant probability to extremely high sensitivity, such as P(S > 6C) > 5%. In this paper, we investigate some of the assumptions underlying these estimates. We show that the popular choice of a uniform prior has unacceptable properties and cannot be reasonably considered to generate meaningful and usable results. When instead reasonable assumptions are made, much greater confidence in a moderate value for S is easily justified, with an upper 95% probability limit forS easily shown to lie close to 4°C, and certainly well below 6°C. These results also impact strongly on projected economic losses due to climate change.

Read the abstract and get the paper here.

Hargreaves, J.C., et al., 2012.

Can the Last Glacial Maximum constrain climate sensitivity? Geophysical Research Letters, 39, L24702, doi: 10.1029/2012GL053872

Abstract

We investigate the relationship between the Last Glacial Maximum (LGM) and climate sensitivity across the PMIP2 multi-model ensemble of GCMs, and find a correlation between tropical temperature and climate sensitivity which is statistically significant and physically plausible. We use this relationship, together with the LGM temperature reconstruction of Annan and Hargreaves (2012), to generate estimates for the equilibrium climate sensitivity. We estimate the equilibrium climate sensitivity to be about 2.5°C with a high probability of being under 4°C, though these results are subject to several important caveats. The forthcoming PMIP3/CMIP5 models were not considered in this analysis, as very few LGM simulations are currently available from these models. We propose that these models will provide a useful validation of the correlation presented here.

Read the abstract and get the paper here.

Lewis, N. 2013.

An objective Bayesian, improved approach for applying optimal fingerprint techniques to estimate climate sensitivity. Journal of Climate, doi: 10.1175/JCLI-D-12-00473.1.

Abstract

A detailed reanalysis is presented of a “Bayesian” climate parameter study (as exemplified by Forest et al.) that estimates climate sensitivity (ECS) jointly with effective ocean diffusivity and aerosol forcing, using optimal fingerprints to compare multidecadal observations with simulations by the Massachusetts Institute of Technology 2D climate model at varying settings of the three climate parameters. Use of improved methodology primarily accounts for the 90% confidence bounds for ECS reducing from 2.1–8.9 K to 2.0–3.6 K. The revised methodology uses Bayes's theorem to derive a probability density function (PDF) for the whitened (made independent using an optimal fingerprint transformation) observations, for which a uniform prior is known to be noninformative. A dimensionally reducing change of variables onto the parameter surface is then made, deriving an objective joint PDF for the climate parameters. The PDF conversion factor from the whitened variables space to the parameter surface represents a noninformative joint parameter prior, which is far from uniform. The noninformative prior prevents more probability than data uncertainty distributions warrant being assigned to regions where data respond little to parameter changes, producing better-constrained PDFs. Incorporating 6 years of unused model simulation data and revising the experimental design to improve diagnostic power reduces the best-fit climate sensitivity. Employing the improved methodology, preferred 90% bounds of 1.2–2.2 K for ECS are then derived (mode and median 1.6 K). The mode is identical to those from Aldrin et al. and [using the same Met Office Hadley Centre Climate Research Unit temperature, version 4 (HadCRUT4), observational dataset] from Ring et al. Incorporating nonaerosol forcing and observational surface temperature uncertainties, unlike in the original study, widens the 90% range to 1.0–3.0 K.

Read the abstract and get the paper here.

Lewis, N. and J.A. Curry, C., 2014.

The implications for climate sensitivity of AR5 focring and heat uptake estimates. Climate Dynamic, 10.1007/s00382-014-2342-y.

Energy budget estimates of equilibrium climate sensitivity (ECS) and transient climate response (TCR) are derived using the comprehensive 1750–2011 time series and the uncertainty ranges for forcing components provided in the Intergovernmental Panel on Climate Change Fifth Assessment Working Group I Report, along with its estimates of heat accumulation in the climate system. The resulting estimates are less dependent on global climate models and allow more realistically for forcing uncertainties than similar estimates based on forcings diagnosed from simulations by such models. Base and final periods are selected that have well matched volcanic activity and influence from internal variability. Using 1859–1882 for the base period and 1995–2011 for the final period, thus avoiding major volcanic activity, median estimates are derived for ECS of 1.64K and for TCR of 1.33K. ECS 17–83 and 5–95% uncertainty ranges are 1.25–2.45 and 1.05–4.05K; the corresponding TCR ranges are 1.05–1.80 and 0.90–2.50K. Results using alternative well-matched base and final periods provide similar best estimates but give wider uncertainty ranges, principally reflecting smaller changes in average forcing. Uncertainty in aerosol forcing is the dominant contribution to the ECS and TCR uncertainty ranges.

Read the abstract and get the paper here.

Our report here.

Lindzen, R.S., and Y-S. Choi, 2011.

On the observational determination of climate sensitivity and its implica­tions. Asia-Pacific Journal of Atmospheric Science, 47, 377-390.

Abstract

We estimate climate sensitivity from observations, using the deseasonalized fluctuations in sea surface temperatures (SSTs) and the concurrent fluctuations in the top-of-atmosphere (TOA) outgoing radiation from the ERBE (1985–1999) and CERES (2000–2008) satellite instruments. Distinct periods of warming and cooling in the SSTs were used to evaluate feedbacks. An earlier study (Lindzen and Choi, 2009) was subject to significant criticisms. The present paper is an expansion of the earlier paper where the various criticisms are taken into account. The present analysis accounts for the 72 day precession period for the ERBE satellite in a more appropriate manner than in the earlier paper. We develop a method to distinguish noise in the outgoing radiation as well as radiation changes that are forcing SST changes from those radiation changes that constitute feedbacks to changes in SST. We demonstrate that our new method does moderately well in distinguishing positive from negative feedbacks and in quantifying negative feedbacks. In contrast, we show that simple regression methods used by several existing papers generally exaggerate positive feedbacks and even show positive feedbacks when actual feedbacks are negative. We argue that feedbacks are largely concentrated in the tropics, and the tropical feedbacks can be adjusted to account for their impact on the globe as a whole. Indeed, we show that including allCERES data (not just from the tropics) leads to results similar to what are obtained for the tropics alone — though with more noise. We again find that the outgoing radiation resulting from SST fluctuations exceeds the zerofeedback response thus implying negative feedback. In contrast to this, the calculated TOA outgoing radiation fluxes from 11 atmospheric models forced by the observed SST are less than the zerofeedback response, consistent with the positive feedbacks that characterize these models. The results imply that the models are exaggerating climate sensitivity.

Get the abstract and read the paper here.

Loehle, C., 2014.

A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.

Abstract

Climate sensitivity summarizes the net effect of a change in forcing on Earth's surface temperature. Estimates based on energy balance calculations give generally lower values for sensitivity (<2°C per doubling of forcing) than those based on general circulation models, but utilize uncertain historical data and make various assumptions about forcings. A minimal model was used that has the fewest possible assumptions and the least data uncertainty. Using only the historical surface temperature record, the periodic temperature oscillations often associated with the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation were estimated and subtracted from the surface temperature data, leaving a linear warming trend identified as an anthropogenic signal. This estimated rate of warming was related to the fraction of a log CO2 doubling from 1959 to 2013 to give an estimated transient sensitivity of 1.093°C (0.96–1.23°C 95% confidence limits) and equilibrium climate sensitivity of 1.99°C (1.75–2.23°C). It is argued that higher estimates derived from climate models are incorrect because they disagree with empirical estimates.

Get the abstract and read the paper here.

Our report here.

Masters, T., 2013.

Observational estimates of climate sensitivity from changes in the rate of ocean heat uptake and comparison to CMIP5  models. Climate Dynamics, doi:101007/s00382-013-1770-4

Abstract

Climate sensitivity is estimated based on 0–2,000 m ocean heat content and surface temperature observations from the second half of the 20th century and first decade of the 21st century, using a simple energy balance model and the change in the rate of ocean heat uptake to determine the radiative restoration strength over this time period. The relationship between this 30–50 year radiative restoration strength and longer term effective sensitivity is investigated using an ensemble of 32 model configurations from the Coupled Model Intercomparison Project phase 5 (CMIP5), suggesting a strong correlation between the two. The mean radiative restoration strength over this period for the CMIP5 members examined is 1.16 Wm−2K−1, compared to 2.05 Wm−2K−1 from the observations. This suggests that temperature in these CMIP5 models may be too sensitive to perturbations in radiative forcing, although this depends on the actual magnitude of the anthropogenic aerosol forcing in the modern period. The potential change in the radiative restoration strength over longer timescales is also considered, resulting in a likely (67 %) range of 1.5–2.9 K for equilibrium climate sensitivity, and a 90 % confidence interval of 1.2–5.1 K.

Get the abstract and read the paper here.

McKitrick, R., 2014.

HAC-Robust Measurement of the Duration of a Trendless Subsample in a Global Climate Time Series. Open Journal of Statistics, 4, 527-535. doi: 10.4236/ojs.2014.47050.

Abstract

The IPCC has drawn attention to an apparent leveling-off of globally-averaged temperatures over the past 15 years or so. Measuring the duration of the hiatus has implications for determining if the underlying trend has changed, and for evaluating climate models. Here, I propose a method for estimating the duration of the hiatus that is robust to unknown forms of heteroskedasticity and autocorrelation (HAC) in the temperature series and to cherry-picking of endpoints. For the specific case of global average temperatures I also add the requirement of spatial consistency between hemispheres. The method makes use of the Vogelsang-Franses (2005) HAC-robust trend variance estimator which is valid as long as the underlying series is trend stationary, which is the case for the data used herein. Application of the method shows that there is now a trendless interval of 19 years duration at the end of the HadCRUT4 surface temperature series, and of 16 - 26 years in the lower troposphere. Use of a simple AR1 trend model suggests a shorter hiatus of 14 - 20 years but is likely unreliable.

Get the abstract and read the paper here.

Our report here.

Michaels. P.J. et al., 2002.

Revised 21st century temperature projections. Climate Research, 23, 1-9.

Abstract

Temperature projections for the 21st century made in the Third Assessment Report (TAR) of the United Nations Intergovernmental Panel on Climate Change (IPCC) indicate a rise of 1.4 to 5.8°C for 1990–2100. However, several independent lines of evidence suggest that the projections at the upper end of this range are not well supported. Since the publication of the TAR, several findings have appeared in the scientific literature that challenge many of the assumptions that generated the TAR temperature range. Incorporating new findings on the radiative forcing of black carbon (BC) aerosols, the magnitude of the climate sensitivity, and the strength of the climate/carbon cycle feedbacks into a simple upwelling diffusion/energy balance model similar to the one that was used in the TAR, we find that the range of projected warming for the 1990–2100 period is reduced to 1.1–2.8°C. When we adjust the TAR emissions scenarios to include an atmospheric CO2 pathway that is based upon observed CO2 increases during the past 25 yr, we find a warming range of 1.5–2.6°C prior to the adjustments for the new findings. Factoring in these findings along with the adjusted CO2 pathway reduces the range to 1.0–1.6°C. And thirdly, a simple empirical adjustment to the average of a large family of models, based upon observed changes in temperature, yields a warming range of 1.3–3.0°C, with a central value of 1.9°C. The constancy of these somewhat independent results encourages us to conclude.

Get the abstract and read the paper here.

Otto, A. et al 2013.

Energy budget constraints on climate response. Nature Geoscience, 6, 415-416.

Abstract

The rate of global mean warming has been lower over the past decade than previously. It has been

argued that this observation might require a downwards revision of estimates of equilibrium climate sensitivity, that is, the long-term (equilibrium) temperature response to a doubling of atmospheric

CO2 concentrations. Using up-to-date data on radiative forcing, global mean surface temperature and total heat uptake in the Earth system, we find that the global energy budget implies a range of values for the equilibrium climate sensitivity that is in agreement with earlier estimates, within

the limits of uncertainty. The energy budget of the most recent decade does, however, indicate a lower range of values for the more policy-relevant transient climate response (the temperature increase at the point of doubling of the atmospheric CO2 concentration following a linear ramp

of increasing greenhouse gas forcing) than the range obtained by either analysing the energy budget of earlier decades or current climate model simulations.

Get the abstract and read the paper here.

Ring, M.J., et al., 2012.

Causes of the global warming observed since the 19th century. Atmospheric and Climate Sciences, 2, 401-415, doi: 10.4236/acs.2012.24035.

Abstract

Measurements show that the Earth’s global-average near-surface temperature has increased by about 0.8℃ since the 19th century. It is critically important to determine whether this global warming is due to natural causes, as contended by climate contrarians, or by human activities, as argued by the Intergovernmental Panel on Climate Change. This study updates our earlier calculations which showed that the observed global warming was predominantly human-caused. Two independent methods are used to analyze the temperature measurements: Singular Spectrum Analysis and Climate Model Simulation. The concurrence of the results of the two methods, each using 13 additional years of temperature measurements from 1998 through 2010, shows that it is humanity, not nature, that has increased the Earth’s global temperature since the 19th century. Humanity is also responsible for the most recent period of warming from 1976 to 2010. Internal climate variability is primarily responsible for the early 20th century warming from 1904 to 1944 and the subsequent cooling from 1944 to 1976. It is also found that the equilibrium climate sensitivity is on the low side of the range given in the IPCC Fourth Assessment Report.

Get the abstract and read the paper here.

Schmittner,  A., et al. 2011.

Climate sensitivity estimat­ed from temperature reconstructions of the Last Glacial Maximum. Science, 334, 1385-1388, doi: 10.1126/science.1203513.

Abstract

Assessing the impact of future anthropogenic carbon emissions is currently impeded by uncertainties in our knowledge of equilibrium climate sensitivity to atmospheric carbon dioxide doubling. Previous studies suggest 3 kelvin (K) as the best estimate, 2 to 4.5 K as the 66% probability range, and nonzero probabilities for much higher values, the latter implying a small chance of high-impact climate changes that would be difficult to avoid. Here, combining extensive sea and land surface temperature reconstructions from the Last Glacial Maximum with climate model simulations, we estimate a lower median (2.3 K) and reduced uncertainty (1.7 to 2.6 K as the 66% probability range, which can be widened using alternate assumptions or data subsets). Assuming that paleoclimatic constraints apply to the future, as predicted by our model, these results imply a lower probability of imminent extreme climatic change than previously thought.

Get the abstract and read the paper here.

Skeie,  R. B., T. Berntsen, M. Aldrin, M. Holden, and G. Myhre, 2014.

A lower and more constrained estimate of climate sensitivity using updated observations and detailed radiative forcing time series. Earth System Dynamics, 5, 139–175.

Abstract

The equilibrium climate sensitivity (ECS) is constrained based on observed near-surface temperature change, changes in ocean heat content (OHC) and detailed radiative forcing (RF) time series from pre-industrial times to 2010 for all main anthropogenic and natural forcing mechanism. The RF time series are linked to the observations of OHC and temperature change through an energy balance model and a stochastic model, using a Bayesian approach to estimate the ECS and other unknown parameters from the data. For the net anthropogenic RF the posterior mean in 2010 is 2.1 W m−2 with a 90% credible interval (C.I.) of 1.3 to 2.8 W m−2, excluding present day total aerosol effects (direct + indirect) stronger than −1.7 W m−2. The posterior mean of the ECS is 1.8 °C with 90% C.I. ranging from 0.9 to 3.2 °C which is tighter than most previously published estimates. We find that using 3 OHC data sets simultaneously substantially narrows the range in ECS, while using only one set and similar time periods can produce comparable results as previously published estimates including the heavy tail in the probability function. The use of additional 10 yr of data for global mean temperature change and ocean heat content data narrow the probability density function of the ECS. In addition when data only until year 2000 is used the estimated mean of ECS is 20% higher. Explicitly accounting for internal variability widens the 90% C.I. for the ECS by 60%, while the mean ECS only becomes slightly higher.

Get the abstract and read the paper here.

Spencer, R. W., and W. D. Braswell, 2013.

The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Science, doi:10.1007/s13143-014-0011-z.

Abstract

Global average ocean temperature variations to 2,000 m depth during 1955–2011 are simulated with a 40 layer 1D forcing-feedback-mixing model for three forcing cases. The first case uses standard anthropogenic and volcanic external radiative forcings. The second adds non-radiative internal forcing (ocean mixing changes initiated in the top 200 m) proportional to the Multivariate ENSO Index (MEI) to represent an internal mode of natural variability. The third case further adds ENSO-related radiative forcing proportional to MEI as a possible natural cloud forcing mechanism associated with atmospheric circulation changes. The model adjustable parameters are net radiative feedback, effective diffusivities, and internal radiative (e.g., cloud) and non-radiative (ocean mixing) forcing coefficients at adjustable time lags. Model output is compared to Levitus ocean temperature changes in 50 m layers during 1955–2011 to 700 m depth, and to lag regression coefficients between satellite radiative flux variations and sea surface temperature between 2000 and 2010. A net feedback parameter of 1.7Wm−2 K−1 with only anthropogenic and volcanic forcings increases to 2.8Wm−2 K−1 when all ENSO forcings (which are one-third radiative) are included, along with better agreement between model and observations. The results suggest ENSO can influence multi-decadal temperature trends, and that internal radiative forcing of the climate system affects the diagnosis of feedbacks. Also, the relatively small differences in model ocean warming associated with the three cases suggests that the observed levels of ocean warming since the 1950s is not a very strong constraint on our estimates of climate sensitivity.

Get the abstract and read the paper here.

Our report here.

van Hateren, J.H., 2012.

A fractal climate response function can simulate global average temperature trends of the modern era and the past millennium. Climate Dynamics,  doi: 10.1007/s00382-012-1375-3.

Abstract

A climate response function is introduced that consists of six exponential (low-pass) filters with weights depending as a power law on their e-folding times. The response of this two-parameter function to the combined forcings of solar irradiance, greenhouse gases, and SO2-related aerosols is fitted simultaneously to reconstructed temperatures of the past millennium, the response to solar cycles, the response to the 1991 Pinatubo volcanic eruption, and the modern 1850–2010 temperature trend. Assuming strong long-term modulation of solar irradiance, the quite adequate fit produces a climate response function with a millennium-scale response to doubled CO2concentration of 2.0 ± 0.3 °C (mean ± standard error), of which about 50 % is realized with e-folding times of 0.5 and 2 years, about 30 % with e-folding times of 8 and 32 years, and about 20 % with e-folding times of 128 and 512 years. The transient climate response (response after 70 years of 1 % yearly rise of CO2 concentration) is 1.5 ± 0.2 °C. The temperature rise from 1820 to 1950 can be attributed for about 70 % to increased solar irradiance, while the temperature changes after 1950 are almost completely produced by the interplay of anthropogenic greenhouse gases and aerosols. The SO2-related forcing produces a small temperature drop in the years 1950–1970 and an inflection of the temperature curve around the year 2000. Fitting with a tenfold smaller modulation of solar irradiance produces a less adequate fit with millennium-scale and transient climate responses of 2.5 ± 0.4 and 1.9 ± 0.3 °C, respectively

Get the abstract and read the paper here.

Sources

Cato Institute here.

Our report on the new Lewis and Curry Climate Dynamics paper here.

-------

My thoughts: You will have noticed that I have done CS several times before. But, Just as with the constantly-proven-FALSE '97% consensus,' it is necessary to keep pounding on climate sensitivity arguments. The Earth is NOT cooperating with pro-AGW alarmists. This is why they keep scrambling to find reasons why things are going the way they are: 52 different 'reasons' why the Pause in warming is happening. At least half a dozen different reasons why Antarctic sea-ice is expanding beyond all records. At least half a dozen different reasons why we are getting cold winters and increased snowfalls world-wide, when 30 years of 'settled science' has been saying that winters would be getting warmer, wetter, shorter. And so on, And so forth...

Image size
568x777px 283.83 KB
© 2014 - 2024 Kajm
Comments0
Join the community to add your comment. Already a deviant? Log In