Climate Change Science
Compiled by Ken Gregory,
Calgary, Alberta, Canada
Revised March 16, 2013 (Revision history)
The Science in Summary
Greenhouse Gas Effect
Climate Is Always Changing
CO2 - Temperature Correlation
CO2 Changes Follow Temperature Changes
Sun Activity Correlates with Temperature
Sun and Cosmic Rays
Heating of the Troposphere
Warming on Other Planets
CO2 Versus the Sun/Cosmic Ray Warming Theories
CO2 Greatly Increases Plant and Forest Growth
IPCC and Model Projections
Computer Models Fail
Water Vapour Feedback
The IPCC Hockey Stick
Urban Heat Island Effects
Falsified Historical CO2 Measurements
Effects of Warming
Sea Level Rise
Warming Is Good for Your Health
Agriculture and Climate Change
Warming Effects on Animals
Kyoto Protocol - Misallocation of Funds
An Inconvenient Truth
Warnings of Global Cooling
One of the goals of the Friends of Science Society is to educate the public through dissemination of relevant, balanced and objective technical information on the scientific merit of the Kyoto Protocol and the global warming issue. The science of climate change is complex. Unfortunately, politics and the media has affected the science. Climate research institutions know that they must present scary climate forecasts to receive continued funding - no crisis means no funding. The media presents stories of climate disaster to sell their products. Scientific research that suggests climate change is mostly natural does not receive much if any media coverage. These factors have caused the general public to be seriously misled on climate issues resulting in wasteful expenditures of billions of dollars in an ineffective attempt to control climate. This document gives an overview of climate change issues as determined by a comprehensive review of the state of climate science.
The graph above shows the temperature changes of the lower troposphere from the surface up to about 8 km as determined from the average of two analyses of satellite data. The best fit line from January 2002 to February 2013 indicates a decline of 0.03 Celsius/decade. The sharp temperature spikes in 1998 and 2010 are El Nino events. Surface temperature data is contaminated by the effects of urban development. The Sun's activity, which was increasing through most of the 20th century, has recently become quiet. The magnetic flux from the Sun reached a peak in 1992. The high magnetic flux reduces cloud cover and causes warming. Since then the Sun has become quiet, however it continues to cause warming for about a decade after its peak intensity due to the huge heat capacity of the oceans. So we expect the warming to peak at about 2002. The green line shows the CO2 concentration in the atmosphere, as measured at Mauna Loa, Hawaii. The ripple effect in the CO2 curve is due to the seasonal changes in biomass. There is a far greater land area in the northern hemisphere than the south that is affected by seasons. During the Northern hemisphere summer there is a large uptake of CO2 from plants growing causing a drop in the atmospheric CO2 concentration.
The data are obtained from microwave sounding units (MSUs) on the
National Oceanic and Atmospheric Administration's satellites, which relate the
intensity or brightness of microwaves emitted by oxygen molecules in the
atmosphere to temperature. The MSU data set represent the temperatures of a
layer of the atmosphere that extends from the surface to approximately 8
kilometres (5 miles) above the surface. The data is from the
University of Alabama in Huntsville and Remote Sensing
The Science in Summary
The history of the Earth tells us that the climate is always changing; from warm periods when the dinosaurs flourished, to the many ice ages when glaciers covered much of the land. Climate has always changed due to natural cycles without any help from people.
The United Nations Intergovernmental Panel on Climate Change (IPCC) is a political organization promoting a theory that recent minor temperature increases may be caused largely by man-made carbon dioxide (CO2) emissions. CO2 is an infrared gas, and increasing concentrations can potentially increase the average global temperature as the gas absorbs long-wave radiation from the Earth and emits the absorbed energy. However, the warming ability of CO2 is limited because much of the absorption spectrum is near or fully saturated. When CO2 concentrations were ten times greater than today the Earth was in the grips of one of the coldest ice ages. The climate system is dominated by strong negative feedbacks from clouds and water vapour which offsets the warming effects of CO2 emissions.
The history of climate and CO2 concentration shows that temperature changes precede CO2 changes and can not be a significant driver of climate. Temperature changes over different time scales have been well correlated to solar cycles, cosmic ray flux and cloud cover. Recent research shows that cosmic rays act as a catalyst to create low clouds, which cool the planet. When the Sun is more active, the solar wind repels the cosmic rays, reducing low cloud cover allowing the Sun to warm the planet.
Computer model results presented in the IPCC Fourth Assessment Report predict that global warming will cause a distinctive temperature profile in the atmosphere of enhanced warming rate in the upper atmosphere at 8 to 12 km altitude over the tropics. The predicted temperature profile is the result of an expected increase in water vapour in the upper atmosphere which would amplify a CO2 induced warming three fold. The computer models are programmed to forecast a constant water vapor relative humidity with increasing CO2 resulting in a large water vapor feedback. Actual temperature data shows no such enhanced warming profile. Therefore, the comparison of observed data to computer models proves that no such water vapour induced warming amplification exists, so CO2 is not the main climate driver. In atmosphere layers near 8 km, the modelled temperature trend from 1980 is 200 to 400% higher than observed. Weather balloon data shows that specific humidity has fallen 9% since 1960 in the upper troposphere (400 mbar pressure level) where the models predict the greatest feedback. Adding CO2 to the atmosphere replaces a significant amount of water vapour, the most important greenhouse gas, resulting in only a small increase of the greenhouse effect.
An analysis of satellite data shows that clouds cause a strong negative feedback on temperature, but climate models assume that clouds cause a positive feedback. Modelers assumed that all cloud changes are caused by temperature changes which results in them inferring a positive feedback. But changing cloud cover can also cause temperature changes. Scientists can now separate these two effects. The correct analysis shows that clouds cause a strong negative feedback, so if temperatures increase, cloud cover increases, reflecting solar energy back to space and greatly reducing the warming effect of CO2 emissions.
Several planets and moons have warmed recently along with the Earth,
confirming a natural sun caused warming trend. Over longer time periods, as the
solar system moves in and out of the galactic arms the cosmic ray flux changes,
causing ice ages and warm ages. A comparison of temperature and solar activity
proxy data suggests that solar effects can explain at least 75% of the surface
warming during the last 100 years.
CO2 is plant food and the increase in the CO2 concentration may have increased the global food production by 15% since 1950 resulting in huge benefits for people. For Canada, any CO2 warming effect would also benefit us by reducing our space heating costs and making a more pleasant climate.
The IPCC predicts that global average temperatures will increase by 0.17 to 0.38 oC per decade to the end of the century depending on the rate of CO2 growth in the atmosphere and other assumptions. The projections assume that no action is taken to limit CO2 emissions. However, these predictions are unrealistic because they falsely assume that the recent temperature changes are driven solely by CO2 and that the Sun has little effect on climate. A recent study of past climate change used by the IPCC has been shown to be wrong due to the use of a faulty algorithm, and the inappropriate selection of data.
The land temperature record is contaminated by the urban heat island effect. Fully correcting the land temperature record would reduce the warming trend from 1980 to 2002 by half. The IPCC historical CO2 record may be incorrect due to inappropriate adjustments to the ice core data, and ignoring direct historical CO2 measurements. The IPCC selects and adjusts data to conform to its CO2 warming hypothesis and ignores alternative climate theories. This is the wrong way to do science. Many scientists strongly disagree with the IPCC conclusions.
The sea level data shows no increase in the recent rate of sea level rise, and no such increase is expected over the next hundred years. There has been no detected increase in severe storms and there is no reason to expect an increase in the number or intensity of hurricanes resulting from any warming assumed to be from human caused CO2 emissions.
Any increase in temperatures due to human caused CO2 emissions will likely be beneficial to human health. The CO2 fertilization effect will increase the rate of forest growth and CO2 induced crop yield increases will reduce the pressures to cut down forests for farmland expansion. This will greatly benefit animals by slowing habitat destruction.
The benefits of CO2 emissions greatly exceed any likely harmful effects. Several authorities who have studied solar cycles have warned that the Earth may soon enter a cooling phase as the Sun is expected to become less active. The atmosphere may warm because of human activity, but if it does, the expected change is unlikely to be more than 0.5 C, and probably less, in the next 100 years.
Greenhouse Gas Effect
This graphic from Trenberth et al 2009 here, shows the exchange of energy among Space, the Sun, the atmosphere and the Earth. Greenhouse gases are primarily water vapour, carbon dioxide and ozone. Greenhouse gases are mostly transparent to incoming solar radiation, but absorb outgoing long wavelength radiation. The absorbed energy is then transferred to cooler molecules or radiated at longer wavelengths than the energy previously absorbed. This process makes the Earth warmer than it otherwise would be without the greenhouse gases (but with the atmosphere and clouds) by about 33 degrees Celsius.
Water vapour and clouds together account for over 70% of the total current greenhouse effect. However, in terms of changes to the greenhouse effect due to human activities, water vapour is generally considered a feedback and not a forcing agent. Computer simulations show the a uniform 1.8% change in water vapour has the same effect on outgoing longwave radiation as a 10% change in CO2 concentration. (See the water vapour feedback section for further information.)
Optical depth is a measure how transparent the atmosphere is to longwave radiation. More greenhouse gases reduce the transparency of the atmosphere to longwave radiation from the surface.
See here for a discussion of CO2 versus water's contribution to the greenhouse effect.
The top panel of the graph above shows the absorption spectral intensity of the greenhouse gases. Most of the short wave length solar radiation in the visible part of the spectrum is transmitted to the surface. Most of the upward thermal long wave radiation from the surface is absorbed except in the atmospheric window indicated by the blue region. About 16% of the long wave radiation is transmitted directly to space and the rest is absorbed by greenhouse gases. The middle panel shows the total absorption bands by wavelength of downward solar radiation and upward thermal radiation. The gray shading at 100 percent indicates that the energy is fully absorbed at that wavelength. The lower panel shows the absorption of the major greenhouse gases. Comparing the CO2 and H2O absorption spectra shows that much of the CO2 spectrum overlaps with that of water. Parts of the CO2 spectrum are already fully saturated. Adding more CO2 will result in ever diminishing effects as more of the available wavelengths become saturated. The temperature response to adding CO2 to the atmosphere depends on the amount of positive and negative feedbacks from water vapour, clouds and other sources. The temperature effect of increasing CO2 concentration is approximately logarithmic. This means if doubling the CO2 concentration from 300 ppm to 600 ppm, a 300 ppm increase, causes the temperature to rise by 1 oC, it would take another 600 ppm increase to add a further 1 oC temperature gain. Methane has an absorption band (at 8 micrometres) that largely overlaps with water vapour, so an increase in methane has little effect on temperature.
The above diagram shows the upward radiation spectrum
from the top of the atmosphere at 20 km with 300 ppm CO2 and 600 CO2 as
calculated by the Modtran radiative code. (Note that the horizontal axis of
this diagram shows wavenumber, or number of wavelengths per cm, which is
the reciprocal of the wavelength in micrometers used in the previous
diagram.) This model calculated radiation is very similar to what is
actually measured by satellites from space. The green curve shows
the emissions spectrum with 300 ppm CO2 in the atmosphere and the blue
curve shows the spectrum with 600 ppm CO2 with the same surface temperature
and water vapour profile. The model shows that doubling the CO2
concentration changes the spectrum only at the edges of the main CO2
absorption band, at 600 and 740 cm-1. The resulting forcing of
3.34 W/m2 would cause the surface temperatures to increase if not offset by
Climate Is Always Changing
The Earth's history shows that the climate has always been changing, over both short-term and long-term time scales. These changes have sometimes been abrupt and severe, without any help from humans. Climate temperature reconstructions are determined from a variety of sources, such as from tree ring width studies and ocean floor sediments. During the last 2 billion years, the Earth has alternated between cool periods like today, and warm periods like when the dinosaurs roamed the planet. The figure below on the left is a temperature reconstruction of the Earth over 2 billion years. Temperatures over this time frame are determined by mapping the distribution of ancient coals, desert deposits, tropical soils, salt and glacial deposits, as well as the distribution of plants and animals that are sensitive to climate, such as alligators, palm trees & mangrove swamps. See here for further information.
Temperature Over Geological Time
The above chart from here shows that CO2 levels have been declining since the end of the Jurassic period. The change in CO2 as indicated by the red line in the red circle is the change in CO2 since the industrial revolution.
The graph above shows the northern hemisphere temperature history since the last ice age.
Temperature History from North Atlantic Ocean Sediments
The graph above shows temperature variations of the past 3,000 years (during recorded history), as determined from ocean sediment studies in the North Atlantic. [Keigwin, 1996]. Note the rapid variations, as well as the much warmer temperatures 1,000 and 2,500 years ago. See here for further information.
A new temperature reconstruction with decadal resolution, covering the last two millennia, is shown above for the extratropical Northern Hemisphere (90?30?N), utilizing many palaeo-temperature proxy records, from Ljungqvist 2010 here. The shading represents 2 standard deviation errors.
RWP = Roman Warm Period AD 1-300; DACP = Dark Age Cold Period 300-900; Medieval Warm Period 800-1300; LIA = Little Ice Age 1300-1900; CWP = Current Warm Period 1900-. The proxy data shows the parts of the Roman Warm Period and the Medieval Warm Period were as warm as the late twentieth century.
Climate is always changing, as the history of Europe's
temperature over the last thousand years shows in the graph below:
1000 Years Temperature History IPCC 1990
The temperature history shown at the left was published in the first IPCC report in 1990, based on Lamb's estimated climate history of Central England.
Clearly, human activity could not have had a significant effect on the temperature changes before 1900. These changes are the result of natural processes.
See here for NASA's GISS temperature graphs since 1880.
HadCrut3 is the global surface temperature index produced by the Hadley Centre and the Climate Research Unit, England. It combines land and marine temperature data. The best line from January 2002 to December 2012 indicates a decline of 0.085 C/decade. The IPCC projected that temperatures would increase by 0.2 C/decade during this period.
CO2 - Temperature Correlation
The temperature of the Earth has warmed slightly, about 0.7 degrees Celsius, over the last hundred years. Over this time, CO2 concentration in the atmosphere has increased, mostly due to the increased use of fossil fuels. However, the Sun has increased in intensity since 1900 which may have induced much of the observed warming since then. Scafetta and West estimate that the Sun may have caused 10 to 20% of the increase in CO2 during the last century. (See page 2 of their paper here.) A short-term correlation does not imply that the CO2 increase caused the temperature increase. Causation can be inferred if there is a correlation over several cycles of CO2 concentration changes, with the CO2 change preceding the temperature change. The actual climate history shows no such correlation, and there is no compelling evidence that the recent rise in temperature was caused by CO2. Temperatures have been variable over time, and do not correlate to CO2 concentration. When CO2 concentrations were 10 times higher than they are now we were in a major ice age. As a greenhouse gas, CO2 is vastly outweighed by (natural) water vapour and clouds, which accounts for over 70% of the greenhouse effect. Human-related CO2 emissions soared after 1940. Yet most of the 20th century's world-wide temperature increase occurred beforehand. See here for a graphic of the carbon cycle.
The CO2 annual growth rate to December 2012 is given below.
The actual increase of CO2 concentration averaged 0.5% per year since 1990.
CO2 Changes Follow Temperature Changes
Fischer et al. (1999) examined records of atmospheric CO2 and air temperature derived from Antarctic Vostok ice cores that extended back in time across a quarter of a million years. Over this immense time span, the three most dramatic warming events experienced on earth were those associated with the terminations of the last three ice ages; and for each and every one of these tremendous global warmings, Earth's air temperature rose well before there was any increase in atmospheric CO2. In fact, the air's CO2 content did not begin to rise until 400 to 1,000 years after the planet began to warm. Ice cores provide a detailed record of local temperature and CO2 concentrations. A study by Caillon et al. (2003) finds that the CO2 increase lagged Antarctic deglacial warming by 800 +or- 200 years. The authors measured the isotopic composition of argon40 and CO2 concentration in air bubbles in the Vostok core during the end of the third most recent ice age (Termination III), 240,000 years before the present. The argon40 isotope is found to be an excellent proxy for temperature.
Vostok Ice Core Data over End of Third Ice Age BP
The CO2 and Argon (Temperature) Age Scales are Shifted 800 Years
The CO2 concentration shown by the black line is plotted against age in years before present (BP) on the bottom axis, and the Argon40, a temperature proxy, shown by the grey line is plotted against age on the top axis. The age scale for the CO2 has been shifted by a constant 800 years to obtain the best correlation of the two data sets. The correlation shows that temperature changes precede CO2 concentration changes by about 800 years.
These findings confirm that an increase in CO2 has never initially caused an increase in temperature during a deglaciation. Temperature increases cause the oceans to expel CO2, increasing the CO2 content of the atmosphere. When temperature is at its maximum in each cycle and starts to fall, CO2 concentrations continue to increase for another 800 years! As CO2 increases, temperatures fall. This is the opposite of what one would expect if CO2 were a primary climate driver. The ice core data proves that CO2 is not a primary climate driver. One must invoke reverse time causality to claim the ice core data shows CO2 causes temperature change, like suggesting actions taken today can affect the conquests of Mongol leader Genghis Khan. Logic demands that cause must precede effect. Increases in air temperature drive increases in atmospheric CO2 concentration, and not vice versa.
A more recent portion of the Vostok ice core record from Joanne Nova's Skeptics Handbook here is shown below.
See here for more information. See here for a graph of Vostok ice core data. See here for the Cailion et al (2003) paper.
Sun Activity Correlates with Temperature
Numerous papers published in major peer-reviewed scientific journals shows the Sun is the primary driver of climate change. There is a very strong correlation between the Sun activity and temperature.
Early in the nineteenth century, William Herschel (1738-1822), discoverer of Uranus, found that five periods of low number of sunspots corresponded to high wheat prices when the temperatures were cold. (Cold climate reduces the supply of wheat causing its price to rise.) See here.
E. Friis-Christensen and K.Lassen have shown that the length of the mean 11 year Sunspot cycle correlates to the northern hemisphere temperature during the past 130 years. The length of the Sunspot cycle is known to vary with solar activity, whereas high solar activity implies short sunspot cycle length. See here for further information.
See here for an updated plot based on Friis-Christensen and Lassen's methodology.
Here is a correlation of the sunspot cycle length, global temperature and CO2 concentrations.
Sunspot Cycle Length Temperature and CO2
The red squares on the graph represent the sunspot cycle lengths. One point is the cycle length from the time of the maximum number of sunspots to the time of the maximum number of sunspots of the next cycle, and the following point is the cycle length from the time of the minimum number of sunspots to the time of the minimum number of sunspots of the next cycle. The sunspot cycles are back filtered using weighting 1,2,3,4 applied to each cycle point, both min to min and max to max. This assumes that the current cycle has the most effect on temperature (weight 4), and previous half cycles affect current temperatures in declining amounts, but future cycles have no effect on the current temperature. The temperature curve in blue used the HadCRUT3 land and sea data to 1978, the MSU satellite data from 1984 to 2006, and the average of the datasets for 1979 to 1983. This eliminates much of the urban heat island effects. The temperatures are unfiltered annual. The CO2 concentrations (ppmv) from 1958 to 2007 are derived from air samples collected at the Mauna Loa Observatory, Hawaii. CO2 concentrations prior to 1958 are uncertain.
Note that there is a correspondence between sunspot cycle length and temperature. Both the temperature and the cycle length curves begin to rise at 1910, and temperatures fall after 1945 to 1975 when the cycle length curve falls, and both curves rise again after 1975. Temperatures have been increasing since 1980 faster than can be explained by the sunspot cycle length, indicating a possible human CO2 contribution. The recent increase of the cycle lengths explains why there has been no warming since 2002. Temperature changes are expected to follow Sun activity changes due to a time lag resulting from the large heat capacity of the oceans.
N. Scafetta of Duke University, Durham, NC and B.J. West of the US Army Research Office, NC studied the solar impact on 400 years of the Northern Hemisphere temperatures since 1600. They find good correspondence between temperature and solar irradiance proxy reconstructions up until 1920 as shown on the graph below.
Northern Hemisphere Temperature vs Solar Irradiance 400 years
The temperature curve is derived from proxy records to 1850 by Moberg et al. , and from instrumental surface temperature data from 1850 to about 1980. The surface temperature record includes the urban heat island (UHI) and land use changes effects. The Northern Hemisphere MSU lower troposphere record is shown from 1979 in blue, which eliminates most of the UHI effects. Two different solar irradiance proxy reconstructions are shown: Lean, 2000; Wang et al., 2005. Both curves merge the ACRIM satellite data since 1980 with the proxy data. By assuming ACRIM, the solar activity has an increasing trend during the second half of the 20th century. This graph is modified from the version created by Scafetta and West, which uses the contaminated instrument record after 1979 instead of the satellite data. See the original version here.
Note the low solar activity periods occurring during the Maunder Minimum (16451715, the Little Ice Age) and during the Dalton Minimum (17951825).
Note the excellent correlation from 1600 to 1900 when humans were unlikely to effect climate. During the 20th century one continues to observe a significant correlation between the solar and temperature patterns: both records show an increase from 1900 to 1950, a decrease from 1950 to 1970, and again an increase from 1970 to 2000.
A divergence of the curves from the Scafetta and West original graph indicates that the Sun is responsible for 56% using Lean 2000, and 69% using Wang 2005, of the northern hemisphere warming from 1900 to 2005. The authors estimate the error at 20%.
There are two solar composites available from satellite data. The ACRIM is obtained directly from the satellite data, while the PMOD assumes that Nimbus7/ERB satellite data covering the ACRIM gap (19891992) are still significantly corrupted and require additional severe adjustments. The ACRIM data shows higher solar irradiance during solar cycle 22 - 23 than the PMOD data. Using the PMOD data and the original graph, the Sun likely has contributed 50% of the surface warming from 1900 to 2005.
The authors did a similar analysis using the Mann and Jones 2003 temperature reconstruction. This temperature history shows little variation before 1900 and shows a hockey stick shape. This reconstruction has been severely criticized for several reasons. See The IPCC Hockey Stick section of this essay. The authors found that the Mann and Jones 2003 reconstruction (when compared to the Lean 2000 data) results in an unphysical zero response time to solar forcing. The ocean's large heat capacity should result in a time lag of surface temperatures with respect to long time solar changes of several years, so this reconstruction cannot be correct.
The authors' analysis shows the Sun has contributed 50 to 69% of the surface warming depending on the reconstructions utilized. The remainder may be due to CO2, UHI and land use changes. The authors compare the Sun's irradiance to the Northern Hemisphere land surface temperatures, which are contaminated with the urban heat island effect. The global MSU satellite temperatures, which are not contaminated by the UHI effect, have increased by half as much as the North Hemisphere temperatures since 1980. If the Scafetta and West analysis used the uncontaminated satellite data since 1980, the results would show that the Sun has contributed at least 75% of the global warming of the last century. See more about the UHI effect later in this essay here. See here for the November 2007 article.
A group of NASA and university scientists have found convincing evidence of a link between the Sun activity and climate by comparing the records of the historical water level of the Nile River to the number of auroras observed in northern Europe and the Far East between 622 and 1470 AD. Auroras are bright glows in the night sky following solar flares, and are an excellent means of tracking solar activity. See this link for further information.
A study by WJR Alexander et al, published June 2007 compared hydrometeorological data to solar variability. The study looked at rainfall, river flow and flood data. The authors conclude that there is "an unequivocal synchronous linkage between these processes in South Africa and elsewhere, and solar activity." The study included an analysis of the level of Lake Victoria, which has been carefully monitored since 1896. In the early 1960s a dramatic rainfall increase significantly raised the lake level, and the level since then has been falling at about 29 mm per year. The decline has been removed from the data plotted below. The plot shows two periods of strong correlation between lake level and sunspot number, corresponding to periods of high levels of volcanic dust.
Lake Victoria Water Level and Sunspot Number
See the paper "Linkages between solar activity, climate predictability and water resource development" here .
Longer term, here is a correlation of a solar proxy to a temperature proxy for a period of 3000 years. Values of carbon-14 (produced by cosmic rays hence a proxy for solar activity) correlate extremely well with oxygen-18 (temperature proxy). The lower graph shows a particularly well-resolved time interval from 8,350 to 7,900 years BP.
The above graph summarizes data obtained from a stalagmite from a cave in Oman, as reported in the paper, Neff, U., et al. 2001.
A team of researchers led by scientists from the Max Planck Institute for Solar System Research analysed radioactive isotopes in trees and has found that the Sun has been more active in the last half of the 20th century than in any time in the last 8000 years. This study showed that the current episode of high solar activity since about the year 1940 is unique within the last 8000 years. See a press release here. A graph from the study is below. The bottom chart is a detail of the shaded period of the top chart from 9300 to 8600 years before the present.
Recently, Tim Patterson, an adviser to the FOS, has studied high-resolution Holocene climate records from fjords and coastal lakes in British Columbia and demonstrates a link between temperature and solar cycles.
The spectral analysis shown here is from sediment cores obtained from Effingham Inlet, Vancouver Island, British Columbia. The annually deposited laminations of the core are linked to the changing climate conditions. The analysis shows a strong correlation to the 11-year sunspot cycle.
See here for a powerpoint slide show by Tim Patterson.
N. Shaviv and J. Veiser using seashell thermometers shows a strong correlation between temperature and the cosmic ray flux over the last 520 million years.
Cosmic Ray Flux and Tropical Temperature Variation Over the Phanerozoic 520 million years
The upper curves describe the cosmic ray flux (CRF) using iron meteorite exposure age data. The blue line depicts the nominal CRF, while the yellow shading delineates the allowed error range. The two dashed curves are additional CRF reconstructions that fit within the acceptable range. The red curve describes the nominal CRF reconstruction after its period was fine-tuned to best fit the low-latitude temperature anomaly. The bottom black curve depicts the smoothed temperature change derived from calcitic shells over the Phanerozoic. The red line is the predicted temperature model for the red curve above. The green line is the residual. The top blue bars indicate ice ages.
A paper by Nicola Scafetta (2011) compares the historical
records of mid-latitude auroras from 1700 to the surface temperature records.
It shows that auroras record share the same ocsillation frequencies evident
in the temperature record and in several planetary and solar
records. The author argues that the aurora
records reveal a physical link between climate change and astronomical
oscillations. The abstract states "In particular, a quasi-60-year large cycle is
quite evident since 1650 in all climate and astronomical records herein studied
... The existence of a natural 60-year cyclical modulation of the global
surface temperature induced by astronomical mechanisms, by alone, would imply
that at least 60 to 70% of the warming observed since 1970 has been naturally
induced. Moreover, the climate may stay approximately stable during the next
decades because the 60-year cycle has entered in its cooling phase."See here .
Sun and Cosmic Rays
During the 20th century the Sun has continued to warm and may have contributed directly to a third of the warming over the last hundred years. The change in solar output is too small to directly account for most of the observed warming. However, the Sun-Cosmic Ray connection provides an amplification mechanism by which a small change in solar irradiance will have a large effect on climate.
A paper by H. Svensmark and E. Friis-Christensen of the Center for Sun-Climate Research of the Danish National Space Center in Copenhagen has shown that cosmic rays highly correlate to low cloud formation. Changes in the intensity of galactic cosmic rays alter the Earths cloudiness.
An experiment in 2005 shows the effect of cosmic rays in a reaction chamber containing air and trace chemicals found over the oceans. Electrons released in the air by cosmic rays act as a catalyst in making aerosols. They significantly accelerate the formation of stable, ultra-small clusters of sulphuric acid and water molecules, which are the building block for the cloud condensation nuclei.
Danish scientists reported in May 2011 that they have succeeded for the first time in directly observing that the electrically charged particles coming from space and hitting the atmosphere at high speed contribute to creating the aerosols that are the prerequisites for cloud formation. In a climate chamber at Aarhus University, scientists have created conditions similar to the atmosphere at the height where low clouds are formed. This artificial atmosphere was irradiated with fast electrons from ASTRID Denmarks largest particle accelerator. The experiments show that increased radiation from cosmic rays leads to more aerosols. In the atmosphere, these aerosols grow into actual cloud nuclei in the course of hours or days. Water vapour concentrates on the nuclei forming small cloud droplets. See the news release here.
A team of 63 scientists published results in August 2011 of a much more sophisticated experiment which investigated the effects of cosmic rays on cloud formation. The CLOUD (Cosmics Leaving OUtdoor Droplets) experiment at CERN (European Organization for Nuclear Research) in Geneva show big effects of pions from an accelerator, which simulate the cosmic rays and ionize the air in the experimental chamber. The CLOUD experiment is the most rigorous test of the Cosmic Ray hypothesis yet devised. The experiments show that cosmic rays strongly enhance the formation rate of aerosols by up to ten fold, and confirms the earlier results from the Danish experiment. The aerosols may grow into cloud condensation nuclei on which cloud droplets form. See the CERN press release here.
The graph below shows the aerosol particle concentration growth in the CLOUD chamber. In an early-morning experimental run at CERN, starting at 03:45, ultraviolet light began making sulphuric acid molecules in the chamber, while a strong electric field cleansed the air of ions. As soon as the electric field was switched off at 04:33, natural cosmic rays raining down through the roof helped to build clusters at a higher rate. When CLOUD simulated stronger cosmic rays with a beam of charged pion particles starting at 4:58 the rate of cluster production became faster still. The various colours are for clusters of different diameters (in nanometres) as recorded by various instruments. The largest (black) took longer to grow than the smallest (blue). See here. The CLOUD results also show that trace vapours assumed until now to account for aerosol formation in the lower atmosphere can explain only a tiny fraction of the observed atmospheric aerosol production.
Coronal mass ejections from the sun cause a large decrease in the cosmic ray count, which are called Forbush decrease. These dramatic, short term cosmic ray decreases can be used to confirm the cosmic ray effects on clouds. The magnetic plasma clouds from solar coronal mass ejections provide a temporary shield against galactic cosmic rays.
A study by Svensmark et al in 2009 shows that the decrease in cosmic rays have a large effect on the amount of aerosols, cloud cover and the liquid water content of clouds. The authors conclude "From solar activity to cosmic ray ionization to aerosols and liquid-water clouds, a causal chain appears to operate on a global scale."
The figure below shows the evolution of fine aerosols particles in the lower atmosphere (AERONET), cloud water content (SSM/I), liquid water cloud fraction (MODIS), and low IR-detected clouds (ISCCP), averaged for the 5 strongest Forbush decreases in the period 1987-2007. The red dashed line shows the average cosmic ray count percent change. The lowest aerosol count occurs 5 days after the Forbush minimum, and the cloud water content minimum occurs 4 days later. The response in cloud water content for the larger events is about 7%.
The broken horizontal lines denote the mean for the first 15 days before the Forbush minimum of each of the four data sets.
Data from the International Satellite Cloud Climatology Project and the Huancayo cosmic ray station shows a remarkable correlation between low clouds (below 3 km) and cosmic rays. There are more than enough cosmic rays at high altitudes, so changes in the cosmic rays do not effect high clouds. But fewer cosmic rays penetrate to the lower clouds, so they are sensitive to changes in cosmic rays.
Cosmic Rays and Low Clouds
The blue line shows variations in global cloud cover collated by the International Satellite Cloud Climatology Project. The red line is the record of monthly variations in cosmic-ray counts at the Huancayo station.
Low-level clouds cover more than a quarter of the Earth's surface and exert a strong cooling effect on the surface. A 2% change in low clouds during a solar cycle will change the heat input to the Earth's surface by 1.2 watts per square metre (W/m2). This compares to the total warming of 1.4 W/m2 the IPCC cites in the 20th century. (The IPCC does not recognize the effect of the Sun and Cosmic rays, and attributes the warming to CO2.)
Cosmic ray flux can be determined from radioactive isotopes such as beryllium-10, or the Suns open coronal magnetic field. The two independent cosmic ray proxies confirm that there has been a dramatic reduction in the cosmic ray flux during the 20th century as the Sun has gained intensity and the Sun's coronal magnetic field has doubled in strength.
Cosmic Ray Flux Since 1700
Changes in the flux of galactic cosmic rays since 1700 are here derived from two independent proxies, 10Be (light blue) and open solar coronal flux (dark blue) (Solanki and Fligge 1999). Low cloud amount (orange) is scaled and normalized to observational cosmic-ray data from Climax (red) for the period 1953 to 2005 (3 GeV cut-off). Both scales are inverted to correspond with rising temperatures. Note that high cosmic ray flux around 1700 is at the end of the Little Ice Age. Also note the increase in cosmic ray flux after 1780 at the time of the Dicken's Winters.
The graph below shows a correlation between the cosmic ray counts and the global troposphere temperature radiosonde data. The cosmic ray scale is inverted to correspond to increasing temperatures. High solar activity corresponds to low cosmic ray counts, reduced low cloud cover, and higher temperatures. The upper panel shows the troposphere temperatures in blue and the cosmic ray count in red. The lower panel shows the match achieved by removing El Nino, the North Atlantic Oscillation, volcanic aerosols and a linear trend of 0.14 Celsius/decade.
The negative correlation between cosmic ray counts and troposphere temperatures is very strong, indicating that the Sun is the primary climate driver. H. Svensmark and E. Friis-Christensen published the above graph in a paper October 2007 in response to a paper by M. Lockwood and C. Frohlich, in which they argue that the historical link between the Sun and climate came to an end about 20 years ago. However, the Lockwood paper had several deficiencies, including the problem that they used surface temperature data that is contaminated by the urban heat island effect (see below). They also fail to account for the large time lag between long-term solar intensity changes to the climate temperature response.
See the Svensmark rebuttal of the Lockwood paper here, and a critique by myself here.
Over the 20th century the Sun has increased activity and irradiance intensity, directly providing some warming. The graph below from here shows the rising solar flux during most of the twentieth century.
Open Solar Flux
Dr. U.R. Rao of Bangalore, India, shows that galactic cosmic rays, using 10Be measurements in deep polar ice as the proxy, has decreased by 9% during the last 150 years. The decrease in cosmic rays cause a 2.0% decrease in low cloud cover resulting in a radiative forcing of 1.1 W/m2, which is about 60% of that due to the CO2 increase during the same period. See here.
In the top panel showing cosmic ray intensity, the continuous line represents estimated Climax neutron monitor counting rate (1956-2000), open circles denote ionization chamber measurements during (1933-1956) and filled circles represent cosmic ray intensity derived from 10Be (1801-1932). 10Be is a long-lived radioactive beryllium isotope produced by cosmic rays. The middle panel shows the near-Earth helio-magnetic field and the lower panel shows the sunspot number.
A reconstruction of the near Earth heliospheric magnetic field strength from 1900 through 2009 from here by Svalgaard and Cliver (2010) is shown below.
The red curve are satellite direct measurements of the near-Earth heliospheric magnetic field (HMF) strength resulting from the solar wind. The blue curve is the Inter-Diurnal Variability (IDV) index calculated from the geomagnetic field observations one hour after midnight. The IDV is highly correlated with the near-Earth HMF. The green curve are estimates of HMF by Lockwood et al 2009.
When the Sun is active it has a higher number of sun
spots and emits more solar wind - a continuous stream of very high-speed charged
particles. The increased solar wind and magnetic field repels cosmic rays that
otherwise would hit the Earth's atmosphere, resulting in less aerosols in the
lower atmosphere thereby reducing low cloud formation. The low clouds have a
high reflectivity and have a strong cooling effect by reflecting sunlight back
In summary, the process is:
More active Sun --> more Sunspots --> more solar wind --> less cosmic ray --> less aerosols --> less low clouds --> more sun light to the surface --> global warming.
The theory of CO2 warming implies that the arctic and Antarctica should be warming about the same, and the polar regions should be warming more that the rest of the Earth. However, Antarctica has not warmed since 1975, which is a big problem for the CO2 theory. The ice covering Antarctica has even higher reflectivity than low clouds, so fewer low clouds cools Antarctica, while fewer low clouds warms the rest of the planet. (Greenland's ice sheet is much smaller and is not so reflective.) This Antarctica temperature trend is strong evidence that the Sun, not CO2, is the primary climate driver.
Antarctica and North America Temperature Trends
The top curve is the North American surface temperature and the bottom curve is the Antarctica (64 S - 90 S) surface temperature over the past 100 years. The Antarctic data have been averaged over 12 years to minimize the temperature fluctuations. The blue and red lines are fourth-order polynomial fits to the data. The curves are offset by 1 K for clarity, otherwise they would cross and re-cross three times.
The cosmic ray flux is not only influenced by the solar wind, it also varies with the position of the solar system in the galactic arms. The solar system passes through the arms of the Milky Way galaxy roughly every 140 million years. When the solar system is in the galactic arms the intensity of cosmic rays increases, as we are closer to more supernovas that give off powerful bursts of cosmic rays. The variations of the cosmic ray flux due to the solar system passing through four arms of the Milky Way galaxy during the last 550 million years is ten times greater than that caused by the Sun. The correlation between cosmic rays and temperatures over 520 million years by N. Shaviv and J. Veiser was shown previously. Below is a similar graph based on their work, but with the times of the galactic arm crossings shown.
Cosmic Ray Flux and Temperature Changes with Galactic Arm Crossings
Four switches from warm hothouse to cold icehouse conditions during the Phanerozoic are shown in variations of several degrees K in tropical sea-surface temperatures (red curve). They correspond with four encounters with spiral arms of the Milky Way and the resulting increases in the cosmic-ray flux (blue curve, scale inverted). (After Shaviv and Veizer 2003)
Temperature changes over this time range can not be explain by the CO2 theory.
CO2 Concentrations 500 Million Years
The graph shows CO2 concentration over the last 500 million years. The CO2 does not correlate with temperature. Note when CO2 concentrations were more than 10 times present levels about 175 million years ago and 440 million years ago, the Earth was in two very cold ice ages.
See here for a paper on CosmoClimatology by Henrik Svensmark.
See here for a discussion of the Shaviv and Veizer 2003 paper by Tim Patterson. See here for their paper.
The Earth-Sun orbital changes are the principal causes of long term climate change. During the last 800,000 years, eight periods of glaciations have occurred. Each ice age lasts about 100,000 years with warm interglacial periods lasting 10,000 to 12,000 years. Milutin Milankovitch (1879-1958) identified three major cyclical variables which became recognized as the major causes of climate change. The amount of solar radiation reaching the Earth depends on the distance of the Earth to the Sun and the angle of incidence of the Suns rays upon the Earths surface. The Earths axis tilt changes on a 40,000-year cycle, the precession of the equinox changes on a 21,000-year cycle, and the eccentricity of the Earths elliptical orbit changes on a 100,000-year cycle.
The Earth's axis tilt (also known as obliquity of the ecliptic) changes from 22 to 24.5 degrees over a 40,000-year cycle. Summer to winter extremes are greater when the axis tilt is greater. The precession of the equinox refers to the Earth's wobble as it spins on its axis. Currently, the north axis points to the North Star, Polaris. In 13,000 years it would point to the star Vega, then return to Polaris in another 13,000 years, creating a 26,000-year cycle. When this is combined with the advance of the perihelion (the point at which the Earth is closest in its orbit to the Sun), it produces a 21,000-year cycle. The variation of the elliptical shape of the Earth's orbit around the sun ranges from an almost exact circle (eccentricity = 0.0005) to a slightly elongated shape (eccentricity = 0.0607) on a 100,000-year cycle. The Earth's eccentricity varies primarily due to interactions with the gravitational fields of other planets. The impact of the variation is a change in the amount of solar energy from closest approach to the Sun (perihelion, around January 3) to the furthest distant to the Sun (aphelion, around July 4). Currently the Earth's eccentricity is 0.016 and there is about a 6.4 percent increase in incoming solar energy from July to January. In the Northern Hemisphere, winter occurs during the closest approach to the Sun. The graph below shows the three cycles versus time. The vertical line represents the present, negative time is the past and positive time is the future. See here.
Analysis of deep-sea cores shows sea temperature changes
corresponding to these cycles, with the 100,000-year cycle being the
These solar cycles do not cause enough change in solar radiation reaching the Earth to cause the major climatic change without an amplifier effect. A plausible amplifier is the Suns varying solar wind that modifies the amount of cosmic rays reaching the Earths atmosphere.
The rate of change of global ice volume varies inversely with the solar insolation due to orbital changes. The graph below compares the June solar insolation anomaly north of 65 degrees latitude to the rate of change of global ice volume over the last 750,000 years. Reconstructions of global ice volumes rely on the measurement of oxygen isotopes in the shells of foraminifera from deep-sea sediment cores. The records also in part reflect deep ocean temperatures. Two ice records are shown; SPECMAP and HW04.
The ice melting and sublimation rates are very sensitive to summertime temperatures. The strong correlations and the absence of a large time lag demonstrate essentially concurrent variations in the change of ice volumes and summertime insolation in the northern high latitudes. Both ice volume reconstructions therefore support the Milankovitch hypothesis and show that the Sun is the dominant climate driver. The graph is from a paper by G. Roe here.
Heating of the Troposphere
Computer models based on the theory of CO2 warming predicts that the troposphere in the tropics should warm faster than the surface in response to increasing CO2 concentrations, because that is where the CO2 greenhouse effect operates. The Sun-Cosmic ray warming will warm the troposphere more uniformly.
The UN's IPCC fourth assessment report includes a set of plots of computer model predicted rate of temperature change from the surface to 30 km altitude and over all latitudes for 5 types of climate forcings as shown below.
Computer Model Predicted Temperature Change
The six plots show predicted temperature changes due to:
a) the Sun
b) volcanic activity
c) anthropogenic CO2 and other greenhouse gasses
d) anthropogenic ozone
e) anthropogenic sulphate aerosol particles
f) all the above forcings combined
The rate of temperature change is shown by the colour in degrees Celsius per decade.
It is apparent that plot c) of warming caused by greenhouse gasses is strikingly distinct from other causes of warming. Plot f) is similar to plot c) only because the IPCC assumes that CO2 is the dominant cause of global warming.
The computer models show that greenhouse warming will cause a hot-spot at an altitude between 8 and 12 km over the tropics between 30 N and 30 S. The temperature at this hot-spot is projected to increase at a rate of two to three times faster than at the surface.
However, the Hadley Centre's real-world plot of radiosonde temperature observations shown below does not show the projected CO2 induced global warming hot-spot at all. The predicted hot-spot is entirely absent from the observational record. This shows that most of the global temperature change can not be attributed to increasing CO2 concentrations.
HadAT2 Radiosonde Data 1979 - 1999
The left scale is atmosphere pressure in hPa. The right scale is altitude in km.
Source: HadAT2 radiosonde observations, from CCSP (2006), p116, fig. 5.7E
See Greenhouse Warming? What Greenhouse Warming? by Christopher Monckton
The graph below compares the global annual temperatures of the troposphere to the surface measurements. The lower troposphere measurements from the University of Alabama in Huntsville (LT UAH). It measures the temperature of the troposphere up to approximately 8 km. The HadCRUT3 curve is the Land and Sea-Surface Temperatures data set from UK Met Office. The GISS curve is the surface temperatures from the Goddard Institute of Space Studies. The three curves are scaled so that the average temperature of the first 5 years equals 0 degrees Celsius.
The graph below compares the annual temperatures of the
troposphere to the surface measurements in the tropics. The lower
troposphere measurements from the UAH is from 20 degrees North to 20
degrees South, and the surface temperatures from HadCrut4 is from 30 degres
North to 30 degree South, and the surface temperatures from GISS is from 24
degrees North to 24 degrees South.
A comparison of the records show that the surface has warmed faster than the troposphere, the opposite of what is predicted by the theory of CO2 warming. Observations agree with the Sun-Cosmic ray warming theory.
The response of the troposphere temperatures in the tropics is sometimes called the fingerprint of the CO2 contribution to warming.
This graph shows two analyses of Microwave Sounding Unit (MSU) satellite temperature measurement data of the troposphere over the tropics from 20 degrees North to 20 degrees South. The UAH analysis is from the University of Alabama in Huntsville and the RSS analysis is from Remote Sensing Systems. The two analyses use different methods to adjust for factors such as orbital decay and inter-satellite difference. The overall trend lines to December 2012 show increasing temperatures at 0.06 C/decade for UAH and 0.11 C/decade for RSS. However, since January 2002, the temperatures have been declining at 0.16 C/decade for UAH and 0.24 C/decade for the RSS data. The IPCC projections do not agree with the data.
The graph "HadAT2 Radiosonde Data 1979 - 1999" in the previous section shows that the stratosphere (above 16 km) has cooled, which might appear to indicate a greenhouse gas effect. However, stratospheric cooling is predicted to occur due to both greenhouse gasses and ozone depletion. The ozone concentration in the stratosphere has declined from 1970 until 1995, and has not declined at all since then due to the implementation of the Montreal Protocol, which limits the emission of ozone reducing CFCs. See here. The stratosphere temperatures are given below from here.
The lower stratosphere temperature has not declined at all since 1995 (when the ozone levels are stable or slightly increasing), so the data does not indicate any greenhouse gas cooling of the stratosphere. In fact, it appears that there has been a slight warming of the lower stratosphere since 1995, the opposite of what is predicted by computer models of the greenhouse gas effects. The stratosphere cooling indicated by the radiosonde data is caused by the changing ozone concentration, not by greenhouse gasses.
Warming on Other Planets
If the Sun is the primary driver of climate change, one should expect to see evidence of recent warming on other planets. As the Earth has warmed over the last 100 years, so too has Jupiter, Neptune, Mars and Pluto.
Jupiter is the largest planet in the solar system. Its most distinctive feature is the great Red Spot, which is a huge storm that has been raging for over 300 years. A new storm, called Red Spot Jr. has recently formed from the merger of three oval-shaped storms between 1998 and 2000. The latest images from the Hubble Space Telescope suggests that Jupiter is in the midst of a global change that can modify temperatures by as much as 10 degrees Fahrenheit on different parts of the globe. The new storm has been rising in altitude above the surrounding clouds, which signals a temperature increase. See here from Space.com.
Neptune is the furthest planet from the Sun (Pluto is now
a dwarf planet) and orbits the Sun at 30 times the distance from the Sun to the
In the recent article, Hammel and Lockwood, from the Space Science Institute in Colorado and the Lowell Observatory, show Neptune has been getting brighter since around 1980; furthermore, infrared measurements of the planet since 1980 show that the planet has been warming steadily from 1980 to 2004.
In the figure, (a) represents the corrected visible light from Neptune from 1950 to 2006; (b) shows the temperature anomalies of the Earth; (c) shows the total solar irradiance as a percent variation by year; (d) shows the ultraviolet emission from the Sun. All data has been corrected for the effects of Neptune's seasons, variations in its orbit, the apparent tilt of the axis as viewed from the Earth, the varying distance from Neptune to Earth, and changes in the atmosphere near the Lowell Observatory.
See here for more information.
There is also strong evidence of global warming on Neptune's largest moon, Triton, which has heated up significantly since the Voyager spacecraft visited it in 1988. The warming trend is causing Triton's frozen surface of Nitrogen gas to turn into gas, making its atmosphere denser. See here.
A recent study shows that Mars is warming four times faster than the Earth. Mars is warming due to increased Sun activity, which increases dust storms. The study's authors led by Lori Fenton, a planetary scientist at NASA, says the dust makes the atmosphere absorb more heat causing a positive feedback. Surface air temperatures on Mars increased by 0.65 C (1.17 F) from the 1970s to the 1990s. Residual ice on the Martian south pole, they note, has steadily retreated over the last four years. Thermal spectrometer images of Mars taken by NASA's Viking mission in the late 1970s were compared with similar images gathered more than 20 years later by the Global Surveyor.
Mars polar ice cap
See here or here or here for more information.
The demoted planet Pluto is also undergoing warming according to astronomers. Pluto's atmosphere pressure has tripled over the last 14 years, indicating rising temperatures even as the planet moves further from the Sun. See here for further information.
CO2 Versus the Sun/Cosmic Ray Warming Theories
The following table sets out a comparison of the predictions of two climate theories - the CO2 warming theory and the Sun/Cosmic Ray theory - and actual real world data.
||Prediction - CO2
Sun/Cosmic Ray Theory
|Antarctic and Arctic Temperatures
||Temperatures in the Arctic and
Antarctic will rise symmetrically
||Temperatures will initially move in
||Temperatures move in opposite
||Fastest warming will be in the
troposphere over the tropics
||The troposphere warming will be
||The surface warming is similar or
greater than troposphere warming
|Timing of CO2 and Temperature Changes
at End of Ice Age
||CO2 increases then temperature
||Temperature increases then CO2
||CO2 concentrations increase about 800
years after temperature increases
|Temperature correlate with the driver
over last 400 year
||Cosmic ray flux and Sun activity
correlates with temperature, CO2 does not
|Temperatures during Ordovician period
||Very hot due to CO2 levels > 10X
||Very cold due to high cosmic ray
||Very cold ice age
|Other Planets' Climate
||Other planets will warm
||Warming has been detected on several
CO2 Greatly Increases Plant and Forest Growth
CO2 is a major plant fertilizer. The increase in CO2 emissions have caused increased crop yields and faster growing plants and forests, thereby greening the planet. Estimates vary, but somewhere around 15% seems to be the common number cited for the increase in global food crop yields due to aerial fertilization with increased carbon dioxide since 1950. This increase has both helped avoid a Malthusian disaster and preserved or returned enormous tracts of marginal land as wildlife habitat that would otherwise have had to be put under the plow in an attempt to feed the growing global population. Commercial growers deliberately generate CO2 and increase its levels in agricultural greenhouses to between 700 ppm and 1,000 ppm to increase productivity and improve the water efficiency of food crops far beyond those in the somewhat CO2 starved atmosphere. CO2 feeds the forests, grows more usable lumber in timber lots meaning there is less pressure to cut old growth or push into "natural" wildlife habitat, makes plants more water efficient helping to beat back the encroaching deserts in Africa and Asia and generally increases bio-productivity. See here
Bigtooth Aspen Growth Response to Enhanced CO2 and Temperature
Jurik et al. (1984) exposed bigtooth aspen leaves to atmospheric CO2 concentrations of 325 ppm and 1935 ppm and measured their photosynthetic rates at a number of different temperatures. At 25C, where the net photosynthetic rate of the leaves exposed to 325 ppm CO2 is maximal, the extra CO2 of this study boosted the net photosynthetic rate of the foliage by nearly 100%; and at 36C, where the net photosynthetic rate of the leaves exposed to 1935 ppm CO2 is maximal, the extra CO2 boosted the net photosynthetic rate of the foliage by a whopping 450%. These results are similar to studies of many other plants.
Young Eldarica Pine Tree Growth Response to CO2
Young Eldarica pine trees were grown for 23 months under four CO2 concentrations and then cut down and weighed. Each point represents an individual tree. Weights of tree parts are as indicated. See here.
Wheat Yield Response to CO2
This graph shows the response of wheat grown under wet conditions and when the wheat was stressed by lack of water. These were open-field experiments. Wheat was grown in the usual way, but the atmospheric CO2 concentrations of circular sections of the fields were increased by means of arrays of computer-controlled equipment that released CO2 into the air to hold the levels as specified. Average CO2-induced increases for the two years were 10% for wet and 23% for dry conditions.
Since atmospheric CO2 is the basic "food" of nearly all plants, the more of it there is in the air, the better they function and the more productive they become. For a 300 ppm increase in the atmosphere's CO2 concentration above the planet's current base level of slightly less than 400 ppm, for example, the productivity of earth's herbaceous plants rises by something on the order of 30% (Kimball, 1983; Idso and Idso, 1994), while the productivity of its woody plants rises by something on the order of 50% (Saxe et al., 1998; Idso and Kimball, 2001). Thus, as the air's CO2 content continues to rise, so too will the productive capacity or land-use efficiency of the planet continue to rise, as the aerial fertilization effect of the upward trending atmospheric CO2 concentration boosts the growth rates of nearly all plants.
A 2003 study using 18 years (1982 to 1999) of satellite
observations shows that global net primary plant production increased 6%
over 18 years. The largest increase was in tropical ecosystems. Amazon rain
forests accounted for 42% of the global increase in net primary production. See
The world's population is 6.6 billion and increasing at 1.18% per year. People will require increasing quantities of food and more natural ecosystems will be lost to crops and pastures. The resulting loss of habitat may result in species extinctions if crop yields are not significantly increased. Unfortunately, the rate of increase of crop yields is declining as crops are approaching the genetic yield limits. Increasing crop yields on existing farmlands would help to save lands for nature. If crop yields fail to increase, humans will suffer more frequent famines. Fortunately, the increase in CO2 concentrations will substantially enhance crop yields and is essential to prevent or delay the destruction of habitat and animal species, and may allow us to produce sufficient agricultural commodities to feed the growing population. Any action taken by us to slow or reverse the increase in CO2 concentration in the air may result in more frequent famines and species extinctions.
See here from CO2Science.
IPCC and Model Projections
Intergovernmental Panel on Climate Change (IPCC) presents projections of climate change, which are based on computer models. The projections given in the Summary for Policy Makers are based on six scenarios, which include different assumptions of population growth, economic growth, technological change and CO2 emissions. The scenarios assume that no climate change mitigation actions are taken, and they do not assume implementation of the Kyoto protocol. The IPCC does not assign any probability or likelihood to any of the scenarios, and the middle scenarios should not be interpreted as the most likely. The Fourth Assessment Report (AR4) used the Special Report on Emission Scenarios (SRES) projections of greenhouse gas emissions that was used in the Third Assessment report (TAR). The emission projections are converted to CO2 concentration projections in the TAR using the Bern climate-carbon cycle model (BERN-CC; Joos et al., 2001) that accounted for the climate-carbon cycle feedback (AR4 Ch 10, pages 750 and 790). The Bern-CC model gives a range of CO2 concentration at the beginning of 2100 for the A2 emissions scenario of 735 ppm to 1080 ppm, with the best estimate reference value of 836 ppm. Coupled climate-carbon cycle models simulate a range of CO2 concentrations for the A2 emissions scenario of 730 to 1020 ppm by the beginning of 2100. These results shows that there is a large uncertainty of the projected CO2 concentrations even for a given emissions scenario due to uncertainty in future changes in the carbon cycle.
The CO2 concentrations from the Bern-CC model were used as inputs to the IPCC AR4 climate models to calculate the projected warming of each emission scenario. The initial growth rate of the projected CO2 concentrations range from 0.48 %/year to 0.56 %/year. The CO2 concentrations of the six projections increased from 367 ppm at the beginning of year 2000 to a range of 540 ppm to 958 ppm at the beginning of year 2100. The table below shows the AR4 projections. The CO2 concentrations are the reference Bern-CC model values given here.
|(C at 2090-2099 relative to 1980-1999)||From 2006||Rate of||CO2||CO2||CO2|
|Scenario||Best Estimate||Likely Range||Best Estimate||Change||Concentration||Maximum Growth||Average Growth|
The temperature changes "Best Estimate" given in the second column are from the average surface temperatures in the period 1980 to 1999. The "Best Estimate" from 2006 given in the fourth column is reduced by 0.3 oC to account for the actual temperature change to 2006 from the average of 1980-1999. The average CO2 growth rates of the last two scenarios at 0.82 and 0.96 %/year appears to be unrealistic considering the actual CO2 growth rate 1990-2006 is 0.5%/year, and fossil fuels are expected to become more expensive as it becomes increasingly difficult to replace depleting oil and gas reserves. Note that the CO2 growth rate of the A1FI scenario increases from 0.51%/year in the 2000-2010 decade to 1.20%/year by 2060. The CO2 concentration projections corresponding to the six emission scenarios are shown below.
The graph below compares the actual CO2 concentrations as measured at Mauna Loa, Hawaii to the IPCC CO2 reference projections of the Burn-CC model for the period January 2000 to January 2010. The uncertainty in the CO2 projections is greater than the range shown here. The Low IPCC projection is from the B2 emission scenario, and the High IPCC projection is from the A1B emission scenario. The average IPCC projection is the average of the six scenarios. The actual annual CO2 growth rate in 2007 and 2008 were both 0.49%.
The IPCC temperature projections for the A2, A1B and B1
scenarios are displayed below.
Kevin Trenberth is head of the large US National Centre for Atmospheric Research and one of the advisors of the IPCC. Trenberth asserts ". . . there are no (climate) predictions by IPCC at all. And there never have been". Instead, there are only "what if" projections of future climate that correspond to certain emissions scenarios. According to Trenberth, GCMs ". . . do not consider many things like the recovery of the ozone layer, for instance, or observed trends in forcing agents. None of the models used by IPCC is initialised to the observed state and none of the climate states in the models corresponds even remotely to the current observed climate." However, Scott Armstrong and Kesten Green audited the relevant chapter in the IPCC's latest report. They find that "in apparent contradiction to claims by some climate experts that the IPCC provides 'projections' and not 'forecasts', the word 'forecast' and its derivatives occurred 37 times, and 'predict' and its derivatives occur 90 times" in the chapter. Consequently, it is not surprising that the public has this misimpression that the IPCC predicts future climate.
Computer Models Fail
The computer models predict that the 20th century temperatures should have increased by 1.6 to 3.74 Celsius, while the actual observed 20th-century temperature increase was about 0.6 Celsius. A model that fails to history match is useless for predicting the future.
The chart below compares the surface warming projections of the 2007 IPCC report to the actual global temperatures as represented by the HadCrut3 index. The red, green and blue curves are temperature projections from the A2, A1B and B1 emission scenarios. The orange curve is the temperature projection assuming the CO2 levels stay constant at the year 2000 value. The pink curve is the annual HadCrut3 actual temperature measurements. The black curve is the Fast Fourier Transform (FFT) best fit to the data.
See here. The quoted error on a single measurement is 0.05 C. The probablility that the IPCC projections overstate the warming in greater than 90%.
The IPCC Fourth Assessment Report projected a surface temperature increase from 1990 to 2100 of 1.4 C to 5.8 C, corresponding to 0.13 C/decade to 0.53 C/decade. The IPCC low estimate corresponds to the actual temperature warming rate as measured by satellite data.
Dr. John Christy presented to the US Senate on August 1, 2012 the following graph of the results from 34 climate models that will be used in the IPCC's fifth assessment report. The thick black line is the mult-model mean hindcast and projection from 1975 to 2020. The graph also shows the surface temperature observations and satellite observations adjusted to surface temperatures.
The graph shows that these new climate model
results are wildly different from the observations. The satellite
observations show the temperature has increased by 0.2 Celsius from 1980 to
2010, but the climate models mean increase is 0.6 Celsius. Surface observations
show a 0.35 Celsius increase from 1980, but as explained elsewhere in this
document, the surface temperatures are contaminated by urban development. The
model mean temperature increase from 1980 to 2010 is three
times higher than the satellite observations so the forecasts are
useless for making policy decisions. Dr. Christy's presentation is here.
The IPCC assumes that the Sun has little effect, even though observational evidence clearly shows the Sun has a significant effect on climate.
The models assume the 20th century temperature rise is caused by CO2 increases, and parameters are set in the models to make the temperature rise in response to the CO2. The direct effect of increasing CO2 concentration on global warming is very small. All the models amplify an initial increase in temperature due to CO2 by employing water vapour and clouds as a large positive feed back. However, there is no evidence that water vapour and clouds provides a large positive feed back. They may provide a negative feed back.
The amount of solar energy the Earth recieves depends on the Earth's albedo, or reflectivity. The greater the albedo, the more sunlight is reflected and the less solar energy is absorbed by the Earth. Project "Earthshine" being done at the Big Bear Solar Observatory measures the Earth's albedo by observing the amount of sunlight reflected by the Earth to the dark side of the Moon and back to Earth. The process is shown below.
The results show that the Earth albedo has gradually fallen up to 1997, likely causing most of the global warming through 1998. Since 2001 the albedo increased rapidly, which has stopped the warming and resulted in the current global cooling. The recent dimming of the Earth is likely due to increased low cloud cover. The albedo is shown below.
The blue lines are the observed earthshine data for 1994-1995 and 1999-2003. The black line is the reconstructed albedo from partially overlapping satellite cloud data with respect to the mean of the calibration period 1999 to 2001. The vertical red line shows the cumulative climate forcing of the increase in greenhouse gases over the 20th century of 2.4 W/m2 according to the IPCC. Note that the change of the albedo's climate forcing in W/m2 is much greater than that due to greenhouse gases. Current climate models do not show such large albedo variability. See an article by Anthony Watts here for further information. See the project Earthshine site here.
Climate models utilize large grid blocks to simulate climate, which are too large to include thunderstorms or hurricanes, so they use parameterization to account for these. These parameterizations ignore real-world transfers of energy, moisture and momentum that could significantly alter the results and severely limits the usefulness of climate model projections. Computer models employ approximations to represent physical processes that cannot be directly computed due to computational limitations. Because many empirical parameters can be selected to force a model to match observations, the ability of a model to match observations cannot be cited as evidence that the model is realistic and does not imply it is reliable for forecasting climate. See the Fraser Institutes Independent Summary For Policy Makers.
Atmospheric methane concentrations have been declining in recent years. Methane is a significant greenhouse gas. Climate models assume that methane concentrations increase with temperature, and it is not known why its concentration is declining. Aerosols play a key role in climate, with a potential impact of more than three times that of CO2 emissions, but their influence is very poorly understood. Aerosols exert an overall cooling effect on climate but estimates of the effect vary by a factor of ten. Models used in the IPCC Fourth Assessment Report assume aerosols have a large cooling effect, thereby attributing a large warming effect to CO2.
Only 2 of the 23 models used by the IPCC account for varying Sun intensity, and these models do not assume the Sun affects the cosmic ray flux and cloud formation. Only 2 of the models account for land use changes.
Computer models predict warming at the north and south poles to be symmetrical, but there is a warming trend at the North Pole but not at the South Pole. They also predict that the polar surface regions will warm more than the surface at the tropics. Winter temperatures will warm more than summer temperatures; night-time temperatures will warm more than day-time temperatures. Therefore, according to the CO2 warming theory, winter nights in the arctic will warm, but there will be little summer day time warming in the tropics.
A team of four researchers from three American universities led by David Douglass compared the troposphere temperature trends in the tropics predicted from climate models to actual satellite and radiosonde observations. In a paper published in December 2007 by the Royal Meteorological Society, Douglass et al analysed the simulation results from 22 climate models at the surface and at 12 different altitudes. The simulation results were compared to the temperature trends determined from two analysis of satellite data and four radiosonde datasets for the period January 1979 through December 2004.
Computer Model Temperature Trends versus Observations
The above diagram shows the comparison of temperature trends from 1979 through 2004 of climate models and actual satellite and radiosonde observations, expressed as degrees Celsius per decade versus altitude and atmospheric pressure. The left panel shows four radiosonde results as IGRA, RATPAC, HadAT2 and RAOBCORE. The thick red line shows the mean of the 22 computer model results, and the models' 2 times standard error of the mean are shown as the two thin red lines. Temperature trends from three surface measurement datasets are identified in the legend by Sfc and are plotted on the left axis. The RSS and UAH analysis of satellite data are plotted on the right panel at two effective layers: T2lt represents the lower troposphere with a weighted mean at 2.5 km, T2 represents the mid troposphere with a weighted mean at 6.1 km altitude. A trend is the slope of the line that has been least-squared fit to the data. Synthetic model values corresponding to the effective layers of the satellite data are shown in the right panel as open red circles.
An essential place to compare observations with greenhouse computer models is the layer between 450 hPa and 750 hPa atmospheric pressure where the presence of water vapour is most important, and is called the "characteristic emission layer". In this layer, the observations are all outside the 2 times standard error test. The radiosonde and satellite trends are inconsistent with the model trends at all altitudes above the surface. Douglass et al. conclude that Model results and observed temperature trends are in disagreement in most of the tropical troposphere, being separated by more than twice the uncertainty of the model mean. In layers near 5 km, the modelled trend is 100 to 300% higher than observed, and, above 8 km, modelled and observed trends have opposite signs. Therefore any projections of future climate from the models are very likely too high, and these projections should not be used to form public policy. See the paper "A comparison of tropical temperature trends with model predictions" here.
A technical paper published by R. McKitrick, S. McIntyre and C. Herman in Atmospheric Science Letters, August 2010 shows that the climate model temperature trends of the mid-troposphere, using 57 runs from 23 climate models, are four times larger than observations from satellites and weather balloons.
See here for a discussion by D. Stockwell and see the technical paper here.
While air temperature may fluctuate from year to year as heat is transferred between the air and oceans, if CO2 is causing global warming by the IPCC hypothesis, the ocean heat content must increase monotonically provided there are no major volcanic eruptions. Ocean heat content is a much more robust metric than surface air temperature for assessing global climate change because the ocean's heat capacity is greater than that of the atmosphere by many orders of magnitude. For any given area on the oceans surface, the upper 2.6 m of water has the same heat capacity as the entire atmosphere above it! According the IPCC models, all major feedbacks are positive so there is no mechanism that would allow the heat content of the Earth to decline.
Heat accumulating in the climate system can be measured on a global scale from 2003 by the ARGO array of 3341 free-drifting floats that measure temperature and salinity in the upper 2000 m of ocean. The robotic floats rise to the surface every 10 days and transmit data to a satellite which also determines their location as shown below.
Dr. Craig Loehle, Ph.D. has analyzed the ocean heat content for a linear trend over 4.5 years of data from mid-2003 to the end of 2007. The data shows an annual variation because most of the oceans are in the southern hemisphere. To eliminate the annual cycle, a model was fit with slope, intercept, and sinusoidal (1-year fixed period) terms using nonlinear least-squares estimation. The linear component of the model shows a decline of 0.35 x E22 Joules/year. (The graph shows the recalibrated data, after the data from certain instruments with a cool bias were removed. Initial Argo results showed strong cooling.) The Argo heat content is shown below. See his paper here.
William DiPuccio compared the projected ocean heat content of the GISS climate model to two analyses of the ARGO heat content data. The projected heat content of the GISS model was adjusted to include only the upper oceans for comparison to the ARGO actual data. He also calculated a lower limit by scaling the net global anthropogenic radiative flux to ocean surface area. The observed ocean heat content trends were calculated by Josh K. Willis of NASA's Jet Propulsion Laboratory and Craig Leohle of the National Council for Air and Stream Improvement, Inc. Loehle's calculations have a smaller margin for error than Willis, because Willis only uses annual average data.
The heat deficit shows that from 2003-2008 there was no positive radiative imbalance caused by anthropogenic forcing, despite increasing levels of CO2. Indeed, the radiative imbalance was negative, meaning the earth was losing slightly more energy than it absorbed. The figure reveals a robust failure on the part of the GISS climate model.
William DiPuccio says, "Since the oceans are the primary reservoir of atmospheric heat, there is no need to account for lag time involved with heat transfer. By using ocean heat as a metric, we can quantify nearly all of the energy that drives the climate system at any given moment. So, if there is still heat in the pipeline, where is it? The deficit of heat after nearly 6 years of cooling is now enormous. Heat can be transferred, but it cannot hide." See his paper here.
Below is a graph which compares ARGO era (2003 to Q1 of 2011) the ocean heat content of the top 700 m from the National Oceanographic Data Center to the projections of the GISS climate model. The NODC OHC dataset is based on the Levitus et al (2009) paper which describes various adjustments and corrections to the data. The NODC data includes the ARGO data as described above and data from expendable bathythermographs. The GISS model projection is discussed here. The NODC data is here, and the graph is from here.
Note the enormous discrepancy between the measurements and the climate model projections.
The graph below shows th GISS-ER climate model 20th century hindcast (9 runs) and the projections (5 runs) compared to the NOAA observations. See here.
One of the most important parameters in determining climate sensitivity in climate models is the amount of heat they transfer to the oceans. The following graph by Dr. Spencer compares the Levitus observations of ocean warming trends during 1955-1999 to 15 IPCC 4AR climate model runs.
Note that the climate models exhibit wildly different trends, with the deep ocean cooling just as often as warming. The green curve is the Levitus actual observation to a depth of 700 m. Most of the models produce too much warming in the layer to 700 m. Many models produce unexpected ocean cooling below 100 m while the surface warms. None of the models even remotely match the observations. The weak ocean warming in the 700 m layer suggests low climate sensitivity, even if all the warming was due to CO2 emissions. See here.
The graphs below in this section prepared by Bob Tisdale compare temperature series to hindcasts of computer models used by the IPCC. Computer model hindcasts should be compare to the actual historical observations to determine how well the models matched the historical record. A model that fails to history match will not produce realistic projections.
The animation below compares observed North Atlantic temperature anomalies to the modeled surface air temperatures for the 6 individual ensemble members and the ensemble mean of the National Center of Climate Research (NCCR) coupled climate model CCSM4. All data have been smoothed with a 121-month filter.
Bob Tisdale writes "The NCAR CCSM4 coupled climate model appears to do a poor job of hindcasting the multi-decadal variability of North Atlantic temperature anomalies." See here.
The animation below compareas the sea surface temperature (SST) in the NINO 3 region to the climate model hindcasts. NINO 3 is a region in the Eastern Pacific tropics where El Nino events occur. It shows how poorly the models hindcast the frequency, magnitude, and trend of ENSO events. The model ensemble mean trend is 14 times greater than the trend of the observations.
Bob Tisdale writes "the frequency and magnitude of El Nino and La Nina events of the individual ensemble members do not come close to matching those observed in the instrument temperature record. Should they? Yes. During a given time period, it is the frequency and magnitude of ENSO events that determines how often and how much heat is released by the tropical Pacific into the atmosphere ..."
The graph below compares the linear trends for the observations and the model mean of the IPCC AR4 hindcasts/projections of SST for the period of January 1982 to February 2011 in 5-degree-latitude bands. The models predicted much greater warming trends in the tropics than what was observed. The actual warming in the northern regions is greater than modeled. Warming was predicted in the southern region but the SST trend was actually negative in much of the region. This shows that the models do an extremely poor job at simulating how topical heat is transported to the Polar Regions. See here.
The sea surface temperatures from -50 to -80 degrees latitude (south) and from 50 to 80 degrees latitude (north) are shown below. The IPCC claims that CO2 is the main driver of climate change, but the best fit linear temperature trends declined at 0.4 C/century in the southern region and increased at 2.0 C/century in the northern region despite the fact that the are CO2 concentrations in the two regions are the same.
The graph below compares the Easter Pacific SST to models by latitude. This includes the important El Nino region so a good history match here is critical. The Eastern Pacific tropical SST has declined at the equator at 0.14 C/decade, but the models show a strong warming of 0.19 C/decade. See here.
The graph below compares SST observations to climate model outputs for the period of 1910 to August 2011. The SST is from the HADISST dataset and the model hindcast is the IPCC model mean published in 2007. The models do not match the temperature variability in the period 1910 to 1975. They are made to match the warming trend from 1975 to 2002 by assuming most of the warming is due to CO2 and using high sensitivity to greenhouse gases. The projections diverge from observations after 2002 despite the continued increase in CO2 emissions. See here.
The graph below shows the northern hemisphere sea surface temperature measurements and the climate model hindcasts for the period 1910 to 1944. The actual temperature rise was 4.5 times greater than the modeled trend. The models cannot replicate the measurements because they do not include natural causes of climate change. The graph is from here.
The global surface temperature trend from HadCRUT for the early 20th century warming period 1917 to 1944 at 0.174 C/decade is similar to the late warming period 1976 to 2005 at 0.195 C/decade as shown below. But the net anthropogenice forcing in the climate models during the late warming period is 3.8 times higher than the forcing during the early warming period. The 3.8 fold increase in forcing had almost no effect on the temperature trends of the two warming periods, indicating that the theory of anthropogenic global warming is seriously flawed. The graph from here.
The graph below compares the 17-year (240 months) trends of the global SST to the IPCC model mean. Each point on curves represents the 17-year straight-line best-fit trend to that point in time. The IPCC models projected the global 17-year SST trend ending August 2011 at 0.15 C/decade, but the observed rise was only 0.02 C/decade. See here.
Bob Tisdale writes, "The coupled climate models used to hindcast past and project future climate in the IPCC 2007 report AR4 were not initialized so that they could reproduce the multi-decadal variations that exist in the global temperature record. This has been known for years." and "The climate models used by the IPCC appear to be missing a number of components that produce the natural multi-decadal signal that exists in the instrument-based Sea Surface Temperature record."
The daily temperature range over land has been decreasing because the daily minimum temperatures (Tmin) has increased more than the daily maximum temperatures (Tmax) over the 20th century. The NOAA Global Historical Network database shows that 2/3 of the warming is due to the increase in the minimum temperatures. The trend of the difference between the maximum and minimim daily temperatures is called the Diurnal Temperature Range and it is a very important climate parameter. A paper by McNider et al (2012) shows that 6 climate models with published minimum and maximum temperatures replicate only 20% of the measured diurnal temperature trend as shown in the figure below. This is a five-fold climate model error.
Climate models are tuned to match only the 1970 to 2000 temperature rise of the average of the minimum and maximum temperatures (Tmean). If models are replicating Tmean but are not capturing the trend in Tmin, then this must mean that the model Tmax is warming faster than the actual Tmax. A computer analysis of the near surface boundary layer shows that an increase in greenhouse gases causes increased mixing of the boundary layer which brings warm nighttime air aloft down to the surface. Only 20% of the warming was due to longwave energy in the model simulation and 80% was due to increased turbulence. A layer only 20 to 50 m in thickness is warmed by this turbulence. The Tmax measured during the daytime represents a boundary layer 1 to 2 km deep. The climate models assume the Tmean represents an air thickness of 1 to 2 km, but it is actually only 20 to 50 m thick. The modeled Tmax is warming much faster and represents a much greater air thickness than the actual Tmax. See here.
Most of the warming in climate models is due to increasing water vapour as temperatures rise. The climate models greatly overestimate the Tmax trend, which represents the deep atmosphere, so they greatly overestimate the increase in water vapour in the lower atmosphere as well.
About 50% of human-caused CO2 emissions are absorbed by natural sinks and 50% remains in the atmosphere. The fraction of emissions that are sequestered in sinks is called the sink efficiency. The graph below shows that as more CO2 is produced, the fraction of emissions that is sequestered in sinks has increased at 0.94%/decade.
Most of the models forecast the sink efficiency will decline so that the CO2 concentration in the atmosphere will rise by an additional 50 to 100 ppm by 2100 compared to a constant sink efficiency. But the actual sink efficiency change is in the opposite direction of the climate models so it is likely the CO2 content will rise slower that climate model predictions. A paper that discusses the climate model sink efficiency forecasts is here. The annual CO2 concentration data from Mauna Loa is here. The annual CO2 emissions data is here.
Many important inputs to climate models are
very uncertain and real world observational evidence does not support them, so
it is foolish to rely on their projections to make expensive policy
A scorecard listing the success of models is here.
Water Vapour Feedback
Relative humidity is the fraction of water vapour in a small parcel of air relative to the total amount of water vapour the air could contain at the given temperature and pressure. All the General Circulation Models, also known as Global Climate Models (GCM), just set various evaporation and precipitation parameters to achieve approximately the result: Relative humidity = constant.
Box 8.1 of 4AR Chapter 8 page 632 states:
The radiative effect of absorption by water vapour is roughly proportional to the logarithm of its concentration, so it is the fractional change in water vapour concentration, not the absolute change, that governs its strength as a feedback mechanism. Calculations with GCMs suggest that water vapour remains at an approximately constant fraction of its saturated value (close to unchanged relative humidity (RH)) under global-scale warming (see Section 188.8.131.52). Under such a response, for uniform warming, the largest fractional change in water vapour, and thus the largest contribution to the feedback, occurs in the upper troposphere.
The assumption of constant relative humidity is not correct. Here is a graph of global average annual relative humidity at various elevations in the atmosphere expressed in millibars (mb) from 300 mb to 700 mb for the period 1948 to 2012. [Standard atmospheric pressure = 1013 mb. 1 mb = 1 hectopascal (hPa)] The data is from the NOAA Earth System Research Laboratory here.
This graph shows that the relative humidity has been dropping, especially at higher elevations allowing more heat to escape to space. The curve labelled 300 mb is at about 9 km altitude, which is in the middle of the predicted (but missing) tropical troposphere hot-spot. This is the critical elevation as this is where radiation can start to escape without being recaptured. The average relative humidity at this altitude has declined by 21% (or 9.8 percentiles) from 1948 to 2012!
This is no logical reason to expect relative humidity to remain constant with increasing CO2 above the cloud layer. Relative humidity in a cloud is exactly 100% because the water droplets that make up the cloud are in equilibrium with the air. Likewise, relative humidity immediately above the oceans is 100%. The relative humidity in air parcels moving up over mountains will increase up to 100% causing rainfall. This saturation limit controls the average humidity in the atmosphere up to the top of the cloud layer. But the relative humidity at 400 mbars averages only 35% globally, or 30% in the tropics, and rarely gets anywhere near the saturation limit except in high thunderstorm clouds. The saturation limit therefore plays little role in determining the water vapour content of the upper atmosphere.
Doubling the amount of CO2 would increase temperatures by only about 1 degree Celsius if nothing else changed according to the IPCC. But the amount of water vapour will change in response to a CO2 induced temperature increase. Warmer air can hold more water vapour, so if relative humidity remains constant, the amount of water vapour increases with increasing temperatures. More water vapour, being a powerful greenhouse gas, would cause a further temperature increase, which is called a positive feedback. Most of the IPCCs projected warming is due to this water vapour feedback.
But the above graph shows falling relative humidity where the IPCC says changing water vapour content is most important. If relative humidity declines with increasing CO2 concentrations, the amount of water vapour in the upper troposphere may not increase, but might decline instead, resulting in a negative water vapour feedback.
Here is a graph of specific humidity, or the actual water vapour content, in grams of water vapour per kilogram of air, at the 400 mb level (about 8 km altitude).
This shows that the actual water vapour content in the upper troposphere has declined by 13.7% (best fit line) from 1948 to 2012 at the 400 mb pressure level. The climate models predict that humidity will increase in the upper troposphere, but the data shows a large decrease; just where water vapour changes have the greatest effect on global temperatures.
The NASA water vapour project (NVAP) uses multiple satellite sensors to create a standard climate dataset to measure long-term variability of global water vapour. The Heritage NVAP merges data from several satellites and radiosonde water vapour products for the years 1988 to 2001. The graph below left was presents at the GEWEX/ESA Due GlobVapour workshop March 8, 2011 here. Water vapour content of an atmospheric layer is represented by the height in millimeters (mm) that would result from precipitating all the water vapour in a vertical column to liquid water.
The graph shows a significant decline in global water vapour in the atmosphere layer from 500 to 300 hPa, about 6 to 9 km altitude. The animation above right shows the amount of water vapor over the earth in the 500 to 300 mbar pressure layer. The Heritage NVAP global water vapour data (1988 to 2001) by layer is available from a NASA website here.
The global annual average precipitable water vapour by atmospheric layer and by hemisphere is shown in the follow graph. The data in Excel format is here.
The graph is presented on a logarithmic scale so the vertical change of the curves approximately represents the forcing effect of the change. The water content of the L1 layer, surface to 700 mb, is about 20 times greater than in the L3 layer, 500 to 300 mb, whereas the forcing effect of a change in the L3 is approximately 14.5 times the same change in the L1. From 1990 to 2001, the water vapour changed by: L3 -0.55 mm, L2 -0.57 mm, L1 +1.73 mm. The decrease in L3 is equivalent to an 8 mm reduction in L1. The water vapor decline in the L2 and L3 layers overwhelms the forcing effect of the water vapor increase in the L1 layer, so the water vapor feedback is negative. The upper atmosphere (L2 and L3) water vapor content of the southern hemisphere is less than, and has declined more than the water vapor content of the northern hemisphere.
The above graphs show the precipitable water vapour by layer versus latitude by one degree bands.
Dr. Ferenc Miskolczi performed computations using the HARTCODE line-by-line radiative code to determine the sensitivity of OLR to a 0.3 mm change in precipitable water vapor in each of 5 layers of the NVAP-M project.
The results show that a water vapor change in the 500-300 mb layer has 29 times the effect on OLR than the same change in the 1013-850 mb near-surface layer. A water vapor change in the 300-200 mb layer has 81 times the effect on OLR than the same change in the 1013-850 mb near-surface layer.
The table below shows the precipitable water vapor for the three layers of the Heritage NVAP and the CO2 content for the years 1990 and 2001, and the change.
table below shows the change in OLR per change in water vapor in each layer, and
the change in OLR from 1990 to 2001 due to the change in precipitable water
The table below shows the change in OLR per change in water vapor in each layer, and the change in OLR from 1990 to 2001 due to the change in precipitable water vapor (PWV).
L1 L2 L3 Sum CO2 OLR/PWV W/m2/mm -0.329 -1.192 -4.75 OLR/CO2 W/m2/ppmv -0.0101 OLR change W/m2 -0.569 0.679 2.613 2.723 -0.171
The calculations show that the cooling effect of the
water vapor changes on OLR is 16 times the warming effect of CO These results highlight the fact that changes in the
total water vapor column, from surface to the top of the atmosphere, is of
little relevance to climate change because the sensitivity of OLR to water vapor
changes in the upper atmosphere overwhelms changes in the lower atmosphere. See
The calculations show that the cooling effect of the water vapor changes on OLR is 16 times the warming effect of CO2 during this 11-year period. The cooling effect of the two upper layers is 5.8 times the warming effect of the lowest layer.
These results highlight the fact that changes in the total water vapor column, from surface to the top of the atmosphere, is of little relevance to climate change because the sensitivity of OLR to water vapor changes in the upper atmosphere overwhelms changes in the lower atmosphere. See here.
The NVAP-M project extends the analysis to 2009 and reprocesses the Heritage NVAP data. This data is not publicly available but is expected to be released about March 2013. The total water vapour amount from the NVAP-M data is shown below.
The global total precipitable water vapour column from here is given below. Climate models assume that water vapor increases with increasing CO2 concentrations, but the NVAP-M data, using the best available satellite data, shows no increase in the total water vapor column.
The most obvious way to determine the water vapour feedback due to CO2 changes, ie. the effect that CO2 changes have on upper atmosphere water vapour, is to plot the annual water vapour specific humidity versus CO2 concentrations. Annual data is used to eliminate the seasonal signal. The climate models show that the maximum predicted water vapour feedback is at about the 400 mbar pressure level, which is in the predicted but missing tropical troposphere hot spot as shown in the Heating of the Troposphere section.
It has been suggested that the early NOAA Earth System Research Laboratory data is unreliable due to poor coverage and calibration issues. Water vapour in air immediately above the ocean is in equilibrium with the water, so the air is near 100% relative humidity, regardless of the temperature. Water vapour over land is expected to vary proportionally with water vapour over the oceans, resulting in a near constant global average relative humidity near the surface with global warming. Data before 1960 is considered less reliable because the surface relative humidity is too high and would result in a declining relative humidity trend. The graph below shows the relative humidity near the surface at 1000 mbar pressure from the NOAA database from 1960 to 2011. The best fit trend line shows no trend confirming that the NOAA water vapour data from 1960 has no drying bias near the surface. Therefore, we use only the data from 1960 in the analysis.
The graph below shows the 13-month centered running average of monthly specific humidity at the 400 mbar pressure level versus CO2 concentration by three latitude bands. The 13-month average data removes the seasonal signal. Note that in the tropics there is a significant drying trend. There is very little trend in either the northern or southern exo-tropics.
The graph below shows the global average annual specific humidity at the 400 mbar pressure level versus CO2 concentration from 1960 to 2012.
The blue line shows that as CO2 increases, water vapour decreases, which is opposite to climate model predictions. The brown line shows what the specific humidity would have been at the actual measured temperature assuming the relative humidity was held constant at the 1960 value.
The graph below shows the annual specific humidity in the tropics from 30 degrees North to 30 degrees South latitude at the 400 mbar pressure level versus CO2 concentration from 1960 to 2012. This is in the middle of the predicted but missing tropical hot spot.
Note the greater discrepancy between the actual data and the constant relative humidity assumption in the tropics versus the discrepancy for the global average. In the topics, the specific humidity best fit line has declined by 0.11 g/kg, or 13%, from 1960 to 2012, while the global average specific humidity best fit line has declined by 0.05 g/kg. There is a remarkably high correlation in the tropics between specific humidity and CO2 concentration with a R-squared (R2) factor of 0.73. (The coefficient of determination R2 is a measure of how well the data fits the linear regression straight line.) The brown line shows what the specific humidity would have been assuming a constant relative humidity. The actual climate model projections would show a much greater increase in specific humidity than indicated by the brown line because the climate models, in addition to the incorrect constant relative humidity assumption, also project the temperature increase in the upper atmosphere to be four times greater than the actual temperature trend determined by radiosonde and satellite measurements.
To compare this correlation to the climate model assumptions, the following graph shows the annual specific humidity in the tropics from 30 degrees North to 30 degrees South latitude at the 400 mbar pressure level versus temperature from 1960 to 2012. The climate models assume that water vapour changes only in response to a temperature change. If this were true, this graph should show a very strong correlation of increasing humidity with temperature. The graph is a phase space plot of the data points connected in time sequence. Over short time periods, especially over a season, an increase in temperature causes an increase in specific humidity. The annual data shows linear striations increasing from bottom left to top right, confirming that higher temperatures relate to higher specific humidity over short time intervals. But the overall trend is down, proving that specific humidity is responding to factors other than temperature.
The graph not only shows a very poor correlation of specific humidity to temperature at the 400 mbar pressure level, but the trend is negative rather than strongly positive as assumed in the climate models. Increasing CO2 would initially cause a slight warming before considering a water vapour or cloud response. In climate models this warming causes an increase in upper atmosphere water vapour because the models assume that water vapour can only change in response to a temperature change. But the data shows that water vapour declines with increasing CO2 at a R2 correlation of 0.73 and shows that water vapour declines with temperature at a R2 correlation of only 0.016. Obviously specific humidity is not only responding to temperature changes. In the long-term, factors other than temperature determine upper atmosphere humidity. Temperature has little effect on the long-term upper atmosphere specific humidity contrary to climate model assumptions. CO2 emissions are causing a decline in upper atmosphere water vapour thereby allowing heat to escape to space. We believe that the long-term specific humidity in the upper atmosphere is determined by the maximum entropy principle, not temperature. The atmosphere is able to maximize the loss of heat to space subject to the constraint of the saturation limit in the lower atmosphere by decreasing the water vapour content in the upper atmosphere in response to increasing CO2 concentrations.
The NOAA humidity data is here in Excel format.
The graph below compares the IPCC AR5 hindcast/forecast mult-model mean to the NOAA total precipitable water vapour column anomally. It also shows that water vapour changes lag behind ENSO by about 3 months. The graph is from a blog comment by Bill Illis here.
The AGW theory is essentially the idea that an increase in CO2 will cause water vapour to increase causing an enhanced greenhouse effect. The graph shows that the models roughly agrees with observation to 1984, then the models significantly over estimates the total water vapour content of the atmosphere. The modellers apparently make no attempt to match observations after 1984.
Greenhouse gases absorb long-wave radiation, making the atmosphere opaque at those wave lengths. Dr. Ferenc M. Miskolczi has developed a program called High-resolution Atmospheric Radiative Transfer Code (HARTCODE) that uses thousands of measured absorption lines and is capable of doing accurate radiative flux calculations. The calculations are independent of any greenhouse theory and contain no assumptions on how the greenhouse effect works, other than the fact that greenhouse gases absorb and emit radiation.
Water vapour is the most important greenhouse gas. HARTCODE simulations show that a 10% increase in CO2 concentration has the same effect as a uniform 1.80% change in water vapour on the out-going longwave radiation (OLR). A uniform 1% change in water vapor has 5.4 times the effect that a 1% change in CO2 has on OLR. A doubling of CO2 can be offset by a 12.3% reduction in H2O. This is shown in the following graph.
The radiation balance is determined at the top of the troposphere. The HARTCODE was used to determine the effect of changes of water vapour at the upper atmosphere versus near the surface. The graph below shows that changing the water vapour content in an atmospheric layer from the 300 mb to the 400 mb level has 30 times the effect on out-going long-wave radiation (OLR) as the same small change near the surface. So water vapour changes in the upper atmosphere are more important than changes in the lower atmosphere.
Optical depth is a measure of how opaque the atmosphere is to long-wave radiation, and so is a measure of the strength of the greenhouse effect. Miskolczi used HARTCODE to compute the optical depth from 1948 to 2008 using the measured CO2 content at Mauna Loa, Hawaii and the global average water vapour content from the NOAA Earth System Research Laboratory. The optical depths are calculated for each greenhouse gas and summed line-by-line across the electromagnetic spectrum. The resulting optical depth curve is a measure of the total greenhouse gases by effect over the last 61 years. The result is given below.
The blue line of the graph shows the optical depth of the atmosphere with changing CO2 and water vapour content. The green line is the linear trend of this data which indicates an insignificant trend. The pink line is the effect of increasing CO2 with water vapour held constant. It shows a small upward trend. The difference of these trends is the water vapour feedback. Recall that the IPCC assumes that water vapour provides a large positive feedback, which implies that the green line would be increasing much steeper than the pink line. The HARTCODE results shows the opposite. It shows a large negative feedback, where the changing water vapour offsets most of the warming effect of CO2.
The results show that the total effective amount of greenhouse gasses in the atmosphere has not significantly increased over the last 60 years.
The IPCC claims that the warming over the last half century was due to an increase in the quantity of greenhouse gases in the atmophere. But the HARTCODE result shows that CO2 replaces water vapour as a greenhouse gas, so it can't be responsible for global warming.
Here is the GCM error of specific humidity as reported by the IPCC's 4AR, Chapter 8-Suppl page 54:
This chart shows the multi-model mean fractional error,
expressed as a percent (i.e., simulated minus observed, divided by observed and
multiplied by 100). The observational estimate is from the 40-year European
Reanalysis (ERA40, Uppala et al., 2005) based on observations over the period
1980-1999. The model results are from the same period of the CMIP3 20th Century
Note that the chart shows that the model's errors in specific humidity at the altitude where the largest contribution of the feedback is predicted to occur is between 20% to 40% too high! If the specific humidity were corrected in the models at this critical altitude, the positive feedback would change to a strong negative feedback.
The strength of
the greenhouse effect is undetermined in the current theory utilized by climate
models. Parameters are just set to match the current temperatures. A new
greenhouse effect theory by
Ferenc Miskolczi shows that the current greenhouse effect equations are
incomplete because they do not include all the necessary energy constraints.
When these constraints are included in a new theory, the strength of the GHE is
determined analytically. The new theory presented in Miskolczi's paper shows
that the atmosphere maintains a saturated greenhouse effect, controlled by water
vapor content. There is a near
infinite supply of greenhouse gases available to the atmosphere in the form of
water vapor from the ocean to provide the greenhouse effect, but the atmosphere
takes up only a portion of the water vapour it could hold due to energy balance
constraints. Adding CO2 to the atmosphere just replaces an equivalent
amount of water vapour to maintain an almost constant greenhouse effect and
has negligible effect on global temperatures. See here for more information.
Climate models are limited by our understanding of cloud
formation. While scientists have a basic understanding of cloud formation, the
details controlling how bright they are, how dense and how large they become is
poorly understood. We lack the detailed understanding of clouds required to make
accurate climate models. Clouds have a major role in climate by reflecting
sunlight back into space, trapping heat, and producing precipitation.
As the Earth warms, there is more evaporation from the oceans, therefore more water vapour in the atmosphere available for cloud formation. But low clouds reflect sunlight back into space resulting in a strong cooling effect, negating most of the initial temperature increase.
Researchers at the University of Alabama in Huntsville (UAH) reported in August 2007 that individual tropical warming cycles that served as proxies for global warming saw a decrease in the coverage of heat-trapping [high altitude] cirrus clouds, says Dr. Roy Spencer, a principal research scientist in UAHuntsville's Earth System Science Center.
"All leading climate models forecast that as the atmosphere warms there should be an increase in high altitude cirrus clouds, which would amplify any warming caused by manmade greenhouse gases," he said. "That amplification is a positive feedback. What we found in month-to-month fluctuations of the tropical climate system was a strongly negative feedback. As the tropical atmosphere warms, cirrus clouds decrease. That allows more infrared heat to escape from the atmosphere to outer space."
"While low clouds have a predominantly cooling effect due to their shading of sunlight, most cirrus clouds have a net warming effect on the Earth," Spencer said. With high altitude ice clouds their infrared heat trapping exceeds their solar shading effect. If computer models incorporated this enhanced cooling effect due to such a reduction of high clouds, "it would reduce estimates of future warming by over 75 percent," Spencer said.
See the UAH News article here, and a report in ScienceDaily here. The paper abstract is here.
The modelers only do crude analysis of feedback from satellite data. They observe that low clouds tend to decrease with warming and assumed that the warming caused the low clouds to decrease. But cloud changes also cause temperatures to change. When a cloud moves to block the Sun, temperatures fall. The amount of clouds can change in response to a general ocean circulation change. So cloud changes are sometimes a cause of temperature change, and sometimes an effect of temperature change. The false assumption that all cloud changes are the effect of temperature changes led modelers to vastly over estimate the feedback from clouds.
Dr. Roy Spencer has developed a method to separate cause and effect of cloud variability. His technique is to plot quarterly average temperature and net flux readings from satellite data on a graph. These averages are plotted every day allowing the time evolution to be visualized. He found that the plots have two types of patterns a set of linear striations with a common slope, and superimposed slower random spiral patterns.
To understand these patterns, Spencer has developed a simple computer model where he can specify the amount of feedback, and can input radiative forcing that might be caused by random cloud changes. The model shows that the slope of the linear striations corresponds to the feedback in the climate system. These striations are due to changes in evaporation and precipitation which causes temperature changes. The temperature changes cause cloud changes, which is the cloud feedback signal we are looking for. The spiral patterns are caused by radiative forcing that might be due to changing the low cloud cover which varies the solar radiation warming the surface.
Spencer has analyzed the temperature-radiative patterns of the NASA Terra satellite. The Terra data starts in March 2000, and its temperature-radiative plot is shown below.
The plot shows two types of patterns; linear striations and random spiral patterns. The usual interpretation of this data by climate modelers would be to use the best fit line which shows a slope of 0.7 W/m2/C, which is a very high positive feedback. The actual feedback should be determined by the slope of the linear striations, which is 8 W/m2/C, which is a very high negative feedback. A value of 3.3 W/m2/C corresponds to no feedback. (No feedback means if the temperature of the atmosphere were uniformly increased by 1 C and nothing else changed, the top of the atmosphere would radiate 3.3 W/m2 more radiation to space.) The feedback is observed to occur on shorter time scales in response to evaporation and precipitation events, which are superimposed upon a more slowly varying background of radiative imbalance due to natural fluctuation in cloud cover changing the rate of solar heating Earths surface.
The satellite data shows that over short time scales, clouds provide strong negative feedbacks. Spencer also analyzed the radiative flux and temperature variations from climate models used by the IPCC to determine if the short term negative feedback found in the satellite data is also applicable to long term feedback. He found that the short term linear striations and the spiral patterns show up all 18 climate models that he analyzed. Spencer says the slopes of the linear striations do indeed correspond to the long term feedbacks diagnosed from these models response to anthropogenic greenhouse gas forcing. This strongly suggests that the short term negative feedback shown in satellite data also applies to long term global climate change.
The feedback estimate for a hypothetical doubling of carbon dioxide, using the Terra satellite data gives a climate sensitivity of 0.46 C.
See here for a more detailed discussion of cloud feedbacks.
Aerosols are a suspension of fine particles in the atmosphere and include smoke, oceanic haze, smog, etc. The most significant aerosols from human sources that affect climate are sulphate and black carbon aerosols. Sulphate aerosols are primarily from the burning of fossil fuels and generally cause a cooling effect by reflecting solar radiation. Black carbon aerosols are from burning of biomass, and generally have a warming effect as it absorbs solar radiation.
Three recent papers discussed below show that change in aerosols account for a much larger portion of recent climate change than assumed in climate computer models, implying that the effect of CO2 is much less than what the climate models show. The sun is likely the main cause of the global warming of the 20th century with aerosol changes providing a significant contribution. When one combines the effects of aerosols with the Sun, ocean cycles and the urban heat island effects, there is no climate change left for CO2 to explain.
A paper published in Journal of Geophysical Research in June, 2009 shows that changes in the amount of aerosols in the atmosphere over the 20th century has had a much larger impact on global temperatures than they are given credit for in the climate computer models. Martin Wild of the Institute for Atmospheric and Climate Science, Zurich, Switzerland, shows that the increase of sulphate aerosols from fossil fuels caused a global solar dimming effect from the 1950s to the 1980s and contributed to global cooling. Air pollution control measures have reduced sulphate aerosols from the 1980s to the 2000s, resulting is solar brightening which significantly contributed to global warming. Air pollution controls allowed more solar radiation to warm the surface. However, on a global basis the effect of aerosols has been stable since 2000 and there has been no global warming this century. Wild says satellite data and Earthshine observations both show a stable planetary albedo after 2000. See here.
A paper published in the journal Science in July, 2009 reports that a careful study of satellite data show the assumed cooling effect of aerosols in the atmosphere to be significantly less than previously estimated. Gunnar Myhre of the Centre for International Climate and Environmental Research, Oslo, Norway, states that previous values for aerosol cooling are too high by as much as 40 percent, implying the IPCC's model sensitivity for CO2 are too high. The main anthropogenic aerosols that cause cooling are sulphate, nitrate, and organic carbon, whereas black carbon absorbs solar radiation. Myhre argues that since preindustrial times, black carbon soot particle concentrations have increased much more than other aerosols. See here.
NASA research published in Nature Geoscience in April, 2009 suggests that much of the atmospheric warming observed in the Arctic since 1976 may be due to changes aerosol particles. Scientists led by Drew Shindell of NASA found that the mid and high latitudes are especially responsive to changes in the level of aerosols. The research suggests aerosols likely account for 45 percent or more of the warming that has occurred in the Arctic during the last thirty years to 2005. (Arctic temperatures have been falling since 2005.) Since decreasing amounts of sulphates and increasing amounts of black carbon in the Arctic both encourage warming, temperature increases can be especially rapid. In the Antarctic, in contrast, the impact of sulphates and black carbon is minimized because of the continents isolation from major population centres. Antarctica temperatures have not increased over the last 30 years. See here.
A study published in March 2007 uses the longest uninterrupted satellite record of aerosols in the lower atmosphere, a unique set of global estimates funded by NASA. Satellite measurements show large, short-lived spikes in global aerosols caused by major volcanic eruptions in 1982 and 1991, but a gradual decline since about 1990. By 2005, global aerosols had dropped as much as 20 percent from the relatively stable level between 1986 and 1991.
Sun-blocking aerosols around the world steadily declined (red line) since the 1991 eruption of Mount Pinatubo, according to satellite estimates.
Credit: Michael Mishchenko, NASA. See here.
Since 2005 China has had a major effort to install state-of-the-art desulphurisation in its coal-fired plants installing more such units than the rest of the world combined. At the end of 2008, 66% of the Chinas coal-fired power plant capacity is equiped with flue gas desulphurisation. Today 75% of all desulphurisation systems are being installed in China. See here. The reduction of aerosols, especially over China, allows more sunlight through the atmosphere to warm the Earth's surface, contributing to global warming.
China's SO2 emissions have declined 14.3% from 2006 to 2011 according to the 2010 and 2011 reports on the state of the environment in China. See here and here.
Many studies have shown that aerosols associated with biological activity provide a negative feedback to climate change. An initial warming stimulates production of marine phytoplankton. These micro-organisms emit greater volumes of dimethyl sulphide, or DMS. The DMS is oxidized in the atmosphere creating acidic aerosols that function as cloud condensation nuclei. Tiny water droplets form around these aerosols leading to the creation of more and brighter clouds that reflect more incoming solar radiation back to space, thereby providing a cooling effect.
Land plants emit greater amounts of carbonyl
sulfide gas in response to CO2 fertilization and temperature rise, which is
transformed into sulfate aerosol particles, which have a cooling effect. See here for more information.
Climate sensitivity refers to the equilibrium change in global mean surface temperature following a doubling of the atmospheric CO2 concentration. Since pre-industrial times, atmospheric CO2 has increased from 280 ppmv to 385 ppmv. There are many estimates of climate sensitivity. When the Earth warms, it emits more infrared radiation to outer space. This natural cooling effect amounts to an average of 3.3 Watts per square meter for every 1 deg C (W/m2/C) that the Earth warms. This is often expressed in the reciprical form as a gray body Earth sensitivity of 0.30 C/(W/m2) as given here. According to the IPCC, a doubling of CO2 concentration would cause a radiation flux forcing of 3.71 W/m2, assuming no feedbacks. Therefore, a doubling of CO2 would cause 3.71 W/m2 / 3.3 W/m2/C = 1.1 Celsius global surface temperature increase, assuming no feedbacks. This sensitivity assumes that the amount of water vapour, cloud cover, vegetation and ice cover does not change. There is little scientific back-up for the CO2 radiation flux forcing numbers used by the IPCC. Miskolczi calculates a no feedback climate sensitivity of 0.48 C.
There is a wide range of estimates of the climate sensitivity with feedbacks. The IPCC assumes that clouds and water vapour cause a positive feedback, while other scientists say that clouds and water vapour cause a strong negative feedback.
Since pre-industrial times, atmospheric CO2 has increased from 280 ppmv to 385 ppmv. Humans have not caused all of this increase. Scafetta and West has estimated that the Sun has caused 10 to 20% of the CO2 increase. Using 15%, humans have caused an estimated 90 ppmv increase in CO2, or a 32% increase from the pre-industrial value.
The table below shows estimates of climate sensitivity from various sources. The climate sensitivity is shown as temperature change in degrees Celsius per doubling of CO2 concentration (C/CO2X2), and as temperature change per radiation flux (C/W/m2). The last column shows the estimated global surface temperature change from pre-industrial time to now due to the human caused increase in atmospheric CO2 of 90 ppmv.
Climate Sensitivity (C/CO2X2)
Climate Sensitivity (C/W/m2)
Temperature Change (C)
The Miskolczi estimate is based on a greenhouse theory with energy constraints that fully determines the strength of the greenhouse effect. It predicts the increasing CO2 concentrations would reduce the quantity of water vapour in the upper troposphere. In fact, the water vapour relative humidity has declined 21% from 1950 to 2007 at 9 km altitude.
The Idso and Spencer estimates are based on temperature change observations, but do not take account of the effect of reduced water vapour in response to increasing CO2, and so are likely too high. (The Spencer article presents a climate sensitivity estimate of 8 W/m2/C, which is the reciprocal of the 0.125 C/W/m2 shown in the above table.)
The Schwartz and Chylek estimates both assume that the Sun has no effect on the temperature increase, and attributes the 20th century temperature change to CO2, modified by aerosols. This assumption greatly over-estimates the climate sensitivity due to CO2. The estimates also rely on the surface temperature record, which is contaminated by the urban heat island effect.
The IPCC determined climate sensitivity by two methods:
Climate sensitivity estimates used by the IPCC assumed that observed temperature variability caused the observed cloud variability. But causation also flows in the opposite direction with cloud variability causing temperature variability. A temperature change caused by cloud variability would be incorrectly interpreted as a positive feedback. This error causes the estimates to have a built-in bias toward high climate sensitivity. We know that the Sun can cause a change in lower cloud cover which cause a temperature change. The IPCC does not consider possible climate change from the Sun as its mandate is to investigate man-made climate change. The analysis of indirect clues from the geological record is very uncertain. The IPCC 4AR give a range of climate sensitivity of 2 to 4.5 C/W/m2, with a best estimate of 3 C/W/m2.
The following chart from a presentation by Dr. Richard Lindzen shows prediction results from a number of climate models and satellite data. The horizontal axis shows the change in sea surface temperatures per year as measured over various time intervals. The vertical axis is the change in outgoing longwave radiation at the top of the atmosphere as predicted by several climate models.
A positive correlation (slope from bottom left to top right) indicates that there is a negative feedback loop in SST change such that the hotter the sea gets the more heat is radiated away to space, which reduces the temperature rise. A negative correlation (slope from top left to bottom right) indicates that there is a positive feedback loop in that the atmosphere inhibits heat loss to space, which increases the temperature further.
The first correlation labeled ERBE is the actual data as measured by the Earth Radiation Budget Experiment (ERBE) satellite. The slope of the line indicates a strong negative feedback which offsets the initial temperature rise. The eleven other correlations are from climate models. They all show negative correlations corresponding to positive feedbacks, which amplifies the initial temperature rise. All the models have the feedback in the wrong direction, confirming that the models are fundamentally wrong.
In the following graph, each climate model's predicted climate sensitivity is plotted versus the slope of the correlations shown above, which correspond to the amount of the temperature feedback. The curved black line shows the relation between the feedback and the climate sensitivity to doubling the amount of carbon dioxide in the atmosphere. The large errors in the feedback factors cause a large range of predicted equilibrium climate sensitivities. The model results show the climate sensitivity could range from 1.3 degrees to over 5 degrees Celsius considering the range of feedback factors. But the ERBE satellite data tells a completely different story. It shows a climate sensitivity of 0.4 to 0.5 degrees Celsius. This small temperature change would not cause any problem and it there is no reason to be concerned about our CO2 emissions. See here or here for further information.
The ERBE determined climate sensitivity may be too high because it was calculated from short term temperature variations. It does not account for the long term reduction in water vapour content in the atmosphere as shown in the Water Vapour Feedback section, so the long term climate sensitivity may be even less than that indicated here.
The IPCC Hockey Stick
The IPCC published the "Hockey Stick" graph from Mann, Bradley and Hughes (MBH 1998), in its Third Assessment Report, which shows little change in temperatures for hundreds of years then a sharp increase recently in the last hundred years. This temperature history was given bold prominence in the IPCC reports, distributed to all Canadian households and used to support major policy decisions involving the expenditure of billions of dollars. The IPCC argues that there was little natural climate change over the last 1000 years, so that the temperature change over the last 100 years is unusual and likely caused by human activities. A senior IPCC researcher said in an email "We have to get rid of the Medieval Warm Period." Christopher Monckton says "They did this by giving one technique, measurement of tree-rings from bristlecone pines, 390 times more weighting than other techniques but didn't disclose this. Tree-rings are wider in warmer years, but pine tree rings are also wider when there's more carbon dioxide in the air: it's plant food. This carbon dioxide fertilization distorts the calculations. They said they had included 24 data sets going back to 1400. Without saying so, they left out the set showing the medieval warm period, tucking it into a folder marked "Censored Data". They used a computer model to draw the graph from the data, but two Canadians [Ross McKitrick and Stephen McIntyre] later found that the model almost always drew hockey-sticks even if they fed in random, electronic "red noise" because it used a faulty algorithm." The MBH 1998 report was never properly peer reviewed before the IPCC used it in their publications.
See here for comments from Christopher Monckton.
McKitrick and McIntyre say in their paper "the dataset used to make this construction contained collation errors, unjustified truncation or extrapolation of source data, obsolete data, incorrect principal component calculations, geographical mislocations and other serious defects. These errors and defects substantially affect the temperature index. The major finding is that the values in the early 15th century exceed any values in the 20th century. The particular hockey stick shape derived in the MBH98 proxy construction a temperature index that decreases slightly between the early 15th century and early 20th century and then increases dramatically up to 1980 is primarily an artefact of poor data handling, obsolete data and incorrect calculation of principal components." See here for their paper.
The IPCC hockey stick is shown below, along with the corrected version. The error ranges are not shown here.
The dispute over the hockey stick caused the United States Congress to decide to investigate the matter. The US National Research Council (NRC) held public hearings and prepared a report in 2006 for the US House of Representatives Committee on Science. The NRC Report made no criticism of the McKitrick and McIntyre papers. The report concludes "strip-bark samples should be avoided in temperature reconstructions." These strip-bark Bristlecone/Foxtail samples are responsible for the sharp increase in the graph in the twentieth century, but the growth spurt is not related to temperatures. It also confirmed that Mann's algorithm, which used non-centered principal component analysis, mines for hockey stick shapes from random red noise data as previously shown by McKitrick and McIntyre, and notes that "uncertainties of the published reconstructions have been underestimated."
Meanwhile, the US House of Representatives Committee on Energy and Commerce had independently commissioned a study from Edward Wegman who is chairman of the NAS Committee on Applied and Theoretical Statistics and a Fellow of the Royal Statistical Society. The Wegman Report states "Overall, our committee believes that Manns assessments that the decade of the 1990s was the hottest decade of the millennium and that 1998 was the hottest year of the millennium cannot be supported by his analysis. It also states "In general, we find the criticisms by [the McKitrick and McIntyre papers] to be valid and their arguments to be compelling. We were able to reproduce their results and offer both theoretical explanations (Appendix A) and simulations to verify that their observations were correct. The study also studied the social network of the group of scientists who publish temperature reconstructions. The study found that they collaborate with each other and share proxy data and methodologies, so that the "independent" studies are not independent at all. See the Wegman Report here.
Both of these reports were public six months before the IPCC began the release of the Fourth Assessment Report; however, the 4AR makes no mention of the Wegman Report, gives only one citation of the NRC Report, and ignores the findings and recommendations of the reports.
David Holland wrote a comprehensive history and discussion of the hockey stick affair. See Holland's paper - "Bias and Concealment in the IPCC Process: The 'Hockey Stick' Affair and its Implications" published by "Energy & Environment", October 2007 here.
David Holland says "it is scandalous that the WGI Chapter 6 authors ignored most of its [NRC Report] substantive findings. Despite the clear analysis in Wegman et al. showing the lack of independence between the various temperature reconstructions, the authors of AR4 WGI Chapter 6 persisted with their reliance on a spaghetti diagram of reconstructions in Figure 6.10(b) to continue to justify the claim that Average Northern Hemisphere temperatures during the second half of the 20th century were likely the highest in at least the past 1,300 years.
Urban Heat Island Effects
The urban heat island effect is caused by the heat-retaining properties of concrete and asphalt in urban areas that artificially increase local temperatures. It is the effect that humans have on local surface temperature so that the temperatures in or near urban centres are warmer than rural areas.
Surface Temperature Trends in 47 California Counties
This graph shows the size of the effect on surface temperatures and the problems associated with objective sampling. The surface temperature trends determined from ground stations for the period 1940 to 1996 were averaged for each county. The trends were grouped by county population and plotted as closed circles along with the standard errors of their means. The straight line is a least-squares fit to the closed circles. The points marked ''X'' are the six unadjusted station records selected by NASA GISS for use in their estimate of global temperatures. Note that 5 of the 6 selected stations are in populous counties. Note also that extrapolating the straight line to a county population of 10,000 gives a temperature trend of zero. See here.
Here is an example of a weather station used by the IPCC to record temperature rise.
Temperature Trends of Major City Sites and Rural Sites
Peterson (2003) is an influential study cited by IPCC Fourth Assessment Report purporting to show that the urbanization effect is negligible.
The IPCC relied heavily on this flawed study, where Peterson states "no statistically significant impact of urbanization could be found in annual temperatures." However, Steve McIntyre using Peterson's data shows that "actual cities have a very substantial trend of over 2 deg C per century relative to the rural network - and this assumes that there are no problems with rural network - something that is obviously not true since there are undoubtedly microsite and other problems." Peterson uses two lists of stations in his study, one labelled Urban and one labelled Rural. However an analysis of the lists shows that the Urban list includes many rural sites and the Rural list includes many urban sites. These results are discussed in a Climate Audit article here.
Most scientist agree that many temperature station measurements are contaminated by urban heat island effects, but they argue that the major global temperature indexes are adjusted to correct for these effects. There is an "Urbanization Adjustment" to correct for the effects of urbanization, a "Time of Observation Bias Adjustment" to correct for changed to the time of day when measurements are taken, and there is a "Coverage Adjustment" to account for the loss of measurement stations. These adjustments are intended to produce a record of what the temperatures would be if nobody lived near the measurement stations. If the adjustments were adequate, there should be no statistically significant correlation between the temperature record and social economic indicators.
Ross McKitrick and Patrick Michaels published a paper in 2004 in which they analyse the pattern of warming over the Earth's land surface compared to local economic conditions. They found a statistically significant correlation between the adjusted temperature data and economic development, meaning that the adjustments are not adequate to remove the urban heat island effects. They conclude "If the contamination were removed, we estimated the average measured warming rate over land would decline by about half."
Dutch meteorologists, Jos de Laat and Ahilleas Maurellis using different testing methodologies came to similar conclusions. They showed that there is a statistically significant correlation between the spacial pattern of warming in the adjusted temperature data and the spacial pattern of industrial development. They concluded it adds a large upward bias to the measured global warming trend. They also show that climate model predictions show no correlation between temperature and industrial development.
The IPCC acknowledges the correlation between the warming trends and social economic development, but dismisses it as a mere coincidence, due to unspecified atmospheric circulation changes. This nonsense claim contradict the IPCC widley advertised claim that recent warming can not be attributed to natural causes, and the Laat and Maurellis research shows it to be false.
McKitrick and Michaels published an updated paper in December 2007 using a larger data set with a more complete set of socioeconomic indicators. They discussed two types of contamination; anthropogenic surface processes, which are changes to the landscaped due to urbanization or agriculture, and inhomogeneities, i.e. equipment changes, missing data, poor quality control, etc. They showed that the spatial pattern of warming trends is tightly correlated with indicators of economic activity. They present a battery of statistical tests to prove that the result is not a fluke or spurious correlation. They conclude "The average trend at the surface in the post-1980 interval would fall from about 0.30 degrees (C) per decade to about 0.17 degrees." Removing the net warming bias due to urban heat effects in surface temperature data could explain as much as half the recent warming over land.
Bias of IPCC Temperature Data
The graph above is from the McKitrick and Michaels December 2007 paper. Each square is colour-coded to indicate the size of the local bias. Blank areas indicate that there was no data available. See the Background Discussion on the paper here.
An audit by researcher Steve McIntyre reveals that NASA has made urban adjustments of temperature data in its GISS temperature record in the wrong direction. NASA has applied a "negative urban adjustment" to 45% of the urban station measurements (where adjustments are made), meaning that the adjustments make the warming trends steeper. The urban adjustment is supposed to remove the effects of urbanization, but the NASA negative adjustments increases the urbanization effects. The result is that the surface temperature trend utilized by the International Panel on Climate Change (IPCC) is exaggerated. See here.
The website www.surfacestations.org was created by Anthony Watts in response to the realization that very little physical site survey data exists for the entire United States Historical Climatological Network (USHCN) of surface stations. Volunteers do hands on site surveys to photograph and document all 1221 USHCN climate stations in the USA. As of February 2009, 854 of 1221 stations have been examined in the USHCN network. Each site is assigned a site quality rating 1 through 5 based on the Climate Reference Network Rating Guide. Only 11% of stations are in suitable locations, 69% are within 10 m of an artificial heat source. Below is a picture of a poorly situated station.
The graph "Surface and Troposphere Temperature Trends" presented in the Heating of the Troposphere section of this essay shows temperature trends of the land, of the land and sea, and of the troposphere in the tropics. The land surface temperature trend has the highest rate of increase because it is contaminated by the heat island effect. The land and sea surface temperature trend is lower than the land trend because the sea temperature data does not have any heat island effect. The troposphere shows the lowest rate of temperature increase. We know that the CO2 theory of climate change requires the troposphere to warm faster than the surface, but the opposite has happened. It is illogical to believe that CO2 is the primary temperature driver and concurrently believe that the surface measurements used to the IPCC are accurate. If the surface temperature data were fully adjusted to remove the effects of urbanization by reducing the warming rate by half, it would closely match the troposphere warming trend.
Actual Siple, Antarctica Ice Core and Mauna Loa Data
Note that the measured concentration declines with increasing load pressure and depth.
Shifted Siple, Antarctica Ice Core and Mauna Loa Data
As the actual measurements show ice deposited in 1890 AD is 328 ppm, not the 290 ppm required to fit the IPCC human caused increasing CO2 concentration and global warming hypothesis, the average age of air was arbitrarily decreed to be exactly 83 years younger than the ice in which it was trapped.
The corrected ice data were then smoothly aligned with the Mauna Loa record, and reproduced in countless publications as a famous Siple curve. Only thirteen years later, in 1993, glaciologists attempted to prove experimentally the age assumption, but they failed.
CO2 Measurements between 1800 and 1955
IPCC modellers ignored the direct measurements of CO2 concentration indicating that the 19th century CO2 concentration was 335 ppm.
The encircled values were arbitrarily selected by Callendar for estimation of 292 ppm as the average 19th century CO2 concentration.
A study of stomatal frequency in fossil leaves from Holocene lake deposits in Denmark, showing that 9400 years ago CO2 atmospheric level was 333 ppm, and 9600 years ago 348 ppm, falsify the concept of stabilized and low CO2 air concentration until the advent of industrial revolution.
See here for more information.
Recently, Ernst-Georg Beck has summarized 90,000 accurate chemical analysis of CO2 in air since 1812. The historic chemical data reveal that changes in CO2 track changes in temperature, and therefore climate in contrast to the simple, monotonically increasing CO2 trend depicted in the post 1990 literature on climate change. Since 1812, the CO2 concentration in northern hemispheric air has fluctuated exhibiting three high level maxima around 1825, 1857 and 1942 the latter showing more than 400 ppm.
Between 1857 and 1958, the Pettenkofer process was the standard analytical method for determining atmospheric carbon dioxide levels, and usually achieved accuracy better than 3%. These determinations were made by several scientists of Nobel Prize level distinction. Following Callendar (1938), modern climatologists have generally ignored the historic determinations of CO2, despite the techniques being standard textbook procedures in several different disciplines. Chemical methods were discredited as unreliable choosing only few which fit the assumption of a climate CO2 connection.
Ernst-Georg Beck calls the falsification of the CO2 record "The greatest scandal in the modern history of science".
See here for a summary of the Beck paper, or here for the paper
See here for Beck's Berlin presentation of May 30, 2007.
See here for CO2: The Greatest Scientific Scandal of Our Time, by Zbigniew Jaworowski, Spring/Summer 2007 21st CENTURY Science & Technology.
In January 2009, a Japanese group launched a satellite IBUKI to monitor CO2 and methane spectral bands around the world to establish exactly where the worlds biggest sources and sinks of greenhouse gases were. The results from from Japans Aerospace Exploration Agency (JAXA) show that Industrialized nations appear to be absorbing the carbon dioxide emissions from the Third World. The satellite data shows that levels of CO2 are typically lower in developed countries than in air over developing countries. Areas with higher net emission (man-made plus natural emissions less natural absorption into sinks) would show higher CO2 concentrations. CO2 levels are lower than average in industrial countries as indicated by the blue dots. The highest net emissions, at least on this graph are predominantly in China, and central Africa. See here.
Author Michael Crichton warned of the dangers of "consensus science" in a 2003 speech. He says "Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. In science consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus."
In an open letter to Prime Minister Stephen Harper, 61 prominent scientists called for an open climate science review. The letter states "Observational evidence does not support today's computer climate models, so there is little reason to trust model predictions of the future. Significant advances have been made since the protocol was created, many of which are taking us away from a concern about increasing greenhouse gases. If, back in the mid-1990s, we knew what we know today about climate, Kyoto would almost certainly not exist, because we would have concluded it was not necessary. Global climate changes all the time due to natural causes and the human impact still remains impossible to distinguish from this natural "noise.""
The Petition Project was organized by the Oregon Institute of Science and Medicine.
The petition states in part:
"There is no convincing scientific evidence that human release of carbon dioxide, methane, or other greenhouse gasses is causing or will, in the foreseeable future, cause catastrophic heating of the Earth's atmosphere and disruption of the Earth's climate. Moreover, there is substantial scientific evidence that increases in atmospheric carbon dioxide produce many beneficial effects upon the natural plant and animal environments of the Earth."
So far (May 2009) the petition has received 31,478 signatures. Signatories are approved for inclusion in the Petition Project list if they have obtained formal educational degrees at the level of Bachelor of Science or higher in appropriate scientific fields. All of the listed signers have formal educations in fields of specialization that suitably qualify them to evaluate the research data related to the petition statement. Many of the signers currently work in climatological, meteorological, atmospheric, environmental, geophysical, astronomical, and biological fields directly involved in the climate change controversy. See here.
The Heartland Institute has conducted an international survey of 530 climate scientists in 2003. The survey asked if the current state of scientific knowledge is developed well enough to allow for a reasonable assessment of the effects of greenhouse gases. Two-thirds of the scientists surveyed (65.9 percent) disagreed with the statement, with nearly half (45.7 percent) scoring it with a 1 or 2, indicating strong disagreement. Only 10.9 percent scored it with a 6 or 7, indicating strong agreement. See here for the full survey results.
In an Open Letter to the Secretary-General of the United Nations, and the head of states of many nations dated December 13, 2007, titled "UN Climate Conference Taking the World in Entirely the Wrong Direction", more than 100 specialists from around the world, many who are leading scientists, state that "It is not possible to stop climate change, a natural phenomenon that has affected humanity through the ages." The letter states than recent climate changes have been well with-in the bounds of known natural variability. It further states that climate models can not predict climate, that there has been no global warming since 1998, that the IPCC has ignored much significant new peer-reviewed research that has cast even more doubt on the hypothesis of dangerous human-caused global warming, and attempts to cut emissions will slow development, and is likely to increase human suffering from future climate change rather than to decrease it. See here for the letter as published by the National Post.
A report to the US Senate lists 400 qualified scientists from around the world who dispute the claims by IPCC and others, that "climate science is settled" and that there is a "consensus". See here.
There is no consensus on whether or to what degree human activities are causing the problem, or even whether there is a problem. Global cooling, widely predicted in the 1970s, would have been much more dangerous than warming.
Effects of Warming
The IPCC and related groups have suggested several adverse effects of global warming. Real world data shows that these claims are mostly false. They ignore the huge benefits of warming and of CO2 emissions on plant growth.
Sea Level Rise
The sea level has been rising since 1860 at about 2 mm/year to 2000 as shown below.
Sea Level Data
Mean global sea level (gsl) (top), with its shaded 95% confidence interval, and mean gsl rate (bottom), with its shaded standard error interval. Adapted from Jevrejeva et al. (2006). See here from CO2science.
The IPCC AR4 estimates that "Global average sea level rose at an average rate of 1.8 [1.3 to 2.3] mm per year over 1961 to 2003. The rate was faster over 1993 to 2003, about 3.1 [2.4 to 3.8] mm per year." It also states "There is high confidence that the rate of observed sea level rise increased from the 19th to the 20th century."
Since August 1992 the satellite altimeters have been measuring sea level on a global basis. The University of Colorado at Boulder provides data from a series of satellites. Tide gauge calibrations are used to estimate altimeter drift. The global sea level rise with the seasonal signal removed is shown here. It shows a trend from 1992 thru March 2009 of 3.1 mm/year. Below are graphs of global, Pacific ocean and Atlantic ocean sea levels with trends from January 2004 to December 2003 and from January 2004 to July 2011. The seasonal signal was removed from the global sea level data, but included in the Pacific and Altantic ocean data.
Note that there has been a significant flattening of the trend since 2004. The global sea level rise since January 2004 of 1.62 mm/year is half of the trend from 1992 to December 2003 of 3.22 mm/year. The trends since January 2004 of the Pacific and Atlantic oceans are 0.037 mm/year and 1.34 mm/year, respectively. The slowing of the sea level rise is consistent with the current lack of global warming.
Permanent Service for Mean Sea Level (PSMSL), here lists 10 tide gauge stations on the West coast of Canada with near continuous monthly data from 1973 through 2011.
The graph shows the average monthly sea level of 10 tide gauge stations on the West coast of Canada. The black line is the linear best fit to the data. Over the period 1973 to 2011 the average sea level has declined at 0.5 mm/year.
The Envisat is the newest and most sophisticated satellite to measure global sea level. Launched in 2002, Envisat is the largest Earth Observation spacecraft ever built. The data shows there has been no global sea level rise since the end of 2003.
Dr. Nils-Axel Morner, who has spent a lifetime in the
study of sea levels, says There is a total absence of any recent acceleration in
sea level rise as often claimed by IPCC and related groups.. Read his
fascinating interview "Claim That Sea Level Is Rising Is a Total Fraud" June
22, 2007 EIR Economics 33.
Dr. Morner says the global sea level has been rising at 1.1 mm/year from 1850 to about 1940, then no increase to 1970. The IPCC uses a tide gauge in Hong Kong that shows 2.3 mm/year of sea level rise. The tide gauge is located where the land is known to be subsiding, so the record should not be used. Satellite altimetry data from the TOPEX/POSEIDON mission measures the sea level relative to the centre of the Earth (rather than relative to the coast) since 1992.
Satellite altimetry of TOPEX/POSEIDON
The graph above from Morner, 2004, shows the original satellite sea level data from 1992 to early 2000. Other than the effect of the 1997/98 El Nino, the data shows no sea level rise.
The satellite data shows no increase, but the IPCC adds a "correction factor" to the satellite data to make it agree with the tide gauge data at 2.3 mm/year. This data is presented as satellite data, but Morner says "it is a falsification of the data set".
Satellite Altimetry Data of TOPEX/POSEIDON Tilted Back to Original Level
The graph above from Morner, 2005, shows the satellite altimetry sea level data from 1993 to 2003 tilted back to the original level by excluding the tide-gauge factor. It shows variability around zero plus ENSO events.
See here for Dr. Morner's Memoradum paper, which was presented to the United Kingdom's House of Lords.
Satellite altimetry Topex/Poseidon data is adjusted by the University of Colorado for NASA to match the rate of sea level rise measured by a set of 64 tide gauges. Any difference between the raw satellite measurement and the tide gauge measurement is assumed to be the sum of satellite measurement drift error and the vertical land movement at the tide gauge location. A separate estimate of the land movement is made mainly by using "doppler orbitography and radiopositioning integrated by satellite" (DORIS) data at the tide gauge location. The raw satellite data is tilted by applying the satellite measurement drift as determined by the tide gauges. See here and here for a description of how satellite data is calibrated from a set of tide gauges.
The graph above shows the sea level trends from January
2002 to April 2011. Note that most of the sea level rise in this period is
located in an area north of Australia. The average of five tide gauge stations'
trends from the north coast of Australia using annual data is 17.7
mm/year from 2002 to 2009. However, the tropical Pacific ocean sea level was
decreasing at up to 16 mm/year.
A famous tree in the Maldives shows no evidence of having been swept away by rising sea levels, as would be predicted by the global warming advocates. A group of Australian global-warming advocates came along and pulled the tree down, destroying the evidence that their theory was false.
The "INQUA Commission on Sea-Level Change and Coastal Evolution" led by Dr. Morner, prepared as estimate that the global sea level will rise 10 cm plus or minus 10 cm in the next 100 years. Dr. Morner has since revised his estimate to 5 cm per 100 years after considering data of the Sun activity suggesting that the warming trend may have ended and the Earth may be headed into a cooling trend.
It seems increasingly likely that a warming will increase precipitation and ice accumulation in the Polar Regions, and thus slow down or even reverse the ongoing sea level rise.
See here update 10.
The Proudman Oceanographic Laboratory estimates the rate of sea level rise at 1.42 plus or minus 0.14 mm/year for the period 1954 to 2003. This is less than the estimate of 1.91 plus or minus 0.14 mm/year for the period 1902 to 1953, indicating a slowing of the rate.
See here for an analysis of sea level rise by the Proudman Oceanographic Laboratory. The following graph shows the rate of sea level change since 1905 using the highest quality long record tide gauges.
Comparison of the global mean rates of sea level change calculated from nine long-record stations with those calculated from 177 stations averaged into 13 regions. The shaded region indicates 1 S.E. These records are from regions which do not experience high rates of Glacial Isostatic Adjustment (GIA) and which are not significantly affected by earthquakes. The comparison shows that over the common period of the two analyses (1955-1998) there is very strong agreement between the two global means.
Wppelmann et al used global positioning satellite (GPS) stations to correct tide gauge data for vertical land movements. In a 2007 paper, Wppelmann et al analyzed data from 160 GPS stations that were within 15 km of tide gauges to determine the vertical movement of the tide gauges. They determined that the global average sea-level rise from January 1999 to August 2005, after correcting the tide gauge data by the vertical land movement, was 1.31 +/- 0.30 mm/year. Note that this estimate is 58% less than the estimate reported (1993 - 2003) in the IPCC AR4. See here from World Climate Report, and the study abstract here.
The movie "An Inconvenient Truth" (AIT) suggests that
the Antarctic ice sheet could melt, but in fact the temperature of Antarctica
has been declining over the last 25 years by 0.11 Celsius per decade. There has
been no significant melting during previous warm periods when temperatures were
warmer than today.
Antarctica Temperatures 1979 - 2006 MSU Data Set (Latitude -90 to -70)
This graph was created from the MSU Data from www.CO2Science.org.
Antarctica ice sheet has been growing in thickness by 5 mm/year (1992 to 2003) according to a recent mass balance study. This net extraction of water from the global ocean, according to Wingham et al., occurs because "mass gains from accumulating snow, particularly on the Antarctic Peninsula and within East Antarctica, exceed the ice dynamic mass loss from West Antarctica."
A similar story is found in Greenland. The warmest period was not the last quarter century. Rather, as Vinther et al. report, "the warmest year in the extended Greenland temperature record was 1941, while the 1930s and 1940s were the warmest decades." In fact, their newly-lengthened record reveals there has been no net warming of the region over the last 75 years. A study of the Greenland ice sheet by Johannessen et al. found that below 1500 meters, the mean change of ice sheet height with time was a decline of 2.0 0.9 cm/year, qualitatively in harmony with the statements of Alley et al.; but above 1500 meters, there was a positive growth rate of fully 6.4 0.2 cm/year. Averaged over the entire ice sheet, the mean result was also positive, at a value of 5.4 0.2 cm/year, which when adjusted for an isostatic uplift of about 0.5 cm/year yielded a mean growth rate of approximately 5 cm/year, for a total increase in the mean thickness of the Greenland Ice Sheet of about 55 cm over the 11-year period, which was primarily driven by accumulation of increased snowfall over the ice sheet.
A recent study by Zwally et al. 2007 found the Greenland ice sheet have experienced a net accumulation of ice which is producing a 0.03 0.01 mm/year decline in sea-level.
The IPCC claims that global warming will result in more severe weather. This doesn't make any sense, as most storms are caused by a difference in temperatures of colliding air masses. If CO2 warms the Polar Regions there will be smaller temperature differences, and less severe storms. All other things being equal, a warmer world should have fewer, not more, severe storms.
Unlike most storms, hurricanes are caused by difference in temperatures between the sea surface and the storm top.
Researchers Knutson and Tuleya examined a suite of climate models and found that they virtually unanimously projected that in a CO2-enhanced world, the middle and upper troposphere will warm at a faster rate than the surface, especially over the tropical oceans. More warming aloft than at the surface makes the atmosphere more stable and less conducive to storm formation. Thus, Knutson and Tuleya reported that the model-projected vertical stability increases in the future would temper (but not totally cancel out) the increase in storm intensity by rising sea surface temperature.
However, researchers Vecchi and Soden found that the climate models almost unanimously project that there will be an increase in the vertical wind shear during the hurricane season which also acts to inhibit tropical cyclone formation. The combined result is that any increase in hurricane intensity will be so small as to be undetectable. Incidentally, the actual vertical wind shear of Atlantic hurricanes have been declining since 1973, the opposite of the trend predicted by the climate models. See here.
There is absolutely no evidence of increasing severe storm events in the real world data. Here is a graph of hurricane intensity for the USA.
For the North Atlantic as a whole, according to the World Meteorological Organization, "Reliable data ... since the 1940s indicate that the peak strength of the strongest hurricanes has not changed, and the mean maximum intensity of all hurricanes has decreased."
Gulev, et al (2000) employed NCEP/NCAR reanalysis data since 1958 to study the occurrence of winter storms over the northern hemisphere. They found a statistically significant (at the 95% level) decline of 1.2 cyclones per year for the period, during which temperatures reportedly rose in much of the hemisphere.
"Global warming causes increased storminess" makes for interesting headlines. It also violates fundamental scientific truth and the lessons of history.
Global hurricane activity declined to mid-2012 to levels not seen since 1978. The Accumulated Cyclone Energy (ACE) is the 2-year running sum of the combination of hurricanes' intensity and longevity. During the past 40 years, Global and Northern Hemisphere ACE undergoes significant variability but exhibits no significant statistical trend. The global 2013-02 ACE was 62% of the 1998-01 ACE. See here.
The graph above shows the last 4-decades of Global and Northern Hemisphere ACE through February, 2013. Note that the year indicated represents the value of ACE through the previous 24-months for the Northern Hemisphere (bottom line/gray boxes) and the entire global (top line/blue boxes). The area in between represents the Southern Hemisphere total ACE.
The graph above shows the last 4-decades of Global Tropical Storm and Hurricane frequency 12-month running sums through February, 2013. The top time series is the number of tropical cyclones that reach at least tropical storm strength (maximum lifetime wind speed exceeds 34-knots). The bottom time series is the number of hurricane strength (64-knots+) tropical cyclones. The global frequency of tropical cyclones has reached a historical low.
The northern hemisphere 2008 ACE was 85% of the 2005 ACE as shown in the stacked bar chart below.
Most thunderstorms occur in the tropics, but most tornatoes occur in the USA. Less than 1% of thunderstorms in the USA spawn tornadoes. Tornadoes requires directional wind shear, a change of wind direction with height. Wind shear occurs when cold and warm air masses collide. This never happens in the tropics so tornadoes never occur there. The graph below shows that the averate USA temperatures have increased since 1960 while the number of strong (F3 to F5) tornadoes have declined. See here from Dr. Roy Spencer.
An outbreak of tornadoes in 2011 in the USA was caused by unseasonably cold spring weather. Dr. Spencer writes "An unusually warm Gulf of Mexico of 1 or 2 degrees right now cannot explain the increase in contrast between warm and cold air masses which is key for tornado formation because that slight warmth cannot compete with the 10 to 20 degree below-normal air in the Midwest and Ohio Valley which has not wanted to give way to spring yet. ... global warming causes FEWER tornado outbreaksnot more."
Dr. Indur M. Goklany prepared a study which examines whether losses due to such events (as measured by aggregate deaths and death rates) have increased globally and for the United States in recent decades. It puts these deaths and death rates into perspective by comparing them with the overall mortality burden, and briefly discuss what trends in these measures imply about human adaptive capacity. Globally, mortality and mortality rates have declined by 95 percent or more since the 1920s. The largest improvements came from declines in mortality due to droughts and floods, which apparently were responsible for 93 percent of all deaths caused by extreme events during the 20th Century. See here.
The most telling graph is the first one in the paper below:
The chart displays data on aggregate global mortality and mortality rates between 1900 and 2006 for the following weather-related extreme events: droughts, extreme temperatures (both extreme heat and extreme cold),floods, slides, waves and surges, wild fires and windstorms of different types (e.g., hurricanes, cyclones, tornados, typhoons, etc.). It indicates that both death and death rates have declined at least since the 1920s. Specifically, comparing the 1920s to the 2000 - 2006 period, the annual number of deaths declined from 485,200 to 22,100 (a 95 percent decline), while the death rate per million dropped from 241.8 to 3.5 (a decline of 99 percent).
Researchers analyzed 7,000 years of data from sediment cores from southern France's coastal region and found that severe storms were more frequent during global cooling, including The Little Ice Age, than during global warming periods, such as the Medieval Warming Period. See here.
The IPCC suggests that warming might result is more floods and draughts. There is no reason why a warmer world would have more floods and draughts. There is no trend of increasing floods or draughts. The Palmer Dought Index maintained by NOAA shows no trend in either floods or droughts in the USA as shown below.
The 1930's and 1950's were very dry in the USA. We are fortunate that climate is so much better now.
Pederson et al. found that droughts during the end of the Little Ice Age were more severe and of longer duration than those of the 20th and 21st centuries. Cooler climates produced more extreme conditions in many parts of the world. See here.
Woodhouse et al. published a 1,200 year perspective of Southwestern North America droughts: "The medieval period was characterized by widespread and regionally severe, sustained drought in western North America. Proxy data documenting drought indicate centuries-long periods of increased aridity across the central and western U.S...The recent drought, thus far, pales hydrologically in comparison.". See here.
Solar activity was high during both the Medieval and modern periods. High solar energy can result in periods of more intense drought and they have nothing to do with CO2 emissions.
Warming is Good for Your Health
The health benefits of a warmer planet are many times greater than any harmful effect. The positive health effects of heat have been well documented over the past quarter century. The early studies of Bull (1973) and Bull and Morton (1975a,b) in England and Wales, for example, demonstrated that even normal changes in temperature are typically associated with inverse changes in death rates, especially in older people. That is, when temperatures rise, death rates fall, while when temperatures fall, death rates rise.
Speculations on the potential impact of continued warming on human health often focus on mosquito-borne diseases. Elementary models suggest that higher global temperatures will enhance their transmission rates and extend their geographic ranges. However the histories of three such diseases - malaria, yellow fever, and dengue - reveal that climate has rarely been the principal determinant of their prevalence or range. Human activities and their impact on local ecology have generally been much more significant. It is therefore inappropriate to use climate-based models to predict future prevalence.
Agriculture and Climate Change
A small temperature drop would decrease the length of the growing season and cause a severe drop in the arable area in northern climates. Conversley, warming would lengthen the growing season and increase the area suitable for agriculture.
The map below shows the present principal area of Canadian wheat production, and the reduction that would result from one and 2 degrees Celsius decreases in average surface temperature.
Warming Effects on Animals
As indicated previously, both higher temperatures and CO2 concentrations enhance plant growth, especially for trees. This increases the habitat available for many animals. The bulk of scientific studies show an increase in biodiversity almost everywhere on Earth that is not restricted by habitat destruction in response to global warming and atmospheric CO2 enrichment.
The global warming alarmist has picked the polar bear as its poster animal. Time magazine has told its readers that they should be worried about polar bear extinction. The data however, does not support reasons for concern. In the Baffin Bay region between North America and Greenland, temperatures have been declining and the polar bear population has declined. In the Beauford Sea region the temperature has increased and so has the polar bear population. In other areas the polar bear population has been stable. So the trend of polar bear populations relative to temperature have been opposite to what Time would lead its readers to believe.
There has been recent warming in the western arctic as a result of the Pacific Decadal Oscillation, which periodically shifts the climate in the western arctic by changing ocean currents. These cycles have occurred over thousands of years. No evidence exists that suggests that both polar bears and the conservation systems that regulate them will not adapt and respond to the new conditions. Polar bears have persisted through many similar climate cycles. See here for an article by Dr. Mitchell Taylor, Polar Bear Biologist.
Polar bear fossils have been dated to over one hundred thousand years, which means that polar bears have already survived an interglacial period when temperatures were considerably warmer than they are at present and when, quite probably, levels of summertime Arctic sea ice were correspondingly low. See here.
Canadian scientists summarized the various estimates of
polar bear populations at an international meeting in 1965 as follows:
"Scott and others (1959) concluded that about 2,000 to 2,500 polar bears existed near the Alaskan coast. By extrapolation, they arrived at a total polar bear population of 17,000 to 19,000 animals. Vspensky (1961) estimated the world polar bear population at 5,000 to 8,000 animals. Harington (1964) ... believes the world polar bear population is well over 10,000."
In 1993, the Polar Bear Specialist Group press release noted, "The state of knowledge of individual subpopulations ranges from good to almost nothing." Then it said that "the world population of polar bears was thought to be between about 21,000 and 28,000." In 2005 the group reported "The total number of polar bears worldwide is estimated to be 20,000-25,000." See here. The Polar Bear Specialist Group of the International Union for the Conservation of Nature (IUCN) has reported in May 2011 that there was no change in the polar bear population in the most recent four-year period studied. The polar bear population is apparently more than double that of the 1960s.
Kyoto Protocol - Misallocation of Funds
Of all the major problems of the world, climate change is one of the least important because funds spent to reduce CO2 emissions will have an insignificant effect on climate. Computer model projections show that full implementation of the Kyoto Protocol may result in temperature reduction of an undetectable 0.06 Celsius by 2050 at a cost of about $1,000,000,000,000 US. See here. (This estimate assumes the sun has no effect on climate. Since the sun has a major effect, the 0.06 Celsius estimate is likely high by a factor of 2 or more.)
The Copenhagen Consensus (directed by environmentalist Bjorn Lomborg) analysed the major challenges facing the world and produced a prioritized list of opportunities responding to those challenges. Submission by 24 United Nations ambassadors and other senior diplomats were reviewed by economists and determined that the top priority for addressing major world challenges would be given to communicable diseases, sanitation and water, malnutrition, and education. Ranked toward the bottom of the 40-category list were issues relating to climate change and the Kyoto Protocol.
An Inconvenient Truth
Al Gore's movie "An Inconvenient Truth" (AIT) is grossly misleading about climate change. Nearly every major statement made in the movie is one-sided, exaggerated, or plainly false. This movie has had a large effect on public opinion even though most scientists agree it is misleading.
Some of the problems with AIT are:
Implies that, during the past 650,000 years, changes in carbon dioxide levels largely caused changes in global temperature, whereas the causality mostly runs the other way, with CO2 changes trailing global temperature changes by hundreds to thousands of years. Never mentions that global temperatures were warmer than the present during each of the past four interglacial periods, even though CO2 levels were lower.
Presents images showing what 20 feet of sea level rise would do to the worlds major coastal communities. There is no credible evidence of an impending collapse of the great ice sheets. We do have fairly good data on ice mass balance changes and their effects on sea level. NASA scientist Jay Zwally and colleagues found a combined Greenland/Antarctica ice loss sea level rise equivalent of 0.05 mm per year during 1992-2002. At that rate, it would take a full century to raise sea level by just 5 mm.
Presents the hockey stick reconstruction of Northern Hemisphere temperature history used by the IPCC, according to which the 1990s were likely the warmest decade of the past millennium. It is now widely acknowledged that the hockey stick was built on a flawed methodology and inappropriate data.
Assumes a linear relationship between CO2 levels and global temperatures, whereas the actual CO2-warming effect is logarithmic, meaning that the next 100 ppm increase in CO2 levels adds only half as much heat as the previous 100 ppm increase.
Claims that the rate of global warming is accelerating, whereas the rate has been constant for the past 30 years to 2002roughly 0.17C per decade, and no warming from 2002 through 2006.
Claims that Lake Chad in Northern Africa is drying up due to global warming. The lake is the water source for 20 million people, and it has an average depth of only 1.5 to 4.5 meters. It has actually been dry multiple times in the past: in 8500 BC, 5500 BC, 2000 BC and 100 BC. The lake has shrunk in size due to a rapidly expanding population drawing water from the lake, the introduction of irrigation technologies and local overgrazing. These causes are neither global nor warming, and are utterly independent of CO2. In addition, Africa as a continent experienced a dramatic shift towards dryer weather in the end of the 19th century that is not generally attributed to CO2.
Distracts views from the main hurricane problem facing the United States: the ever-growing concentration of population and wealth in vulnerable coastal regions, which is partly a consequence of federal flood insurance and other political subsidies.
Blames global warming for the decline since the 1960s of the emperor penguin population in Antarctica, implying that the penguins are in peril, their numbers dwindling as the world warms. In fact, the population declined in the 1970s and has been stable since the late 1980s.
Never explains why anyone should be alarmed about the current Arctic warming, considering that our stone-age ancestors survivedand likely benefited fromthe much stronger and longer Arctic warming known as the Holocene Climate Optimum.
Presents one climate models projection of increased U.S. drought as authoritative even though another leading model forecasts increased wetness. Climate model hydrology forecasts on regional scales are notoriously unreliable. Most of the United States, outside the Southwest, became wetter during 1925-2003.
Blames global warming for the record number of typhoons hitting Japan in 2004. Local meteorological conditions, not average global temperatures, determine the trajectory of particular storms, and data going back to 1950 show no correlation between North Pacific storm activity and global temperatures.
Claims that global warming endangers polar bears even though polar bear populations are increasing in Arctic areas where it is warming and declining in Arctic areas where it is cooling. In fact 11 of the 13 main groups in Canada are thriving, and there is evidence that the only groups that are not thriving are in a region of the Arctic that has cooled. Polar bears have survived the Holocene Climate Optimum and the Medieval Warm Period, both periods were significantly warmer than today's climate.
Warns that a doubling of pre-industrial CO2 levels to 560 ppm will so acidify sea water that all optimal areas for coral reef construction will disappear by 2050. This is not plausible. Coral calcification rates have increased as ocean temperatures and CO2 levels have risen, and todays main reef builders evolved and thrived during the Mesozoic Period, when atmospheric CO2 levels hovered above 1,000 ppm for 150 million years and exceeded 2,000 ppm for several
Blames global warming for the resurgence of malaria in Kenya, even though several studies have found no climate link and attribute the problem to decreased spraying of homes with DDT and anti-malarial drug resistance.
Claims that 2004 set an all-time record for the number of tornadoes in the United States. Tornado frequency has not increased; rather, the detection of smaller tornadoes has increased. If we consider the tornadoes that have been detectable for many decades (category F-3 or greater), there actually has been a downward trend since 1950.
Cites Tuvalu, Polynesia, as a place where rising sea levels force residents to evacuate their homes. In reality, sea levels at Tuvalu fell during the latter half of the 20th century and even during the 1990s.
Neglects to mention that global warming could reduce the severity of winter stormsalso called frontal storms because their energy comes from colliding air masses (fronts)by decreasing the temperature differential between colliding air masses.
Ignores the large role of natural variability in Arctic climate, never mentioning either that Arctic temperatures during the 1930s equalled or exceeded those of the late 20th century, or that the Arctic during the early- to mid-Holocene was significantly warmer than it is today.
Ignores a study by University of Missouri professor Curt Davis that found an overall Antarctic ice mass gain during 1992-2003.
Neglects to mention that NASA satellites show an Antarctic cooling trend of 0.11C per decade since 1978.
Calls carbon dioxide the most important greenhouse gas. Water vapour and clouds are the leading contributors and account for over 70% of the greenhouse effect.
Claimed that ice cap on Mt. Kilimanjaro is disappearing due to global warming, though satellite measurements show no temperature change at the summit.
This is only a partial list of errors, omissions and exaggerations.
See here from the Competitive Enterprise Institute.
See here for an article listing 35 errors in AIT by Christopher Monckton of Brenchley.
The decision by the British government to distribute the film "An Inconvenient Truth" to schools has been the subject of a legal action. The British High Court found that the film was false or misleading in 11 respects.
In order for the film to be shown, the High Court ruled in October, 2007 that teachers must make it clear to their students that:
1.) The film is a political work and promotes only one side of the argument.
2.) Nine inaccuracies have to be specifically drawn to the attention of school children.
The inaccuracies are listed here.
Al Gore and the IPCC shared the 2007 Nobel Peace Price "for their efforts to build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures that are needed to counteract such change". Irena Sendler was considered for the prize for saving 2500 children and infants from the Nazi Warsaw Ghetto and the extermination camps during World War II. She was not selected. See her story here.
Warnings of Global Cooling
Nigel Weiss, Professor Emeritus at the Department of Applied Mathematics and Theoretical Physics at the University of Cambridge says that throughout earth's history climate change has been driven by factors other than man: "Variable behaviour of the sun is an obvious explanation," says Dr. Weiss, "and there is increasing evidence that Earth's climate responds to changing patterns of solar magnetic activity." The sun's most obvious magnetic features are sunspots, formed as magnetic fields rip through the sun's surface. "If you look back into the sun's past, you find that we live in a period of abnormally high solar activity," Dr. Weiss states. These hyperactive periods do not last long, "perhaps 50 to 100 years, then you get a crash," says Dr. Weiss. 'It's a boom-bust system, and I would expect a crash soon."
In addition to the 11-year cycle, sunspots almost entirely "crash," or die out, every 200 years or so as solar activity diminishes. When the crash occurs, the Earth can cool dramatically. These phenomenon, known as "Grand minima," have recurred over the past 10,000 years, if not longer. In the 17th century, sunspots almost completely disappeared for 70 years. That was the coldest interval of the Little Ice Age, when New York Harbour froze, allowing walkers to journey from Manhattan to Staten Island, and when Viking colonies abandoned Greenland, a once verdant land that became tundra.
In contrast, when the sun is very active, such as the period we're now in, the Earth can warm dramatically. This was the case during the Medieval Warm Period, when the Vikings first colonized Greenland and when Britain was wine-growing country.
No one knows precisely when a crash will occur but some expect it soon, because the sun's polar field is now at its weakest since measurements began in the early 1950s. Some predict the crash within five years, and many speculate about its effect on global warming. Several authorities are now warning of global cooling because the sun has entered a quiet period.
A Russian Academy of Sciences report in August 2006 warns that global cooling could develop on Earth in 50 years and have serious consequences.
David Archibal presentation titled "The Past and Future of Climate" here presented to the Lavoisier Group's 2007 Workshop in Melbourne, Australia, shows a forecast of global temperatures based on a detailed analysis of sunspot cycles. He expects the next sunspot cycle (24) to be weak resulting in the start of a long cooling trend. The forecast shows a 1.5 oC drop in global temperature from 2007 to 2025. He warns "...this will have a large and negative effect on Canadian grain production...".
On July 1, 2008, the Space and Science Research Center, a solar research organization, issued a formal declaration on climate change: Global warming has ended - a new climate era of pronounced cold weather has begun.