Blog & News

Unsettling Science

Thu 21 January 2016

Category: Guest Contributor, Membership

Article by R G Heath of Seismic and Oilfield Services Ltd, rgheath@btconnect.com
Originally published in EAGE’s First Break November 2015


The Scientific Method and Hydrocarbon Prosperity
The scientific method is perhaps humanity’s greatest achievement, laying the foundation for the industrial revolution which improved living standards to unimaginable levels. It led to the harnessing of fossil fuels providing far more energy than a single person ever had at their disposal before, increasing prosperity in the process. There are direct relationships between per capita CO2 output and standard of living indices such as child mortality/life expectancy, and funding for environmental protection. Most social advancements have relied on electricity generated with the fossil fuels our profession helps to locate and, despite costly drives towards renewables, 87% of world energy in 2013 came from hydrocarbons.

For 150 years there have been claims that fossil fuels have peaked. William Jevons in the 1860s forecast that Britain would quickly run out of coal, leading him to conclude that the country’s ‘present happy progressive condition’ would be of limited duration. But thanks to geoscientists, we still enjoy reliable supplies enabling affluence to expand. Despite demonisation by environmentalists, civilisation will depend on hydrocarbons for most of our foreseeable power needs permitting us to continue escaping Malthusian limitations.

Imagine a planet where Jevons was right and we had run short of coal during the reign of Queen Victoria. Would standards of living have gone on increasing and how would the environment have suffered?

Settled Science?
The Oxford English Dictionary defines the scientific method as a ‘procedure that has characterised natural science since the 17th century, consisting in systematic observation, measurement and experiment, and the formulation, testing, and modification of hypotheses’. This is not entirely correct. It was Islamic scholars who first formalised the idea. Ibn Al-Haytham, writing a thousand years ago about the natural world stated ‘We must be seekers of truth and not rely on any consensus.’ He realised that the burden of proof is on those who come up with conjectures. Islam might have become the culture that eventually brought us the fully fledged scientific revolution had it not been for Genghis Khan.  T. H. Huxley amplified Al-Haytham’s views: ‘The improver of natural knowledge absolutely refuses to acknowledge authority as such. Scepticism is the highest of duties; blind faith the one unpardonable sin.’

However, believers in a manmade climate alarm refer to those who do not accept their position as sceptics and deniers because ‘there is a consensus that the science is settled’. Michael Crichton said: ‘If it’s consensus it isn’t science and if it’s science it isn’t consensus.’ As physical scientists, we have the right to ask how much science backs up the claims of climate catastrophe based on the combustion of substances for which we explore, and how closely the alarmist cause adheres to the scientific method?

Most know that climate changes constantly; temperature is always fluctuating as the atmosphere is a thermodynamically unstable fluid-like system acted on by various non-linear forces and feedbacks of differing magnitudes and durations whose origins range from the deep oceans to outer space. Forty years ago almost everyone thought change was normal and trending towards a new ice age. It was only in the 1980s that manmade ‘Global Warming’ emerged and was later reborn as the more encompassing ‘Climate Change’. But as MIT atmospheric physicist Richard Lindzen observed: ‘The move from ‘global warming’ to ‘climate change’ indicated the silliness of this issue. The climate has been changing since the Earth was formed.’ The most recent study I found about climate variability is from China, showing from the geochemistry of marine sediments that change has been underway for at least 1.4 billion years. The topic later underwent another makeover and was reincarnated as ‘Dangerous Climate Disruption’. But since it is the manmade warming issue that is supposed to be scary, I will use the acronym AGW – Anthropogenic Global Warming.

As a member of various organisations involved in climate science and policy, I have written articles, worked with internationally read authors and given talks to AGW adherents. The late Nigel Calder, one of the greatest science writers of recent times and a frequent contributor on this subject, found climate change to be an ‘angry science’. I have had various disagreements over the decades about exploration and never got into a heated argument, whereas I can hardly remember a single climate-related event where there was not some ad hominem aggression, especially if I revealed my profession. When asking AGW fanatics for evidence of catastrophe, the usual response is that they do not actually have any but that a consensus of scientists assures us that things are dire.

The notion of consensus is hardly scientific. Science is not settled by a vote and, if it were, democracy does not support AGW. Over 30,000 scientists signed The Oregon Petition challenging alarmism. Experts who signed the Leipzig Declaration in 1995 called climate controls ‘ill-advised, lacking credible support from the underlying science’. Four thousand world leaders and scientists, including 70 Nobel Prize winners, signed the Heidelberg Appeal calling AGW ‘highly uncertain’ and, recently, mathematicians from the French Société de Calcul Mathématique produced 195 pages entitled: “The battle against global warming: an absurd, costly and pointless crusade.”

Many believe that the UN’s Intergovernmental Panel on Climate Change (IPCC) is the gatekeeper of climate truths. They are said to use thousands of the world’s “top scientists” and their various Assessment Reports (AR) contain hard evidence. Those making such claims may not have spent much time reading ARs which do not hold the information they are often imagined to. There is no excuse for not studying these reports; the EAGE bookstall occasionally sells them off cheaply. I spent €10 on AR4 (‘The Physical Science Basis and Summary for Policy Makers’), which was a bargain. I cannot get those in influential positions on the alarmist side to contradict that these documents contain no hard empirical or observational data. For example, in personal communications with a key IPCC scientist, he agreed there was indeed no evidence but that ‘a picture would emerge of climate-based catastrophes leaving no one in doubt’. Many books have been written about these computer-predicted disasters. However, one model may forewarn of floods while another sees drought devastating the same place. Our own field makes great use of models but we are far more wary of their predictive limitations.

AGW advocates may use information from NGO press releases based on the Summary for Policy Makers (SPM) from the ARs, but SPMs rarely represent the science. Another source of scares is journalists, many of whom have little education in the physical sciences. For example, some UK ‘science’ correspondents boast degrees in English and Geography. The Oscar-winning film An Inconvenient Truth is also an oft-quoted source, but covered itself in little glory given that a UK High Court judge ruled that AIT had nine scientific errors; I counted over thirty.

Patrick Michaels (Department of Environmental Science, University of Virginia) estimated that tens of thousands of scientists now depend on there being a climate problem, plus far larger numbers of hangers-on. AGW is big business; compare that with job security in our field. In the US, several federal agencies take part in the Global Change Research Programme which annually costs many billions. Both the American and EU AGW programmes also have big private money behind them. There is an interesting recent Senate Committee on Environment and Public Works report The Chain of Environmental Command: How a Club of Billionaires and Their Foundations Control the Environmental Movement. In Europe, The International Policy Network produced a report entitled Friends of the EU – The Costs of a Taxpayer-Funded Green Lobby revealing how green pressure groups are funded by the EU in order to lobby the EU, with AGW high on the agenda.

It is difficult even to know the scale of recent temperature change or why it is worthy of special regard. The ~20 year AGW period (sometimes called the ‘Alarmocene’, which finished 1997/98) has nothing unique to identify it. Its duration and delta are similar to 1915-1935 warming which no one associates with CO2. The most reliable measurements of Earth’s temperature are from microwave-sounding unit (MSU) satellites. Data is published monthly, global and independently verified by radiosonde. University of Alabama at Huntsville (UAH) MSU data shows no change in temperature for ~19 years and Remote Sensing Systems basically agrees. MSU data was recently reprocessed and revealed even less to be alarmed about: www.drroyspencer.com/2015/04/version-6-0-of-the-uah-temperature-dataset-released-new-lt-trend-0-11-cdecade/. Alarmism is generally based on surface data which may be of dubious raw quality, badly sampled, poor sighting of stations and subjected to occasional manipulation. It generally indicates significantly more warming than MSU data. An inquiry into this has been launched by The Global Warming Policy Foundation (www.thegwpf.org/inquiry-launched-into-global-temperature-data-integrity/) and results will be made public.

The Alarmocene’s rate of temperature change has been far lower than the initial recovery from the Little Ice Age (LIA) of 2.2 K/century, and we have not reached the balmy temperatures of the Mediaeval Warm Period (MWP). Global Accumulated Cyclone Energy demonstrates no upward trend since satellite monitoring began in the 1970s. The March 2015 US tornado count is zero – only the second time this has happened since 1950. Even glaciers – Exhibit A for the prosecution, are poor witnesses for AGW; the Alaskan glacier visited by President Obama in September to highlight AGW, started receding more than a century ago and is uncovering remains of trees which grew there during the MWP, when it was warmer. Sea level increase is not unusual: allowing for tectonic movement tide gauge measurements indicate eustatic rise of ~1.8 mm/y for the last century (Leuliette, 2012) while satellites show 1.3±0.9 mm/y for 2005-2011. If CO2 caused warming, sea level should change more quickly.

We often hear Earth’s ice stores are disappearing. However, 2014’s Arctic summer ice melt was minimal with thickness back to levels of almost a decade ago. A recent paper in Nature by Environment Canada noted ‘from 2007-13 there was a near-zero trend in observed Arctic September sea-ice extent, in large part due to a strong uptick of the icepack in 2013 which has continued into 2014’. Freshwater melt was supposed to decelerate the Gulf Stream but two decades of velocity measurement show no decrease. Antarctic sea ice is steadily increasing.

The undoubted signature of anthropogenic warming was supposed to be a “hotspot” in the upper troposphere caused by an ascending water vapour (WV) layer. This is crucial as most AGW fears come from WV, so lack of hotspot equals no need for panic. Only radio sondes can investigate this in detail and, despite the vast numbers of such experiments, no hotspot can be found. All in all, we seem to have no carbon-induced catastrophes to concern ourselves with, so indeed a picture has emerged. Those attending COP-21 Paris please take note.

The Science?
Climate sensitivity (CS) is the temperature increase from doubling CO2 concentration from ~280ppmv to 560ppmv (due around 2100). Temperature change is proportional to the log of the relative CO2 increase so whatever delta-T results in going to 560ppmv, requires doubling again to 1120ppmv to have another increase of the same magnitude. This is in the IPCC’s third report and is due to the shape of CO2’s absorption lines leading to progressive saturation, though is only an approximation. That same delta-T theoretically resulted by going from 140 to 280ppmv, from 70 to 140ppmv etc., thus CO2 long ago lost its ability to increase temperatures significantly.

All-CO2 accounts for about one twentieth of the total greenhouse effect and mankind’s emissions are just a small portion of that. Up to 96% of the GH effect comes from WV.  CO2’s capabilities are a tiny fraction of a tinier fraction: the rate of CO2 increase represents one additional molecule for every existing 100,000 already in the atmosphere, every five years. CO2 is now ~400ppmv and we hear it has not been at this concentration for 800,000 years. This statement’s veracity is dubious at best but, even if this were true, so what? CO2 and temperature correlate poorly. It has been much colder with 10-15 times as much CO2. Current temperature is less than the average of the last eight millennia of the Holocene Warm Period which probably is what allowed civilisation to take root. There were also the MWP, Minoan and Roman warm periods – each of which are referred to as “climate optimums”.

CO2 traps some of the outgoing long wave radiation and is a “forcing” while other processes either amplify or reduce that warming, and are feedbacks. Some calculate CO2 doubling alone will lead to ~1.1K increase, others think it less. Climate models assume feedbacks are positive, increasing warming significantly. However, real data indicates feedbacks are net negative (e.g. Lindzen and Choi 2009).

Schools teach that if it were not for CO2 average surface temperature would be about -16ºC whereas thanks to CO2 it is a life-friendly +16ºC. But as Spencer describes in his book Climate Confusion, in the absence of all other influences the surface due to CO2 would be a deadly +55ºC and the stratosphere cold enough to turn jet fuel into gel. This does not happen because before this huge temperature gradient establishes, the atmosphere becomes convectively unstable – the laws of thermodynamics come to the rescue with negative feedbacks allowing heat to flow from where there is a lot to where these is less, both upwards and polewards. Planetary-scale air movements create winds causing evaporation, carrying away heat from oceans, lakes, marshes etc. Additional CO2 does not cause a warming tendency; it merely enhances the considerable pre-existing tendency by a very small amount.

Edmund Halley, of comet fame, calculated that in temperate regions ‘one square degree of the ocean’ (about 177 km2), 30 million tonnes of water evaporate daily. The rate at the tropics is significantly greater. Vapour carried upwards in thermals eventually condenses out to form water droplets/clouds releasing heat removed from the surface. This heat keeps clouds aloft as seen from the adiabatic lapse rate. Small clouds can carry a few hundred tonnes (of water) but large cumulonimbus clouds defy gravity to hold ~106 tonnes, containing the same amount of heat as released by a small nuclear device. The condensation may cause rain – one estimate is that average daily planetary rainfall is 1.4 trillion tonnes, or further evaporate and rise to higher levels. At any one time, between 100 and 150 billion tonnes of water is held aloft. So this process represents a giant solar-powered air conditioner keeping the surface much cooler than it otherwise would be. Eventually, the heat is radiated into space. So it is not CO2 which makes life possible, it is weather systems distributing heat.

Clouds also change planetary albedo reflecting radiation back into space. Low-level clouds help to keep the planet cool and high-level clouds do the opposite, though the net effect of all clouds taken together is cooling. Global temperature is controlled to a considerable level by cloud formation and, along with WV, represent the biggest unknowns in terms of precise climatic effect.

The Astronomer Royal of 200 years ago, William Herschel, who discovered infrared radiation and the planet Uranus, noted a relationship between sunspot activity and grain harvests, though he did not know the mechanism. There is interesting recent work (e.g. Christensen and Svensmark) showing how cloud formation is modulated by coronal mass ejections and higher energy cosmic rays. On a decadal basis temperature records match solar activity well; cosmic ray intensity and terrestrial climate correlate for more than 3,000 years (C14 and O18 isotopes resp.)

Global brightening from surface incident solar radiation can explain all recent warming during which there was an increase in forcing of between 2 and 7W/sq.m from reduced cloud cover – the IPCC says anthropogenic CO2 only exerts ~3W/sq.m. There is also correlation between temperature change and the major oceanic influences such as ENSO. So CO2 does not have to be invoked as the ocean-atmosphere-heliosphere system is nonlinear, dynamical and chaotic and changes all by itself over relevant timescales. It is also probable that processes which kept Earth within narrow temperature bands over billions of years still operate.

Many other scientists spurred curiosity about climate. Robert Hooke thought prediction required systematic recording of weather data, which he was one of the first to do. This influenced Daniel Defoe (author of Robinson Crusoe) who catalogued what became known as The Great Storm of 1703 in his book The Storm. This storm was one of the most destructive in recorded British history – long before anthropogenic CO2.

But it was only in the last few centuries when scientists recognised CO2 as a greenhouse gas. Some are referenced as though their pronouncements have relevance to today’s debate, including the Swedish chemist Svante Arrhenius, who won the Nobel Prize in 1903 for work on the conduction of electricity in electrolytes. He thought the release of carbonic acid (as he referred to CO2) would increase temperatures which he saw as positive, proclaiming ‘We may hope to enjoy ages with more equable and better climates.’ Arrhenius’s calculations did not allow for Plank’s work on black body radiation. When Svante saw it, he halved his temperature estimate, and the lack of future warming depressed him. He might have been even more upset had he understood how feedbacks further reduce temperature.

How we got here
The IPCC was established in 1988 by the World Meteorological Organisation and the UN’s Environmental Programme. Encouraged by the success of the Montreal Protocol which covered eight synthetic gases said to harm the ozone layer, they targeted three GH gases: carbon dioxide, nitrous oxide and methane. The difference between these groups of gases was that the science of the first was well agreed, and CFCs were manufactured only by a few companies so regulation was simple. However, CO2-based warming was hotly disputed and CO2/CH4 are the outputs of most life forms.

The IPCC’s Charter includes assessing literature ‘relevant to the understanding of the risk of human-induced climate change’, the definition of which is in Article 1 – ‘a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over comparable time periods’. The WMO assumed a range of natural variability of ±1K. Hubert Lamb, whom some see as the father of UK climatology, thought it was impossible to define a figure, adding that its range is itself subject to variation. Lamb doubted that anthropogenic CO2 caused much warming because existing levels of CO2 and WV already block most of the radiation and because warming oceans out-gas CO2. He stated “On balance, the effect of increased carbon dioxide on climate is almost certainly in the direction of warming but is probably much smaller than the estimates which have commonly been accepted”. Lamb’s proper caution did not deter alarmists from reducing the WMO’s value to ± 0.5K possibly as a prelude to claiming that anything more was anthropogenic?

The IPCC was set up with three different working groups. WG1 deals with the science and WG2 with the potential impact of warming on society. But WG3 is the only group that matters politically as it recommends international action on emissions – WG1 is there to provide gravitas to WG3. These groups work in parallel with little pretence of WG3 waiting for the deliberations of any scientists. This is analogous to building a refinery before any seismic is shot.

Five Assessment Reports have been produced. The first AR (1990) formed the basis for 1992’s Rio Climate Summit which led to the Framework Convention on Climate Change which formed the basis for the 1997 Kyoto Protocol to control CO2 emissions. The problem was that the IPCC needed a lengthy CO2 atmospheric residence time to secure its raison d’être. Hitherto, this was widely accepted as 4-15 years but such a duration made emission controls pointless. So they devised a new definition enabling figures of >75 years. Segalstad’s 1998 listing of 36 related experiments all still showed <15 years.

In the second AR many key scientific contributions were excluded, e.g. ‘No study to date has positively attributed all or part [of observed climate change] to anthropogenic causes’. The IPCC did not deny these deletions, saying changes were made under pressure from governments. It ended up ‘The balance of evidence suggests a discernible human influence on global climate’. Professor Frederick Seitz wrote ‘In my more than 60 years as a member of the American scientific community, including service as president of both the National Academy of Sciences and the American Physical Society, I have never witnessed a more disturbing corruption of the peer review process than the events which led to this IPCC report.’

The third AR wiped away the MWP and LIA from all historical and scientific records so warming coincided with CO2 emissions, as shown in the now infamous hockey stick graph, despite AR1 featuring figures which showed otherwise. The hockey stick appeared six times in AR3 and was included in the IPCC’s logo before its fall from grace. However, on examination by Canadians Steve McIntyre and Ross McKitrick, the graph was proven to be an artefact of unusual statistical methodology. Such shapes are created even when random data (red noise) is used. McKitrick pointed out: ‘The hockey stick debate is thus about two things. At a technical level it is about flaws in methodology and erroneous results in a scientific paper. But at the political level the debate is about whether the IPCC betrayed the trust of governments around the world.’ The non-existence of a worldwide MWP still remains an article of faith for some AGW apostles. Recently, I attended one of their talks to hear ‘the MWP hardly existed even in Greenland, and nowhere else’ ignoring data from hundreds of global sites.

In February 2007 the IPCC was supposed to publish AR4 saying it constituted ‘the standard reference for all concerned with climate change in academia, government and industry worldwide’. It was heralded as proof that severe anthropogenic climate change was underway. However, by that time, the only document available was a summary which referred to a report which was still being prepared. The IPCC explained that the delay was to ‘permit adjustments to the scientific report’. When the SPM for AR4 was eventually released it even had simple arithmetical errors: expected sea level rise was greatly overestimated simply from decimal points in the wrong place. Only about a quarter of the much-vaunted 2,500 experts contributed to the science chapters, several of whom were environmental activists. Only a handful had any experience in the essential aspect – the evaluation of climate sensitivity. All others depended on CS values proposed by these few. AR3‘s hockey stick had all but disappeared. AR4 concluded with a 90% confidence level that CS range was 2.0 – 4.5K.

The fifth AR reduced the lower bound of the short-term range to 1.5 – 4.5K, its SPM said recent warming was due to mankind. The IPCC claimed ex cathedra that this had a 95% confidence level, though AR5 contained no supporting data. Forty years ago the US National Academy of Sciences investigated CS and arrived at 1.5 – 4.5K. So the hundred billion spent during this time has profited mankind little. Neither AR4 nor AR5 mentioned the warming pause which is approaching 20 years’ duration. Were we to cherry-pick data, CS for this period, which represents about half the satellite era, is zero. The IPCC, having predicted a resumption of warming started to refer to ‘missing heat’. One conjecture is that it is at the bottom of the ocean, but to be there it would have to have sneaked past the ARGO network, keeping its signature hidden.

Many governmental policy institutions almost blindly follow the IPCC. For example, the UK’s AGW Ministry says it is ‘largely informed by the IPCC’s Assessment Reports’, stating: ‘Heat lost from the surface by evaporation is transported upwards and radiated to space. The height at which this occurs depends on the infrared opacity of the atmosphere. Thickening of this opacity due to GH gas emissions tends to shift the radiating height upwards to emit at a colder layer. This means that more energy enters the Earth system than is flowing out so the Earth must warm up.’ As with much alarmist ‘science’ this is not the whole picture. If it were, more CO2 would always mean a warmer climate.

Given how critical this is, it would make sense to do two things. Firstly, subject AGW to a cost-benefit analysis. The 2006 Stern Report is one attempt but was derided by some other economists, one reason for which was its choice of discount rates. It was suggested that AGW would cost 20% of world GDP but, if action were taken immediately, it would only be 1%. It believed renewable energy should be boosted but, even before publication, countries that had already made serious commitment to renewables showed this figure to be flawed. Other environmental groups stated that for 1% of global GDP there were far more pressing concerns than AGW. (In the last decade alone EU states have spent about €600 billion on renewable energy projects).

The second issue that the IPCC avoided was a second opinion. There are various bodies interested in climate science which the IPCC ignores; perhaps the largest is the Non-Governmental International Panel on Climate Change. The NIPCC was founded in 2003 in Italy by concerned scientists during one of the frequent UN climate meetings. In 2007 the NIPCC began offering alternative scientific interpretations and providing independent examinations of the peer-reviewed literature. It is now an international coalition of experts, greater in number than are involved with the IPCC and produces its own hefty reports. See www.nipccreport.org

Climate Insensitivity
To evaluate CS, we must define relationships between CO2 concentration, radiative forcing, temperature change and any feedbacks to create a surface delta-T. As a crude approximation we allow for only four variables: A, B, C and D.  ARs do not always make things this easy to follow and I am grateful to Christopher Monckton and others for their input. (Also see ‘Why Models Run Hot’, Chinese Science Bulletin, Monckton, Soon et al).

A is the factor due to CO2 doubling compared to pre-anthropogenic levels.  A = ln (CO2new/CO2old) = ln 2 = 0.69.

B covers other GH gases, and turns CO2 concentration increase A into a forcing in Watts/sq.m hitting the surface. It also allows for other effects including contrails, aerosols etc. Very approximately its value in this case is 5.

C is what AR5 refers to as the “Planck sensitivity” which turns increased radiative forcing into surface temperature delta. Laboratory experiments give it a value 0.31ºC W-1 m2.

The product of A, B and C gives ~1.1K increase for CO2 doubling, but then we must allow for feedbacks. Factor D is this ‘feedback multiplier’, covering any mutually amplified heating/damping effects. The IPCC give this is a strongly positive value of ~3.1, producing a CS figure of ~3.3K (central estimate, much greater in some scenarios due to choice of higher D value). So to be a cause for concern, CO2’s minimal warming must be multiplied by a large factor solely due to hypothetically positive feedbacks.

Some believe that forcing drops off more quickly than is covered by A’s logarithmic relationship and that IPCC-accepted figures for B and C are exaggerated, as laboratory experiments do not replicate the atmosphere. Professor Lindzen believes the figure should be cut by up to a factor of 3. With no possibility to undertake real-world testing, these values should be climate model outputs but such programmes are incapable of evaluating them, so estimates have to be made and used as inputs.

The value of C is based on a couple of papers which in turn cite a few others, providing little justification for a value so high. CS is very sensitive to the value of D. In 2001, the IPCC put D at 2.08 and offered scant reason for the 50% increase in 2007, and we have seen how weather works to make such amplifications improbable. Team-AGW says additional surface heat creates extra evaporation which humidifies the air. But positive WV feedback only occurs if the cold upper troposphere remains moist, which is what models assume. If the UT is moist, emissions to space take place from vapour in the warmer boundary layer in the lower troposphere, leaving less energy to be emitted from the surface. However, the GH effect is not controlled by surface evaporation but precipitation systems regulate the amount of vapour in the troposphere. Surface evaporation is always trying to saturate the atmosphere but ongoing precipitation stops that happening. Otherwise, it would take less than a week to completely humidify the atmosphere (100% relative humidity). From basic physics, some show that whatever warming is caused by CO2, weather systems can completely negate it.

If we re-do the CS calculation using realistic negative feedbacks, a central estimate is <1K, most of which must already have happened. Thus, it seems we may continue to use hydrocarbons guilt-free. But why does anyone believe that feedbacks must be positive and enhance CO2’s modest warming? The answer lies in climate models.

Climate Muddles
To know more about climate model limitations I recommend the IPCC publication An Introduction to Simple Climate Models used in the IPCC Second Assessment Report. Models are used to generate scenarios into the distant future by incorporating human contribution to climate change (‘attribution’). But AR3 admitted ‘we are dealing with a coupled-nonlinear chaotic system, and therefore long-term prediction of future climate states is not possible’, and the ludicrous assumption is made that natural variation is understood. Nevertheless, a series of graphs is often publicised “proving” that recent warming cannot have been due to natural variation, but these were produced ex post facto, and only worked with a low (by IPCC standards) CS value of about 2.5 ºC.

Models simulate forcings, e.g. increasing CO2, volcanic emissions and aerosols (>20 drivers have been identified having different intensities and timescales) with calculations based on classical laws of motion and thermodynamics. These rely greatly on partial differential equations to estimate temperature, pressure, density etc., in the atmosphere and oceans. However, partial derivatives should only be used to investigate the changes in two variables when all else is held constant. In the real climate there are large numbers of dependent variables, and nothing can be held constant. The use of such calculus is mathematically totally inappropriate; variations in initial conditions can lead to large output fluctuations making it impossible to compute accurate solutions. As pointed out by Dr. Vincent Gray (an expert reviewer to all five ARs) in his book ‘The Global Warming Scam and the Climate Change Superscam’ not single model has ever been validated.

Climate models are severely limited in other ways. Forecasting entails the precise understanding of processes such as cloudiness, condensation heating, evaporation cooling, cloud-free radiation, air-sea moisture-temperature flux, extra-terrestrial effects etc., none of which are well understood. There is also the closure problem of turbulence in fluids; we cannot even determine pipe flow from first principles or get the lowest order statistic, after 150 years of trying. Thus, models do a poor job at simulating many areas of climate which alarmists claim is well understood, e.g. the current hiatus. They are even farther away from incorporating various negative feedbacks, such as the process recently identified by European chemists in which the oceans emit huge volumes of volatile organic compounds into the atmosphere which provide a significant cooling effect.

Yet none of this prevents model-based predictions sounding the alarm. But we do not need to rely on models; Earth radiation budget satellite data (ERBE) demonstrate that feedbacks tend to be net negative.

2C or not 2C
Our industry should re-evaluate its position on AGW. CO2 concentration increase averages ~2 ppmv/y, a rate which changes according to natural temperature variation, and less than half of which is attributable to the combustion of hydrocarbons (‘Relaxation kinetics of atmospheric carbon dioxide’, Gösta Petterson, revised 2014) while CO2 has not been shown to be the cause of any dangerous warming.

Alarmists insist that >2°C is bad for the planet, equating to 420-450 ppmv (choosing high values of B-D) while the IEA reports that two thirds of all hydrocarbon reserves are unexploitable after we reach this concentration.  But 2°C was not based on any hard data; preparations for Kyoto saw Sweden suggest 2°C as a “threshold” but it only came to prominence ten years ago with the World Wildlife Fund who apparently ignored plentiful research showing 2-3°C would be good for the planet in many ways. They claimed >2°C would harm biodiversity, even though warmth encourages biodiversity with most species living in the tropics. The EU then saw their chance to make 2°C official, although at that time there had been a decade without warming. (The UK Met Office used climate models to estimate that 0.3°C of this would be from 2004 to 2014, though during this period we saw no warming at all.) The 2007 Bali climate boondoggle, which 12,000 of the planet’s most carbon-conscious flew in to attend, turned 2°C into a Communiqué which some in our industry put their name to. It is surprising that geo-science signed up given that IPCC calculations are based on high net positive feedbacks making 2°C imminent, significantly devaluing reserves.

Recently, The Right Climate Stuff, a group formed from some who worked in the Apollo programme, wanted to see if 2°C from CO2 was even possible. They generously assumed ALL warming since 1850 was anthropogenic – about 0.7°C, then included total recoverable fossil fuels [traditional resources] and satellite temperature data to calculate the upper bound to be an additional 1.2°C, i.e. 1.9ºC total increase. The latest UAH data shows that CO2 doubling would likely lead to only 1.3°C.

It is also ironic that 2°C warming is still mentioned given the growing number of recent predictions of a mini ice age coming to a continent near you around 2030; we are then going to need all the hydrocarbons we produce. This is something the oil industry could take the lead on – preparing the planet against what would be a real emergency. But with or without a new ice age, we can happily go on creating planetary prosperity for a very long while before having to worry about carbon-created climate change.

The question has been asked what our business can do to clear the way for an informed debate. The answer is simple: review the science and latest data, isn’t this how we always used to proceed? We should get our PR machinery organised and spokesmen appointed, bragging about the good our industry has done and the additional benefits it has yet to bring to ever more of mankind – and that science is on our side. Alex Epstein, who is not even a geoscientist, has made a good start on our behalf.

Conclusion
For reasons of space, this article has not covered many important issues but it is sufficient to repeat that no evidence exists to support AGW. The IPCC is believed to have demonstrated CO2’s harmfulness, but some see it as untrustworthy and unrepresentative of major aspects of climate research while the alarm is based on models which cannot, without tweaking, even hindcast the recent past. Whereas most individual AGW scientists perform to the highest standards, the process as a whole appears to have been occasionally hijacked by those with other motives, and the scientific method exchanged for Post Normal Science.

It makes little sense for one of the world’s most rigorous users of the scientific method (our industry) to go along with the AGW conjecture. There are significant ethical, economic, environmental and scientific grounds for not just the continued use of hydrocarbons but their appropriate expansion.

Acknowledgements
Various individuals from our own and the climate industries provided material for this piece. Not all wanted their names mentioned, but I am especially grateful to John Graham for his input.

References
Climate of Corruption. Politics and Power behind the Global Warming Hoax; Larry Bell.
Unstoppable Global Warming; Fred Singer & Dennis Avery.
Heaven and Earth; Global Warming: the Missing Science; Ian Plimer.
The Chilling Stars; Henrik Svensmark & Nigel Calder.
Chill; Peter Taylor.
The Great Global Warming Blunder; Roy Spencer.
The Moral Case for Fossil Fuels; Alex Epstein.
Physics of Atmospheres; John Houghton.
The Many Benefits of CO2 Enrichment; Craig & Sherwood Idso.
The Age of Global Warming; Rupert Duvall.