Oil Industry Group’s Own Report Shows Early Knowledge of Climate Impacts

A report the American Petroleum Institute commissioned in 1982 revealed its knowledge of global warming, predated its campaign to sow doubt.

Share this article

An oil tanker loads up in the Port of Los Angeles
The oil industry knew its impact on climate change in the early '80s, an internal report shows. Credit: Getty Images

Share this article

A Columbia University report commissioned by the American Petroleum Institute in 1982 cautioned that global warming “can have serious consequences for man’s comfort and survival.” It is the latest indication that the oil industry learned of the possible threat it posed to the climate far earlier than previously known.

The report, “Climate Models and CO2 Warming, A Selective Review and Summary,” was written by Alan Oppenheim and William L.  Donn of Columbia’s Lamont-Doherty Geological Observatory for API’s Climate and Energy task force, said James J. Nelson, the task force’s former director. From 1979 to 1983, API and the nation’s largest oil companies convened the task force to monitor and share climate research, including their in-house efforts. Exxon ran the most ambitious of the corporate programs, but other oil companies had their own projects, smaller than Exxon’s and focused largely on climate modeling.

The task force commissioned the report to better understand the models being produced in the nascent field of climate science, Nelson said.  

“There was discussion in the committee about all the noise and information” around carbon dioxide, Nelson said. “There were all sorts of numbers being thrown around. We were not trying to find a model to hang our hats on. It was more, ‘If you see this model, this is how it’s built and these are its strengths and weaknesses.'”

Obtained from a university library by the Union of Concerned Scientists and made available to InsideClimate News, the report described in detail five models used at the time by climate scientists. They ranged from simple to complex: the radiation balance model, energy balance, radiative-convective, thermodynamic and general circulation model.  A table showed the predictions each model generated of the average increase in global temperature if atmospheric concentrations of CO2 doubled compared to pre-industrial times, from .6 degrees C per hemisphere under the thermodynamic model to 2 to 3.5 degrees C globally under the general circulation model.  The poles were expected to undergo even greater jumps in temperature.

The report did not focus on the forces behind the increase in CO2 concentrations, but it linked the phenomenon plainly to fossil fuel use. Atmospheric CO2, it said, “is expected to double some time in the next century. Just when depends on the particular estimate of the level of increasing energy use per year and the mix of carbon based fuels.”

Like many studies at the time, the report stressed the models’ inherent uncertainties. “All models are still sufficiently unrealistic that a definitive evaluation of the problem requires continued effort,” the authors wrote in the summary.

Still, the report concluded that the models pointed to hikes in global average temperatures as CO2 concentrations rose. “They all predict some kind of increase in temperature within a global mean range of 4 degrees C,” the report stated. “The consensus is that high latitudes will be heated more than the equator and the land areas more than the oceans.”

The consensus turned out to accurately predict how global warming has proceeded since then and matches what current models are still predicting for the future.

The consequences for humanity were serious, the authors wrote, “since patterns of aridity and rainfall can change, the height of the sea level can increase considerably and the world food supply can be affected.”

The authors concluded that “optimum forecasting of climate changes is a necessity for any realistic long term planning by government and industry.”

When it commissioned the report, Nelson said, the API task force did not provide any guidance on which models to use and did not meddle with the assessment. Committee members received periodic updates about the report’s progress, he said. The final document “was well received by the committee,” Nelson recalled. “We didn’t change it at all. Copies were sent to all the member companies since they paid for it.”

Nelson said the general feeling of the task force members about climate models echoed the report’s findings that models were not yet realistic enough to evaluate global warming definitively.

Nelson said he suggested the Columbia study in part because of his own skepticism of atmospheric modeling. A former Air Force pilot, he had sometimes found himself in hairy situations because of inaccurate weather forecasting based on modeling.

“Everybody kept talking about the doubling of CO2 in the atmosphere, and we wanted to know where the numbers came from, what kind of assumptions were in the models as definitely as possible,” he said, “because I had the experience from a long time before that if you put garbage in, you get garbage out.”

Task force members also wanted to understand the modeling because they worried that the predictions could lead to what they believed were unnecessary regulations. “Where we were also coming from, we felt we didn’t want the EPA throwing a lot of new rules at this until we knew more precisely what would be the most effective methods of solving the problem,” Nelson said.

The report accurately described climate physics and the models used in the early 1980s, said Anthony Del Genio, a NASA atmospheric scientist and expert on the general circulation model and climate feedbacks, who recently read the Columbia document.

But it also reflected “biases prevalent in the academic community at the time” that simpler models were better than the general circulation one, Del Genio said in an email. Further, the report’s “failure to critically evaluate the models, some of it justified by the limited knowledge at that time but some of it a failure to think critically about the simple models, is its greatest weakness.”

Del Genio also questioned why API commissioned such a paper when the National Academy of Sciences had issued a definitive assessment of climate models in 1979, known as the Charney report.

More telling is what API did with the information once they read their own report. “API could have used that knowledge to invest in developing solutions to climate change,” said Peter Frumhoff, director of science and policy for UCS.

Instead, a year after the task force circulated the report to API’s members, the organization disbanded the committee and shifted its work on climate change from the environment directorate to its lobbying arm.

The industry’s lobbying effort over the years sought to emphasize the uncertainties surrounding global warming, even as the models improved and the scientific consensus around man-made climate change grew stronger. Throughout the 1990s, for instance, it joined Exxon and other fossil fuel interests in the Global Climate Coalition (GCC), whose objective was to derail international efforts to curb greenhouse emissions by questioning climate science. In 1998, API coordinated a multi-million dollar campaign to convince the public and policymakers that the Kyoto Protocol was based on tenuous science.

The groups declared victory when President George W. Bush pulled the U.S. out of the Kyoto Protocol in 2001.

June 2001 briefing memorandum records a top State Department official thanking the GCC because Bush “rejected the Kyoto Protocol in part, based on input from you.”

Share this article