The European Union has a widely quoted objective of avoiding anthropogenic temperature rise of more than 2°C. That is to say, all the greenhouse gasses we have pumped into the atmosphere should, at no point, produce enough radiative forcing to increase mean global temperatures more than 2°C above their levels in 1750.
What is less commonly recognized is how ambitious a goal this is. The difficulty of the goal is closely connected to climate sensitivity: the “equilibrium change in global mean surface temperature following a doubling of the atmospheric (equivalent) CO2 concentration.” According to the Intergovernmental Panel on Climate Change, this is: “likely to be in the range 2 to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values.”
Taking their most likely value, 3°C, the implication is that we cannot allow the doubling of global greenhouse gas concentrations. Before the Industrial Revolution, carbon dioxide concentrations were about 280ppm. Today, they are about 380ppm.
Based on the IPCC’s conclusions, stabilizing greenhouse gas levels at 450ppm only produces a 50% chance of staying below 2°C of warming. In order to have a relative high chance of success, levels need to be stabilized below 400ppm. The Stern Review’s economic projections are based around stabilization between 450 and 500ppm. Stabilizing lower could be quite a lot more expensive.
Finally, there is considerable uncertainty about climate sensitivity itself. Largely, this is the consequence of feedback loops within the climate. If feedbacks are so strong that climate sensitivity is greater than 3°C, it is possible that current GHG concentrations are sufficient to breach the 2°C target for total warming. Some people argue that climatic sensitivity is so uncertain that temperature-based targets are useless.
The 2°C target is by no means sufficient to avoid major harmful effects from climate change. Effects listed for that level of warming in the Stern Review include:
- Failing crop yields in many developing regions
- Rising number of people at risk from hunger, with half the increase in Africa and West Asia
- Severe impacts in marginal Sahel region
- Significant changes in water availability
- Large fraction of ecosystems unable to maintain current form
- Rising intensity of storms, forest fires, droughts, flooding, and heat waves
- Risk of weakening of natural carbon absorption and possible increasing natural methane releases and weakening of the Atlantic Thermohaline Circulation
- Onset of irreversible melting of the Greenland ice sheet
Just above 2°C, there is “possible onset of collapse of part or all of Amazonian rainforest” – the kind of feedback-inducing effect that could produce runaway climate change.
George Monbiot has also commented on this. The head of the International Energy Agency has said that it is too late for the target to be met (PDF).
Amusing: a calendar made of people.
Topical calendar.
In a half-page ad in the Globe and Mail, Tuesday, Suncor boasted that it had reduced GHG emission intensity in its Alberta oil sands development by 51 per cent between 1990 and 2006. The ad failed to mention that – thanks to huge increases in production – the company’s absolute emissions increased by 131 per cent over the same period. According to a PowerPoint presentation that you can find here, Suncor also plans to increase its production in the next three to five years from 260 thousand barrels per day to 550 thousand – a further jump of nearly 90 per cent.
Dr. Andrew Weaver from the University of Victoria is putting the finishing touches on a new book called Keeping our cool: Canada in a warming world.
There’s a broad consensus now among scientists and governments that we need to limit global warming on average to no more than 2 degrees Celsius above the pre-industrial level. Above 2 degrees you start to get into a zone of really quite catastrophic impact globally. Irreversible impact.
The EU, for example, as long ago as 1996 adopted a two degree limit as its objective in the long term. To date, the Government of Canada has simply refused to take a position on 2 degrees. In other words, the Government of Canada officially doesn’t have an opinion on how much global warming is too much, either for the world or for Canadians.
From 1996-2006, the level of CO2 in the atmosphere grew by an annual average of 1.93 parts per million to reach 381.2 parts per million last year, the biggest increase seen since scientists began continuous monitoring of the gas in 1959. WMO said last year’s uptick in CO2 concentrations was largely due to fossil fuel combustion.
In comparison, atmospheric CO2 levels rose by about 1.58 ppm per year in the 1980s and 1.49 ppm in the 1990s.
Temperatures have already risen by about 1.4 degrees Fahrenheit compared to pre-industrial levels. Since 1750, CO2 levels have risen by 136 percent, methane by 255 percent and nitrous oxide by 119 percent.
Today, I added two graphics taken from the Fourth Assessment Report of the IPCC to this entry.
Ocean circulation in a warming climate
Nature 451, 286-288 (17 January 2008) | doi:10.1038/nature06590; Published online 16 January 2008
Climate models predict that the ocean’s circulation will weaken in response to global warming, but the warming at the end of the last ice age suggests a different outcome.
There is an old truism in climate circles that the cold climate at the Last Glacial Maximum (LGM), which occurred 21,000 years ago, had stronger winds. This idea fits with the common observation that it is windier in the winter than in the summer because there is greater thermal contrast within the atmosphere in the winter hemisphere. Temperature reconstructions from the LGM show that Equator-to-pole gradients in sea surface temperature were indeed larger — that is, the polar oceans were colder than the tropical ocean at the LGM in comparison with the temperature differences today.
Avoiding catastrophic global warming requires stabilizing carbon dioxide concentrations, not emissions. Studies find that many, if not most, people are confused about this, including highly educated graduate students. I have personally found even well informed people are confused on this point and its crucial implications.
We need to cut emissions 50 to 80 percent below current levels just to stop concentrations from rising. And global temperatures will not be stabilized for decades after concentrations are stabilized. And of course, the ice sheets may not stop disintegrating for decades — and if we dawdle too long, centuries — after temperatures stabilize. That is why we must act now if we want to have any reasonable hope of averting catastrophe.
One 2007 MIT study, “Understanding Public Complacency About Climate Change: Adults’ mental models of climate change violate conservation of matter,” concluded “Low public support for mitigation policies may be based more on misconceptions of climate dynamics than high discount rates or uncertainty about the risks of harmful climate change.”
Here is a great video clarifying the issue, which you can send to folks. It is narrated by my friend Andrew Jones:
‘Stabilizing climate requires near-zero emissions’
A new climate science paper calls for dramatic action
Avoiding climate catastrophe will probably require going to near-zero net emissions of greenhouse gases this century. That is the conclusion of a new paper in Geophysical Research Letters (subs. req’d) co-authored by one of my favorite climate scientists, Ken Caldeira, whose papers always merit attention. Here is the abstract:
Current international climate mitigation efforts aim to stabilize levels of greenhouse gases in the atmosphere. However, human-induced climate warming will continue for many centuries, even after atmospheric CO2 levels are stabilized. In this paper, we assess the CO2 emissions requirements for global temperature stabilization within the next several centuries, using an Earth system model of intermediate complexity. We show first that a single pulse of carbon released into the atmosphere increases globally averaged surface temperature by an amount that remains approximately constant for several centuries, even in the absence of additional emissions. We then show that to hold climate constant at a given global temperature requires near-zero future carbon emissions. Our results suggest that future anthropogenic emissions would need to be eliminated in order to stabilize global-mean temperatures. As a consequence, any future anthropogenic emissions will commit the climate system to warming that is essentially irreversible on centennial timescales.
Is 450 ppm (or less) politically possible? Part 1
We’ll need a lot of Socolow and Pacala’s wedges
The short answer is: “Not today — not even close.”
The long answer is the subject of this post.
…
“The purpose of my last post on the adaptation trap was to make clear that 800 to 1,000 ppm, which is where we are headed, is a catastrophe ar beyond human imagining, one that makes a mockery of the word “adaptation,” that has a “cost” far beyond that considered by any traditional economic cost-benefit analysis. It is a rationally and morally impossible choice. So too, I think, is 550 ppm, assuming we could stop there — which as I argued, we probably can’t, thanks to the carbon cycle feedbacks like the melting tundra.”
7 April 2008
Target CO2
What is the long term sensitivity to increasing CO2? What, indeed, does long term sensitivity even mean? Jim Hansen and some colleagues (not including me) have a preprint available that claims that it is around 6ºC based on paleo-climate evidence. Since that is significantly larger than the ‘standard’ climate sensitivity we’ve often talked about, it’s worth looking at in more detail.
Britain’s climate target ‘impossible’
Efforts to help keep world temperature rises under 2C will fail, says thinktank, even if UK sticks to policy on carbon emissions
Juliette Jowit, environment editor
The Observer,
Sunday June 8 2008
Britain will find it ‘impossible’ to meet its target as part of the world’s battle to ensure temperatures do not rise more than 2C – a key threshold for dangerous climate change, according to a study by a panel of leading experts.
The report ‘Carbon Scenarios’ by the Stockholm Network thinktank says that if existing policies and hopes of international agreement on reducing emissions were implemented, there would still be a 90 per cent chance the temperature rise would reach about 3C, a level that experts fear would provoke ‘feedback’ of more carbon by melting permafrost, threatening the world’s forests.
But how strong is this warming effect? That is the only fundamental doubt about anthropogenic climate change that can still be legitimately debated. We climatologists describe this in terms of the climate sensitivity, the warming that results in equilibrium from a doubling of CO2. The IPCC gives the uncertainty range as 1.5-4.5 ºC. Only if this is wrong, and the true value is lower, can we escape the fact that unabated emissions of greenhouse gases will lead to the warming projected by the IPCC.
Chances for that are not good. A new large uncertainty analysis that appeared this week in Nature shows that it is very difficult to get a climate sensitivity below 2 ºC in a climate model, no matter how one changes the parameters. And climate history, with its Ice Ages and other large changes, also speaks strongly against low climate sensitivity.
First of all, how much does atmospheric CO2 rise if you add 3000 GtC to the system in a (geologically) short period of time? Zeebe et al. did this calculation and the answer is about 700 ppmv – quite a lot eh? However, that is a perturbation to the Paleocene carbon cycle – which they assume has a base CO2 level of 1000 ppm, and so you only get a 70% increase – i.e. not even a doubling of CO2. And since the forcing that goes along with an increase in CO2 is logarithmic, it is the percent change in CO2 that matters rather than the absolute increase. The radiative forcing associated with that is about 2.6 W/m2. Unfortunately, we don’t (yet) have very good estimates of background CO2 levels in Paleocene. The proxies we do have suggest significantly higher values than today, but they aren’t precise. Levels could have been less than 1000 ppm, or even significantly more.
If (and this is a key assumption that we’ll get to later) this was the only forcing associated with the PETM event, how much warmer would we expect the planet to get? One might be tempted to use the standard ‘Charney’ climate sensitivity (2-4.5ºC per doubling of CO2) that is discussed so much in the IPCC reports. That would give you a mere 1.5-3ºC warming which appears inadequate. However, this is inappropriate for at least two reasons. First, the Charney sensitivity is a quite carefully defined metric that is used to compare a certain class of atmospheric models. It assumes that there are no other changes in atmospheric composition (aerosols, methane, ozone) and no changes in vegetation, ice sheets or ocean circulation. It is not the warming we expect if we just increase CO2 and let everything else adjust.
In fact, the concept we should be looking at is the Earth System Sensitivity (a usage I am trying to get more widely adopted) as we mentioned last year in our discussion of ‘Target CO2‘. The point is that all of those factors left out of the Charney sensitivity are going to change, and we are interested in the response of the whole Earth System – not just an idealised little piece of it that happens to fit with what was included in GCMs in 1979.
—
What is the long term sensitivity to increasing CO2? What, indeed, does long term sensitivity even mean? Jim Hansen and some colleagues (not including me) have a preprint available that claims that it is around 6ºC based on paleo-climate evidence. Since that is significantly larger than the ’standard’ climate sensitivity we’ve often talked about, it’s worth looking at in more detail.
We need to start with some definitions. Sensitivity is defined as the global mean surface temperature anomaly response to a doubling of CO2 with other boundary conditions staying the same. However, depending on what the boundary conditions include, you can get very different numbers. The standard definition (sometimes called the Charney sensitivity), assumes the land surface, ice sheets and atmospheric composition (chemistry and aerosols) stay the same. Hansen’s long term sensitivity (which might be better described as the Earth System sensitivity) allows all of these to vary and feed back on the temperature response. Indeed, one can imagine a whole range of different sensitivities that could be clearly defined by successively including additional feedbacks. The reason why the Earth System sensitivity might be more appropriate is because that determines the eventual consequences of any particular CO2 stabilization scenario.
“If humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted, paleoclimate evidence and ongoing climate change suggest that CO2 will need to be reduced from its current 385 ppm to at most 350 ppm.”
James Hansen, NASA’s chief climatologist
‘Scary’ climate message from past
By Richard Black
Environment correspondent, BBC News website
A new historical record of carbon dioxide levels suggests current political targets on climate may be “playing with fire”, scientists say.
Researchers used ocean sediments to plot CO2 levels back 20 million years.
Levels similar to those now commonly regarded as adequate to tackle climate change were associated with sea levels 25-40m (80-130 ft) higher than today.
Scientists write in the journal Science that this extends knowledge of the link between CO2 and climate back in time.
The last 800,000 years have been mapped relatively well from ice cores drilled in Antarctica, where historical temperatures and atmospheric content have left a series of chemical clues in the layers of ice.
But looking back further has been more problematic; and the new record contains much more precise estimates of historical records than have been available before for the 20 million year timeframe.
CLIMATE CHANGE: Four Degrees of Devastation
By Stephen Leahy
UXBRIDGE, Canada, Oct 9 (IPS) – The prospect of a four-degree Celsius rise in global average temperatures in 50 years is alarming – but not alarmist, climate scientists now believe.
Eighteen months ago, no one dared imagine humanity pushing the climate beyond an additional two degrees C of heating, but rising carbon emissions and inability to agree on cuts has meant science must now consider the previously unthinkable.
“Two degrees C is already gone as a target,” said Chris West of the University of Oxford’s UK Climate Impacts Programme.
“Four degrees C is definitely possible…This is the biggest challenge in our history,” West told participants at the “4 Degrees and Beyond, International Climate Science Conference” at the University of Oxford last week.
A four-degree C overall increase means a world where temperatures will be two degrees warmer in some places, 12 degrees and more in others, making them uninhabitable.
It is a world with a one- to two-metre sea level rise by 2100, leaving hundreds of millions homeless. This will head to 12 metres in the coming centuries as the Greenland and Western Antarctic ice sheets melt, according to papers presented at the conference in Oxford.
Four degrees of warming would be hotter than any time in the last 30 million years, and it could happen as soon as 2060 to 2070.
World on course for catastrophic 6° rise, reveal scientists
Fast-rising carbon emissions mean that worst-case predictions for climate change are coming true
By Steve Connor and Michael McCarthy
The world is now firmly on course for the worst-case scenario in terms of climate change, with average global temperatures rising by up to 6C by the end of the century, leading scientists said yesterday. Such a rise – which would be much higher nearer the poles – would have cataclysmic and irreversible consequences for the Earth, making large parts of the planet uninhabitable and threatening the basis of human civilisation.
We are headed for it, the scientists said, because the carbon dioxide emissions from industry, transport and deforestation which are responsible for warming the atmosphere have increased dramatically since 2002, in a way which no one anticipated, and are now running at treble the annual rate of the 1990s.
This means that the most extreme scenario envisaged in the last report from the UN Intergovernmental Panel on Climate Change, published in 2007, is now the one for which society is set, according to the 31 researchers from seven countries involved in the Global Carbon Project.
“With carbon emissions still rising, and political foot-dragging continuing, some scientists began to consider what the world will look like if we miss the target of limiting global temperature increase to 2 °C above pre-industrial levels.
Writing in April in Nature (458, 1102; 2009), Martin Parry of Imperial College London and colleagues warned that we should prepare to adapt to an overshoot of the 2 °C mark. Even if emissions peak in 2015 and decrease by three per cent per year, there’s an even chance we’ll exceed 2 °C, they said. As a precaution, we should begin planning now to adapt to 4 °C.
This message was reiterated at a conference in September in Oxford, by which stage scientists had done considerably more research on what 4 °C of warming would mean. Among other things, in a 4 °C world we could look forward to the destruction of US$1 trillion worth of gross domestic product and displacement of 146 million people if sea levels rise a metre, as well as starvation, disease, fire and flooding.
Richard Betts, a researcher with the UK Met Office Hadley Centre in Exeter, told the conference that temperatures could reach 4 °C above pre-industrial levels by 2060, in part because natural carbon sinks might lose their ability to absorb carbon from the atmosphere.
In November, a European consortium of 65 research centers concluded that to avoid overshooting 2 °C, emissions would have to reach almost zero by 2100, and we might need to start pulling carbon out of the atmosphere by 2050.”
“Charney was seeking the equilibrium global warming, the warming after the atmosphere and ocean have come to a new final temperature in response to increased carbon dioxide. The immediate effect of doubling carbon dioxide, if everything else were fixed, would be a decrease of about 4 watts (per square metre) in the heat radiation from Earth to space. This is simply physics… The added carbon dioxide increases the opacity (opaqueness) of the atmosphere for heat radiation, so radiation to space arises from a higher level, where it is colder, thus reducing emission to space.
Any physicist worth his salt can immediately tell you the answer to Charney’s problem if everything except temperature is fixed… We can use Planck’s law to calculate how much Earth must warm up to radiate 4 more watts and restore the planet’s energy balance. The answer we find is 1.2 degrees Celsius. So the climate sensitivity in this simple case of Planck radiation is 0.3 degrees Celsius per watt of climate forcing.
This simple Planck’s law climate sensitivity, 0.3 degrees Celsius per watt of climate forcing, is called the no-feedback climate sensitivity. Feedbacks occur in response to variations in temperature and can cause further global temperature change, either magnifying or diminishing the no-feedback, or blackbody, response. Feedbacks are the guts of the climate problem. Forcings drive climate change. Feedbacks determine the magnitude of climate change.”
Hansen, James. Storms of My Grandchildren. p.42 (hardcover)
Note that in Storms of My Grandchildren James Hansen provides a specific estimate of climate sensitivity, based on Paleoclimatic data. He observes that the magnitude of greenhouse gas forcings changed by 3±0.5 watts between the last ice age and the Holocene. In addition, surface changes such as changes in vegetarian and exposure of continental shelves altered forcings by 3.5 watts.
The 6.5 watts together maintained an equilibrium temperature change of about 5°C, implying a climate sensitivity of about 0.75°C per watt of forcing. As such, the expected change in equilibrium temperature from the four watt forcing that accompanies a doubling of the concentration of atmospheric carbon dioxide is about 3°C – right in the middle of Charney’s range of estimates.
Hansen argues that this sensitivity covers only a bounded range, and that the climate can be pushed out of that range in either direction. Going one way would produce a ‘snowball Earth’ while going the other (the way we are headed) would produce a runaway greenhouse effect.
Tong, J.A., You, Y., Müller, R.D. and Seton, M. 2009. Climate model sensitivity to atmospheric co2 concentrations for the middle Micoene. Global and Planetary Change 67:129-140.; Mills, T.C. 2009. How robust is the long-run relationship between temperature and radiative forcing? Climatic Change 94:351-361.
Two new studies, one based on past climate behavior millions of years ago and the other comparing climate and radiative forcing trends during the past 150 years, suggest a global warming in response to a doubling of co2 concentrations on the order of 2°C. This is within, but at the lower end of the large range of estimates provided by past studies.
There has been long standing debate about how sensitive the global climate system is to rising greenhouse gas concentrations. Most estimates indicate at least a 1.5°C equilibrium warming for a doubling of carbon dioxide concentrations, while some suggest it could be 7°C or higher. Two recent studies have added to the debate. In one of these, a team of Australian modelers investigate the response of the earth’s climate to changes in carbon dioxide some 15 million years ago. Their results suggest a climate sensitivity during that period of time of about 2.2°C per doubling of co2. In the second analysis, a British economist uses a statistical comparison of changes in temperatures over the past 150 years with concurrent estimated changes in radiative forcing. Assuming that net radiative forcing from multiple causes has the same effect on global climate as that due to co2 alone, he concludes that global climate sensitivity is likely in the range of 1 to 3°C per co2 doubling. Both of these estimates are within the range of past estimates, but at the lower end of the range.
Summary courtesy of Environment Canada
SIR – The reduced warming of the past decade is brief and can be understood in terms of natural fluctuations from the El Niño phenomenon, the effects of volcanoes, the solar cycle and the uptake of heat from the oceans, which continues, in contrast to your statement. There are and will always be fluctuations in global temperature, but the underlying trend is robust, man-made and consistent with a climate sensitivity of around 3°C.
The IPCC’s range on sensitivity is supported by, but not merely based on, models. It is deeply rooted in physics. Quantum physics and thermodynamics, the same physical laws that underlie the functioning of our computers and power plants, yield a baseline climate sensitivity of about 3°C. This is based on the facts that carbon dioxide, water vapour and methane absorb infra-red; a warmer atmosphere holds more water; and ice and snow melt under warming. Any deviation from this baseline needs a reason. As long as we do not find modern physics to be fundamentally wrong, we will have to plan for a climate sensitivity of 3°C.
Since CO2 emissions are consistently at the upper end of the IPCC’s scenarios, both our solid understanding of climate change on a global level and our lack of understanding of hurricanes and other climatic extremes demand more, not less, caution.
Professor Anders Levermann
Potsdam Institute for Climate Impact Research
Potsdam, Germany
Making sense of palaeoclimate sensitivity
Journal name: Nature
Volume: 491, Pages: 683–691
Date published: (29 November 2012)
DOI: doi:10.1038/nature11574
Received 18 April 2012
Accepted 11 September 2012
Published online 28 November 2012
Many palaeoclimate studies have quantified pre-anthropogenic climate change to calculate climate sensitivity (equilibrium temperature change in response to radiative forcing change), but a lack of consistent methodologies produces a wide range of estimates and hinders comparability of results. Here we present a stricter approach, to improve intercomparison of palaeoclimate sensitivity estimates in a manner compatible with equilibrium projections for future climate change. Over the past 65 million years, this reveals a climate sensitivity (in K W−1 m2) of 0.3–1.9 or 0.6–1.3 at 95% or 68% probability, respectively. The latter implies a warming of 2.2–4.8 K per doubling of atmospheric CO2, which agrees with IPCC estimates
http://www.nature.com/nature/journal/v491/n7426/full/nature11574.html
How much global temperatures rise for a certain level of carbon emissions is called climate sensitivity and is seen as the single most important measure of climate change. Computer models have long indicated a high level of sensitivity, up to 4.5C for a doubling of CO2 in the atmosphere.
However in recent years estimates of climate sensitivity based on historical temperature records from the past century or so have suggested the response might be no more than 3C. This would mean the planet could be kept safe with lower cuts in emissions, which are easier to achieve.
But the new work, using both models and paleoclimate data from warming periods in the Earth’s past, shows that the historical temperature measurements do not reveal the slow heating of the planet’s oceans that takes place for decades or centuries after CO2 has been added to the atmosphere.
“The hope was that climate sensitivity was lower and the Earth is not going to warm as much,” said Cristian Proistosescu, at Harvard University in the US, who led the new research. “There was this wave of optimism.”
The new research, published in the journal Science Advances, has ended that. “The worrisome part is that all the models show there is an amplification of the amount of warming in the future,” he said. The situation might be even worse, as Proistosescu’s work shows climate sensitivity could be as high as 6C.
Prof Bill Collins, at the University of Reading, UK, and not part of the new research, said: “Some have suggested that we might be lucky and avoid dangerous climate change without taking determined action if the climate is not very sensitive to CO2 emissions. This work provides new evidence that that chance is remote.” He said greater long term warming had implications for melting of the world’s ice sheets and the rise of sea levels that already threatens many coastal cities.
Among Charney’s group was Akio Arakawa, a pioneer of computer modeling. On the final night at Woods Hole, Arakawa stayed up in his motel room with printouts from the models by Hansen and Manabe blanketing his double bed. The discrepancy between the models, Arakawa concluded, came down to ice and snow. The whiteness of the world’s snowfields reflected light; if snow melted in a warmer climate, less radiation would escape the atmosphere, leading to even greater warming. Shortly before dawn, Arakawa concluded that Manabe had given too little weight to the influence of melting sea ice, while Hansen had overemphasized it. The best estimate lay in between. Which meant that the Jasons’ calculation was too optimistic. When carbon dioxide doubled in 2035 or thereabouts, global temperatures would increase between 1.5 and 4.5 degrees Celsius, with the most likely outcome a warming of three degrees.
The publication of Jule Charney’s report, “Carbon Dioxide and Climate: A Scientific Assessment,” several months later was not accompanied by a banquet, a parade or even a news conference. Yet within the highest levels of the federal government, the scientific community and the oil-and-gas industry — within the commonwealth of people who had begun to concern themselves with the future habitability of the planet — the Charney report would come to have the authority of settled fact. It was the summation of all the predictions that had come before, and it would withstand the scrutiny of the decades that followed it. Charney’s group had considered everything known about ocean, sun, sea, air and fossil fuels and had distilled it to a single number: three. When the doubling threshold was broached, as appeared inevitable, the world would warm three degrees Celsius. The last time the world was three degrees warmer was during the Pliocene, three million years ago, when beech trees grew in Antarctica, the seas were 80 feet higher and horses galloped across the Canadian coast of the Arctic Ocean.
https://www.nytimes.com/interactive/2018/08/01/magazine/climate-change-losing-earth.html
Climate goal of 1.5C is ‘gasping for breath’, says UN head | Climate crisis | The Guardian
https://www.theguardian.com/environment/2022/dec/19/climate-goal-15c-gasping-breath-un-head-antonio-guterres