close_game
close_game

Weather Bee | More than one dataset can breach the 1.5°C threshold this year

Nov 17, 2024 09:00 AM IST

Divergence between datasets means that the breach of the 1.5°C threshold can be a staggered event with different datasets reporting the event in different years

The Copernicus Climate Change Service (C3S) forecast last week that 2024 is almost certain to be the first calendar year when the world’s average temperature is warmer than the pre-industrial average by more than 1.5°C in its ERA5 dataset. The importance of this event was explained in this column last week. What is concerning about this year is that ERA5 is not the only dataset this year where global temperature is likely to be at least 1.5°C warmer than the pre-industrial average. An HT analysis shows that there is a good likelihood of this happening in at least two other datasets. Here is why.

Raisen, Nov 13 (ANI): A hot air balloon flies in the air over UNESCO's world heritage site Sanchi Stupas during a hot air balloon show. (ANI Photo) (Sanjeev Gupta) PREMIUM
Raisen, Nov 13 (ANI): A hot air balloon flies in the air over UNESCO's world heritage site Sanchi Stupas during a hot air balloon show. (ANI Photo) (Sanjeev Gupta)

Before one explains how multiple datasets are likely to breach the 1.5°C threshold this year, it is important to understand why there are multiple datasets for global temperature and why the level of global warming appears different in each of them. The reason why multiple datasets exist is that our measurements of temperature are not spread out uniformly across the globe. The multiple ways of making this data uniform produce the differences in the global average of different datasets. This is especially true of the temperature in the pre-industrial period (usually taken as the 1850-1900 period), where there are bigger geographical holes in the data. That’s why different datasets disagree more on the pre-industrial average than the current temperature or rate of warming.

The divergence between datasets also means that the breach of the 1.5°C threshold can be a staggered event, with different datasets reporting the event in different years. For example, of the five prominent global temperature datasets analysed here, one breached the threshold in 2023 itself; another three came close to the breach in 2023 and would appear to do so if the deviation in temperature was rounded off to one digit after decimal; and one was still around 0.2°C away from the threshold last year.

In the data we have for 2024 so far (for nine months in three of the five datasets and for ten months in two), the average warming is above the 1.5°C threshold in all but the dataset produced by NOAA. This means that most of them will breach the threshold for the year as a whole even if the remaining part of the year deviates less than 1.5°C above the pre-industrial average.

Since we know the average deviation required in the remaining part of the year for breaching the 1.5°C threshold, we can also calculate how much sequential temperature change is required. For example, the data produced by Berkeley Earth will show a breach unless the average temperature in October-December cools down compared to July-September by at least 0.42°C. On the other hand, the data produced by NOAA will not show a breach unless the last three months warm up by 0.42°C compared to July-September.

Calculating the required sequential temperature change is useful because it is not very easy to quickly change the global average temperature in a big way. If the world needs to cool down by a large quantum in a particular dataset for the breach to not happen, the dataset will likely record a breach.

For example, the GISTEMP data produced by NASA requires that the world cooldown by at least 0.28°C in November-December compared to September-October. This level of sequential cooling has happened in 31 such sequential changes out of 1,735 available in the data (1.8%). This means that the likelihood of GISTEMP not recording a breach is low. On the other hand, ERA5 data has never recorded the required sequential cooling of 0.54°C in its dataset that begins in 1940. This is one reason why C3S is almost certain of a 1.5°C breach in its dataset this year. As the accompanying chart shows, on the basis of past statistics alone, the probability of the breach not happening is very low in three out of five datasets.

To be sure, calendar years have little meaning for processes such as global warming. For example, the 12-month running mean of global temperature has already been breached in four out of five datasets analysed here. This means that we are past the point where warming for a period as long as a year averages a warming of 1.5°C. Unless there is course correction – this does not appear to be the case from the 29th Conference of Parties (COP29) unfolding in Baku currently — it is only a matter of time before the world becomes warmer than the pre-industrial average even long-term.

Abhishek Jha, HT’s senior data journalist, analyses one big weather trend in the context of the ongoing climate crisis every week, using weather data from ground and satellite observations spanning decades.

rec-icon Recommended Topics
Share this article
See More

For evolved readers seeking more than just news

Subscribe now to unlock this article and access exclusive content to stay ahead
E-paper | Expert Analysis & Opinion | Geopolitics | Sports | Games
SHARE THIS ARTICLE ON
SHARE
Story Saved
Live Score
Saved Articles
Following
My Reads
Sign out
New Delhi 0C
Tuesday, February 11, 2025
Start 14 Days Free Trial Subscribe Now
Follow Us On