How will climate change stress the power grid? Hint: Look at dew point temperatures

Photo illustration of electric transmission towers and the effect of global warming.

Study reveals new forecast model, which suggests some energy providers may be underestimating future electricity demands

Release Date: September 24, 2018

Sayanti Mukherjee

Sayanti Mukherjee

“Existing energy demand models haven’t kept pace with our increasing knowledge of how the climate is changing. This is troublesome because it could lead to supply inadequacy risks that cause more power outages, which can affect everything from national security and the digital economy to public health and the environment.”
Sayanti Mukherjee, assistant professor
Department of Industrial and Systems Engineering

BUFFALO, N.Y. — A new study suggests the power industry is underestimating how climate change could affect the long-term demand for electricity in the United States.

The research, published today in the journal Risk Analysis, was led by the University at Buffalo and Purdue University.

It describes the limitations of prediction models used by electricity providers and regulators for medium- and long-term energy forecasting. And it outlines a new model that includes key climate predictors — mean dew point temperature and extreme maximum temperature — that researchers say present a more accurate view of how climate change will alter future electricity demands.

“Existing energy demand models haven’t kept pace with our increasing knowledge of how the climate is changing,” says the study’s lead author Sayanti Mukherjee, PhD, assistant professor of industrial and systems engineering in UB’s School of Engineering and Applied Sciences. “This is troublesome because it could lead to supply inadequacy risks that cause more power outages, which can affect everything from national security and the digital economy to public health and the environment.”

“The availability of public data in the energy sector, combined with advances in algorithmic modeling, has enabled us to go beyond existing approaches that often exhibit poor predictive performance. As a result, we’re able to better characterize the nexus between energy demand and climate change, and assess future supply inadequacy risks,” says co-author Roshanak Nateghi, PhD, assistant professor of industrial engineering and environmental and ecological engineering at Purdue.

The limitations of existing models

The overwhelming majority of climate scientists predict global temperatures will rise throughout 21st century. This is expected to increase the demand for electricity as more people turn to air conditioners to keep cool.

One of the most common energy modeling platforms used to predict future electricity demand — MARKAL, named after MARKet and ALlocation — does not consider climate variability.

Another common energy-economic model, the National Energy Modeling System, or NEMS, does consider the climate. However, it’s limited to heating and cooling degree days. A heating degree day is defined as a day when the average temperature is above 65 degrees Fahrenheit (18 degrees Celsius). A cooling degree day is when the average temperature is below 65 degrees.

While there are different ways to measure heating and cooling degree days, they are most often calculated by adding the day’s high temperature to the day’s low temperature, and then dividing the sum by two. For example, a high of 76 degrees and a low of 60 degrees results in an average temperature of 68 degrees.

The trouble with this approach, Mukherjee says, is that it doesn’t consider time. For example, it could be 76 degrees for 23 hours and 60 degrees for one hour — yet the average temperature that day would still be recorded as 68 degrees.

“Moreover, choice of the accurate balance point temperature is highly contentious, and there is no consensus from the research community of how to best select it,” says Mukherjee.

Dew point temperature is the key

To address these limitations, she and Nateghi studied more than a dozen weather measurements. They found that the mean dew point temperature — the temperature at which air is saturated with water vapor — is the best predictor of increased energy demand. The next best predictor is the extreme maximum temperature for a month, they say.

The researchers combined these climate predictors with three other categories — the sector (residential, commercial and industrial) consuming the energy, weather data and socioeconomic data — to create their model.

They applied the model to the state of Ohio and found that the residential sector is most sensitive to climate variabilities. With a moderate rise in dew point temperature, electricity demand could increase up to 20 percent. The prediction jumps to 40 percent with a severe rise.

By comparison, the Public Utility Commission of Ohio (PUCO), which does not consider climate change in its models, predicts residential demand increases of less than 4 percent up to 2033.

It’s similar in the commercial sector, where the researchers say demand could increase to 14 percent. Again, PUCO’s projections are lower, 3.2 percent. The industrial sector is less sensitive to temperature variability, however, researchers say the demand could still exceed projections.

During the winter months, variations between the models is less significant. That is due, in part, to the relatively low percentage (22.6 percent) of Ohio residents who heat their homes via electricity.

While the study is limited to Ohio, researchers say the model can be applied to other states. To communicate results, the researchers used heat maps, which provide an immediate visual summary of the data represented by colors. The idea, they say, is to better inform decision makers with accurate and easy to understand information.

The research was funded in part by the Purdue Climate Change Research Center and the National Science Foundation.

Media Contact Information

Cory Nealon
Director of News Content
Engineering, Computer Science
Tel: 716-645-4614
cmnealon@buffalo.edu
Twitter: @UBengineering