A weather forecasting question:

For the past two months we’ve had one heat wave after another here in West Virginia. It’s been awful. High temperatures in the low 90s (Fahrenheit) with humidity giving a heat index around 100F. These conditions are 15 to 20 degrees above normal. Sometimes, it goes on for four or five days at a time without a single break. Then, there’s a one or two day break in the 80s, and then another heat wave. It’s now the middle of September, almost fall, and it continues. The forecast shows another heat wave next weekend, after the first day of fall.

An odd new thing I’ve never seen before that’s happening this summer is that forecast temperatures, both highs and lows, are consistently four or five degrees lower than the temperature actually reached. The forecast says 90, but it reaches 95. The forecast is 87, but it reaches 92. An error the other way around never happens.

As an engineer, I know that true errors are like noise and vary randomly to either side of the correct value. If the error is consistently to one side or the other, then there’s a systemic problem or calibration error.

Here’s the question. As an amateur meteorologist for over 50 years, this got me thinking. Long ago, before the powerful computer weather models of today, the historical average temperature was factored into a weather forecast. I don’t know if that’s still the case today. Might it be that I’m observing one of the effects of climate change? The climate is changing, today’s temperatures are above normal, and the historical weather data is biasing the forecast several degrees too low? Is that what’s going on?

I invite anyone who might know the answer to comment below. Thank you.