Unless you've been living under the Geico rock, you know that the US is experiencing one of the coldest early winters on record. It's been fascinating watching the grid in New England, a region where CAVE dwellers have adamantly opposed natural gas, instead, apparently, preferring oil to produce electricity.
Previous posts include this one. That's an important post because it provides a bit of an explanation why oil might be preferred over natural gas in New England (and it doesn't have to do with the environment). But I digress.
A reader pointed out that when electricity demands are at their greatest, the region maxes out on nuclear, coal, and natural gas. Ever-increasing amounts of oil are required. I had not noticed that until the reader pointed it out, and, subsequently, a writer over at
Forbes also noted the same thing.
Now, a new wrinkle. In the graph below, note that when electricity demand was at its greatest:
nuclear and coal maxed out; natural gas was likely soon to max out (based on earlier data); and,
when the region most needed additional sources of electricity,
electricity from renewable sources actually decreased significantly. Perhaps overcast conditions/cloudy weather degraded solar sources; and, high winds/cold degraded wind farms.
Dynamic link here.
Here is the spot price of electricity
($285/MWh) just when the region needed electricity most, and renewable
energy was unable to provide the extra electricity needed.
And,
of course, it should be noted, that whatever renewable sources are
there, it all has to be backed up by conventional sources, such as oil,
coal, or natural gas -- and the back-up has be available immediately, so
the coal, natural gas, oil is burning just to keep back-up ready to go as soon as it's needed.