Your utility is probably blind and afraid to admit it.
Historically, utilities have had zero visibility beyond the substation over how much power gets consumed by whom at a given point in time. Lights flickering? It’s often a symptom of low voltage.
To get around the problem, utilities typically “over-juice” their customers to ensure that the minimum threshold is being met. Utilities in the U.S. are required to deliver power to consumers at 120 volts, plus or minus 5 percent.
Since voltage gradually decreases on distribution feeder lines as the cumulative load increases, the voltage must be transmitted at a high enough level so that the very last consumer on the line gets at least the minimum standard of 114 volts during peak load (while those at the front don’t get more than 126 volts).
The result of this practice, however, is higher–and frankly unfair–bills, but also more greenhouse gas emissions and all the other corresponding problems associated with (unnecessarily) burning fossil fuels.
Reducing voltage–a measure of the potential energy in a snapshot in time–on the feeder lines that run from substations to businesses and homes by 1 volt yields between 0.8-1.2% savings in total load. So dropping voltage from 120 to 118 could allow utilities to reduce the power they deliver by nearly two percent.
Few would call two percent a huge gain, but calculated over twenty-four hours in a day 365 days a year those gains could turn into gigawatt-hours of saved energy. And since we have yet to identify a magic bullet solution to our current energy challenges, conservation voltage management techniques need to be explored.
Although voltage correction like this might seem mundane, it is actually rather revolutionary. And what could be accomplished in Europe or Asia, where 220 voltage rules?
A more flexible, dynamic grid would also allow variable generation sources like wind and solar to be added while ensuring that the system voltage remains stable. Smart meters can help boost that figure by giving grid operators the ability to get precise voltage measurements at the points of consumption and correct it.
But consumer advocacy groups will ask: will the public benefit, or is this an example of an improvement that helps a private utility at the consumers’ expense? Are there techniques that will yield more results for less? Flickering and outages during an upgrade could lead to complaints and lawsuits.
Is it worth tackling this problem when there are so many other issues with the grid? What do you think?