Categories: Pacific Northwest

Getting a grip on the grid

In this hypothetical scenario, red areas indicate vulnerable portions of the western North America power grid that network operators must address. Gray areas are causes for concern and green areas are safe. Overlaying risk-level data sets (here, 200 of them) allows operators to visualize the collective risk on the system.

When Zhenyu “Henry” Huang considers the grid of power sources and wires that supplies electricity to the United States, he doesn’t see a network so much as he sees a giant machine.

And it’s straining to keep up.

“The power grid is doing things today which it was not designed for,” says Huang, a senior research engineer in the Energy Technology Development Group at Pacific Northwest National Laboratory, a Department of Energy facility.

“If something serious happens in the system, it can wipe out a whole region in a few seconds,” Huang says. “That’s why we want to think about how we can respond to those situations more quickly, and that’s why we want to look into high-performance computers to help us.”

Huang and fellow PNNL researchers are developing ways for powerful computers to quickly analyze which failures can combine to most threaten the grid. When multiple components break down, the consequences can be devastating, such as the massive blackout of the northeast United States and Ontario in 2003.

The risk of repeating such a giant failure is increasing as the power system copes with a list of major challenges – including rising demand on a grid that already operates near capacity.

Most of the problems stem from power systems’ high-wire balancing act. “You cannot store a large amount of power like you can store computers in a warehouse,” Huang says. Demand and production must always be roughly equal. Too much or too little generation can cause overloads, brownouts and blackouts.

The addition of renewable energy sources like wind and solar tests that balance. Passing clouds can make solar generation fluctuate; winds can suddenly accelerate or die, boosting or idling turbines. Power companies must quickly redistribute traditional generation to compensate.

Thinking fast

Power company employees and computers monitor the grid at large control centers. To cope with problems, they adjust generation and open or close lines and transformers to meet demand and reroute power.

That’s not good enough, Huang says. To avoid catastrophic incidents, utilities must be able to quickly analyze how failures affect the system.

A power system’s character completely changes over a period of hours, but usually does so slowly, says John Allemong, principal engineer for American Electric Power, an Ohio-based utility that owns the nation’s largest transmission system. However, “There are topological changes that occur fairly frequently,” he adds. “When those do occur it’s a good idea to get a handle on where you are fairly quickly. The ability to do so would be a plus.”

(Visited 918 times, 1 visits today)

Page: 1 2

Thomas R. O'Donnell

Thomas R. O'Donnell is senior science writer at the Krell Institute and a frequent contributor to DEIXIS.

Share
Published by
Thomas R. O'Donnell

Recent Posts

Efficiency surge

A DOE CSGF recipient at the University of Texas took on a hurricane-flooding simulation and blew away limits on its… Read More

September, 2019

Robot whisperer

A DOE computational science fellow combines biology, technology and more to explore behavior, swarms and space. Read More

July, 2019

Labeling climate

A Berkeley Lab team tags dramatic weather events in atmospheric models, then applies supercomputing and deep learning to refine forecasts. Read More

July, 2019

Molecular landscaping

A Brookhaven-Rutgers group uses supercomputing to target the most promising drug candidates from a daunting number of possibilities. Read More

May, 2019

Forged in a Firestorm

A Livermore team takes a stab, atom-by-atom, at an 80-year-old controversy over a metal-shaping property called crystal plasticity. Read More

April, 2019

Visions of exascale

Argonne National Laboratory’s Aurora will take scientific computing to the next level. Visualization and analysis capabilities must keep up. Read More

March, 2019