Categories: ArgonneFeatured

Bright future

A power pole. (Photo: fir0002flagstaffotos.com.au via Wikimedia Commons.)

One mid-August afternoon in 2003, a cascade of events shut down the power grid from Ohio through much of the Northeast and Ontario, affecting an estimated 50 million people. In many places, the power returned by evening. But some areas went without electricity for four days, and others had rolling blackouts for more than a week. The cost ran into the billions of dollars.

A joint U.S.-Canadian task force concluded that both human and technological failures started the cascade when transmission lines began to fail, overloading the network’s remaining lines. Unaware of moment-by-moment developments, operators did not divert a surge of power out of the system. As a result, they were unable to prevent the blackout.

In the blackout’s aftermath, the North American power system, sometimes called the biggest machine on Earth, is turning into what utilities planners call a “smart grid.”

The smart grid will provide the information and diagnostic tools operators need. Network-wide high-speed sensors will provide real-time readings from multiple points. High-level automation will apply algorithms to help make effective decisions. And this revamped machine will need high-performance computing to guide its development and keep it working.

New algorithms are becoming increasingly important as renewable energy sources, mainly wind and solar power, continue to grow and introduce random supply fluctuations.

Researchers charged with helping the grid become as reliable and cost-effective as possible face several challenges. An important example is the security-constrained problem – arriving at the optimal level of vital but expensive reserve energy at all times.

“You’re trying to compute the mix of energy that will not just minimize the cost, but you will minimize the cost and will survive the loss of any one asset,” explains Mihai Anitescu, a senior computational mathematician in the Laboratory for Advanced Numerical Software in the Mathematics and Computer Science Division at Argonne National Laboratory.

Such computations are an important part of a Department of Energy Advanced Scientific Computing Research program-sponsored project Anitescu leads: Multifaceted Mathematics for Complex Energy Systems (M2ACS). He works with colleagues at Argonne, Pacific Northwest National Laboratory, Sandia National Laboratories, the University of Wisconsin-Madison and the University of Chicago to develop simulations and computational tools for a more stable and efficient power grid.

Among the tools the M2ACS team is developing are algorithms that gather data from throughout one of the four giant networks, or interconnects, that comprise the grid. The algorithms run scenarios that enable operators to make intelligent decisions in time to prevent blackouts and equipment damage.

Any power grid is peppered by random influences, such as equipment failures and rising and falling consumer demand. Add the growing use of renewable energy sources, with their unpredictability, and the algorithms become even more complex. For example, “you do not know exactly how much wind you’re going to have,” Anitescu says. “It’s not like a coal plant, where you say, ‘Just give me one gigawatt.’ It’s going to be what it’s going to be. So you have to allow for more scenarios to deal with the uncertainty in renewable supply.”

The team has developed randomly determined optimization algorithms that can generate tens of thousands of scenarios incorporating billions of random variables and billions of constraints on the demands placed on grid components. By limiting the uncertainties inherent in outcomes that may result from a given operator decision, the algorithms allow power facilities to make more intelligent decisions. That also means they can operate with smaller reserves, maintaining a stable system while saving money and reducing greenhouse gas emissions. The algorithms are scalable, meaning each facility can invest in only as much computer hardware as it needs.

“We can now solve a large number of scenarios for the real-size networks,” says Anitescu, whose study showed networks would be able to incorporate “more uncertain sources such as wind and power without losing reliability.”

Exactly how to add more wind, solar and other power sources hinges on another set of tools the M2ACS team is developing: predictive models of the grid’s behavior.

Each time a generator quits, a transmission line goes down or a lightning strike delivers a dose of electricity to the grid, the system must adjust. Within each interconnect, generators are synchronized to put out their 60 cycles per second in unison. As generators accelerate to make up for a lost generator or transmission line or slow down when power demand drops, they can veer off that critical rate. Since the generators, transformers and other equipment are all designed to function at 60 cycles, an excursion too far off that frequency can damage equipment.

To avoid expensive, lengthy shutdowns and repairs, many relays – the circuit breakers of the grid’s high-voltage realm – are set at conservative levels. They may cut off power when they could have continued running. “It’s what happened at the Super Bowl in 2013,” Anitescu says. “That blackout happened because somebody assigned the breaker too low of a value.”

The team’s predictive models aren’t for use in making on-the-fly decisions. Instead, they’re for predicting grid behavior in various conditions, then designing networks and setting the parameters for relays. “We are at the stage where we can now simulate an interconnect in real time or faster than real-time,” Anitescu says. These tools will likely be further developed and become available for use in a few years, he predicts.

The M2ACS team thinks their tools and simulations can lead to a deeper understanding of the grid. Although humans built the world’s biggest machine, it may hold some mysteries. For example, researchers are still interpreting one observation from the 2003 blackout: Part of the grid did not fail that day. “The cascade did not go to a full blackout, although that is what the theory predicted,” Anitescu says. “An island emerged which was self-sustaining.”

The surprising resilience of a grid section suggests that a cascade may not necessarily lead to a blackout. Maybe the grid can be modified to “fail gracefully,” Anitescu says. “In other words, we don’t at the moment design for limited failure. We design for complete non-failure but then if a failure starts to grow, we don’t know quite what the system will do.”

Inspired by this idea, two M2ACS collaborators, Christopher DeMarco at the University of Wisconsin-Madison and Jonathan Weare at the University of Chicago, are borrowing mathematical tools from chemistry to determine how the grid can operate at different levels of connectivity, analogous to the different energy levels in a chemical compound. A step away from an all-or-nothing, on-or-off model of the grid might be a step toward a more responsive and reliable system.

Today, the power grid is somewhat calm but also expensive. With growing renewable sources, it’s likely to be harder to tame. “It’s like not having realized the full potential,” Anitescu says. “But there’s going to be a lot more freedom – and people being able to use the energy as they want to use it. And we have to allow for that.”

Andy Boyles

The author is science media manager at the Krell Institute.

Share
Published by
Andy Boyles

Recent Posts

Genomic field work

A UC Davis fellow combines computing, the corn genome and growing data to help farmers… Read More

March, 2024

Tracking space debris

 A Computational Science Graduate Fellowship recipient monitors threats to satellites in Earth’s ionosphere by modeling… Read More

February, 2024

Decisive achievement

A computational sciences fellow models COVID-19 virus variants and examines how people weigh complex decisions. Read More

October, 2023

Learning climate

A Colorado State fellow employs machine learning for climate modeling, putting provenance behind predictions. Read More

August, 2023

Statistically significant

A LANL statistician helps cosmologists and epidemiologists grasp their data and answer vital questions. Read More

July, 2023

Cutting carbon, blocking blooms

Besides bioplastics research, the LANL Biofuels and Bioproducts team is studying carbon neutrality and applying… Read More

June, 2023