Banishing blackouts

October 2020
Filed under: Argonne
An Argonne researcher upgrades supercomputer optimization algorithms to boost reliability and resilience in U.S. power systems.
keeping the lights on

Early Career awardee Kibaek Kim’s research thrust: keeping the lights on. (Photo: Argonne National Laboratory.)

The most extensive blackout in North American history occurred on Aug. 14, 2003, affecting an estimated 55 million people in nine U.S. states across the Midwest and Northeast and the Canadian province of Ontario.

Such high-impact outages are rare, and Kibaek Kim, a computational mathematician at Argonne National Laboratory, aims to keep them that way. He’s in the second year of a  five-year, $2.5 million Department of Energy Office of Science Early Career Research Award to develop data-driven optimization algorithms that account for power grid uncertainties.

“Right now, one of the issues in the grid system is that we have a lot of distributed energy resources,” says Kim, who received his doctorate from Northwestern University before joining Argonne’s Mathematics and Computer Science Division. Such resources include solar panels, electric storage systems, or other energy-related devices located at privately owned homes or commercial buildings. “These resources are not controllable by system operators – utilities, for example. These are considered as uncertain generating units.”

Such components put uncontrollable and potentially risky uncertainties into the grid. If the system somehow becomes imbalanced, “it can lead to power outages and even wider-area blackouts,” Kim notes. “The question we’re trying to address is how to avoid these blackouts or outages, how to make the system reliable or resilient.”

In 2004, the U.S.-Canada Power System Outage Task Force issued a report on the causes of the 2003 blackout and made recommendations to prevent similar events. “Providing reliable electricity is an enormously complex technical challenge, even on the most routine of days,” the report says. “It involves real-time assessment, control and coordination of electricity production at thousands of generators, moving electricity across an interconnected network of transmission lines, and ultimately delivering the electricity to millions of customers by means of a distribution network.”

Kim’s project builds on experience he gathered in recent Advanced Grid Modeling program collaborations with researchers in Argonne’s Energy Systems Division. Bolstered by access to Argonne’s Laboratory Computing Resource Center, his initiative relies on machine learning to help make grid planning and operation more reliable. Just last year he led one of 18 teams selected from national laboratories, industry and academia that participated in the first Grid Optimization Competition, sponsored by DOE’s Advanced Research Projects Agency-Energy.

His new research aims to more fully integrate techniques from two fields: numerical optimization and data science and statistics, which so far have been only loosely connected.

“I see this as a new paradigm,” Kim says. “As we see more and more data available from sensors, computing and simulation, this is probably the time to integrate the two areas.”

The work will include investigating what he calls “statistically robust design and control” mechanisms for complex systems, particularly data sampling in the energy sector. “Systems decisions would be made with future uncertainties in mind.”

Kim will use a sophisticated tool called distributionally robust multistage stochastic optimization to deal with data uncertainty. “This is a relatively new area that considers realistic situations where we can make decisions over time, sequentially, with streaming observation of data points for uncertainties.”

The optimization algorithms Kim has previously developed and now proposes for his current project will decompose, or systematically split, a large problem into smaller ones. “The adaptive decomposition that I propose here would dynamically adjust the structures, meaning we have a larger piece that would dynamically change the smaller pieces based on algorithm performance.”

Kim’s work takes into account how electrical grid systems can be disrupted by extreme weather, adversarial attacks or decentralized control mechanisms, including unexpected consumption patterns. Once a system is disrupted, the control mechanism would act to rapidly restore it to normal operations.

Motivating Kim’s interest in the project is the increasing availability of sensor data that flow from the grid’s many power systems and related computer networks. The nation’s electrical systems are increasingly incorporating smart-grid sensors, energy storage, microgrids and distributed-energy resources such as solar panels. His work also incorporates data from an Argonne-based National Science Foundation-supported project called the Array of Things. It collects and processes real-time power-use and infrastructure data from geographically scattered sensors.

Aurora supercomputer

(Photo: Argonne National Laboratory.)

This trend presents optimization problems requiring algorithms that can operate at multiple scales on powerful computers. Specifically, Kim is developing methods for this purpose to run on the Aurora supercomputer, scheduled for delivery to Argonne next year. It will be one of the nation’s first exascale machines, capable of performing a quintillion calculations per second, almost 100 times faster than Argonne’s current top supercomputer.

Processing ever-shifting variables in the vast, highly complex U.S. power grid requires Aurora’s high-performance computing brawn. An outage in just a few of the nation’s 20,000 power plants and 60,000 substations, with electricity flowing at nearly the speed of light over 3 million miles of power lines, can lead to more than a billion potential failure scenarios.

Kim will adapt his algorithms for parallel computing on machines like Aurora to sift through this massive data flow, a departure from current serial-computing tools that are ill-equipped to make full use of high-performance computers for optimization.

His research will encompass two applications: resilient smart distribution systems and distributed learning to “serve as a support framework for processing data such as building energy consumption and ambient weather conditions through design and control systems for smart grids” he notes. “With the data-support system we should be able to rapidly control these smart grids, particularly for contingency situations.”

(Visited 921 times, 1 visits today)

About the Author

Steve Koppes is a former science writer and editor with the University of Chicago, Arizona State University and the University of Georgia. He’s been a frequent contributor to Purdue University’s College of Agriculture and is the award-winning author of Killer Rocks From Outer Space.

Leave a Comment

You must be logged in to post a comment.