Simulations of continuous, physics-based systems, such as chemical reactions or hydrodynamic phenomena, are relevant to many of the Laboratory’s missions. Models for these types of simulations use differential equations to track a system’s response, or activity, at every time step as it evolves. Alternatively, discrete event simulation is used to model complex, asynchronous systems in which events are represented as discontinuous state changes at separate and distinct points in space and time, occurring at irregular intervals. Examples include simulations of traffic flow in a road system or data packets traveling through communication networks.
Livermore computer scientists David Jefferson and Steven Smith and physicist Peter Barnes are tapping into the power of the Laboratory’s high performance computing capabilities to perform parallel discrete event simulation (PDES) of real-world systems. “PDES can be used to model systems that have an abstract geometry and cannot be precisely mapped to a physical layout,” says Barnes. Recently, their work has focused on models that can more accurately and effectively simulate California’s electric grid. Smith states, “Using high performance computing, we are improving electrical grid operation to save money and increase reliability as well as enhance the system’s security posture.”
Traditionally, electricity produced in large power generation facilities, such as nuclear power plants, is delivered through transmission lines to substations, which then distribute it to consumers. However, additional distributed generation from resources such as solar photovoltaic devices and wind generators require new sensors, controls, and communication networks to be added to the grid. As a result, the future smart grid will be an automated system with two-way flow of electricity and information.
In partnership with the California Energy Systems for the 21st Century (CES-21)—a collaboration that includes San Diego Gas and Electric, Southern California Edison, Pacific Gas and Electric, and Lawrence Livermore—and leveraging prior research funded by the Laboratory Directed Research and Development Program, Barnes and Smith are leading an effort with Brian Kelley and Jamie Van Randwyk to improve grid simulation tools.
“Current grid simulation tools separately model transmission and distribution,” says Smith. “As the energy system changes to become more integrated, these tools will be insufficient. For example, as clouds move across a rooftop solar photovoltaic array, the output power drops, requiring more power from the transmission network to meet demand. Our simulations can help operators adjust to these changes.”
Smith is the lead developer of ParGrid, a large-scale integrated dynamic simulator that couples the effects of transmission and distribution grids as well as the communication networks between them. The current decoupled transmission and distribution simulation approach is insufficient to address pending complex smart grid systems. Grid transformation requires fast simulation for electric grid real-time operations. “Our long-term vision is to improve both hardware and algorithms to enable fast and high-fidelity dynamic simulations,” says Smith.
ParGrid was designed to federate multiple parallel simulators running across high performance computing nodes for large-scale power grid analysis. These distributed simulators communicate through asynchronous messages while maintaining a global time across all simulator components. The approach assures efficient and correct execution. ParGrid has the potential to conduct cross-domain analysis for a large-scale grid such as California’s, which includes 6,000 transmission substations connecting about 10 million customers.
“We are the first group to simulate such large systems in a time-accurate manner,” says Smith. Such a capability enhances the design process for electric utility wide-area control schemes, which can improve overall system security and reliability.
The team is also exploring a novel use of PDES to solve continuous problems related to the grid. “We usually have a trade-off with continuous simulations and PDES—depending on which is used, results are either inefficient or inexact. In a system where derivatives of variables change and actions change, simulations tend to be inefficient if time stepped. It is a reasonable hypothesis that we could mix continuous simulations and PDES to achieve more accurate results,” says Barnes. As an example, ParGrid could be mixed to simulate the control system as a PDES network and the flow of electricity as a continuous system.
The success of the Laboratory’s PDES simulations partially relies on advances Livermore computer scientists have made in making the models more efficient. PDES simulations rely on the program’s ability to run code forward and backward in time based on events. Livermore developers have created Backstroke—an advanced application of the Laboratory’s ROSE compiler—that automatically generates reverse code from manually developed forward code. Backstroke saves time and effort, as programmers do not have to write code in both directions.
In addition, Livermore is also part of the Extreme Scale Parallel Discrete Event Simulation (XPDES) consortium, which includes collaborators from Rensselaer Polytechnic Institute (RPI), the University of Illinois Urbana-Champaign (UIUC), and Georgia Tech. The consortium is designed to improve the scalability of PDES. The team is currently working on combining RPI’s ROSS simulator engine with UIUC’s Charm++ software to create an object-oriented asynchronous message passing programming platform that improves load balancing by migrating model objects among the nodes of a supercomputer at runtime.
PDES programming is also being used to study more volatile systems including the stock market; materials science applications involving billions to trillions of atoms; and biological processes, such as the evolution of viruses. PDES experts are propelling the field forward and proving that simulation tools are key to understanding complex, asynchronous systems with many interacting components.