Lawrence Livermore National Lab has named Stefanie Guenther as Computing’s fourth Sidney Fernbach Postdoctoral Fellow in the Computing Sciences. This highly competitive fellowship is named after LLNL’s former Director of Computation and is awarded to exceptional candidates who demonstrate the potential for significant achievements in computational mathematics, computer science, data science, or scientific computing.
For the two-year fellowship, Guenther will work in the Center for Applied Scientific Computing (CASC) with her mentor, Rob Falgout, project leader and main software architect for the parallel multigrid-in-time integration project XBraid. She will concentrate on innovative numerical methodologies that leverage insights from optimal control with partial differential equations (PDEs) to improve and accelerate current learning models with deep neural networks specifically designed for scientific applications. “I plan to develop parallel-in-time optimization methods that utilize high performance clusters for distributing the neural network layers onto different compute units using multigrid techniques. CASC is a good place to do that. It has the software and HPC architectures, as well as the experts in one place, which is an ideal environment for my research,” says Guenther.
Guenther’s research focuses primarily on scientific machine learning (SciML), a new, evolving field that merges scientific laws built upon physical knowledge with automated data analysis from machine learning. In computational science and engineering applications, SciML leverages the tremendous success of data-driven machine learning models to enhance physics-based simulations. “I’m developing expertise in an area where there’s been lots of development, but knowledge transfer has been slow. I hope to bring the two together to improve machine learning interpretability and accelerate learning by leveraging theory and numerics from the well-established field of optimal control,” says Guenther.
In 2019, Guenther began her postdoctoral research at CASC with a focus on parallel-in-time integration and optimization methods for differential algebraic equations (DAEs) in electric power grid applications. And before her LLNL postdoc work, she focused on layer-parallelization for deep residual networks and time-parallel adjoint sensitivities using automatic differentiation techniques in a postdoctoral appointment in the Scientific Computing Group at Technische Universität Kaiserslautern in Germany. She received her Ph.D. in Applied Mathematics from Rheinisch-WestfälischeTechnische Hochschule Aachen, also in Germany, working on numerical optimization methods with PDEs for optimal shape design in aerodynamics, and has had undergraduate internships at Argonne National Laboratory (ANL) and Massachusetts Institute of Technology (MIT).
“As an emergent field, machine learning and deep learning can be presented as an optimal control problem. People are looking at discrete neural networks as a continuous ordinary differential equation (ODE). Numerical optimization with ODEs then lies at the heart of the training process,” Guenther explains “My work will aim to develop a full multigrid approach to solving nonlinear, time-dependent optimal control problems that enables massive parallelism for deep learning on many-core high performance compute clusters.”
Guenther initially came to the Bay Area in 2018 for a three-month internship at CASC where she worked with Jacob Schroder on developing adjoint-based sensitivities for XBraid, and fell in love with the region. “I really loved being in the Bay Area and living here. I love the diversity, the people, and the outdoor lifestyle where I can bike, hike, and rock climb. After my internship at CASC, I knew I wanted to come back,” she says.
—Genevieve Sexton