Computational mathematician Yohann Dudouit wrote his first finite element simulation as an undergraduate student—creating a video game physics engine involving complex meshing, discretizing, problem solving, and visualization, in just three months. Yohann, who both initiated and led the project, says, “It was an ambitious goal to fully develop this engine for simulating properties of object deformation in such a short amount of time. Even now, I am proud of what my team accomplished.” Looking back, this project was also the genesis of what would become a successful career focused on developing generalized mathematical libraries for simulating an array of complex physical phenomena.

After earning his Ph.D. in Computational and Applied Mathematics from the University of Bordeaux, Yohann joined the Laboratory as a postdoc in 2017. Working with the MFEM team on CEED (Center for Efficient Exascale Discretizations)—part of the Department of Energy’s (DOE’s) Exascale Computing Project—he gained insight into mapping abstract mathematics to physical hardware, memory hierarchy, and the critical metrics for optimizing performance on GPU architectures. He says, “We focused on developing a general mathematical library for simulation using matrix-free techniques to improve application performance and optimize GPU architectures.”

In 2020, Yohann became a full-time staff member of the Research Software Engineering group within the Laboratory’s Center for Applied Scientific Computing. In this role, he has shifted the expertise gained from CEED to more specific applications, including the Laboratory’s BLAST code, extending matrix-free approaches to discontinuous Galerkin methods and adaptive mesh refinement (AMR)—work for which he earned an LLNL Spot Award. He later utilized those same performance-optimization techniques to help develop matrix-free numerical methods for the GEOSX application. Yohann says, "Software developers don’t often get to see how their products are used, but at the Laboratory, we can directly engage with our end users. It is very motivating to receive direct feedback on how software development benefits applications.”

Over the last three years, Yohann has participated in a Laboratory Directed Research and Development (LDRD) Program–funded project to develop a generalized arbitrary-dimension finite element library—called GenDiL—for simulating high-dimensional complex physics processes, from space-time discretizations in 4D to radiation transport and kinetic simulations in 6D. “We are concentrating on high-dimensional methods to solve challenging problems that are central to many applications at the Laboratory but were previously too computationally expensive and memory intensive to solve,” says Yohann. “By transferring the knowledge gained through development of high-order methods to high dimension, we strongly mitigate computational cost and can actually run some benchmarks faster in 6D than we could in 3D.”

Yohann became the LDRD project lead in 2024 and has refocused efforts to turn GenDiL’s unique capabilities into a state-of-the-art, high performance computing finite-element library for next-generation high-dimensional simulations, specifically for radiation transport and kinetic models central to inertial confinement fusion. A key thrust of the project is pairing phase-space AMR with local-dimension refinement (LDR). Yohann says, “AMR concentrates resolution where the dynamics live, while LDR mixes 3D and 6D models in one simulation—so we pay for 6D only where physics demand it, which cuts costs and improves accuracy.”

Building on his work with GenDiL, Yohann also serves as co-principal investigator for VISTA, a DOE Office of Advanced Scientific Computing Research project to deliver end-to-end, machine-checked proofs of correctness for scientific software libraries, including floating-point correctness. GenDiL is a target application, which if successful, would be the first finite-element library with a full proof of correctness,” says Yohann. “My goal is to bring formal methods, which are typically used for critical and embedded software verification, into the world of scientific computing. The long-term goal is to raise confidence in simulation results, particularly for safety-critical or certification-driven applications.” Yohann also notes the utility of formal verification techniques as AI continues to rapidly evolve. “As AI-driven hardware trends toward very low-precision formats and may even de-emphasize or omit double precision, scientific software will have to adapt. Proving floating-point correctness becomes critical in that transition.”

Yohann is proud to be a part of the important work done at the Laboratory. “High-dimensional simulations enable more accurate models for critical mission areas. My work aims to bring these capabilities within reach by reducing memory footprint and computation time while preserving accuracy and scalability,” he says. “Going beyond 3D simulation was unthinkable up until recently. The work we’ve done to simulate processes in 6D is nontrivial.” Combining this work with efforts on formal verification, Yohann sees a future full of possibilities. “With formal verification methods, we would start with mathematical specifications to define what generic programming functions must do to solve a specific problem, making the design problem as simple as possible for exploring extremely vast implementation spaces,” he says. “This idea is an old dream of computer science…writing specification rather than programs and deriving the program from the specification. In the short future, such an outcome might actually be achievable.”

—Caryn Meissner

Published on November 13, 2025