The report lays out a comprehensive vision for the DOE Office of Science and NNSA to expand their work in scientific use of AI by building on existing strengths in world-leading high performance computing systems and data infrastructure.
Topic: CogSim/AI/ML Hardware
LLNL CTO Bronis de Supinski talks about how the Lab deploys novel architecture AI machines and provides an update on El Capitan.
As CTO of Livermore Computing, de Supinski is responsible for formulating, overseeing, and implementing LLNL’s large-scale computing strategy, requiring managing multiple collaborations with the HPC industry and academia.
The addition of the spatial data flow accelerator into LLNL’s Livermore Computing Center is part of an effort to upgrade the Lab’s cognitive simulation (CogSim) program.
Adding machine learning and other artificial intelligence methods to the feedback cycle of experimentation and computer modeling can accelerate scientific discovery.
The award recognizes progress in the team's ML-based approach to modeling ICF experiments, which has led to the creation of faster and more accurate models of ICF implosions.
LC sited two different AI accelerators in 2020: the Cerebras wafer-scale AI engine attached to Lassen; and an AI accelerator from SambaNova Systems into the Corona cluster.
LLNL has established the AI Innovation Incubator (AI3), a collaborative hub aimed at uniting experts from LLNL, industry, and academia to advance AI for scientific and commercial applications.
In his opening keynote address at the AI Systems Summit, LLNL CTO Bronis de Supinski described integration of two AI-specific systems to achieve system level heterogeneity.