The collaboration has enabled expanding systems of the same architecture as LLNL’s upcoming exascale supercomputer, El Capitan, featuring AMD’s cutting-edge MI300A processors.
Topic: AI/ML
In two papers from the 2024 International Conference on Machine Learning, Livermore researchers investigate how LLMs perform under measurable scrutiny.
To keep employees abreast of the latest tools, two data science–focused projects are under way as part of Lawrence Livermore’s Institutional Scientific Capability Portfolio.
The proposed Frontiers in Artificial Intelligence for Science, Security and Technology (FASST) initiative will advance national security; attract and build a talented workforce; harness AI for scientific discovery; address energy challenges; develop technical expertise necessary for AI governance.
This issue highlights some of CASC’s contributions to the DOE's Exascale Computing Project.
LLNL is applying ML to real-world applications on multiple scales. Researchers explain why water filtration, wildfires, and carbon capture are becoming more solvable thanks to groundbreaking data science methodologies on some of the world’s fastest computers.
LLNL’s HPC capabilities play a significant role in international science research and innovation, and Lab researchers have won 10 R&D 100 Awards in the Software–Services category in the past decade.
Held May 7–8 in Washington, DC, the Special Competitive Studies Project (SCSP) AI Expo showcased groundbreaking initiatives in AI and emerging technologies. Kim Budil and other Lab speakers presented at center stage and the DOE exhibition booth.
In a groundbreaking development for addressing future viral pandemics, a multi-institutional team involving LLNL researchers has successfully combined an AI-backed platform with supercomputing to redesign and restore the effectiveness of antibodies whose ability to fight viruses has been compromi
Throughout the workshop, speakers, panelists and attendees focused on algorithm development, the potential dangers of superhuman AI systems and the importance of understanding and mitigating the risks to humans, as well as urgent measures needed to address the risks both scientifically and politically.
LLNL’s fusion ignition breakthrough, more than 60 years in the making, was enabled by a combination of traditional fusion target design methods, HPC, and AI techniques.
By taking weather variables such as wildfire, flooding, wind, and sunlight that directly impact the electrical grid into consideration, researchers can improve electrical grid model projections for a more stable future.
New research reveals subtleties in the performance of neural image compression methods, offering insights toward improving these models for real-world applications.
LLNL is participating in the 35th annual Supercomputing Conference (SC23), which will be held both virtually and in Denver on November 12–17, 2023.
Merlin is an open-source workflow orchestration and coordination tool that makes it easy to build, run, and process large-scale workflows.
Cindy Gonzales earned a bachelor’s degree and master’s degree and changed careers—all while working at the Lab. Meet the deputy director of LLNL’s Data Science Institute.
CASC computational mathematician Andrew Gillette has always been drawn to mathematics and says it’s about more than just crunching numbers.
Using explainable artificial intelligence techniques can help increase the reach of machine learning applications in materials science, making the process of designing new materials much more efficient.
Highlights include MFEM community workshops, compiler co-design, HPC standards committees, and AI/ML for national security.
In a time-trial competition, participants trained an autonomous race car with reinforcement learning algorithms.
LLNL participates in the International Parallel and Distributed Processing Symposium (IPDPS) on May 30 through June 3.
Winning the best paper award at PacificVis 2022, a research team has developed a resolution-precision-adaptive representation technique that reduces mesh sizes, thereby reducing the memory and storage footprints of large scientific datasets.
From molecular screening, a software platform, and an online data to the computing systems that power these projects.
LLNL’s cyber programs work across a broad sponsor space to develop technologies addressing sophisticated cyber threats directed at national security and civilian critical infrastructure.
This project advances research in physics-informed ML, invests in validated and explainable ML, creates an advanced data environment, builds ML expertise across the complex, and more.
