From our fall 2022 hackathon, watch as participants trained an autonomous race car with reinforcement learning algorithms.
Topic: Deep Learning
Highlights include MFEM community workshops, compiler co-design, HPC standards committees, and AI/ML for national security.
In a time-trial competition, participants trained an autonomous race car with reinforcement learning algorithms.
LLNL is participating in the 33rd annual Supercomputing Conference (SC21), which will be held both virtually and in St. Louis on November 14–19, 2021.
The 2021 Conference on Computer Vision and Pattern Recognition features two papers co-authored by an LLNL researcher targeted at understanding robust machine learning models.
New research debuting at ICLR 2021 demonstrates a learning-by-compressing approach to deep learning that outperforms traditional methods without sacrificing accuracy.
Highlights include scalable deep learning, high-order finite elements, data race detection, and reduced order models.
The Data Science Institute sponsored LLNL’s 27th hackathon on February 11–12. Organizers offered a deep learning tutorial and presentations showcasing data science techniques.
LLNL and IBM research on deep learning models to accurately diagnose diseases from x-ray images won the Best Paper award for Computer-Aided Diagnosis at the SPIE Medical Imaging Conference.
Our researchers will be well represented at the virtual SIAM Conference on Computational Science and Engineering (CSE21) on March 1–5. SIAM is the Society for Industrial and Applied Mathematics with an international community of more than 14,500 individual members.
Three papers address feature importance estimation under distribution shifts, attribute-guided adversarial training, and uncertainty matching in graph neural networks.
An LLNL team has developed a “Learn-by-Calibrating” method for creating powerful scientific emulators that could be used as proxies for far more computationally intensive simulators.
Lawrence Livermore National Lab has named Stefanie Guenther as Computing’s fourth Sidney Fernbach Postdoctoral Fellow in the Computing Sciences. This highly competitive fellowship is named after LLNL’s former Director of Computation and is awarded to exceptional candidates who demonstrate the potential for significant achievements in computational mathematics, computer science, data science, or scientific computing.
Highlights include response to the COVID-19 pandemic, high-order matrix-free algorithms, and managing memory spaces.
Rafael Rivera-Soto is passionate about artificial intelligence, deep learning, and machine learning technologies. He works in LLNL’s Global Security Computing Applications Division, also known as GSCAD.
Cindy Gonzales earned a bachelor’s degree, started her master’s degree, and changed careers—all while working at the Lab. Meet one of our newest data scientists.
Highlights include perspectives on machine learning and artificial intelligence in science, data driven models, autonomous vehicle operations, and the OpenMP standard 5.0.
With nearly 100 publications, CASC researcher Jayaraman “Jay” Thiagarajan explores the possibilities of artificial intelligence and machine learning technologies.
Highlights include recent LDRD projects, Livermore Tomography Tools, our work with the open-source software community, fault recovery, and CEED.