The 2021 Conference on Computer Vision and Pattern Recognition, the premier conference of its kind, will feature two papers co-authored by an LLNL researcher targeted at improving the understanding of robust machine learning models.
New research debuting at ICLR 2021 demonstrates a learning-by-compressing approach to deep learning that outperforms traditional methods without sacrificing accuracy.
The Data Science Institute sponsored LLNL’s 27th hackathon on February 11–12. Organizers offered a deep learning tutorial and presentations showcasing data science techniques.
LLNL and IBM research on deep learning models to accurately diagnose diseases from x-ray images won the Best Paper award for Computer-Aided Diagnosis at the SPIE Medical Imaging Conference.
Three papers address feature importance estimation under distribution shifts, attribute-guided adversarial training, and uncertainty matching in graph neural networks.
An LLNL team has developed a “Learn-by-Calibrating” method for creating powerful scientific emulators that could be used as proxies for far more computationally intensive simulators.
Computing’s summer hackathon was held virtually on August 6–7 and featured presentations from teams who tested software technologies, expanded project features, or explored new ways of analyzing data.
Two papers featuring LLNL scientists were accepted in the 2020 International Conference on Machine Learning (ICML), one of the world’s premier conferences of its kind.
Surrogate models supported by neural networks could lead to new insights in complicated physics problems such as inertial confinement fusion.
A team led by an LLNL computer scientist proposes a deep learning approach aimed at improving the reliability of classifier models for predicting disease types from diagnostic images.
LLNL teams conduct research using AI, and the Machine Learning Reading Group serves as a resource for employees to keep one another apprised of developments in this ever-changing field.
As part of the Department of Energy’s role in the fight against cancer, scientists are building tools that use supercomputers to solve problems in entirely new ways.
Brothers and Computation teammates Joe and Sam Eklund discuss their multi-hackathon project using Deep Voice 3.
LLNL’s Center for Applied Scientific Computing looks back at 2018 papers, presentations, and other activities recognizing research and innovation in data science.
With nearly 100 publications, CASC researcher Jayaraman “Jay” Thiagarajan explores the possibilities of artificial intelligence and machine learning technologies.
LLNL employees attended a five-part “Deep Learning 101” course, which introduced the basics of neural networks and machine learning to anyone with a basic knowledge of programming in Python.