Topic: ML Theory

LLNL held its first-ever Machine Learning for Industry Forum (ML4I) on August 10–12, co-hosted by the Lab’s High-Performance Computing Innovation Center and Data Science Institute.

News Item

The 2021 Conference on Computer Vision and Pattern Recognition features two papers co-authored by an LLNL researcher targeted at understanding robust machine learning models.

News Item

New research debuting at ICLR 2021 demonstrates a learning-by-compressing approach to deep learning that outperforms traditional methods without sacrificing accuracy.

News Item

Three papers address feature importance estimation under distribution shifts, attribute-guided adversarial training, and uncertainty matching in graph neural networks.

News Item

An LLNL team has developed a “Learn-by-Calibrating” method for creating powerful scientific emulators that could be used as proxies for far more computationally intensive simulators.

News Item

The 34th Conference on Neural Information Processing Systems features two papers advancing the reliability of deep learning for mission-critical applications at LLNL.

News Item

Two papers featuring LLNL scientists were accepted in the 2020 International Conference on Machine Learning (ICML), one of the world’s premier conferences of its kind.

News Item

An LLNL-led team proposes a DL approach aimed at improving the reliability of classifier models for predicting disease types from diagnostic images.

News Item

LLNL's Jay Thiagarajan joins the Data Skeptic podcast to discuss his recent paper "Calibrating Healthcare AI: Towards Reliable and Interpretable Deep Predictive Models." The episode runs 35:50.

News Item

Highlights include perspectives on machine learning and artificial intelligence in science, data driven models, autonomous vehicle operations, and the OpenMP standard 5.0.

News Item

With nearly 100 publications, CASC researcher Jayaraman “Jay” Thiagarajan explores the possibilities of artificial intelligence and machine learning technologies.

People Highlight