Topic: ML Theory

Three papers address feature importance estimation under distribution shifts, attribute-guided adversarial training, and uncertainty matching in graph neural networks.

News Item

An LLNL team has developed a “Learn-by-Calibrating” method for creating powerful scientific emulators that could be used as proxies for far more computationally intensive simulators.

News Item

The 34th Conference on Neural Information Processing Systems features two papers advancing the reliability of deep learning for mission-critical applications at LLNL.

News Item

Two papers featuring LLNL scientists were accepted in the 2020 International Conference on Machine Learning (ICML), one of the world’s premier conferences of its kind.

News Item

LLNL's Jay Thiagarajan joins the Data Skeptic podcast to discuss his recent paper "Calibrating Healthcare AI: Towards Reliable and Interpretable Deep Predictive Models." The episode runs 35:50.

News Item

LLNL’s Data Science Institute hosted its second annual workshop with the University of California, emphasizing key challenges with machine learning and artificial intelligence in scientific research.

News Item

Highlights include perspectives on machine learning and artificial intelligence in science, data driven models, autonomous vehicle operations, and the OpenMP standard 5.0.

News Item

As demonstrated by CASC computer scientists, LLNL's innovative data-driven machine learning techniques teach computers to solve real-world problems.

News Item

With nearly 100 publications, CASC researcher Jayaraman “Jay” Thiagarajan explores the possibilities of artificial intelligence and machine learning technologies.

People Highlight