LLNL researchers and collaborators have developed a highly detailed, ML–backed multiscale model revealing the importance of lipids to RAS, a family of proteins whose mutations are linked to many cancers.
Topic: Data Science
Highlights include power grid challenges, performance analysis, complex boundary conditions, and a novel multiscale modeling approach.
LLNL is participating in the 33rd annual Supercomputing Conference (SC21), which will be held both virtually and in St. Louis on November 14–19, 2021.
Brian Gallagher works on applications of machine learning for a variety of science and national security questions. He’s also a group leader, student mentor, and the new director of LLNL’s Data Science Challenge.
New research debuting at ICLR 2021 demonstrates a learning-by-compressing approach to deep learning that outperforms traditional methods without sacrificing accuracy.
Highlights include scalable deep learning, high-order finite elements, data race detection, and reduced order models.
BUILD tackles the complexities of HPC software integration with dependency compatibility models, binary analysis tools, efficient logic solvers, and configuration optimization techniques.
Our researchers will be well represented at the virtual SIAM Conference on Computational Science and Engineering (CSE21) on March 1–5. SIAM is the Society for Industrial and Applied Mathematics with an international community of more than 14,500 individual members.
Three papers address feature importance estimation under distribution shifts, attribute-guided adversarial training, and uncertainty matching in graph neural networks.
StarSapphire is a collection of scientific data mining projects focusing on the analysis of data from scientific simulations, observations, and experiments.
fpzip is a library for lossless or lossy compression of multidimensional floating-point arrays. It was primarily designed for lossless compression.
Nisha Mulakken is advancing COVID-19 R&D and mentoring the next generation. “The opportunities we are exposed to early in our careers can shape the limits we place on ourselves and our approaches to challenges we encounter throughout our careers,” she says.
Lawrence Livermore National Lab has named Stefanie Guenther as Computing’s fourth Sidney Fernbach Postdoctoral Fellow in the Computing Sciences. This highly competitive fellowship is named after LLNL’s former Director of Computation and is awarded to exceptional candidates who demonstrate the potential for significant achievements in computational mathematics, computer science, data science, or scientific computing.
Highlights include response to the COVID-19 pandemic, high-order matrix-free algorithms, and managing memory spaces.
Rafael Rivera-Soto is passionate about artificial intelligence, deep learning, and machine learning technologies. He works in LLNL’s Global Security Computing Applications Division, also known as GSCAD.
ADAPD integrates expertise from DOE national labs to analyze growing global data streams and traditional intelligence data, enabling early warning of nuclear proliferation activities.
Researchers develop innovative data representations and algorithms to provide faster, more efficient ways to preserve information encoded in data.
Highlights include perspectives on machine learning and artificial intelligence in science, data driven models, autonomous vehicle operations, and the OpenMP standard 5.0.
Simulation workflows for ALE methods often require a manual tuning process. We are developing novel predictive analytics for simulations and an infrastructure for integration of analytics.
Highlights include CASC director Jeff Hittinger's vision for the center as well as recent work with PruneJuice DataRaceBench, Caliper, and SUNDIALS.
AIMS (Analytics and Informatics Management Systems) develops integrated cyberinfrastructure for big climate data discovery, analytics, simulations, and knowledge innovation.
Marisa Torres, software developer with LLNL’s Global Security Computing Applications Division, combines her love of biology with coding.
Highlights include recent LDRD projects, Livermore Tomography Tools, our work with the open-source software community, fault recovery, and CEED.
Highlights include the directorate's annual external review, machine learning for ALE simulations, CFD modeling for low-carbon solutions, seismic modeling, and an in-line floating point compression tool.
SOAR (Stateless, One-pass Adaptive Refinement) is a view-dependent mesh refinement and rendering algorithm.