From her first day as a computer scientist in the Center for Applied Scientific Computing (CASC), Maya Gokhale has enjoyed tackling a wide variety of challenges. Over the years, she has taken part in groundbreaking projects and developed high-risk ideas from solving data movement challenges facing our supercomputers to reducing energy required for autonomous sensor data processing in remote locations. As a Distinguished Member of Technical Staff—LLNL’s highest technical job classification level—Gokhale is considered an expert in her field, and continues to enjoy the fast pace of innovation and change in computing.
Gokhale attended Wake Forest University where she obtained a bachelor’s degree in mathematics in just three years. After the intense academic engagement, she jumped into computer science (CS) research and development for the giants of computing in that era, Burroughs Corporation (later part of Unisys) and Hewlett-Packard. “I had to learn on the job and be really quick on my feet,” says Gokhale. “But I ended up discovering that I really do like logical thinking and a systems perspective for how all the pieces fit together.”
Gokhale soon realized she couldn’t stay away from academics for long, completing a master’s degree in CS while working full time, and subsequently pursuing a CS doctorate, both at the University of Pennsylvania. In her daily walk to her office at Penn, she passed glass cases containing parts of the first ENIAC computer.
While a member of the University of Delaware Computer and Information Sciences department, Gokhale was drawn to full-time research at the newly formed Supercomputing Research Center. Her passion for parallel processing was amply fulfilled by engaging with teams designing novel heterogeneous architectures: one of the first processing-in-memory chips ever fabricated, and the new opportunities for computing in programmable logic. Over a decade, Gokhale’s teams designed parallel languages uniquely targeting reconfigurable computing with field programmable gate arrays (FPGAs), and built compilers to translate high level parallel C to hardware gates. One such compiler was licensed to a computer-aided design company. Another open sourced, won an R&D 100 award.
At LLNL, Gokhale’s research has focused on the blurring distinction between memory and storage afforded by newly emerging persistent memories such as Flash and resistive random-access memory (RAM). Her projects have delivered memory-mapping libraries targeting exascale compute nodes as well as specialized processing near memory designs that reduce the cost of moving data between memory and CPU.
Upon joining CASC in 2007, Gokhale was immediately engaged in a new project focusing on memory and storage in supercomputing architectures. Gokhale observes, “At LLNL Computing, I get to be involved with diverse challenges and enjoy new experiences across the stack from systems software to purpose-built hardware.”
Currently, Gokhale is engaged in a variety of projects related to memory-intensive computing and heterogeneous architectures. One is a Laboratory Directed Research and Development (LDRD) project focused on low-level and specialized hardware meant to reduce energy in autonomous near-sensor computing devices. The hardware her team is developing exploits asynchronous data flows through compute units producing new data.
“Normally, artificial intelligence is associated with inference-only chips where the data just flows through because it already has a path through a particular network,” says Gokhale. “Our chip employs asynchronous logic to enable runtime adaptation to changing environment and data characteristics.”
Gokhale says it’s the unknowns that excite her most about this work. The team combines expert hardware designers and machine learning researchers at LLNL with a leading-edge Yale University hardware team to develop the technology, which she refers to as a “high-risk idea” because of the level of challenge it presents to breaking the barrier of adaptable autonomous processing.
“We face many questions on both hardware and machine learning fronts,” says Gokhale. “Support for potentially disruptive research is a unique strength of the Lab. It’s really what LDRD is all about.”
That technical challenge, the variety in projects, and the support of leadership is what Gokhale enjoys the most about being a member of CASC. “Sometimes in the span of 15 minutes, I will have pivoted between 3 different projects,” she says. “But I like that variety.”
Gokhale continues that variety in her personal life by helping register voters for local and national elections and caring for her body and spirit through regular yoga practice. If you see Gokhale around the office, ask her about power yoga, which she says builds muscle, reinforces joints, and improves bone density. She may even give you tips on perfecting your handstand (which may be her secret to having a head so full of ideas) but cautions that not knowing how to do something shouldn’t be a reason you don’t try it.
“We tend to think, ‘I’m not perfect, I’ll break it, I’m not doing it right.’ But I learned that it doesn’t really matter because most people are making it up as they go,” says Gokhale. “And even when we make it up, we can end up being pretty spot on.”
—Amy Weldon