In-Memory Computing Hardware AcceleratorsCo-designing devices, circuits and architectures for explainable machine learning and pattern matching applications


 Dr. Cat Graves,Principal Research Scientist at Hewlett Packard Labs
Cost: Free, but registration is required. Register: Here
     Registered attendees will receive an email with a link for the Zoom meeting

Note: to avoid these announcements from going to spam, Mailchimp recommends you add this email to your address book: LincolnBourne@gmail.com
Wed April 20 – Agenda (California Time)
11:30 AM – Check-in
12:00 PM – Announcements and Speaker Introduction
12:10 – 1:30 PM –  Seminar and Q&A
 
           
Abstract: 
The dramatic rise of data-intensive workloads has revived special-purpose hardware for continuing gains in computing performance. Several promising special-purpose approaches take inspiration from the brain, which outperforms digital computing in power and performance in key tasks such as pattern matching. One brain-inspired architecture called “in-memory computing” significantly reduces data movement and has been shown to improve performance in CMOS ASIC demonstrations. However, these approaches still suffer from low power efficiency. Emerging non-volatile memories are a highly attractive alternative for low-power and high-performance in these architectures. Originally developed as digital (binary) non-volatile memories, many of these devices have a highly tunable analog resistances which are well-matched to in-memory computing architectures. I will review our team’s recent work using crossbar and content addressable memory (CAMs) circuits to accelerate important computing workloads in machine learning, complex pattern matching and optimization. I will also discuss our team’s recently invented analog CAM circuit targeted to accelerate interpretable machine learning models. Our work spans co-design from circuits and devices to algorithms and architectures to enable low power, high-throughput computation for important computing workloads.

                        
 Dr. Cat Graves is a Principal Research Scientist at Hewlett Packard Labs developing analog and neuromorphic computational accelerators which leverage emerging devices such as resistive RAM (RRAM) for high energy efficiency and throughput compared to general-purpose digital approaches in data-centric domains. Some of her previous work utilized multilevel analog resistive RAM devices to natively perform matrix multiplication within crossbars, accelerating a core computation of wide-ranging applications from neural networks to signal processing. Currently, she leads a research team exploring uses of RRAM-based and analog associative memory circuits for accelerating diverse computational models, including tree-based ML models and finite automata processing for network security and genomics applications. Cat was awarded Silicon Valley Intellectual Property Law Association (SVIPLA) Inventor of the Year in 2021 for her co-invention of analog content addressable memories. Cat received her Ph.D. in Applied Physics from Stanford University studying ultrafast magnetism for future magnetic memory technologies while an NSF Graduate Research Fellow. She has published over 35 peer-reviewed papers, three book chapters, and has been awarded 14 US patents.