Physics-Aware Tiny Machine Learning by Swapnil Sayan Saha

Speaker: Swapnil Sayan Saha
Affiliation:

Swapnil Sayan Saha

Friday, April 21, 2023 – 1:00 PM to 3:00 PM
Engineering Bldg. IV – Tesla Room #53-125

Abstract:
Tiny machine learning has enabled Internet of things platforms to make intelligent inferences for time-critical and remote applications from unstructured data. However, realizing edge artificial intelligence systems that can perform long-term high-level reasoning and obey the underlying system physics, rules, and constraints within the tight platform resource budget is challenging. This dissertation explores how rich, robust, and intelligent inferences can be made on extremely resource-constrained platforms in a platform-aware and automated fashion. Firstly, we introduce a robust training pipeline that handles sampling rate variability, missing data, and misaligned data timestamps through intelligent data augmentation techniques during training time. Secondly, we introduce TinyNS, a platform-aware neurosymbolic architecture search framework for the automatic co-optimization and deployment of neural operators and physics-based process models. TinyNS exploits fast, gradient-free, and black-box Bayesian optimization to automatically construct the most performant learning-enabled, physics, and context-aware edge artificial intelligence program from a search space containing neural and symbolic operators within the platform resource constraints. Thirdly, we introduce the concept of neurosymbolic tiny machine learning, where we showcase recipes for defining the physics-aware tiny machine learning program synthesis search space from five neurosymbolic program categories. Neurosymbolic artificial intelligence combines the context awareness and integrity of symbolic techniques with the robustness and performance of machine learning models. We develop several previously unseen TinyML applications, such as onboard physics-aware neural-inertial navigation, on-device human activity recognition, on-chip fall detection, neural-Kalman filtering, and co-optimization of neural and symbolic processes. Finally, we showcase techniques to personalize and adapt tiny machine learning systems to the target domain and application. We illustrate the use of transfer learning, resource-efficient unsupervised template creation and matching, and foundational models as pathways to realize generalizable, domain-aware, and data-efficient edge artificial intelligence systems.

Biography
Swapnil Sayan Saha is a Ph.D. candidate in Electrical and Computer Engineering (ECE) at UCLA. He received his M.S. in ECE from UCLA in 2021, and B.Sc. in Electrical and Electronic Engineering from the University of Dhaka in 2019. He works at the Networked and Embedded Systems Lab under Professor Mani Srivastava. He creates deployable, physics-aware, uncertainty-injected, learning-enabled, and resource-constrained embedded systems. He has published 25+ peer-reviewed articles, and received 30+ awards in robotics, technical, and business-case competitions worldwide. He is also the director of graduate events of the UCLA Graduate Students Association (GSA), Vice President of the UCLA Engineering Graduate Students Association (EGSA), and Vice President of the UCLA Electrical and Computer Engineering Graduate and PostDoc Society (ECEGAPS).

Date/Time:
Date(s) - Apr 21, 2023
1:00 pm - 3:00 pm

Location:
Map Unavailable