Computational Imaging and Sensing in Diagnostics with Deep Learning


Via Zoom:


Abstract:  Computational imaging and sensing aim to redesign optical systems from the ground up, jointly considering both hardware/sensors and software/reconstruction algorithms to enable new modalities with superior capabilities, speed, cost, and/or footprint. Often systems can be optimized with targeted applications in mind, such as low-light imaging or remote sensing in a specific spectral regime. In the last decade, the increased availability of data and cost-effective computational resources coupled with the commodification of neural networks has accelerated and expanded the potential for these computational sensing systems.

First, I will present our work on a cost-effective system for quantifying antimicrobial resistance (AMR), which could be of particular use in resource-limited settings, where poverty, population density, and lack of healthcare infrastructure lead to the emergence of some of the most resistant strains of bacteria. The device uses optical fibers to spatially subsample all 96 wells of a standard microplate without any scanning components, and a neural network identifies bacterial growth from the optical intensity information captured by the fibers. Our accelerated antimicrobial susceptibility testing (AST) system can interface with the current laboratory workflow and, when blindly tested on patient bacteria at UCLA Health, was able to identify bacterial growth after an average of 5.72 h, as opposed to the gold standard method requiring 18–24 h. The system is completely automated, avoiding the need for a trained medical technologist to manually inspect each well of a standard 96-well microplate for growth.

Second, I will discuss a deep learning-enabled spectrometer framework using localized surface plasmon resonance (LSPR). By fabricating an array of periodic nanostructures with varying geometries, we created a “spectral encoder chip” whose spatial transmission intensity depends upon the incident spectrum of light. A neural network uses the transmitted intensities captured by a CMOS image sensor to faithfully reconstruct the underlying spectrum. Unlike conventional diffraction-based spectrometers, this framework is scalable to large areas through imprint lithography, conducive to compact, lightweight designs, and, crucially, does not suffer from the resolution–signal strength tradeoff inherent to grating-based designs.

Biography:  Calvin Brown earned a B.S. in Engineering Physics from the University of California, Berkeley in 2015 and an M.S. in Electrical Engineering from UCLA in 2017. His current research sits at the intersection of computational imaging/sensing and artificial intelligence, particularly for medical and global health applications. In addition, he has conducted research at Lawrence Livermore National Laboratory, Berkeley Orthopaedic Biomechanics Laboratory, and Helmholtz-Zentrum Berlin in various engineering disciplines. Calvin earned the 2016 NSF GRFP fellowship, won third place for his presentation at the 2018 Emerging Researchers National (ERN) Conference in the Technology & Engineering category, and was awarded the 2019 NSF PATHS-UP Innovation Seed Fund.

For more information, contact Prof. Aydogan Ozcan ()

Date(s) - Oct 21, 2020
2:00 pm - 4:00 pm

Via Zoom Only
No location, Los Angeles
Map Unavailable