January 2020 UWIN Seminar speakers Fred Rieke and Arka Majumdar.

The UWIN seminars start in January 2020 with a pair of short talks by Fred Rieke and Arka Majumdar. The seminar is on Wednesday, January 8, 2020 at 3:30pm in Husky Union Building (HUB) 337. Refreshments will be served prior to the talks.

“Towards generalizable models for sensory responses”

Fred Rieke, Professor, Department of Physiology and Biophysics, University of Washington

“Extreme miniaturization of optics using metasurface and computational imaging”

Arka Majumdar, Assistant Professor, Department of Electrical and Computer Engineering, University of Washington

Abstracts:

“Towards generalizable models for sensory responses” (Fred Rieke)

Receptive field models attempt to concisely summarize neuronal responses to sensory inputs. These models instantiate our understanding of the mechanistic basis of circuit function, and by doing so, help identify gaps in that understanding. Current models often fail to generalize to predict responses to stimuli other than those to which they were directly fit. Such failures are particularly striking for stimuli with the large dynamic range and strong temporal and spatial correlations characteristic of natural visual images. I will summarize recent work aimed at generating models with better ability to generalize across stimuli, and discuss what those models can reveal about key circuit functions.

“Extreme miniaturization of optics using metasurface and computational imaging” (Arka Majumdar)

Modern image sensors consist of systems of cascaded and bulky spherical optics for imaging with minimal aberrations. While these systems provide high quality images, the improved functionality comes at the cost of increased size and weight. One route to reduce a system’s complexity is via computational imaging, in which much of the aberration correction and functionality of the optical hardware is shifted to post-processing in the software realm. Alternatively, a designer could miniaturize the optics by replacing them with diffractive optical elements, which mimic the functionality of refractive systems in a more compact form factor. Metasurfaces are an extreme example of such diffractive elements, in which quasiperiodic arrays of resonant subwavelength optical antennas impart spatially-varying changes on a wavefront. While separately both computational imaging and metasurfaces are promising avenues toward simplifying optical systems, a synergistic combination of these fields can further enhance system performance and facilitate advanced capabilities. In this talk, I will present a method to combine these two techniques to perform full-color imaging across the whole visible spectrum. I will also discuss the use of computational techniques to design new metasurfaces, and using metasurfaces to perform computation on wavefronts, with applications in optical information processing and sensing.