Sung Soo Kim, Ph.D.
April 27(Thu) - April 27(Thu), 2023
10am
Online zoom (ID: 728-142-6028)
Neuro@noon Seminar
Date: 10am, Thursday, April 27th
Speaker: Sung Soo Kim, Ph.D. (University of California Santa Barbara, Santa Barbara)
Title: Visual processing for the fruit fly head direction system
Abstract: Vision provides the richest and the most reliable information for animals to navigate an environment. Your uncertain sense of location and direction in a dark room vanishes as you flick on the light switch and receive visual input from the illuminated scene. Strong visual influence on the navigation system is common across animal species, including the fruit fly, Drosophila melanogaster. In flies, E-PG neurons––each innervating one sector of the torus-shaped ellipsoid body––encode the animal’s heading. The heading corresponds to an activity peak that, when superimposed on the population activity, resembles a bump: It moves smoothly around the ellipsoid body as the fly turns, and the bump’s position relative to the visual landmark is consistent across trials. This suggests visual input is essential for encoding heading, yet the nature of the visual features and how E-PG neurons use them are not well understood. Recently, we used diverse naturalistic visual stimuli to understand how the bump position is determined. We found that E-PG neurons can store a unique bump position relative to the orientation of each naturalistic scene. Interestingly, the more different the two scenes are, the more unpredictable the bump position for a scene is, based on the bump position of the other scene. This suggests that, if two scenes are different, they are decoupled in the memory system, providing an important clue to understanding the mechanisms of scene memory. Since our computational modeling suggests that the number of visual features may dictate memory capacity, we used electron-microscopy data to reconstruct the entire visual pathways from optic lobes to the E-PG neurons with synaptic resolution. Using this connectome data, which includes channels for all visual features, we have predicted and confirmed that light wavelength and visual stimulus shape are encoded in a different way across distinct populations of visual neurons that are presynaptic to E-PG neurons. Overall, this line of work will provide a comprehensive understanding of how visual features are extracted and transformed across multiple stages of processing to provide critical information to compute the fly’s head direction.