Scale-Invariant Neural Dynamics for Cognition

Speaker: Yue Liu

When: September 23, 2019 (Mon), 10:00AM to 11:00AM (add to my calendar)
Location: RKC 106C

This event is part of the Preliminary Oral Exam.

Examining Committee: Marc W. Howard, Michael E.Hasselmo, Pankaj Mehta, Martin Schmaltz

Abstract: The brain is operating in a world with rich dynamics across a wide range of timescales, therefore brain state should reflect this dynamics. Limited by experimental techniques and the nature of behavior, most established results in systems neuroscience are about static feature detectors. New techniques for large-scale and chronic measurement of neural activity open up the opportunity to investigate neural dynamics across different timescales. In this talk I will present modeling, theoretical and data analysis works on a particular type of temporal dynamics - scale-invariant dynamics - which has been implicated by behavioral experiments and neural data. I will start with a neural circuit model that utilizes Laplace transform and inverse Laplace transform to produce scale-invariant sequential neural activity and point out evidence for the elements of model in neural data. I will then present a theoretical analysis on the ability for a linear recurrent neural network to generate scale-invariant neural activity. I will show that a geometric series of eigenvalues and translated eigenvectors in the connectivity matrix are needed for generating scale-invariant activity. Finally I will show the existence of reliable neural dynamics on the timescale of minutes in neural data. Taken together, these results suggest the possibility that the neural activity has important dynamics over a much wider range of timescales then previously thought, and explore its consequences for neural circuit models.