Abstract
There is an avalanche of new data on the brain’s activity, revealing the collective dynamics of vast numbers of neurons. In principle, these collective dynamics can be of almost arbitrarily high dimension, with many independent degrees of freedom — and this may reflect powerful capacities for general computing or information. In practice, neural datasets reveal a range of outcomes, including collective dynamics of much lower dimension — and this may reflect the structure of behavioral tasks and learning rules. For what networks does each case occur? The speaker will present his contribution to the answer which is a new framework that links tractable statistical properties of network connectivity with the dimension of the activity that they produce. He will also explore how and why connectivity features that impact dimension arise as networks learn to perform basic tasks. He’ll describe where he has succeeded, where he has failed, and the many avenues that remain.
About the speaker
Prof. Eric Shea-Brown received his PhD in Applied and Computational Mathematics from the Princeton University in 2004. He then joined the University of Washington in 2008 and is currently the Professor of Applied Mathematics. He is also an affiliate investigator at the Allen Institute for Brain Science, and an adjunct faculty in the Department of Physiology and Biophysics and a member of the Program in Neuroscience.
Prof. Shea-Brown’s research focuses on the time course of decision making in neural networks, mechanisms and consequences of coordinated spike patterns, graph-theoretic tools for cutting neural connectomics problems down to size, and the stability / chaos transition in spiking circuits.
Prof. Shea-Brown received numerous awards including the Simons Fellowship in Mathematics (2014), the US National Science Foundation CAREER Award (2011-2016) and the Burroughs Wellcome Fund Career Awards at the Scientific Interface (2006-2013).
|