Event Date
Abstract:
Flexible function is essential for the brain to cope with varying environments, changing quality of sensory information as well as context dependency. This requires organized communication and flexible information processing among the different sub-circuits in neuronal networks. While network oscillations and in particular coherent dynamics between different sub-networks have been shown to change correlation structures, precise mechanisms for the self-regulated processing of information in brains are, however, not well understood.
Here we propose [1-2] that neuronal network activity has two separate components: a collective reference state on top of which information is encoded and distributed in deviations from this reference, similarly to how radio signals broadcast information via frequency or amplitude modulation. In networks, switching between dynamical reference states then enables fast and flexible rerouting of information [1]. In particular, for coupled oscillator networks we analytically show how the physical network structure and the dynamical reference state co-act in order to generate a specific information routing pattern, as quantified by transfer entropy. In modular networks, we find that local changes within a sub-network, e.g. as a result of local processing, are capable of influencing the network’s global reference dynamics and thereby can actively control the network-wide distribution of information. This in turn influences the local processing. Thus, in this loop, the network, while performing computations, is also capable in continuously updating its own function in a dynamic and flexible way [2]. We numerically show that this mechanism for self-organized information processing naturally enables context dependent pattern-recognition in an oscillatory Hopfield network and an analog version of believe propagation.
We are currently exploring learning strategies within this approach [3] and developing novel data analysis tools combining dimension reduction and dynamic motif detection to identify possible reference dynamics in multi-site electrode recordings of neuronal brain activity.
If time permits, we will also discuss how we are currently using our brain inspired approach to design novel neuromorphic hardware based on energy efficient super-conducting oscillators [4]
[1] Kirst, Timme, Battaglia, Nature communications (2016)
[2] Kirst, Magnasco, Modes, Current Opinion in Systems Biology (2017)
[3] Zhang, Kirst (in prep)
[4] Cheng, Kirst*, Vasudevan, IEEE Transactions on Applied Superconductivity (2023, in revision)
Biography:
CHRISTOPH KIRST, Ph.D., is a theoretical neuroscientist interested in flexible computation and the coordination of information processing in neuronal circuits of brains and machines. Christoph’s work combines mathematical theory, dynamical systems modeling as well as algorithm development and machine learning with the analysis of large-scale data sets.
Christoph is currently an Assistant Professor at the University of California San Francisco (UCSF) as well as a Faculty Scientist at the Lawrence Berkeley National Laboratory (LBL). After studying mathematics and physics at the University of Göttingen (Germany), Oxford University (UK), and the Humboldt University Berlin (Germany) Christoph obtained a Master's degree in Mathematics from Cambridge University (UK) and a Diploma in Physics from the University of Göttingen (Germany).
For his PhD in theoretical physics at the Max-Planck Institute for Dynamics and Self-Organization (Germany) Christoph studied the collective dynamics and computation in complex networks. Subsequent to studying universal modulation mechanisms of single neuron dynamics at the Ludwig Maximilian University Munich (Germany) he became an independent research Fellow for Physics and Biology at the Rockefeller University (New York City). Christoph also became an independent Kavli Physics Fellow and member of the Steering Committee at the Kavli Neural Systems Institute (New York City) where he developed tools for the analysis of large-scale data sets to better understand brain activity, structure and its flexible and modulatory functions. He also developed fundamental theory for mechanisms to coordinate large scale distributed processing in neuronal networks that is currently used to develop a new generation of neuromorphic computing devices.