Our group develops and adapts cutting-edge approaches from mathematics, machine learning, and physics. to analyze learning, dynamics, computation, efficiency, and robustness of computation in the brain.
We wish to uncover how high-level cognitive function emerges from the bottom-up. What aspects of circuit architecture drive the emergence of function from simple constituents? What are the dynamical properties of neural circuits, how do these arise from the circuit, and how do they support function? What are the properties of memory systems constructed from noisy, leaky neurons, and how are limitations of the building blocks overcome? What are the structural biases (such as modularity) in brains that make them efficient and robust learners? How do simple plasticity and competitive rules drive the emergence of these structures over development?
In pursuing answers to these questions, we find that the brain contains neural circuits that express invariant low-dimensional dynamics. We find that these circuits underlie fundamental computations like integration. We show how these circuits can combine for flexible and data-efficient computation, inference, and learning. We find that the brain contains surprising and new analog error-correcting codes that enable fault-tolerant computation. We find ways in which the brain outperforms modern AI in learning speed, data efficiency, and robustness, and seek to transfer these insights to build better machine intelligence.
To learn more, please see our Publications.
Who we are: Each path into science is idiosyncratic and unique, and this idiosyncrasy is a strength that leads to new ways of looking at and solving problems, as you can see from our People page.
Accessibility at MIT: http://accessibility.mit.edu