Reading list
**Wang, L., Zhang, X., Su, H., & Zhu, J. (2024). A comprehensive survey of continual learning: Theory, method and application. IEEE transactions on pattern analysis and machine intelligence, 46(8), 5362-5383.
**Ehret, B., Henning, C., Cervera, M. R., Meulemans, A., Von Oswald, J., & Grewe, B. F. (2021). Continual learning in recurrent neural networks. ICLR 2021.
**Barry, M. L., Gerstner, W., & Bellec, G. (2025). Context selectivity with dynamic availability enables lifelong continual learning. Neural Networks, 107728.
**Shan, H., Minni, S., & Duncker, L. (2025). Separating the what and how of compositional computation to enable reuse and continual learning. NeurIPS 2025.
**Zheng, W. L., Wu, Z., Hummos, A., Yang, G. R., & Halassa, M. M. (2024). Rapid context inference in a thalamocortical model using recurrent neural networks. Nature Communications, 15(1), 8275.
*Hummos, A. (2023). Thalamus: a brain-inspired algorithm for biologically-plausible continual learning and disentangled representations. ICLR 2023.
*Mishra, Poonam, and Rishikesh Narayanan. "Stable continual learning through structured multiscale plasticity manifolds." Current opinion in neurobiology 70 (2021): 51-63.
*Logiaco, Laureline et al. "Thalamic control of cortical dynamics in a model of flexible motor sequencing" Cell Reports, Volume 35, Issue 9, 109090
*Costacurta, J., Bhandarkar, S., Zoltowski, D., & Linderman, S. (2024). Structured flexibility in recurrent neural networks via neuromodulation. Advances in Neural Information Processing Systems, 37, 1954-1972.
*Hiratani, N. (2024). Disentangling and mitigating the impact of task similarity for continual learning. Advances in Neural Information Processing Systems, 37, 3243-3274.
Driscoll, Laura N., Krishna Shenoy, and David Sussillo. "Flexible multitask computation in recurrent networks utilizes shared dynamical motifs." Nature Neuroscience 27.7 (2024): 1349-1363.
Zenke, Friedemann, and Axel Laborieux. "Theories of synaptic memory consolidation and intelligent plasticity for continual learning." arXiv preprint arXiv:2405.16922 (2024).
Heald, J. B., Lengyel, M., & Wolpert, D. M. (2021). Contextual inference underlies the learning of sensorimotor repertoires. Nature, 600(7889), 489-493.
Heald, J. B., Lengyel, M., & Wolpert, D. M. (2023). Contextual inference in learning and memory. Trends in cognitive sciences, 27(1), 43-64.
Cossu, A., Carta, A., Lomonaco, V., & Bacciu, D. (2021). Continual learning for recurrent neural networks: an empirical evaluation. Neural Networks, 143, 607-627.
Wortsman, M., Ramanujan, V., Liu, R., Kembhavi, A., Rastegari, M., Yosinski, J., & Farhadi, A. (2020). Supermasks in superposition. Advances in neural information processing systems, 33, 15173-15184.
Sandbrink, K., Bauer, J., Proca, A., Saxe, A., Summerfield, C., & Hummos, A. (2024). Flexible task abstractions emerge in linear networks with fast and bounded units. Advances in Neural Information Processing Systems, 37, 6938-6978.
Todo
- Learn jax;
- Reproduce the main results from Leo's thesis in jax;
- Present the literature on CL in RNN.