Efficient recurrent architectures through activity sparsity and sparse back-propagation through time A Subramoney, KK Nazeer, M Schöne, C Mayr, D Kappel arXiv preprint arXiv:2206.06178, 2022 | 14 | 2022 |
Spinnaker2: A large-scale neuromorphic system for event-based and asynchronous machine learning HA Gonzalez, J Huang, F Kelber, KK Nazeer, T Langer, C Liu, ... arXiv preprint arXiv:2401.04491, 2024 | 4 | 2024 |
Language Modeling on a SpiNNaker 2 Neuromorphic Chip KK Nazeer, M Schöne, R Mukherji, C Mayr, D Kappel, A Subramoney arXiv preprint arXiv:2312.09084, 2023 | 2 | 2023 |
Block-local learning with probabilistic latent representations D Kappel, KK Nazeer, CT Fokam, C Mayr, A Subramoney arXiv preprint arXiv:2305.14974, 2023 | 2 | 2023 |
Activity sparsity complements weight sparsity for efficient RNN inference R Mukherji, M Schöne, KK Nazeer, C Mayr, A Subramoney arXiv preprint arXiv:2311.07625, 2023 | 1 | 2023 |
EGRU: Event-based GRU for activity-sparse inference and learning A Subramoney, KK Nazeer, M Schöne, C Mayr, D Kappel | 1 | 2022 |
Weight Sparsity Complements Activity Sparsity in Neuromorphic Language Models R Mukherji, M Schöne, KK Nazeer, C Mayr, D Kappel, A Subramoney arXiv preprint arXiv:2405.00433, 2024 | | 2024 |
An efficient RNN Language Model using activity sparsity and sparse back-propagation through time M Schöne, KK Nazeer, C Mayr, D Kappel, A Subramoney | | |