Page de couverture de Training Neural Networks to Use Time Like the Brain Does

Training Neural Networks to Use Time Like the Brain Does

Training Neural Networks to Use Time Like the Brain Does

Écouter gratuitement

Voir les détails du balado

À propos de cet audio

Featured paper: [**Efficient event-based delay learning in spiking neural networks**](https://doi.org/10.1038/s41467-025-65394-8)

What if AI could learn to use time the way your brain does, with a fraction of the energy? In this episode, we explore groundbreaking research that's revolutionizing spiking neural networks by teaching them to master synaptic delays. Discover how this brain-inspired approach uses sparse, event-driven spikes instead of constant data streams, slashing energy consumption while processing temporal information. We dive into the breakthrough EventProp algorithm that calculates exact gradients for both connection weights and delays, running 26 times faster than previous methods while using half the memory. Learn why adding learnable delays transforms small networks into powerhouses, achieving state-of-the-art accuracy with five times fewer parameters on speech recognition tasks. Join us as we unpack how this event-based training is paving the way for neuromorphic hardware that thinks like the brain but runs on just 20 watts of power. Perfect for anyone fascinated by the future of energy-efficient AI that truly understands the language of time.*Disclaimer: This content was generated by NotebookLM. Dr. Tram doesn't know anything about this topic and is learning about it.*

Pas encore de commentaire