Fatiguing Leaky Integrate and Fire Neurons (fLIF)
- Neurons are connected via weighted connections
- Neurons collect activation
- Neurons fire if they get enough activation
- This sends out activation to another neurons
- That's the Hopfield model though we use sparse uni-directional
connections. That's Integrate and Fire.
- Leaky means that if a neuron doesn't fire in a cycle,
some of the activation leaks away.
- LIF neurons are used in a fair number of places.
- Fatigue means that the more a neuron fires, the harder
it becomes to fire
- We model this by increasing the firing threshold when a
neuron fires, and decreasing it when it doesn't
- There's not much work with fatiguing LIF neurons
- It's a pretty good model of biological neural behaviour