fatiguing Leaky Integrate and Fire (fLIF) neurons are the model
we use.
They're not as accurate as compartmental models, but they
are quite accurate and we can simulate 100,000 of them in
real-time on my PC.
IF neurons are McCullouch Pitts Neurons (1943); neurons collect
activation; if they surpass a threshold they fire and send out
activation.
LIF neurons are commonly used. If in a given cycle (or gradually
for continuous models) a neuron does not fire, some of its activation
leaks away.
Fatigue means the more a neuron fires the harder it becomes
to fire. We do this by raising the threshold when a neuron fires
and decrease it (down to a base) when it doesn't fire.
We're pretty much unique in using this model.
Our model is discrete and ignores synaptic delays and
refractory periods.
That means that each cycle is roughly equivalent to 10 ms
of biological time.