Hebbian Learning
- A while ago we showed (and I don't think it's particularly surprising)
that neurons are Turing complete. That is, given enough neurons (the
simple IF kind or more complex ones), you can program anything.
- So what's really interesting about neurons is that they learn.
- They learn in lots of ways, but the simple and widely used way
is permanent change of synaptic weights. In biology this
is long-term potentiation (and depression).
- The way we learn is a Hebbian rule:
- If neuron A tends to cause neuron B to fire, the synaptic
strength increases.
- This of course is underspecified.
- There is not wide spread agreement on the learning rule that
works.
- Currently spike timed dependent plasticity (STDP) is in favour.
- If the pre-synaptic spike happens before the post-synapatic spike,
increase the weight, if after decrease.
- We started working with an older rule called Oja's rule.
- It forces the synapses to the likelihood that the post-synaptic
rule fires when the pre-synaptic rule does.
- We called it the correlatory rule, but it's Oja's.