Hebbian Learning as Correlation
- A central tennet to this work has been learning is Hebbian learning.
- Aside from cell birth and death, and synaptic death and growth, Hebbian
learning seems to be the only type of biologically supported learning.
- One particularly good theoretical point is that Hebbian
Learning makes synapses correlators.
- Hebb states that two neurons that are coactive tend to
have their connecting strength increased.
- Assume this only applies to neurons that are connected.
- If there is an A->B connection
and an A->C connection and B fires along with A more often
than C than A->B is stronger than A->C.
- Any monotonically increasing function will work, but
I've been using (recently) a function that makes
a linear relation between the two.
- Has anyone said this before?
- Extra notes: you need an anti-Hebbian learning rule also
called a forgetting rule.
- The amount of change has to be based on the existing strength
with large strengths increasing less and decreasing more.
- In the simulations I'm describing, I'm only using pre-not-post
anti-Hebbian learning.
- I worked out some mathematical theory as to why the algorithm
works (see
Neural Network Theory: from correlators to CAs").
- I did some simulations.