Hebbian Learning
- Hebb is of course more renowned (at least to computer scientists)
for Hebbian learning than for CAs.
- The basic idea is that if neuron A fires neuron B frequently then
the synapse that connects them is strengthened.
- This leaves a wide range of possible learning rules.
- It's clear you need anti-Hebbian learning (which is Hebbian).
- We've done some work on simulating different varieties of
Hebbian rules.
- Our novel work here is on compensatory learning, where the total
synaptic strength of a neuron is limited.
- We've also done some stuff with STP. I think
this will really change the dynamics of systems.
- Still, I'm convinced that the real neural learning rules
are more sophisticated.
- Neurobiology helps here, but I think the current evidence and theory
is not conclusive.
- Spike timed dependent plasiticity seems to have some nice
evidence for it.
- There may be different rules for different types of neuron
pairs.
- I'd like to integrate fatigue into the rule.
- Clearly there is a lot to explore here.