Compensatory Learning
- Hebbian learning roughly states that when two connected neurons
co-fire, the synapse between them gets stronger.
- It also implies that when one fires and the other does not, the
connection weakens.
- These simple rules make the weight reflect the likelihood
that the post-synaptic neuron fires when the pre-synaptic
neuron does.
- We've added a compensatory mechanism. This takes the total
strength of the synapses from a neuron into account, pushing
the total toward a threshold.
- In these simulations we've used the total weight from the pre-synaptic
neuron (pre-compensatory) and the total weight to the post-synaptic
neuron (post-compensatory).
- The input subnet neurons learn using the pre-compensatory rule,
and the SOM neurons learn from the post-compensatory rule. The
saturation bases are 5 and 1 respectively.
- This makes it faster for learning to spread neural circuits into the
unstimulated SOM layer. It also helps spread the circuit out in the
SOM layer.