Compensatory Learning
- Hebbian learning roughly states that when two connected neurons
co-fire, the synapse between them gets stronger.
- It also implies that when one fires and the other does not, the
connection weakens.
- These simple rules make the weight reflect the likelihood
that the post-synaptic neuron fires when the pre-synaptic
neuron does.
- We've added a compensatory mechanism. This takes the total
strength of the synapses from a neuron into account, pushing
the total toward a threshold.
- In these simulations, we've used the total weight into the
post-synaptic neuron.
- This makes it faster to spread neural circuits into the
unstimulated hidden layer.
- Note that compensatory learning alone doesn't solve the xor problem.