Post-Hebbian Learning
- The specific Hebbian learning rule says that the weights
change by a constant amount
- Post-Hebbian learning rules can base the change on other
things
- Both post-Hebbian and anti-Hebbian learning rules are Hebbian
in the broad sense.
- Post-Hebbian rules eliminate the axonal weight problem
- If the amount of the change is based on the existing weight,
- higher existing weights change less, and lower existing weights
more
- The strengthening and weaking are offset, and the weights will
tend toward an intermediate value
- There will be a few large strengthenings, and many small
weakenings
- Similarly, inhibitory weights will also tend toward an
intermediate value
- This is all argument; can an experiment show this?