Things from the Last Talk
- I gave
a talk here in May 2005.
- Some things that I reported on were:
- CA capacity (it's O(n)).
- Categorisation tasks (artificial, Congressional Voting, and
Information Retrieval).
- Correlatory and compensatory learning. Correlatory
sets the weights to the likelihood that the post-synaptic
neuron fires when the pre-synaptic neuron does. Compensatory
modifies this to set the total synaptic weight of neuron. This
helps things fire up sooner, but also prevents neurons from
getting to powerful and causing simulated epilepsy.
- Hierarchical Categorisation. We used overlapping CAs to store
hierarchical categories. Compensatory learning helped here.
- Compensatory learning can also be used to recover from
"bad" neural recruitment.
- Spontaneous activation can be useful for starting things up
and of course is a biological phenomena.