Parameters and Iris Results
- One of the things I didn't guess at from Diehl and Cook's paper was
synaptic time variation and incomplete connectivity.
- This prepares certain neurons for certain inputs.
- The learning then specialises the neurons for particular inputs.
- You want one neuron firing per epoch.
- "Without adaptation of intrinsic excitabilityes, the network would
start performing erroneous inference, learning would reinforce this
erroneous behaviour, and performance would quickly breadk down"
-Habbenschuss et al.
- I usually trained for one pass through the data.
- At the end, you rebuild the topology with static synapses.
- You present the training items, and this allows you to associate
particular category neurons with particular categories to a certain
degree.
- You then present the test items, and decide the category based on
which neurons fire.
System | Train | Test |
Orig Params | 70.67% | 67.33% |
Orig Params .1ms | 68% | 71.33% |
Novel Params | 75.33% | 76% |
Huyck 2020 | 92% | 90.6% |
Huyck and Mitchell 2014 | | 93.67% |
- I had to wrestle with it, but managed the parameters from Rybka et al.
2024. That's the orig params row with 1ms time step and .1 ms.
- I did a bit better by changing the synaptic weights from category to
inhibition, and the weight back.
- They're reasonable results but other spiking systems (that I did) have
done better.