Simultaneous unsupervised and supervised learning
of cognitive functions in biologically plausible
spiking neural networks

Trevor Bekolay, Carter Kolbeck, Chris Eliasmith
Centre for Theoretical Neuroscience, University of Waterloo
bekolay.org/cogsci2013-pres
How can we learn the connection weights
in the spiking neural networks in Spaun?
1. Cognitive functions
Vector Symbolic Architecture (Plate, 2003)
$\circ{5} \Rightarrow \left[ 0.12, 0.56, 0.48 \right] \Rightarrow$
$= \text{COUNT} \circledast \circ{1}
+ \text{NUMBER} \circledast \circ{5}$

$= \text{COUNT} \circledast \circ{2}
+ \text{NUMBER} \circledast \circ{5}$
$= \text{COUNT} \circledast \circ{3}
+ \text{NUMBER} \circledast \circ{5}$
How can we learn the binding function $\circledast$?
2. In spiking neurons
Neural Engineering Framework
(Eliasmith & Anderson, 2003)
$X$,
$\;e_i$,
$\quad a_i = f(e_i \cdot X)$
ˆX=∑idiai
ˆX=∑idiai
3. Learning
Random initial $\omega_{ij}$
Supervised learning
Given error $\color{red}{E} = X - \hat{X}$,
Δωij∝aiej⋅E
Kirkwood, Rioult & Bear (1996)
Combined learning
Δωij∝ai[Sej⋅E⏟ Supervised+(1−S)aj(aj−θ)⏟ Unsupervised]
Given an error signal, $E$, we can learn binding.
How are error signals generated?
Learning parameters
Neurons per dimension, learning rate, supervision ratio ($S$)
jaberg/hyperopt