Slow stochastic learning with global inhibition: a biological solution to the binary perceptron problem
Options
BORIS DOI
Date of Publication
June 1, 2004
Publication Type
Article
Division/Institute
Subject(s)
Series
Neurocomputing
ISSN or ISBN (if monograph)
0925-2312
Publisher
Elsevier
Language
English
Publisher DOI
Description
Networks of neurons connected by plastic all-or-none synapses tend to quickly forget previously acquired information when new patterns are learned. This problem could be solved for random uncorrelated patterns by randomly selecting a small fraction of synapses to be modified upon each stimulus presentation (slow stochastic learning). Here we show that more complex, but still linearly separable patterns, can be learned by networks with binary excitatory synapses in a finite number of presentations provided that: (1) there is non-vanishing global inhibition, (2) the binary synapses are changed with small enough probability (slow learning) only when the output neuron does not give the desired response (as in the classical perceptron rule) and (3) the neuronal threshold separating the total synaptic inputs corresponding to different classes is small enough.
File(s)
File | File Type | Format | Size | License | Publisher/Copright statement | Content | |
---|---|---|---|---|---|---|---|
1-s2.0-S092523120400058X-main.pdf | text | Adobe PDF | 198.9 KB | publisher | published |