How multiple synapses increase a neuron's classification capacity
Neurons in the brain can be connected by multiple synapses, rather than just one. Using numerical simulations, Song and Benna showed that these parallel connections can increase a neuron’s classification capacity beyond the limit achieved by the perceptron, the classic mathematical model of a linear neuron. This talk describes ongoing work towards deriving this result analytically, with the goal of understanding how parallel synapses improve performance. In our model, a neuron with parallel synapses can apply a monotone nonlinear transformation in each input dimension before performing ``standard" pattern separation, like a perceptron. We have shown that if there exists an entrywise monotone function that maps all patterns to a surface where the total activity across inputs is the same for each pattern, then any labeling of these patterns can be classified. We have also reduced the problem of finding the classification capacity of this nonlinear “dendrite” model to a more familiar perceptron problem, but with correlated inputs. Our geometric characterization already shows that capacity is high when the input patterns are normalized. Since sensory input to the brain is believed to be normalized, this suggests that parallel synapses may boost classification capacity even more than predicted by models that ignore geometric effects.