Abstract

We investigate the consequences of maximizing information transfer in a simple neural network (one input layer, one output layer), focusing on the case of nonlinear transfer functions. We assume that both receptive fields (synaptic efficacies) and transfer functions can be adapted to the environment. The main result is that, for bounded and invertible transfer functions, in the case of a vanishing additive output noise, and no input noise, maximization of information (Linsker's infomax principle) leads to a factorial code-hence to the same solution as required by the redundancy-reduction principle of Barlow. We also show that this result is valid for linear and, more generally, unbounded, transfer functions, provided optimization is performed under an additive constraint, i.e. which can be written as a sum of terms, each one being specific to one output neuron. Finally, we study the effect of a non-zero input noise. We find that, to first order in the input noise, assumed to be small in comparison with the (small) output noise, the above results are still valid, provided the output noise is uncorrelated from one neuron to the other.

Keywords

Transfer functionNoise (video)Nonlinear systemMathematicsMaximizationBounded functionConstraint (computer-aided design)Gaussian noiseComputer scienceControl theory (sociology)AlgorithmApplied mathematicsMathematical optimizationMathematical analysisArtificial intelligence

Affiliated Institutions

Related Publications

Network In Network

Abstract: We propose a novel deep network structure called In Network (NIN) to enhance model discriminability for local patches within the receptive field. The conventional con...

2014 arXiv (Cornell University) 1037 citations

Publication Info

Year
1994
Type
article
Volume
5
Issue
4
Pages
565-581
Citations
127
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

127
OpenAlex

Cite This

Jean‐Pierre Nadal, Néstor Parga (1994). Nonlinear neurons in the low-noise limit: a factorial code maximizes information transfer. Network Computation in Neural Systems , 5 (4) , 565-581. https://doi.org/10.1088/0954-898x/5/4/008

Identifiers

DOI
10.1088/0954-898x/5/4/008