Abstract
A class of neural models is introduced in which the topology of the neural network has been generated by a controlled probability model. It is shown that the resulting linear operator has a spectral measure that converges in probability to a universal one when the size of the net tends to infinity: a law of large numbers for the spectra of such operators. The analytical treatment is accompanied by omputational experiments.
Keywords
Related Publications
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
It is widely known that neural networks (NNs) are universal approximators of continuous functions. However, a less known but powerful result is that a NN with a single hidden la...
Pilot contamination precoding in multi-cell large scale antenna systems
An LSAS entails a large number (tens or hundreds) of base station antennas serving a much smaller number of terminals, with large gains in spectral-efficiency and energy-efficie...
Scale-Free Behavior in Protein Domain Networks
Several technical, social, and biological networks were recently found to demonstrate scale-free and small-world behavior instead of random graph characteristics. In this work, ...
Demand responsive pricing and competitive spectrum allocation via a spectrum server
In this paper we develop a framework for competition of future operators likely to operate in a mixed commons/property-rights regime under the regulation of a spectrum policy se...
A Study of Complex Deep Learning Networks on High-Performance, Neuromorphic, and Quantum Computers
Current deep learning approaches have been very successful using convolutional neural networks trained on large graphical-processing-unit-based computers. Three limitations of t...
Publication Info
- Year
- 1977
- Type
- article
- Volume
- 32
- Issue
- 2
- Pages
- 499-519
- Citations
- 85
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1137/0132041