Abstract

Reservoir computing is a computational framework suited for temporal/sequential data processing. It is derived from several recurrent neural network models, including echo state networks and liquid state machines. A reservoir computing system consists of a reservoir for mapping inputs into a high-dimensional space and a readout for pattern analysis from the high-dimensional states in the reservoir. The reservoir is fixed and only the readout is trained with a simple method such as linear regression and classification. Thus, the major advantage of reservoir computing compared to other recurrent neural networks is fast learning, resulting in low training cost. Another advantage is that the reservoir without adaptive updating is amenable to hardware implementation using a variety of physical systems, substrates, and devices. In fact, such physical reservoir computing has attracted increasing attention in diverse fields of research. The purpose of this review is to provide an overview of recent advances in physical reservoir computing by classifying them according to the type of the reservoir. We discuss the current issues and perspectives related to physical reservoir computing, in order to further expand its practical applications and develop next-generation machine learning systems.

Keywords

Reservoir computingComputer scienceArtificial neural networkReservoir modelingArtificial intelligenceVariety (cybernetics)Echo state networkRecurrent neural networkMachine learningState (computer science)Distributed computingAlgorithmPetroleum engineeringGeology

MeSH Terms

AlgorithmsMachine LearningNeural NetworksComputer

Affiliated Institutions

Related Publications

Publication Info

Year
2019
Type
review
Volume
115
Pages
100-123
Citations
1828
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1828
OpenAlex
42
Influential
1659
CrossRef

Cite This

Gouhei Tanaka, Toshiyuki Yamane, J. B. Héroux et al. (2019). Recent advances in physical reservoir computing: A review. Neural Networks , 115 , 100-123. https://doi.org/10.1016/j.neunet.2019.03.005

Identifiers

DOI
10.1016/j.neunet.2019.03.005
PMID
30981085
arXiv
1808.04962

Data Quality

Data completeness: 93%