TY - JOUR
T1 - Model-free prediction of spatiotemporal dynamical systems with recurrent neural networks
T2 - Role of network spectral radius
AU - Jiang, Junjie
AU - Lai, Ying Cheng
N1 - Publisher Copyright: © 2019 authors. Published by the American Physical Society.
PY - 2019/10
Y1 - 2019/10
N2 - A common difficulty in applications of machine learning is the lack of any general principle for guiding the choices of key parameters of the underlying neural network. Focusing on a class of recurrent neural networks - reservoir computing systems, which have recently been exploited for model-free prediction of nonlinear dynamical systems - we uncover a surprising phenomenon: the emergence of an interval in the spectral radius of the neural network in which the prediction error is minimized. In a three-dimensional representation of the error versus the time and spectral radius, the interval corresponds to the bottom region of a "valley."Such a valley arises for a variety of spatiotemporal dynamical systems described by nonlinear partial differential equations, regardless of the structure and the edge-weight distribution of the underlying reservoir network. We also find that, while the particular location and size of the valley depend on the details of the target system to be predicted, the interval tends to be larger for undirected than for directed networks. The valley phenomenon can be beneficial to the design of optimal reservoir computing, representing a small step forward in understanding these machine-learning systems.
AB - A common difficulty in applications of machine learning is the lack of any general principle for guiding the choices of key parameters of the underlying neural network. Focusing on a class of recurrent neural networks - reservoir computing systems, which have recently been exploited for model-free prediction of nonlinear dynamical systems - we uncover a surprising phenomenon: the emergence of an interval in the spectral radius of the neural network in which the prediction error is minimized. In a three-dimensional representation of the error versus the time and spectral radius, the interval corresponds to the bottom region of a "valley."Such a valley arises for a variety of spatiotemporal dynamical systems described by nonlinear partial differential equations, regardless of the structure and the edge-weight distribution of the underlying reservoir network. We also find that, while the particular location and size of the valley depend on the details of the target system to be predicted, the interval tends to be larger for undirected than for directed networks. The valley phenomenon can be beneficial to the design of optimal reservoir computing, representing a small step forward in understanding these machine-learning systems.
UR - http://www.scopus.com/inward/record.url?scp=85082770617&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85082770617&partnerID=8YFLogxK
U2 - 10.1103/PhysRevResearch.1.033056
DO - 10.1103/PhysRevResearch.1.033056
M3 - Article
SN - 2643-1564
VL - 1
JO - Physical Review Research
JF - Physical Review Research
IS - 3
M1 - 033056
ER -