Dynamical features simulated by recurrent neural networks

Abstract

The evolution of two-dimensional neural network models with rank one connecting matrices and saturated linear transfer functions is dynamically equivalent to that of piecewise linear maps on an interval. It is shown that their iterative behavior ranges from being highly predictable, where almost every orbit accumulates to an attracting fixed point, to the existence of chaotic regions with cycles of arbitrarily large period.

Publication Title

Neural Networks

Share

COinS