Recurrent neural networks with iterated function systems dynamics

Peter Tino, Georg Dorffner

Publication: Working/Discussion PaperWU Working Paper

32 Downloads (Pure)

Abstract

We suggest a recurrent neural network (RNN) model with a recurrent part corresponding to iterative function systems (IFS) introduced by Barnsley [1] as a fractal image compression mechanism. The key idea is that 1) in our model we avoid learning the RNN state part by having non-trainable connections between the context and recurrent layers (this makes the training process less problematic and faster), 2) the RNN state part codes the information processing states in the symbolic input stream in a well-organized and intuitively appealing way. We show that there is a direct correspondence between the Rényi entropy spectra characterizing the input stream and the spectra of Renyi generalized dimensions of activations inside the RNN state space. We test both the new RNN model with IFS dynamics and its conventional counterpart with trainable recurrent part on two chaotic symbolic sequences. In our experiments, RNNs with IFS dynamics outperform the conventional RNNs with respect to information theoretic measures computed on the training and model generated sequences. (author's abstract)

Publication series

SeriesReport Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
Number18

WU Working Paper Series

  • Report Series SFB \Adaptive Information Systems and Modelling in Economics and Management Science\

Cite this