An attempt to classify neural networks computational capabilities

Jérémie Cabessa

University Joseph Fourier, Grenoble

Recurrent neural networks constitute a very powerful model of computation. Their expressive power has been proved to range from the capabilities of simple finite state automata up to super-Turing machines, depending on both the kind of synaptic weights and the cells' activation function under consideration. In this context, a natural question would be to try to refine this understanding of networks computational capabilities. As a first step in this direction, we consider some kind of everlasting McCulloch and Pitts neural networks, i.e. finite recurrent neural networks with rational synaptic weights, hard-threshold activation function, and receiving a never-ending stimulation. We then prove an effective equivalence between these networks and specific infinite input reading automata called Muller automata. More precisely, for any Muller automata, one can deduce an everlasting McCulloch and Pitts neural networks recognizing the same language, and conversely, for any such network, one can build a Muller automata recognizing the same neural language. By this equivalence result, the most refined classification of Muller automata according to their computational complexity -- known as the Wagner hierarchy -- induces a corresponding classification of neural networks according to their computational capabilities. Interestingly, the complexity of a network in this classification happens to be intimately linked to the attracting properties of the underlying dynamical system.

Back