Circular Backpropagation Networks for Classification Sandro Ridella, Stefano Rovetta, and Rodolfo Zunino IEEE Transactions on Neural Networks, vol.8, no.1, January 1997 (c) 1997 IEEE ABSTRACT The class of mapping networks is a general family of tools to perform a wide variety of tasks; however, no unifying framework exists to describe their theoretical and practical properties. This paper presents a standardized, uniform representation for this class of networks, and introduces a simple modification of the multilayer perceptron with interesting properties, especially well suited to cope with pattern classification tasks. The proposed model unifies the two main representation paradigms found in the class of mapping networks for classification, namely, the surface-based and the prototype-based schemes, while retaining the advantage of being trainable by backpropagation. The enhancement in the representation properties and the generalization performance are assessed through results about the worst-case requirement in terms of hidden units and about the Vapnik-Chervonenkis dimension and Cover capacity. The theoretical properties of the network also suggest that the proposed modification to the multilayer perceptron is in many senses optimal. A number of experimental verifications also confirm theoretical results about the model's inceased performances, as compared with the multilayer perceptronand the Gaussian radial basis function network. INDEX TERMS Feedforward neural networks, backpropagation, pattern classification, knowledge representation