This paper discusses the effect of response functions on the performance of multi-layered perceptrons. It will be shown that the N-bit parity problem, and even any binary classification problem, is exactly solvable with a simple perceptron using the right (non-monotonic) response function. We discuss problems which arise, and present some results, when using non-monotonic response functions together with learning. As a new approach we introduce function-learning to adjust response functions during the learning process.
Keywords: Neural networks, multi-layer perceptron, N-bit parity, response functions, pattern classification, function-learning.