From neural PCA to neural ICA E. Oja, J. Karhunen, A. Hyvarinen and L. Wang Helsinki University of Technology, Finland Abstract Neural networks, whether biological or artificial, are very limited in their computational capabilities. Such aspects as locality, parallelism, and homogeneity are important in learning rules if they are to be realized in neural nets. The talk first gives a brief overview of neural Principal Component and Minor Component learning by Hebbian and anti-Hebbian learning rules. Then some simple contrast functions, especially suitable for neural computations, are chosen from among the many suggested for ICA and they are shown to lead to stochastic gradient descent/ascent algorithms that again have the constrained Hebbian form. Some analysis, extensions, and applications of these learning algorithms are discussed.