Artificial neural networks (ANN) are popular machine learning tools that are widely used to solve problems like function approximation, time series prediction, medical diagnosis, character recognition and several optimization problems in various domains of science and engineering. While training ANN, it is essential to have a better optimization method for faster convergence. Simultaneous Perturbation with Stochastic approximation (SPSA) is such a successful optimization method. SPSA provides its power and relative ease of use in difficult multivariate optimization problems and the underlying gradient approximation that requires only two objective function measurements per iteration regardless of the dimension of the optimization problem. This book discusses different neural network learning algorithms to solve classification and non-linear function approximation problems with the combination of simultaneous perturbation, dynamic tunneling techniques, modified back propagation, and neighborhood approach with adaptive learning parameters. Efficiency of these algorithms has been discussed with detailed simulation results for different problems.