Call Us 080-41656200 (Mon-Sat: 10AM-8PM)
Free Shipping above Rs. 1499
Cash On Delivery*

Backpropagation and it's Modifications


Marketed By :  LAP LAMBERT Academic Publishing   Sold By :  Kamal Books International  
Delivery in :  10-12 Business Days


Check Your Delivery Options

Rs. 3,651

Availability: In stock

  • Product Description

Gradient based methods are one of the most widely used error minimization methods used to train back propagation networks. The BP training algorithm is a supervised learning method for multi-layered feedforward neural networks. It is essentially a gradient descent local optimization technique which involves backward error correction of the network weights. It has many limitations of convergence , getting trapped in local minima and performance. To solve this there are different modifications like introducing momentum and bias terms, conjugate gradient are used. In the conjugate gradient algorithms a search is performed along conjugate directions, which produces generally faster convergence than steepest descent directions.Here, in this monograph we will consider parity bit checking problem with conventional backpropagation method and other methods. We have to construct suitable neural network and train it properly. The training dataset has to be used to train “classification engine” for solution purpose and then the trained network is used for testing its validation.

Product Specifications
SKU :COC17606
AuthorRicha Kathuria Karthikeyan
Number of Pages72
Publishing Year1/26/2012
Edition1 st
Book TypeComputing & information technology
Country of ManufactureIndia
Product BrandLAP LAMBERT Academic Publishing
Product Packaging InfoBox
In The Box1 Piece
Product First Available On ClickOnCare.com2015-07-26 00:00:00