the backpropogation algorithm is often used for training fee

the back-propogation algorithm is often used for training feed-farword neural networks.why do we need to calculate the gradient in the BP algorithm?

Solution

Backpropagation algorithm is gradient descent and it is usually restricted to first derivative because the application of chain rule on first derivative which gives us the \"back propagation\".

Backpropagation is a common method of training artificial neural networks used in conjunction with an optimization method such as gradient descent.

it calculates the gradient of a loss function with respect to all the weights in the network.

Backpropagation has a small bugs can be present and ruin it,

This may mean it looks like J() is decreasing, but in reality it may not be decreasing by as much as it should,So using a numeric method to check the gradient can help diagnose a bug

Gradient helps make sure an implementation is working correctly.

the back-propogation algorithm is often used for training feed-farword neural networks.why do we need to calculate the gradient in the BP algorithm?SolutionBack

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site