Clip
Understanding Backpropagation and Gradient Descent in Neural Networks
Backpropagation is a technique used in neural networks to adjust the weights of the input-hidden layer connections. Gradient descent is used with multiple neuron-like processing units and for single layers of connection weights.