Backpropagation

    0

    Algorithm for adjusting weights in neural networks using gradients; Algorithm for training neural networks by adjusting weights to minimize errors.

    A supervised learning algorithm used for training artificial neural networks. It involves redistributing the error term back through the network layers to update the weights, thereby minimizing the error in predictions. Wikipedia