Back-propagation

Backpropagation

Backpropagation is a machine-learning algorithm that adjusts a model's parameters to minimize the mean squared error. It computes the gradient of a loss function with respect to the weights of the network, one layer at a time, using dynamic programming. Backpropagation is commonly used in conjunction with gradient descent for training neural networks.

1 courses cover this concept

11-785 Introduction to Deep Learning

Carnegie Mellon University

Spring 2020

This course provides a comprehensive introduction to deep learning, starting from foundational concepts and moving towards complex topics such as sequence-to-sequence models. Students gain hands-on experience with PyTorch and can fine-tune models through practical assignments. A basic understanding of calculus, linear algebra, and Python programming is required.

No concepts data

+ 40 more concepts