Hebbian theory, introduced by Donald Hebb, explains how synaptic efficacy increases through repeated and persistent stimulation of a postsynaptic cell by a presynaptic cell. It suggests that cells that fire together wire together, emphasizing the importance of temporal precedence in causation. This theory is used to explain associative learning and is considered the neuronal basis of unsupervised learning in neural networks.
Carnegie Mellon University
Spring 2020
This course provides a comprehensive introduction to deep learning, starting from foundational concepts and moving towards complex topics such as sequence-to-sequence models. Students gain hands-on experience with PyTorch and can fine-tune models through practical assignments. A basic understanding of calculus, linear algebra, and Python programming is required.
No concepts data
+ 40 more concepts