A recursive neural network is a type of deep neural network that uses the same set of weights to produce structured predictions over variable-size inputs. It has been used in natural language processing to learn sequence and tree structures, as well as distributed representations of structure. Models and frameworks have been developed since the 1990s.
Stanford University
Winter 2023
CS 224N provides an in-depth introduction to neural networks for NLP, focusing on end-to-end neural models. The course covers topics such as word vectors, recurrent neural networks, and transformer models, among others.
No concepts data
+ 21 more concepts