Word Vectors

Word embedding

Word embeddings are representations of words in a vector space that encode the meaning of words such that similar words are close together. They can be generated using various techniques such as neural networks, dimensionality reduction and probabilistic models. Word embeddings have been shown to improve performance in NLP tasks such as syntactic parsing and sentiment analysis.

1 courses cover this concept

CS 224N: Natural Language Processing with Deep Learning

Stanford University

Winter 2023

CS 224N provides an in-depth introduction to neural networks for NLP, focusing on end-to-end neural models. The course covers topics such as word vectors, recurrent neural networks, and transformer models, among others.

No concepts data

+ 21 more concepts