N-gram Language Models

N-gram language model

An n-gram language model is a type of language model that predicts the probability of the next word in a sequence based on a fixed window of previous words. The size of the window depends on the specific n-gram model being used. These models can be estimated using frequency counts from a text corpus, but may require smoothing techniques to handle sparsity. However, n-gram models are not commonly used in modern natural language processing research and applications, as they have been replaced by deep learning methods such as large language models.

1 courses cover this concept

COS 484: Natural Language Processing

Princeton University

Spring 2023

This course introduces the basics of NLP, including recent deep learning approaches. It covers a wide range of topics, such as language modeling, text classification, machine translation, and question answering.

No concepts data

+ 13 more concepts