Markov's inequality

Markov%27s inequality

Markov's inequality is a probability theory concept named after Andrey Markov, which provides an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant. It was also discovered by Pafnuty Chebyshev and Jean-Baptiste Bienaymé.

3 courses cover this concept

CSE 312 Foundations of Computing II

University of Washington

Winter 2022

This course dives deep into the role of probability in the realm of computer science, exploring applications such as algorithms, systems, data analysis, machine learning, and more. Prerequisites include CSE 311, MATH 126, and a grasp of calculus, linear algebra, set theory, and basic proof techniques. Concepts covered range from discrete probability to hypothesis testing and bootstrapping.

No concepts data

+ 41 more concepts

CS 168: The Modern Algorithmic Toolbox

Stanford University

Spring 2022

CS 168 provides a comprehensive introduction to modern algorithm concepts, covering hashing, dimension reduction, programming, gradient descent, and regression. It emphasizes both theoretical understanding and practical application, with each topic complemented by a mini-project. It's suitable for those who have taken CS107 and CS161.

No concepts data

+ 57 more concepts

CS 265 / CME 309 Randomized Algorithms and Probabilistic Analysis

Stanford University

Fall 2022

This course dives into the use of randomness in algorithms and data structures, emphasizing the theoretical foundations of probabilistic analysis. Topics range from tail bounds, Markov chains, to randomized algorithms. The concepts are applied to machine learning, networking, and systems. Prerequisites indicate intermediate-level understanding required.

No concepts data

+ 37 more concepts