Conditional Independence

Conditional independence

Conditional independence in probability theory refers to situations where an observation does not affect the certainty of a hypothesis. It is expressed as an equality between the probability of the hypothesis given the observation and the probability without the observation. This concept is important in graph-based theories of statistical inference.

1 courses cover this concept

10-401 Introduction to Machine Learning

Carnegie Mellon University

Spring 2018

A comprehensive exploration of machine learning theories and practical algorithms. Covers a broad spectrum of topics like decision tree learning, neural networks, statistical learning, and reinforcement learning. Encourages hands-on learning via programming assignments.

No concepts data

+ 55 more concepts