Skip to content

Independence

Defines Independence, independent, independent events

Two events A and B are independent if the occurrence of one does not affect the probability of the other: P(A ∩ B) = P(A) · P(B). Equivalently, conditional probability is trivial: P(A | B) = P(A) and P(B | A) = P(B). Independence means that knowing B occurred gives no information about whether A occurred.

Independence is symmetric (if A is independent of B, then B is independent of A) but not transitive. It is distinct from mutual exclusivity: mutually exclusive events (A ∩ B = ∅) are strongly dependent — knowing one occurred tells you the other did not. Pairwise independence of several events does not imply mutual independence; the latter requires P(A₁ ∩ A₂ ∩ … ∩ Aₙ) = P(A₁) · P(A₂) · … · P(Aₙ) for every subcollection.

Random variables X and Y are independent if every event defined through X is independent of every event defined through Y. Independence of random variables is the foundation of classical statistics: it justifies using the sample to draw conclusions about the population, and it underlies the law of large numbers and the central limit theorem.

Relations

Date created

Cite

@misc{emsenn2026-independence,
  author    = {emsenn},
  title     = {Independence},
  year      = {2026},
  url       = {https://emsenn.net/library/math/domains/probability/terms/independence/},
  publisher = {emsenn.net},
  license   = {CC BY-SA 4.0}
}