Skip to content

mutual information

A technical introduction to mutual information.
Defines mutual information

Mutual Information and Dependence

Assumed audience

  • Reading level: technical.
  • Background: probability and basic entropy.
  • Goal: understand mutual information as shared uncertainty.

Definition

Mutual information measures how much knowing one variable reduces uncertainty about another:

I(X;Y)=H(X)+H(Y)H(X,Y). I(X;Y) = H(X) + H(Y) - H(X,Y).

Interpretation

  • If XX and YY are independent, I(X;Y)=0I(X;Y)=0.
  • If XX determines YY, mutual information is high.

Why this matters

Mutual information is a core tool for dependency detection and feature selection.

Relations

Date created
Date updated
Dependencies
  • Information curricula entropy and surprise.md
Part of
Information disciplines information theory terms

Cite

@misc{emsenn2025-mutual-information,
  author    = {emsenn},
  title     = {mutual information},
  year      = {2025},
  note      = {A technical introduction to mutual information.},
  url       = {https://emsenn.net/library/information/domains/information-theory/terms/mutual-information/},
  publisher = {emsenn.net},
  license   = {CC BY-SA 4.0}
}