Skip to content

Likelihood

Defines Likelihood, likelihood function

The likelihood of a hypothesis H given observed evidence E is P(E | H) — the probability of observing the evidence if the hypothesis were true. Likelihood is not itself a probability of the hypothesis; it is the probability of the data under the hypothesis, viewed as a function of H with E fixed.

Likelihood plays a central role in both frequentist and Bayesian statistics. In maximum likelihood estimation, the parameter value that maximizes the likelihood function is taken as the best estimate. In Bayesian inference, the likelihood is the updating factor in Bayes’ theorem: posterior ∝ likelihood × prior, or P(H | E) ∝ P(E | H) · P(H).

The likelihood ratio P(E | H₁) / P(E | H₂) compares how well two hypotheses explain the same evidence, without requiring prior probabilities. This ratio is the basis of likelihood ratio tests in frequentist statistics and of Bayes factors in Bayesian model comparison.

Relations

Date created

Cite

@misc{emsenn2026-likelihood,
  author    = {emsenn},
  title     = {Likelihood},
  year      = {2026},
  url       = {https://emsenn.net/library/math/domains/probability/terms/likelihood/},
  publisher = {emsenn.net},
  license   = {CC BY-SA 4.0}
}