Zettelkasten/Permanent Notes/20260114132351-bayes-theorem.md
2026-02-06 20:19:24 -05:00

1.0 KiB

id title type created modified tags
20260114132351 Bayes' Theorem permanent 2026-01-14T18:23:51Z 2026-01-14T18:43:20Z

Bayes' Theorem

Suppose we know P(x), the prior, and we get some data Y, and we want to know the probability P(X|Y), which in English is the probability of our model being correct given the data we've collected.

We can use Bayes' Theorem to find it!

P(X|Y) = \frac{P(Y|X) P(X)}{P(Y)}

Where:

  • P(X|Y) is called the posterior
  • P(Y|X) is called the likelihood
  • P(X) is called the prior
  • P(Y) is called the evidence

We can usually find the evidence by the following formula:

P(Y) = \sum P(Y|x_i) P(x_i)

We can apply Bayes' Theorem to states in time too.

P(X_t|Y_t) = \frac{P(Y_t|X_t) P(X_t)}{P(Y_t)}

where:

  • X_{t+1} = f(x_t, w_t)
  • Y_{t} = f(x_t) + v_t

w_t is process noise, while v_t is measurement noise.

Bayes' Rule is great for simulation and data-based approaches.

Expected Value

Sources

Bayesian Signal Processing w/ Dan (1/12)