Danes-Vault/Zettelkasten/Permanent Notes/20260114132351-bayes-theorem.md
2026-01-14 15:47:27 -05:00

50 lines
1.0 KiB
Markdown

---
id: 20260114132351
title: Bayes' Theorem
type: permanent
created: 2026-01-14T18:23:51Z
modified: 2026-01-14T18:43:20Z
tags: []
---
# Bayes' Theorem
Suppose we know $P(x)$, *the prior*, and we get some data
$Y$, and we want to know the probability $P(X|Y)$, which in
English is *the probability of our model being correct given
the data we've collected*.
We can use Bayes' Theorem to find it!
$$ P(X|Y) = \frac{P(Y|X) P(X)}{P(Y)} $$
Where:
- $P(X|Y)$ is called the *posterior*
- $P(Y|X)$ is called the *likelihood*
- $P(X)$ is called the *prior*
- $P(Y)$ is called the *evidence*
We can usually find the evidence by the following formula:
$$ P(Y) = \sum P(Y|x_i) P(x_i) $$
We can apply Bayes' Theorem to states in time too.
$$ P(X_t|Y_t) = \frac{P(Y_t|X_t) P(X_t)}{P(Y_t)} $$
where:
- $X_{t+1} = f(x_t, w_t)$
- $Y_{t} = f(x_t) + v_t$
$w_t$ is process noise, while $v_t$ is measurement noise.
Bayes' Rule is great for simulation and data-based
approaches.
## Related Ideas
[[Expected Value]]
## Sources
Bayesian Signal Processing w/ Dan (1/12)