# Perplexity - Bayes Rule in Odds Form
## Metadata
**Status**:: #x
**Zettel**:: #zettel/fleeting
**Created**:: [[2025-11-04]]
**Parent**:: [[Perplexity - Bayes Rule]]
## Synopsis
Bayes' rule in odds form states that posterior odds equal prior odds multiplied by the likelihood ratio (or Bayes factor):
$
O(A_1 : A_2 \mid B) = O(A_1 : A_2) \cdot \Lambda(A_1 : A_2 \mid B)
$
Where:
- $O(A_1 : A_2)$ represents the **prior odds** between two hypotheses $A_1$ and $A_2$
- $O(A_1 : A_2 \mid B)$ represents the **posterior odds** after observing evidence $B$
- ${\displaystyle \Lambda(A_1 : A_2 \mid B) = \frac{P(B \mid A_1)}{P(B \mid A_2)}}$ is the **likelihood ratio** (also called the Bayes factor)
## Understanding the Components
The odds between two events is simply the ratio of their probabilities:
$
O(A_1 : A_2) = \frac{P(A_1)}{P(A_2)}
$
Similarly, posterior odds are defined as:
$
O(A_1 : A_2 \mid B) = \frac{P(A_1 \mid B)}{P(A_2 \mid B)}
$
## Why Odds Form is Useful
The odds form of Bayes' rule has significant practical advantages over the standard probability form. When making sequential updates with multiple pieces of evidence, the odds form allows you to multiply likelihood ratios together without needing to normalize at each step. This normalization step (converting odds back to probabilities) only needs to be performed once at the very end of calculations, making the odds form computationally more efficient.