Bayesian Inference

Understanding uncertainty through a coin toss

Observed Data

We start with actual observations. These coin tosses are the only information we have from the real world.

Heads: 0 | Tails: 0

Prior Belief

Before seeing any data, we express our belief about the probability of heads using a probability distribution.

Likelihood

The likelihood tells us how compatible different values of the probability of heads are with the observed data.

Posterior (Updated Belief)

The posterior combines the prior belief and the likelihood to form an updated belief after seeing the data.

How Prior and Data Combine

The posterior combines the prior belief with observed data. Sampling from the posterior gives many plausible values of p.

Trace Plot (Sampling Over Time)

Each point is one sampled value of p. Over many samples, the trace shows how sampling explores the posterior.