Bayes' Theorem: Algebraic Conditional Probability
Bayes’ theorem provides the algebraic formula for computing P(A|B)—the probability of event A given evidence B—directly from known probabilities.
Biased Coin Problem: Non-Uniform Outcome Probabilities
Bob possesses three coins: two fair, one biased (heads 2/3, tails 1/3). He randomly selects one, flips heads—what’s the probability he chose the biased coin?
Asymptotic Certainty: Never Reaching 100% Confidence
No matter how many consecutive heads occur, the probability the coin is unfair approaches but never reaches 100%, preserving fundamental uncertainty in inductive inference.
Fair vs Unfair Coin: Conditional Probability Example
A person possesses two coins—one fair, one double-sided (both heads)—randomly selects one, flips it, and reports “heads.” What probability did they choose the fair coin?
Probability Trees: Visualizing Sequential Events
Conditional probability problems become tractable through probability tree methodology—systematically growing branches representing sequential events and their possible outcomes.
Sequential Evidence: Declining Confidence with Multiple Heads
When the coin-flipper reports a second consecutive “heads,” the probability tree extends further, and confidence in the fair coin decreases from 1/3 to 1/5.
Tree Balancing: Least Common Multiple for Equal Leaves
When outcome probabilities differ across branches (fair coin 1/2-1/2, biased coin 2/3-1/3), tree balancing using least common multiples enables simple counting of equally likely leaves.
Tree Trimming: Eliminating Impossible Outcomes
When new evidence emerges, probability trees require “trimming”—cutting branches leading to outcomes contradicted by observations.