Image for Continuity Theorem

Continuity Theorem

The Continuity Theorem in probability theory states that if a sequence of random variables converges in distribution to a certain random variable, and the target variable's probability distribution is continuous (has no jumps), then the probabilities of the sequence approximating specific outcomes will also converge to the probabilities of the limiting variable. Essentially, it links the convergence of distributions with the behavior of probabilities for specific values, ensuring that under these conditions, the approximation becomes consistent as the sequence progresses. This theorem is fundamental in understanding how approximate models behave as they get closer to a target distribution.