Numerical Analysis II
Convergence in distribution refers to the concept where a sequence of random variables converges in distribution to a limiting random variable, meaning that the cumulative distribution functions of the random variables converge at all continuity points of the limiting variable. This type of convergence is significant in probability theory and statistics, especially when discussing weak convergence, which is related to how probability distributions change as sample sizes increase.
congrats on reading the definition of convergence in distribution. now let's actually learn it.