The concept of "convergence in distribution"

98 Views Asked by At

I studied "the convergence in distribution".

But, I am confused of its concept.

In wikipedia, the convergence in distribution is introduced like below.


$\text{A sequence } X_1,X_2, ... \text{of real-valued random variable is}$ $\text{said to }\mathbf{convergence}\text{ }\mathbf{in}\text{ }\mathbf{distribution}\text{ if }$

$$\lim_{n\to\infty}F_{n}(x)=F(X)$$

$\text{for every number x}\in\mathbb{R}\text{ at which }F\text{ is continuous.}$ ${\text{Here } F_{n}(x) \text{ and } F(x) \text{ are the cdf of random variables }X_n\text{ and }X }$

$\text{Convergence in distribution can be denoted as } X_n\to X$


I wonder difference between the histogram of samples from sequence of random variable and convergence of distribution

For example, Although $\{X_{n}\}$ is not converges to a specific $\{X\}$, its samples $x_1, x_2, ..., x_n$ make a specific histogram like Gaussian, for large n.

(As in my earlier questions, if $\{X_n\}$ is discrete time Gaussian process with correlation, its samples $x_1, x_2, ... x_n, ... $ make a Gaussian histogram for large n. But it does not converge to Gaussian, marginally. Simple matlab code is below)

%% Generating correlated Gaussian series %%

w=randn(1,10000)
y=conv(w1,ones(1,5)/3,'same')
histogram(y)

Is there a mathematical name in this concept?

Thank you for reading!