Convergence in probability measure

65 Views Asked by At

We are given the following: $(X_n)_{n \in \mathbb{N}}$ is a row of independent, identically distributed stochastic variables with pdf $\frac{\ln(x)}{x^2}$ for $x>1$. Now set $X := min(X_1, X_2, ..., X_n)$.

Does $X$ converge in probability measure to 1?

By definition, we'd have to show (or disprove) the following:

$lim_{n \to +\infty} P(\{|X_n - 1| \geq \epsilon \}) = 0 $ for all $\epsilon>0$.

My answer: The statement is not true since X does not converge in density to the identity function. For this, I've used the fact that, by definition, the cdf of X is given by $\frac{ln(x)^2}{2}$. For example, for $x$ approaching 1 from the right (right limit), the cdf of $X$ seems to go to zero. Hence, $X$ wouldn't converge to the unity function in density and thus also not in probability.

I just need someone to check my reasoning, or correct me. I feel like this could be done in a more rigorous way, or I'm just wrong which would explain my bad gut feeling with the answer I've currently got.

1

There are 1 best solutions below

0
On BEST ANSWER

You must have been confused by notations.

Let $Y_n= \min(X_1,\ldots,X_n)$ and note that $Y_n\geq 1$ a.s. and $$\begin{aligned} P(\min(X_1,\ldots,X_n)-1 \geq \epsilon) &=P(\min(X_1,\ldots,X_n) \geq 1+\epsilon) \\ &=P(\bigcap_{i=1}^n X_i\geq 1+\epsilon) \\ &=P(X_1\geq 1+\epsilon)^n \end{aligned} $$ Since $P(X_1\geq 1+\epsilon)<1$, $P(X_1\geq 1+\epsilon)^n \to 0$ and $Y_n$ converges in probability to $1$.