Sets and convergence in probability

54 Views Asked by At

Let us consider this toy example: suppose that an unknown parameter $\beta$ satisfies $0\leq\beta\leq x_0$ where $x_0$ is some fixed and unobserved positive number. This of course can be written as $$ \beta\in [0,x_0]. $$ Now suppose that we have series of variables $\{x_1,x_2,\ldots\}$ such that $x_n$ converges in probability to $x_0$. Let $$ P_n\equiv\Pr[\beta\in[0,x_n]] $$ It seems intuitive that $P_n\to 1$ as $n\to\infty$ but how do I prove this rigorously please (if the claim is in fact true)?

1

There are 1 best solutions below

0
On

This is not true. Suppose that $x_0=\beta=1$ and $x_n=1-1/n$. Then $P_n=0$ for all $n\ge 1$.


You may actually make this work once you have a prior on the location of $\beta$ in $[0,x_0]$. For any $\epsilon>0$, \begin{align} \mathsf{P}(\beta\notin[0,x_n])&\le \mathsf{P}(\beta\notin[0,x_0-\epsilon])+\mathsf{P}(x_n<x_0-\epsilon) \\ &\to \mathsf{P}(\beta\notin[0,x_0-\epsilon]) \quad\text{as}\quad n\to\infty. \end{align} Therefore, assuming that the latter probability goes to $0$ as $\epsilon\searrow 0$, the desired result follows.