$X_n=Y_n/\lambda_n$ converges in probability to 1

1.8k Views Asked by At

Let $X_n=\frac{Y_n}{\lambda_n}$, where $Y_n$ is a Poisson variable with mean $\lambda_n$. Assume that $\lambda_n\to\infty$ and prove that $X_n\to 1$ in probability.

My attempt is that if $Y_n\sim$Pois$(\lambda_n)$, then we can write $Y_n=\sum_{i=1}^nY_i\sim$Pois$(\sum_{i=1}^n\lambda_i)$, so each $Y_i\sim$Pois$(\lambda_i)$ where each $\lambda_i=\frac{\lambda_n}{n}$. Now $\overline{Y_n}\to\mathbb{E}(Y_1)=\frac{\lambda_n}{n}$, and $\overline{\lambda_n}\to\mathbb{E}(\lambda_1)=\frac{\lambda_n}{n}$ by the law of large numbers. Is this the correct way to say this? Can I now conclude that

$X_n=\sum_{i=1}^nX_i=\frac{\sum_{i=1}^nY_i}{\sum_{i=1}^n\lambda_i}=\frac{\frac{1}{n}\sum_{i=1}^nY_i}{\frac{1}{n}\sum_{i=1}^n\lambda_i}\to\frac{\frac{\lambda_n}{n}}{\frac{\lambda_n}{n}}=1$ in probability by Slutsky's lemma.

2

There are 2 best solutions below

0
On BEST ANSWER

There are some abuses and mistakes in your reasoning (for example, writing $\mathbb{E}[\lambda_1]$ doesn't make a lot of sense since $\lambda_1$ is a constant and not a random variable) but I think they can be corrected without too much trouble. What may be the main problem in your reasoning is when you write $$Y_n = \sum_{i=1}^n Y_i$$ because this is actually something not given from your problem. In fact what you are doing there is actually using a coupling technique: you are constructing a new series of random variables, say $\widetilde{Y}_n$, distributed as $Y_n$, and you are showing that for this new sequence we have $$ \frac{\widetilde{Y}_n}{\lambda_n}\to 1 \text{ in probability}$$ Actually if you look closer, you reasoning shows that $\frac{\widetilde{Y}_n}{\lambda_n}\to 1$ almost surely. Now this actually concludes your problem, because it implies that the original sequence $Y_n/\lambda_n$, being distributed as the new one, must converge in distribution to 1, and since 1 is constant you also obtain convergence in probability. In a sense is a very elegant proof, but at the same time is an overkill - you have essentially proved Skhorokhod's theorem in this special case. A much easier proof can be given by Chebyschev inequality: recall that if $Y_n\sim Pois(\lambda_n)$, then $$\mathbb{E}[Y_n]=\lambda_n, \quad Var(Y_n)=\lambda_n$$ So for any $\varepsilon>0$ $$ \mathbb{P}(\vert X_n-1\vert <\varepsilon) = \mathbb{P}(\vert Y_n-\lambda_n\vert <\varepsilon\lambda_n) \leq \frac{Var(Y_n)}{\varepsilon^2 \lambda_n^2} = \frac{1}{\varepsilon^2 \lambda_n}\to 0 \text{ as } n\to\infty$$ which implies convergence in probability.

0
On

I'm sure that this can be proven directly from the formulae, however I thought I'd provide a proof which relies only on taking a limit, and some established facts.

I use two facts.

  • A sequence $X_n$ which converges in distribution to a constant $a$, converges in probability to $a$.
  • Pointwise convergence of the moment generating functions $M_n$ of a sequence $X_n$ implies convergence in distribution of the sequence, and the limit variable has moment generating function $M(t) = \lim_{n \rightarrow \infty} M_n(t)$.

Note that the first of these two statements really does need convergence to a constant, as in general convergence in distribution does not imply convergence in probability.

Now, the moment generating functions of the sequence $Y_n / \lambda_n$ are given by

$$ M_n(t)= \mathbf E \left[ e^{t Y_n / \lambda_n} \right] = \exp \left( \lambda_n \left( e^{t/\lambda_n} - 1\right) \right),$$ Note then that $$\lim_{n \rightarrow \infty} \lambda_n \left( e^{t/\lambda_n} - 1\right) = t,$$ and so

$$M_n(t) \rightarrow e^t,$$ which is the moment generating function of a constant variable $Y = 1$, and we are done.

A semi-rigorous derivation of the above limit can be obtained by noting that from the power series of the exponential function:

$$\lambda_n \left( e^{t/\lambda_n} - 1\right) = \lambda_n \left(\sum_{k=0}^\infty \frac{1}{k!} \left(\frac{t}{\lambda_n}\right)^k - 1 \right) = t + \frac{t^2}{2} O\left(\lambda_n^{-1}\right)$$