Convergence in law of sample means of random variable

165 Views Asked by At

Let $\{X_n | n \in \mathbb{N} \}$ be a sequence of independent identically distributed random variables with density function:

$$f_X(x) = e^{\theta - x}I_{(\theta, \infty)}(x)$$

with $\theta > 0$. We define the sequence of sample means by: $$\tilde{X_n} = \frac{1}{n}\sum_{j=1}^nX_j$$

Prove convergence in law of $\{\tilde{X_n} | n \in \mathbb{N}\}$ to a random variable with degenerate distribution at point $1 + \theta$ when $n$ tends to infinity. (Hint: $ln(1-x) = \sum_{k=1}^\infty k^{-1}x^k$)

My try: I've easily proven convergence in probability (which implies convergence in law) by verifiyng that the hypotheses to Kintchine's weak law of large numbers are satisfied. From there the result follows immediately. However, the hint provided in the text suggests the exercise should be done perhaps by some other more elementary means, but I don't know how the expression of $ln(1-x)$ as a power series can be used to prove the convergence.

How was this exercise intended to be solved?

2

There are 2 best solutions below

1
On BEST ANSWER

The moment generating function of the random variable $\,\,\dfrac{1}{n}\sum\limits_{j=1}^{n}X_j\,\,$ is (for $\lambda<n$) $$ \mathbb{E}\left[e^{\lambda\frac{1}{n}\sum\limits_{j=1}^{n}X_j}\right]{}={}\left(1-\lambda/n\right)^{-n}e^{\theta\lambda}. $$

One way to justify that, as $n\to\infty$, this MGF tends to the MGF for the required degenerate distribution, ... that is, justifying

$$ \lim\limits_{n\to\infty}\left(1-\lambda/n\right)^{-n}e^{\theta\lambda}{}={}e^{(1+\theta)\lambda},\ldots $$

is to take logarithms of the MGF and apply the hint. That is, using the hint, one shows that $$ \lim\limits_{n\to\infty}\log\bigg(\left(1-\lambda/n\right)^{-n}e^{\theta\lambda}\bigg){}={}\theta\lambda+\lambda\,. $$

0
On

Don't have time for a full answer, but here's an approach that seems to use the hint:

$f_X(x;\theta)= e^{\theta - x}I_{(\theta, \infty)}(x)\implies F_X(x;\theta)=\left(1-e^{\theta-x}\right)I_{(\theta, \infty)}(x):=p_{\theta}\implies e^{\theta-x}I_{(\theta, \infty)}=1-p_{\theta}$

Now take log of both sides:

$$\theta-x=\ln(1-p_{\theta})\implies x=\theta-\sum_{k=1}^{\infty} \frac{p_{\theta}^k}{k}$$

Which will converge for all $p<1$

Now, what you actually do with this to get the CDF of the sample mean, I don't know...perhaps Bonferroni or Boole inequality?