Limiting Distribution of $\left(\prod\limits_{i=1}^{n} U_i\right)^{1/n}$ with $(U_i)$ i.i.d. uniform $(0,\theta)$

406 Views Asked by At

Let $(U_i)$ i.i.d. uniform $(0,\theta)$ and $$T_n=\left(\prod_{i=1}^{n} U_i\right)^{1/n}$$

Compute the limiting distribution of the sequence $(T_n)$.

My try: $$ F_{T_n}(t) =\mathsf P(T_n \leq t)=\mathsf P\left(\left(\prod_{i=1}^{n} U_i\right)^{1/n}\leq t\right) =\mathsf P\left(\prod_{i=1}^{n} U_i\leq t^n\right)$$ hence $$ F_{T_n}(t) =\mathsf P\left(\log\prod_{i=1}^n U_i\leq \log t^n\right) =\mathsf P\left(\sum_{i=1}^n \log U_i \leq n\log t\right) $$ that is, $$ F_{T_n}(t)=\mathsf P\left(V_n \leq \log t\right)=F_{V_n}(\log t) $$

where $$V_n=\frac1n\sum_{i=1}^n \log U_i$$

Since $E(\log U_1)=\log\theta-1$ and $\log U_1$ is square integrable, by the CLT, for some positive $\sigma^2$,

$$\sqrt{n}\left(V_n-(\log \theta-1\right))\stackrel{d}{\rightarrow}\mathsf N(0,\sigma^2)$$

Then

$$ \lim_{n\rightarrow\infty}F_{V_n}(v)= \begin{cases} 1 & v\gt \text{log }\theta -1 \\ 0 & v\lt \text{log }\theta -1 \\ \end{cases} $$ hence $$ \lim_{n\rightarrow\infty}F_{T_n}(t)= \begin{cases} 1 & t\gt \ell \\ 0 & t\lt \ell \\ \end{cases} $$ where $$\ell=\theta e^{-1}$$ Thus, $F_{T_n}(t)\to F_T(t)$ where $P(T=\ell)=1$, at every point $t$ where $F_T$ is continuous, that is, at every point $t\ne\ell$. By a well-known theorem, this suffices to show that $T_n\to T$ in distribution, where $P(T=\ell)=1$, that is, $T_n\to\theta e^{-1}$ in distribution (hence also in probability).

Thus $T_1$, $T_2$, . . . converges to a degenerate random variable with pmf

$$f_T(t)=I_{\{\theta e^{-1}\}}(t)$$

1

There are 1 best solutions below

2
On BEST ANSWER

This method right here follows approximations and need to be made more rigorous as stressed out by @Chris Janjigian. Indeed the usage of the CLT below is an approximation for large $n$.

Consider the random variable \begin{equation} T = \frac{\sum \ln U_i}{n} \end{equation} The mean and variance of $\ln U_i$ are \begin{equation} \mu =E(\ln U)= \int_{-\infty}^{\infty} f_U(u) \ln u \ du = \frac{1}{\theta} \int_{0}^{\theta} \ln u \ du = \ln \theta - 1 \end{equation} \begin{equation} E(\ln^2 U)= \int_{-\infty}^{\infty} f_U(u) \ln^2 u \ du = \frac{1}{\theta} \int_{0}^{\theta} \ln^2 u \ du = \ln^2 \theta - 2 \ln \theta + 2 \end{equation} So \begin{equation} \sigma^2 = \operatorname{var}(\ln U) = E(\ln^2 U) - \mu^2 = 1 \end{equation} Now \begin{equation} \mathsf P\left(\frac{\sum_{i=1}^n \ln U_i}{n} \leq \ln t\right) \simeq \mathsf P \left(\sqrt{n}( T - \mu )\leq \sqrt{n}( \ln t -\mu)\right) \end{equation} But by the Central limit theorem, $\sqrt{n}( T - \mu ) \rightarrow N(0,\sigma^2) = N(0,1)$ So \begin{equation} \mathsf P\left(\frac{\sum_{i=1}^n \ln U_i}{n} \leq \ln t\right)\simeq \frac{1}{\sqrt{2\pi}} \int_{-\infty}^{\sqrt{n}(\ln t - \mu)} \exp(-\frac{1}{2}x^2) dx \end{equation} Let $y = e^{x\frac{1}{\sqrt{n}}+\mu}$, then $dy = \frac{1}{\sqrt{n}}y dx$, we get \begin{equation} \mathsf P\left(\frac{\sum_{i=1}^n \ln U_i}{n} \leq \ln t\right)\simeq \frac{1}{\sqrt{2\pi}} \int_{0}^{t} \exp(-\frac{1}{2}x^2) \frac{\sqrt{n}}{y} dy \end{equation} But $x = \sqrt{n}(\ln y - \mu)$ so we can re-arrange as \begin{equation} \mathsf P\left(\frac{\sum_{i=1}^n \ln U_i}{n} \leq \ln t\right) \simeq \frac{1}{\sqrt{2\pi\frac{1}{n}}} \int_{0}^{t} \frac{1}{y} \exp(-\frac{1}{2\frac{1}{n}}(\ln y -\mu)^2) dy \end{equation} which is a Log-Normal distribution with mean $\ln \theta - 1$ and variance $\frac{1}{n}$.