Finding the pdf of a random variable generating from another random variable with defined pdf

315 Views Asked by At

Initially, there is a random variable $X$ (non-negative) with distribution function: $$P(x) = \lambda e^{-\lambda x}$$ Now we randomly generate $X$ and, then, form sets of $N$ values of this variable: $(x_1,x_2,\ldots,x_N)$

And for each set, we define, for example, $$S = \frac{1}{N}\sum_{k=1}^N x_k^2$$ $S$ is then a new random variable. My question is: How to find the pdf of this new random variable?

2

There are 2 best solutions below

3
On

Comments; Your Question is a possible duplicate of this Q&A, where a method is shown to get the answer, but not the answer itself.

You mention simulation: Here are results from a simulation in R of 100,000 realizations of $X^2,$ where $X \sim \mathsf{Exp}(\text{rate}\, \lambda = 1/3)$ and $n = 5.$ Your reason for simulation is unclear; perhaps you want to see if theoretical and simulated answers agree.

Of course, if $n$ is large enough, this sum of independent random variables will be approximately normal. With $m = 100000$ the mean and SD should be accurate to a couple of significant digits.

set.seed(309) # retain for exactly same simulation; delete for fresh run
m = 10^5; n = 5; lam = 1/3
x = rexp(m*n, lam)
MAT = matrix(x^2, nrow=m)  # each row of matrix is sample of size n
s = rowMeans(MAT)
mean(s);  sd(s)
## 17.90796
## 17.83484
hist(s, prob=T, br=100, ylim=c(0,.05), col="skyblue2", main="")
lines(density(s, from=0, to=300), lwd=2, col="darkgreen")

enter image description here

0
On

Since $x_1^2$ has characteristic function $\phi(t):=\int_0^\infty\lambda\exp(itx^2-\lambda x)dx$, $S$ has characteristic function $\phi^N(\frac{t}{N})$, making its pdf $\int_{\mathbb{R}}\dfrac{1}{2\pi}\phi^N(\frac{t}{N})e^{-itx}dt$. I doubt this has a nice expression, though of course the CLT gives a good approximation for large $N$. We have $\mathbb{E}x_1^k=k!\lambda^{-k}$, so $x_1^2$ has mean $2\lambda^{-2}$ and variance $20\lambda^{-4}$. Thus $S$ has the same mean and $N^{-1}$ times as much variance.