$\mathit{W}$ is a $\bigl(\alpha = 3, \beta = \frac 12 \bigr)$ -Gamma random variable, and $\mathit{N}$ is a $\mu$ = $\frac 13$ -Poisson random variable, independent from $\mathit{W}$.
What is $\mathbb{E}$$\bigl[\mathit{W^N} \bigr]$?
Note: $\mathit{p_N}$ = P$\bigl(N=n \bigr)$ = $\ e^\mu$ $\frac {\mu^n}{n!}$ for $\mu$- Poisson RV.
Also, the pmf for $\bigl(r,p \bigr)$- Negative Binomial RV is $\mathit{p_T (n)}$ = $\begin{pmatrix}n-1\\r-1\end{pmatrix}$ $\mathit{p^r}$$\mathit{(1-p)^{n-r}}$, for $\mathit{n= r, r+1, ...}$
So far, I have that I need to make a negative binomial variable out of $\mathit{W^N}$, I'm assuming by manipulating their pmfs and pdfs.
I have that P$\bigl(W^N \leq t \bigr)$ = P$\bigl(W \leq t^{1/N} \bigr)$, and from there I tried to find their joint cdf, which I think is just replacing t in the gamma pdf by $t^{1/n}$, and then use the usual rule to find the conditional cdf, which should be something like a negative binomial cdf.
However, when I do that, I just get stuck and can't really move forward. Am I heading in the right direction? Are there other ways to solve this?
Thanks!
We often don't need to compute the full pmf if we only want the expected value.
Here, we can use the law of total expectation, as in $E[E[g(X,Y) \mid Y]]=E[g(X,Y)]$, to write
$$ E[W^N] = E [E[W^N | N]] \tag1$$
But $E[W^n]$, for any fixed positive integer $n$, is the $n-$th moment (non centered) moment of $W$, a $(\alpha,\beta)$ Gamma rv.
Taking advantage of this question, we can write
$$ E[W^n] = \frac{\beta^n \Gamma(\alpha+n)}{\Gamma(\alpha)}\tag2$$
which is also valid for $n=0$. In our case this gives
$$ E[W^n] = \frac{ \Gamma(3+n)}{2^n \Gamma(3)}=\frac{(n+2)!}{2^n 2!}=\frac{(n+2)!}{2^{n+1}} \tag3$$
Then $$E[W^N] = E [E[W^N | N]] = E\left[\frac{(N+2)!}{2^{N+1}} \right] \tag4$$
where $N$ follows a Poisson distribution with mean $1/3$. Can you go on from here?