Variance of a random sum of a function of multiple independent random variables?

337 Views Asked by At

I have a function of the form

$$I = {\left( {\sum\limits_{i = 0}^{N-1} {{{h(a_i)}{g(r_i)}}} } \right)^2}$$

where $N$, $a$, and $r$ are independent random variables with known distributions ($N$: Poisson, $a$: normal, $r$: uniform), and $h(a)$ and $g(r)$ are known functions of the random variables $a$ and $r$, respectively. Note that for one realization of $I$ there is one realization of $N$ but (realization of N)-times realizations of $a$ and $r$. My goal is to calculate the variance $Var(I)$. To do so I need to calculate $E(I)$ and $E(I^2)$, since $Var(I) = E(I^2) - E(I)^2$. The expected value of $I$ I calculated as $$E(I) = {\left( {{{E(N)E(h(a))}{E(g(r))}}} \right)^2}$$, although I'm not 100% sure that this is correct. What I don't know is how to calculate $E(I^2)$. Any suggestions?

Further information: $$N\sim Poisson(\lambda)$$ $$E(N)=\lambda$$ $$a\sim Normal(\mu, (\sqrt{0.1}\mu)^2)$$ $$h(a) = a^3$$ $$E(h(a)) = 1.3\mu^3$$ $$r\sim uniform[0, (d/2)^2]$$ $$g(r)={1\over \sqrt {{F^2} + r} }$$ $$E(g(r)) = {8 \over {{d^2}}}\left( {\sqrt {{F^2} + {{(d/2)}^2}} - F} \right)$$

Edit: The answer does not need to be specific for my problem, I would just appreciate any hints or useful theorems that can help me to calculate the variance of a squared random sum of a function of independent random variables.

2

There are 2 best solutions below

3
On BEST ANSWER

My goal is to calculate the variance $Var(I)$.

Rewrite your squared sum in the form $$I=S_N^2 = \left(\sum_{i=1}^N X_i\right)^2 $$ where $S_N = X_1+...+X_N$, the $X_i$ are iid $X$ independent of $N$, and now $N-1\sim $Poisson$(\lambda).$ (Note that a Poisson r.v. has support $\{0,1,2,...\}$, so $N$ has support $\{1,2,3,...\}.$)

  1. The variance is $$\begin{align}V(I) &= E(I^2)-(E(I))^2\\ &= E(E(I^2\mid N))-(E(E(I\mid N)))^2\\ &= E(m_2(N))-(E(m_1(N)))^2 \end{align}$$ where $m_1()$ and $m_2()$ are given in the next part.

  2. The conditional moments of $I$, given $N=n$, are $$\begin{align}m_1(n)=E(I\mid N=n) &= E(S_n^2)\\ &= n\,E(X^2) + n(n-1)(E(X))^2 \\ \\ m_2(n)=E(I^2\mid N=n)&=E(S_n^4)\\ &=n\,E(X^4)\\ &+4n(n-1)\,E(X^3)\,E(X)\\ &+3n(n-1)\,(E(X^2))^2\\ &+6n(n-1)(n-2)\,E(X^2)\,(E(X))^2\\&+n(n-1)(n-2)(n-3)\,(E(X))^4 \end{align}$$ These can be proved by a combinatorial approach to counting how many terms take various forms in the multivariate polynomials $S_n^2=\left(\sum_{i=1}^n X_i\right)^2$ and $S_n^4=\left(\sum_{i=1}^n X_i\right)^4$. That is, the terms in $S_n^2$ take exactly two forms, namely $n$ of type $X_i^2$ and $n(n-1)$ of type $X_iX_j(i\ne j)$, which give rise to the expectations $E(X^2)$ and $(E(X))^2$, respectively. Similarly, the terms in $S_n^4$ take exactly five forms, namely $X_i^4, X_i^3X_j, X_i^2X_j^2, X_i^2X_jX_k, X_iX_jX_kX_l,$, which give rise to the expectations $E(X^4),E(X^3)E(X),(E(X^2))^2,E(X^2)(E(X))^2,(E(X))^4,$ respectively. Counting the number of each type of term thus provides the coefficients -- polynomials in $n$ -- of the corresponding expected values. (Before I found the linked proof above, I found these for $S_n^4$ by generating the multivariate polynomials for $n=1..10$ using Sage, counting the various types of term, then using OEIS to determine the formula. The linked formula confirms mine.)

  3. Substituting (2) into (1) (and leaving the simple but tedious details to you), $$\begin{align}V(I) &= E(m_2(N))-(E(m_1(N)))^2\\ &= E(\text{4th degree polynomial in }N) + (E(\text{2nd degree polynomial in }N))^2\\ &= E(\text{4th degree polynomial in }M) + (E(\text{2nd degree polynomial in }M))^2\\ &=c_4E(M^4)+c_3E(M^3)+c_2E(M^2)+c_1E(M) \end{align}$$ where $M=N-1\sim$Poisson$(\lambda),$ and the first four moments of the Poisson distribution are well-known: $$\begin{align}E(M^1)&=\lambda\\ E(M^2)&=\lambda(\lambda+1)\\ E(M^3)&=\lambda(\lambda^2+3\lambda+1)\\ E(M^4)&=\lambda(\lambda^3+6\lambda^2+7\lambda+1). \end{align} $$

This completely solves the problem, in the sense that $V(I)$ is now given as a polynomial in $\lambda$, whose coefficients $c_i$ are known functions of the first four moments of $X= h(a)\,g(r)$, where $h,g$ are known functions and $a,r$ are independent random variables with known distributions.

0
On

To simplify notation, recognize that a function of a random variable is itself a random variable. Using capital letters to identify random variables, and to have $n$ terms in the sum consider the slight modification, $$K_n=\left(\sum_{i=0}^{n-1}H_iG_i\right)^2=\left(\sum_{i=0}^{n-1}C_i\right)^2=\sum_{i=0}^{n-1}\sum_{j=0}^{n-1}C_iC_j$$ For a fixed $n$, the expectation can be written as $$E[K_n]=\sum_{i=0}^{n-1}\sum_{j=0}^{n-1}E[C_iC_j]$$ And since $C_i$ are independent, for $i\neq j$ $$V_{ij}=E[C_iC_j]-E[C_i]E[C_j]=0$$ and $$V[C]=E[C_iC_i]-E[C_i]^2$$ Collecting terms: $$E[K_n]=n(V[C]+E[C]^2)+(n^2-n)E[C]^2=n^2E[C]^2+nV[C]$$ Now suppose that $n$ is an outcome of a random variable N. The joint pmf/pdf for the random variables N and K is $$f(n,k)=f_N(n)f_{K|n}(k|n)$$ $$E[K]=\sum_{n=1}^\infty f_N(n)\int kf_{K|n}(k|n)\,dk=\sum_{n=1}^\infty f_N(n)E[K_n]$$ Which you can work out provided you know the expectation and variance of $C$. Hopefully you can use this approach to work out the variance of $K$. Start by working out the variance of $K_n$.