Find these moments?

53 Views Asked by At

enter image description here

I'm just looking for guidance with a, I haven't attempted b yet. Here's my work for a:

Let $Z=\xi_1+...+\xi_N$. This is an assumption so I doubt it's accurate but since $P(\xi=\pm1)=\frac{1}{2}$ suggests the possibility of a binomial distribution, since this would allow for a heads-or-tails-type probability decision.

If that's the case, then I would consider $E[\xi_i]=Np=\mu$ and $Var[\xi_i]=Np(1-p)=\sigma^2$. Since $p_N(n)$ follows a geometric distribution, $E[N]=\frac{1}{p}=v$ and $Var[N]=\frac{1-\alpha}{\alpha^2}=\tau^2$ (Note: the nomenclature chosen is from Pinsky's An Introduction to Stochastic Modeling.)

Then, $E[Z]=\mu v=\frac{Np}{p}=N$ and $Var[Z]=v\sigma^2+\mu^2\tau^2=N(1-p)+\frac{(Np)^2(1-\alpha)}{\alpha^2}$.

I'm very new to all of this, the last probability course I had was in undergrad and it didn't really include statistics. I took an applied statistical methodology graduate course last term and did well. This is a grad independent study in stochastic modeling. Would someone please help direct my attempt?

1

There are 1 best solutions below

2
On BEST ANSWER

So, conditioned on $N$, it is a rescaled, shifted Binomial distribution (note that the summands are in $\{-1,1\}$, not $\{0,1\}$). Unfortunately, $N$ is a random variable as well, so you cannot really do that immediately.

What you can do, however, is use the law of total expectation: $$ \mathbb{E}[Z] = \mathbb{E}[\mathbb{E}[Z\mid N]] \tag{1} $$ As you noticed, $\mathbb{E}[Z\mid N] = 0$ (it's a shifted Binomial!), so that gives you $$ \mathbb{E}[Z] = 0 $$ which you can now compute from the probability mass function of $N$.

Now, for the variance, there is a law of total variance you can similarly use, but since you also will want the third and fourth moments as well, we may look at a useful rewriting of $Z$: $$ Z = \sum_{k=1}^N \xi_k = \sum_{k=1}^\infty \xi_k \mathbf{1}_{N \geq k} \tag{2}$$ This is useful, as by assumption $\xi_k$ and $\mathbf{1}_{N \geq k}$ are independent for all $k$. For instance, if we wanted to re-derive the expectation of $Z$: $$ \mathbb{E}[Z] = \sum_{k=1}^\infty \mathbb{E}[\xi_k \mathbf{1}_{N \geq k}] = \sum_{k=1}^\infty \mathbb{E}[\xi_k]\cdot \mathbb{E}[\mathbf{1}_{N \geq k}] = \sum_{k=1}^\infty 0\cdot \mathbb{P}\{{N \geq k}\}= 0\,. $$ We retrieve the result. But now, also, $$\begin{align} \mathbb{E}[Z^2] = \mathbb{E}\left[\left(\sum_{k=1}^\infty \xi_k \mathbf{1}_{N \geq k}\right)^2\right] = \mathbb{E}\left[\sum_{k=1}^\infty\sum_{\ell=1}^\infty \xi_k\xi_\ell \mathbf{1}_{N \geq k}\mathbf{1}_{N \geq \ell}\right]\\ &= \sum_{k=1}^\infty\sum_{\ell=1}^\infty \mathbb{E}\left[\xi_k\xi_\ell]\cdot\mathbb{E}[\mathbf{1}_{N \geq k}\mathbf{1}_{N \geq \ell}\right] \end{align}$$ However, now we have that $\mathbb{E}[\xi_k\xi_\ell] = 1$ if $\ell=k$, and $0$ if $\ell\neq k$ (by independence and since $\xi_k^2=1$; can you see why?), so the above expression for the variance simplifies a lot to $$ \operatorname{Var} Z = \mathbb{E}[Z^2] = \sum_{k=1}^\infty\mathbb{E}[\mathbf{1}_{N \geq k}] = \sum_{k=1}^\infty\mathbb{P}\{N \geq k\} = \mathbb{E}[N]\,. $$ The third and fourth moments can be done similarly, though it gets longer.