We have 16 independent normal random variables $X_1, . . . , X_{16}$ where $E(X_t) = \alpha t$ and the variance of each is $\sigma ^2$ . We calculate new random variables
$$S_t = (1 + \sqrt3)X_{t−2} − (3 + \sqrt3)X_{t−1} + (3 −\sqrt3)X_t − (1 −\sqrt3)X_{t+1} $$
for $t = 3, . . . , 15.$
(a) Show that $ES_t = 0$ for any $t$.
$$E(S_t) = (1 + \sqrt3)E(X_{t−2}) − (3 + \sqrt3)E(X_{t−1}) + (3 −\sqrt3)E(X_t) − (1 −\sqrt3)E(X_{t+1}) $$ $$= (1 + \sqrt3)\alpha(t−2) − (3 + \sqrt3)\alpha(t−1) + (3 −\sqrt3)\alpha(t) − (1 −\sqrt3)\alpha(t+1)$$
when you expand the last line, everything cancels out and $E(S_t) = 0$
(b) Prove $S = \frac{1}{\sqrt2}[X_1 + X_2]$ and $T = \frac{1}{\sqrt2}[X_1 − X_2]$ are independent. Are $S' = X_1 + X_2$ and T' = $X_1 − X_2$ independent?
$$S = \alpha X + \beta Y$$
$$T = -\beta X + \alpha Y$$ are independent as long as X and Y are independent and that $\alpha ^2$ and $\beta ^2$ add up to 1. This is true for $S = \frac{1}{\sqrt2}[X_1 + X_2]$ and $T = \frac{1}{\sqrt2}[X_1 − X_2]$ so S and T are independent.
I had trouble figuring out/proving whether or not $S' = X_1 + X_2$ and T' = $X_1 − X_2$ are independent. How would I go about doing this? Isn't the complement just 1 - that probability? Is there a property about sums of of normal random variables I'm forgetting?
c) Calculate
$$ E\left[\frac{1}{7} \sum_{j=2}^7 S^2_{(2j + 1)}\right]$$
Everything inside the brackets confuses me, the summation of when values j = 2 to 7 in $S^2$? How would I go about solving this?
Hints and comments:
(a) You seem to have figured out this one.
(b) Jointly $X_1$ and $X_2$ have an uncorrelated normal distribution. A rotation by 45 degrees gives the joint normal dist'n of $S$ and $T$ (as originally defined in the problem), also uncorrelated. For jointly normal dist'ns zero correlation implies independence because the joint density function can be factored.
Note: This may be leading up to a proof that the mean $\bar X$ and the variance $S^2$ of an iid normal sample of size $n \ge 2$ are independent random variables. (Clearly, they are not functionally independent because $\bar X$ appears in the def'n of $S^2$, but they are stochastically independent, which is the sense of the word 'independent' used here.) It may seem counterintuitive, but it is true only for normal distributions. (For example, for exponential data, $\bar X$ and $S^2$ are highly correlated.)
(c) The sum inside brackets amounts to $S_5^2 + S_7^2 + \dots + S_{15}^2.$ The sum of squares of $k$ iid standard normal RVs is $Chisq(\nu = k)$. If $Q \sim Chisq(k),$ then $E(Q) = k.$ (I am not sure I see immediately where this is leading, if anywhere.)
I see you are relatively new to the site, so some points of strategy may be appropriate: You might get better hints toward proofs at a relevant level by telling us what level course you are taking and what topics you have studied recently. Also, it is not always a good strategy to string several problems together into one Question.