I have some identically independently distributed random variables $X_1,X_2...$ where $E(X_1)=0, \sigma^2=\operatorname{Var}(X_1) < \infty$ and $Y_n=2X_nX_{n+1}$.
I would like to calculate the variance of $Y_n$, however, I'm stuck in my derivations. I calculated the expectation value of $Y_n$ and this has to equal $0$.
Now, for the variance, this is my work so far: $\operatorname{Var}(Y_n) =\operatorname{Var}(2X_nX_{n+1}) = 4\operatorname{Var}(X_nX_{n+1})$
now since the variables are i.i.d., we can rewrite this as $4\operatorname{Var}(X_nX_{n+1}) = 4\operatorname{Var}(X_1X_2)$.
Am I correct to assume that since the variables are i.i.d., their variances are the same? i.e. $\operatorname{Var}(X_1)=\operatorname{Var}(X_2)=...=\operatorname{Var}(X_n) = \sigma^2$? Or won't this be helpful? I'm not sure how to continue now, since it's the variance of a product of random variables..
Any input is appreciated!
$\newcommand{\Var}{\operatorname{Var}}\newcommand{\E}{\mathbb{E}}$For any random variable $Z$ with finite expectation we have: $$\Var(Z)=\Bbb E[Z^2]-(\Bbb E[Z])^2$$Fix now an $n\in\Bbb N$. We have: $$\Var(Y_n)=\Var(2X_nX_{n+1})=4\Var(X_nX_{n+1})=4\Bbb E[X_n^2X_{n+1}^2]-4(\Bbb E[X_nX_{n+1}])^2$$
If $f,g:\Bbb R\to\Bbb R$ are any measurable functions and $A,B$ are independent random variables, the composites $f(A)$ and $g(B)$ are also independent random variables. In particular, $X_n^2$ is independent of $X_{n+1}^2$. The expectation of the product of independent random variables is the product of expectations. So I find: $$\begin{align}\Var(Y_n)&=4\Bbb E[X_n^2]\Bbb E[X_{n+1}^2]-4(\Bbb E[X_n]\Bbb E[X_{n+1}])^2\\&=4(\Var(X_n)+(\Bbb E[X_n])^2)(\Var(X_{n+1})+(\Bbb E[X_{n+1}])^2)-4\cdot0^2\end{align}$$Concluding: $$\Var(Y_n)=4\Var(X_n)\Var(X_{n+1})=4\sigma^2\cdot\sigma^2=4\sigma^4$$
Almost by definition of "identically distributed", we have $\Var(X_n)=\Var(X_1)$ for all $n\in\Bbb N$. You are correct to "assume" it but you should also try to fully justify it.