For x and y to be statistically independent, I have a probability density function of delta function given as: f(x)= 1/2(x+1) + 1/2(x-1), where x is a random variable. From the function, I can deduce that its expected value is '0' since it is symmetric at '0'. Now I would like to find variance for this function.
Secondly, if there is another function g(y) which is a Gaussian function with mean '0'. I would like to find the variance of "z" with z=x.y and the co-variance K(YZ).
As per my working, I am confused for variance of f(x), variance can be defined as E[x^2]-E[x]^2. From this, from here I can see that E(x)^2 is zero but for E(x^2), will this be also '0'?
I found that variance of 'z' will be '0' since it is defined as E(x-E(x))(y-E(y)). Now since x and y are statistically independent, I can write it as E(xy)-E(X)E(y)=0
For co-variance of [y.z], I solved until {E[y^2.x]-E[y^2]E[x] }, not if x and y are statistically independent, will x and y^2 also be statistically independent?
I hope my working above is correct, looking for your opinions.