Covariance of two functions, multiplication rules of the second moment as $E(aX^2)$

119 Views Asked by At

Since $Cov(X,Y) = E(XY)-E(X)E(Y)$, I have to somehow define E(XY).
I know what $E(X)$ and $E(Y)$ are, also I know $Var(X)$ and $Var(Y)$. $Y$ and $X$ are defined by two independent random variables which appear in both:
$$X= 5a + 3b - 5$$ and
$$Y= -3a + 5b - 1$$

$a$ and $b$ respectively have also their own EVs and variances that are just defined as some constant, for example 5 and 6, and 7 and 8. I can work out all the EVs and variances, but I run into problem, when forming the E(XY) for the covariance.

I get a term where I have some constant $c$ multiplying the second moment of EV in the following way:
$E(ca^2)$

I do know that $Var(a)=E(a^2)-(E(a))^2$ and can workout that $E(a^2)=Var(a)+(E(a))^2$, since the $Var(a)$ and $E(a)$ are known. Therefore I should be able plug something in for the $E(a^2)$.

But now comes the real question: is it fine to take the constant out as follows:
$E(c*a^2)=cE(a^2)$
and now plug $E(a^2)$ from this one $E(a^2)=Var(a)+(E(a))^2$?

For this example I just used case for variable $a$ but same sort dilemma is also presented with cases where there is $E(kb^2)$, $k$ being a constant.

As a separate pondering:
I have a gut feeling that the $Cov(X,Y)$ shall not be zero ($X$ and $Y$ are not independent which is algebraically quite clear, I dare to say) despite in the beginning we cannot say anything about the values of X regardless of knowing what Y put out and vice-versa. The underlying variables as $a$ and $b$ are independent and they can get pretty much any real values, but for sure we know, that if for example $a$ and $b$ both were zero, the $X$ would -5 and $Y$ would be -1.

Thus $P(X = 0, Y = 0) = 0 ≠ P(X=0)P(Y=0)$ and the variables are not independent.

If someone wants to have take on this last part, it would be nice, but really want the answer for the $E(c*a^2)=cE(a^2)$ situation, if it's allowed or not.

Edit, example, why I believe that covariance is not 0.

Let's set that E(a)=1 and E(b)=2, and that Var(a)= 3 and Var(b)=4. Then E(X) = 6 and E(Y)=6. Also $E(a^2) = 4, E(b^2) = 8$. Let's then calculate the $Cov(X,Y)=E(XY)−E(X)E(Y)$. $E(X)E(Y)=36$.

Let's then figure out $E(XY)=E((5a+4b-5)(-3a+5b-1))=E(-15a^2+16ab+10a+15b^2-28b+5)$

Now let's calculate each EV-term and then sum separately like this: $E(-15a^2)+E(16ab)+E(10a)+E(15b^2)+E(-28b)+E(5)=-15E(a^2)+16E(a)E(b)+10E(a)+15E(b^2)-28E(b)+5$

Bow let's put in everything we know: $-15(4) + 16(1)(2) + 10(1) +15(8)-28(2)+5=-60+32+10+120-56+5=51$ Then subtrarcting both terms: $Cov(X,Y)=E(XY)−E(X)E(Y)=51-36=15$

Edit2: Only thing making the $Cov(X,Y)=0$ would be to square the $a^2$ and $b^2$ values inside the following expression or with the constants taken out: (if squaring allowed, then the expression would equal $36, 36-36 = 0$) $E(-15a^2)+E(16ab)+E(10a)+E(15b^2)+E(-28b)+E(5)$ but in the first place I understood that it's not allowed, instead we have to use these values: $E(a^2) = 4, E(b^2) = 8$

1

There are 1 best solutions below

1
On BEST ANSWER

by two independent random variables which appear in both:

Because $a, b$ are independent: $\mathsf E(ab)=\mathsf E(a)\,\mathsf E(b)$, so...

$\qquad\begin{align}\mathsf E(XY) &=\mathsf E((5a + 3b - 5)~(-3a + 5b - 1))\\ &= \mathsf E(5 + 10 a - 15 a^2 - 28 b + 16 a b + 15 b^2)\\& = 5+10\,\mathsf E(a) -15\,\mathsf E(a^2) -28\,\mathsf E(b) +16\,\mathsf E(ab)+15\,\mathsf E(b^2)\\& = 5+10\,\mathsf E(a) -15\,\mathsf E(a^2) -28\,\mathsf E(b) +16\,\mathsf E(a)\,\mathsf E(b)+15\,\mathsf E(b^2)\\&=5+10\,\mathsf E(a)-15(\mathsf{Var}(a)+\mathsf E(a)^2)-28\mathsf E(b)+16\mathsf E(a)\mathsf E(b)+15(\mathsf {Var}(b^2)-\mathsf E(b)^2)\\\mathsf E(X)\mathsf E(Y) &=5+10\,\mathsf E(a) -15\,\mathsf E(a)^2 -28\,\mathsf E(b) +16\,\mathsf E(a)\,\mathsf E(b)+15\,\mathsf E(b)^2\\[2ex]\mathsf{Cov}(X,Y)&=-15\mathsf{Var}(a)+15\mathsf{Var}(b)\end{align}$


I have a gut feeling that the Cov(X,Y) shall not be zero (X and Y are not independent which is algebraicly quite clear, I dare to say)

Independence always entails zero covariance, but some dependent variables may also have zero covariance. Covariance is a measure of linear correlation.

$$P\to Q\text{ does not mean }\lnot P\to\lnot Q$$