the greater of two random variables

107 Views Asked by At

For two independent normal random variables with non-zero mean

$$ X \sim N(u_1,\sigma_1) , Y \sim N(u_2,\sigma_2) $$

If we have the condition, $$E(X^2 ) > E(Y^2)$$

is this condition always true? $$E(\sqrt{X^2}) > E(\sqrt{Y^2}) $$

1

There are 1 best solutions below

2
On BEST ANSWER

For a normal variable $N(\mu,\sigma)$ where $\sigma$ is the standard deviation and so $\sigma^2$ is the variance, from the formula $V(X)=E(X^2)-[E(X)]^2$ one can get $E(X^2)=\mu^2+\sigma^2.$

Now let $X=N(2,1.5)$ and $Y=N(\sqrt{5},1)$ so that $E(X^2)=6.25,\ E(Y^2)=6.$ Then we have here $E(X^2)>E(Y^2).$ But in comparing the two expected values of the absolute values, which as far as I know means doing integrals numerically, we get $E(|X|)\approx 2.12718$ and $E(|Y|) \approx 2.24488.$ That is, the expected values of the absolute values are in reverse order from the expected values of the squares in this case. [The post refers to expected values of $\sqrt(X^2),$ however for real $a$ one has $\sqrt{a^2}=|a|$ so it doesn't change things to use absolute values here.]