Statistics basics

45 Views Asked by At

Given that $X$ has mean $a$ and variance $b$. Then $E(X^2) = a^2 + b^2$. Why is this true? Please provide a proof alongside any other relevant information. Thanks in advance.

2

There are 2 best solutions below

0
On BEST ANSWER

This is purely a matter of definitions: recall that $Var(X)=E(X^2)-E(X)^2$. Try to conclude from there based on what you know. Also they might have meant that variance is actually $b^2$, not $b$.

0
On

I think you meant standard deviation is b? Then Variance=$b^2$
Because $Var(X)=E[X^2]-(E[X])^2$, and $E[X]=\mu=a$
So $E[X^2]=a^2+Var(X)=a^2+b^2$