Inequality of variance of a convex combination

196 Views Asked by At

Let $X,Y$ be two random variables with $Var(X)=\sigma_X^2<\infty$ and $Var(Y)=\sigma_Y^2<\infty$. If $t\in[0,1]$, is it true that: $$Var[tX+(1-t)Y]\le{t\sigma_X^2+(1-t)\sigma_Y^2}?$$ If variance was a convex function I could use Jensen inequality and prove the statement; is this the case?

1

There are 1 best solutions below

0
On BEST ANSWER

Jensen's inequality is unnecessary -- just use the fact that $x \mapsto x^2$ is convex to note that $$\Big(t(X-\mu_X)+(1-t)(Y-\mu_Y)\Big)^2 \leq t(X-\mu_X)^2 + (1-t)(Y-\mu_Y)^2$$ and then apply the expectation.