Let $U: \mathbb R -> \mathbb R$ be a concave function, and let $X$ be a random variable with a normal distribution, expected value $\mu$, and standard deviation $\sigma$. Let $\lambda \gt 1$, and let $Y$ be a random variable with a normal distribution, expected value $\mu$, and standard deviation $\mu \sigma$.
(a)Prove that $U(\mu + c) + U(\mu-c) \ge U(\mu + c\sqrt{\lambda}) + U(\mu - c\sqrt{\lambda})$ for all $c \gt 0$
(b) By changing appropriate varible, and using a), prove that $E[U(X)] \ge E[U(Y)]$
I can only know Jensen's inequality which is $U[E(X)] \ge E[U(X)]$. I have no idea how to prove a) and b). Can you please help me? I have forgone statistics for a long time so I greatly appreciate for your help. Thanks.
0) I think you have some typos in the question. Let us make things more clear by assuming $Y$ has standard deviation $\left(\sqrt{\lambda}\right) \sigma$.
1) Draw a picture and compare the midpoints of the two line segments.
2) Define "centered densities":
\begin{align} \tilde{f}_X(x) &= f_X(x+\mu)\\ \tilde{f}_Y(x) &= f_Y(x + \mu) \end{align} Note that $\tilde{f}_X(c)=\tilde{f}_X(-c)$ and $\tilde{f}_Y(c) = \frac{1}{\sqrt{\lambda}}\tilde{f}_X\left(\frac{c}{\sqrt{\lambda}}\right)$ for all $c \in \mathbb{R}$. I think this problem wants you to do the computation:
\begin{align} E[U(X)] &= \int_{c=0}^{\infty} U(\mu+c)\tilde{f}_X(c)dc + \int_{c=0}^{\infty} U(\mu-c)\tilde{f}_X(-c)dc\\ &= \int_{c=0}^{\infty} \tilde{f}_X(c)[U(\mu+c)+U(\mu-c)]dc\\ &\geq \ldots \end{align}
A more general statement that does not require the density of $X$ to be symmetric about the mean is this:
Let $g:\mathbb{R}^N\rightarrow\mathbb{R}$ be a convex function. Let $X$ be a random vector in $\mathbb{R}^N$ with $E[X]=0$ and let $Y=\theta X$ for some $\theta \in [0,1]$.
Claim: $E[g(Y)]\leq E[g(X)]$.
Proof: Define the convex function $h(x) = g(x)-g(0)$. Then $h(0)=0$ and we have: \begin{align} E[h(Y)] &= E[h(\theta X)]\\ &= E[h(\theta X + (1-\theta)0)]\\ &\leq E[\theta h(X) + (1-\theta)h(0)]\\ &=\theta E[h(X)] \end{align} However, $E[h(X)] \geq h(E[X])=h(0)=0$, and so $\theta E[h(X)]\leq E[h(X)]$. It follows that $E[h(Y)]\leq E[h(X)]$. Thus: $$ E[g(Y)-g(0)] \leq E[g(X)-g(0)]$$ $\Box$