Calculate the standard deviation of $Y=X+\frac{(X-k)^2}{a}$, knowing the standard deviation of $X$

76 Views Asked by At

Let $\mu$ be the expected value of random variable $X$ with density $f(x)$. The standard deviation of $X$ is defined as $${\displaystyle {\begin{aligned}\sigma_X &\equiv {\sqrt {\operatorname {E} \left[(X-\mu )^{2}\right]}}\end{aligned}}}$$ I would like to know if it is possible to express the standard deviation of the random variable $Y$, defined as: $$Y=X+\frac{(X-k)^2}{a}$$ (where $a$ and $k$ are constants) starting from $\sigma_X$.

Edit. Let us suppose to know all the n-th moment $E[X^n]$

1

There are 1 best solutions below

2
On BEST ANSWER

Yes, you can do that if you know all the moments:

$$\sigma_Y^2 = \langle Y^2 \rangle - \langle Y \rangle^2.$$

Since $Y$ and $Y^2$ are polynomials in $X$, this can be expressed in terms of expectations of positive integer powers of $X$. Note the common result that for any random variable,

$$\langle (X - \langle X \rangle)^2 \rangle = \langle X^2 - 2 X \langle X \rangle + \langle X \rangle^2 \rangle= \langle X^2 \rangle - \langle X \rangle^2$$