In my probability book, I found the following equation:
$E[((X-\mu)b+\sigma)^2] = \sigma^2(1+b^2)$ , when $ \mu < \infty$ and $\sigma^2 > 0$
$\mu$ expected value and $\sigma^2$ variance for every $b > 0$
Sadly, the book doesn't give a proof for this and I couldn't find one online. Since there is no proof in the book, it can't be too hard. Nevertheless, I didn't got the proof yet.
You can expand the left side using linearity of expectation.
That gives you
$$ E[((X-\mu)b+\sigma)^2] =E[(X-\mu)^2b^2] + E[2(X-\mu)b\sigma] + E[\sigma^2] = \sigma^2b^2+0+\sigma^2 = \sigma^2(1+b^2). $$
Are you sure this is not what the book said, rather than $\sigma^2(1+b)^2$? Because that doesn't seem like the right value.