Let $x\in \mathrm{R}$ be a random variable having univariate Normal distribution with mean $0$ and variance $\Sigma$. Let $y$ be defined as followed: \begin{equation} y = f\left(x\right)\hspace{10pt};x\in\left(a,b\right) \end{equation} where, $f\left(x\right)$ is a function of $x$ and $(b-a) < \sqrt{\Sigma}$. In order to find the variance of $y$, I need to use the variance of $x$. I have the following question:
Since $y$ is defined for only a small subset of $x$, should I use $\Sigma$ as variance of $x$ in the calculation of variance of $y$ or should I use $\Sigma_x$ in place of $\Sigma$, where, \begin{equation} \Sigma_x = \int_a^b x^2\frac{1}{\sqrt{2\pi}}\exp\left(-0.5\frac{x^2}{\Sigma}\right)dx \end{equation} Note: This is not a homework problem. This calculation is a part of a larger problem that I am trying to solve.