I am trying to find tight upper and lower bounds for $E[X | X > x]$ where $X$ follows a standard normal random variable. After calculations I found that $$ E [X|X>x] = \frac{\exp(-x^2/2)}{\int_{x}^{\infty} \exp(-u^2/2) du} $$
When I plot this in R, I find that this is just a line. So how can I simplify the above expression and find a closed solution of it? Or can I get tight bounds?
This answers the original formulation of the queston, which was to find tight lower and upper bounds for $\mathrm{E}(X\mid X>0)$.
The expectation $\mathrm{E}\left(X \mid X>0\right)$ can be computed in closed form as follows: $$ \mathrm{E}\left(X \mid X>0\right) = \frac{\int_0^\infty x \phi(x) \mathrm{d}x}{\int_0^\infty \phi(x) \mathrm{d}x} = \frac{\int_0^\infty x \phi(x) \mathrm{d}x}{1/2} = \int_0^\infty 2 x \frac{1}{\sqrt{2\pi}} \exp\left(-\frac{x^2}{2}\right) \mathrm{d}x $$ Making a change of variables $u = \frac{x^2}{2}$ $$ \mathrm{E}\left(X \mid X>0\right) = \int_0^\infty \frac{2}{\sqrt{2 \pi}} \exp(-u) \mathrm{d}u = \sqrt{\frac{2}{\pi}} $$ This is as tight as it gets.
The updated question can be solved by the technique from my answer to an earlier question. Let $Z$ be a Gaussian random variable with mean $\mu$ and variance $\sigma$. Then $$ \mathrm{E}\left(X \mid X > x\right) = x + \mathrm{E}\left(X-x \mid X -x > 0\right) = x + \mathrm{E}\left(Z \mid Z > 0\right) $$ where $\mu = -x$ and $\sigma=1$. Now using an easily verified identity $$ \sigma^2 \frac{\partial}{\partial \mu} f_Z(z; \mu, \sigma) + \mu f_Z(z; \mu, \sigma) = z f_Z(z; \mu, \sigma) $$ $$ \begin{eqnarray} \mathrm{E}\left(Z \mathbf{1}_{Z > 0}\right) &=& \int_0^\infty z f_Z(z) \mathrm{d} z = \mu \Pr(Z>0) + \sigma^2 \frac{\partial}{\partial \mu} \Pr(Z>0) \\ &=& \mu \Phi\left(\frac{\mu}{\sigma}\right) + \sigma^2 \frac{\partial}{\partial \mu} \Phi\left(\frac{\mu}{\sigma}\right) \\ &=& \mu \Phi\left(\frac{\mu}{\sigma}\right) + \sigma \phi\left(\frac{\mu}{\sigma}\right) \end{eqnarray} $$ Hence $$ \mathrm{E}(X \mid X > 0) = x + \frac{\mathrm{E}\left(Z \mathbf{1}_{Z > 0}\right)}{\Pr(Z>0)} = x + \mu + \sigma \frac{\phi\left(\mu/\sigma\right)}{\Phi\left(\mu/\sigma\right)} $$ now recalling $\mu = -x$ and $\sigma=1$, and $\phi(-x) = \phi(x)$ $$ \mathrm{E}(X \mid X > 0) = \frac{\phi(x)}{\Phi(-x)} $$ which is precisely what you found.
This function is known as inverse Mill's ratio and is not a linear function, but approaches linear function for large positive $x$: