Let $(\varepsilon, \nu)$ be jointly normally distributed:
$$\begin{pmatrix}\varepsilon \\ \nu \end{pmatrix} \sim N\left(\begin{pmatrix}0 \\ 0 \end{pmatrix} , \begin{pmatrix} \sigma_{\varepsilon}^2 & \rho \sigma_{\varepsilon} \\ \rho \sigma_{\varepsilon} & 1 \end{pmatrix} \right)$$ where $\rho$ is the correlation between $\varepsilon$ and $\nu$. Then show that for any real constant $\alpha$: $$\varepsilon \mid \nu > \alpha \sim N(\rho \sigma_{\varepsilon}\lambda(-\alpha), \sigma_{\varepsilon}^2(1-\rho^2))$$ where $\lambda(\cdot) = \frac{\phi(\cdot)}{\Phi(\cdot)}$, and $\phi$ and $\Phi$ are the pdf and cdf of a standard normal distribution, respectively.
I know that the conditional distribution of $\varepsilon \mid \nu = \alpha$ is normally distributed with mean $\rho\sigma_{\varepsilon}\alpha$ and variance $\sigma_{\varepsilon}^2(1-\rho^2)$ by the properties of a joint normal distribution, but how do I prove the above when the condition is an inequality?