This is an exercise found in Mathematical Statistics with Applications, by Freund. The book defines the regression equation of $Y$ on $X$ as $$ \mu_{Y|x} = E[Y|x] = \int_{-\infty}^{\infty}yf(x|y)dy $$
There is a theorem (proven in the book) which states that if the regression of $Y$ on $X$ is linear (i.e., if $\mu_{Y|x} = \alpha + \beta x$) , then $$\mu_{Y|x} =\mu_{2} +\rho\frac{\sigma_{2}^2}{\sigma_{1}^2}(x-\mu_1)\; , $$ where $\mu_1 = E[X]$, $\mu_2 = E[Y]$, $\sigma_{1}^2 = \text{var}[X]$, $\sigma_{2}^2 = \text{var}[Y]$, and $\rho = \frac{\text{cov}[X,Y]}{\sigma_1 \sigma_2}$
Statement: Show that if $\mu _{Y|x}$ is linear in $x$ and var$[Y|x]$ is constant, then var$[Y|x] = \sigma_{2}^2 (1-\rho ^2)$.
Attempt: $\;$ Suppose $\mu _{Y|x}$ is linear in $x$ and var$(Y|x)$ is constant. Then there exist constants $\alpha , \beta \in \mathbb{R}$ such that $\mu_{Y|x} = \alpha + \beta x$, where $\mu_{Y|x} := E[Y|x]$. Equating these two,
$$\int_{-\infty}^{\infty}yf(y|x)dy = \alpha + \beta x $$ Multiplying both sides by $y$,
$$ \int_{-\infty}^{\infty}y^2 f(y|x)dy = (\alpha + \beta x)y $$ The left side is the conditional expectation of $Y^2$; hence
$$E[Y^2|x]= (\alpha + \beta x)y $$
The conditional variance of $Y$ is defined as $\text{var}[Y|x] = E\left[ \left( Y - E[Y|x] \right)^2 \biggr| x \right]$, this can be rewritten as $\text{var}[Y|x]= E[Y^2|x] - \left( E[Y|x] \right)^2$. We have
\begin{align*} \text{var}[Y|x] &= (\alpha + \beta x)y - (\alpha + \beta x)^2 \\ &= (\alpha + \beta x) (y-\alpha -\beta x) \\ &= \text{const} \tag{1} \end{align*}
This is where I'm stuck. Is my approach correct? Assuming it is, I tried plugging $\alpha = \mu_2 - \rho \frac{\sigma_2}{\sigma_1}\mu_1$ and $\beta = \rho\frac{\sigma_2}{\sigma_1}$ into equation (1). I also tried implicit differentiation of equation (1) with respect to $x$ and obtained an expression for $y'$. I then integrated to obtain $y$, substituted into (1), and obtained an expression for $\text{var}[Y|x]$ in terms of $x$ only.
None of this seems to work, I'd appreciate any help.