Let $Y_i=\alpha_0+\beta_0 X_i + \epsilon_0$, where $\epsilon_i \sim N(0, \sigma_0^2)$ and $X_i \sim N(\mu_x,\tau_0^2)$ are independent.
The data $(X_i, Y_i)$ are generated from $Y_i=\alpha_0+\beta_0 X_i + \epsilon_0$.
I have found the maximum likelihood estimator for each parameter by using $$L_n({X_i, Y_i};\, \alpha, \beta, \mu_x, \sigma^2, \tau^2) = \prod_{i=1}^n f(X_i, Y_i)=\prod_{i=1}^n f_x(X_i)f_{.|X_i}(Y_i),$$ differentiating it with respect to the parameter, setting it equal to zero, and solving for the parameter.
For example, I get that $\hat{\alpha}_{MLE}=\bar{Y_n}-\beta \bar{X_n}$.
But how do I show that $\hat{\alpha}_{MLE}=\bar{Y_n}-\beta \bar{X_n}$ converges to its true value?
I do not know how to find the true value of $\hat{\alpha}_{MLE}$ or how to begin showing convergence.
Thank you.
You have by the law of total expectation that \begin{equation} E[Y_i] = E[E[Y_i | X_i]] = E[\alpha_0 + \beta_0\mu_x + \epsilon_i] = \alpha_0 + \beta_0\mu_x \end{equation}
You also have by the law of large numbers that $\bar{X_n} \overset{P}{\rightarrow} \mu_x$. By Slutsky's theorem, we have that:
\begin{equation} \bar{Y_n} -\beta_0\bar{X_n} \overset{P}{\rightarrow} Y - \beta_0\mu_x \end{equation}
Where $Y$ is the limiting distribution of $\bar{Y_n}$. Again using the law of large numbers we have that $\bar{Y_n} \overset{P}{\rightarrow} Y = E[Y_i] = \alpha_0+\beta_0\mu_x$.
Plugging these into the previous expression we have that \begin{equation} \bar{Y_n} - \beta_0\bar{X_n} \overset{P}{\rightarrow} \alpha_0 +\beta_0\mu_x - \beta_0\mu_x = \alpha_0 \end{equation}