I only know, from some textbook, that we can do LSR is this way:
Since the text is too long, I am sorry that I cannot typeset it here.
This method only consider the errors of $y$. In real experiment, however, both $x$ and $y$ may have error. How can we perform a least square regression on both axis?
So the question is:
Suppose two variables $x,y$ are related by $y=ax+b$. In several experiments, error of $x$ and $y$ are random variables $\epsilon_x \sim N(0,\sigma_x), \epsilon_y \sim N(0,\sigma_y)$ respectively. How can we give a MLE on $a$ and $b$?
Since you have stated that $y = ax+b$, it is unclear in your question where the errors come into your model. However, if you are saying that there is some error in the measurement of $x$ and $y$ then what you are referring to is a regression model with errors-in-variables. This subject has a large literature, but you can find an introduction here.