I have a feeling there should be a mathematical formular for determining the estimators of the coefficients of a $\frac{1}{x^{2}}$ Weighted Linear Regression.
I was able to derive the estimators ($a$ and $b$) for the non-weighted linear regression ($y=ax+b$). I did it by minimizing $\sum \epsilon^2 $ where $\epsilon = y - (ax+b) $
I partially differentiated $\sum \epsilon^2 $ wrt $a$ and $b$ and equating it to zero. at the end i got the estimators as:
$$a = \frac{n\sum xy - \sum x \sum y}{n\sum {x}^2 - {(\sum x)}^2} $$
and
$$b = \frac{\sum y - a\sum x}{n} $$
Now my question is this:
If I decide to use a $\frac{1}{x^{2}}$ weight on the linear regression $y=ax+b$, is there a way to minimize the least squares of the weighted error (ie minimize $\sum \frac{1}{x^{2}} \ [y - (ax+b)]^2 $ ) and come up with a simple mathematical relationship to determine the estimators $a$ and $b$ ?
Why not? You have two equations in two variables. Just solve them.
We have to minimize $S=\sum \frac 1{x_i^2}[y_i-(ax_i+b)]^2=\sum \left[\frac {y_i}{x_i}-\left(a+\frac {b}{x_i}\right)\right]^2$ \begin{align*} \frac{\partial S}{\partial a}&=0\Rightarrow\sum \left[\frac {y_i}{x_i}-\left(a+\frac {b}{x_i}\right)\right]=0\Rightarrow \boxed{a=\frac{\sum\frac {y_i}{x_i}-b\sum\frac {1}{x_i}}{n}}\tag{1}\\ \frac{\partial S}{\partial b}&=0\Rightarrow\sum \frac1{x_i}\left[\frac {y_i}{x_i}-\left(a+\frac {b}{x_i}\right)\right]=0\Rightarrow a\sum\frac1{x_i}+b\sum\frac1{x_i^2}=\sum\frac {y_i}{x_i^2}\tag{2}\\ \end{align*} Putting the value of $a$ from $(1)$ in $(2)$, we get \begin{align*} \left(\frac{\sum\frac {y_i}{x_i}-b\sum\frac {1}{x_i}}{n}\right)\sum\frac1{x_i}+b\sum\frac1{x_i^2}&=\sum\frac {y_i}{x_i^2}\\ \sum\frac {y_i}{x_i}\sum\frac1{x_i}-b\left(\sum\frac {1}{x_i}\right)^2+nb\sum\frac1{x_i^2}&=n\sum\frac {y_i}{x_i^2}\\ \boxed{b=\frac{n\sum\frac {y_i}{x_i^2}-\sum\frac {y_i}{x_i}\sum\frac1{x_i}}{n\sum\frac1{x_i^2}-\left(\sum\frac {1}{x_i}\right)^2}}\\ \end{align*}