I'm trying to solve this problem:
Linearize the function $y = ax^b$ into $Y = AX + B$ and with the least squares method set a and b for the folowing data:
x: 2 2.2 2.4 2.6 2.8 3 y: 0.6935 0.7885 0.8750 0.9555 1 1.1
The first thing I did is write the function like so:
$\log(y) = b \times \log(x) + \log(a)$
And that gives me
$\\Y = \log(y)\\A = b\\B = \log(a)\\X = \log(x)$
Now I don't really get the least squares method. In school we did something like
$A_i = \begin{bmatrix}1 & X_i\end{bmatrix}\\B_i = \begin{bmatrix}Y_i\end{bmatrix}$
And then solved the system $A^T\times A\times X = A^T\times B$
But I get the wrong results. What's the proper way of doing this?
I figured it out
$A_i = \begin{bmatrix}1 & X_i\end{bmatrix}$ needs to be $A_i = \begin{bmatrix}X_i & 1\end{bmatrix}$ otherwise I'm solving for $Y = BX + A$