Let $$y(x)=a\cdot\frac{x^c}{b^c}$$
Assume that we have a number of data samples $(x_i,y_i)$.
The objective is to find $a, b,$ and $c$ that minimizes the square error function, i.e.,
$$J(a,b,c) = \sum_{\forall i} (y(x_i)-y_i)^2$$
As always, to minimize, we set the derivatives to 0, i.e.,
$$\frac{\partial J}{\partial a}=0 \iff 2\sum_{\forall i} (a\cdot\frac{x_i^c}{b^c}-y_i)\cdot \frac{x_i^c}{b^c}=0 \iff \sum_{\forall i} (a\cdot\frac{x_i^c}{b^c}-y_i)\cdot x_i^c=0$$
$$\frac{\partial J}{\partial b}=0 \iff 2\sum_{\forall i} (a\cdot\frac{x_i^c}{b^c}-y_i)\cdot c \cdot a \cdot x_i^c \cdot b^{-c-1}=0 \iff \sum_{\forall i} (a\cdot\frac{x_i^c}{b^c}-y_i)\cdot x_i^c=0$$
$$\frac{\partial J}{\partial c}=0 \iff 2\sum_{\forall i} (a\cdot\frac{x_i^c}{b^c}-y_i)\cdot a\cdot\frac{x_i^c}{b^c} \cdot \log\frac{x_i}{b}=0 \iff \sum_{\forall i} (a\cdot\frac{x_i^c}{b^c}-y_i)\cdot x_i^c \cdot \log\frac{x_i}{b}=0$$
As you may have noticed, I'm supposed to get a 3-by-3 system of equations, but due to cancellation, I got 2 equations and 3 unknowns (the 1st and 2nd equations are identical after simplification).
Does this mean the existence of many solutions? Or did I do some mistake in deriving the equations? Please note that all the parameters are assumed to be positive reals and greater than zero.
Please help.
The problem is that there are infinitely many least-square solutions. This can for example be seen by noticing that $J(k^ca,kb,c) = J(a,b,c)$ for all $k\not= 0$. We can only constraint the parameter combinations $(a/b^c,c)$ so your fitting-function has one parameter that cannot be constrained. You can get around this problem by instead trying to fit to the functional form $f(x) = a x^c$ instead. With this form you will get a well defined equation-system.
You can also, if wanted, reduce the problem to a standard linear least-square problem by taking $\log$'s which allows you to use the standard well-known formulas for this case directly. If we define $\tilde{y}_i = \log(y_i)$ and $\tilde{x}_i = \log(x_i)$ (assuming for simplicity that $y_i > 0$; otherwise we need to work with the absolute value) and try to fit these points to $y(\tilde{x}) = \alpha \tilde{x}+\beta$ then the least-square solution for $(\alpha,\beta)$ are related to the least-square solution for $(a,c)$ via $c = \alpha$ and $a = e^{\beta}$.