So I have a table with points in time $t_i$ and measurements $s_i$ give by
\begin{bmatrix}t_i&0&1&2&3\\s_i&4.30&1.48&0.56&0.24\end{bmatrix}
As a regression function I'm given
$\hspace{6cm}s(t)=a_1 2^{-2t}+a_2 2^{-t}$
Now I'm supposed to find the parameters $a_1$ and $a_2$ with the least square method, which means I have to minimize
$\hspace{6cm} \sum\limits_{i=0}^{3}|s(t_i)-s_i|^2$.
I think I understood linear regression quite well and would be able to perform it on this example but the multiple variables of the function really throws me off.
$$s(t)=a \:2^{-2t}+b\: 2^{-t}$$ Compute the variable : $\quad x_i=2^{-t_i}$ $$s(x)=a \:x^2+b\: x$$ In order to find the approximates of $a,b$ one have to make a linear regression.
This means linear with respect to the sought parameters $a,b$, of course not with respect to the functions $x^2,x$.
I suppose that all is explain in your textbook on a more general manner.
If you are trouble by the function $x^2$ you can do this (not recommended, only for information) :
$$\frac{s(t)}{2^{-t}}=a \:2^{-t}+b$$ Compute $\quad y_i=\frac{s_i}{2^{-t_i}}$ $$y(x)=a \:x+b$$ You are certainely more familiar with this simpler form.
The result is slightly different from above. This is not surprising because the fitting is with respect to $y$ instead of to $s$.