Sinusoidal regression

292 Views Asked by At

I am trying to do my homework on finding parameters with some data and I am kind of stuck. The problem is to find parameters $(T_m, T_0, t_0, \omega) \in \mathbb R \times \mathbb R \times \mathbb R_{+} \times \mathbb R_{+}$ such that $$T_i = T_m + T_0 \sin (\omega (t_i-t_0)) + \epsilon_i$$ for all the data point $(t_i, T_i)$. $\epsilon_i$ is a small noise. I tried to minimize the quadratic error but since the quadratic error function is not convex, the minimization is not guaranteed to be global. Can someone help me find a method to solve the problem?

1

There are 1 best solutions below

2
On BEST ANSWER

The model being $$T = T_m + T_0\, \sin [\omega (t-t_0)] $$ nonlinear regression will be required which means that rather good estimates would be required.

What I should do is to consider that we can fix $\omega$ at a given value and rewrite the model as $$T=T_m+T_0 \cos(\omega t_0)\sin(\omega t)-T_0 \sin(\omega t_0)\cos(\omega t)$$that is to say $$T=T_m+a \,\sin(\omega t)+b \,\cos(\omega t)$$ where $a=T_0 \cos(\omega t_0)$ and $b=-T_0 \sin(\omega t_0)$.

Since, for the time being, $\omega$ is fixed, define two variables $x_i=\sin(\omega t_i)$ and $y_i=\cos(\omega t_i)$ making the model to be $$T=T_m+a x+b y$$ which is a simple multilinear regression.

Now, consider $$SSQ(\omega)=\sum_{i=1}^n (T_m+a x_i+b y_i-T_i)^2$$ and compute this sum of squares for various values of $\omega$ until you see a minimum (plot the results). If required, repeat the procedure with a smaller stepsize $\Delta \omega$.

Say that $\omega_*$ is your best candidate after this search. For this specific value you have retained, from the linear regression, the corresponding $a_*$ and $b_*$ are available.

Now, $a_*^2+b_*^2=T_0^2$ so the estimate for $T_m$. Similarly $\frac{b_*}{a_*}=-\tan(\omega_* t_0)$ from which the estimate for $t_0$.

Now, you should be ready for starting the nonlinear regression for the four original parameters.

Edit

Have a look here for a very interesting method (page 21 and next) developed and extensively used by JJacquelin, an MSE user.

Update

For illustration purposes, I used the data set given on page 23 in JJacquelin's linked book and applied the above described procedure.

The preiminary search gives the following results $$\left( \begin{array}{cc} \omega & \text{SSQ}(\omega) & T_m & a & b \\ 0.0 & 16.854& -0.26694 & 0.43645 & +0.21608 \\ 0.5 & 8.369& +4.32249 & 0.84523 & -5.29375 \\ 1.0 & 7.265& +0.61499 & 0.68349 & -1.62695 \\ 1.5 & 3.869& -0.10656 & 1.01910 & -0.90143 \\ 2.0 & \color{red}{0.326}& -0.39790 & 1.28306 & -0.57357\\ 2.5 & 3.278& -0.42749 & 1.18252 & -0.85369 \\ 3.0 & 8.766& -0.10998 & 0.78829 & +0.34607 \end{array} \right)$$

Then, from the values obtained for $\omega_*=2.0$, we get as estimates $T_m=-0.40$, $T_0=1.40$ and $t_0=0.21$. Using these as starting values, the nonlinear regression converges in a couple of iterations and gives $$\begin{array}{clclclclc} \text{} & \text{Estimate} & \text{Standard Error} \\ T_m & -0.39070 & 0.05412 \\ T_0 & +1.41040 & 0.06313 \\ \omega & +1.98131 & 0.04346 \\ t_0 & +0.21064 & 0.02831 \\ \end{array}$$ and then the final results $$\left( \begin{array}{ccc} t & T & T_{calc} \\ -1.983 & 0.936 & 0.9262 \\ -1.948 & 0.810 & 0.8881 \\ -1.837 & 0.716 & 0.7275 \\ -1.827 & 0.906 & 0.7102 \\ -1.663 & 0.247 & 0.3712 \\ -0.815 & -1.513 & -1.6537 \\ -0.778 & -1.901 & -1.6963 \\ -0.754 & -1.565 & -1.7201 \\ -0.518 & -1.896 & -1.7897 \\ 0.322 & 0.051 & -0.0820 \\ 0.418 & 0.021 & 0.1726 \\ 0.781 & 1.069 & 0.8849 \\ 0.931 & 0.862 & 1.0052 \\ 1.510 & 0.183 & 0.3670 \\ 1.607 & 0.311 & 0.1259 \end{array} \right)$$