Real problem: I have $n$ points $(t_i, \ x_i)$ (from an experiment) and I want to find the best values for $X = (A, B, \xi, \omega)$ to fit
$$ x(t) = \exp \left(-\xi \omega t\right)\left(A\cos \mu \omega t + B \sin \mu \omega t\right) $$
With $0 < \xi < 1$ and $\mu = \sqrt{1-\xi^2}$.
My minimizing function $J$ (least square) is
$$ J(A, \ B, \xi, \ \omega) = \dfrac{1}{2}\sum_{i=1}^{n} (x(t_i) - x_i)^2 $$
Then, the equation to solve is
$$ \nabla J = \begin{bmatrix}\dfrac{\partial J}{\partial A} & \dfrac{\partial J}{\partial A} & \dfrac{\partial J}{\partial \xi} & \dfrac{\partial J}{\partial \omega}\end{bmatrix} = \vec{0} $$
As it's not a linear equation, I use Newton's method
$$ X_{j+1} = X_{j} - \left[\nabla \nabla J\right]^{-1} \cdot \left[\nabla J\right] $$
Unfortunately, this method converges only when I get a good estimative for a initial value $X_{0}$. And it's very bad cause my data is very noisy.
Complex problem So, I thought about transforming my equation into a complex to reduce the number of variables to $C, \ s \in \mathbb{C}$:
$$ s = -\xi \omega + i \cdot \mu \omega \in \mathbb{C} $$
$$ C = A + i \cdot B \in \mathbb{C} $$
$$ z(t) = C \exp \left(s t\right) $$
And this equation is a lot easier to solve, cause I can separate and make a linear regression (least square) using the equation:
$$ \ln z = \ln C + s \cdot t $$
And once found $z(t)$ I can get $x(t)$ like
$$ x(t) = \Re(z(t)) $$
Unfortunately I don't have the 'imaginary' points $y_i$ to put in $z_i = x_i + i \cdot y_i$ and compute the regression in the complex plane.
Question: Is there any possibility to make this work? I mean, transform a real problem (with 4 real variables) into a complex problem (with 2 complex variables) and get an method more stable (less dependent on $X_0$ or even get rid of it)?
Context: The function $x(t)$ is the solution for the ODE of a mass-spring-damper when $0 < \xi < 1$ and I got experimental data $(t_i, \ x_i)$ with noise. So I want the best values for $\xi$ and $\omega$ to characterize my system.
$$ m \ddot{x} + c\dot{x} + k x = 0 $$
$$ \ddot{x} + 2\xi \omega \dot{x} + \omega^2 x = 0 $$
Even with perfect data, a model such $$x = \exp \left(-\xi \omega t\right)\left[A\cos (\mu \omega t )+ B \sin (\mu \omega t)\right]\qquad \text{with} \quad \mu = \sqrt{1-\xi^2}$$ would be very difficult to fit in the absence of good guesses.
Fortunately, there are only two parameters which make the problem nonlinear : $\xi$ and $\omega$.
For the time being, suppose that we fix their values. Defining two variables $$y_i=\exp \left(-\xi \omega t_i\right)\cos (\mu \omega t )\qquad \text{and} \qquad z_i=\exp \left(-\xi \omega t_i\right)\sin (\mu \omega t )$$ the problem becomes
$$\Phi(A,B)=\dfrac{1}{2}\sum_{i=1}^{n}\Big[A y_i+B z_i- x_i\Big]^2$$ is more than simple to minimize using normal equations $$\sum_{i=1}^{n} x_i\,y_i=A \sum_{i=1}^{n} y^2_i+B\sum_{i=1}^{n} y_i\,z_i$$ $$\sum_{i=1}^{n} x_i\,z_i=A \sum_{i=1}^{n} y_i\,z_i+B\sum_{i=1}^{n} z^2_i$$
from which the values of $A_{\text{opt}}$, $B_{\text{opt}}$ and $\Phi_{\text{min}}=\Phi(A_{\text{opt}},B_{\text{opt}})$ for the current $\xi$ and $\omega$.
Now, build a regular grid for $0 < \xi < 1$ and $0 <\omega < ?$ and for each pair compute $\Phi_{\text{min}}$ and search for the couple $(\xi,\omega)$ whic gives the lowest value of $\Phi_{\text{min}}$.
When found, you have your starting guesses for $(A,B,\xi,\omega)$ and you can start Newton-Raphson iterations.