Is it possible to fit the inital time?

34 Views Asked by At

Assume the initial time is $t = 0$ and the following general ODEs model:

$\textbf{y} = f(t, \textbf{p}, \textbf{y})$

where $t$ is the time, p is the vector of model parameters, and $\textbf{y} = (y_1, y_2)^T$ is the vector of model outputs. I am trying to fit this model to my data, $\hat{u}$. For my data, I assume that:

$\hat{u} = y_1 + y_2$

The problem is that the first data point, $\hat{u}_0$, is not actually the data at $t=0$. Meaning:

$\hat{u}_0 = y_1(t_0) + y_2(t_0)$, with $t_0 \neq 0$

My problem: $t_0$, $y_1(t_0)$, and $y_2(t_0)$ are unknown. I only know $\hat{u}_0$. Here are my approaches:

  1. First approach: Fit a proportional parameter, $p$. So, $y_1(t_0) = p \hat{u}_0$ and $y_2(t_0) = (1-p)\hat{u}_0$.
  2. Second approach: Fit $t_0$.

Normally, I would take the first approach. However, I was told to try the second approach. I tried and it did not work. The algorithm only tries to estimate all other parameters except $t_0$. And $t_0$ just stays at its initial starting value. For example, if I let initial starting value for $t_0$ to be $50$, then the algorithm estimates $t_0 = 50$.

The second approach is really odd to me, but I am not sure why it did not work. And, I am not sure how to explain why it did not work to the people whom suggested it. Have anyone every done something similar?