I want to know if there is an exact solution for the following problem and how to approach solving it:
I have a discrete-time signal where the Nyquist theorem is satisfied:
$$ r_k = \sum_i a_i^{(1)}h^{(1)}(kT-iT-\tau_i^{(1)}) +\sum_i a_i^{(2)}h^{(2)}(kT-iT-\tau_i^{(2)}),$$
where $a_i^{(1)}$ and $a_i^{(2)}$ are two independent random streams of ${\pm 1}$ and $h^{(1)}(t)$ and $h^{(2)}(t)$ are their corresponding pulses, and $T$ is the bit period, and $\tau_i^{(1)}$ and $\tau_i^{(2)}$ are timing errors.
The problem is that I want to calculate another set of samples while all the parameters of the above equation are known. I want to obtain:
$$ s_k = \sum_i a_i^{(1)}h^{(1)}(kT-iT) +\sum_i a_i^{(2)}h^{(2)}(kT-iT), $$
which looks exactly like the original samples except for that the timing errors do not exist anymore. I know that I cannot achieve this just by resampling. How do I approach this?
Please share your thoughts.