There is a sequence of time intervals as normal distributions. $$ t_i \sim N(\mu_i, \sigma_i) \qquad \forall i \in [1..n] $$
I know $\mu_i$ and $\sigma_i$ for all the distributions above. I need to find the sequence $t_1..t_n$ which has the highest probability such that, $$ \sum_{i=1}^{n} t_i = T $$
$T$ is the total time. Is there a known methodology to solve the problem above?
The solution I am working on right now: we have to maximize.
$$ P = \prod_{i=1}^{n} \mathcal{N}(t_i; \mu_i,\sigma_i) = \prod_{i=1}^{n} \frac{1}{\sqrt{2\pi\sigma^2}} e^{-\frac{(t_i - \mu_i)^2}{2\sigma^2} } $$ Then take log we get to maximize, $$ log(P) = \sum_{i=1}^{n} -\frac{1}{2}2\pi\sigma^2 - \frac{(t_i - \mu_i)^2}{2\sigma^2} $$
Then we ignore constants and multipliers we get to minimize, $$ = \sum_{i=1}^{n} \frac{(t_i-\mu_i)^2}{\sigma_i^2} $$
I couldn't get any further but it looks like there should be a simple solution.