We have a time delay element that given an input pulse it generates an output pulse after time T_nom. T-nom is the average time it takes for the device to output a pulse. Due to the device's mismatching characteristics the delay has stochastic nature and follows a Gaussian distribution as following:
$$F(t)=\frac{1}{\sqrt{2\pi} \ \sigma_{mis}}e^{-\frac{(t-T_{nom})^2}{2\sigma_{mis}^2}}$$
where $\sigma_{mis}$ is the device's mismatch standard deviation. In addition to the device's mismatching we have also what's called a random noise or jitter which is inevitable for the device. This jitter, which additionally perturbs the delay time, also follows a Gaussian distribution function with the standard deviation $\sigma_{jit}$.
What I want now is to include the effect of the jitter in the function F(t) and come up with a compact model to fully describe the stochastic behavior of the device. But how?
As hinted in the comments, if you assume that the jitter is independent from the device's mismatching, the total time delay $\Delta T$ is given by $$ \Delta T = T_{mis}+T_{jit}$$ Where $T_{mis}\sim \mathcal N(T_{nom},\sigma_{mis}^2)$ and $T_{jit}\sim \mathcal N(0,\sigma_{jit}^2)$.
Now, since $T_{mis}$ and $T_{jit}$ are assumed to be independent, you can deduce that their sum $\Delta T$ is also normally distributed with mean $T_{nom}$ and variance $\sigma_{mis}^2+\sigma_{jit}^2$. So the PDF of $\Delta T$ is $$f_{\Delta T}(t) = \frac{1}{\sqrt{2\pi(\sigma_{jit}^2+\sigma_{mis}^2)}}\cdot \exp\left({-\frac{1}{2}\left(\frac{t-T_{nom}}{\sqrt{\sigma_{jit}^2+\sigma_{mis}^2}}\right)^2}\right) $$