Let $(X_t)_{0 \leq t \leq 1}$ be a Gauss-Markov semimartingale. I believe the Gaussian property implies that if this process jumps, it jumps at fixed (non-random) times. But can the size of the jumps be random?
My intuition: if a Gaussian process jumps, the size of the jump must be Gaussian. For instance, $X_t= B_t + Y 1_{[1/2,1]}(t)$ where $Y$ is a Gaussian random variable and $(B_t)$ a Brownian motion. But then, $(X_t)$ is not Markov, since for $1/2 < t < 1$, $P(X_1 <x \, | \, \mathcal F_t^X) = P(X_1 <x \,| \,Y, X_t) $, where $(\mathcal F_t^X)$ is the natural filtration of $(X_t)$. The only way that $P(X_1 <x \,|\, Y, X_t) = P(X_1 <x \,|\, X_t)$ is that $Y$ is a degenerate Gaussian, that is, a constant. Hence the size of the jumps is not random.
EDIT: Here is an interested fact I noticed: the size of jumps can be random if what follows the jump is independent from what's before the jump (it follows immediately). For example, $B_t 1_{[0, 1/2)}(t) + Y 1_{[1/2, 1]}(t)$ where $Y$ is a Gaussian random variable and $(B_t)$ a Brownian motion independent from $Y$. So the interesting case now is can the size of jumps be random when there is dependence between what follows the jump and what's before the jump.
Consider the process
$$ X_t = Z \mathbf{1}_{\{ t \geq 1/2 \}} $$
for some $Z \sim \mathcal{N}(0, 1)$. We claim that $(X_t)_{t\in[0,1]}$ is a Gaussian–Markov process.
1st Proof. For any $0 \leq t_0 \leq t_1 \leq \ldots \leq t_n = 1$,
$$ X_{t_1} - X_{t_0}, \quad X_{t_2} - X_{t_1}, \quad \ldots, \quad X_{t_n} - X_{t_{n-1}} $$
is a sequence of independent gaussian RVs. This follows from the observation that only one of these increments can be non-degenerate with the value $Z$. Consequently, $X = (X_t)_{t \in [0, 1]}$ is a gaussian process with independent increments, hence it has Markov property as required.
2nd Proof. The law $X = (X_t)_{t \in [0, 1]}$ is the same as that of the process $\tilde{X} = (\tilde{X}_t)_{t \in [0, 1]}$ defined by
$$ \tilde{X}_t = W(\theta_t) = \begin{cases} W(1), & t \geq 1/2, \\ W(0) = 0, & t < 1/2, \end{cases} $$
where $W = (W(t))_{t\geq 0}$ is a standard Wiener process and $\theta_t = \mathbf{1}_{\{t \geq 1/2\}}$. Since $\theta_t$ is a deterministic function that is non-decreasing and right-continuous, $\tilde{X}$ is again both gaussian and Markovian. So the same is true for the law of $X$ as well.
3rd Proof. Fix any $t \in [0, 1]$ and consider
$$ 0 \leq t_1 < t_2 < \ldots < t_n \leq 1 $$
so that $t_j \neq t$ for all $j$. Also, partition the index set $J = \{1,\ldots,n\}$ into two parts as follows:
$$ J_{\text{p}} = \{j \in J : t_j < t\} \qquad\text{and}\qquad J_{\text{f}} = \{j \in J : t_j > t\}. $$
Also, for each $I \subseteq J$ we write $X_I = (X_{t_j})_{j \in I}$. To establish the Markov property of $(X_t)_{t\in[0,1]}$, we will show that, conditioned on $X_t = x$, the CF of $X_J$ factors into CF for the "past" part $X_{J_{\text{p}}}$ and for the "future" part $X_{J_{\text{f}}}$. To this end, we consider the characteristic function of $X_I$ given $X_t = x$:
\begin{align*} \phi_{X_I \mid X_t = x}(\xi_I) &= \mathbf{E} \biggl[ \exp \biggl( i\sum_{j \in I} \xi_j X_{t_j} \biggr) \,\biggm|\, X_t = x \biggr] \\ &= \mathbf{E} \biggl[ \exp \biggl( i Z \sum_{j \in I} \xi_j \mathbf{1}_{\{ t_j \geq 1/2 \}} \biggr) \,\biggm|\, X_t = x \biggr]. \end{align*}
Case 1. If $t < 1/2$, then $X_t$ is a degenerate RV with the value $0$. So it suffices to consider the case $x = 0$ only. Then the condition $X_t = 0$ can be dropped from the expectation essentially without affecting the value, yielding
\begin{align*} \phi_{X_I \mid X_t = x}(\xi_I) &= \mathbf{E} \biggl[ \exp \biggl( i Z \sum_{j \in I} \xi_j \mathbf{1}_{\{ t_j \geq 1/2 \}} \biggr) \biggr] = \exp\biggl[ -\frac{1}{2} \biggl( \sum_{j \in I} \xi_j \mathbf{1}_{\{ t_j \geq 1/2 \}} \biggr)^2 \biggr]. \end{align*}
Then by plugging $J, J_{\text{p}}, J_{\text{f}}$ to $I$ respectively, it is clear that
$$ \phi_{X_J \mid X_t = x}(\xi_J) = \phi_{X_{J_{\text{p}}} \mid X_t = x}(\xi_{J_{\text{p}}}) \phi_{X_{J_{\text{f}}} \mid X_t = x}(\xi_{J_{\text{f}}}). $$
This shows that $X_{J_{\text{p}}}$ and $X_{J_{\text{f}}}$ are independent given $X_t = 0$ as desired.
Case 2. Assume that $t \geq 1/2$. Then $X_t = Z$, and so,
\begin{align*} \phi_{X_I \mid X_t = x}(\xi_I) &= \mathbf{E} \biggl[ \exp \biggl( i Z \sum_{j \in I} \xi_j \mathbf{1}_{\{ t_j \geq 1/2 \}} \biggr) \,\biggm|\, Z = x \biggr] = \exp \biggl( i x \sum_{j \in I} \xi_j \mathbf{1}_{\{ t_j \geq 1/2 \}} \biggr). \end{align*}
Using this, it is straightforward to show that
$$ \phi_{X_J \mid X_t = x}(\xi_J) = \phi_{X_{J_{\text{p}}} \mid X_t = x}(\xi_{J_{\text{p}}}) \phi_{X_{J_{\text{f}}} \mid X_t = x}(\xi_{J_{\text{f}}}). $$
holds, hence $X_{J_{\text{p}}}$ and $X_{J_{\text{f}}}$ are independent given $X_t = x$ as desired.