If $X(t)$ is observed at a random time $U \sim \text{Uniform}(0, 1)$, then, by the law of total probability, we have that $$P(X(U) = k \mid X(0) = 1) = \int_0^\infty P(X(u) = k \mid X(0) = 1) g_U(u) \ du,$$
where
$$g_U(u) = \begin{cases} 1, & 0 < u < 1 \\ 0 & \text{otherwise} \end{cases}$$
How was the law of total probability used here for this conditional probability to get this result?
[I will ignore the conditioning on $X(0)=1$ in my answer for simplicity; you can apply the same argument with conditioning on $X(0)=1$ afterward.]
General result: If $X = (X(t))_t$ and $U$ are independent and if $U$ has a density $g_U$, then $E[f(X,U) \mid U=u] = E[f(X, u)]$ and thus $$E[f(X,U)] = E[E[f(X,U) \mid U]] = \int_{-\infty}^\infty E[f(X,u)] g_U(u) \, du.$$
Applying this with $f(x,u) := \mathbf{1}_{x(u)=k}$ gives $$P(X(U)=k) = \int_{-\infty}^\infty P(X(u)=k) g_U(u) \, du.$$
You can modify this to condition on some other event $A$ that is independent of $U$. The general result would become $E[f(X,U) \mid U=u, A] = E[f(X,u) \mid A]$, and $$E[f(X,U) \mid A] = \int_{-\infty}^\infty E[f(X,u) \mid A] g_U(u) \, du.$$