Let $X=(X_t)_{t \ge 0}$ a Lévy process for real valued $X_i: \Omega \to \mathbb{R}$. I heard about a slogan on Lévy processes that the
"the probability measure on the space of paths $t \mapsto X_t(\omega)$ (for fixed event $\omega \in \Omega$) is completely determined only by probability measure $P_{X_1}$ of $X_1$ of single $X_i$, say wlog $X_1$."
Why?
Recall that the probability measure on the space of paths
is defined as pushforward measure $X(P)$ of measure $P_{\Omega}$ on
$\Omega$ with respect measurable function
$$X: \Omega \to \mathbb{R}^{\mathbb{R}_{\ge 0}}, \omega \mapsto (X_t(\omega))_{t \ge 0} $$
It is well known that $X(P)$ is uniquely determined by marginal distributions $P^{\mathbb{R}_{\ge 0}}_{t_1,..., t_n}(P_X):= P_{X_{t_1},..., X_{t_n}}$ for all $X_{t_1},..., X_{t_n}$ and natural projection map $P^{\mathbb{R}_{\ge 0}}_{t_1,..., t_n}: \mathbb{R}^{\mathbb{R}_{\ge 0}} \to \mathbb{R}^n$.
Therefore the question is why for all $X_{t_1},..., X_{t_n}$ the marginal distributions $P_{X_{t_1},..., X_{t_n}}$ depend only on $P_{X_1}$?
That is because, for a Levy process all the marginal distributions are decided by the distribution of $X_1$.
Let me start once again with the definition of a Levy process. A stochastic process $(X_t)_{t \geq 0}$ is a Levy process if $X_0 = 0$ a.s., $X_t$ has independent stationary increments, and is continuous in probability i.e. for all $\epsilon>0$ and $t \geq 0$ it is true that $\mathbb P(|X_{t +h} - X_t| < \epsilon) \to 0$ as $h \to 0$.
The key point is that there are stationary independent increments.
Let the path space of $X_t$ be given by $\Omega'$. Then, what is the sigma-algebra and measure on $\Omega'$? As mentioned, the function $\omega \to (X_t(\omega))_{t \geq 0}$ is a function from $\Omega$ to $\Omega'$.
If we seek to "push" the measure forward, then we get the following sigma-algebra $\Sigma'$ on $\Omega'$ : for measurable subsets $B_1,...,B_n \subset \mathbb R$ and times $0 \leq t_1 < ... < t_n$, the set $\{X_{t_i} \in B_i \ \forall i\} \in \Sigma'$ , and furthermore $\Sigma'$ is generated by these sets(which are called cylindrical sets, if I'm not wrong).
The measure $P'$ on $(\Omega',\Sigma')$ is then determined by the fact that $P'(\{X_{t_i} \in B_i\}) = P_{t_1,...,t_n}(X_{t_i} \in B_i \ \forall i)$ where $P_{t_1,...,t_n}$ is the joint distribution of $(X_{t_1},...,X_{t_n})$.
But we quickly observe something about the joint distributions : $$ P_{(t_1,...,t_n)}(\{X_{t_i} \in B_i\}) = P_{(t_1,...,t_n)}(\{X_{t_1} \in B_1, (X_{t_2}-X_{t_1}) \in B_2 - B_1 , ..., (X_{t_n}-X_{t_{n-1}}) \in B_{n} - B_{n-1}\}) $$
where $C-D = \{c-d : c \in C,d \in D\}$ is the difference of two sets. However, these are independent random variables that are in discussion. Therefore, $$ P_{(t_1,...,t_n)}(\{X_{t_i} \in B_i\}) = \prod_{i=1}^{n-1} \mathbb P(X_{t_{i}}- X_{t_{i-1}} \in B_{i}-B_{i-1}) $$
where we take $B_0 = \{0\}$ and $t_0 = 0$. But we've used only independence : by stationarity, we actually get $$ \prod_{i=1}^{n-1} \mathbb P(X_{t_{i}}- X_{t_{i-1}} \in B_{i}-B_{i-1}) = \prod_{i=1}^{n-1} \mathbb P(X_{t_{i}-t_{i-1}} \in B_{i}-B_{i-1}) $$
At this point, we could say that we are stuck and cannot proceed further. However, it turns out that we can proceed further. The main reason is that the stationarity and independence of increments allows us to actually express the distribution of $X_t$ in terms of $X_1$, for all $t$. Let me explain how.
Let's look at $X_2$. By independence of increments, $X_2 = X_1 + X_1$ (where the two copies are independent instances of $X_1$). Then $P_{X_2} = P_{X_1} * P_{X_1}$ where $*$ denotes the convolution of probability measures. Thus, $X_2$ is determined by $X_1$. Similarly, you can see why $X_3,X_4$ etc. are also determined in distribution by $X_1$.
At this point, it's important to be comfortable with a different language : the characteristic function of $X_2$, $\phi_{X_2} = E[e^{itX_2}] = E[e^{itX_1}]^2 = \phi_{X_1}^2$ and since the characteristic function determines the distribution, $X_2$ is determined in distribution by $X_1$. In general, $\phi_{X_p} = (\phi_{X_1})^p$ for positive integers $p$.
What about $X_{\frac pq}$ for positive $p,q$? By independence and stationarity, $X_{\frac pq} + X_{\frac pq} + ... + X_{\frac pq} = X_p$ as a sum of independent stationary increments. Therefore, in terms of characteristic functions, the characteristic function of $X_{\frac pq}$ would be $\phi_{X_{\frac pq}} = (\phi_{X_p})^{\frac 1q} = (\phi_{X_1})^{\frac pq}$. It is difficult to describe $X_{\frac pq}$ in terms of $X_1$ using the probability it assigns to various sets, which is why the characteristic function is handy.
Now, let $r$ be an arbitrary real number and let $\frac {p_n}{q_n}$ be a sequence of rational numbers converging to $r$ from above. Then, by the continuity in probability hypothesis, one can check pretty much from definition that $X_{\frac {p_n}{q_n}} \to X_r$ in probability, and hence in distribution. But then, this means that for all measurable $B$ we have $P(X_r \in B) = \lim_{n \to \infty} P(X_{\frac {p_n}{q_n}} \in B)$! We can also note that convergence in distribution implies convergence of the characteristic functions, therefore $$\phi_{X_r} = \lim_{n \to \infty} \phi_{X_{p_n/q_n}} = \lim_{ n \to \infty} (\phi_{X_1})^{\frac {p_n}{q_n}} = (\phi_{X_1})^r$$ which shows that the distribution of $X_r$ is actually determined by the distribution of $X_1$ for all $r$ real. The only problem is that one can't express this as a "by definition" determination, because there's no direct connection between the probability that $X_r$ lies in a set versus the probability that $X_1$ lies in that set, but the characteristic function shows the clear relation between the probability distributions.
Thus, for each $r$, $P_{X_r}$ is determined by $P_{X_1}$ via this characteristic function relation.
The above explanation, along with everything we've said before, tells us that the joint distributions $P_{t_1,...,t_n}$ are determined by $P_{X_1}$ (because $\{X_{t_i} \in B_i\}$ generates the same sigma-algebra as $(X_{t_1},...,X_{t_n})$ as the $B_i$ are arbitrary).
Therefore, $P'$ is determined on a generating set of $\Sigma'$ by $P_{X_1}$, and hence $P'$ itself is determined by $P_{X_1}$, as desired.
But there is nothing special about the time point $1$, apart from the fact that it is non-zero. If one replaces $1$ with an arbitrary positive real number $r$, then we can determine $X_{\frac{rp}{q}}$ in distribution and then $\frac{rp}{q}$ continues to be dense in $\mathbb R$ if $r >0$ so the distribution at each time point is determined.
This will hopefully explain the otherwise vague statement specified, a lot better.