I know that the chain rule for the mutual information $I(X_1,...,X_n;Y)$ is defined as:
$$I(X_1,...,X_n;Y) = I(X_1;Y) + I(X_2;Y|X_1) + ... + I(X_n;Y|X_1,...,X_{n-1})$$
But in the lecture notes of a subject of my degree, it says that:
And I do not really see how they get there. They are describing the mutual information for a Markov chain with $X$, $Y$ and $Z$.

$X-Y-Z$ are a Markov chain iff $I(X;Z|Y)=0$. Both descomposition are possible via chain rule (and mutual information is always greater or equal than zero).