I've studied Markov Process with 2x2 matrices. Using the linear algebra and calculus procedures is clear to me how a Markov chain works.
However, i'm still not able to grasp the intuitive and immediate meaning of a Markov chain. Why intuitively, for $n\rightarrow +\infty $, the state of the system is independent of both initial state and the states reached during the process ?

It does depends on the initial state. Consider the Markov chain which consist of two disjoint state, that is $$\begin{bmatrix} 1 & 0 \\ 0 & 1\end{bmatrix}$$
Then if you start in state $1$, you will stuck in state $1$ forever as it is an absorbing state. Similarly if you start in state $2$.
Those example that you may have encounter might have the property that they only contain a unique stationary distribution, hence no matter where you begins, it ends up there.