Q1) We say that $X_n\to X$ in probability if $$\forall \varepsilon>0, \lim_{n\to \infty }\mathbb P\{|X_n-X|>\varepsilon\}=0.$$
What does it mean concretely ? What could be the interpretation behind ?
Q2) What would be the difference between
1) $$\forall \varepsilon>0, \lim_{n\to \infty }\mathbb P\{|X_n-X|\leq \varepsilon\}=1$$
2) $$\forall \varepsilon>0, \mathbb P\{\lim_{n\to \infty }|X_n-X|\leq \varepsilon\}=1$$
3) $$\mathbb P\{\forall \varepsilon>0, \lim_{n\to \infty }|X_n-X|\leq \varepsilon\}=1$$
4) $$\lim_{n\to \infty }\mathbb P\{\forall \varepsilon>0, |X_n-X|\leq \varepsilon\}=1.$$
I'm not really sure how to interpret all these four limits since they look almost the same for me. I can see that 1) is nothing more than the convergence in probability. If someone could explain me the difference between all these limit, it would help me very much to understand better those concept of convergence.
About your first question: basically convergence in probability is nothing else than $L^1$ convergence once you have decided to ignore high peaks (and here, you can choose the height threshold as small or as large as you please). In fact, the convergence in probability is equivalent to $$\forall M>0, \int(|X_n-X|\land M ) d\mathbb{P}\rightarrow0, n\rightarrow\infty$$ and also to $$\exists M>0, \int(|X_n-X|\land M ) d\mathbb{P}\rightarrow0, n\rightarrow\infty,$$ where $a\land b:=\min(a,b)$.
Edit: let me explain why you should fall in love with this result (if it isn't clear yet).
Now let's prove the claims. To simplify the notation in order to do that, let's define $Y_n:=|X_n-X|$. What we have to prove then is the equivalence among:
$\forall\varepsilon>0, \mathbb{P}(Y_n>\varepsilon)\rightarrow0, n\rightarrow0$;
$\forall M>0, \int(Y_n\land M ) d\mathbb{P}\rightarrow0, n\rightarrow\infty$;
$\exists M>0, \int(Y_n\land M ) d\mathbb{P}\rightarrow0, n\rightarrow\infty$;
That 1) implies 2) is a simple consequence of the fact that 1) implies that for all $M>0$ the sequence $(Y_n\land M)_{n\in\mathbb{N}}$ converges in probability to zero and so, by the variant of the bounded convergence theorem where the convergence in probability takes the place of the a.s. convergence, we get 2).
The fact that 2) implies 3) is obvious.
Now assume 3). Suppose to get a contradiction that 1) doesn't hold. Then there exists $0<\varepsilon <M$, there exists $\delta>0$ and there exists a strictly increasing sequence of positive integers $(n_k)_{k\in\mathbb{N}}$ such that $$\forall k\in\mathbb{N}, \mathbb{P}(Y_{n_k}>\varepsilon)\ge\delta.$$ Then $$\forall k\in\mathbb{N}, \delta\le\mathbb{P}(Y_{n_k}>\varepsilon)\le\mathbb{P}(Y_{n_k}-(Y_{n_k}\land M)>\frac{\varepsilon}{2})+\mathbb{P}(Y_{n_k}\land M>\frac{\varepsilon}{2})\\\le\mathbb{P}(\{Y_{n_k}-(Y_{n_k}\land M)>\frac{\varepsilon}{2}\}\cap\{Y_{n_k}\ge M\})+\mathbb{P}(\{Y_{n_k}-(Y_{n_k}\land M)>\frac{\varepsilon}{2}\}\cap\{Y_{n_k}< M\})+\mathbb{P}(Y_{n_k}\land M>\frac{\varepsilon}{2})\\=\mathbb{P}(Y_{n_k}>M+\frac{\varepsilon}{2})+\mathbb{P}(\emptyset)+\mathbb{P}(Y_{n_k}\land M>\frac{\varepsilon}{2})\\\le\mathbb{P}(Y_{n_k}>M+\frac{\varepsilon}{2})+\frac{2}{\varepsilon}\int Y_{n_k}\land Md\mathbb{P}.$$ So $$\liminf_{k\rightarrow\infty}\mathbb{P}(Y_{n_k}>M+\frac{\varepsilon}{2})\ge\delta.$$ Then $$0=\lim_{k\rightarrow\infty}\int Y_{n_k}\land Md\mathbb{P}\ge\liminf_{k\rightarrow\infty}\int_{Y_{n_k}>M+\frac{\varepsilon}{2}} Y_{n_k}\land Md\mathbb{P}\\\ge\liminf_{k\rightarrow\infty}M\mathbb{P}(Y_{n_k}>M+\frac{\varepsilon}{2})\ge M\delta>0,$$ absurd. So 3) implies 1).
About your second question: the first is equivalent to convergence in probability, the second and the third are nothing else than convergence almost surely, while the fourth is something strange: it is strictly stronger then convergence in probability (e.g. $X_n=1/n$ converges to zero in probability but doesn't satisfies (4)), while it doesn't imply a.s. convergence (as the typewriter sequence of functions shows) neither it is implied by a.s. convergence (as $X_n=1/n$ shows). Basically it states that the set where $X_n$ and $X$ are equal grows in probability to $1$.