In some demonstration where the aim is to prove that a random variable, say $X_n$, converge in $L^2$ to another r.v. X, where X is a priori unknown, I notice that some times one compute the $\mathbb E[X_n]$ and then prove that actually $X_n\to \mathbb E[X_n]$ in $L^2$.
What I don't understand is if there is a kind of relation between $X=L^2$-limit and $\mathbb E[X_n]$ and therefore there is a good chance that actually $\mathbb E[X_n]$ corresponds to the $L^2$-limit or if there is no relation and it is only a "right-guess" of the author
First of all, note that in order for your statements to make sense, you need $\mathbb E[X_n] = c$ to be independent of $n$.
In probability theory one often one wants to prove some kind of law of large numbers, i.e. that the empirical mean converges to the expected value, the most simple version being: $$ X_i\stackrel{\rm i.i.d.}{\sim}X \quad \implies \quad S_n :=\frac{1}{n}\sum_{i=1}^{n} X_i \xrightarrow{n\to\infty}\mathbb E[X] $$ (under certain assumptions).
(My notation slightly varies from yours, since here $S_n\to\mathbb E[X]$ and not $X_n\to \mathbb E[X]$.)
As an intuitive example, if you throw a regular dice again and again and compute the mean of the outcomes after e.g. $1000$ throws, the result will typically be close to $3.5$, which is the expected value $\mathbb E[\text{throwing a regular dice}]$.
Note that in fact $\mathbb E[S_n] = \frac{1}{n}\sum_{i=1}^{n} \mathbb E[X_i] = \frac{1}{n}\sum_{i=1}^{n} \mathbb E[X] = \mathbb E[X]$.
This is an intuitive but also very strong/important/fundamental result of probability theory that keeps appearing in various versions.