Let $E$ be a vector space, with $\dim E = n$.
Let $u$ be an endomorphism of $E$ and $B = (b_1, \dots, b_n)$, $C = (c_1, \dots, c_n)$ be two basis of $E$.
Let $\operatorname{Mat}_{BC}(u)$ denote the matrix of $(u(b_1), \dots, u(b_n))$ in basis $C$.
Here is my question: if we introduce $B', C'$ two basis of $E$, is this formula right ?
$$\operatorname{Mat}_{B'C'}(u) = \left(\operatorname{Mat}_{C'C}(\operatorname{Id}_E)\right)^{-1}\operatorname{Mat}_{BC}(u)\operatorname{Mat}_{B'B}(\operatorname{Id}_E)$$
Secondly, if this is correct, any explanation or interpretation would be greatly appreciated.
Yes, your formula is correct although it is usually written differently. Let me introduce some notation which will make the formula look more natural. Given an $n$-dimensional vector space $E$ and an ordered basis $B = (b_1, \dots, b_n)$ for $E$, any vector $v \in E$ can be written uniquely as $v = v_1 b_1 + \dots + v_n b_n$ where $v_i \in \mathbb{F}$ are called the compomenets of the vector $v$ with respect to the basis $B$. Set
$$ [v]_B = \begin{pmatrix} v_1 \\ \vdots \\ v_n \end{pmatrix}. $$
That is, $[v]_B$ is a column vector which contains the components of the vector $v$ with respect to the basis $B$.
Now, let's assume $E,F$ are vector spaces, $B = (b_1, \dots, b_n)$ is a basis for $E$ and $C = (c_1, \dots, c_m)$ is a basis for $F$. Given a linear map $T \colon E \rightarrow F$, there is a unique $m \times n$ matrix denoted by $[T]^B_C$ (you should read it as the matrix representing $T$ with respect to the basis $B$ in the domain and the basis $C$ in the codomain) which satisfies $$ [Tv]_C = [T]^B_C \cdot [v]_B $$
for all $v \in V$. On the right we have the matrix multiplication of the $m \times n$ matrix $[T]^B_C$ by the column $n \times 1$ vector $[v]_B$ so we get a $m \times 1$ column vector which contains the components of $Tv$ with respect to the basis $C$. The way to remember this formula (which is essentially the definition of $[u]_C^B$) is to note that the $B$'s cancel out as in:
$$ \require{cancel} [Tv]_C = [T]^{\cancel{B}}_C \cdot [v]_{\cancel{B}}. $$
Now, let's say we have another linear map $S \colon F \rightarrow G$ and a basis $D = (d_1, \dots, d_l)$ for $G$. The fact that matrix multiplication corresponds to composition of linear maps can be written as
$$ \require{cancel} [S \circ T]^B_D = [S]_D^{\cancel{C}} \cdot [T]_{\cancel{C}}^B $$
where again, on the right you have matrix multiplication and on the left you have the matrix representing the composition.
In particular, if $E = F$ and $T$ is invertible, then by taking $S = T^{-1}$ we get
$$ \require{cancel} I_n = [\operatorname{Id}_E]_B^B = [T^{-1} \circ T]^B_B = [T^{-1}]_B^{\cancel{C}} \cdot [T]_{\cancel{C}}^B $$
and so
$$ [T^{-1}]_B^{C} = \left( [T]_{C}^B \right)^{-1}. $$
The relation between the notation I described and your notation is that $\operatorname{Mat}_{BC}(u) = [u]^B_C$. Translating your formula to my notation, we get
$$ [u]^{B'}_{C'} = \left( [\operatorname{Id}_E]^{C'}_{C} \right)^{-1} \cdot [u]^B_C \cdot [\operatorname{Id}_E]^{B'}_B = [\operatorname{Id}_E^{-1} ]^{C}_{C'} \cdot [u]^B_C \cdot [\operatorname{Id}_E]^{B'}_B = [\operatorname{Id}_E ]^{\cancel{C}}_{C'} \cdot [u]^{\cancel{B}}_{\cancel{C}} \cdot [\operatorname{Id}_E]^{B'}_{\cancel{B}} $$
so your formula is just an expression of the fact that matrix multiplication corresponds to composition.