I've been explained that a vector field, when seen as "arrows" in the plane, has 0 divergence when its magnitude doesn't change, i.e. when the "arrows" keep same length. But the following examples puzzle me:
$F(x)=x/|x|$ has always norm 1 but its divergence is not 0
$F(x)=x/|x|^2$ has not constant norm but its divergence is 0
Is there some contradiction or do I have a wrong/incomplete picture?
Divergence has nothing (little?) to do with norms of the vectors.
Think instead of of drawing a closed region $\Omega$ in the plane, and the arrows as measuring the velocity of material flowing through the plane.
The region that you drew encloses some amount of material. As the material flows, that amount changes: in particular, it is common sense that the rate at which the amount of enclosed material changes is equal to the rate at which material is crossing the boundary of the region.
You can think of the divergence at a point as measuring the rate at which the density of material is decreasing at that point. The rate at which the total amount of material in all of $\Omega$ is changing is then
$$\int_\Omega \operatorname{div} v$$
and the rate at which material is passing through the boundary of $\Omega$ is $$\int_{\partial \Omega} v \cdot n$$ and the fact that these two must be equal is exactly the divergence theorem.