How do I stop my normals from spreading out under perturbation?

35 Views Asked by At

In many AI generative models you start with a so-called latent vector $\mathbf{z}$ of high dimension $d$ such that each $z_i \in \mathcal{N}(0,1)$. I'd like to randomly perturb this distribution in such a way the the perturbation is the same as $\mathbf{z}$ (e.g. a bunch of independent standard normals). When I try to do something like this

$$\mathbf{z} \rightarrow \mathbf{z} + \Delta \mathbf{w}$$

Where $\Delta$ is a small number and $\mathbf{w}$ is another vector of independent standard normals. They start to spread out after enough iterations:

enter image description here

It looks like by new distribution is spreading out (albeit symmetrically). Two questions:

  1. How do I adjust my update so that standard deviation stops spreading out? I assume it's a function of $\Delta$ and maybe the dimension $d$?
  2. Is accounting for the standard deviation enough? Or are there higher moments that need to be adjusted as well? I've plotted the first 4 and they look OK (code for reference), but I'd like to understand this from a mathematical perspective.