How to normalize a sum of noise signals (fbm)?

147 Views Asked by At

Given an artificially generated centered noise of amplitude 1 (boundaries within [-½;½]), how can we get the amplitude of the following fbm() function?

$$ \text{fbm}(t) = \displaystyle\sum_{k=0}^{\text{octaves}-1} \text{noise}(t \times \text{lacunarity}^k) \times \text{gain}^k $$

$noise()$ is a 1D gradient $noise()$ with tilts between -45° and 45°. Its distribution looks like a Laplacian distribution (roughly): gradient noise distribution. A major difference with the Laplacian distribution is that the boundaries are [-½;½] (instead of $\mathbb{R}$, that is converging to 0 but never reach it).

Given the following to simplify the notation:

  • $X_k=\text{noise}(t \times \text{lacunarity}^k)$ ($lacunarity$ and $t$ don't really matter for its distribution)
  • $r=\text{gain}$
  • $N=\text{octaves} - 1$

We can write fbm like this:

$$ \text{fbm}(t) = \displaystyle\sum_{k=0}^{N} X_k r^k $$

Now what is f() such that $\frac{\text{fbm}(t)}{\text{f}(r,N)}$ fits as best as possible within [-½;½] (amplitude of 1, just like the "vanilla" $noise()$ signal)?

By actually observing $\text{f}(r,N)$ with a bunch of data, we get: https://i.imgur.com/gol9BEc.mp4

But what is the actual formula?