I am trying to simulate the following random process:
$$ \frac{dy}{dt} = f(y) + AN(t) $$
where $N(t)$ is additive Gaussian white noise, which, if I remember right, is defined by $\int\limits_0^t N(t) dt$ having zero mean and a variance of $t$.
Aside from the noise, I know I can simulate $\frac{dy}{dt} = f(y)$ as
$$y[k+1] = y[k] + f(y[k])\Delta t$$
as long as the dynamics of the system are much slower than $\Delta t$, for example, if $f(y) = y/\tau$ then the requirement is $\Delta t \ll \tau$.
How does the simulation get changed when adding noise?
I think it scales so that if you halve the timestep, then you have to use random Gaussian samples with variance that halves, e.g. standard deviation divides by $\sqrt{2}$, so I think this would work:
$$y[k+1] = y[k] + f(y[k])\Delta t + An[k]\sqrt{\Delta t}$$
where $n[k]$ is a series of independent samples of a Gaussian distribution with zero mean and $\sigma = 1$.
Is this correct?