I am trying to simulate a simple SDE $$dS = \mu Sdt + \sigma S dW_t$$ with $\mu,\sigma \in\mathbb{R}$. When I use the scheme
\begin{align} S_{n+1} &= S_n + \mu h S_n + \sigma S_n N(0,h) \end{align}
The results do not make sense. But when I use
\begin{align} S_{n+1} &= S_n + \mu h S_n + \sigma S_n \sqrt{h} N(0,1) \end{align} where $h$ is the step size $$t_0 < t_1 < ... <t_n \text{ with } |t_{i+1} - t_i| = h$$ This appears to be correct in the simulation, but I do not know why the first one fails. I saw on the Wiki page that $$W_{t_2} = W_{t_1} + \sqrt{t_2-t_1}N(0,1) \text{ for } t_2 >t_1$$ "is useful for simulations", but I do not know why this is the case. Is the first method bad for numerical reasons or is there a problem with the method itself?