For uni I have this exercise where i need to simulatie drifted BM:

I'm doing this with the following python code:
import numpy as np
import math
from matplotlib import pyplot as plt
import scipy.stats as stats
class BMD: #Brownian Motion With Drift
def __init__(self, drift, variance_term, m, T):
self.dt = T/m
self.Z = np.random.standard_normal(m)
self.arr = np.zeros(m+1)
for i in range(m):
self.arr[i+1] = self.arr[i] + variance_term*math.sqrt(self.dt)*self.Z[i] + drift*self.dt
self.end = self.arr[-1]
N = 1000
drift = 0
variance_term = 0.25
m = 200
T = 2
fig, axs = plt.subplots(2)
ends = np.zeros(N)
for i in range(N):
bmd = BMD(drift, variance_term, m, T)
ends[i]=bmd.end
axs[0].plot(np.arange(0,m+1), bmd.arr)
axs[1].hist(ends, density = True, bins=100)
exp_mu = drift*T
exp_sigma = (variance_term**2)*T
x = np.linspace(exp_mu - 3*exp_sigma, exp_mu + 3*exp_sigma, 100)
axs[1].plot(x, stats.norm.pdf(x, exp_mu, exp_sigma))
plt.show()
The problem is that when simulating I get the following image:
Where the orange line shows what the histogram is supposed to show. My problem is that I think i got the BMD definition correct but do not know how to set T in my code, when I set T=2, the histogram is too low, when I set T=365*2, the histogram is too high. u this should not matter right? Changing measure of time should not influence the outcome of the simulations. How do I fix this? Any help is appreciated!
The second argument (
scale) ofstats.norm.pdfis the standard deviation, not the variance. After fixing this bug, the expected PDF then matches your histogram more closely.