I am running a simulation of an OU process. The process is defined such that the change in each time step is
$dx(t+dt)=-\theta.dt.x(t)+\sqrt{\kappa.dt}.(randn())$
where $\theta$ and $\kappa$ are parameters of the distribution and randn() signifies a random variable drawn from a normal distribution. I then generate a large number, n2, of points in this distribution and only sample the last n1 points. I do this so that $n1<<n2$ so that I can assume I am in the limit where an infinite amount of time has passed from the distribution started. In this limit the variance of all n2 points is $\frac{\kappa}{2\theta}$.
If I take the average of the mean of the n1 points over enough repeated loops I obviously get an average mean of zero. But if I take the average of the absolute value of the mean of the n1 points over many loops I get some non-zero value. I can see numerically that this quantity, lets call it x, is proportional to $\sqrt{k}$, inversely proportional to $\theta$ and inversely proportional to $\sqrt{dt}$. However I do not know how to show this analytically. Does anyone on here know how to show this?
So effectively I want to generate an OU process and then calculate the expectation value of the absolute value of this distribution after a certain length of time has passed.
Any help will be greatly appreciated
You are using an Euler-Maruyama scheme to simulate the process $$dX_t = -\theta X(t)dt + \sqrt{\kappa} dB_t,$$ where $B_t$ is a standard Brownian motion. The OU process has an explicit solution as an SDE, and its transition kernel can also be written down. At time $t$, it should be $$p_t(x_0,y) = \frac{1}{\sqrt{\pi \kappa(1-e^{-2\theta t})/\theta}} \exp\left( -\frac{(y-x_0 e^{-\theta t})^2}{\kappa (1-e^{-2\theta t})/\theta} \right).$$ Your desired expectation at time $t$ is then $$E[|X_t|] = \int_{-\infty}^\infty |y| p_t(x_0, y)dy,$$ which, assuming $x_0 = 0$, is the expectation of the absolute value of a normal random variable, also known as a half-normal distribution. The variance of this half-normal distribution would be $\sigma^2 = \kappa(1-e^{-2\theta t})/\theta$, and the integral works out to be $$ E[|X_t|] = \sigma\sqrt{\frac{2}{\pi}} \propto \sqrt{\frac{\kappa}{\theta}}.$$ This would confirm your observation that the expectation varies proportionally to $\sqrt{\kappa}$, but not that it varies inversely with $\theta$. It is of course entirely possible I've gotten some calculation wrong and I urge you to check the calculations.
Finally, you should question your statement about scaling $dt$. Indeed the expectation should be insensitive to $dt$, so long as it is sufficiently small. Are you sure that you account for the fact that you need to take more steps to reach the same final time when $dt$ is scaled?
Also, note that when you are taking $n_2 > n_1$ steps, you are essentially sampling from a Gaussian, which is the stationary distribution of the Ornstein-Uhlenbeck process.