I have implemented Hamiltonian Monte Carlo. To test the effectiveness of my implementation, I have run it against a normal random variable.
After $n$ number of steps, I compare the sample mean and variance of the HMC output against the true mean and variance of the distribution. The HMC output seems to have $\bar{x}$ converging to $\mu$, but $s^2$ seems to converge to something a bit higher than $\sigma^2$.
Aside from an incorrect implementation, how might algorithm parameters affect this (e.g., choice of integrator step size, number of integrator steps per transition)?