I'm conducting a simulation of the evolution of a population and calculating values for heritability and confidence intervals for a type's frequency (the number of individuals of a type for each generation) as the program is running. The program runs for an x quantity of generations for each of a y quantity of times (maybe x=200 and y=5000), but the result is different each time because a random process is modeled.
Heritability=covariance of parent values and offspring values divided by variance of parent values
I believe that I am taking a sample because it is a random process, yet covariance and variance are defined in terms of expectations (in which the result is divided over all N pairs of values). So, is the denominator of the covariance N or N-1 and is the denominator of the variance N or N-1?
Thank you very much for your input.
Asymptotically it makes no difference. Clearly $\frac{n}{n-1}$ converges to $1$ so provided your sample is sufficiently large, you'll get basically the same answer.
With that said, $n-1$ is normally used for the sample standard deviation in order to make the estimator unbiased. The covariance of two random variables can be expressed as the sum of two variances
$$Cov(X,Y) = Var \left(\frac{X+Y}{2} \right)-Var \left(\frac{X-Y}{2} \right)$$
Hence, in order to make the covariance estimator unbiased, you should also divide by $n-1$.