Having searched the internet for a while and read a few textbooks that I have found in library, I am unable to answer the very simple question of -
As the number of samples taken from a normal distribution increases should the mean of the estimator be the constant if an unbiased estimator is used, and the mean of the estimator not constant if a biased estimator is used?
I can provide further detail if needed, but I feel like the context of the question is irrelevant, however I have added some below.
The distribution had variance 64, and when I used the estimator $S^2$ for n = 3,10, and 100, the sample means of $S^2$ were 63.9, 64.1 and 63.8, respectively. However, when I used $\bar S^2$ for the same n values, the sample means were 42.5, 58.0, and 63.2. What does this tell me about the estimators?
N.B When I work out sample variances for the estimators, as n increases, both of the sample variances decrease and seem to converge to 64 as well. But I only have 3 samples of 3,10, and 100 so I cant be certain.