"Everybody knows" that there are lots of variations on the theme of the central limit theorem. The most frequently seen form seems to be this: Suppose $X_1,X_2,X_3,\ldots$ are i.i.d. random variables and $\sigma^2=\operatorname{var}(X_1)<\infty$ and $\mu=\mathbb E(X_1)$. Then for all real $x$, $$ \lim_{n\to\infty} \Pr\left( \frac{\frac{X_1+\cdots+X_n}n -\mu}{\sigma/\sqrt n} \le x \right) = \frac 1{\sqrt{2\pi}} \int_{-\infty}^x e^{-u^2/2}\,du. $$
Now suppose $X_1,X_2,X_3,\ldots$ are sampled without replacement from a finite population. In that case the assumption of independence does not hold, but it holds approximately when $n$ is small by comparison to the size of the population. One may imagine the probability above seeming to approach the integral above, but when $n$ reaches the size of the whole population, then the sample average above is with probability $1$ equal to the population average, and the behavior of its probability distribution when $n$ is, for example $1/2$ or $3/4$ or $99/100$ of the population is unclear to me.
There must be published results about this. (?) What are they?
For which value of $n$ is the probability whose limit is taken above closest to the integral?
First, I think you need to fix your theorem: $\bar X = \frac{\sum X_i}{n}\sim \mathcal{N}(\mu,\frac{\sigma}{\sqrt{n}})$. You need to subtract the mean and divide by the standard deviation, so that $\sqrt{n}\frac{\bar X-\mu}{\sigma} \sim \mathcal{N}(0,1)$
That aside, the answer will depend on the underlying distribution of the population. Lets say all $X_i=1\;\forall i$ then it will never converge to the integral. Barring such degenerate distributions, you have the issue of defining your metric for "closest". The probability measure of your sample mean is a function which you are comparing to another function. Do you want to minimize the Kullback–Liebler Divergence? I think this seems like a good choice, at least theoretically.
You will need to know the underlying sampling distribution as a function of $n$ and $N$ (sample and population size, respectively), then do a nonlinear optimization where you minimize the Kullback–Liebler Divergence as a function of $n$.