I'm learning basic statistics all by myself so sorry if this is a naive question. Today I was just reading about the Central Limit Theorem. I understand that no matter what distribution a given population has if we take large samples and get the mean of them, it'll be close to a normal distribution.
Now the thing that I don't understand is, how is this applicable to a single sample? I mean, we are not going to have all possible samples, we will usually have one sample of size 'n' and we'll want to infer something about the population where it was taken from. So how does this apply to a single sample?
The more precise versions of the CLT give rates of convergence that are dependent on n. This allows you to infer that say the sample mean and sample variance will converge to the distribution mean and variance. In fact, since we know the actual distribution converges to a gaussian, we can say much more. For example, quantiles or tail inequalities.