I am in a grad-level statistical inference class and literally getting all my concepts confused. Here's a cutout of the concepts that need most amount of clarifications: Rice Mathematical Statistics Book
- What is the difference between variance of an estimate and estimated variance? I always remember using the 1/n-1 summation(xi - xbar)^2 to find the sample variance, but now I know nothing because everything is confused up.
I am missing a lot of links. Any help appreicated. Thank you.
The estimator (of whatever parameter) is a random variable, thus it fluctuates between different samples. Its variance, which is the second-to-last column in your picture, describes the actual degree to which it fluctuates between samples. In practice it depends on parameters whose values we realistically don't know, but it is still helpful to know the exact nature of this parameter dependence. This variance can also be estimated, and the estimate of it is in the last column of your picture.
The values in that second-to-last column are indeed unobservable in realistic situations; the last column is how you estimate them from a sample. That said, you will occasionally see some things that look artificial, like estimating $\mu$ when you somehow already know $\sigma$.