In the book Statistical Inference (George Casella 2nd ed.), page 470, there is an example:
$\bar{X}_n$ is the mean of $n$ iid observations, and E$X=\mu$, $\operatorname{Var}X=\sigma^2$. "If we take $T_n=1/\bar{X}_n$, we find that the variance is $\operatorname{Var}T_n=\infty$, so the limit of the variances is infinity." Why is the limit of the variance infinity? I can go as far as
$$\operatorname{Var}\frac 1{\bar{X}_n}=\operatorname{E}_{\bar{X}_n}\left[\left(\frac1{\bar{X}_n}-\operatorname{E} \frac 1{\bar X_n} \right)^2 \mid \mu\right] $$
what's next? I know that $\lim_{n\to\infty}\operatorname{Var}{\bar X_n}=0$. However, $\lim_{n\to\infty}1/\operatorname{Var}{\bar X_n}\not=\lim_{n\to\infty}\operatorname{Var}1/\bar X_n$. How can we show the variance approaches to infinity for sufficiently large $n$?
Thanks!

In the example, the mean $\overline{X}_n$ is taken of $n$ iid normal observations. Therefore, $\overline{X}_n$ also has a normal distribution with mean $\mu$ and variance $\sigma^2/n$; its probability density function is therefore $f(x)=\frac1{\sqrt{2\pi\sigma^2/n}}e^{\frac{-(x-\mu)^2}{2\sigma^2/n}}$. When we try to compute the mean of $T_n=1/\overline{X}_n$, we find: $$E(|T_n|) = \int_{-\infty}^{\infty} \frac1{\sqrt{2\pi\sigma^2/n}}e^{\frac{-(x-\mu)^2}{2\sigma^2/n}}\frac1{|x|}\ dx=\infty$$ since near $x=0$ the integrand is bounded below by a constant multiple of $1/|x|$, which has infinite integral. Thus each $T_n$ has undefined mean and hence also undefined variance. I'm not sure why Casella says that $\text{Var }T_n=\infty$; I think it would be more correct to simply say that the variance of each $T_n$ is undefined.