I am currently reading the book: Bayesian Logical Data Analysis. In chapter 5 it is mentioned like below:
"What happens to the average of samples drawn from a distribution which has an infinite variance? In this case, the error bar for the sample mean does not decrease with increasing n''.
My query is, what would be an ideal example of infinite variance?
A simple example is the standard Cauchy distribution $\dfrac{dx/\pi}{1+x^2}$.
If the variance of a probability distribution is characterized as $\frac 1 2 \operatorname{E}\left(( X_1 - X_2)^2\right)$, where $X_1,X_2$ is an i.i.d. sample, then this has infinite variance.
If $X_1,\ldots,X_n$ are independent random variables with this distribution then $\bar X_n= \dfrac{X_1+\cdots+X_n} n \vphantom{\dfrac\int\int}$ also has this same distribution rather than one that is more concentrated near the center.