I see condition (C2) in the paper http://www-stat.wharton.upenn.edu/~tcai/paper/Precision-Matrix.pdf is called polynomial-type tails. I do not know why they call it as polynomial-type tails.
Let $X$ be a mean-zero random variable. The (C2) condition therein is $E|X|^p<C$ with some positive constants $p,C$.
I know that: if $E|X|^p<C$ with some positive constants $p,C$, then by Markov inequality we have \begin{equation} P(|X|\ge t)\le \frac{E|X|^p}{t^p}=Ct^{-p} \ \text{for all $t>0$}. \end{equation} So, a finite moment implies a polynomial tail.
Now my question is: If $P(|X|\ge t)\le Ct^{-p}$ for an constant $p$ and all $t>0$ (maybe $t\to \infty$?), does it have any finite moment?
Or could anyone explain why they call the condition (C2) as polynomial-type tails?
Thanks.
If $P(|X|>t) \sim t^{-p},$ then the PDF goes like (differentiate CDF) $f_X(x)\sim x^{-(p+1)},$ so has a power law tail. When you integrate to get $E(|X|^q)$ you have $\int^\infty x^qx^{-(p+1)}dx$ which converges for $q-p-1 < -1,$ i.e. $q <p$ and diverges for $q\ge p.$
Thus if you have a power law tail (I'd call it power law, not polynomial, but that's unimportant), it corresponds to a particular moment diverging... the faster the power law decays, the larger number of moments are finite. Whereas if the tail decays exponentially, all the moments are finite.