Infinite Integrals and "Moments" of a Random Vector

30 Views Asked by At

I'm trying to understand the concept of "moments" of random vector, and what this means for (potentially) infinite integrals. Here is an example

Suppose we have a random vector $\alpha$ distributed $F(\alpha)$ with dimension $m$ with bounded support on all dimensions. Suppose we have a formula:

$$g(\alpha)=\int_{1}^{\infty}t^{m-1}f(t\alpha)dt$$

where $t$ is a scalar. Here's the question: Is $g$ always finite? A paper I'm reading claims that "if enough moments for $\alpha$ exist," then $g()$ will be finite. Why does having "enough moments" (and what does that mean?) force this integral be finite?

To me: Because $F$ has bounded support, at some point the pdf $f(\cdot)$ inside of $g$ goes to zero before $t$ goes to infinity. However, that does not require anything about the "moments" of the vector $\alpha$ to compute.

Paper is below:

Armstrong, M. (1996). Multiproduct nonlinear pricing. Econometrica: Journal of the Econometric Society, 51-75. See footnote 13 on pg 62.