Do random variables always integrate to 1?

147 Views Asked by At

In my experience, a classic exam question for stats students is to ask them to prove that a given function is a random variable. Typically, they are expected to give an answer which includes integrating (or summing) over the sample space (which is almost always $\mathbb{R}$) and pointing out that the result is $1$. However, does this trick always work? What if, for example, we're working with a suspected random variable that's exotic enough to require Lebesgue integration? Or what if we're in a rather nasty probability space?

1

There are 1 best solutions below

0
On BEST ANSWER

There are random variables that don't have a density or a probability mass function, so it wouldn't make sense to ask a question of the form "show that this function is a density" or "show that this function is a PMF" if you're interested in a random variable like that.

But from the measure-theoretic point of view, if you have a function on a probability space, the only thing to be done to show it's a random variable is to show it's measurable. The "integrates to 1" part is already built into the measure.

Showing that a given measure is a probability measure would be closer to your question.