Characteristic function for a random variable that can take the value infinity

747 Views Asked by At

I want to derive a characteristic function for the duration of a stochastic process that can possibly never end. Specifically, I have $X \in \mathbb{N} \cup \{0, +\infty \}$ and $\sum_{k = 0}^{\infty} P(X = k) <1$.

I tried to go like this:

$$ \phi_X(t) = \mathbb{E} \left [e^{itX} \mid X < \infty \right ] P(X < \infty ) + \mathbb{E} \left [e^{itX} \mid X = \infty \right ] P(X = \infty ) $$

But it seems to me that $e^{t i \infty}$ is not defined:

$$ \mathbb{E} \left [e^{itX} \mid X = \infty \right ] = \cos(t \times \infty)+ i \sin(t \times \infty) $$

If I remember correctly, characteristic function exists for all real-valued distributions. Does it include the case of extended real line? Does characteristic function exist in this specific case?

Update: the process in question is similar to the Gambler's Ruin when playing against infinitely rich adversary, like the one described in paragraph 1.1 here: http://www.columbia.edu/~ks20/FE-Notes/4700-07-Notes-GR.pdf Assuming that the Gambler starts with initial wealth of \$1, the duration of the game is the duration of a nonnegative random walk, where the probability of stepping up is $p$. The probability that the process will last for $k = 2m+1$ periods (that is, $k$ must be odd) is

$$ P(X = 2m+1) = \frac{1}{2m + 1} \binom{2m + 1}{m} p^m (1-p)^{m+1} $$

In a case $p > 0.5$, we have $\sum_{m = 0}^{\infty} P(X = 2m+1) = \frac{1-p}{p} < 1 $, which is the probability that the process will eventually terminate.

1

There are 1 best solutions below

4
On BEST ANSWER

For what it's worth, none of the variations listed at Wikipedia would apply to the extended real numbers.

I have two ideas on how you might proceed in whatever you're trying to do.


The first idea is to write $X = Y + Z$, where $Y$ and $Z$ are independent random variables with $Y \in \mathbb{N}$ and $Z \in \{ 0, \infty \}$. Explicitly, their distributions are determined by

  • $ P(Y = y) = P(X = y \mid X \neq \infty) $
  • $P(Z = \infty) = P(X = \infty)$

Given this, you use $\phi_Y$ as much as you can in situations where you would like to use characteristic functions, and use the distribution of $Z$ to account for the distinction between $X$ and $Y$.

(normally I would consider $X = YZ$ and $Z \in \{ 1, \infty \}$, but my first impression is the additive version is more suitable to your needs)


The second idea is to define a function

$$ p: \mathbb{R} \times \overline{\mathbb{N}} \to \mathbb{C} : (t, n) \mapsto \begin{cases} \exp(i t n) & n \neq \infty \\ 0 & n = \infty \end{cases} $$

and then set

$$ \phi_X(t) = \mathbb{E}[p(t, X)] $$

Note that you can still recover $X$ from this. In fact, there is a straightforward relationship to $Y$ and $Z$:

  • $\phi_X(0) = P(Z = 0)$
  • $\phi_X(t) = \phi_X(0) \phi_Y(t) $

Amusingly, $\phi_Z(t)$ is the the constant function with value $\phi_X(0)$, so this last identity is the addition formula

$$ \phi_{Y+Z}(t) = \phi_Y(t) \phi_Z(t) $$

Maybe there is some nice theory to develop here.

This same approach should generalize from $\mathbb{N} \cup \{ \infty \}$ to the projective real numbers: the compactification where $+\infty$ and $-\infty$ are the same number. (not the extended real numbers, because there doesn't appear to be any reasonable way to distinguish between the two ends)