Modulus of characteristic function equal to 1 implies almost surely constant

284 Views Asked by At

Let $X$ be a random variable, and let $\hat{\mu_X}$ be its characteristic function.

Suppose that $|\hat{\mu_X}(u)| = |\hat{\mu_X}(v)| = 1$ for some $u,v \in \mathbb{R}^*$, with $uv^{-1} \not \in \mathbb{Q}$. We want to show that $X$ is a.s. constant.

My solution: if $|\hat{\mu_X}(u)| = 1$, it is not difficult to show that the image of $uX$ (on a set of full measure) lies in $\theta + 2 \pi \mathbb{Z}$ for some $\theta \in \mathbb{R}$. Hence, the image of $X$ (almost surely) lies in:

$$ \bigg( \frac{\theta_1}{u} + \frac{2 \pi}{u} \mathbb{Z} \bigg) \cap \bigg( \frac{\theta_2}{v} + \frac{2 \pi}{v} \mathbb{Z} \bigg)$$

for some $\theta_1, \theta_2 \in \mathbb{R}$. This has at most one solution, by the conditions on $u,v$, so $X$ is almost surely constant.

Question: Is this proof correct? I'm slightly unsure because the question gives the hint to consider "an independent copy of $X$", so maybe an argument like this is expected, but I don't see how it works here, and if the above is correct, then it is surely simpler. I was also wondering if it is relevant that $\langle u, v \rangle \le (\mathbb{R}, +)$ is dense, but I couldn't think of how that might be applied either.

I would be interested to see any other arguments which can be used to solve this question.

1

There are 1 best solutions below

8
On BEST ANSWER

We first prove a lemma.

Lemma: Let $X$ be such that $|\hat{\mu}_X(u)| = 1$, $u \neq 0$. Then, $u$ is a period for |$\hat{\mu}_X|$.

Proof

Note that we have the inequality: $$|\mathbb{E} [e^{itX}]| \le \mathbb{E}[|e^{itX}|] = 1$$

In order for equality to hold, for example at $t = u$, we must have:

$$e^{itX} \equiv e^{i \theta} \; \; \text{a. s.}$$

for some $\theta \in \mathbb{R}$. This holds by applying the argument here (after rotation). We deduce that:

$$uX(\omega) \in \theta + 2 \pi \mathbb{Z} \;\; \text{a.s.}$$

Now observe that since $X$ is, in fact, effectively discrete, we can write:

$$\hat{\mu_X}(t) = \mathbb{E}[e^{itX}] = \sum_\mathbb{Z} \exp\bigg(it\frac{(\theta + 2 k\pi)}{u}\bigg) \mathbb{P}(uX = \theta + 2 k\pi) $$

$$\implies \hat{\mu_X}(t +u) = e^{i \theta}\hat{\mu_X}(t)$$

$$ \implies |\hat{\mu_X}(t)| = |\hat{\mu_X}(t+u)|$$

Result follows. $\square$

Proposition: Let $X$ be a random variable, and let $\hat{\mu_X}$ be its characteristic function. Suppose that $|\hat{\mu_X}(u)| = |\hat{\mu_X}(v)| = 1$ for some $u,v \in \mathbb{R}^*$, with $uv^{-1} \not \in \mathbb{Q}$. Then, $X$ is a.s. constant.

Proof

By the conditions on $u,v$, we have that $\langle u,v \rangle \le (\mathbb{R},+)$ is dense in $\mathbb{R}$. Consider the random variable $Z := X -X'$, where $X'$ is an independent copy of $X$. Then:

$$\hat{\mu_Z}(t) = \mathbb{E}[e^{itZ}]$$

$$= \mathbb{E}[e^{it(X-X')}]$$ $$\stackrel{\text{indep.}}{=} \mathbb{E}[e^{itX}] \cdot \mathbb{E}[e^{-itX'}]$$ $$ = \hat{\mu_X}(t) \overline{\hat{\mu_X}(t)}$$ $$ = | \hat{\mu_X(t)}|^2$$

By the first observation we made, this is equal to $1$ on a dense subset of $\mathbb{R}$. But note that, as the Fourier transform of a probability measure, $\hat{\mu_Z}$ is continuous! Thus, we must have $\hat{\mu_Z} \equiv 1$.

Since characteristic functions uniquely characterise the laws of their underlying random variables, we may deduce that $Z$ is a.s. constant with value $0$. In particular, $X = X'$ a.s.

To conclude, we consider:

$$\mathbb{P}(X \le c) = \mathbb{P}(X,X' \le c)$$

$$\stackrel{\text{indep.}}{=} \mathbb{P}(X \le c) \cdot \mathbb{P}(X' \le c)$$

$$ = \mathbb{P}(X \le c)^2$$

$$ \implies \mathbb{P}(X \le c) = 0 \text{ or } \mathbb{P}(X \le c) = 1$$

This holds for all $c \in \mathbb{R}$.

Hence, let $c^* := \inf \{c : \mathbb{P}(X \le c) = 1 \}$. We see that $X = c^*$ a.s. $\square$