Probability theory - request for proof-verification f(further properties of characterisitc functions)

53 Views Asked by At

By the courtesy of the user @uniquesolution, I presumably managed to understand proof related to certain further property of characteristic function. I completed too big for me shortcuts in reasoning. However, I am not confident whether current form of proof with my edition is correct.

There is the following lemma and its proof of some interesting lemma for me regarding further propeties of characteristic functions in probability theory.

Lemma: Let $X$ be a random variable. Assume: $\exists$ $a\in\mathbb{R}$: $P(X=a)>\frac{1}{2}$. Prove that characteristic function of $X$ cannot take the value of 0.

Proof: Let $\gamma$ - characteristic function of $X$.

If: $a=0$: Since $\exists p \in (\frac{1}{2},1]$: $P(X=0)=p$, then there exists a probabilistic measure $\mu$ such that $P_{X}=p\delta_{0}+(1-p)\mu$. Hence: $(1-p)\mu = P_{X}-p\delta_{0}$ $\implies$ $\sum(1-p)e^{itx}\mu=\sum e^{itx}P_{X} - \sum e^{itx}p\delta_{0}$ $\implies$ $(1-p)\sum e^{itx}\mu = \gamma(t)-p$ $\implies$ $(1-p)\hat{\mu}(t)=\gamma(t)-p$ $\implies$ $|\gamma(t)-p|=(1-p)|\hat{\mu}(t)|\leq(1-p)\times 1 =1-p$ $\implies$ $-(1-p)\leq \gamma(t)-p \leq 1-p $. Therefore: the smallest distance of $\gamma$ to $X$-axis is equal to $p-(1-p)=2p-1>0$.

If $a \in \mathbb{R}\setminus \{0\}$, then analogically to the case of $a=0$ : $|\gamma(t)-pe^{ita}|=|\gamma(t)e^{-ita}-p|\leq 1-p$. This implies that we do not have a change of distance of $\gamma$ with respect to $X$-axis. This implies that in the case of $a=0$ $\gamma$ is not uniformly continous, which must be satisfied by every characteristic function $\blacksquare$

Problem: I read this proof 9 times. Current version of the proof contains bridges of potential too big shortcuts in reasoning. However, I am not sure whether now I am completely correct.

I would be extremely thankful for feedback!

1

There are 1 best solutions below

0
On BEST ANSWER

$\hat{\mu}$ stands for the Fourier transform of $\mu$. In order to understand the solution the problem, you should know that the characteristic function of a random variable is the Fourier transform of its probability density function.