Random equation

210 Views Asked by At

We want to prove the following proposition:

Let $X$ and $Y$ be two random variables independent and identically distributed with variance $\sigma^2.$ Let $\alpha,\beta \in \mathbb{R}$ such that $\alpha\beta \neq0$ and $\alpha^2+\beta^2=1.$ Suppose that $\alpha X+\beta Y$ and $X$ have the same distribution. Prove that $X$ is normal $N(0,\sigma^2).$

Following these steps: let $\phi$ be the characteristic function of $X,$ and $\psi(x)=\frac{\phi'(x)}{\phi(x)}.$

1) Prove that $\psi$ is a $C^1$ function, compute $\psi(0)$ and $\psi'(0).$

2) If W is a random variable such that $P_W=\alpha^2\delta_{\alpha}+\beta^2\delta_{\beta}.$ Prove that $$\forall x \in \mathbb{R},\psi'(x)=E[\psi'(Wx)].$$

3) Deduce $\psi'=-\sigma^2.$

4) Conclude.

Attempt:

1) Since $E[X^2]<+\infty,$ then $\phi$ is a $C^2$ function and hence $\psi$ is a $C^1$ function, we have $(\alpha+\beta-1)E[X]=0$ then $E[X]=0$, because $\alpha+\beta \neq 1$, which means that $\psi(0)=0$ and $\psi'(0)=-\sigma^2.$

2) We have $\forall x \in \mathbb{R},\phi(\alpha x)\phi(\beta x)=\phi(x),$ then $\alpha\phi'(\alpha x)\phi(\beta x)+\beta \phi(\alpha x)\phi'(\beta x)=\phi'(x)$ and $\alpha^2\phi''(\alpha x)\phi(\beta x)+2\alpha \beta \phi'(\alpha x)\phi'(\beta x)+\beta^2\phi(\alpha x)\phi''(\beta x)=\phi''(x),$ in conclusion $$\alpha^2E[\psi'(\alpha x)]+\beta^2E[\psi'(\beta x)]=\frac{\alpha^2\phi''(\alpha x)\phi(\beta x)\phi(x)-\alpha^2(\phi'(\alpha x)\phi'(\beta x))^2+\beta^2\phi''(\beta x)\phi(\alpha x)\phi(x)-\beta^2(\phi'(\beta x)\phi'(\alpha x))^2}{(\phi(\alpha x)\phi(\beta x))^2} $$$$=\frac{\phi''(x)\phi(x)-(\phi'(x))^2}{(\phi(x))^2}=\psi'(x)$$

4) $\psi'(x)=-\sigma^2$ which means $\psi(x)=-\sigma^2x$ and then $\phi(x)=e^{-\frac{1}{2}\sigma^2x^2}.$

I am having a problem with part 3), how to deduce that $\psi'(x)=-\sigma^2$ and also he defined $\psi$ by dividing by $\phi$, so why do we have $\forall x \in \mathbb{R},\phi(x) \neq 0$.

2

There are 2 best solutions below

6
On BEST ANSWER

Suppose for a contradiction that there exists $t_0$ such that $\phi(t_0)=0$. Since $\phi(t)=\phi(\alpha t)\phi(\beta t)$, there exists $t_1\in\{\alpha t_0,\beta t_0\}$ such that $\phi(t_1) = 0$. Inductively, we may then construct a sequence $\{t_n\}$ with $t_n\in\{\alpha t_{n-1},\beta t_{n-1}\}$ such that $\phi(t_n)=0$ for all $n$. Since $\alpha^2 + \beta^2 = 1$ and $\alpha,\beta\notin\{0,1\}$, it follows that $r:=\max\{|\alpha|,|\beta|\}<1$, and obviously $|t_n| \le r|t_{n-1}|$, so we therefore have $|t_n|\le r^n|t_0|\to0$ as $n\to\infty$. But since $\phi$ is continuous at zero, we then have $1=\phi(0)=\lim_{n\to\infty}\phi(t_n)=0$, a contradiction. Hence $\phi$ is never zero, so $\psi$ is continuously differentiable.

The proof that $\psi'=-\sigma^2$ is quite similar. Suppose for a contradiction that there exists $t_0$ such that $\psi'(t_0)\neq-\sigma^2$. Assume without loss of generality that $\operatorname{Re}\psi'(t_0) \ge -\sigma^2+\epsilon$ (we can repeat the exact same argument for $\operatorname{Re}\psi'(t_0) \le -\sigma^2-\epsilon$, $\operatorname{Im}\psi'(t_0) \ge \epsilon$ and $\operatorname{Im}\psi'(t_0) \le -\epsilon$). Observe that $\psi'(t_0)=\alpha^2\psi'(\alpha t_0) + \beta^2\psi'(\beta t_0)$ is a convex combination of $\psi'(\alpha t_0)$ and $\psi'(\beta t_0)$, and so at least one of $\psi'(\alpha t_0),\psi'(\beta t_0)$ must have real part no smaller than $-\sigma^2+\epsilon$. That is, we have found $t_1\in\{\alpha t_0,\beta t_0\}$ such that $\operatorname{Re}\psi'(t_1) \ge -\sigma^2+\epsilon$. So once again we have a sequence $\{t_n\}$ with $|t_n|\le r|t_{n-1}|$ satisfying $\operatorname{Re}\psi'(t_n)\ge-\sigma^2+\epsilon$ for all $n$. Since $t_n\to0$ as $n\to\infty$ and $\psi'$ is continuous, we find $-\sigma^2=\operatorname{Re}\psi(0)=\lim_{n\to\infty}\operatorname{Re}\psi'(t_n) \ge -\sigma^2+\epsilon$, a contradiction. Hence $\psi'(t)=-\sigma^2$ for all $t$, completing the proof.

5
On

Okay so I commented that you could probably do this the same way you prove Cramer's decomposition theorem, but you can actually get this from the central limit theorem. Without loss of generality, assume that $\sigma = 1$. I'll write the proof in the case of $\alpha = \beta = 1/\sqrt{2}$ and then you may be able to see how to generalize it (although it seems nasty).

Let $X_j$ be i.i.d. copies of $X$. Then notice that for each $n$, $$\frac{1}{2^{n/2}}\sum_{j = 1}^{2^n} X_j = X $$ in distribution, by iterating the identity given. As $n \to \infty$, the left-hand side converges to a standard normal distribution.


EDIT: If this is easy, then consider instead the identity $$\sum_{i_1,\ldots,i_n \in \{0,1\}} \alpha^{\sum_j i_j} \beta^{n - \sum_j i_j} X_{i_1,\ldots,i_n} = X$$

and use the Lindeberg central limit theorem.