In which case is $|x| = x^2$?

85 Views Asked by At

i had a exercise in which i had to conclude the law of large numbers. I couldnt solve it, because they used the following equivalence:

$|x| = x^2$

To be morce specific: Let be $X_{n} \sim bin(n,p)$ with $X_{n} \sim Z_{1}+..+Z_{n}$. and let be $Z_{1},Z_{2}...$ a continued p-coin flip and $\epsilon > 0$ $P(|\frac{Z_{1}+..+Z_{n}}{n}-p| \geq \epsilon)=P((\frac{X_{n}}{n}-p)^2 \geq \epsilon)\leq\frac{1}{\epsilon}\frac{pq}{n} => lim_{n\to\infty}\frac{1}{\epsilon}\frac{pq}{n} = 0$

My question now, in which case is $|x| = x^2$ correct and why?

3

There are 3 best solutions below

0
On BEST ANSWER

This is not an answer, but the formulae in the question are bothersome to me.

Chebyshev's inequality states that for a random variable $X$ with finite expectation $\mu $, finite variance $\operatorname{var} X$ and $\epsilon>0$ that $P[|X-\mu| \ge \epsilon] \le { \operatorname{var} X \over \epsilon^2} $.

If $Z_k$ are independent coin flips with probability $p$ of getting a head then $\mu= E Z_k = p$ and $\operatorname{var} Z_k = p(1-p)$.

Then $E [{1 \over n}(Z_1+\cdots+Z_n)] = \mu = p$ and $\operatorname{var} {1 \over n}(Z_1+\cdots+Z_n) = {1 \over n}\operatorname{var} Z_k = {p (1-p) \over n} $.

Now substitute into Chebyshev's inequality to get $P[|{1 \over n}(Z_1+\cdots+Z_n) - p| \ge \epsilon] \le {{p (1-p) \over n} \over \epsilon^2}$ and we see that $\lim_{n \to \infty} P[|{1 \over n}(Z_1+\cdots+Z_n) - p| \ge \epsilon] = 0$

4
On

Assuming $x\in \mathbb R$, only $x=0$, $x=-1$ and $x=1$. This happens because

  • if $x\ge0$, $|x|=x$, and so the equation becomes $x=x^2$, or $0=x(x-1)$;
  • if $x<0$, $|x|=-x$, and the equation is now $-x=x^2$, or $0=x(x+1)$.

Since those are all possible cases, the set of solutions is $\{-1,0,1\}$.

Anyway, that's not the case for $\frac{X_n}n-p$. For instance, let $n=2$ and $p=0.4$, so that $X_n\sim Binom(2;0.4)$. Then $\frac {X_n}n$ is one of the elements of $\{0,\tfrac12, 1\}$. Then $\left|\frac{X_n}n-p\right| \in \{0.4, 0.1, 0.6\}$ and $\left(\frac{X_n}n-p\right)^2 \in \{0.16, 0.01, 0.36\}$. So if $\epsilon=0.2$, then $$\left|\frac{X_n}n-p\right| \ge \epsilon \quad \iff \quad X_n \in \{0,1\}$$ and $$\left(\frac{X_n}n-p\right)^2 \ge \epsilon \quad \iff \quad X_n \in \{1\},$$ and hence $$P\left(\left|\frac{X_n}n-p\right| \ge \epsilon\right)>P\left(\left(\frac{X_n}n-p\right)^2 \ge \epsilon \right).$$

So the claim is not true. I imagine there is somewhere some additional assumption for $\epsilon$, such as that this is "small enough" (although that only makes sense for fixed $n$). It is also true that for fixed $\epsilon$, both probabilities become $0$ for $n$ "large enough". Maybe you should check again to see if you overlooked some important remark in the text.

2
On

For non-negative number $x$, $$ |x| = x^2$$ is the same as $$x = x^2$$ so $x=0$ or $x=1.$

For negative $x$, it is the same as $$-x=x^2$$ with $x=-1$ a solution in negative numbers.

Therefore the only possible solutions are $x=0$, or $x=1$ or $x=-1$.