Theorems for i.i.d[2]

98 Views Asked by At

Let $(X_j)$ be an i.i.d. sequence of random variables equal $\pm 1$, $P(X_1 = 1) = p$. Let $Y_n$ be a number of series of a constant sign. Prove that exists a $\lim\limits_{n \to \infty} \frac{Y_n}{n}$ and find it.

2

There are 2 best solutions below

2
On BEST ANSWER

I suspect $Y_n$ is defined to be the number of series of constant sign up to n$^{\textrm{th}}$ step. (I will use 0 instead of -1 for the rest of the question.) For instance, for the $X_n$ sequence 001110100, $Y_5$ should be 2.

Now, we can see that this forms a renewal process (Whenever $X_n$ resets to $X_1$). Now, the renewal theorem states that number of renewals over unit time will be the reciprocal of the average renewal length, which is EX = $\frac{1}{p} + \frac{1}{1-p}$ here. So, it should be the reciprocal of that i.e. $\frac{1}{\frac{1}{p} + \frac{1}{1-p}}$

Edit: The number above is the average number of renewals per unit time; to get the number of sequences, we have to multiply it by 2 (since every renewal contributes 2 sequences) to get $\frac{2}{\frac{1}{p} + \frac{1}{1-p}}$

0
On

EDIT: I just realized the probabilities for $+1$ and $-1$ are not equal. My solution of course applies to the more general case, but I restricted to $p=\frac{1}{2}$.

It seems that if $Y_n^{(1)} := \#\{2 \le k \le n : X_{k-1} = -1, X_k = 1\}$ and $Y_n^{(2)} := \#\{2 \le k \le n : X_{k-1} = 1, X_k = -1\}$, then your $Y_n$ is equal to $Y_n^{(1)}+Y_n^{(2)}$.

Since $Y_n$ is a random variable, it is unclear what you mean by $\lim_{n \to \infty} \frac{Y_n}{n}$.

If by $Y_n$ here, you mean $E(Y_n)$, the expected value of $Y_n$, then by symmetry, $\lim_{n \to \infty} \frac{E(Y_n)}{n} = 2\lim_{n \to \infty} \frac{E(Y_n^{(1)})}{n}$. And this is easy to calculate using linearity of expectation. For any fixed $k$, the probability that $X_{k-1}=-1$ and $X_k = 1$ is simply $\frac{1}{4}$, so $E(Y_n) = (n-1)\frac{1}{4}$ so that $\lim_{n \to \infty} \frac{E(Y_n)}{n} = \frac{1}{2}$.

If you mean find the almost surely pointwise limit of $\frac{Y_n}{n}$, this question is a bit trickier. Once again, by symmetry, we may study $\lim_{n \to \infty} \frac{2Y_n^{(1)}}{n}$. For $k \in [2,n]$, if we let $Z_k$ denote the indicator function of the event $X_{k-1}=-1,X_k = 1$, then $\frac{Y_n^{(1)}}{n} = \frac{1}{n}\sum_{k=2}^n Z_k$. But I don't think the strong law of large numbers is applicable here, since the $Z_k$'s are not independent. However, we may use ergodic theory. If we let $\Omega = \{-1,1\}^{\mathbb{N}}$ with the product topology, $\mathcal{B}$ be the Borel sigma-algebra, $\mu$ be the $\frac{1}{2}-\frac{1}{2}$ product measure, and $T : \Omega \to \Omega$ be the left shift, then it is well known that $(\Omega,\mathcal{B},\mu,T)$ is an ergodic measure preserving system. If we let $E = \{(X_1,X_2,\dots) \in \Omega : X_1 = -1, X_2 = 1\}$, then $Z_k = 1_E(T^k(X_1,X_2,X_3,\dots))$. So by Birkhoff's ergodic theorem, we see that $\lim_{n \to \infty} \frac{Y_n}{n} = 2\mu(E) = \frac{1}{2}$ almost surely. Furthermore, by Von Neumann's ergodic theorem, we see that $\frac{Y_n}{n}$ converges to $\frac{1}{2}$ in the $L^2(\Omega)$ norm as well.