Convergence of sum of MLE's over all possible $0/1$ sequences

73 Views Asked by At

Fix $N\in\mathbb N$ and take $\Theta=\{\frac1{N+1},\ldots,\frac N{N+1}\}$. For $y^T\in\{0,1\}^T$, let $\hat\theta(y^T)$ be the number in $\Theta$ that is closest to $\frac1T\sum_{i=1}^Ty^T_i$. Let $\sum_{i=1}^Ty^T_i=:k(y^T)$, and let $p_\theta(y^T)=\theta^{k(y^T)}(1-\theta)^{T-k(y^T)}$. In my lecture notes I read that $$\sum_{y^T\in\{0,1\}^T}p_{\hat{\theta}(y^T)}(y^T)\to N,$$ as $T\to\infty$, by the law of large numbers. I was not able to figure out why this was the case, exactly. My first problem is that I don't even see how the LLN comes into play here. Any help on this is much appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

Denote $Y_k = \{y^T: \hat\theta(y^T) = k/N\}$. Note that $$ \sum_{y^T\in Y_k} p_{\hat \theta(y^T)}(y^T) = \sum_{y^T\in Y_k} p_{k/N}(y^T). $$ This is nothing else but the probability that for a sequence of Bernoulli trials with success probability $k/N$, the empirical success ratio is at the distance closer than $1/(2N)$ to $k/N$. So by LLN, $$ \sum_{y^T\in Y_k} p_{k/N}(y^T)\to 1, T\to\infty. $$ Consequently, $$ \sum_{y^T\in \{0,1\}^T} p_{\hat\theta(y^T)}(y^T) = \sum_{k=1}^N\sum_{y^T\in Y_k} p_{k/N}(y^T)\to N, T\to\infty. $$