Understanding "For every $\theta \in \Theta$" in the definition of the weak consistence of an estimator

36 Views Asked by At

The definition I have in mind is: A sequence of estimators W_n are weakly consistent for $\theta$ if $\forall \epsilon > 0, \forall \theta \in \Theta$ $$P_{\theta}[\lvert W_n - \theta \rvert > \epsilon ] \rightarrow 0$$ I am confused about the fact that this convergence in probability is for all $\theta \in \Theta$. Or, as wikipedia puts it, "the convergence in probability must take place for every possible value of this parameter".

In my mind if $P_{\theta}[\lvert W_n - \theta \rvert > \epsilon ] \rightarrow 0$ then $W_n$ converges to $\theta$ so if I considered some other $\theta' \in \Theta$ and $\epsilon' < \theta - \theta'$ then, $\lvert W_n - \theta' \rvert >' \epsilon$ so wouldn't $P_{\theta}[\lvert W_n - \theta' \rvert > \epsilon' ] \rightarrow 1$ (I think that last step is where my error is?).

I'm looking for someone to clarify the definition or my error.

Thank you.