Does pointwise convergence of estimator imply consistency

131 Views Asked by At

Let $n \in \mathbb N$ and $\Omega=\mathbb N^{n}, \mathcal{F}=2^{\Omega},\mathcal{P}:=\{P_{\vartheta}:=\operatorname{Geom}(\vartheta)^{\otimes n}:0<\vartheta<1\}$

Find the Estimator $\hat{\vartheta}:\Omega\to (0,\infty)$ where $\forall \omega \in \Omega:P_{\hat{\vartheta}(x)}(\{\omega\})=\max_{\vartheta}P_{\vartheta}(\{\omega\})$ using the function $f: \vartheta \mapsto \log{(P_{\vartheta}(\{\omega\}))}$

And then show that the estimator is consistent.

My idea:

$\log({P_{\vartheta}(\{\omega\})})=\log(\prod_{i=1}^{n}(1-\vartheta)^{\omega_{i}-1}\vartheta)$

Then define $S:=\sum_{i=1}^{n}\omega_{i}$ and see that

$\log(\prod_{i=1}^{n}(1-\vartheta)^{\omega_{i}-1}\vartheta)=\log((1-\vartheta)^{S-n}\vartheta^n)=(S-n)\log(1-\vartheta)+n\log(\vartheta)$

It follows that $f'(\vartheta)=\frac{n}{\vartheta}-\frac{S-n}{1-\vartheta}$ and $f'(\vartheta)=0 \iff \vartheta = \frac{n}{S}$ and since $f^{''}(\vartheta)<0$ the function is maximized at $\vartheta = \frac{n}{S}$

So our estimator $\hat{\vartheta}=\frac{n}{S}$.

Now onto my actual problem, on showing that a estimator is consistent. My understanding of a consistent estimator $\hat{\vartheta}$ of $\vartheta$ is that

For any $P_{\vartheta} \in \mathcal{P}$, $\hat{\vartheta}\xrightarrow{n \to \infty}\vartheta(P_{\vartheta})$

But how can I test whether $\hat{\vartheta}$ converges to a parameter if I do not know what parameter $\vartheta$ is supposed to be?

Additional questions:

$1.$ Does my definition of consistent estimator: $\hat{\vartheta}\xrightarrow{n \to \infty}\vartheta(P)$ mean that $\hat{\vartheta}$ converges to $\vartheta(P)$ pointwise and thereby almost everywhere?

$2.$ Since I am supposed to choose any $P_{\vartheta}\in \mathcal{P}$, my probability measure already depends on my choice of parameter $\vartheta$, so therefore I cannot choose any $P \in \mathcal{P}$, can I?

$3.$ Does pointwise convergence of an estimator imply cosistent estimator?