Given a random variable $X$ following a geometric distribution with parameter $p.$ Then one estimator that can be obtained by considering the second moment $E[X^{2}]=\frac{2-p}{p^2}$, which is $$\hat{p}_1=\frac{-1+\sqrt{1+\frac{8}{n}\sum_{i=1}^{n}X_i^2}}{\frac{2}{n}\sum_{i=1}^{n}X_i^2}.$$ Another family of estimators can be obtained by observing that $E[\mathbf{1}_{[k,\infty)}X_1] = P[X_1>k]=(1-p)^{k}$ and so $$\hat{p}_2 = 1- \frac{\log\left(\frac{1}{n}\sum_{i=1}^{n}\mathbb{1}_{[k,+\infty)}(X_i)\right)}{k}.$$ I want to determine whether $\hat{p}_1$ and $\hat{p}_2$ are biased and consistent, but this seems difficult since I am not able to figure the distribution of these estimators. Perhaps I have to use inequalities, but I am not sure how to proceed. Any hints will be much appreciated.
Edit: Let $Y=\sum_{i=1}^{n}X_i^2/n$ then $$E[\hat{p}_1]=E[f(Y)]$$ where $$f(y) = \frac{-1+\sqrt{1+8y}}{2y}$$ then since $f''(y)>0$ using the Jensen Inequality we have that: $$E[f(Y)]>f(E[Y])\implies E[\hat{p}_1]>p.$$ For the second estimator, we try Jensen with $Y=\frac{1}{n}\sum_{i=1}^{n}\mathbb{1}_{[k,+\infty)}(X_i)$ and $$f(y)=1-\frac{\log(y)}{k}.$$Then $$f''(y)=-\frac{1}{yk}<0$$ and so we have that $$E[\hat{p}_2]=E[f(Y)]<f(E[Y])=p.$$ I think this argument shows that $\hat{p}_1$ and $\hat{p}_2$ are biased. I am not very sure as to how the law of large numbers will work with the random variable $Y=\sum_{i=1}^{n}X_i^2/n$ since it is not exactly an average. Perhaps someone could explain this to me.
For the first estimator we can use the strong law of large numbers to deduce that since $(X_i)_{i\geq 1}$ are i.i.d, it follows that $\sum_1^n X_i/n\stackrel{\text{a.s}}{\to} EX=1/p$ whence $$ \hat{p}=\frac{1}{\sum_1^n X_i/n}\stackrel{\text{a.s}}{\to} \frac{1}{1/p}=p $$ as $n\to \infty$. So the first estimator is strongly consistent.