Suppose $\bar{X}$ is the sample mean of i.i.d Poisson random variables with mean parameter $\lambda$, and $T$ is any other unbiased estimator. How do we prove without any theorem in Statistics that $E\left(\bar{X}-\lambda\right)^{2}\leq E\left(T-\lambda\right)^{2}$?
I have tried the following, $$ \begin{align} E\left(T-\lambda\right)^{2} &= E\left(T-\bar{X}+\bar{X}-\lambda\right)^{2} \\ &= E\left(T-\bar{X}\right)^{2} + E\left(\bar{X}-\lambda\right)^{2} + E\left(T-\bar{X}\right)\left(\bar{X}-\lambda\right) \end{align} $$ but I'm not sure how to proceed from here. Thanks!
Your asking "How do we prove without any theorem in Statistics" is a little much.
You should first check that the expectation of your rival estimator $T$, conditional on $\overline X$, is actually $\overline X$. Then you appeal to Jensen's inequality. When you are done you have rediscovered an application of Rao-Blackwellization, a beautiful idea with an ugly name.