I have two quick question:
If an estimator is consistent, does that imply it is unbiased?
If an estimator is biased, does that imply it is not consistent?
we know that consistency means when $bias^2+variance=0$
so, consistent means both are zero. because both are positive number. But this is for all case or not. Any type of suggestion will be appreciated.
Consistency is an asymptotic property, which roughly asks, as our sample becomes large, does our estimator become accurate. Bias on the other hand, is a not an asymptotic property. The bias tells us, given a sample, how off our estimator is in expectation. There are consistent estimators that are biased and unbiased estimators that are not consistent.
Based on your question, I assume you are asking about mean square consistency, i.e. if $\theta$ is the true parameter, and $\hat{\theta}$ is an estimator, then $\lim_{n\to \infty} E( (\theta-\hat{\theta})^2)\to 0$ (which happens if the squared-bias and variance of the estimator converge to 0). There is another, more commonly used notion of consistency, which is that $p\lim \hat{\theta} \to \theta$. It can be shown that neither of these are the same as unbiasedness, in fact the example below shows it for either.
Since this is an asymptotic property, it does not imply unbiasedness. For instance, if a sample $X_1 \cdot X_n$ are drawn independently from $N(\theta,1)$, then the estimator $\hat{\theta_n} =\frac{\sum X_n}{n}+1/n$ is a consistent estimator for $\theta$. The first term converges to $\theta$ by the law of large numbers, and the second term converges to 0, which means that asymptotically, the bias and the variance goes to 0 (since it converges to a constant). But, $\hat{\theta_n}$ is not unbiased, $E(\hat{\theta_n})=\theta + \frac{1}{n}$.
A less "artificial" example would be the sample variance. If estimated using $\frac{1}{n} \sum(X-\bar{X})^2$ is a biased but consistent estimator of variance.