a biased, consistent estimator of the mean given $N$ i.i.d samples is the sample mean. The variance of estimated mean is ${\mathop{\rm var}} (\hat \mu ) = \frac{{{\sigma ^2}}}{N}$ and the estimator is $$\hat \mu = \frac{1}{N}\sum\limits_{i = 1}^N {{x_i}} $$ However if I use the other formula for variance I get
${\mathop{\rm var}} \left( {\hat \mu } \right) = E{\left( {\hat \mu } \right)^2} - {\left( {E\left( {\hat \mu } \right)} \right)^2} = E{\left( {\frac{1}{N}\sum\limits_{i = 1}^N {{x_i}} } \right)^2} - {\left( {E\left( {\frac{1}{N}\sum\limits_{i = 1}^N {{x_i}} } \right)} \right)^2} = \frac{{{\sigma ^2}}}{N} - {\mu ^2}$
Why am I getting the extra additive term ,what did I do wrong?
You have computed the expectation incorrectly. The easiest way to obtain the result it to use the fact that the variance of a sum is the sum of variances when the summands are independent, i.e. if $x_i$ are iid with mean $\mu$ and variance $\sigma^2$, then
$$\text{Var}\left(\frac{1}{N}\sum_{i=1}^N x_i\right)=\frac{1}{N^2}\sum_{i=1}^N \text{Var}(x_i)=\frac{1}{N^2}\sum_{i=1}^N \sigma^2=\frac{1}{N^2}N\sigma^2=\frac{\sigma^2}{N}.$$
Using your approach correctly would you get you there as well, albeit with a bit more work:
$$E\left[\left(\frac{1}{N}\sum_{i=1}^N x_i\right)^2\right]-\left(E\left[\frac{1}{N}\sum_{i=1}^N x_i\right]\right)^2\\ =\frac{1}{N^2}E\left[\sum_{i=1}^Nx_i^2+2\sum_{i=1}^N\sum_{j=i+1}^Nx_ix_j\right]-\mu^2\\ =\frac{1}{N^2}\left[\sum_{i=1}^NE\left[x_i^2\right]+2\sum_{i=1}^N\sum_{j=i+1}^N E\left[x_ix_j\right]\right]-\mu^2\\ =\frac{1}{N^2}\left[\sum_{i=1}^NE\left[x_i^2\right]+2\sum_{i=1}^N\sum_{j=i+1}^N E\left[x_i\right]E\left[x_j\right]\right]-\mu^2\quad \left(x_i. \text{ indep.}\right)\\ =\frac{1}{N^2}\left[\sum_{i=1}^N \left\{\mu^2+\sigma^2\right\}+2\sum_{i=1}^N\sum_{j=i+1}^N \mu^2\right]-\mu^2\quad \left(x_i \text{ identically distrib.}\right)\\ =\frac{1}{N^2}\left[N(\mu^2+\sigma^2)+(N^2-N)\mu^2\right]-\mu^2\\ =\frac{\sigma^2}{N} $$