Show that if an estimator $\hat\mu=a_1X_1 +a_2X_2 +\cdots+a_nX_n$, where $a_1, a_2,\ldots,a_n$ are constants, is unbiased, then its variance is minimum when $a_1=a_2=\cdots=a_n=\frac{1}{n} \hat\mu=\bar X$.
Ive tried subjecting it to $\sum a_i=1$, and I know that $\sum a_i^2$ is minimized by choosing $a_1=a_2=\cdots=a_n=1/n$), not sure what to do next.
We are assuming all observations are iid.
This statement is not generally true. For instance, if your $X_i$'s are dependent or not identically distributed (or, more technically, do not have finite second moments) then it is false.
If your observations are iid with finite variance, which is probably assumed, then the answer you want can be found by solving
$$\max_{\sum_{i=1}^n a_i=1} V\biggl(\sum_{i=1}^n a_i X_i\biggr),$$
using the familiar rules
$$V(cX)=c^2V(X)$$
and,
$$\text{$X,Y$ independent with finite variances} \Rightarrow V(X+Y)=V(X)+V(Y).$$