For a regular IG($\mu$,$\lambda$) with pdf: $f(x;\mu,\lambda) = \frac{\lambda}{2\pi x^3}^{1/2}$ $e^{\frac{-\lambda}{2\mu^2}\frac{(x-\mu)^2}{x}}$
Taking the log likelihood to produce
$\frac{n}{2}Ln(\lambda)+ \frac{n}{2}Ln(1/2\pi) + K - \frac{\lambda}{2\mu} \sum_1^n{\frac{(x_i-\mu)^2}{x_i}}$, k some term not in $\mu \,or \lambda$
Proceeding to differentiate the one relevant term at the end
I get that $\sum_1^n{\frac{(x_i-mu)^2}{x_i}}$ = 0
If this is correct, how does this imply that $\mu$ = sample mean?
What about the second parameter $\lambda$? It should reduce to $\lambda hat = \frac{1}{n}\sum_1^n{(\frac{1}{x_i} - \frac{1}{\mu^2})}$
Where $\mu$ is the estimated $\mu$ just shown.
First, I believe your PDF is incorrect: it should be $$f_X(x \mid \mu, \lambda) = \left(\frac{\lambda}{2\pi x^3}\right)^{1/2} \exp \left( - \frac{\lambda(x-\mu)^2}{2\mu^2 x} \right), \quad x > 0.$$ Source: Wikipedia
Using this PDF, a likelihood of a sample from this distribution is $$\mathcal L(\mu, \lambda \mid \boldsymbol x) \propto \lambda^{n/2} \exp \left( - \frac{\lambda}{2\mu^2} \sum_{i=1}^n \frac{(x_i - \mu)^2}{x_i} \right).$$ The log-likelihood is then $$\ell(\mu, \lambda \mid \boldsymbol x) \propto \frac{n}{2} \log \lambda - \frac{\lambda}{2\mu^2} \sum_{i=1}^n \frac{(x_i - \mu)^2}{x_i}.$$ If we are interested in estimating $\mu$ for a known $\lambda$, then this expression is more conveniently written $$\ell(\mu, \lambda \mid \boldsymbol x) \propto \frac{n}{2} \log \lambda - \frac{\lambda}{2} \sum_{i=1}^n \frac{((x_i/\mu) - 1)^2}{x_i},$$ and the partial derivative with respect to $\mu$ is $$\frac{\partial \ell}{\partial \mu} \propto \sum_{i=1}^n \left(\frac{x_i}{\mu} - 1\right)\frac{1}{\mu^2}.$$ Thus any extremum occurs at a critical point satisfying $\partial \ell/\partial \mu = 0$, or $$\sum_{i=1}^n x_i = n\mu,$$ hence $\hat \mu = \bar x$ as claimed.