Question: Suppose $n$ samples are taken from the following distribution:
$f(x)=\frac{x}{\beta} e^{-x^{2} /(2 \beta)}$
where $x$ and $\beta$ are positive. $E(X_i^{2}) = 2\beta$ and $E(X_i) = \sqrt{\frac{\beta \pi}{2}}$, now we are asked to find moment estimator and maximum likelihood estimator for $\beta$.
Here are my approaches that the results of which for some reason are not consistent between the two approaches (My maximum likelihood estimator is the negative of the moment estimator!):
Moment estimator method:
$E[x]=\bar{x} \rightarrow \sqrt{\frac{\beta \pi}{2}}=\bar{x} \rightarrow \frac{\beta \pi}{2}=\bar{x}^{2} \rightarrow \beta=2 \bar{x}^{2} / \pi$
Maximum likelihood method:
$L(\beta)=\prod_{i=1}^{n} f\left(x_{i} ; \beta\right)=\prod_{i=1}^{n} \frac{x_{i}}{\beta} e^{-x_{i}^{2}/(2 \beta)}$
$ \ \ \ \ \ \ \ \ =\beta^{-n}\left(\prod_{i=1}^{n} x_{i}\right) \exp \left(\frac{-\sum_{i=1}^{n} x_{i}^{2}}{2 \beta}\right)$
$\operatorname{lag} L(\beta)=-n \log \beta+\sum_{i=1}^{n} \log x_{i}-\frac{\sum_{i=1}^{n} x_{i}^{2}}{2 \beta}$
$\frac{d}{d \beta} \log L(\beta)=\frac{-n}{\beta}+\frac{\sum_{i=1}^{n} x_{i}^{2}}{2 \beta^{2}}=0$
$\frac{\sum_{i=1}^{n} x_{i}^{2}}{2 \beta}=n \rightarrow \beta=\frac{1}{2} \times \frac{\sum_{i=1}^{n} x_{i}^{2}}{n} = \frac{E(X_i^{2})}{2}$
$E(X_i^{2}) = Var(X_i) - E(X_i)^2 = 2\beta - \frac{\beta \pi}{2} - \bar{X}^2$
Hence $\beta=-2 \bar{x}^{2} / \pi$
Can't find the problem of my solutions, can anyone help?
Your method of moments estimator is correct. Clearly, your maximum likelihood calculation cannot be correct.
The likelihood function for a sample $\boldsymbol x = (x_1, \ldots, x_n)$ is proportional to $$\mathcal L(\beta \mid \boldsymbol x) \propto \beta^{-n} \exp \left(- \frac{1}{2\beta} \sum_{i=1}^n x_i^2 \right)$$ so that the log-likelihood is $$\ell(\beta \mid \boldsymbol x) = -n \log \beta - \frac{n \overline{x^2}}{2\beta} ,$$ where $\overline{x^2}$ is the sample second moment; i.e., the mean of the square of the sample. Consequently we search for critical points satisfying $$0 = \frac{\partial \ell}{\partial \beta} = -\frac{1}{\beta} + \frac{\overline{x^2}}{2\beta^2},$$ hence $$\hat \beta = \frac{\overline{x^2}}{2}.$$ This of course is different than the method of moments estimator, but they are both asymptotically consistent.