I'm stuck understanding this question where we are trying to get the MOM for a RV, IID, with a density function as:
$$f(x|α) = {Γ(3α)/Γ(α)Γ(2α)} * x^{α-1} * (1-x)^{2α-1}$$
Where alpha is the perimeter.
These are also given:
$$E(X) = 1/3, \text{Var}(X) = 2/(9(3α+1))$$
The question is to find the MOM estimate of alpha.
I understand what the gamma function does but I'm completely stuck on finding the first moment. The solution suggests calculating:
$$E(X^2) = \text{Var}(X) + (E(X))^2$$
which, I understand the relation, but why the second moment? We only have one parameter (alpha), so shouldn't the first moment suffice?
I'm also stuck on getting the first moment itself - I understand how to calculate the MOM for main distributions (normal, gamma, poisson...) but how do you approach a distribution like this?
No, the first moment is clearly insufficient to estimate $\alpha$ because you are told that $\operatorname{E}[X] = 1/3$. Therefore, there is no way to get a relationship between $\alpha$ and the sample mean.
For now, take it for granted that the expectation and variance are provided to you. Then, you would simply substitute these to find $\operatorname{E}[X^2]$. Then the method of moments equates the sample moment to the raw moment; i.e., for as many $k = 1, 2, \ldots$ as are needed, equate $$\operatorname{E}[X^k] = \frac{1}{n} \sum_{i=1}^n x_i^k$$ and solve the resulting system. So in your case, since $k = 1$ fails to relate $\alpha$ to the sample moment, you simply proceed to $k = 2$: $$\operatorname{E}[X^2] = \operatorname{Var}[X] + \operatorname{E}[X]^2 = \frac{1}{n} \sum_{i=1}^n x_i^2.$$ Substitute and solve for $\alpha$ in terms of the sum of squares of the sample.
Of course, this is not the only way to get a method of moments estimator. You can equate the central sample moments; e.g., $$\operatorname{Var}[X] = \frac{1}{n} \sum_{i=1}^n (x_i - \bar x)^2$$ where $\bar x$ is the sample mean. Does the resulting equation, solved for $\alpha$, result in the same method of moments estimator as you found using the raw moments? Why or why not?
Finally, to compute the expectation and variance, one way is to recognize that the $X_i$ are drawn from a Beta distribution with parameters $\alpha$ and $2\alpha$. If you take for granted that for a general Beta distribution, the integral of its density over its support is $1$; that is to say, $$\frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)} \int_{x=0}^1 x^{a-1} (1-x)^{b-1} \, dx = 1, \quad a, b > 0,$$ then we can write $$\begin{align*} \operatorname{E}[X] &= \frac{\Gamma(3\alpha)}{\Gamma(\alpha)\Gamma(2\alpha)} \int_{x=0}^1 x \cdot x^{\alpha-1} (1-x)^{2\alpha-1} \, dx \\ &= \frac{\Gamma(\alpha+1)\Gamma(3\alpha)}{\Gamma(\alpha)\Gamma(3\alpha+1)} \cdot \frac{\Gamma(3\alpha+1)}{\Gamma(\alpha+1)\Gamma(2\alpha)} \int_{x=0}^1 x^{(\alpha+1)-1} (1-x)^{2\alpha-1} \, dx \\ &= \frac{\Gamma(\alpha+1)\Gamma(3\alpha)}{\Gamma(\alpha)\Gamma(3\alpha+1)} \\ &= \frac{\alpha}{3\alpha} \\ &= \frac{1}{3}. \end{align*}$$ In the second step we have evaluated the integral by recognizing that it corresponds to a Beta density with parameters $\alpha+1$ and $2\alpha$, so it evaluates to $1$, leaving the factor that we pulled out, which then simplifies by the rules of the gamma function.
The calculation of the second raw moment proceeds in a similar fashion.