I'm considering inference of the factor loading in the classical factor analysis.
There is the model of classical factor analysis: suppose there are $T$ i.i.d samples from the data generating process $Y = \Lambda F + \varepsilon, Var(\varepsilon) = \Sigma$, which is diagonal, and $Var(F) = I_r$, where $r$ is a fixed known constant that is typically much smaller than the sample size $T$. Classical factor analysis solves $Var(Y) = \Lambda\Lambda^T + \Sigma$ to identify and estimate the parameters.
Now I wonder if we can use the generalized method of moments (GMM) to inference on the loading parameter $\Lambda$? In Anderson et al (1956), they say the theoretical form of the variance of $\Lambda$ is complicated, but perhaps numerically estimating the variance by GMM is possible. Does anyone know about something with this? Appreciate any directions or discussions!