Suppose $Y_I$ is independent and identically distributed $N(0, σ^2)$, $i = 1, ..., n$. The estimand is $σ = \sqrt{Var(Y_1)}$. Derive the likelihood, sufficient statistic, and score of $\sigma$.
$$L(\mu, \sigma^2) = \Pi_{i=1}^n f(X_i | \mu, \sigma^2) $$
$$= \Pi_{i=1}^n \frac{1}{\sqrt{2\pi\sigma^2}}e^{-(X_i-\mu)^2/2\sigma^2}$$ With $\mu$ = 0: $$= \Pi_{i=1}^n \frac{1}{\sqrt{2\pi\sigma^2}}e^{-(X_i)^2/2\sigma^2}$$
$$L(\sigma^2) = \frac{1}{(\sqrt{2\pi\sigma^2})^n}e^{-\sum_{i=1}^n(X_i)^2/2\sigma^2}$$ Is it possible to just take the square root of this to get $\sigma$, not $\sigma^2$? I understand that once you get this far, you just take the derivative of the function to get the score.
For the SS, we look for two functions in the form of g($\mu ; T(\vec{x})$) and h(x).
$$(2\pi)^{-n/2} (\sigma)^{-n} e^{-1/2\sigma^2} e^{-\sum X_i^2}$$ Once here, though, I'm confused as to what is necessarily relevant in working with our $\sigma$. Can anyone help me in this problem?
The likelihood is $$ L(\sigma |X) = \frac{1}{( 2\pi )^{n/2} \sigma^n } \exp\{ - 2^{-1}\sigma ^ {-2} \sum_{i=1}^n X_i^2\} . $$ The minimal sufficient statisic is $ T(X) = \sum_{i=1}^n X_i^2 $.
The log-likelihood is $$ l(\sigma)=-n/2 \ln(2 \pi) - n\ln \sigma -\frac{1}{2\sigma^2}\sum_{i=1}^n X_i^2, $$ and the "score" is $$ l'(\sigma) = -\frac{n}{\sigma} + \frac{1}{\sigma^3}\sum_{i=1}^nX_i^2. $$
EDIT: $$ \operatorname{var}(l'(\sigma)) = \frac{1}{\sigma^2} \operatorname{var} \left( \sum_{i=1}^nX_i^2/\sigma^2 \right) = \frac{1}{\sigma^2}\operatorname{var}(\chi^2_{(n)})=\frac{2n}{\sigma^2}. $$