I am stuck at the part of the asymptotic variance of the MLE. The MLE I derived is $\hat{\theta} = \frac{\sum_i x_i y_i}{\sum_i x_i^2}$. Using iterated expectation, I can prove $\hat{\theta}$ is unbiased. However, when computing its variance, I can only get to the last step shown below, which involves the expectation of the inverse of sum of squares, which I don't know how to compute:
\begin{aligned} E[(\hat{\theta})^2] &= E_{X} \{E_{Y|X} \frac{(\sum_i x_i^2 y_i^2) + (\sum_i \sum_{j \neq i} x_i y_i x_j y_j) }{\sum_i x_i^2} \} \\ &= E_{X} \{\frac{(\sum_i x_i^2 (1 + x_i^2 \theta^2)) + (\sum_i \sum_{j \neq i} \theta^2 x_i^2 x_j^2) }{\sum_i x_i^2}\} \\ &= E_{X} \{ \frac{\theta^2 (\sum_i x_i^2)^2 + \sum_i x_i^2}{\sum_i x_i^2} \} \\ &= \theta^2 E_{X} [\sum_i x_i^2] + E_X [\frac{1}{\sum_i x_i^2}] \\ &= n \theta^2 + E_X [\frac{1}{\sum_i x_i^2}] \end{aligned}
Any help would be greatly appreciate.
