I found this question that states:
Given the random variables $X_1, X_2, \dots , X_n$ from a Pareto distribution with probability density function: $$ f(x) = \begin{cases} \theta x_0^{\theta}x^{-\theta - 1} & x > 0 \\ 0 & x \leq x_0 \end{cases} $$
where $\theta > 0 $ is not known and $x_0$ is known.
It asks to show that $T_n = \frac{1}{n} \sum_{i = 1}^n \log \frac{X_i}{x_0}$ is an unbiased estimator of $\frac{1}{\theta}$, calculate the variance of $T_n$ and given that $n$ is large enough such that $T_n$ is normally distributed, find the $95 \% $ confidence interval.
Can anyone give some hints as to where to start on these three questions? I am completely lost when it comes to this part of my course
Note that, if $u=\log(x/x_0)$, \begin{align} \mathbb{E}\log\left(\frac{ X_1}{x_0}\right) &=\int_{x_0}^{\infty} \theta\log\left(\frac{ x}{x_0}\right) x_0^{\theta} x^{-\theta-1}\textrm{d}x= \int_{x_0}^{\infty} \theta \log\left(\frac{x}{x_0}\right) \left(\frac{x}{x_0}\right)^{-\theta} x^{-1}\textrm{d}x\\ &=\int_{0}^{\infty} \theta ue^{-\theta u}\textrm{d}u=\frac{1}{\theta}\int_0^{\infty} te^{-t}\textrm{d}t=\frac{1}{\theta} \end{align}
Hence, $$ \mathbb{E}T_n=\frac{1}{n}\sum_{i=1}^n \frac{1}{\theta}=\frac{1}{\theta}, $$ so your estimator is unbiased.
For the variance, by the same change of variables \begin{align} \mathbb{E}\log\left(\frac{ X_1}{x_0}\right)^2 &=\int_0^{\infty} \theta u^2 e^{-\theta u}\textrm{d}y=\frac{1}{\theta^2} \int_0^{\infty} t^2e^{-t}\textrm{d}t=\frac{2}{\theta^2}, \end{align}
and hence we have $\operatorname{Var}(\log(X_1/x_0))=\frac{1}{\theta^2}$. Assuming indepence, this gives $$ \operatorname{Var}(T_n)=\frac{1}{n^2}\sum_{i=1}^{n} \frac{1}{\theta}=\frac{1}{n\theta^2} $$ Asymptotic normality simply follows from CLT.