Firstly, what does it mean to calculate a statistic? Do they just want us to find the mean, variance, and standard deviation? For instance, if I'm asked to calculate the statistic of the Chi squared with three degrees of freedom or the F distribution with 1 and 2 degrees of freedom how would I go about doing that? The assumptions are that $X_i$ is independent with $N(i,i^{2})$ distributions. I'm asked to use the $X_i$ to construct the statistics.
Would i just use the formulas:
$$\sum_{n=1}^{n} (X_{1}+...+X_{n})/n = mean$$
and
$$\sum_{n=1}^{n} (X_i-mean)^{2}/n-1 = variance$$
So if you do use these formulas how can you evaluate all the $X_1$ to $X_n$ values? How is knowing just the type of distribution and its respective degrees of freedom going to help me come up with the individual random sample values? Or is my understanding completely incorrect?
I'm going to assume that you have independent $X_{1}, X_{2}, \ldots, X_{n}$ with $X_{i} \sim N(i,i^{2})$ and you want to find a statistic that has, for example, a $\chi^{2}$-distribution.
A $N(0,1)$ random variable squared has a $\chi^{2}(1)$ distribution. You can show this using moment generating functions. A sum of independent $\chi^{2}$ random variables has a $\chi^{2}$ distribution with the individual degrees of freedom added up. You can show that using moment generating functions as well.
So, you could take $X_{i} \sim N(i,i^{2})$, "standardize" it to a $N(0,1)$: $$ \frac{X_{i}-i}{\sqrt{i^{2}}} = \frac{X_{i}-i}{i} \sim N(0,1), $$ square it to get a $\chi^{2}$ distribution $$ \left( \frac{X_{i}-i}{i} \right)^{2} \sim \chi^{2}(1), $$ and add them up (using independence) to get $$ \sum_{i=1}^{n} \left( \frac{X_{i}-i}{i} \right)^{2} \sim \chi^{2}(1+1+ \cdots + 1) = \chi^{2}(n). $$