Calculating a statistic of specific distributions and degrees of freedom

100 Views Asked by At

Firstly, what does it mean to calculate a statistic? Do they just want us to find the mean, variance, and standard deviation? For instance, if I'm asked to calculate the statistic of the Chi squared with three degrees of freedom or the F distribution with 1 and 2 degrees of freedom how would I go about doing that? The assumptions are that $X_i$ is independent with $N(i,i^{2})$ distributions. I'm asked to use the $X_i$ to construct the statistics.

Would i just use the formulas:

$$\sum_{n=1}^{n} (X_{1}+...+X_{n})/n = mean$$

and

$$\sum_{n=1}^{n} (X_i-mean)^{2}/n-1 = variance$$

So if you do use these formulas how can you evaluate all the $X_1$ to $X_n$ values? How is knowing just the type of distribution and its respective degrees of freedom going to help me come up with the individual random sample values? Or is my understanding completely incorrect?

2

There are 2 best solutions below

5
On

I'm going to assume that you have independent $X_{1}, X_{2}, \ldots, X_{n}$ with $X_{i} \sim N(i,i^{2})$ and you want to find a statistic that has, for example, a $\chi^{2}$-distribution.

A $N(0,1)$ random variable squared has a $\chi^{2}(1)$ distribution. You can show this using moment generating functions. A sum of independent $\chi^{2}$ random variables has a $\chi^{2}$ distribution with the individual degrees of freedom added up. You can show that using moment generating functions as well.

So, you could take $X_{i} \sim N(i,i^{2})$, "standardize" it to a $N(0,1)$: $$ \frac{X_{i}-i}{\sqrt{i^{2}}} = \frac{X_{i}-i}{i} \sim N(0,1), $$ square it to get a $\chi^{2}$ distribution $$ \left( \frac{X_{i}-i}{i} \right)^{2} \sim \chi^{2}(1), $$ and add them up (using independence) to get $$ \sum_{i=1}^{n} \left( \frac{X_{i}-i}{i} \right)^{2} \sim \chi^{2}(1+1+ \cdots + 1) = \chi^{2}(n). $$

0
On

To calculate a statistic, in your case, means to figure out the value of a test statistic, a value useful in statistical testing (though in other contexts, a statistic can also refer to the mean, standard deviation, etc. of a sample). Let's take one of the examples you mentioned: the $\chi^2$ (chi-squared) distribution. This distribution, as you may already know, is useful for determining whether differences between observed and expected values of categorical data are significant or not. The formula for the chi-squared statistic is $$\chi^2 = \sum_{i =1}^n\frac{(O_i - E_i)^2}{E_i},$$ where $n$ is the number of data points you have, $O_i$ is the $i$-th observed value, and $E_i$ is the $i$-th expected value. Now, if all you were asked for is the $\chi^2$ statistic, then you're done once you calculate this. However, the p-value that results from the $\chi^2$ statistic is much more practical, and you will usually be asked for this on a question dealing with statistical testing. And this is where degrees of freedom come in. To get the $p$-value, you can do one of two things: look it up on a table, or use a graphing calculator or a computer. You do both of these using your $\chi^2$ statistic and the number of degrees of freedom.