Why when finding with a significance level $\alpha$ a confidence interval for the mean, the $\alpha$ appears as $\alpha/2$?

30 Views Asked by At

I just can't get some things in the reasoning that gives us the confidence interval for the mean. I am going to state the argumentation and let me know where am I wrong.

Given a sample $X_1,...X_n$ from a $N(\mu,\sigma^2)$, I want to calculate a confidence interval with a significance level $\alpha$ for the mean $\mu$. In order to approximate the mean, I want to consider the random variable $$T=\frac{\overline X-\mu}{\sigma/\sqrt{n}}$$ because this is a $t$ distribution with $n-1$ degrees of freedom (but isn't it also a $N(0,1)$? I just normalised right? Also, why exactly do I consider $T$?).

Okay so now the reasoning says that the confidence interval for $\alpha$ is $$ P(-t_{\alpha/2}\leq T \leq t_{\alpha/2}) = \alpha$$ and after operating inside the brackets, we leave alone the mean $\mu$ so we end up with the confidence interval for $\mu$.

So why do we consider $t_{\alpha/2}$? Where the $\alpha/2$ comes from?