A credit card company wants to test the hypothesis that its account holders spend an average of 100 per month at gasoline stations. They take a sample of 1000 accounts and find an average spend of 115 with a standard deviation of 41. Conduct this hypothesis test with a .01 level of significance. What is the test statistic?
So I did this and got
$(115-100)/(41/\sqrt1000)$ =11.57
That gave me 11.57 but I think that I am doing this incorrectly
I know that I probably need to do something with the .01 level of significance but I am not sure what
You will use the significance level to find a cutoff point that you will compare to the value you calculated for the test statistic. The cutoff point you use will depend on whether you are using a one-tailed or two-tailed test. Your book or instructor should provide you will some information about which to use.
You can think of it as follows: "For a 0.01 significance level I reject the null hypothesis if the estimate of the test statistic is greater then A. Is 11.57 greater than A? If yes, I reject the null hypothesis, if not, I fail to reject the null hypothesis." You would find the cutoff level A by looking it up in a table (or using a computer program or calculator).
You can find a two-tailed t table here. You give the table the degrees of freedom and the significance level, it gives you the cutoff value.