I have two sets of sample data with the first one having the mean of
100.1
and the second set having the mean of
192.1
and if i were to find the mean difference between this two sets, how do i know which of the mean above should subtract with another mean?
For example, i could use the first set to subtract the second set mean
100.1 - 192.1 = -92
while i could use the second to minus the first:
192.1 - 100.1 = 92
I'm confused on which should minus which as i will then need to use the mean difference to calculate the 95% interval. Would love some advice on this.