Standard Error: multiplied or divided by SD?

61 Views Asked by At

I'm relearning stats after a number of years of non-use, and I have a great textbook I'm using which defines the SE as the square root of the number of draws from a box multiplied by the standard deviation of the number of members (n) of that box.

Yet everywhere I see SE defined as the standard deviation divided by the square root of n.

Can someone help me understand this discrepancy? Thank you!