If $x_i$ is from a random sample, is the conditional variance of the mean (or the sum of squares, really any statistic based on $x$) just treated as a constant? I saw this in a OLS variance of a parameter proof. I'm just looking for some rational as to why it is true.
I think this would also work for expected values So for example: $$E(\bar x \mid x_i)=\bar x$$ is true as well. Whats the reasoning being these things?
I believe it can also be assumed that $x_i$ is identically distributed random sample.
When you know all your $X_i$ then you know your $\bar{X}$, hence $E(\bar{X}|X_1,X_2,...,X_n)=\bar{X}$ as your sample mean is no longer a random variable but a constant now.
Thus, $Var(\bar{X}|X_1,X_2,...,X_n)=E(\bar{X}^2|X_1,...,X_n)-(E(\bar{X}|X_1,...,X_n))^2=\bar{X}^2-\bar{X}^2=0$