The following question is from Introduction to Probability by Joe Blitzstein and Jessica Hwang:
Let $X_1, \cdots , X_n$ be i.i.d. r.v.s with mean $\mu$ and variance $\sigma^2$, and $n \geq 2$. A bootstrap sample of $X_1, \cdots , X_n$ is a sample of $n$ r.v.s $X_1^{\ast}, \cdots, X_n^{\ast}$ formed from the $X_j$ by sampling with replacement with equal probabilities. Let $\overline{X_n}^{\ast}$ denote the sample mean of the bootstrap sample: $$\overline{X_n}^{\ast} = \frac{1}{n} (X_1^{\ast}, \cdots, X_n^{\ast})$$
(a) Calculate E($X_j^{\ast}$) and Var($X_j^{\ast}$) for each $j$.
(b) Calculate E($\overline{X_n}^{\ast}|X_1, \cdots , X_n$) and Var($\overline{X_n}^{\ast}|X_1, \cdots , X_n$). Hint: Conditional on $X_1, \cdots , X_n$, the $X_j^{\ast}$ are independent, with a PMF that puts probability $\frac{1}{n}$ at each of the points $X_1, \cdots , X_n$. As a check, your answers should be random variables that are functions of $X_1, \cdots , X_n$.
(c) Calculate E($\overline{X_n}^{\ast}$) and Var($\overline{X_n}^{\ast}$).
(d) Explain intuitively why Var($\overline{X_n}$) < Var($\overline{X_n}^{\ast}$).
I thought that for part (a), the mean and variance of each bootstrap sample would be the same as the mean and variance of each i.i.d, which would be $\mu$ and $\sigma^2$ respectively. However, I have seen some answers that use Adam's Law and Eve's Law, which confuses me as I'm not sure why there is a need to condition on the original r.v.s.
Similarly, for part (b), I don't understand how the expectation and variance will be a function of the random variables $X_1, \cdots, X_n$, which in turn would make part (c) clearly to apply Adam's Law and Eve's Law to the answer of part (b).
The statement
$X^*_1,\ldots,X^*_n$ are independent when conditioned by $X=(X_1,\ldots,X_n)$
does not seem correct. More specifically, denote by $\nu_j$ the number of times where $X_j$ has been chosen. Then $$\Pr (\nu_1=k_1,\ldots,\nu_n=k_n)=\frac{1}{n^n}\frac{n!}{k_1!\ldots k_n!}$$ for $k_1+\cdots+k_n=n$ while $X_j^*=X_j\nu_j.$ Therefore $$E(e^{s_1\nu_1+\cdots+s_n\nu_n})=\left(\frac{1}{n}(e^{s_1}+\cdots+e^{s_n})\right)^n,\ \ \ \ (*)$$ $$E(e^{s_1X_1^*+\cdots+s_nX_n^*}|X)=E(e^{s_1\nu_1X_1+\cdots+s_n\nu_nX_n}|X)=\left(\frac{1}{n}(e^{s_1X_1}+\cdots+e^{s_nX_n})\right)^n$$ and this last function of $s_1,\ldots,s_n$ is not a product of a function of $s_1$ alone by a function of $s_2$ alone and so on; which means that the said independence does not take place.
I have also computed $E(X_1^*)=E(\nu_1X_1)=E(\nu_1)E(X_j)=E(X_1)$ since $\nu_1\sim B(n+1/n)$ is binomial. Also $E(\nu_1^2)= 2-\frac{1}{n}$ leading to $$\sigma^2(X_1^*)=\sigma^2(X_1)+(1-\frac{1}{n})E(X_1)^2.$$ Thus your claim that $\sigma^2(X_1^*)=\sigma^2(X_1)$ does not seem correct.
I have also computed $\sigma^2(\overline{X_n})<\sigma^2(\overline{X_n^*}),$ but the calculation is a bit too long to be displayed here. It uses (*) for computing $E(\nu_1\nu_2).$