Let $X_1,\ldots,X_n$ be random variables, each with expectation $\mu_{1},\ldots,\mu_{n}$. Further, let $\pi$ be a RV taking values on $\left\{1,\ldots,n\right\}$. In machine learning literature, it is claimed that $\mathbb{E}[X_{\pi}]=\mathbb{E}[\mu_{\pi}]$. Is it true?
I know that by tower property, it holds $\mathbb{E}[X_{\pi}]=\mathbb{E}[\mathbb{E}[X_{\pi}|\pi]]$, where $\mathbb{E}[X_{\pi}|\pi]$ is the conditional expectation of $X_{\pi}$ w.r.t. the $\sigma$-algebra generated by $\pi$. The intuition says that $\mathbb{E}[X_{\pi}|\pi]=\mu_{\pi}$ since $X_{\pi}$ becomes a RV with deterministic index by knowing $\pi$.
However, I can't manage to show that $\mathbb{E}[\mathbb{I}_{\left\{\pi=k\right\}}\mu_{\pi}]=\mathbb{E}[\mathbb{I}_{\left\{\pi=k\right\}}X_{\pi}]$, $\forall k\in [n]$, and therefore $\mu_{\pi}$ is the version of $\mathbb{E}[X_{\pi}|\pi]$.
Thank you for your help in advance.
I think they can be connected by a right way of thinking.
Firstly, for $\mathbb{E}[\mu_{\pi}]$, the experiment is $\pi$ and the outcomes are $\{\mu_1,\mu_2,...,\mu_n\}$
$\mathbb{E}[\mu_{\pi}]=\sum_{i=1}^{n}\mu_{i}P(\pi=i)$
For $\mathbb{E}[X_{\pi}]$, the experiment has two parts. First you do $\pi$. It gives an outcome $i\in\{1,2,...,n\}$. then, after you know which random variable $X_i$ you have, you do the experiment $X_i$, which is $P(X_i=j|\pi=i)$. Therefore, assuming that $\pi$ and $X_i $ are independent, you get
$\mathbb{E}[X_{\pi}]=\sum_{i=1}^{n} \sum_{j} P(\pi=i)P(X_i=j|\pi=i)\times j$
Then,
$\sum_{i=1}^{n} \sum_{j} P(\pi=i)P(X_i=j|\pi=i)\times j=\sum_{i=1}^{n} P(\pi=i) \sum_{j} P(X_i=j|\pi=i)\times j$
Then, using the definition of expected value of a random variable
$\sum_{i=1}^{n} P(\pi=i) \sum_{j} P(X_i=j|\pi=i)\times j=\sum_{i=1}^{n} P(\pi=i)E(X_i|\pi=i)=\sum_{i=1}^{n} P(\pi=i)\mu_i$