I am stuck with the following question. I would be grateful for any help. Assume, we have a simple Gaussian mixture: \begin{align} \text{with prob. } \pi: y &=\mu+\epsilon\\ \text{with prob. } 1-\pi: y &= \epsilon \end{align} I have a prior $\mu \sim \mathcal{N}(\mu_0, \sigma_0^2)$ and I know that $\epsilon \sim \mathcal{N}(0, \sigma^2)$. I observe just one observation $y_1$. I would like to write down a conditional posterior expectation $\mathrm{E}[\mu \vert y_1]$ (and corresponding density $p(\mu \vert y_1)$) in two cases: (1) when I know $\pi$ and (2) when I don't know $\pi$ and only have uniform prior on it $\pi \sim U[0,1]$.
For the first case, I think, we should have $\mathrm{E}[\mu\vert y_1]=\pi \left(\mu_0 + \frac{\sigma_0^2}{\sigma_0^2 + \sigma^2}(y_1-\mu_0)\right)+(1-\pi)\mu_0$, while for the second case, this is $\mathrm{E}[\mu\vert y_1]=\int_0^1 \left[\pi \left(\mu_0 + \frac{\sigma_0^2}{\sigma_0^2 + \sigma^2}(y_1-\mu_0)\right)+(1-\pi)\mu_0\right] \, \mathrm{d}\pi$.
Are my calculations correct? What confuses me is that observation $y_1$ is not used in any way to update prior on $\pi$ in the second case. Shouldn't $y_1$ help learning something both about $\mu$ and $\pi$?