Consider the probalistic pca setting, where $x \in \mathbb{R^d}$ is an input vector drawn from $p(x)$, $z \in \mathbb{R^m}$ is an explicit latent variable with $p(z) = \mathcal{N}(0,\mathbb{I}_m)$ and $p(x \mid z, \theta) = \mathcal{N}(Wz + \mu, \sigma^2 \mathbb{I}_d)$ where $W \in \mathbb{R}^{d \times m}, u \in \mathbb{R}^d, \sigma^2 \in \mathbb{R}^{+}$ are parameters. For the $M$-step, I want to show that: $$\mathbb{E}[z_n \mid x_n] = (W^T W + \sigma^2 \mathbb{I}_m)^{-1} W^T (x_n - \mu)$$ For that, I calculated: $$p(z_n \mid x_n) = \frac{p(x_n | z_n) \cdot p(z_n)}{\int_z p(x_n \mid z) p_z (z) dz}$$, but now I need to show that $p(z_n \mid x_n)$ is a again a gaussian with mean $(W^T W + \sigma^2 \mathbb{I}_m)^{-1} W^T (x_n - \mu)$.
How can I get there?
Let $p(z|x_n)$ denote the conditional density of $z$ given $x=x_n$. To identify $p(z|x_n)$ it is licit to get rid of any multiplicative factor that does not depend on $z$.
Bayes' rule states that $$\begin{aligned}p(z|x_n)&=\frac{p(x_n|z)p(z)}{p(x_n)}\\ &\propto p(x_n|z)p(z)\\ &\propto \exp(-\frac{1}{2\sigma^2}\|Wz+\mu-x_n\|^2 - \frac 12 \|z\|^2)\\ &\propto \exp(-\frac 1{2\sigma^2}(z^T(W^TW+\sigma^2I_m)z+2z^TW^T(\mu-x_n))\\ &\propto \exp(-\frac 1{2}\bigg[\begin{aligned}[t]&z^T\frac 1{\sigma^2}(W^TW+\sigma^2I_m)z\\ &-2z^T\frac 1{\sigma^2}(W^TW+\sigma^2I_m)(W^TW+\sigma^2I_m)^{-1}W^T(x_n-\mu)\bigg])\end{aligned} \end{aligned} $$
Now, the density of $\mathcal N(a,\Sigma)$ is proportional to $\exp(-\frac{1}2\left[z^T\Sigma^{-1}z-2z^T\Sigma^{-1}a\right])$
By identification, $p(z|x_n)$ is the density of a multidimensional normal with center $$(W^TW+\sigma^2I_m)^{-1}W^T(x_n-\mu)$$ and covariance matrix $$\sigma^2(W^TW+\sigma^2I_m)^{-1}$$
Besides, note that $W^TW+\sigma^2I_m$ is invertible because its eigenvalues are the shift of those of $W^TW$ by $\sigma^2>0$.