I tried to extend the expectation below from https://zhiyzuo.github.io/VI/ where it is a Variational Inference topic
\begin{align} \dfrac{\partial}{\partial \phi_{ij}}~\text{ELBO} & \propto \dfrac{\partial}{\partial \phi_{ij}}\Big\{E_q\Big[-\dfrac{(x_i-\mu_j)^2}{2}\Big] \phi_{ij} - E_q\Big[\log~\phi_{ij}\Big] \Big\}\\ & = E_q\Big[-\dfrac{(x_i-\mu_j)^2}{2}\Big] - \log~\phi_{ij} - 1 = 0 \\ \log~\phi_{ij} & \propto E_q\Big[-\dfrac{(x_i-\mu_j)^2}{2}\Big] \\ \phi_{ij}^* & \propto \exp\{ -\tfrac{1}{2}(m_j^2+s_j^2) + x_i m_j \}\end{align}
and
\begin{align} \dfrac{\partial}{\partial m_{j}}~\text{ELBO} & \propto \dfrac{\partial}{\partial m_{j}}~\Big\{ -E\big[\dfrac{\mu_j^2}{2\sigma^2}\big] - \sum_i \phi_{ij} E[\dfrac{(x_i-\mu_j)^2}{2}] \Big\} \\ & \propto \dfrac{\partial}{\partial m_{j}}~\Big\{ -\dfrac{1}{2\sigma^2} m_j^2 - \sum_i \phi_{ij} \big[ -\dfrac{1}{2}m_j^2 + x_i m_j \big] \Big\} \\ & = -\dfrac{1}{\sigma^2}m_j - \sum_i\phi_{ij} m_j + \sum_i \phi_{ij} x_i = 0 \\ m_j^* &= \dfrac{\sum_i\phi_{ij}x_i}{\tfrac{1}{\sigma^2} + \sum_i\phi_{ij}} \end{align}
Given:
\begin{align}E[\mu_j] = m_j \text{; } E[\mu_j^2] = V[\mu] + E^2[\mu] = s_j^2 + \mu_j^2\end{align}
I try to solve (we all know it is equal to sample standard deviation)
\begin{align}{E_q\Big[-\dfrac{(x_i-\mu_j)^2}{2}\Big]}\end{align}
The result from above:
\begin{align} -\tfrac{1}{2}(m_j^2+s_j^2) + x_i m_j \end{align}
\begin{align} \big[ -\dfrac{1}{2}m_j^2 + x_i m_j \big] \end{align}
Question:
1. How to solve the expectation into two individual result I tried this with factorization but I get no luck with it?
2.What is the subscript q near the expectation ? is it same as normal expectation ?
Sorry for poor syntax I hope someone can help me
2026-02-24 00:49:29.1771894169