Following is from the original paper of concept of VAE(variational autoencoder) by Kingma,Welling 2014
B. Solution of $D_{KL}(q_\phi(z)||p_\theta(z))$ of Gaussian case
The variational lower bound (the objective to be maximized) contains a $KL$ term that can often be integrated analytically. Here we give the solution when both the prior $p_{θ}(z) = N (0, I)$ and the posterior approximation $q_φ(z|x^{ (i)})$ are Gaussian. Let $J$ be the dimensionality of $z$. Let $µ$ and $σ$ denote the variational $mean$ and $s.d.$ evaluated at datapoint $i$, and let $µ_j$ and $σ_j$ simply denote the $j$-th element of these vectors. Then:
$\int q_θ(z) \log p(z) dz = \int N (z; µ,σ^ 2 ) \log N (z; 0, I) dz = − {J\over 2} \log(2π) − {1\over 2} \sum_{j=1}^{J} (µ_j{^2} + σ_j^2 )$
At the equation above can't understand how the second equality calculated. Any hint to understand those eqaulity?
I suppose that the distribution $N(\mu,\sigma^2)$ is actually $\mathcal{N}(\mu,\Sigma)$ and $\Sigma$ is a matrix of size $J\times J$, I don't really know what is $\sigma^2$ in your case, maybe it is the same.
Expend the distribution inside the log as \begin{align*} &\int \mathcal{N}(\mu,\Sigma) \log \left( \frac{1}{\sqrt{2\pi}^J} \exp\left( -\frac{z^T I z}{2} \right) \right) dz\\ =&\int \mathcal{N}(\mu,\Sigma) \log\left( \frac{1}{\sqrt{2\pi}^J} \right) dz + \int \mathcal{N}(\mu,\Sigma) \log \left( \exp\left( -\frac{z^T I z}{2} \right) \right) dz\\ =&-\frac{J}{2}\log(2\pi)\int \mathcal{N}(\mu,\Sigma)dz - \frac{1}{2}\int \mathcal{N}(\mu,\Sigma) z^Tzdz\\ =&-\frac{J}{2}\log(2\pi) \mathbb{E}_{Z}[1] - \frac{1}{2}\mathbb{E}_Z[Z^T Z]\\ =&-\frac{J}{2}\log(2\pi)- \frac{1}{2}\mathbb{E}_Z\left[\sum_{i=1}^J Z_i^2\right]\\ =&-\frac{J}{2}\log(2\pi)- \frac{1}{2}\sum_{i=1}^J \left( \mu_i^2+\sigma_i^2 \right) \end{align*}
The expectaction $\mathbb{E}_Z$ is for $Z\sim \mathcal{N}(\mu,\Sigma)$ and $\sigma_i^2=\Sigma_{ii}$