Here is the result from the appendix of paper
The Variational Gaussian Approximation Revisited (Manfred Opper and Cedric Archambeau)
This result has been used in the popular paper Practical Variational inference for Neural Networks (Alex Gravves)
Here is the statement:
Derivatives of multivariate Gaussian expectations with respect to the mean $\mu$ and the covariance $\Sigma$ are most conveniently computed using the characteristic function $$G(\mathbf{k})=\left\langle e^{i \mathbf{k}^{\top} \mathbf{x}}\right\rangle_{q}=e^{-\frac{1}{2} \mathbf{k}^{\top} \mathbf{\Sigma} \mathbf{k}+i \mathbf{k}^{\top} \boldsymbol{\mu}}$$
of the Gaussian measure $q$. Using standard Fourier analysis, we can express expectations of any function $$\begin{aligned}\langle V(\mathbf{x})\rangle_{q} &=\frac{1}{(2 \pi)^{n}} \int G(\mathbf{k}) e^{-i \mathbf{k}^{\top} \mathbf{y}} V(\mathbf{y}) d \mathbf{y} d \mathbf{k} \\ &=\frac{1}{(2 \pi)^{n}} \int e^{-\frac{1}{2} \mathbf{k}^{\top} \boldsymbol{\Sigma} \mathbf{k}+i \mathbf{k}^{\top}(\boldsymbol{\mu}-\mathbf{y})} V(\mathbf{y}) d \mathbf{y} d \mathbf{k} \end{aligned}$$
This shows that any derivative with respect to $\mu$ is equivalent to the derivative of the exponential under the integral with respect to y. This in turn, using integrations by parts with respect to y, yields
$$\begin{aligned} \nabla_{\boldsymbol{\mu}}\langle V(\mathbf{x})\rangle_{q} &=\left\langle\nabla_{\mathbf{x}} V(\mathbf{x})\right\rangle_{q} \\ \nabla_{\boldsymbol{\Sigma}}\langle V(\mathbf{x})\rangle_{q} &=\frac{1}{2}\left\langle\nabla_{\mathbf{x}} \nabla_{\mathbf{x}} V(\mathbf{x})\right\rangle_{q}=\frac{1}{2} \nabla_{\boldsymbol{\mu}} \nabla_{\boldsymbol{\mu}}\langle V(\mathbf{x})\rangle_{q} \end{aligned}$$
where the second equality in the last line follows from the first line.
I followed his hint by using integral by parts with respect to y. But I failed to get this result. Can anyone kindly for help?