In the paper
Sharpee, T., Rust, N.C., Bialek, W.: Analyzing neural responses to natural signals: maximally informative dimensions. Neural Comput. 16, 223–250 (2004).
I found the following claim (equation A.1., page 242):
For any function $F(\mathbf x)$ of a Gaussian random variable $\mathbf x\in \mathbb R^n$, the following identity holds:
$$\left\langle x_i F(\mathbf x) \right\rangle = \left\langle x_i x_j\right\rangle \cdot \left\langle \frac{\partial }{\partial x_j} F(\mathbf x) \right\rangle,$$ where $\langle \cdot \rangle$ denotes the expectation operator.
This equation is obviously wrong. To see this, let $\mathbf x \in\mathbb R^2$ and $F(\mathbf x) = x_1$. Then $\frac{\partial }{\partial x_2} F(\mathbf x) = 0$ which would imply that $\left\langle x_i F(\mathbf x) \right\rangle=\left\langle x_i^2 \right\rangle=0$ which is obviously not true for almost all normal RVs.
However, I would be curious if there is a correct version of the equation in the paper. I tried to make sense of it but was not successful. In particular, it puzzles me how the Gaussian assumption enters the equation.
Has anyone encountered a similar property of Gaussian RVs who could clarify that for me?
See this answer to the same formula which the authors used in a different paper: https://mathoverflow.net/questions/200930/expectation-of-gaussian-random-vector-arbitrary-function-thereof