Suppose $f$ is a probability density on $y \in \mathbb{R}$ and $\theta \in \mathbb{R}$ is some parameter.
Define $$S(\theta) := \log \big( \frac{\partial f(y; \theta)}{\partial \theta} \big)$$
Then I'm trying to show that $E[S(\theta)] = 0$. The derivation shown in my class is first noticing that, when $L(\theta; y) = f(y; \theta)$ and then seeing that:
\begin{align} 0 &= \frac{\partial 1}{\partial \theta}dy \\ &= \frac{\partial}{\partial \theta} \int f(y; \theta) dy \\ &= \int f(y; \theta) \frac{ \frac{ \partial L(\theta; y) }{\partial \theta} }{ L(\theta; y) } dy \\ &=\int f(y; \theta) \frac{ \partial \log(L(\theta; y)) }{ \partial \theta} dy\\ &= E[S(\theta)] \end{align}
I think that my lack of intuition for this derivation may come from a weak background in analysis, that is, I'm not so intuitively clear on what conditions are required for interchanging differentiation and integration and so perhaps without this intuition I don't see why this result is obvious.
So, I was trying to gain more intuition by proceeding more directly, starting with $E[S(\theta)]$ and then unpacking the result until I arrived at $$E[S(\theta)] = \int \frac{ \partial f(y; \theta) }{\partial \theta } dy$$
and from here, it is not obvious to me that I should be able to interchange integration and differentiation and so I don't know how to proceed directly.
Thanks in advance.