Suppose a super-population model without auxiliary covariates,
$y_i = \theta + \epsilon_i$, where $\epsilon_i \sim N(0,1)$. Then the score function $S_i(y_i, \theta) = y_i-\theta$.
In the case of informative/nonignorable non-response, the probability of non-response suppose $\pi_i$ needs to be dependent on $y_i$ but I am not considering the exact relation between $y_i$ and $\pi_i$ like a usually used logistic model $logit(\pi_i/(1-\pi_i))= \gamma y_i$ but I am considering $logit(\pi_i/(1-\pi_i))=\eta_i(y_i)$ , where $\eta_i$ is unspecified function of $y_i$ so we do not know the exact relation between $y_i$ and $\pi_i$ but they are not independent.
Now to estimate the parameter $\theta$, one can write sample estimating equations with known response probability $\pi_i$ as \begin{equation} \frac{1}{N} \sum_{i=1}^{N}\frac{\delta_i}{\pi_i} [S_i(y_i, \theta)] \tag{1} \end{equation}
Now using empirically estimated response probability $\hat \pi_i$, the sample estimating equation can be written as \begin{equation} \frac{1}{N} \sum_{i=1}^{N}\frac{\delta_i}{\hat \pi_i} [S_i(y_i, \theta)], \tag{2} \end{equation}
I need to show this estimation equation to be unbiased to derive further properties of estimator $\hat \theta$. For unbiasedness I need to show the expectation of (2) directly zero or asymptotically zero.
Since in (2), $y_i$ and $\delta_i$ are both random variables. After Taylor expansion of $1/\hat \pi_i$ around $\pi_i$ and first taking expectation on $\delta_i$, we have \begin{equation} \frac{1}{N} \sum_{i=1}^{N}[S_i(y_i, \theta)] \frac{Var(\hat \pi_i)}{\pi_i}, \tag{3} \end{equation} Now suppose $G_i=\frac{Var(\hat \pi_i)}{\pi_i}\geq 0$ and $S_i=S_i(y_i, \theta) $ then (3) becomes
\begin{equation} \frac{1}{N} \sum_{i=1}^{N}[S_i G_i] \tag{4} \end{equation} Now taking expectation under super population model \begin{equation} \frac{1}{N} \sum_{i=1}^{N}E[S_i G_i] \tag{5} \end{equation}
As $S_i$ is a known function of $y_i$ but $G_i$ is unspecified function of $y_i$ because $\pi_i$ is unspecified function of $y_i$. Now we can assume $S_i$ is IID with $E(S_i)=0$ and suppose $G_i$ is symmetric. Then I need to show
$\frac{1}{N} \sum_{i=1}^{N} E(S_i G_i) = 0$.
Now suppose $y_i$ is fixed then both $S_i$ and $G_i$ are not random variables then I need to show for $N \to \infty$,
$\frac{1}{N} \sum_{i=1}^{N} (S_i G_i) \to 0$.
Maybe I need to put some conditions on the moments of $G_i$ for first question and conditions on both $S_i$ and $G_i$ for second question.
Any suggestion will be obliged, thanks.