Let $n$ be any positive natural number. I was wondering if for any function $g\colon\{0,1\}^n \to \{0,1\}$ it is always possible to determine a second function $f\colon \{0,1\}^n \to \mathbb{R}$ such that, for any distribution $\rho$ on $\{0,1\}$ and any $X_1, \ldots, X_n \overset{\text{i.i.d.}}{\sim} \rho$, we have $$\mathbb{E}\bigl[ f(X_1,\ldots,X_n) \bigr] = \mathbb{E}[ X_1 ] \, \mathbb{E} \bigl[ g(X_1,\ldots,X_n) \bigr]$$
In other words, if it is possible to find unbiased estimators of the product of the expectations of the two (non-independent!) random variables $X_1$ and $g(X_1,\ldots,X_n)$ (both of which are $\sigma(X_1,\ldots,X_n)$-measurable), using only the $n$ samples $X_1,\ldots, X_n$.
Every fiber of my being tells me that this should not be possible but I have no idea how to even attack such a problem.
Counterexample: $n=1, g(x_1)=x_1$.
Then we have to find $f:\{0,1\}\to\mathbb{R}$ such that for every probability measure $\mathbb{P}$ on $(\{0,1\},2^{\{0,1\}})$ we have that $$\mathbb{E}_{\mathbb{P}}(f(X_1))=\left(\mathbb{E}_{\mathbb{P}}(X_1)\right)^2.$$ Since every probability $\mathbb{P}$ on $(\{0,1\},2^{\{0,1\}})$ is fully determined by $\mathbb{P}(\{1\})$ and since varying $\mathbb{P}$ on the probability measures on $(\{0,1\},2^{\{0,1\}})$ we have that $\mathbb{P}(\{1\})$ varies in the whole $[0,1]$, the previous requirement is equivalent to $$\forall p\in[0,1], f(0)(1-p)+f(1)p=p^2.$$ In this equality, pluggin $p=0$ we get that $f(0)=0$ and pluggin $p=1$ we get $f(1)=1$. However, if $p=1/2$ we get $$f(0)(1-p)+f(1)p=0 \frac{1}{2}+1\frac{1}{2} = \frac{1}{2} \neq \frac{1}{4}=\left(\frac{1}{2}\right)^2=p^2$$