Suppose I have two independent random variables $X_1$ and $X_2$ at device D1 and device D2, respectively. A server want to calculate a function $f(X_1,X_2)$, so it needs information from the devices. Let D1 send a random variable $G_1$, which only depends on $X_1$, and D2 send a random variable $G_2$, which only depends on $X_2$.
I want to ask, when I obtain the value of $f(X_1,X_2)$ (which is inferred from $G_1$ and $G_2$), how much at least do I have already known about $X_1$ and $X_2$? (by observing $G_1$ and $G_2$). The `how much' can be measured in arbitrary ways, such as entropy. We assume the prior distribution of $X_1$ and $X_2$ is known to devices and the server.
For example, when I have the constant function $f(X_1,X_2)=0$, we do not need to know anything about $X_1$ and $X_2$. On the contrast, if $f(X_1,X_2)=X_1+X_2$, we end up with obtaining the value of both $X_1$ and $X_2$.
I think this problem should have been studied, but I do not know the key word to search for it. Please leave references if you have any.