How to find a bi-variate function from two univariate functions?

35 Views Asked by At

I was analyzing the data from an experiment where the dependent variable $z$ is know to vary with two independent variables $x$ and $y$ are independent variables. The experiment was conducted in two ways.

  1. Both $x$ and $y$ were varied and the values of $z$ observed. The relationship between $x$ and the average value of $z$ for different values of $y$ was found to be $z = f(x)$
  2. Both $x$ and $y$ were varied and the values of $z$ observed. The relationship between $y$ and the average value of $z$ for different values of $x$ was found to be $z = g(x)$

At the end of the experiment, the individual relationships $z = f(x)$ and $z = g(y)$ but are known the combined relationship $z = h(x,y)$.

Question: Is there any standard way of finding the relationship $h$ such that $z = h(x,y)$?

In my specific example, in terms of $z$ can be represented in terms of $x$ using the cumulative distribution function of the standard normal distribution

$$ z = \frac{1}{2} + \frac{1}{2}erf \Bigg(\frac{x-a}{b\sqrt 2}\Bigg) $$

where $erf$ is the error function. In terms of $y$, $z$ can be represented as

$$ z = \frac{1}{2}erfc \Bigg(-\frac{\log x-c}{d\sqrt 2}\Bigg) $$ where $erfc$ is the complimentary error function. Here $a,b,c$ and $d$ are positive constants. I was able to find $h$ using numerical approximations. I would like to know if there an easier theoretical approach.

1

There are 1 best solutions below

0
On

You cannot recover $h$ in general, but a good guess is $$ \tilde{h}(x,y):=f(x)+g(y)-\mathbb{E}[f] $$

In particular, if your observations are consistent in the sense that $\mathbb{E}[f]=\mathbb{E}[g]$ then this guess matches your observations in the sense that $$ \mathbb{E}[\tilde{h}(x,y) | x ] = f(x)\\ \mathbb{E}[\tilde{h}(x,y) | y ] = g(y) $$

You can easily verify that the guess agrees with the the true $h$ if and only if the true $h$ can be written as the sum of a function of $x$ and a function of $y$.