Can $f_1(x, y)-f_2(u, v)$ be written as $g(x-u, y-v)$?

58 Views Asked by At

when I do some calculation on the basic theory of diffractive neural networks,the question behind blocks my way.It's a pure math that i want to know, $f_1 (x, y)$ and $f_2(u, v)$ are both nonlinear functions $(f(ax + by) != af(x) + bf(y)$ where $a$ and $b$ are both constants). Is there any set of functions satisfy $f_1 (x, y) - f_2(u, v) = g(x-u, y-v)$? If not, I would appreciate that if you can provide a proof. Thanks

1

There are 1 best solutions below

2
On BEST ANSWER

Assuming we are talking about continous functions defined $\mathbb{R}$, this is not possible. Proof:

Set $$ x=a,y=b,u=0,v=0 \Rightarrow f_1(a,b)-f_2(0,0)=g(a,b)\\ x=0,y=0,u=a,v=b \Rightarrow f_1(0,0)-f_2(a,b)=g(-a,-b)\\ x=0,y=0,u=0,v=0 \Rightarrow f_1(0,0)-f_2(0,0)=g(0,0) $$ We can express $f_1,f_2$ completely in terms of $g$. Inserting in the original equation yields $$ g(x,y)+g(u,v)-g(0,0)=g(x+u,y+v) $$ From here we need to check that $g$ is in fact linear. The easiest way I thought of assumes that $g$ is differentiable but I am certain this assumption can be relaxed if necessary. $$ g(x+u,y+v)-g(x,y)=g(u,v,)-g(0,0)\\ \Rightarrow Dg(x,y)[(u,v)]=Dg(0,0)[(u,v)] $$ So the derivate everywhere in any direction is equal to the derivative at zero implying $g$ is linear. Inserting back yields $f_1,f_2$ linear in even stricter way you suggested, namely $f(x,y)=ax+by+c$.