I am trying to approximate the XOR function with a linear function. It does mean to be useful, but a textbook exercise derived from the deep learning of the XOR function context: https://www.deeplearningbook.org/contents/mlp.html
We know that if $x_1$ and $x_2$ are two boolean, we have the XOR table as follows:
- 1 XOR 0 = 1
- 0 XOR 1 = 1
- 0 XOR 0 = 0
- 1 XOR 1 = 0
Now, if I use an $x_1*w_1+x_2*w_2+b$ to approximate $x_1$ XOR $x_2$ using the least squared error lost function, I come up with calculating the minimal value of
$$F(w_1,w_2,b)=(w_1+b-1)^2+(w_2+b-1)^2+b^2+(w_1+w_2+b)^2$$
where $w_1$,$w_2$ and $b$ are reals. How can I minimize this function $F$? Any clue?
Guide:
Note that the optimization problem is convex.
$$2(w_1+b-1)+2(w_1+w_2+b)=0$$
Also differentiate with respect to $w_2$ and $b$.
From there we solve a linear system of equations.