Why is the Error surface for a 2 input neural network with 2 weights a parabolic bowl

257 Views Asked by At

I am new to machine learning and AI in general and had a quick question regarding the error function surface regarding a simple neural net: 2 input neural net

After reading the following wiki: https://en.wikipedia.org/wiki/Backpropagation

I understand that the error function for a specific input and output of this neural net could be described as:

$$E = (t-y)^2$$ where $t$ is the expected value and $y$ is the value of the output node in the neural net.

However, $y$ is a function of $w_1$, and $w_2$:

$$y=x_1w_1 + x_2w_2$$

This I believe must mean that the final error function should be:

$$E = (t - x_1w_1 - x_2w_2)^2$$

If I graph this function for specific values of $t, x_1$ and $x_2$, I am getting a surface that is a parabolic cylinder instead of a parabolic bowl. Is there anything that I might be looking at wrong? Thanks so much.