Write equation of a simple neural network

133 Views Asked by At

I have a number of weights and biases, and would like to write the network equations explicitly to understand how this works.

$$w^2 = \begin{bmatrix} 0.336067 & -0.322224 \\ -0.700076 & 0.00861628 \\ \end{bmatrix}$$

$$b^2 = \begin{bmatrix} 4.17819 \\ -6.55492 \\ \end{bmatrix}$$

$$w^3 = \begin{bmatrix} -106.923 & -1936.34\\ -36.9631 & 1949.76 \\ \end{bmatrix}$$ $$b^3 = \begin{bmatrix} 113.353 \\ 27.4203 \\ \end{bmatrix}$$ $$w^4 = \begin{bmatrix} 1138.29 \\ 2871.45 \\ \end{bmatrix}$$ $$b^4 = \begin{bmatrix} -1138.24 \\ \end{bmatrix}$$

Here are the equations I came up with:

$$ z^2 = w^2a^1 + b^2 $$ $$ a^2 = σ(z^2) $$ $$z^3 = w^3a^2 + b^3 $$ $$a^3 = σ(z^3)$$ $$z^4 = w^4a^3 + b^4$$ $$a^4 = z^4$$

Here, The hidden layers are sigmoid neurons and the output layer is linear. However, I am worried I have written out the equations incorrectly because the output (a4) is supposed to be an integer, while my output, according to these expressions, is a 2x1 vector.

Any help is appreciated!

1

There are 1 best solutions below

1
On BEST ANSWER

The problem here is $w^4$. You can see that the result after adding $b_3$ is a 2d-vector. Evaluating $W\cdot x$ will not work unless $w_4$ has $n=2$ which is not the case. I assume you need to transpose $w_4$ and everything will work.

If you want to get started with this type of thing, you should look into automatic differentiation.

We can discuss this in the chat if you wish.