derivative of sign() as active function in backpropagation

2.3k Views Asked by At

I've got the task that I need to implement the backpropagation algorithm for a neural network. My activation function is just the sign(.).

$w^{\prime} = w + \space$learning rate$\space \times \delta \times \frac{df(e)}{de} \times$ input

So I need to calculate $\frac{df(e)}{de}$ and I don't understand how to calculate derivative of sign() function.

Do you have some idea?

1

There are 1 best solutions below

0
On

Formally, the derivative is $0$ everywhere but the origin and is undefined at the origin. If you consider distributional derivatives, the derivative is $2\delta$ where $\delta$ is the Dirac delta distribution.

Read http://en.wikipedia.org/wiki/Sign_function for more information.