ReLU on a feature map

38 Views Asked by At

I have produced a feature map, that is 5x3, of a input layer which was 5x7. I have obtained "2:s" and "3:s" and also one "1". The aim is now to use the ReLU neuron on this. My supervisor said that ReLU of 2 is equal to 1.

I use the definition as ReLU(b)=max{0,b}. I don't know how he gets it to be 1?.

What is ReLU of 3? and ReLU of 1? isn't it 3 and 1? Don't get it.

1

There are 1 best solutions below

0
On BEST ANSWER

You're right. ReLU(2)=2. Your supervisor might be thinking of a piecewise linear function that "maxes out at 1" so that its shape is similar to a sigmoid function.