I am not sure if this is considered a ReLU function but the function is,
v0 = max(h0,h1,0)
I need to find the derivative of v0 wrt to h0 and h1 (which are hidden layers in a convolutional neural network). This explains the derivative of ReLU. But I'm not sure what to do when there are 2 or more variables. Please help! Thank you.