Matrix Derivatives In Back-propagation: Matrix times vector

145 Views Asked by At

I am currently performing back-propagation on a neural network by hand and am perform derivatives on matrices. How would I calculate $\frac{d}{dh_a}(W_xX + W_hh_a + h_bh_a)$ where $W_x$ is 3x3 matrix, $X$ is vector of length 3, $W_x$ is a $3x3$ matrix, $h_a$ is a vector of length 3, and $h_b$ is a vector of length 3.

I assume that the derivative would equal to $Wh + h_b$ when thinking in a scalar perspective, but this does not seem to the correct. How would I go about calculating this derivative?