How is this Jacobian matrix calculated?

75 Views Asked by At

I'm going through the matrix calculus tutorial here The Matrix Calculus You Need For Deep Learning and I am puzzled by this equation here:

Assuming that:

I don't get why the partial derivatives of $x_i$ with respect to $x_j$ equals to $0$ when $j \neq i$. If we take the first horizontal vector of as an example, shouldn't the second element, partial derivative of $x_1$ with respect to $x_2$ return $3(x_1)^2$?