Let $u_{0}$ be a row vector of real numbers, $U$ be a matrix of real numbers, $x$ some row vector.
Then let $v = xU + u_{0} $
My question is how you would calculate $$ \frac{\partial v}{\partial u_{0j}} $$
where $u_{0j}$ is the jth element of the vector $u_{0}$.
Clearly $\frac{\partial v}{\partial u_{0}} = 1$ but I'm confused about differentiating with respect to a single element in $u_0$. Do I need the chain rule?
(Apologies for the strange question without context - it's part of a broader question in a machine learning problem and I don't have much experience with matrix calculus so need to make sure I get it right).
$\vec{v}= \matrix M \vec{x}+\vec{u_0}$
$v^i=M^i_jx^j+u_0^i$
$\frac{\partial v^i}{\partial u_0^k}=\frac{\partial u_0^i}{\partial u_0^k}=\delta^i_k$
Where repeated upper and lower index implies a sum over that index from 1 to 3. Exponent indicates ith component and not exponentiation. Upper and lower indices on M are row by column, respectively.
$\delta^i_k$ is the Kronecker Delta. One if indices are equal, zero if indices are not.
So in this case, I think its the identity matrix.
This assumes x and M are independent of $u_0$.