Differentiation with respect to vector

77 Views Asked by At

I have a function, $$ K\left(\nabla u \right) = \left(K_0 + K_1 \textrm{norm} \left(\nabla u \right)^2 \right) \nabla u $$ and I want to differentiate $K$ with respect to $ \nabla u $.

I will try to rephrase this question, let$$\nabla u = e$$

$$e = e\left(e_1, e_1 \right)$$ This is more like $$ K\left(\nabla u \right) = K\left(e \right) $$ If I have $$ K\left(e_1, e_1 \right) $$ e is a vector $$e \in R^2$$ and k is a function that takes a vector as an input and gives a vector as an output. $$K \in R^2$$

I want derivative/gradient of $$K$$.

Can someone help me with this?

1

There are 1 best solutions below

0
On

Consider the vector $$ \mathbf{k} = (k_0+k_1 \| \mathbf{x} \|^2) \mathbf{x} $$ with $\mathbf{x} \in \mathbb{R}^N$.

From what I understand, you are asking for the Jacobian. Using differentials (and product rule), we have $$ \mathrm{d}\mathbf{k} = (k_0+k_1 \| \mathbf{x} \|^2) \mathrm{d}\mathbf{x} + k_1 \mathbf{x}\mathbf{x}^T \mathrm{d}\mathbf{x} $$ from which you deduce that $$ \frac{\partial \mathbf{k}}{\partial \mathbf{x}} = (k_0+k_1 \| \mathbf{x} \|^2) \mathbf{I}_N + k_1 \mathbf{x}\mathbf{x}^T $$ where $\mathbf{I}_N$ is the identity matrix.