Gradient of a norm with a linear operator

1k Views Asked by At

In mathematical image processing many algorithms are stated as an optimization problem, where we have an observation $f$ and want recover an image $u$ that minimizes a objective function. Further, to gain smooth results a regularization term is applied to the image gradient $\nabla u$, which can be implemented with finite differences in $x, y$ direction.

If I have a regularizer of the form $||\nabla u||_2$, with $||x||_2 = \sqrt{\sum_i x_i^2}$ how can I compute the gradient with respect to $u$? I don't know how to handle the operator $\nabla$.

1

There are 1 best solutions below

0
On BEST ANSWER

I have now the results, how it is usually done in image processing. As I have mentioned, it is implemented with finite differences. Therefore, for the derivative we can use also the finite difference scheme.

$|| \nabla u ||_2 = || \sqrt{ (u^x)^2 + (u^y)^2 } ||_2 = || \sqrt{ (u_{i+1,j} - u_{i,j})^2 + (u_{i,j+1} - u_{i,j})^2 } ||_2$

where it is now easy to take the derivative with respect to $u_{i,j}$.

Note also that $|| \nabla u ||_2$ implicitly means $|| \nabla u ||_{2,2}$.