Least Squares Problem with Generalized Tikhonov Regularization

284 Views Asked by At

Let $c \in \mathbb{R}^n$ and $\mu \in \mathbb{R}$, $\mu > 0$. Find the matrix $A$ and vector $b$ to solve this problem using the least squares approximation:

$$\min \left\{ x \in \mathbb{R}^n : \|x -c\|_{2}^2 + \mu \sum_{i=1}^{n - 1} (x_{i+1} - x_{i})^2 \right\}$$

1

There are 1 best solutions below

0
On BEST ANSWER

The problem could be generalized as following (Practically, Tikhonov Regularization of Least Squares):

$$ \arg \min_{x} f \left( x \right) = \arg \min_{x} \left\| A x - c \right\|_{2}^{2} + \mu \left\| D x \right\|_{2}^{2} $$

Where in you case $ A = I $ and $ D $ is the derivative matrix.

The above is Convex and Smooth hence the solution, $ \hat{x} $, is given by the stationary point:

$$ \nabla f \left( \hat{x} \right) = 2 {A}^{T} \left( A x - c \right) + 2 \mu {D}^{T} D x = 0 \Rightarrow \hat{x} = \left( {A}^{T} A + \mu {D}^{T} D \right)^{-1} {A}^{T} c $$

Setting your data ($ A = I $) yields:

$$ \hat{x} = \left( I + \mu {D}^{T} D \right)^{-1} c $$