Regularized Least Squares Solution Matrix - 2 Equivalent Formulations

72 Views Asked by At

Suppose we have the following least squares problem $$\min_{x \in \mathbb{R}^N} \left( \lVert y - Ax \rVert_2^2 + \delta \lVert x \rVert_2^2 \right) $$

Applying some calculus we can determine a solution to be of the form: $$A^T y = (A^T A + \delta I)x$$

We can show that the right-side is invertible: $$(A^T A + \delta I)^{-1} A^T y = x$$

Now, my professor said that this can also be written as: $$A^T (A A^T + \delta I)^{-1} y = x$$

I cannot quite see why this is the case.

1

There are 1 best solutions below

0
On

Let $\boldsymbol{A} \in \mathbb{R}^{m \times n}$ and $\boldsymbol{I}_{r} \in \mathbb{R}^{r \times r}$ be the identity matrix of size $r$.

One way to prove it is looking at the following identity:

$$ \boldsymbol{A}^{T} \left( \lambda \boldsymbol{I}_{m} + \boldsymbol{A} \boldsymbol{A}^{T} \right) = \left( \lambda \boldsymbol{I}_{n} + \boldsymbol{A}^{T} \boldsymbol{A} \right) \boldsymbol{A}^{T} $$

Just open each side of the identity.

Now, since both $\left( \lambda \boldsymbol{I}_{m} + \boldsymbol{A} \boldsymbol{A}^{T} \right)$ and $\left( \lambda \boldsymbol{I}_{n} + \boldsymbol{A}^{T} \boldsymbol{A} \right)$ are Symmetric Positive Definite (SPD) matrices for $\lambda > 0$, then their inverse is well defined.

So we multiply both sides by ${\left( \lambda \boldsymbol{I}_{n} + \boldsymbol{A}^{T} \boldsymbol{A} \right)}^{-1}$ on the left and ${\left( \lambda \boldsymbol{I}_{m} + \boldsymbol{A} \boldsymbol{A}^{T} \right)}^{-1}$ on the right we will end up with:

$$ {\left( \lambda \boldsymbol{I}_{n} + \boldsymbol{A}^{T} \boldsymbol{A} \right)}^{-1} \boldsymbol{A}^{T} = \boldsymbol{A}^{T} {\left( \lambda \boldsymbol{I}_{m} + \boldsymbol{A} \boldsymbol{A}^{T} \right)}^{-1} $$