Uncertainty of estimates from SVD

1.1k Views Asked by At

I have a vector of values $\vec d$ that have each been measured with known uncertainty $\vec \sigma$.

With an equation

$$ \mathbf{A} \vec x = \vec d$$

where both $\mathbf{A}$ and $\vec d$ have been divided by $\vec \sigma$, I can estimate $\vec x$ using the singular value decomposition (SVD) via

$$ \mathbf{A} = \mathbf{U} \boldsymbol{\Sigma} \mathbf{V}^T $$ $$ \vec x = \mathbf{V} \boldsymbol{\Sigma}^{-1} \mathbf{U}^T \vec{d}. $$

What is the uncertainty on each component of $\vec x$?

1

There are 1 best solutions below

5
On

I'm adding some assumptions to the question:

Assume $A$ is a $n \times n$ invertible matrix and $d$ is a random vector with covariance matrix $C$ whose diagonal is the vector $\sigma^2$. Then we seek the variance of $x$, the unique solution to $Ax=d$. But $x=A^{-1}d=A^{-1}(\mu+C^{1/2}\epsilon)$ where $\epsilon$ is the noise vector, assumed uncorrelated and with unit variance (but no other assumptions are required). Now $A^{-1} \mu$ is a fixed vector, so the covariance is that of the random vector $A^{-1} C^{1/2} \epsilon$.

If we now assume additionally that $C$ is diagonal (i.e. the noise to $d$ is uncorrelated) then we can read off the variance from here: the variance of $x_i$ is $\sum_{j=1}^n (A^{-1})_{ij}^2 \sigma^2_j$, so the standard deviation of $x_i$ is the square root of that. I think you can proceed similarly when $C$ is not diagonal but the situation will get significantly more complicated.

I'm not sure getting the SVD of $A$ involved here does you any good from the mathematical perspective (as opposed to the numerical perspective).

Apparently the $A$ here is actually $m \times n$ with $m>n$, so that there is no solution to $Ax=d$ for most $d$. Then the least squares solution is given by $x=A'd$ where $A'$ is the Moore-Penrose pseudoinverse. One can repeat the above analysis exactly as written with $A'$ replacing $A^{-1}$ and the result turns out the same.