Distance between matrix and the span of matrices

341 Views Asked by At

How to find distance between matrix $A$

$$A = \left( \begin{matrix}1 & 2\\ 6 & 6\end{matrix} \right)$$

and the span of the following matrices $B$, $C$ and $D$?

$$B = \begin{pmatrix} 6 & 3\\ 3 & 3\end{pmatrix}, \qquad C = \begin{pmatrix} 1 & 1\\ 4 & 3\end{pmatrix}, \qquad D = \begin{pmatrix} 5 & 3\\ 6 & 5\end{pmatrix} $$


If the (Frobenius) dot product of two matrices is

$$ \langle X,Y \rangle = \mbox{tr} \left( X^T Y \right)$$

I found dot product of these matrices, not sure what to do next

$$B*C = 18+12=30$$ $$C*D = 29+18=47$$ $$B*D = 48+24 =72$$

$$\left( \begin{matrix}6 & 3\\ 3 & 3\end{matrix} \right) \left( \begin{matrix}\alpha_1\\ \beta_1\end{matrix} \right) \left( \begin{matrix}1 & 1\\ 4 & 3\end{matrix} \right)\left( \begin{matrix}\alpha_2\\ \beta_2\end{matrix} \right) \left( \begin{matrix}5 & 3\\ 6 & 5\end{matrix} \right)\left( \begin{matrix}\alpha_3\\ \beta_3\end{matrix} \right)$$ is the span.

3

There are 3 best solutions below

0
On

If you look at matrices $B,C,D$ you'll find that they linearly dependent, because you can write

$ D = \dfrac{2}{3} B + C $

Therefore,

$\text{Span}(B, C, D) = \text{Span}(B,C) $

So, let the matrix $E$ be the projection of matrix $A$ onto the span of $B, C$, then

$E = c_1 B + c_2 C $

we want $E - A $ to be orthogonal to $B$, $C$, i.e.

$ (c_1 B + c_2 C - A) . B = 0 $

$ (c_1 B + c_2 C - A) . C = 0 $

which are three equations in $c_1, c_2, c_3$. Evaluating gives

$ 63 c_1 + 30 c_2 = 48 $

$ 30 c_1 + 27 c_2 = 45 $

Solving yields $c_1 = - \dfrac{6}{89}, c_2 = \dfrac{155}{89} $

So that

$E = \dfrac{1}{89} \begin{bmatrix} 119&&137\\602&&447\end{bmatrix}$

Finally the distance will be

$ d = \sqrt{ \langle (E - A), (E-A) \rangle } $

0
On

First of all, you need to decide on a distance in the space of matrices, say $d(M_1,M_2) = \|M_1-M_2\|$, for some matrix norm. Your problem is now to find the set of real numbers $\alpha, \beta, \gamma$ that minimizes $$ G(\alpha, \beta, \gamma)=\|A-\alpha B - \beta C -\gamma D\|^2 $$

In the case of the Frobenius norm, you have that $$ G(a,b,c) = (1 - 6 a - b - 5 c)^2 +(2 - 3 a - b - 3 c)^2+(6 - 3 a - 4 b - 6 c)^2+(6 - 3 a - 3 b - 5 c)^2 $$

Now, you just need to compute the global mininum, which is $\sqrt{\frac{166}{89}}$.

0
On

The span of $B,C$ and $D$ is simply the set

$$\left\{xB+yC+zD:(x,y,z)\in\mathbb{R}^3\right\}$$ so the distance from $A$ and an element of that subspace is given by

$$D(x,y,z):=\langle E(x,y,z),E(x,y,z)\rangle$$

where $E(x,y,z):=A-Bx-Cy-Dy$ and we use the inner product to induce a norm and a distance on the space of matrices. The distance to the span is, definition, the minimum of $D(x,y,z)$.

To find the minimum, we compute the derivative of this expression with respect to $x$, to get

$$\dfrac{\partial D}{\partial x}=2x\mathrm{tr}(B^TB)+y\mathrm{tr}(B^TC+C^TB)+z\mathrm{tr}(B^TD+D^TB)-\mathrm{tr}(A^TB+B^TA).$$

The other derivatives are obtained analogously. Equating all the derivatives to zero yields the expression

$$\underbrace{\begin{bmatrix}2\mathrm{tr}(B^TB) & \mathrm{tr}(B^TC+C^TB) & \mathrm{tr}(B^TD+D^TB)\\ \mathrm{tr}(C^TB+B^TC) & 2\mathrm{tr}(C^TC) & \mathrm{tr}(C^TD+C^TD)\\ \mathrm{tr}(D^TB+B^TD) & \mathrm{tr}(C^TD+C^TD) & 2\mathrm{tr}(D^TD) \end{bmatrix}}_{\mbox{$M$}}\underbrace{\begin{bmatrix}x\\y\\z\end{bmatrix}}_{\mbox{$X$}}=\underbrace{\begin{bmatrix}\mathrm{tr}(B^TA+A^TB)\\\mathrm{tr}(C^TA+A^TC)\\\mathrm{tr}(D^TA+A^TD)\end{bmatrix}}_{\mbox{$b$}}.$$

The Hessian matrix can be easily shown to be positive definite. Therefore, any stationary point is a minimum.

There are several cases from here:

  • The matrix $M$ is invertible and we have $X^*=M^{-1}b$.
  • The matrix $M$ is singular and $b$ is not in the column space of $M$, so there is no solution.
  • The matrix $M$ is singular and $b$ lies in the column space of $M$, so there is an infinite number of solutions.

When the matrices $B,C$ and $D$ are linearly independent, then the column of $M$ are linearly independent and, as a result, $M$ is nonsingular. This proves the uniqueness of the minimizer in the case where $B,C$ and $D$ are linearly independent.

When the matrices $B,C$ and $D$ are linearly dependent, then there exists a vector $v\in\mathbb{R}^3$ such that

$$v_1B+v_2C+v_3D=0.$$ We also have that both $v^Tb=0$ and $v^TM=0$, which proves that $b$ always lies in the column space of $M$, no matter what. The same argument holds when multiple dependences exist. This implies that there cannot be no solution to the problem and there can only be an infinite number of solutions. Those solutions can be expressed as

$$X^*=M^+b+(I-M^+M)z$$

where $M^+$ is the Moore-Penrose pseudoinverse of $M$ and $z\in\mathbb{R}^3$.

In conclusion, if the matrices $B,C$ and $D$ are linearly independent, then the solution is unique. If not, there is an infinite number of solutions. This conclusion still holds in the case of matrices of any dimension and any number of matrices.

Upon computing $X^*$, the distance can be evaluated as

$$D(x^*,y^*,z^*)^{1/2}.$$


In the present case, the matrices $B,D,C$ are linearly dependent since $D=2B/3+C$. The matrices $M$ and $b$ are given in this case by

$$M=\begin{bmatrix} 126 & 60 & 144\\ 60 & 54 & 94\\ 144 & 94 & 190 \end{bmatrix},\ b=\begin{bmatrix}96\\90\\154\end{bmatrix}.$$

We can verify that $v^TM=0$ and $v^Tb=0$ with $v=(2/3, 1, -1)$. All the solutions are then given by

$$X^*=\begin{bmatrix}-519/979\\ 2051/1958\\ 1359/1958\end{bmatrix}+\begin{bmatrix}2/11 & 3/11\\ 3/11 & 9/22\\ -3/11 & -9/22 \end{bmatrix}z,z\in\mathbb{R}^2.$$

In this case, the minimum distance is given by $\sqrt{166/89}$.