Determine linear combination of rank 1 matrices for a rank 2 matrix?

237 Views Asked by At

I'm interested in the following question:

Given that some matrix $M$ (which we have) can be expressed as $\vec{\lambda}\vec{x}^t + \vec{x}\vec{\mu}^t$, with $\vec{x}$ given, how can we determine values for $\vec{\lambda}$ and $\vec{\mu}$?

I'm interested in the general case when $\vec{x}$ is not necessarily perpendicular to $\vec{\lambda}$ or $\vec{\mu}$ (where this problem becomes finding the Singular Value Decomposition for this rank $2$ matrix).

Is there a systematic procedure for solving this problem?

2

There are 2 best solutions below

0
On BEST ANSWER

Let me do it for a more general equation $\lambda x^t + y\mu^t = M$, with $x,y$ known.

We also assume that both $x$ and $y$ are non-zero.


Write $v = (1, 0, \dotsc, 0)^t$.

Since $x$ and $y$ are non-zero, we may then find invertible matrices $P, Q$ such that $Px = Qy = v$.

Multiplying the equation on the left by $Q$ and on the right by $P^t$, we get: $$(Q\lambda) v^t + v(P\mu)^t = QMP^t.$$

Writing $\lambda' = Q\lambda,\mu' = P\mu,M' = QMP^t$, we see that it suffices to solve the equation $\lambda' v^t + v \mu'^t = M'$.

But the vector $v$ is so simple that this equation is trivially solved: $\lambda'$ is just the first column of $M'$ (denoted by $c$), and $\mu'^t$ is just the first row of $M'$ (denoted by $r^t$), with the exception that the first components of $\lambda'$ and $\mu'$ are not uniquely determined. Instead, their sum is determined as the $(1, 1)$-th entry of $M'$ (denoted by $m$).

Hence we get a family of solutions $\lambda' = c - av$ and $\mu'^t = r^t - bv^t$, with $a + b = m$.

It only remains to multiply by the inverses of $P$ and $Q$ to get back $\lambda$ and $\mu$.

0
On

Suppose that $M$ has size $m \times n$. We are given that $M$ is rank $2$. In particular, we know that $x$ is an element of the column-space and the row space of $M$ and that $\lambda,\mu$ are not multiples of $x$. I assume that the matrices being discussed are real.

One procedure is as follows: find the first column of $M$ that is not a multiple of $x$, let this column be $v$. The vectors $x$ and $v$ form a basis of the column-space, so for $i$th column $m_i$ of $M$ there exist $a,b \in \Bbb R$ such that $a_i \vec v + b_i \vec x = \vec m_i$. In other words, after solving $n$ (typically over-determined) system of equations, we will have we have $$ M = \pmatrix{m_1 & \cdots & m_n} = \pmatrix{v & x} \pmatrix{a_1 & a_2 & \cdots & a_n\\ b_1 & b_2 & \cdots & b_n}. $$ Because $x,v$ are linearly independent, the solution to these equations is unique. So, the vectors $\vec a = (a_1,\dots,a_n)$ and $\vec b = (b_1,\dots,b_n)$ will be uniquely determined. We have $$ M = \pmatrix{v & x} \pmatrix{a & b}^T = v a^T + x b^T $$

Now, I claim that $a$ must be a multiple of $x$ (proof below). That is, we have $a = kx$ for some non-zero $k \in \Bbb R$ (which can be easily computed). We can now say $$ M = \vec v (k\vec x)^T + \vec x \vec b^T = (k \vec v) \vec x^T + \vec x \vec b^T. $$ So, we can solve the problem with $\lambda = kv$ and $\mu = b$.


Proof: Suppose that $M = \alpha x^T + x\beta^T$ for some vectors $\alpha,\beta$. Let $P$ be the orthogonal projection onto $x^\perp$. We have $$ PM = P(\alpha x^T + x\beta^T) = (P\alpha) x^T,\\ PM = P(v a^T + x b^T) = (Pv) a^T. $$ Because $P(\alpha)$ and $P(v)$ are non-zero, the above can only hold if $P(\alpha),P(v)$ are multiples and $x,a$ are multiples. Thus, we indeed conclude that $a$ is a multiple of $x$.


Some computational improvements:

  1. We can select a $v$ (and thus eventually a $\lambda$) that is perpendicular to $x$ as follows. Let $P$ be the orthogonal projection onto $x^\perp$. Take $v$ to be the first non-zero column of $PM$.

  2. We can produce the matrix with rows $a,b$ in a more efficient way. Let $$ Q = [\pmatrix{v&x}^T\pmatrix{v&x}]^{-1}\pmatrix{v&x}^T = \pmatrix{v^Tv & v^Tx\\v^Tx & x^Tx}^{-1}\pmatrix{v^T\\x^T}. $$ Because $Q$ is a left-inverse to $\pmatrix{v&x}$, we can compute $$ \pmatrix{\vec a^T\\ \vec b^T} = QM. $$ If we select $v$ as in suggestion 1 above, then the square matrix whose inverse we compute is a diagonal matrix.