A lemma on lifting vectors to make them orthogonal

161 Views Asked by At

For a paper in progress, I am trying to come up with a quick proof of the following claim:

Claim. Let $V$ be a real inner product space of dimension $n$ (if you like you may take $\mathbb{R}^n$ with its usual Euclidean inner product), and let $v_1, \dots, v_k \in V$ be an arbitrary finite list of vectors in $V$. Then there exists another real inner product space $W$ of dimension at most $n+k$ (or thereabouts), a list of orthogonal vectors $w_1, \dots, w_k \in W$, and a surjective partial isometry $T : W \to V$ such that $T w_i = v_i$ for all $i=1,\dots, k$.

Recall that surjective partial isometry means that $T : W \to V$ is a linear mapping which maps the orthogonal complement of its kernel isometrically onto $V$. That is, if $y_1, y_2 \perp \ker T$, then $\langle T y_1, T y_2 \rangle_V = \langle y_1, y_2 \rangle_W$.

Please note that there are no assumptions at all on the given vectors $v_1, \dots, v_k$. In particular, they may or may not be linearly independent. They need not be distinct, and some of them could be zero. Or, some or all of them might already be orthogonal. Ideally the proof would not have to treat these cases separately.

It's not so important that the bound on the dimension of $W$ should be exactly $n+k$; something like $n+k+1$ or $n+k+2$ would also be okay. I'd rather have a short proof than a sharp bound. Actually, for the specific application at hand, I really only need to cover $n=k=3$, and it would be all right if the dimension of $W$ is merely bounded by some constant. But I would rather prove the claim for general $n,k$.

I do have a proof of this statement, but I don't really like it. The idea is to suppose $V = \mathbb{R}^n$ and let $W = \mathbb{R}^{n+k}$, with their usual Euclidean inner products, and $T$ the orthogonal projection onto the first $n$ coordinates. Then we construct the vectors $w_i$ by setting their first $n$ coordinates equal to those of the $v_i$ (so that $T w_i = v_i$), and compute their remaining $k$ coordinates by induction on $i$ so as to make them orthogonal at each step. This requires solving a system of $i$ linear equations at the $i$th step, and a bit of care to ensure that the systems remain consistent. So it proves the statement, but it's messier than I think should be necessary, and I would prefer not to have to work in coordinates. I feel like there should be a more elegant argument.

I'd also be happy with a reference to any reasonably simple theorem that implies this one, or an explanation of why it's an obvious consequence of some well known fact.

2

There are 2 best solutions below

0
On

How to do it in $n+k(k-1)/2$

You can definitely do it in $\mathbb R^{n+ k(k-1)/2}.$ Except when $k=2$ and $v_1=v_2=0.$

This is terrible for general $k,$ but for $k=3,$ it is $k(k-1)/2=3,$ so we get a nice bound in that case.

If all the $v_i$ are zero, and $k>2,$ then we can pick $u_1,\dots,u_k$ as part of an orthonormal basis.

Otherwise, re-order the $v_i$ so that $v_k\neq 0.$

Define for $i\neq j$ $$r_{ij}=\begin{cases} -1&i<j\\\langle v_i,v_j\rangle&i>j\end{cases}.$$

Let $e_{ij}$ with $1\leq i<j\leq k,$ be an orthogonal basis of $\mathbb R^{k(k-1)/2},$ and when $i>j,$ write for shorthand $e_{ij}=e_{ji}.$

Let $u_i=\sum_{j\neq i} r_{ij}e_{ij}.$

Then we get $$\langle u_i,u_j\rangle=r_{ij}r_{ji}=-\langle v_i,v_j\rangle$$ when $i\neq j,$ and $\langle u_i,u_i\rangle\neq 0,$ except possibly when $i=k.$

So we let $w_i=(v_i,u_i)\in \mathbb R^{n+k(k+1)/2}.$ Then $$\langle w_i,w_j\rangle = \langle v_i,v_j \rangle +\langle u_i,u_j\rangle.$$ This is zero when $i\neq j,$ and we are done.

We know $\langle w_i,w_i\rangle = \langle v_i,v_i \rangle +\langle u_i,u_i\rangle>0$ because the $u_i$ are non-zero when $i<k$ and $v_k\neq 0.$


Special case $k=2$ when both vectors are zero

When $k=2$ and $v_1=v_2=0,$ the problem is $k(k-1)/2<k.$ We can do it in $\mathbb R^{n+2}.$


What $k=3$ looks like in $n+3$

So if $k=3,$ and $v_3\neq 0,$ you get:

$$\begin{align}w_1&=(v_1,-1,-1,0)\\w_2&=(v_2,v_1\cdot v_2,0,-1)\\w_3&=(v_3,0,v_3\cdot v_1,v_3\cdot v_2).\end{align}$$


An intuition for $n+k$

An idea to get it down to $\mathbb R^{n+k}.$ Pick $R$ large so that you can define, for $i\neq j$:

$$\theta_{ij}=\arccos \frac{-\langle v_i,v_j\rangle}{R^2}$$

When $R$ is very large compared to the $v_i,$ we get that the $\theta_{ij}$ are all close to $\frac{\pi}2.$ So if we can get vectors $u_i$ with $|u_i|=R$ and the angle between $u_i$ and $u_j$ is $\theta_{ij},$ we are done.

If we find $u_i,$ we get $$u_i\cdot u_j= -R^2\cos\theta_{ij}=v_i\cdot v_j,$$ and we can pick $w_i=(v_i,u_i).$

I couldn’t solve this question, so I asked it here. This answer shows a sufficient condition for finding $v_i$ is if, for $i\neq j:$

$$\left|\cos\theta_{ij}\right|\leq \frac{1}{n-1}.$$

We can pick $R$ large enough to match this condition, say:

$$R=\sqrt{n-1}\max_{i\neq j}\left|\langle v_i,v_j\rangle \right|$$

7
On

Assume $V=\mathbb R^n$. Let $A=\pmatrix{v_1&\cdots&v_k}\in\mathbb R^{n\times k}$. Let $a>\|A\|_2$ and $X=a^{-1}A$. Then $\|X\|_2<1$ and $Y=\left(I_k-X^TX\right)^{1/2}\in\mathbb R^{k\times k}$ exists. Hence $\pmatrix{X\\ Y}$ is the first $k$ columns of some $Q\in O(n+k,\mathbb R)$. Now $T=\pmatrix{I_n&0_{n\times k}}Q\in\mathbb R^{n\times(n+k)}$ is a surjective partial isometry and $T(ae_j)=v_j$ for $j=1,2,\ldots,k$.

Remarks.

  1. Put it simply, every (possibly rectangular) matrix $A$ can be completed to a block matrix $\pmatrix{A&\ast\\ \ast&\ast}$ that is a nonzero scalar multiple of an orthogonal matrix.
  2. The vectors $w_j$ ($=ae_j$) above are not only mutually orthogonal, but also having identical norms ($=a$) and this common norm can be any pre-specified value greater than $\|A\|_2$.