Sub Matrix of an Orthogonal Matrix is always singular?

1.1k Views Asked by At

I am trying to implement Grassmanian rank one update (GROUSE) as per this paper . For the solution of the least squares problem of (1) in the paper the value of $w$ is

$w$ = $(U_{\Omega_{t}}^TU_{\Omega_{t}})^{-1}U_{\Omega_{t}}^Tv_{\Omega_{t}}$

Where $U$ is an $n$ x $d$ ($d$ <$n$) orthogonal matrix and $U_{\Omega_{t}}$ is a sub matrix constructed by selecting the rows from orthogonal matrix $U$ as per the row indices stored in the set $\Omega_{t}$ e.g if $\Omega_{t}$ = $(1,2,4,11,..45,...)$ then the respective rows are stacked to form $U_{\Omega_{t}}$ . But every time I try to calculate $w$ in R Studio it says $(U_{\Omega_{t}}^TU_{\Omega_{t}})$ is singular. Note the indices in set $\Omega_{t}$ are randomly generated. So I have no choice of row selection of $U$. Is a sub matrix of an Orthogonal matrix always singular?

Here is the R code

w_t =  (solve(t(U_t_omega) %*% (U_t_omega))) %*% t(U_t_omega) %*% (v_t_omega)
2

There are 2 best solutions below

1
On

An example: Let $$U = \left( \begin{matrix} 1 & 0 \\ 0 & 1\\ 0 & 0 \\ 0 & 0 \end{matrix} \right)$$

If you select a subset $\Omega\subset \{1,2,3,4\}$ then $\Omega$ has to contain both 1 and 2, or else $U_\Omega$ will have rank less than 2 (and thus your product be singular). For example with $\Omega=\{2,3,4\}$:

$$U_\Omega = \left( \begin{matrix} 0 & 1\\ 0 & 0 \\ 0 & 0 \end{matrix} \right)$$ And $U_\Omega^T U_\Omega = \left( \begin{matrix} 0 & 0\\ 0 & 1 \end{matrix} \right)$ is singular.

0
On

Your problem may be distilled down into the following question:

Let $S$ be a set of $n$ linearly independent vectors of length $m$. Given some $i\in[n]$, remove the $i_{th}$ coordinate from each $v\in S$ so $\dim(v) = m-1$.

Is $S$ a linearly independent set?

Unfortunately, this question does not have any easy answer. For example, consider the following examples.

Example

Let $S=\{v_1,v_2,v_3\}$ such that

$$ v_1 = \left[\begin{matrix}2 \\ 4 \\ 6 \\ 8 \end{matrix}\right]; \quad v_2 = \left[\begin{matrix}1 \\ 3 \\ 5 \\ 7 \end{matrix}\right]; \quad v_3 = \left[\begin{matrix}1 \\ 1 \\ 2 \\ 3 \end{matrix}\right]. $$

We can check that $S$ is linearly independent. However, consider removing the $i_{th}$ coordinate. Let's check $i=1$ and $i=2$ and see the results.

$i = 1$:

$$ \left\{\left[\begin{matrix} 4 \\ 6 \\ 8 \end{matrix}\right], \left[\begin{matrix} 3 \\ 5 \\ 7 \end{matrix}\right], \left[\begin{matrix} 1 \\ 2 \\ 3 \end{matrix}\right]\right\} \text{is not linearly independent.}$$

$i = 2$:

$$ \left\{\left[\begin{matrix} 2 \\ 6 \\ 8 \end{matrix}\right], \left[\begin{matrix} 1 \\ 5 \\ 7 \end{matrix}\right], \left[\begin{matrix} 1 \\ 2 \\ 3 \end{matrix}\right]\right\} \text{is linearly independent.}$$

This informs us that it is the choice of position we remove that decides whether or not these sets are linearly independent, which is based upon the entries of the vectors (the columns of $U$, in your case).

The Paper

To achieve nonsingularity, I have a few suggestions based on what little I could personally take away from the paper.

  • If $U$ is arbitrary, I would suggest always using the standard basis vectors $e_i$ for your columns, as @H. H. Rugh did in their example.

  • If the only restriction on $S$ is that $\dim(S) = d < n$, then perhaps consider choosing which $d$ basis vectors to use so that $\Omega_t$ has a smaller chance of removing important rows (i.e. rows with a 1 in them).