Semi-orthogonal tall matrices with zero column-means

98 Views Asked by At

For a semi-orthogonal $m\times n$ matrix $\mathbf{A}$ with $m\gt n$, such that $\mathbf A^T\mathbf A=\mathbf I_n$, can we prove or disprove the proposition that, for some given $m, n$ an $\mathbf A$ exists such that $$\sum_{i=1}^m a_{ij} = 0\qquad\forall j \in [1, n]$$ Further, from an optimization perspective, regardless of whether the above proposition holds or not:

Starting from an arbitrary semi-orthogonal matrix $\mathbf A_{m\times n}$ which does not meet zero column mean criteria, how close can we approach the criteria? The other way around could be to start with a matrix with zero column means and optimize $\Vert\mathbf A^T\mathbf A-\mathbf I\Vert_F$ under the constraint.

Could a Lagrangian do the trick here? Say, if we start with the orthogonality, we could begin with $\mathbf{QR}$ decomposition of a random matrix and optimize $\Vert\mathbf Q^T\mathbf Q-\mathbf I\Vert_F$ subject to $\sum a_{ij}$. But how exactly do we proceed for the multidimensional case?

Alternatively under a statistical view, assuming $n$ zero-centered distributions each with sample size $m$, can we somehow optimize an equivalent distance metric? I am not too sure about this intuition.

I also looked into possibility of using normalized Hadamard matrices, although they are surely known to exist only when $m=2^k$, and possibly also exist whenever $m=4k$. I also found an outdated paper discussing the weights (total number of 1s) in these matrices and they provide upper and lower bounds for many cases, but their definition of weights is not columnar. Hadamard matrices that fit my query must only have $m/2$ weight per column in at least $n$ columns. They are not easy to find in the first place for an arbitrary space and I suspect, even harder to optimize as we have to start with binary elements.

Approaching from a combinatorial perspective, I looked into a related concept from coding theory, the so called equidistant constant weight codes but could not glean much from it. Then also, my problem is not necessarily combinatorial in nature although such a solution would be equally acceptable.

Kindly pardon any mistakes in text or notations. An easy to understand explanation would be preferred and much appreciated. Thanks.

1

There are 1 best solutions below

6
On BEST ANSWER

You want $v = \mathbf{1} , v \in \mathbb{R}^m $ to be orthogonal to $n$ vectors, with $n \lt m $, solve the linear system

$ v^T x = 0 $

the solution space is $(m-1)$-dimensional. Orthogonalize the solution space (which is the null space of $B = v^T $ ) using Gram-Schmidt orthogonalization, then select $n$ vectors out of the $(m-1)$ vectors. These vectors will be the columns of $\mathbf{A}$.