Largest orthogonal matrix that can be created using only 1 and -1 as elements

610 Views Asked by At

I am trying to understand what is the largest possible orthogonal matrix that can be created using only 1 and -1 as its elements.

Clearly for a $\ 4 \times 4 $ matrix:

We can construct an orthogonal matrix as : $\ \begin{bmatrix}1&1&1&-1\\1&1&-1&1\\1&-1&-1&-1\\1&-1&1&1\end{bmatrix} $

But I want to understand whether it is possible to prove that we can or cannot create such a matrix for larger size of the matrix.

Any guidance is appreciated :)

My attempts until now have been restricted to finding bounds on column rank of a matrix with all possible column vectors constructed from 1 and -1.

But even then, I am unable to link it to orthogonality of vectors.

2

There are 2 best solutions below

0
On

It is straightforward to see that there is a $(2^r) \times (2^r)$ solution matrix for each integer $r\geq 2$. Indeed, consider a matrix $A=(a_{I,J})$ where the two indices $I$ and $J$ are subsets of $\lbrace 1,2,\ldots ,r \rbrace$, and

$$a(I,J)=(-1)^{|I\cap J|}$$

Let us show that this matrix is indeed orthogonal. Let $I_1 \neq I_2$, we must show that the sum

$$ S(I_1,I_2)=\sum_{J\subseteq [r]} a(I_1,J)a(I_2,J) \tag{2} $$

is zero. If $A\cup B$ is any partition of $[r]$, we can decompose $S(I_1,I_2)$ using this partition, and rewrite it as

$$ S(I_1,I_2)=\sum_{J_A\subseteq A}\sum_{J_B\subseteq B} a(I_1,J_A)a(I_1,J_B)a(I_2,J_A)a(I_2,J_B) \tag{2'} $$

In particular, if we take $A=I_1\cup I_2$ and $B=[r] \setminus A$ in (2'), the terms $a(I_1,J_B)$ and $a(I_2,J_B)$ are always equal to 1, so

$$ S(I_1,I_2)=2^{|B|} \sum_{J_A\subseteq A} \sum_{J_B\subseteq B} a(I_1,J_A)a(I_2,J_B) \tag{3} $$

In other words, we may assume without loss that $I_1\cup I_2 = [r]$. Similarly, using (2') again with $A=I_1\cap I_2$, we may also assume without loss that $I_1\cap I_2 = \emptyset$.

At this point $(I_1,I_2)$ is a partition of $[r]$. We may then use (2') a third time, with $A=I_1,B=I_2$ :

$$ S(I_1,I_2)=\sum_{J_A\subseteq I_1}\sum_{J_B\subseteq I_2} a(I_1,J_A)a(I_1,J_B)a(I_2,J_A)a(I_2,J_B)= \sum_{J_A\subseteq I_1}\sum_{J_B\subseteq I_2}(-1)^{|J_A|+|J_B|}= \bigg(\sum_{J_A\subseteq I_1}(-1)^{|J_A|}\bigg)\bigg(\sum_{J_B\subseteq I_2}(-1)^{|J_B|}\bigg) \tag{4} $$

Now each factor in this product is zero, using the Newton binomial for $(1-1)^n$.

0
On

Thanks to @PeterTaylor for guiding me in the right direction. This problem comes under the genre of construction of Hadamard matrices which are orthogonal matrices built using only 1 and -1 as elements.

https://en.wikipedia.org/wiki/Hadamard_matrix

There are many constructions possible for big matrices of size $2^k $ like Sylvestors construction: Sylvester's construction(From Wikipedia) :

Examples of Hadamard matrices were actually first constructed by James Joseph Sylvester in 1867. Let H be a Hadamard matrix of order n. Then the partitioned matrix

\begin{bmatrix}H&H\\H&-H\end{bmatrix} is a Hadamard matrix of order 2n. This observation can be applied repeatedly and leads to the following sequence of matrices, also called Walsh matrices.

\begin{aligned}H_{1}&={\begin{bmatrix}1\end{bmatrix}},\\H_{2}&={\begin{bmatrix}1&1\\1&-1\end{bmatrix}},\\H_{4}&={\begin{bmatrix}1&1&1&1\\1&-1&1&-1\\1&1&-1&-1\\1&-1&-1&1\end{bmatrix}},\end{aligned}

Also construction of such matrices of arbitrary sizes is still under research with an open conjecture called the Hadamard conjecture which states that a Hadamard matrix of order 4k exists for every positive integer k.

Just a fun fact to add: These matrices are extremely useful especially for error correcting codes to correct messages transmitted over very unreliable channels.