Determine the dimension of the subspace of all matrices that commutes with an all 1 matrix and find a basis

163 Views Asked by At

Let $\Gamma$ denote the $n$-by-$n$ matrix which has $1$ in all its entries. Denote $$S := \{A \in \mathcal{M}_n(\mathbb{R}) : A\Gamma = \Gamma A\}$$ Determine the dimension of $S$, and find a basis for $S$.

Simple calculation tells that $S$ consists of all matrices with the sum of each row and column agreeing on a value. This is similar to the case of Magic matrices, so I found a literature for that and conclude that the dimension is $\,n^2 - 2n + 2\,$.

Problem is, how do I find a basis for it?

Exhausting all the possibilities by hand seems undesirable, and I can't think of a way to do it systematically.

2

There are 2 best solutions below

5
On BEST ANSWER

First hint: $\Gamma\,$ is symmetric and satisfies $\Gamma^2 =n\,\Gamma$ , so that $\,\frac1n\Gamma\,$ is an orthogonal projector. It is fairly straightforward to see that it projects onto a $1$-dimensional image. Hence $\,\frac1n\Gamma\,$ is similar to $E_{nn}=\left(\begin{smallmatrix} 0 &\ldots &0 \\ \vdots &\ddots &\vdots \\ 0 &\dots &1\end{smallmatrix}\right)$, note that $E_{nn}$ has just one nonzero entry. Similarity means there is some transformation matrix $T$, and $\,TE_{nn}T^{-1}=\frac1n\Gamma\,$ holds. In this context you may consult this .
Consider the commutator map $[\,\cdot\, ,E_{nn}]: \mathcal{M}_n(\mathbb{R})\to\mathcal{M}_n(\mathbb{R}), A\mapsto AE_{nn} -E_{nn}A$, which is linear.

Second hint: $\dim\ker\, [\,\cdot\, ,E_{nn}] = (n-1)^2+1$ may be read off from $$[A,E_{nn}] \;=\; \begin{pmatrix} 0 &\dots &0 &a_{1,n}\\ \vdots &\ddots &\vdots & \vdots\\ 0 &\dots &0 &a_{n-1,n}\\ -a_{n,1} &\dots &-a_{n,n-1} &0\end{pmatrix}$$

First question: How are $S$ and $\,\ker\, [\,\cdot\, ,E_{nn}]\,$ related to each other, in terms of the transformation matrix $T$ ?

Last question: How to systematically deduce from the preceding a basis of $S$ ?

0
On

If $A \in S$ and $s$ is the sum of each line or column of $A$, then $A-\frac{s}{n} {\Gamma} \in {\mathcal{V}}_{n}$, the space of ${n}\times{n}$ matrices which sum of each line or column is $0$. We can find a basis of ${\mathcal{V}}_{n}$ and add ${\Gamma}$ to this basis to get a basis of $S$.

Let us now suppose that $A = \left({a}_{i , j}\right) \in {\mathcal{V}}_{n}$ and let $m = n+1$. Let $\left({x}_{1} , \cdots , {x}_{n}\right)$ and $\left({y}_{1} , \cdots , {y}_{n}\right)$ be $2 n$ arbitrary scalars. Let us build a matrix $B = {b}_{i , j} \in {\mathcal{V}}_{m}$ by defining

\begin{equation} \left\{\begin{array}{lcl}{b}_{i , j}&=&{a}_{i , j}+{x}_{j}+{y}_{i} \quad \text{ if } i , j \leqslant n\\ {b}_{i , m}&=&{-{s}_{x}}-n {y}_{i} \quad \text{ if } i \leqslant n\\ {b}_{m , j}&=&{-n} {x}_{j}-{s}_{y} \quad \text{ if } j \leqslant n\\ {b}_{m , m}&=&n {s}_{x}+n {s}_{y} \end{array}\right. \end{equation}

where ${s}_{x} = \sum _{j} {x}_{j}$ and ${s}_{y} = \sum _{i} {y}_{i}$. Conversely, any matrix $B \in {\mathcal{V}}_{m}$ can be written in this way because the ${x}_{j}$ and ${y}_{i}$ must be defined by

\begin{equation} \renewcommand{\arraystretch}{1.5} \left\{\begin{array}{lcl}{x}_{j}&=&{-\frac{1}{n}} {b}_{m , j}-\frac{1}{n} {s}_{y}\\ {y}_{i}&=&{-\frac{1}{n}} {b}_{i , m}-\frac{1}{n} {s}_{x} \end{array}\right. \end{equation}

The sums of the last line and column of $B$ plus the constraint that ${b}_{m , m} = n {s}_{x}+n {s}_{y}$ then imply that ${s}_{x} = \sum _{j} {x}_{j}$ and ${s}_{y} = \sum _{i} {y}_{i}$. It follows that the matrix ${a}_{i , j} = {b}_{i , j}-{x}_{j}-{y}_{i}$ for $i , j \leqslant n$ belongs to ${\mathcal{V}}_{n}$. This decomposition is not unique because we have one degree of freedom in the choice of $s_x$ an $s_y$. We could get a unique decomposition by choosing for example $s_y=0$.

From this we deduce the induction relation $\dim {\mathcal{V}}_{n+1} = (2 n-1)+\dim {\mathcal{V}}_{n}$, hence $\dim {\mathcal{V}}_{n} = (n-1)^2$ because $\dim {\mathcal{V}}_{1}=0$ and the construction gives us a way to build a basis of these vector spaces by taking the matrices in factor of one of the ${x}_{j}$'s or ${y}_{i}$'s in the process.