Completing inductive step of proof of Cochran's theorem?

341 Views Asked by At

On slides 23-26 here, they prove the case for $k=2$ of Cochran's theorem. I don't see why you can use induction to prove it for general $k$.

In more detail, they already prove the following:

If $$ Q = \sum_{i=1}^n x_i^2 = x'A_1 x + x'A_2 x = Q_1 + Q_2 $$ where $A_1$ and $A_2$ are PSD with ranks $r_1, r_2$ such that $r_1+r_2=n$, then there exists an orthogonal matrix $C$ such that, with $x=Cy$,

$$ Q_1 = y_1^2 + \cdots + y_{r_1}^2, \\ Q_2 = y_{r_1+1}^2 + \cdots + y_n^2. $$

I don't see how to use this to show it's true for general $k$ not just $k=2$ as above by induction.

If you split $Q_2$ into $Q_{21}+Q_{22}$ where $Q_{21}$ and $Q_{22}$ have ranks $a$ and $b$ for example, then you'd have a different orthogonal matrix $C_1\neq C$ such that, for $\hat{x}=(x_{r_1+1},\ldots,x_n)$,

$\hat{x}=C_1 y$ and $$Q_{21}=y_{r_1}^2 + \cdots + y_{r_1+a-1}^2,\quad Q_{22}=y_{r_1+a}^2 + \cdots + y_{r_n}^2$$

I don't see how you can show from $k=2$ that this is true for general $k\geq 2$, i.e. that there exists an orthogonal matrix $C$ that simultaneously gets us $$Q_i = y^2_{r_1+\cdots+r_{i-1}+1} + \cdots + y^2_{r_1+\cdots+r_i}$$ for $i=1,\ldots,k$. Anyone know how to complete this proof?

1

There are 1 best solutions below

3
On BEST ANSWER

Sup Y9,

Note that the assumption $r_1 + \cdots + r_n = n$ will be problematic if you try to set up the induction directly, since if you handle $A_1$ first, you are left with $r_2 + \cdots + r_n$ which is not equal to $n$. This suggests that the induction involves looking at a lower-dimensional problem, i.e. the "new $n$" when you apply the induction step after handling $A_1$ is $\tilde{n} = r_2 + \cdots + r_n$, as described below.


It might be easier to state the result as follows: if $A_1, \ldots, A_n$ are PSD with ranks summing to $n$ satisfying $I = A_1 + \cdots + A_n$, then there exists an orthonormal basis $\{c_1, \ldots, c_n\}$ such that

  • $A_1$ is the identity on the subspace $\text{span}\{c_1, \ldots, c_{r_1}\}$, and zero on the orthogonal complement,
  • $A_2$ is the identity on the subspace $\text{span}\{c_{r_1 + 1}, \ldots, c_{r_1 + r_2}\}$, and zero on the orthogonal complement,
  • and so on.

[Note that $c_1, \ldots, c_n$ here correspond to the columns of $C$ in the original formulation of the result.]


Using the $k=2$ result, we can handle the situation $I = A_1 + B$ where $B := A_2 + \cdots + A_n$. This implies $A_2 + \cdots + A_n$ is the identity on the subspace $\operatorname{span}\{c_{r_1 + 1}, \ldots, c_n\}$, and zero on the orthogonal complement i.e. $\operatorname{span}\{c_1, \ldots, c_{r_1}\}$.

Since $\operatorname{rank}(A_2) + \cdots + \operatorname{rank}(A_n) = r_2 + \cdots + r_n = n - r_1 = \operatorname{rank}(A_2 + \cdots + A_n)$, each individual $A_i$ must also be zero on $\operatorname{span}\{c_1, \ldots, c_{r_1}\}$.

So we can restrict our attention to this $(n-r_1)$-dimensional subspace $\operatorname{span}\{c_{r_1 + 1}, \ldots, c_n\}$, and we have another instance of the theorem's conditions, since the sum of $A_2, \ldots, A_n$ is the identity, and the sum of their ranks is the dimension of the space ($n - r_1$). By induction, we may apply the theorem to this smaller problem and obtain the result.


It is possible to translate the above proof into strictly matrix notation if you wish. The gist of it is that you can modify certain subsets of columns of $C$ at a time without messing up previous steps.

When you make lots of money from your cushy job, please remember us.