There is a line I don't understand in a proof I'm reading. It goes like this:
Let $A_{ki}$ be the cofactor of $a_{ki}$. Then:
$$\sum_{i} A_{ij} \sum_{k} a_{ki}h_{k} = \sum_{k} h_{k} \sum_{i} A_{ij}a_{ki} = \Delta \cdot h_{j}$$
where $\Delta$ is the deteminant of the (symmetric) matrix whose entries are the $a_{ki}$.
Can any of you tell if the second equality is valid at all? I know that the $j$-th summand of the outer sum is equal to $\Delta \cdot h_{j}$ but, in the absence of additional information on the $h_{k}$'s, may we really conclude that $\sum_{k} h_{k} \sum_{i} A_{ij}a_{ki} =\Delta \cdot h_{j}$?
Thanks in advance for your reading suggestions, replies, etc.
Using the formula $$\operatorname{adj} A \cdot A = \det A \cdot I$$ We have $$\sum_k A_{ki} a_{kj} = \begin{cases} \det A &\text{ if } i = j \\ 0 &\text{ if } i \ne j \end{cases}$$ since $\operatorname{adj} A = (A_{ji})_{ij}$ (the index are reversed!)
After relabeling the index $(k \mapsto i, i \mapsto j, j \mapsto k)$ $$\sum_i A_{ij} a_{ik} = \begin{cases} \det A &\text{ if } j = k \\ 0 &\text{ if } j \ne k \end{cases}$$ By the assumption that $A$ is symmetric, i.e. $a_{ij} = a_{ji}$ $$\sum_i A_{ij} a_{ki} = \begin{cases} \det A &\text{ if } j = k \\ 0 &\text{ if } j \ne k \end{cases}$$ So our big sum $$\sum_k h_k \sum_i A_{ij} a_{ki} = \det A \cdot h_j$$ as all the $k \ne j$ terms die.
Credits: This proof comes to my mind after reading page 20 of http://www.math.ku.edu/~slshao/fall2013math290lecture11.pdf