Suppose I have a matrix, and for some $n\in \mathbb{N}$, I insert a row of zeros between the $n$th and $n+1$th rows, and a column of zeroes between the $n$th and $n+1$the columns. For example, for $n=1$ and the matrix: $$A=\begin{bmatrix} 5&7&9 \\ 7&1&1 \\ 9&1&3 \\ \end{bmatrix}$$ I would obtain: $$B=\begin{bmatrix} 5&0&7&9 \\ 0&0&0&0 \\ 7&0&1&1 \\ 9&0&1&3 \\ \end{bmatrix}$$ In general, what effect does this have on the eigenvalues? If we subtract $\lambda I$ and cofactor expand along the zero row or zero column of a matrix transformed in this way, clearly zero must be an eigenvalue. From this, and playing around with some matrices, I believe the following:
If $A$ is a singular matrix, then the eigenvalues of $B$ are the same as the eigenvalues of $A$. If $A$ is an invertible matrix, then the eigenvalues of $B$ are the eigenvalues of $A$ as well as 0.
Is this claim true, and how would I prove it if so?
So far as I can tell, the comment by superckl is exactly what I need (thanks by the way!). Subtracting $\lambda I$ and cofactor expanding along the zero row or column yields: $$ \pm\lambda\det(A-\lambda I)=\det(B-\lambda I)=0 $$ Clearly, $\lambda=0$ must be a solution. That's as far as I got when I posted the question. But as superckl pointed out, if we suppose now that $\lambda\neq0$ to get the other eigenvalues, we see that: $$\det(A-\lambda I)=0$$ and so since $\lambda$ was an arbitrary nonzero eigenvalue of $B$, the remaining eigenvalues are the same.