I've stumbled upon several solution of linear algebra problems which use notion of "continuous dependence" of matrix polynomials on matrix elements. For instance (translated, so any inaccuracies are probably mine):
Let $A$ and $B$ square matrices of the same size. Prove that сharacteristic polynomials of $AB$ and $BA$ are equal.
Proof: Lets assume first that $A$ is invertible. Then $$|AB - \lambda I| = |A^{-1}(AB - \lambda I)A| = |BA - \lambda I|$$
Also it is clear that polynomial $P(\lambda,A,B) = |AB - \lambda I|$ depends continuously on elements of $A$ and $B$, thus $P(\lambda,A,B) = P(\lambda,B,A)$ holds for singular matrices as well. Indeed, any singular matrix is a limit of invertible matrices, because matrix $A + \lambda I$ is invertible for all but finitely many $\lambda$.
Or here another one:
Let $$X = \left( \begin{matrix} A & B \\ C & D \\ \end{matrix} \right) $$ If $A$ and $D$ - square matrices of the same size and $AC = CA$ then $$|X| = |AD - CB|$$
Proof: First assume that $|A| \ne 0$. Then using the fact that $|X| = |A||D - CA^{-1}B|$ one has $$|X| = |AD - ACA^{-1}B| = |AD - CB|$$ Now let us consider the case where $|A| = 0$. Equation $|X| = |AD - CB|$ is a polynomial equation of elements of $X$. Thus if there exists invertible matrices $A_{\varepsilon}$ for which $A_{\varepsilon}C = CA_{\varepsilon}$ and $\lim \limits_{\varepsilon \to 0}A_{\varepsilon} = A$ then former equation holds for $A$ as well.
Let $A_{\varepsilon} = A + \varepsilon I$. Then $$|A + \varepsilon I| = \varepsilon^{N} + \ldots$$ is a polynomial of $\varepsilon$ which has finite amount of zeros, thus for all but finitely many $\varepsilon$ matrix $A + \varepsilon I$ is invertible. (Particularly it is invertible for small $\varepsilon$)
I don't quite get the latter parts of both proofs, where author talks about continuous dependence. What does it mean when we talk about polynomials of matrices?
If matrix equation is "countinuously dependent" on matrix elements and holds for invertible matrix does it also hold for singular matrices?
How can I check that polynomial "continuously depends" on matrix elements? For instance, why polynomials in these proofs considered to be so?
Consider for example the first proof (where I assume you were meant to show that the characteristic polynomial of $AB$ equals the characteristic polynomial of $BA$). By expanding the definitions, you can see that the function $p \colon \mathbb{R} \times M_n(\mathbb{R}) \times M_n(\mathbb{R}) \rightarrow \mathbb{R}$ given by $p(\lambda,A,B) = |AB - \lambda I|$ is a polynomial in the entries of $A,B$ and in $\lambda$. For example if $n = 2$ then
$$ p(\lambda,A,B) = \left| \begin{pmatrix} a_{11}b_{11} + a_{12}b_{21} - \lambda & a_{11}b_{12} + a_{12}b_{22} \\ a_{21} b_{11} + a_{22} b_{21} & a_{21} b_{12} + a_{22} b_{22} - \lambda \end{pmatrix} \right| = (a_{11} b_{11} + a_{12}b_{21} - \lambda)(a_{21}b_{12} + a_{22}b_{22} - \lambda)-(a_{11}b_{12} + a_{12}b_{22})(a_{21}b_{11}+a_{22}b_{21}).$$
More generally, it follows from the definition of the determinant as a sum of products of matrix entries. In particular, $(\lambda,A,B) \mapsto p(A,B,\lambda)$ is continuous. The function $(\lambda,A,B) \mapsto p(\lambda,B,A)$ is also continuous and you have shown that $p(\lambda,A,B) = p(\lambda,B,A)$ holds for all $\lambda \in \mathbb{R}$, $B \in M_n(\mathbb{R})$ and $A$ invertible. Since the set $S = \{ (\lambda, A, B) \, | \, \lambda \in \mathbb{R}, B \in M_n(\mathbb{R}), A \in GL_n(\mathbb{R}) \}$ is dense in $\mathbb{R} \times M_n(\mathbb{R}) \times M_n(\mathbb{R})$, two continuous functions that agree on a dense set must agree everywhere and so $p(\lambda,A,B) = p(\lambda,B,A)$ for all $(\lambda,A,B) \in \mathbb{R} \times M_n(\mathbb{R}) \times M_n(\mathbb{R})$.