I have met with a formula, that $$\det(\lambda^2A + \lambda B +C) = \lambda^{2n} \det(A) + \text{ lower order terms}.$$ Here, $\lambda$ is a scalar and $A,B,C$ are $n \times n$-matrices. Can you help me prove it?
2026-03-29 16:02:50.1774800170
determinant of the sum of two matrices det(A+B)
2.3k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
No better a time than now to learn about the exterior algebra perspective on determinants! Unless you prefer inductive proofs...
Let's view these matrices as linear maps $\alpha, \beta, \gamma$ instead. In terms of the exterior algebra approach, we wish to show that $\Lambda^n ( \lambda^2 \alpha + \lambda \beta + \gamma)$ is $\lambda^{2n} \Lambda \alpha$ up to lower order terms in $\lambda$.
The determinant here will be $ (\lambda^2 \alpha + \lambda \beta + \gamma )\wedge \cdots \wedge (\lambda^2 \alpha + \lambda \beta + \gamma)$. Now simply distribute, and collect into terms with the same degree of $\lambda$. The term with the highest degree of $\lambda$ will be the one where we have picked out all the $\lambda^2 \alpha$'s, and that's $\Lambda^n \lambda^2 \alpha = \lambda^{2n} \Lambda^n \alpha$, as desired.