Eigenvector normalization, poles and residue

296 Views Asked by At

I am trying to understand an eigenvector normalization procedure described in an article [1](appendix B).

The problem involves a complex valued matrix $\mathbf{Z}$, of which each elements is a function of the complex variable $s$, for which we want to find the non-trivial solutions of

$\mathbf{Z}(s_n) \mathbf{I}_n=0$

and

$\mathbf{K}_n\mathbf{Z}(s_n)=0$

with $\mathbf{I}_n$ and $\mathbf{K}_n$ the right and left eigenvectors.

Once the $s_n$ are obtained, it is said that the eigenvectors are normalized so that

$\mathbf{K}_n\mathbf{Z}'(s_n)\mathbf{I}_n=1$

and that this normalization ensures that the dyadic product of the eigenvectors matches the pole residue:

$\mathbf{Z}(s)=\frac{\mathbf{I}_n\mathbf{K}_n}{s-s_n}$ in the vicinity of $s_n$.

I am having difficulties linking the normalization step to the final equation.

Can someone explain to me how this works/give me a reference describing the method in more details? Do I have to compute the derivative $\mathbf{Z}'$ of my matrix to conduct this normalization? I know about the residue theorem and Laurent series.

EDIT

After studying this in more details, I am pretty sure that the last equation should read

$\mathbf{Z}^{-1}(s)=\frac{\mathbf{I}_n\mathbf{K}_n}{s-s_n}$

and that there is a typo in the article. In that case, the equation $\mathbf{K}_n\mathbf{Z}'(s_n)\mathbf{I}_n=1$ simply enforces that the coefficient $c_1$ of the Laurent series of $\mathbf{Z}$ around $s_n$ is equal to $1/\mathbf{I}_n\mathbf{K}_n$ so that the expansion of $\mathbf{Z}$ around $s_n$ is

$\mathbf{Z}(s)=\frac{s-s_n}{\mathbf{I}_n\mathbf{K}_n}$ in the vicinity of $s_n$.

and thus

$\mathbf{Z}^{-1}(s)=\frac{\mathbf{I}_n\mathbf{K}_n}{s-s_n}$ in the vicinity of $s_n$.

[1] (Paywall): https://journals.aps.org/prapplied/abstract/10.1103/PhysRevApplied.7.034006 (Free): https://arxiv.org/abs/1610.04980