How does a linear operator having constant eigenvalues imply integrability of its eigenvector distribution?

60 Views Asked by At

I am reading a paper where the authors have constructed a linear operator, say $A$, on vector fields. They claim that this operator having locally constant eigenvalues is enough to imply that the distribution generated by the eigenvectors of $A$ is integrable. I am having trouble understanding this.

In order to have integrability, I believe we must show that $A([v,w])$ is an eigenvector for any eigenvectors $v,w$ of $A$. Say that $\lambda_1,\lambda_2$ are the eigenvalues associated to $v,w$, respectively. The constant eigenvalue condition seems to imply that $$\nabla(Av) = A(\nabla v) = \lambda_1\nabla v,$$ but I do not understand how this implies the distribution is closed under Lie brackets. In particular, it is clear from the above that $[Av,Aw] = \lambda_1\lambda_2[v,w]$, and it is also clear that the bracket is $A$-linear in both arguments separately. With that said, why is it that $[v,w]$ is also an eigenvector of $A$? I do not see it.

EDIT: The operator $A$ is (as I understand it) not a priori a homomorphism of Lie algebras. It is something related to the shape operator of an immersed submanifold.