In similar vein to this question, I am trying to understand the proof of the fact that in a $k$-regular graph, the multiplicity of the eigenvalue $k$ equals the number of connected components.
The proof given in my book runs as follows: Let $v=(x_1,\cdots,x_n)^t$ be an eigenvector corresponding to the eigenvalue $k$ and without loss of generality suppose $|x_1|$ is maximal and $x_1>0$. Then $kx_1=\sum_{(1,j)\in E}x_j\le kx_1$. ($E$ is the edge set).
The next sentence is: This means there is no cancellation in the sum and all the $x_j's$ are equal to $x_1$. This sentence ends the proof.
I don't understand this last statement and I don't understand as to how the result follows from this.
Can someone explain please?
Thanks
Let $G$ have $\ell$ components $C_1$, $\ldots\,$, $C_{\ell}$. For $i = 1$, $\ldots\,$, $\ell$, define the vector $x^{(i)}$ by $x^{(i)}_j = \chi_{v_j \in C_i}$.
Since the different $x^{(i)}$s are orthogonal, in order to show that the multiplicity of $k$ is $\ell$, it is enough to show that each $x^{(i)}$ is an eigenvector with eigenvalue $k$ and that if $x$ is an eigenvector with eigenvalue $k$ then there exist $\lambda_1$, $\ldots\,$, $\lambda_{\ell}$ (with some $\lambda_i \neq 0$) such that $x = \lambda_1 x^{(1)} + \cdots + \lambda_{\ell} x^{(\ell)}$.
The first assertion is easy to see. For the second, let $x$ be an eigenvector with eigenvalue $k$. Without loss of generality, assume that $\lvert x_1 \rvert$ is maximal, that $x_1 > 0$, and that $v_1 \in C_1$.
Because $x$ is an eigenvector with eigenvalue $k$, we have $$k x_1 = \sum_{(v_1,v_j)\in E} x_j.$$ Furthermore, by our hypotheses that $G$ is $k$-regular and that $\lvert x_1 \rvert$ is maximal, we have $$\sum_{(v_1,v_j)\in E} x_j \leq k \max_{(v_1,v_j)\in E} \lvert x_j \rvert \leq k x_1.$$ Note that because $G$ is $k$-regular, the sum on the left-hand side has exactly $k$ non-zero terms. Hence, all of them are positive and all equal $x_1$.
We can apply the same reasoning to each $x_j$ for which $(v_1,v_j)\in E$, and so on to all of the $x_i$ for which $v_i \in C_1$.
It may be that there exists some $j \in [n]$ such that $v_j \notin C_1$ but $x_j \neq 0$. However, we can apply the argument used above to show that for each $i$, if $v_j \in C_i$, then $x_j = \max_{v_j \in C_i} \lvert x_j \rvert$. Thus, we have shown that $x$ is a linear combination of the $x^{(i)}$s, as claimed.