I am reading a proof in which instead of using the common dot product they define the inner product as the standard dot product divided by the dimension of the vectors. For example:
$\langle x, y\rangle = \frac{1}{n} (x \cdot y)$
Does this have a common interpretation or could this be just be convenient for the proof? I am looking for a more general understanding, but if it helps I have written the specific proof below:
Here is the proof, it is from my notes in class.
Let $G$ be an $r$-regular graph with eigenvalues $\lambda_{0} \geq \lambda_{1}, \ldots \geq \lambda_{n-1}$. Show that the stability number (largest independent set) $\lvert S \rvert$ satisfies:
$\lvert S \rvert \leq \frac{-n \lambda _{n-1}}{\lambda _{n-1}}$
Let $\langle x,y \rangle$ be the inner product such that $\langle x,y \rangle = \frac{1}{n}(x \cdot y)$. Let $\left \| x \right \| = \sqrt{\langle x,x\rangle}$. Let $A$ be the adjacency matrix of the graph $G$ and let $\left \{ v_{0}, v_{1}, \ldots, v_{n-1} \right \}$ be an orthonormal basis of eigenvectors such that $Av_{i} = \lambda_{i} v_{i}$. As $G$ is $r$-regular $\lambda_{0}=r$ and $v_{0}$ is the all-ones vector $\vec{1}$.
Let $S$ be a set of vertices forming an independent set (there are no edges between them). Let $u$ be the characteristic vector of $S$ and let $\alpha = \frac{\lvert S\rvert}{n}$
Note that
$u^{T}Au = \sum_{x,y \in S}^{} a_{x,y} = 0$
as the entries $a_{x,y}$ are all 0 in $A$ (by definition of independent set).
As all $v_{i}$ form an orthonormal basis we can express $u$ as a linear combination of $v_{i}$ with coefficients $\left \{ a_{0}, a_{1}, \ldots, a_{n-1} \right \}$.
$u = \sum_{i=0}^{n-1}a_{i}v_{i}$
As $v_{i}$ is an orthonormal basis we know that $\langle u, v_{i}\rangle = a_{i}$. Hence $\langle u, v_{0}\rangle = \langle u, \vec{1} \rangle = \frac{\lvert S \rvert}{n} = \alpha = a_{0}$
Using Parseval's identity:
$a_{0}^{2} + a_{1}^{2} + \ldots + a_{n-1}^{2} = \left \| u \right \|^{2} = \alpha$
Hence as $a_{0} = \alpha^{2}$ we can rearrange such that:
$\alpha - \alpha^{2} = \sum_{i=1}^{n-1}a_{i}^{2}$
Now we will put everything together an create the bound:
Note that:
$\langle u, Au \rangle = \frac{1}{n}u^{T}Au = 0$ ( $u^{T}Au = 0$ as we saw above)
Hence we have:
0 = $\langle u, Au \rangle = \sum_{i=0}^{n-1}\langle u, Aa_{i}v_{i} \rangle = \sum_{i=0}^{n-1}\langle u, \lambda_{i}a_{i}v_{i} \rangle = \sum_{i=0}^{n-1}\lambda_{i}a_{i} \langle u, v_{i} \rangle = \sum_{i=0}^{n-1}\lambda_{i}a_{i}^{2} = \lambda_{0}a_{0}^{2} + \sum_{i=1}^{n-1}\lambda_{i}a_{i}^{2}$
Now we establish an inequality by setting every eigenvalue in $\sum_{i=1}^{n-1}$ to the smallest eigenvalue $\lambda_{n-1}$
$0 = \sum_{i=0}^{n-1}\lambda_{i}a_{i}^{2} \geq \lambda_{0}a_{0}^{2} + \lambda_{n-1}\sum_{i=1}^{n-1}a_{i}^{2}$
Remembering that we found that $\sum_{i=1}^{n-1}a_{i}^{2} = \alpha - \alpha^{2}$ ,that $\lambda_{0} = r$ and that $a_{0}^{2} = \alpha^{2}$ we have
$0 = \geq r\alpha^{2} + \lambda_{n-1}(\alpha - \alpha^{2})$
Rearranging and using the fact that $\frac{\lvert S \rvert}{n} = \alpha$ we find:
$\lvert S \rvert \leq \frac{n\lambda_{n-1}}{r-\lambda_{n-1}}$
It just saves us manually dividing out powers of $n$ in certain proofs. A vector whose entries are all $1$ in some privileged basis is often denoted $1$ or $e$; let's go with $e$. On the given definition, $\langle e,\,e\rangle=1$, so by Cauchy-Schwarz $|\langle e,\,x\rangle|\le\sqrt{\langle x,\,x\rangle}$, which is just the QM-AM inequality (in fact, a generalization thereof where $x$ can have components $\le0$), i.e. the statement that the entries in $x$ have variance $\ge0$.
Addendum, just before I post this: I seemy example is used early in the proof you've edited in, albeit with an eye to combinatorial inequalities such as in adjacency matrices.