I have an integral operator $T$ defined with respect to a positive semidefinite kernel function $k: \mathbb{R} \times \mathbb{R} \rightarrow \mathbb{R}$ probability measure $\mu(dx) = p(x) dx$ defined as: $$ (Tf)(x) = \int k(x, x') f(x') \mu(dx') \tag{1} $$
Let $\phi$ be an eigenfunction of $T$ corresponding to the largest eigenvalue of $T$ such that $T\phi = \lambda_{max} \phi$. I am trying to figure out whether the following inequality is true:
$$ \frac{\langle \phi, T\phi\rangle}{\langle \phi, \phi\rangle} \geq \frac{\langle f, Tf\rangle }{\langle f, f\rangle} \tag{2} $$
where $\langle f, g\rangle = \int \bar{f(x)}g(x)\mu(dx)$ and $f$ is some function in $L^2$. I thought this inequality would hold by analogy with an inequality for any positive, symmetric $d \times d$ matrix $A$: $$ \frac{\langle u, Au\rangle}{\langle u, u\rangle} \geq \frac{\langle x, Ax\rangle}{\langle x, x\rangle} \tag{3} $$
where $u, x \in \mathbb{R}^d$ are in $\ell_2$ and $u$ is the largest eigenvalue eigenvector of $A$ satisfying $A u = \lambda_{max}(A)u$. While (3) is easy to prove using Schatten p-norms I haven't found a good resource for the equivalent inequalities that I would use to prove (2), or whether such inequalities actually hold.