If $H$ is a subgraph of a graph $G$ then $\lambda_1(H) \leq \lambda_1(G)$, being $\lambda_1$ the largest eigenvalue of $H$ and $G$ respectively (which are defined as the eigenvalues of the adjancecy matrix of $H$ and $G$ respectively).
This is easy to prove if $G$ is connected and non-bipartite using Perron-Frobenius Theorem, but I cannot prove it the other cases.
A similar question that I cannot solve is:
Let $G$ be a connected graph. If $\lambda_1(H) = \lambda_1(G)$ then $H=G$ (In fact, is and if and only if).
Any idea on how to do it?
Based on the proof for why $\lambda_1(H) \le \lambda_1(G)$, we can figure out what happens if $\lambda_1(H) = \lambda_1(G)$ by thinking through the implications when the inequalities are tight. (Note that the linked answer assumes $\lambda_n$ is the largest eigenvalue, but here I assume that $\lambda_1$ is the largest to be consistent with the question.)
Let $\mathbf w$ be a unit eigenvector of $A_H$ corresponding to $\lambda_1(H)$. We can assume that $w_i \ge 0$ for all $i$, because $\mathbf w$ is supposed to maximize the quadratic form associated to $A_H$, and replacing $w_i$ by $|w_i|$ can only increase that quadratic form. We also assume that $A_H$ and $A_G$ have the same size, by including isolated vertices in $H$ if necessary.
From the inequality chain $$ \lambda_1(H) = \sup_{\mathbf x \in \mathbb R^n : \|\mathbf x\|=1} \mathbf x^{\mathsf T}\!A_H\mathbf x = \mathbf w^{\mathsf T}\!A_H\mathbf w \le \mathbf w^{\mathsf T}\!A_G\mathbf w \le \sup_{\mathbf x \in \mathbb R^n : \|\mathbf x\|=1} \mathbf x^{\mathsf T}\!A_G\mathbf x = \lambda_1(G) $$ we conclude that when $\lambda_1(H) = \lambda_1(G)$, we must have $\mathbf w^{\mathsf T}\!A_H\mathbf w = \mathbf w^{\mathsf T}\!A_G\mathbf w$ and also $\mathbf w^{\mathsf T}\!A_G\mathbf w = \lambda_1(G)$.
Focus on the second equation first; it will show that $\mathbf w$ is a $\lambda_1(G)$-eigenvector of $A_G$. To see this, write $\mathbf w$ in an orthonormal eigenvector basis $\mathbf v^{(1)}, \dots, \mathbf v^{(n)}$ of $A_G$ as $$\mathbf w = c_1 \mathbf v^{(1)} + \dots + c_n \mathbf v^{(n)}.$$ We assume $\|w\|=1$, so $c_1^2 + \dots + c_n^2 = 1$. Then $\mathbf w^{\mathsf T} \!A_G \mathbf w = \lambda_1(G) c_1^2 + \lambda_2(G) c_2^2 + \dots + \lambda_n(G) c_n^2$. This is a convex combination of the eigenvalues, and the only way it can be equal to $\lambda_1(G)$ is if $c_i = 0$ for all $i$ with $\lambda_i(G) < \lambda_1(G)$. But this means that $\mathbf w$ is a linear combination of the $\lambda_1(G)$-eigenvectors of $A_G$, so it is also such an eigenvector.
Now, return to the first equation. We have $$0 = \mathbf w^{\mathsf T}\!A_G\mathbf w - \mathbf w^{\mathsf T}\!A_H\mathbf w = \sum_{ij \in E(G)} 2w_iw_j - \sum_{ij \in E(H)} 2w_iw_j = \sum_{ij \in E(G) \setminus E(H)} 2w_iw_j.$$ By nonnegativity of $\mathbf w$, for every edge $ij$ that appears in $G$ but not in $H$, $w_iw_j=0$. In particular, either $G=H$ and we are done, or else some $w_i$ must be $0$.
Because $\mathbf w$ is an eigenvector, we know $A_G \mathbf w = \lambda_1(G)\mathbf w$. Then from $w_i = 0$ we conclude $$\sum_{j : ij \in E(G)} w_j = (A_G\mathbf w)_i = \lambda_1(G) w_i = 0.$$ Since all components of $\mathbf w$ are nonnegative, this means that $w_j=0$ for all $j$ such that $ij \in E(G)$: the zero components propagate to adjacent vertices.
This is true for any $i$ such that $w_i=0$: whenever $\mathbf w$ is zero at a vertex, it's zero at the vertex's neighbors. But then it's zero at those neighbors' neighbors, too, and so on... $G$ is connected, so by repeating this argument, we conclude that $\mathbf w = \mathbf 0$, which contradicts the assumption that $\mathbf w$ is an eigenvector of anything.