Prove that when $A$ is hermitian then $$\|A\|=\max_{u^\dagger u=1}|u^\dagger Au|=\lambda$$ where $\lambda$ is the maximum of the set $|\lambda_i|$ and $\lambda_i$ are the eigenvalues of the matrix $A$.
For some matrix $A$, let $\vec{u}$ be the eigenvector corresponding to the eigenvalue of the largest magnitude such that $A\vec{u}=\lambda_u\vec{u}$ where $|\lambda_u|=\lambda$ then, $$ |u^\dagger Au|=|\lambda_u u^\dagger u|=|\lambda_u|=\lambda $$ Therefore, $\|A\|=\max_{u^\dagger u=1}|u^\dagger Au|\ge\lambda$.
Fine, but how do we prove that $\|A\|=\lambda$ when $A$ is hermitian?
Attempt
For any symmetric matrix $A_{n\times n}$ and vector $\vec{x}=\begin{bmatrix}x_1\\x_2\\\vdots\\x_n\end{bmatrix}$.
\begin{align} \frac{\partial}{\partial x_k}(x^T Ax)&=\frac{\partial}{\partial x_k}[\sum_j(\sum_ix_ia_{ij})_jx_j]=\frac{\partial}{\partial x_k}[\sum_j\sum_ia_{ij}x_ix_j]\\ &=\frac{\partial}{\partial x_k}[\sum_{j\ne k\\i=k}a_{kj}x_kx_j+\sum_{i\ne k\\j=k}a_{ik}x_ix_k+a_{kk}|x_k|^2]\\ &=\sum_{j\ne k}a_{kj}x_j+\sum_{i\ne k}a_{ik}x_i+2a_{kk}x_k\\ &=\sum_ja_{kj}x_j+\sum_ia_{ik}x_i\\ &=k^{th}\text{ row of }Ax+k^{th}\text{ column of }x^T A\\ &=k^{th}\text{ row of }Ax+k^{th}\text{ row of }A^T x\\ &=k^{th}\text{ row of }Ax+k^{th}\text{ row of }Ax,\quad\color{red}{\text{ since $A$ is symmetric}}\\ &=2\times (k^{th}\text{ row of }Ax)\\ \frac{\partial}{\partial x}(x^T Ax)&=\begin{bmatrix}\frac{\partial}{\partial x_1}(x^T Ax)\\\frac{\partial}{\partial x_2}(x^T Ax)\\\vdots\\\frac{\partial}{\partial x_n}(x^T Ax)\end{bmatrix}=\begin{bmatrix}2\times(1^{st}\text{ row of }Ax)\\2\times(2^{nd}\text{ row of }Ax)\\\vdots\\2\times(n^{th}\text{ row of }Ax)\end{bmatrix}=2Ax \end{align}
We need to find the critical points of the function $x^T Ax$ subject to the constraint $||x||^2=x^T x=1$, ie., we need to find the critical points of the Lagrangian function $L(\vec{x},\lambda)=x^TAx-\lambda(||x||^2-1)$, ie., it is to find the partial derivatives $\frac{\partial L}{\partial x},\frac{\partial L}{\partial \lambda}$ and set them equal to zero.
\begin{align} \frac{\partial L}{\partial x}&=\frac{\partial }{\partial x}(x^T Ax)-\lambda\frac{\partial }{\partial x}||x||^2=2Ax-\lambda(2x)=0\\ \implies Ax&=\lambda x\\ \frac{\partial L}{\partial \lambda}&=||x||^2-1=0\implies ||x||=1 \end{align} $\therefore x,\lambda$ must be an eigenpair of the symmetric matrix $A$, i.e., the maximum of $|x^T Ax|$ is the eigenvalue with the largest magnitude, i.e., $||A||=\max_{x^T x}|x^T Ax|=\lambda$
How can we extend this proof to hermitian matrices ?
A general unit vector $v$ can be written as a linear combination $v = \sum_i c_i e_i$, where the $e_i$'s are eigenvectors corresponding to the eigenvalues $\lambda_i$ such that $e_i^\dagger e_j = \delta_{ij}$, where $\delta_{ij}$ is the Kronecker delta. The fact that such an basis of eigenvectors exists follows from the fact that any two eigenvectors corresponding to distinct eigenvalues are orthogonal to each other.
We have $v^\dagger A v = \sum_i |c_i|^2 \lambda_i $ and $\left\| v \right\|^2 = \sum_i |c_i|^2 = 1$.
So $\left\| A \right\|$ is the maximum value of $\left| \sum_i |c_i|^2 \lambda_i \right| $, subject to the constraint that $\sum_i |c_i|^2 = 1$.
Numbering the $\lambda_i$'s and $e_i$ so that $\lambda_1$ is the eigenvalue with highest magnitude, this maximum value is achieved if $c_1 = 1$ and $c_2 = \dots = c_n = 0$; the value achieved is $|\lambda_1|$.