Let $\langle \cdot, \cdot \rangle$ be Hermitian inner product and $A=(a_{ij})_{1\leqq i,j\leqq n}$ be $n\times n$ matrix of $\mathbb C.$
Prove this claim.
Claim :
If $\langle Ax,x\rangle=\langle x,Ax\rangle$ for all $x\in \mathbb C^n$, then $A$ is Hermitian matrix.
It suffices to show $a_{jk}=\overline{a_{kj}}$ for all $k,j=1,\cdots,n.$
Let $x\in \mathbb C^n$ and then \begin{align} &\langle Ax,x\rangle=\sum_{j,k=1}^n a_{jk}\overline{x_j} x_k\\ &\langle x,Ax\rangle=\sum_{j,k=1}^n \overline{a_{jk}} x_j \overline{x_k}=\sum_{j,k=1}^n \overline{a_{kj}} \ \overline{x_j}x_k \end{align}
From $\langle Ax,x\rangle=\langle x,Ax\rangle$, I get $\displaystyle\sum_{j,k=1}^n (a_{jk}-\overline{a_{kj}})\overline{x_j}x_k=0 \cdots (\ast)$
I want to derive $a_{jk}=\overline{a_{kj}}$ from $(\ast).$
The one way is here.
$(\ast)$ holds for all $x\in \mathbb C^n$, so consider
(i) $x=\begin{pmatrix}0\\ \vdots \\ 1 \\ \vdots \\ 0 \end{pmatrix}$ : the $m$-th component is $1$ and the others are $0$
(ii) $x=\begin{pmatrix}0\\ \vdots \\ i \\ \vdots \\ 1 \\ \vdots \\ 0 \end{pmatrix}$ : the $m$-th component is $i$, the $l$-th component is $1$, and the others are $0$.
Putting (i) into $(\ast)$, I get $a_{mm}=\overline{a_{mm}}$,
and putting (ii) into $(\ast)$ and considering $a_{mm}=\overline{a_{mm}}$, I get $a_{ml}=\overline{a_{lm}}$ for $l\neq m$.
Thus, $a_{ml}=\overline{a_{lm}}$ for all $m,l=1,\cdots,n.$
But this method needs some calculation. Of course I think this method makes sense, but I wonder there is a shorter (cleverer) way.
I think $(\ast)$ is similar to the form relating linear independence.
The definition of liner independence is $$\sum_{k=1}^n c_k v_k=0 \Rightarrow c_k=0 (\forall k)$$
Now, $\displaystyle\sum_{j,k=1}^n (a_{jk}-\overline{a_{kj}})\overline{x_j}x_k=0$ is similar to $\displaystyle\sum_{k=1}^n c_k v_k=0$, and $c_k=0(\forall k)$ seems to correspond to $a_{jk}=\overline{a_{kj}}(\forall k,j)$.
So, the question : Is there a way to prove the claim using the linear independence ?
Of course, another way to prove the claim is also welcome.
A more general result is that if $\langle x, A x \rangle = 0$ for all $x \in \mathbb{C}^n$ then $A = 0$ (not true for $\mathbb{R}^n$ in general).
To see this expand $\langle x+y, A (x+y) \rangle, \langle x+iy, A (x+iy) \rangle$ to show that $\langle x, A y \rangle = 0$ for all $x,y \in \mathbb{C}^n$.
Hence if $\langle x, A x \rangle = \langle Ax, x \rangle = \langle x, A^*x \rangle$, we have $\langle x, (A-A^*) x \rangle =0$ for all $x$ and hence $A=A^*$.