Show that $h_1,\ldots,h_n$ are linearly independent in $\mathscr{H}$ (Hilbert space) if and only if the matrix $A$, defined by $A=[a_{ij}]$ where $a_{ij}=\langle h_j, h_i \rangle$, is invertible.
First, I tried to express the matrix $A$ explicitly: $$A=(a_{ij})_{n \times n} = (\langle h_j,h_i \rangle)_{n \times n} =\begin{pmatrix} \langle h_1,h_1 \rangle & \cdots & \langle h_n, h_1 \rangle \\ \vdots & \ddots & \vdots \\ \langle h_1,h_n \rangle & \cdots & \langle h_n,h_n \rangle \end{pmatrix}.$$
At this point, I tried to prove the "if" direction first:
Suppose $h_1,\ldots,h_n$ are linearly independent. Then we have $\sum_{k=1}^n c_k h_k = 0$ with $c_k=0$. How may I proceed from here? Perhaps, to prove that $A$ is invertible, I can use one of its equivalent conditions listed in the Invertible Matrix Theorem.
For the "only if" direction, I tried to write $Ah$ explicitly: $$Ah=A(h_1,\ldots,h_n)=\begin{pmatrix} \langle h_1,h_1 \rangle & \cdots & \langle h_n, h_1 \rangle \\ \vdots & \ddots & \vdots \\ \langle h_1,h_n \rangle & \cdots & \langle h_n,h_n \rangle \end{pmatrix} \begin{pmatrix} h_1 \\ \vdots \\ h_n \end{pmatrix}=\begin{pmatrix} \langle h_1, h_1 \rangle h_1 + \cdots + \langle h_n, h_1 \rangle h_1 \\ \vdots \\ \langle h_1, h_n \rangle h_1 + \cdots + \langle h_n, h_n \rangle h_n \end{pmatrix}.$$ Does this lead to saying $\langle h_k, h_i \rangle=0$ ($k\in \{1,\ldots,n\}$)? If so, then I think this means $h_1,\ldots,h_n$ are linearly independent.
For the "only if" direction, consider the homogenous system $Ax = 0$. Then the $i$-th row is $$\sum_{j=1}^n \langle h_j, h_i \rangle x_j = 0$$ By linearity in the first argument it is $$\langle \sum_{j=1}^n x_j h_j, h_i \rangle = 0$$ Therefore $\sum_{j=1}^n x_j h_j \in \text{span}(h_1,h_2,\ldots,h_n)^\perp$. On the other hand $\sum_{j=1}^n x_j h_j$ is a linear combination of $h_1,h_2,\ldots,h_n$. Hence $$\sum_{j=1}^n x_j h_j \in \text{span}(h_1,h_2,\ldots,h_n) \cap \text{span}(h_1,h_2,\ldots,h_n)^\perp = \{0\}$$ and by linear independence of $h_1,h_2,\ldots,h_n$, we have, for $j = 1,2,\ldots,n$, $$\sum_{j=1}^n x_j h_j = 0 \implies x_j = 0$$ Now you can look at the equivalence theorems for a nonsingular matrix.
For the "if" direction, I don't think writing $Ah$ explicitly is useful. Instead, you can try the contrapositive statement:
You can use the same equivalence theorem as above and I suppose the argument is very similar.