Let $a < b$ be real numbers and $f_1,...,f_n:[a,b] \to \mathbb R$ continuous funtions. Define an $n$ by $n$ matrix $M = (m_{ij})_{i,j = 1,...,n}$ where $$m_{ij} = \int_a^bf_i(x)f_j(x)dx. $$ Prove that $$\det(M) = 0 \iff f_1,...,f_n \text{ are linearly dependent}.$$
Since the matrix $M$ is symmetric, we know that it is orthogonally diagonalizable. If $M$ is a diagonal matrix, then one of $m_{ii}$ is zero, which means that (since $f_i^2$ is nonnegative and continuous) one of $f_i$ is zero, so that $f_i$'s are linearly dependent. Conversely, if one of the $f_i$ can be expressed as a linear combination of the other, say $f_n = \sum_1^{n-1}c_kf_k$ then we may use the fact that $M$ is a diagonal to deduce algebraically that the determinant is zero.
However, I am not sure how to get from this special case to the general case. By looking at the problem when $n = 2$ it seems like Cauchy-Schwartz inequality may come up somewhere, but I am not sure how it would exactly.
I would like to know how I can solve this problem and also about what would be the motivation for someone to come up with this problem.
EDIT: Thank you for the link given by @AnneBauval. However, I found the answers on the link to be insufficient for my understanding. When we write $G = A^TA$, are we not assuming that we are simply doing the standard inner product as we compute the product $A^TA$ entry by entry? In our case of the matrix $M$ defined as above, how would a matrix that satisfy $G = A^TA$ look like? The link from @AnneBauval gave me greater sense of the problem though, and I thank her for that.
$M$ is the Gram matrix of $(f_1,\dots,f_n),$ i.e. $m_{i,j}=f_i\cdot f_j,$ for the inner product on $C([a,b])$ defined by $$f\cdot g=\int_a^bf(x)g(x)\,\mathrm dx.$$ Let us now forget about $C([a,b]),$ and prove that for any family $(f_1,\dots,f_n)$ of vectors in any inner product space, its Gram matrix $M\in M_n(\Bbb R)$ is singular iff the $n$ vectors are linearly dependent.
$\Rightarrow:$ if $M$ is singular, let $v=\begin{pmatrix}a_1\\\vdots\\a_n\end{pmatrix}\in M_{n,1}(\Bbb R)$ be a non-zero vector of its kernel, i.e. $Mv=0.$ Then, $$0=v^TMv=\sum_{i,j}a_im_{i,j}a_j=\sum_{i,j}a_i(f_i\cdot f_j)a_j=w\cdot w$$ where $$w:=\sum_ia_if_i.$$ From $\|w\|^2=0,$ we deduce that $w=0,$ which (since $v\ne0$) proves that $f_1,\dots,f_n$ are linearly dependent.
$\Leftarrow:$ if $f_1,\dots,f_n$ are linearly dependent, then $\sum_ja_jf_j=0$ for some non-zero $v=\begin{pmatrix}a_1\\\vdots\\a_n\end{pmatrix}\in M_{n,1}(\Bbb R).$ Then, $f_i\cdot\sum_ja_jf_j=0$ for all $i,$ i.e. $Mv=0,$ which (since $v\ne0$) proves that $M$ is singular.