How to show that this symmetric matrix is invertible?

269 Views Asked by At

This problem is linked to a physical problem of linear system (structural analysis of a beam with $n+2$ supports), so this matrix $A$ should be invertible. The matrix is $n\times n$ and has elements $a_{i,j}$ given by $$a_{i,j}=\int_0^L\phi_i(x)\phi_j(x)dx$$ for $i,j=${$1,2,...,n$}, $L$ is the length of the beam and $$\phi_i(x)=\frac{x(L-x_i)}{L}$$ if $x<x_i$ and $$\phi_i(x)=\frac{x_i(L-x)}{L}$$ if $x>x_i$.

$x_i$ are the positions of the "internal supports", with $0<x_1<x_2<...<x_n<L$. I already see that $A$ is symmetric, which means that it is diagonalizable. So, if I could prove that $A$ is positive definite then it would have all eingenvalues positive, and since $A$ is diagonalizable then its determinant is the product of its eingenvalues, which means it is invertible.

2

There are 2 best solutions below

2
On

Yes, $A$ is positive-definite for "reasonable" basis functions $\phi$.

Hint: for any vector $v$,

\begin{align*} v^TAv &= \sum_{i,j} v_i \left(\int_0^L \phi_i(x)\phi_j(x)\,dx \right)v_j \\ &= \int_0^L \sum_{i,j} v_i\phi_i(x)v_j\phi_j(x)\,dx\\ &= \int_0^L \left(\sum_i v_i\phi_i(x)\right)^2\,dx \end{align*}

and can you take it from here? You should now also see the conditions necessary on $\phi$ for $A$ to be positive-definite.

Stepping back, you should recognize that the $A$ matrix represents an $L^2$ inner product on the discrete function space spanned by the basis elements $\phi$, and so must be positive-definite.

0
On

Let $L=1$, take $0<x_{1}<x_{2}<...<x_{n}<1$, and consider n functions, $$\phi_{i}(x)=\begin{cases} x(1-x_{i}), & x\leq x_{i}\\ x_{i}(1-x), & x>x_{i} \end{cases}=x(\tfrac{1}{2}-x_{i})+\tfrac{1}{2}x_{i}-\tfrac{1}{2}\vert x-x_{i}\vert$$

The functions $\{\phi_i\}$ are linear independent. Were they linearly dependent, one of them $\phi_k$ would be a linear combination of the others differentiable at $x=x_k$, contradicting the fact that $\phi_k$ itself is not differentiable at $x=x_k$. (Prove function space is linearly independent.)

Introducing the inner product as in your question we then make use of the key property of the Gramian matrix $G_{ij}=\langle \phi_i, \phi_j \rangle$: "a set of vectors is linearly independent if and only if the Gram determinant (the determinant of the Gram matrix) is non-zero" https://en.wikipedia.org/wiki/Gramian_matrix.