Let $x_0, \dots , x_n$ be the roots of the Chebyshev polynomial $T_{n+1}(x)$.
We define:
$$A=\begin{pmatrix} \frac{1}{\sqrt2}T_0(x_0) & \cdots & \frac{1}{\sqrt2}T_0(x_n) \\ T_1(x_0) & \cdots & T_1(x_n) \\ \vdots & \vdots& \vdots \\ T_n(x_0) & \cdots & T_n(x_n) \\ \end{pmatrix}$$
Is $A$ invertible? If so, calculate $A^{-1}$.
I have tried to solve it for the roots of $T_2(x)$ and I found that the matrix is invertible.
How can I generalize this?
Since $T_0=1$, the first row is a constant, and so just for the sake of symmetry, I'll write it as $$ \begin{pmatrix} \frac{1}{\sqrt2}T_0(x_0) & \cdots & \frac{1}{\sqrt2}T_0(x_n) \\ T_1(x_0) & \cdots & T_1(x_n) \\ \vdots & \vdots& \vdots \\ T_n(x_0) & \cdots & T_n(x_n) \\ \end{pmatrix}, $$ where the $T_0$'s in the first row are actually evaluated at all of the roots $x_0$ up to $x_n$ rather than evaluated all at $x_0$. In this case, the invertibility of the matrix follows directly from a discrete orthogonality condition for the $T$'s, which is $$ \sum_{k=0}^{N-1}T_i(x_k)T_j(x_k)=\begin{cases} 0 & i\neq j \\ N & i=j=0 \\ N/2 & i = j \neq 0 \end{cases}, $$ which holds provided that $N>\textrm{max}(i,j)$. Each of these orthogonality conditions is realized by taking the dot product of any two columns in the matrix above, and since each dot product is zero, the rows are mutually linearly independent, and hence the matrix is invertible.
Related: Prove that $T_n$ satisfy $ \sum_{k=0}^{N-1}{T_i(x_k)T_j(x_k)} = \begin{cases} 0 &: i\ne j \\ l\neq 0 &: i=j \end{cases} \,\! $