I am currently familiarising myself with spectral methods for solving differential equations. Now I am confused when it comes to calculating derivatives. Let $f(x)$ be a function and let $f_N(x)$ be an interpolating polynomial that approximates $f$. One has
$$f(x) \approx f_N(x) = \sum_{i = 0}^N f(x_i) L_i(x) \quad ,$$
where $L_i$ are Lagrangian polynomials with nodes $x_i$ that are the extrema of the Chebychev Polynomial $T_N(x)$. Now my lecture noted that when calculating $f'(x)$, it is numerically faster to not (!) go via
$$ f_N'(x) = \sum_{i=0}^N f(x_i) L_i'(x) $$
but rather transform into spectral space (by FFT?),
$$f_N(x) = \sum_{k=0}^N \lambda_k T_k(x)$$
and calculate the coefficients $\mu_k$ such that
$$f_N'(x) = \sum_{k=0}^N \lambda_k T_k'(x) \overset{!}{=} \sum_{k=0}^N \mu_k T_k(x)$$
To this end I need to calculate $T_k'(x)$. This can be done easily. I calculated:
$$\mu_i = D_{ij} \lambda_j \quad , \qquad D_{ij} = c_i \sum_{k=0}^N T_i(x_k) T_j'(x_k)$$
where $c_i$ is some constant that varies through the two cases $i = 0$ and $i \neq 0$. However, I do not see how this is numerically faster than going over Lagrangian polynomials since the $\mu_i$ still need to be calculated by matrix multiplication -- a matrix that is (as far as I can tell) not tridiagonal or something similar that would suggest the use of e.g. Thomas' algorithm.