Suppose $f:[-1,1]\to\mathbb{R}$ satisfies $|f(x)-f(y)|\leq |x-y|$ for all $x,y\in[-1,1]$. Then Jackson's theorem asserts that $$ \min_{\deg p \leq k } \| f - p \|_\infty < C k^{-1} $$ for some constant $C$ (independent of $f$); i.e. the best polynomial approximation to $f$ in infinity norm converges at least like $O(k^{-1})$.
What about the Chebyshev interpolant (or approximant) of $f$? I have seen a proof that the rate is at least $C \log(k) k^{-1}$, but I haven't been able to find any examples of functions for which this gap is necessary. Is there a simple example for which Chebyshev interpolants don't converge as fast as $O(k^{-1})$? I know such a function must have derivative of unbounded total variation, since the Chebyshev interpolant of functions with bounded total variation also converge like $O(k^{-1})$.