By some Stone-Weierstrass argument, odd functions on $[-R,R]$ are approximated by power series in odd powers of $x$. This means e.g. that we can approximate $x^2$ on $[0,R]$ by an odd power series.
Suppose that for fixed $k$, we want to find the coefficients $\alpha_i$ that minimise the $L^\infty$ distance from $\sum_{i=1}^{k} \alpha_i x^{2i-1}$ to $x^2$ on $[0,R]$, these will be functions of $R$. For instance, I believe the lowest two approximations are: $$f_1(x) = (2\sqrt{2} - 2)R x$$ $$f_2(x) = \frac{1}{\sqrt{3} R} x^3 + \frac{R}{\sqrt{3}} x$$ To find these, I essentially waded through the messy algebra. I wonder whether there are some observations that would make it easy to solve for the best coefficients, or whether all hope is lost and they become non-algebraic for the higher approximations - I have been trying but keep getting lost in the algebra.
The solution to the $\,L^\infty\,$ distance problem involves the Chebyshev polynomials of the first kind $\,T_n(x).\,$ The Wikipedia article states
Thus, the answer to your problem is to expand the function in a infinite series of Chebyshev polynomials with odd indices. By truncating this infinite series we will get the best polynomial apporximation under the maximum norm of a given degree.
To determine the coefficients in the Chebyshev series we can use the orthogonality of Chebyshev polynomials. That is, if $$ \langle f(x), g(x)\rangle := \frac4\pi \int_0^1 f(x)g(x)/\sqrt{1-x^2} dx \tag1 $$ then $\, \langle T_n(x), T_m(x)\rangle = 0\,$ if $\,n\ne m\,$ while $\, \langle T_n(x), T_n(x)\rangle = 1\,$ if $\,n\gt 0\,$ and $\, \langle T_0(x), T_0(x)\rangle = 2.\,$ Using these facts we find $$ x^2 = \frac{8R}{\pi} \left(\frac{T_1(x)}{3} + \frac{T_3(x)}{15} - \frac{T_5(x)}{105} + \dots\right). \tag2 $$ The general series using OEIS sequence A061550 is $$ x^2 = \frac{8R}{\pi} \left(\sum_{n=0}^\infty \frac{(-1)^{n+1} T_{2n+1}(x)}{(2n-1)(2n+1)(2n+3)} \right). \tag3 $$