I read that the gradient is an example of a quantity that transforms covariantly since in the below expression for the gradient $$\frac{\partial x^j}{\partial x'^i}$$ appears instead of $$\frac{\partial x'^i}{\partial x^j}$$ This amounts to the use of the inverse matrix that would be used to express components in a contravariant context $$\frac{\partial f}{\partial x'^i}=\sum_j \frac{\partial x^j}{\partial x'^i}\frac{\partial f}{\partial x^j}$$
I also read that
Specifically, every vector has both contravariant and covariant components that transform in predictable ways.
So does this mean it is meaningful to express the gradient in contravariant components with covariant basis vectors? If so is that useful at all, and if not why? And how does one derive the transformation rules in that context?