Gradient vector fields are quite useful in many applications. They however do not generate a Lie Algebra, especially because composition of gradient does not give a gradient. I know that a differentiable vector field $X:\mathbb{R}^n\rightarrow\mathbb{R}^n$ is a gradient if and only if its Jacobian is symmetric.
This means that the family of gradient fields is quite small. I've seen results about classifying the vector fields that can be approximated arbitrarily well by a gradient vector field on a compact set.
My questions are then three:
- Are there results about what classes of functions can be exactly represented by composing two gradients? i.e. what are the properties of $X:\mathbb{R}^n\rightarrow\mathbb{R}^n$ in order for it to satisfy $X(x) = \nabla f\circ \nabla g(x)$ for every $x\in E$, $E\subset\mathbb{R}^n$ compact, for some $f,g:\mathbb{R}^n\rightarrow\mathbb{R}$?
- What if the requirement is relaxed to finding those $X$s that for a fixed $\varepsilon>0$ satisfy $\max_{x\in E}\|X(x) - \nabla f\circ \nabla g(x)\|<\varepsilon$?
- If any of the two results above are investigated somewhere, is also the extension to more compositions studied? Like $\nabla f\circ\nabla g\circ\nabla h(x)$?
It might be that this problem has never been considered because of not much interest, but I think it is mathematically interesting and this is why I'm asking this. The approximation "power" given by function compositions is very surprising many times, so I expect that already with two compositions the approximable space increases quite considerably.