I have a vector valued function which takes in 3 independent variables and has three outputs and which we can represent as: $f_1(x_1,x_2,x_3)$, $f_2(x_1,x_2,x_3)$, $f_3(x_1,x_2,x_3)$.
The actual functions $f$ are the outputs of a FEM simulation, so I cannot write down their functional form and they are expensive to compute.
My question is the following: Is there a basis, or simply a vector in parameter space, in which only one of the functions, say $f_1$ changes?
I thought to compute the Jacobian, which informs me of the effects of changing each $x_i$ on each $f_j$. My thought was if I diagnoalize the Jacobian, the eigenvectors should form a basis in which the above condition is met. That is, if the Jacobian is diagonal in a basis, it means that each of the new coordinate s $x'_1, x'_2$ etc. affects only one of the $f_i$.
However this did not seem to work -- when I moved along these Eigenvectors it seems that one of the $f_i$ did NOT change, while the others did. This is rather the reverse of what I need, unfortunately.
Can this be done? Even if there is no exact solution, is there a way to get the vectors in phase space along which, say, $f_1$ is mostly the only one which changes?
If $f_1$ is nice enough, its gradient tells you in what direction it changes the fastest. As a corollary (still assuming niceness), it doesn't change at all in any direction normal to the gradient. Similarly for $f_2$.
This means that if you move in a direction normal both to the gradient of $f_1$ and to the gradient of $f_2$, neither of them should change value. Such a direction is often given by the cross product of their respective gradients. Hopefully $f_3$ does change when moving in that direction.
Numerically, you need to evaluate a function at least for three surrounding points (in addition to the original point of interest) in order to find (an approximation to) the gradient.