let $k\cdot x$ and $n \cdot x$ be scalar products of 3 component vectors, $k_i\in(k_1,k_2,k_3),\,\, n_i, \,\,x_i$. Define two functions, $f,g$ which are related via a partial derivative ,
$$ g(k\cdot x)=\int^{k\cdot x}\!f(\xi)\, \text{d}\xi, \quad \frac{\partial g}{\partial x_i} =k_i\,f(k\cdot x)$$
My question is, can I define another function, $h(k\cdot x)$ which returns a component of the new vector $n_i$ as a co product of $f(k\cdot x)$ when differentiated. That is, I want to find $h$ such that
$$ \frac{\partial h}{\partial x_i} =n_i\,f(k\cdot x)$$
I want to do something like including a rotation matrix $R$ which rotates the vector $n$ onto $k$ and write,
$$ h(k\cdot x)=\int^{n\cdot x}\!f(R\xi)\, \text{d}\xi$$
However, I'm not sure if this is a valid operation. Can anyone elucidate, or come up with a different way of doing it?
Take $n_1= 1,n_2 = 0, n_3 = 1$, then $$ \frac{\partial h}{\partial x_1} = 0 $$
$$ \frac{\partial h}{\partial x_2} = 0 $$
from the first you conclude that $h$ does not depend on $x_1$ and the second that $h$ does not depend on $x_2$ so we conclude that $$ \frac{\partial h(x_3)}{\partial x_3} = f(k\cdot x) $$ L.h.s. does only depend on $x_3$ and r.h.s. depends generally on all $x_1,x_2,x_3$. A contradiction.
The question is for which $n$, there is a solution. Assume that there is two independent vectors $n,m$. That has a solutions $h_n,h_m$, then it is possible to combine $n$ and $m$ so that one coordinate is zero and another is nonzero. Again similarly if $k_i$ is nonzero for all $i$ we will get a contradiction, then the possible $n$ is equal to some $c k$. If say $k_1=0$ if then $n_1$ is nonzero then the $h$ from the first equation would give $$ h = x_1 n_1 f(k\cdot x) + c(x_2,x_3) $$ and then the derivation of $h$ in a coordinate with nonzero $k_j$ would lead to l.h.s. depending on $x_1$ but not r.h.s. again a contradiction. Hence all possible $n$ is in the linear space generated by $k$.