Say that $S(\mathbf{x})$ is a skew-symmetric $k+1$-tensor, that is, $S_{i_0,...,i_a,...,i_b,...,i_{k}}(\mathbf{x})=-S_{i_0,...,i_b,...,i_a,...,i_{k}}(\mathbf{x})$ for $a,b=0,...,k$, then find $S(\mathbf{x})$ given the equation $$S(\mathbf{x})(\mathbf{h}^{[1]}(\mathbf{x}),...,\mathbf{h}^{[k]}(\mathbf{x}))=\mathbf{f}(\mathbf{x}),$$ where $\mathbf{h}^{[i]},\mathbf{f}\in\mathbb{R}^n$ are vectors. Using the summation convention and dropping the explicit dependence on $\mathbf{x}$ this can be written in components as $$S_{i_0,i_1,...,i_k}\prod_{\alpha=1}^k h^{[\alpha]}_{i_\alpha}=f_{i_0},$$ for $i_0=1,...,n$. The question is: what are the components $S_{i_0,i_1,...,i_k}$ in terms of $h^{[\alpha]}_{i_\alpha}$ and $f_{i_0}$? I know that in general, there are infinitely many solutions, so the full answer will depend on some free parameters. However, any particular solution that solves the general case will be very useful.
Example: Take the $k=1$ case $$S_{i_0,i_1}h^{[1]}_{i_1}=f_{i_0},$$ where $S$ is a skew-symmetric matrix. We have the property $$(\mathbf{h}^{[1]})^TS \,\mathbf{h}^{[1]}=(\mathbf{h}^{[1]})^T\mathbf{f} = 0.$$ Therefore $$S = \frac{\mathbf{h}^{[1]}\mathbf{f}^T - \mathbf{f}\,(\mathbf{h}^{[1]})^T}{{\mathbf{h}^{[1]}}^T\mathbf{h}^{[1]}}$$ is a solution.
Can someone generalise this for general $k$?
Whether you have a solution at all depends on your $\mathbf{h}^{[\bullet]}$ and $\mathbf{f}$.
For example if $k=1$, $n=2$ a skew symmetric $k+1$ tensor is just a skew symmetric matrix and, as you already realized, solving your problem is equivalent to finding a skew symmetric matrix $\mathbf{S}$ such that $\mathbf{S} \cdot \mathbf{h}^{[1]} = \mathbf{f}$. However, then $$\mathbf{S} = \begin{pmatrix} 0 & a \\ -a & 0\end{pmatrix}$$ if we represent $\mathbf{S}$ as a skew symmetric $2\times 2$ matrix. Then you see that you can solve this only for $\mathbf{f}$ which are orthognal to $\mathbf{h}^{[1]}$.
In higher dimension I would suspect that you find a solution $$ ||f|| \cdot \sum_{\sigma \in S_{k+1}} \operatorname{sgn}(\sigma) \cdot \left(\frac{v_{\sigma(0)}}{||v_{\sigma(0)}||} \otimes \cdots \otimes \frac{v_{\sigma(k)}}{||v_{\sigma(k)}||}\right) $$ if $v_{0} := \mathbf{f}, v_1 := \mathbf{h}^{[1]}, \dots, v_{k} := \mathbf{h}^{[k]}$ are orthogonal and non-zero, but I did not check this. Probably, something similar works if $\mathbf{f}$ is orthogonal on the span of $\mathbf{h}^{[1]}, \dots, \mathbf{h}^{[k]}$ if the latter vectors are linearly independent. I would guess then you would have to replace $v_1, v_2, \dots, v_n$ by the Gram-Schmidt orthongalization of $\mathbf{h}^{[1]}, \dots, \mathbf{h}^{[k]}$ before you take the above formula.