Given $n$-vector $\bf c$, I would like to solve the following equation in $d$-vector $\bf a$ and $d \times n$ matrix $\bf M$
$$ {\bf a}^\top {\bf M} = {\bf c}^\top $$
Since there might be infinitely many solutions to this equation,
$$\min_{ {\bf a}, {\bf M} } \quad \| {\bf a} \| + \| {\bf M} \|$$
It is not needed to be that objective function in specific but any objective that let me find an unique $\bf a$ and $\bf M$ that solves the equation. This is because I'm doing a system that takes advantage of the structure of a neural network, and I want to use a weight adjustment from values given by the user, but since the function I'm using in the neurons are not derivable, then I have to do a different weight adjustment, I reached to have a vector c at the end of the nn, and here is where the problem begins, because I want the adjustment to be reproducible, so that only the user can adjust the weights (like a way of authentication, only the right weights will output the right values of the user)
We parameterize this problem via the singular value decomposition of $M$, $M = U\Sigma V^*$, where $U$ and $V$ are unitary and $\Sigma$ is a diagonal matrix with zero padding. In particular, we use this decomposition to write $M$ as a sum of rank-$1$ matrices:$$M = \sum_{i=1}^{\min\{n,d\}}\sigma_iu_iv_i^*,$$ where $u_i$ and $v_i$ are the $i$-th column of $U$ and $V$, respectively. These are unit vectors. Notice that the constraint $a^*M = c^*$ now reads $$\sum_i^{\min\{n,d\}}(a^*u_i)\sigma_i v_i^* = c^*.$$ We are free to choose $a$, $\sigma_i$, $u_i$, and $v_i$ such that this equation remains true.
Now note that the Frobenius norm of a matrix can also be computed in term of its singular values. In particular, we have $\|M\|_F^2 = \sum_{i=1}^{\min\{n,d\}}\sigma_i^2$. This tells us we should try to make the singular values of $M$ as small as possible. Now, given an arbitrary $a\in\mathbb{R}^d$, one can actually make all but one of the singular values of $M$ zero by letting $u_1 = a/\|a\|_2$ and $v_1 = c/\|c\|_2$ and using the fact that all $u_i$'s and $v_i$'s are orthogonal to each other. Using this construction, the constraint becomes
$$ \begin{aligned} (a^*u_1)\sigma_1v_1^* &= c^*\\ a^*\left(\frac{a}{\|a\|}\right)\sigma_1\left(\frac{c^*}{\|c\|}\right) &= c^*\\ \left(\frac{\sigma_1\|a\|}{\|c\|}\right)c^* &= c^* \\ \implies \sigma_1 &= \frac{\|c\|}{\|a\|}. \end{aligned} $$ Notice that the constraint only depends on $\sigma_1$, so all other singular values can be set to zero and the objective becomes $$ \|a\| + \|M\|_F = \|a\| + \sigma_1 = \|a\| + \frac{\|c\|}{\|a\|}. $$
We can now minimize this w.r.t. the norm of $a$ using single-variable calculus to find that the optimal norm of $a$ is $\sqrt{\|c\|}$. This yields $\sigma_1 = \sqrt{\|c\|}$ and $M = \frac{1}{\|c\|}ac^*$.
To summarize, given a vector $c\in\mathbb{R}^n$, $a$ can be an arbitrary vector such that $\|a\| = \sqrt{\|c\|}$, and $M = \frac{1}{\|c\|}ac^*$ is a minimizer of this objective.