Consider several linearly independent matrices $A_k \in \mathbb R^{m \times n}$ and the following equation $$ \operatorname{rank} \left(A_0 + \sum_{k=1}^r c_k A_k\right) = 1. $$ Here $A_k$ are fixed, $c_k$ are unknown real values and $r = n m - n - m + 1$.
I wonder what methods may be used to solve this type of equation.
I though about relation between singular values and trace: let $B = A_0 + \sum_{k=1}^r c_k A_k$. $$ \|B\|_F^2 = \operatorname{tr}(B^\top B) = \sum_i \sigma_i^2 = \sigma_0^2\\ \|B^\top B\|_F^2 = \operatorname{tr}(B^\top B B^\top B) = \sum_i \sigma_i^4 = \sigma_0^4 = \|B\|_F^4. $$ The latter equation is a forth degree equation in $c_k$. For square matrices probably it could be reduced to second degree multivariate equation, but still remains a complex problem.
Given linearly independent matrices ${\rm A}_k \in \mathbb R^{m \times n}$, we have the following equation in ${\rm x} \in \mathbb R^r$
$$\operatorname{rank} \left( {\rm A}_0 + \sum_{k=1}^r x_k {\rm A}_k \right) = 1$$
Since the nuclear norm is a convex proxy for the rank, we could solve the following convex program in ${\rm x} \in \mathbb R^r$
$$\begin{array}{ll} \underset{{\rm x} \in \mathbb R^r}{\text{minimize}} & \left\| {\rm A}_0 + \displaystyle\sum_{k=1}^r x_k {\rm A}_k \right\|_* \end{array}$$
Let ${\rm x}^{\min}$ be the minimizer of this convex program. If
$$\operatorname{rank} \left( {\rm A}_0 + \displaystyle\sum_{k=1}^r x_k^{\min} {\rm A}_k \right) = 1$$
we are done. If not, we wasted a few minutes of our lives.