I'm having troubles with this proof:
Given two linear systems in $n$ unknowns:
$A=\begin{pmatrix} A_1\\ A_2\\ ...\\ A_m \end{pmatrix}$
$B=\begin{pmatrix} B_1\\ B_2\\ ...\\ B_p \end{pmatrix}$
Let $S_A$ and $S_B$ be the set of their solutions. Prove that:
$S_A=S_B \Rightarrow A_i=\sum_\limits{k=1}^{p} c_k^i B_k$ and $B_j=\sum_\limits{k=1}^{m} d_k^j A_k \ \ \ 1\leq i\leq m , 1\leq j\leq p $
I had not difficulties in proving the inverse implication but this is giving me troubles. Practically I have to prove that his 2 linear systems have the same solutions than the second can be derived by a linear combination of the equations of the first and vice-versa. Now I'll write my (miserable) attempt:
Attempt
Let's write the systems in a more explicit form:
$$A: (\sum_\limits{j=1}^{n}a_{ij}x_j)-a_i=0 \ \ \ 1\leq i \leq m$$
$$B: (\sum_\limits{j=1}^{n}b_{ij}x_j)-b_i=0 \ \ \ 1\leq i \leq p$$
Where $a_i$ and $b_i$ are the constant terms. Practically we have to prove that there exist constants such that:
$$(\sum_\limits{j=1}^{n}a_{ij}x_j)-a_i=c_1^i[(\sum_\limits{j=1}^{n}b_{1j}x_j)-b_1]+...+c_p^i[(\sum_\limits{j=1}^{n}b_{pj}x_j)-b_p] \ \ 1\leq i \leq m$$
One of my friends proved this equality by substituting $(x_1,...,x_j)=(t_1,...,t_j)$ where $t_k$ are the solution of both the systems. But I think that this is dumb because this proves just that they have common solutions (the hypothesis): I want the equality for a generic $(x_1,...,x_j)$. My idea is this: to be equal the coefficients of the same unknown must be equal both sides. So:
$$\left\{\begin{matrix} a_ {i1}=c_1^i b_{11}+c_2^i b_{21}+...+c_p^i b_{p1}\\ a_ {i2}=c_1^i b_{21}+c_2^i b_{22}+...+c_p^i b_{p2}\\ ...\\ a_ {in}=c_1^i b_{n1}+c_2^i b_{n2}+...+c_p^i b_{pn}\\ a_i=c_1^i b_1+c_2^i b_2+...+c_p^i b_p \end{matrix}\right.$$
And now I have to prove that this system has always a solution( the unknowns are $(c_1^i,c_2^i,c_3^i,...)$. I still have to use the fact that $A$ and $B$ have the same solutions, but I don't know how to do this.
[I know that there is a similar post, but it's about the particular case of a system with 2 unknowns, and I can't understand many answers because these are my first lessons of Linear Algebra/Geometry]
P.S : The notation we are using is really heavy, am I right?
Moreover I think that maybe the hypothesis $S_A=S_B$ implies also that $m=p$: the two systems must have the same number of independent equations, otherwise the system that I obtained during my proof couldn't be always solved.
Let $A\in M_{m,n},u\in\mathbb{C}^m$ and $B\in M_{p,n},v\in \mathbb{C}^p$. We consider the $2$ systems in $x\in\mathbb{C}^n$:
$(S_1)$: $Ax=u$ and $(S_2)$: $Bx=v$.
EDIT. I forgot the case when both sets of solutions are void. The correct result is
$\textbf{Proposition}$. The above systems have same set of solutions IFF
EITHER $AA^+u\not= u,BB^+v\not= v$ (both sets of solutions are void)
OR $AA^+u=u,BB^+v=v$, $\ker(A)=\ker(B)$ and $AB^+v=u,BA^+u=v$.
$\textbf{Proof}$ (of the second part, when $AA^+u=u,BB^+v=v$). The set of solutions of $(S_1)$ is the non-void affine set
$x=A^+u+(I_n-A^+A)w$ where $w$ is arbitrary in $\mathbb{C}^n$.
The set of solutions of $(S_2)$ is the non-void affine set
$x=B^+v+(I_n-B^+B)z$ where $z$ is arbitrary in $\mathbb{C}^n$.
These affine sets are the same IFF $im(I-A^+A)=im(I-B^+B)$ and $BA^+u=v,AB^+v=u$.
$A^+A$ is hermitian and, more precisely, is the orthogonal projection on $im(A^*)=(\ker(A)^{\perp}$.
Then $im(I-A^+A)=(im(A^+A))^{\perp}=\ker(A)$ and we are done.$\square$
$\textbf{Remark}$. The last $2$ conditions positively responds to the OP's question when both sets of solutions are non-void.