I am trying to solve a matrix equation such as $AB = C$. The $A$ is the unknown matrix and I must find it. I know that $B$ and $C$ are $n \times 1$ matrices and so $A$ must be $n \times n$.
I can't use the inverse of B since it doesn't exist.
I am trying to solve a matrix equation such as $AB = C$. The $A$ is the unknown matrix and I must find it. I know that $B$ and $C$ are $n \times 1$ matrices and so $A$ must be $n \times n$.
I can't use the inverse of B since it doesn't exist.
On
For two general matrices $B,C$ of order $m \times n$ (note that in your case $m=n$ and $n=1$) over a field $\mathscr{F}$ the equation is solvable if and only if the row space of $C$ is a subspace of the row space of $B$. If this is the case then you can construct the matrix $A$ using the following: $$A=\begin{bmatrix} C & X \end{bmatrix}\begin{bmatrix} B & Y \end{bmatrix}^R,$$ where $X \in M_{m \times (m-r)}(\mathscr{F}), Y \in M_{m \times (m-r)}(\mathscr{F})$ and $\begin{bmatrix} B & Y \end{bmatrix}$ is full row rank.The symbol $R$ indicates a right inverse, and $r$ the rank of $B$.
Further, a matrix $A$ of each possible rank can be constructed by choosing $X$ of rank $$0 \leq \text{r}(X) \leq \min(m-\text{r}(C),m-\text{r}(B))$$ such that R$(X) \cap$ R$(C) = \{0\}$. For a full proof and a worked out example please see this text, specifically the preliminary section on matrix division. You will see in the text the situation is even more general as the rows of $B$ and $C$ may be different - so the idea is to find a "right-quotient" for any two matrices with the same amount of columns, whenever this is possible. Your question is just a special case.
Maybe I can just mention the basic principle involved...If $B$ is full row rank, it has a right inverse and we can write $A=CB^R$. If $B$ is not full row rank we need the matrix $Y$, which simply consists of columns which are linearly independent from the columns of $B$, and then together with $B$ will create a matrix of full row rank. This ensures that a right inverse exists for the matrix $\begin{bmatrix} B & Y \end{bmatrix}$. If we augment the columns of $B$ we also need to augment the columns of $C$ to keep the multiplication $$A\begin{bmatrix} B & Y \end{bmatrix}=\begin{bmatrix} C & X \end{bmatrix}$$ valid: this is the matrix $X$. Now $X$ provides a means to vary the rank of $A$.
In practical terms the matrices $X$ and $Y$ can be calculated by applying elementary row/column reduction on $B$ and $C$ and selecting certain columns from the resulting elementary matrices...there is a fully worked example in the text under section 2.2.1.
On
Let me transcript the excellent answer of @G.Cab in a ''visible'' way.
Let me rewrite the matrix equation under the form: $AV_1=V_2$.
As there are no constraints on $A$ sending any vector to any vector, one can write $AA_1=A_2$ with:
$$A_1:=\left(\begin{array}{l|cccc} \vdots & * & * & * & *\\ \vdots & * & * & * & *\\ V_1 & * & * & * & *\\ \vdots & * & * & * & *\\ \vdots & * & * & * & * \end{array}\right) \ \ \text{and} \ \ A_2:=\left(\begin{array}{l|cccc} \vdots & * & * & * & *\\ \vdots & * & * & * & *\\ V_2 & * & * & * & *\\ \vdots & * & * & * & *\\ \vdots & * & * & * & * \end{array}\right) $$ where the entries of the $n \times (n-1)$ right blocks are arbitrary (in a practical way : randomly selected).
Then the general solution to your problem is
$$A=A_2A_1^{-1}$$
under the condition that $A_1$ is invertible.
On
If $B = 0$ and $C \neq 0$ no solutions exist. If $B = C = 0$ any $n \times n$ matrix $A$ is a solution.
So consider the case $B \neq 0.$ Given a $ n \times n$ $A=(a_{ij})$ let $\mathbf{a}$ denote the $n^2 \times 1$ column vector $ (a_{11} \dots a_{1n} \ a_{21} \dots a_{2n} \dots a_{n1} \dots a_{nn})^T.$
$\mathbf{a}$ is uniquely determined by $A$ and vice-versa.
Let $\tilde{\mathbf{B}}$ denote then $ n \times n^2$ matrix $\begin{pmatrix} B^T & 0 & 0 & \dots & 0 \\ 0 & B^T & 0 & \dots & 0 \\ \vdots & \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & 0 & \dots & B^T \end{pmatrix}. $
Our problem is equivalent to determining all $\mathbf{a}$ that satisfy $$\tilde{\mathbf{B}} \mathbf{a} = C.$$
Note that $B \neq 0$ implies $\texttt{rank}(\tilde{\mathbf{B}}) = n$. To see this let $B=\begin{pmatrix} b_1 & b_2 & \dots & b_n\end{pmatrix}^T$ and note that by permuting the columns of $\tilde{\mathbf{B}}$ we get the matrix $\left[ \begin{matrix} b_1 I_n & b_2I_n & \dots &b_nI_n \end{matrix} \right]$ and some $b_i \neq 0.$ This implies the above equation always has a solution when $B \neq 0.$
In the complex case we then get all solutions of $\mathbf{a}$ as $(\tilde{\mathbf{B}})^\dagger C + (I - (\tilde{\mathbf{B}})^\dagger \tilde{\mathbf{B}}) x$ where $x$ is an arbitrary $ n^2 \times 1$ vector and $(\tilde{\mathbf{B}})^\dagger$ is the Moore-Penrose inverse of $\tilde{\mathbf{B}}$ .
On
Here is an explicit solution assuming real matrices. Assume $B = \begin{pmatrix} b_1 & b_2 & \dots & b_n \end{pmatrix}^T \neq 0.$
Also $\|B\|^2 = \sum_{i=1}^n b_i^2 > 0.$
The mapping $A \to AB$ is a linear transformation. The kernel of this transformation is the set of all $n \times n$ matrices $A$ whose rows are orthogonal of $B$, i.e., the kernel consists of all matrices of the form $X\left(I - \dfrac{BB^T}{\|B\|^2}\right)$ where $X$ is an arbitrary $n \times n$ matrix.
It is easy to see $\dfrac{CB^T}{\|B\|^2}$ is a particular solution.
So the set of all solutions consists of matrices of the form $\dfrac{CB^T}{\|B\|^2} + X\left(I - \dfrac{BB^T}{\|B\|^2}\right)$ where $X$ is an arbitrary $n \times n$ matrix.
Put aside of the given $B_1$ column vector other $n-1$ vectors, such that the whole becomes a matrix $B_n$ with $n$ independent vectors, so that $B_n$ is invertible. Do the same with $C_1$, but in this case you do not need independence. Then $A_n$=$C_n B_n^{-1}$ will give you "all" the matrices such that $A_n B_1=C_1$.
Note: the process applies also in case you have $B$ and $C$ with two, .. ,$n$ columns.