I want to solve the equation $AB=C, A\epsilon \mathbb{R}^{3\times 3}, B\epsilon \mathbb{Z}^{3\times 10}, C\epsilon \mathbb{R}^{3\times 10}$ for $A$. The solution should minimize the norm (best is euclidean) of the columns of $AB-C$.
My first attempt was to transpost the whole equation ($B^TA^T=C^T$) and apply an ordinary linear regression, but this would minimize the norm of the rows of $AB-C$ instead of the columns.
How could one minimize the columns of $AB-C$ instead? Or (what would be the same) minimize the rows of $B^TA^T-C^T$ as a solution of $B^TA^T=C^T$.
Background: $A$ is the basis of a lattice. $B$ is an integer matrix that selects lattice points. The columns of $C$ are noisy lattice points. So I am searching for a lattice, that describes the noisy lattice points best (best in the sense, that the norms between predicted and real lattice points are minimal).
Let's define $ D = A B - C $.
Then in order to minimize the Least Squares per each column of $ D $ one could do the following:
$$ \arg \min_{A} \sum_{i} \left\| \left( A B - C \right) \boldsymbol{e}_{i} \right\|_{2}^{2} $$
Where $ \boldsymbol{e}_{i} $ is the $ i $ -th columns of the identity matrix $ I \in \mathbb{R}^{10 \times 10} $.
The above indeed extracts each column of the matrix $ D $ and calculates its $ {L}_{2} $ Norm.
The optimization problem tries to minimize that.
Yet, what's happening here is basically using the Frobenious Norm.
Thence the problem can be written as:
$$ \arg \min_{A} \left\| A B - C \right\|_{F}^{2} $$
Now, by defining $ f \left( A \right) = \left\| A B - C \right\|_{F}^{2} $ one could see that:
$$ \frac{d}{d A} f \left( A \right) = 2 \left( A B - C \right) {B}^{T} $$
Taking the derivative to zero yields:
$$ \hat{A} = C {B}^{T} \left( B {B}^{T} \right)^{-1} $$