Systematic way to find a matrix that satisfies an equation

309 Views Asked by At

Background

It is relatively easier to find a solution to a system of linear equations in the form of $A\textbf{v}=\textbf{b}$ given the matrix $A$. But what systematic ways are there that allows us to obtain a matrix given a equation?

For example, consider the following equations with all terms existing in $\mathbb{R}$

$$ \begin{bmatrix} a&b&c\\ d&e&f\\ g&h&i\\ \end{bmatrix}\begin{bmatrix}2\\3\\4\end{bmatrix}=\begin{bmatrix}1\\1\\1\end{bmatrix} $$

Although it is easy to see that $a=\frac{1}{2}, e=\frac{1}{3}, i=\frac{1}{4}$ with all other terms being $0$ is a viable solution, I am curious if there is a more systematic way of finding a matrix that satisfies a equation. Even more importantly, how should these methods be adapted when there are added constraints on the properties of the matrix? For example, if we require that the matrix of interest should be invertible, or of rank = $k$?

Why I am interested in such question

Consider the vector space $P_2{(\mathbb{R})}$, the problem of finding a basis $\beta$ such that $[x^2+x+1]_\beta = (2,3,4)^T$ can be reduced to a problem that has been stated above.

1

There are 1 best solutions below

0
On BEST ANSWER

Just as $A\mathbf v=\mathbf b$ is equivalent to a system of linear equations in the entries of $\mathbf v$, it can also be equivalent to a system of linear equations in the elements of $A$. In particular, you have $$\begin{align}2a+3b+4c&=1\\2d+3e+4f&=1\\2g+3h+4i&=1,\end{align},$$ or in matrix form, $$\begin{bmatrix}2&3&4&0&0&0&0&0&0\\0&0&0&2&3&4&0&0&0\\0&0&0&0&0&0&2&3&4\end{bmatrix}\begin{bmatrix}a\\b\\c\\d\\e\\f\\g\\h\\i\end{bmatrix} = \begin{bmatrix}1&1&1\end{bmatrix}.$$ I expect that you know how to find the general solution to these equations. Indeed, it’s possible to write one down from the matrix form of this system by inspection.