So i have to solve:

I know how to do them basically,but the vectors in the second matrix are not linearly independent.The answer in the textbook is :
If someone could show me how to solve it that would be great.
So i have to solve:

I know how to do them basically,but the vectors in the second matrix are not linearly independent.The answer in the textbook is :
If someone could show me how to solve it that would be great.
On
Gerry Myerson already gave a nice answer. Here is a way to solve $X$ all at once (but it's not really slick).
You just need to directly apply what you know from linear algebra. Call the first matrix $A$ and the second matrix $B$. The set of all $3 \times 3$ matrices form a vector space, $M_3$. And the map $T$ defined by $T(X) = XA$ is a linear map on $M_3$. So we can choose a basis $\beta$ for $M_3$ (an obvious one is the set of all matrices with $0$ everywhere except $1$ in one of the entries).
Then $[X]_\beta$ (the column vector of $X$ with respect to the basis $\beta$) and $[B]_\beta$ are column vectors with $9$ entries. Also $[T]_\beta$ is a $9 \times 9$ matrix. So we have the equation $$ [T]_\beta [X]_\beta = [B]_\beta $$ which is just a usual equation with vectors and matrices and you can solve it by row reducing the augmented matrix $([T]_\beta | [B]_\beta)$.
This is not exactly slick and obviously it would take too long to do it by hand. But I just wanted to show that linear algebra already tells us how to solve it (theoretically).
Let the top row of $X$ be $(r\ s\ t)$. Then you get a system of three equations in the three unknowns $r,s,t$. I presume you know how to solve such a system, and, if the textbook is right, a solution is given by $r=a$, $s=2-a$, $t=a$, where $a$ is arbitrary. Then do the same thing with the second row of $X$, and then with the third row of $X$, and (if the book is right) you should get the textbook answer.
There's probably a slick way to do all three rows at once, but the method I outline should be easy to understand.