I'm trying to find a $n \times n$ matrix $A$, that satisfies $V_x = A V_y$, where $V_x$ and $V_y$ are both $n \times 1$ vectors of known quantities. For what its worth, the sum of $V_x$ and the sum of $V_y$ will always equal 1, and each element in $V_x$ and $V_y$ will not be negative (probabilities...)
Can $A$ be expressed in terms of $V_x$ and $V_y$?
And I suspect $A$ has multiple possible solutions - is there a generalized equation which describes the possible solutions? I'll take as much as you can tell me about the way $A, V_x$, and $V_y$ relate. Thanks!
What does $\mathbf{A} \mathbf{x} = \mathbf{y}$ represent? It's just a system of equations of the form
$$ \begin{array}{c} a_{00} x_1 + a_{01} x_2 + \cdots + a_{0N} x_N = y_1 \\ a_{10} x_1 + a_{11} x_2 + \cdots + a_{1N} x_N = y_2 \\ \vdots \\ a_{M0} x_1 + a_{M1} x_2 + \cdots + a_{MN} x_N = y_M \\ \end{array} $$
Any matrix $\mathbf{A} \in \mathbb{C}^{M\times N}$ such that $\operatorname{rank}(\mathbf{A}) = M$ will have one or more solutions to the above system. This is true regardless of the values $\mathbf{x}$ and $\mathbf{y}$. You can try rearranging the above to express the $a_{mn}$ in terms of the $x_n$ and $y_m$, but I think you'll find that there are infinitely many ways to do this since you have $N$ unknown variables for each of the $M$ equations.
EDIT:
Suppose that $\mathbf{x}$ and $\mathbf{y}$ are known but $\mathbf{A}$ is not. Notice that you can rewrite the above system to reflect this. For example, consider the following $2 \times 2$ case:
$$ \left[ \begin{array}{c} a & b \\ c & d \end{array} \right] \left[ \begin{array}{c} x_0 \\ x_1 \end{array} \right] = \left[ \begin{array}{c} y_0 \\ y_1 \end{array} \right] $$
Since $x_0$, $x_1$, $y_0$, and $y_1$ are known, we would want to solve the following system to get the values of $a$, $b$, $c$, and $d$:
$$ \left[ \begin{array}{c} x_0 & x_1 & 0 & 0\\ 0 & 0 & x_0 & x_1 \end{array} \right] \left[ \begin{array}{c} a \\ b \\ c \\ d \end{array} \right] = \left[ \begin{array}{c} y_0 \\ y_1 \end{array} \right] $$
Now it's obvious that $a$, $b$, $c$, and $d$ can't be solved for exactly. However, what if you had two known cases $\mathbf{Ax}_0 = \mathbf{y_0}$ and $\mathbf{Ax}_1 = \mathbf{y_1}$. Then we could combine this with the above to get the following:
$$ \left[ \begin{array}{c} x_{00} & x_{01} & 0 & 0 \\ 0 & 0 & x_{00} & x_{01} \\ x_{10} & x_{11} & 0 & 0 \\ 0 & 0 & x_{10} & x_{11} \end{array} \right] \left[ \begin{array}{c} a \\ b \\ c \\ d \end{array} \right] = \left[ \begin{array}{c} y_{00} \\ y_{01} \\ y_{10} \\ y_{11} \end{array} \right] $$
As long as $\mathbf{x}_0$ and $\mathbf{x}_1$ are not linearly dependent, the above will be solvable and you can use it to find $\mathbf{A}$.
The above is easily extended for higher-dimensional cases.