I have reduced a computational task to looking for solutions to problems of the form:
$$\mathbb{A}.\mathbb{X}=\mathbb{Y}\,,$$
where $\mathbb{A}$ and $\mathbb{Y}$ are $N\times(L+1)$ matrices. $\mathbb{A}$ is given, but $\mathbb{Y}$ is given up to undetermined entries in the last column. Finally $\mathbb{X}$ is a real $(L+1)\times(L+1)$ square matrix to be solved for, but the last row must be all zeros with a 1 at the end.
An example problem $N=4$, $L=2$ might be:
$$ \left(\begin{array}{cc|c} 1&0&0\\ 0&1&2\\ 0&1&1\\ 1&1&1\\ \end{array}\right).\left( \begin{array}{cc|c} \alpha_1 & \alpha_2 & \alpha_3 \\ \beta_1 & \beta_2 & \beta_3 \\\hline 0 & 0 & 1 \end{array} \right) = \left(\begin{array}{cc|c} 1&0&0\\ 0&1&0\\ 0&1&Y\\ 1&1&0\\ \end{array}\right) $$
where the six variables $\alpha_1$ ... $\beta_3$ along with $Y$ need to be solved for.
Question
Has this kind of linear algebra problem been studied before? Or can this problem be reformulated in terms of something known?
Just multiply out the left hand side and equate entries with the right hand side. You will obtain a system of $12$ linear equations in $7$ variables. This system will have no solution, one solution or infinitely many solutions (depending on the specific details) and all solutions can be determined by standard methods.
In this case, for example, multiplying out the top row gives immediately $\alpha_1=1$, $\alpha_2=0$, $\alpha_3=0$. But note that these may be contradicted by other equations, in which case there would be no solution. I haven't pursued the specific calculations any further, will leave that to you.