While doing my research, I came across the following matrix equation in $W \in \mathbb{R}^{n \times d}$ that I could not solve.
$$ \sum_{i=0}^{t} X_{i} W Y_{i} + X'_{i}WY'_{i} = Z $$
where
$X_{i}, X'_{i} \in \mathbb{R}^{n \times d \times n}$
$Y_{i}, Y'_{i} \in \mathbb{R}^{d \times 1}$
$Z \in \mathbb{R}^{n \times d \times 1}$
There is no relation between $t$, $n$ and $d$.
I tried using $WY_{i}$ and $WY'_{i}$ as $K_{i}$ and $J_{i}$ and tried to solve it using least-squares such that I could solve for W from the solution obtained, but I was not getting the right solution ($K_{i}$ and $J_{i}$). Is there any analytical way to solve for W?
$ \def\o{{\tt1}} \def\bbR#1{{\mathbb R}^{#1}} \def\qiq{\quad\implies\quad} \def\LR#1{\left(#1\right)} \def\shape#1{\operatorname{Reshape}\LR{#1}} \def\vc#1{\operatorname{vec}\LR{#1}} $Define a third-order tensor $\nu$ with components $$\eqalign{ \nu_{\ell jk} &= \begin{cases} \o\quad{\rm if}\;\;\ell=j+kn-n \\ 0\quad{\rm otherwise} \end{cases} \qquad \qquad \\ }$$ $$\eqalign{ 1&\le\; j \;&\le n &\qquad\big({\rm the}\;row\;{\rm index}\big) \\ 1&\le\; k \;&\le d &\qquad\big({\rm the}\;column\;{\rm index}\big) \\ 1&\le\; \ell \;&\le n\!\cdot\!d &\qquad\big({\rm the}\;long\;{\rm index}\big) \\ }$$ A double-dot product $(:)$ with this tensor will reshape any matrix $A\in\bbR{n\times d}\,$ into a vector $a\in\bbR{nd\times\o}$ while preserving each component $$\eqalign{ &a = \nu:A = \vc{A} &\qiq a_\ell = \sum_{j=\o}^n\sum_{k=\o}^d \nu_{\ell jk}\,A_{jk} \\ &M_i = \nu:X_i &\qiq (M_i)_{\ell p} = \sum_{j=\o}^n\sum_{k=\o}^d \nu_{\ell jk}\,(X_i)_{jkp} \\ }$$ In the second line, $\nu$ reshaped a tensor $\in\bbR{n\times d\times n}\,$ into a lower rank tensor $\in\bbR{nd\times n}$
Use this tensor to reshape the variables $(X_k,\,X_k^\prime,\,Z)$ into normal matrix and vector variables $$\eqalign{ M_k &= \nu:X_k,\qquad N_k &= \nu:X_k^\prime,\qquad z &= \nu:Z }$$ Reshape the entire equation into a normal matrix-vector equation, which can be vectorized to isolate the $W$ matrix as a vector, which can then be reshaped back into a matrix $$\eqalign{ &\vc{\sum_k M_kWY_k + N_kWY_k = z} \\ &\LR{\sum_k Y_k^T\otimes M_k + Y_k^T\otimes N_k}w = z \\ &w = \LR{\sum_k Y_k^T\otimes M_k + Y_k^T\otimes N_k}^{-1}z \\ &W = \shape{w,n,d} \;=\; w\cdot \nu \\ }$$ In the last line, note that the $\nu$ tensor can also be used to reshape the $\,\bbR{nd\times\o}$ vector into a $\,\bbR{n\times d}$ matrix using a standard dot product.