This recent answer contained an interesting Kronecker decomposition of the form $$\eqalign{ &A = \sum_{i,j} C_{ij}\otimes E_{ij} \;\in\;{\mathbb R}^{mp\times nq} \\ &C_{ij} \in{\mathbb R}^{m\times n}\quad\big({\rm Coefficient\,Matrices}\big) \\ &E_{ij} \in{\mathbb R}^{p\times q}\;\quad\big({\rm Standard\,Basis\,Matrices}\big) }$$ This decomposition has two trivial cases.
When $m=n=1$, the coefficients are simply scalars equal to the components of the matrix $$C_{ij}=A_{ij}$$ When $p=q=1$, then there is only one matrix-valued coefficient equal to the whole matrix $$C_{11}=A$$ But what is the algorithm/formula to calculate the coefficient matrices in the general case?
Define the standard basis vectors using an index which acts as a mnemonic for their dimensionality, i.e. $$e_j\in{\mathbb R}^{J},\quad e_k\in{\mathbb R}^{K},\quad etc$$ The Kronecker product of two basis vectors yields a basis vector from a higher dimension $$\eqalign{ e_\ell &= vec(e_ke_j^T) = e_j\otimes e_k \\ }$$ and reveals the following relationship between the three indexes $$\eqalign{ &\ell = k + (j-1)K,\quad &j = 1 + {\rm div}(\ell-1,K),\quad &k = 1 + {\rm mod}(\ell-1,K) \\ }$$ First, expand an arbitrary vector in the standard basis. $$\eqalign{ a &\in {\mathbb R}^{L},\qquad L=JK \\ a &= \sum_{\ell=1}^{L} a_\ell e_\ell \;=\; \sum_{j=1}^{J}\sum_{k=1}^{K} a_{(jK-K+k)}\; e_j\otimes e_k \\ }$$ Next, expand an arbitrary matrix in terms of the standard matrix basis. $$\eqalign{ A &\in {\mathbb R}^{L\times P},\qquad L=JK,\;P=MN,\quad E_{jm} \in {\mathbb R}^{J\times M},\quad E_{kn} \in {\mathbb R}^{K\times N} \\ A &= \sum_{\ell=1}^{L} \sum_{p=1}^{P} A_{\ell p}\; E_{\ell p} \;=\; \sum_{\ell=1}^{L} \sum_{p=1}^{P} A_{\ell p}\; e_\ell e_p^T \\ &= \left(\sum_{j=1}^{J}\sum_{k=1}^{K}\right)\left(\sum_{m=1}^{M}\sum_{n=1}^{N}\right) A_{(jK-K+k)(mN-N+n)}\; (e_j\otimes e_k) (e_m\otimes e_n)^T \\ &= \sum_{j=1}^{J}\sum_{k=1}^{K} \sum_{m=1}^{M}\sum_{n=1}^{N} A_{(jK-K+k)(mN-N+n)}\; E_{jm}\otimes E_{kn} \\ }$$ So this is the $(JM\times KN)$ decomposition. There are also $(KN\times JM)$, $\,(JN\times KM)$, and $\,(KM\times JN)$ decompositions. In fact, there are decompositions corresponding to every possible factorization of the integers $L$ and $P$.
So to answer the question that I posed (with slightly different indexing), the coefficient matrices of the decompositions $$\eqalign{ A &= \sum_{j=1}^J\sum_{m=1}^M E_{jm}\otimes B_{jm} \\ &= \sum_{k=1}^K\sum_{n=1}^N C_{kn}\otimes E_{kn} \\ }$$ are given by $$\eqalign{ B_{jm} &= \left(\sum_{k=1}^{K}\sum_{n=1}^{N} A_{(jK-K+k)(mN-N+n)} \; E_{kn}\right)&\in {\mathbb R}^{K\times N} \\ C_{kn} &= \left(\sum_{j=1}^{J}\sum_{m=1}^{M} A_{(jK-K+k)(mN-N+n)} \; E_{jm}\right)&\in {\mathbb R}^{J\times M} \\ }$$ Often, it is the traces of these coefficients which are of primary interest. $$\eqalign{ {\rm Tr}(B_{jm}) &= \sum_{k=1}^{K} A_{(jK-K+k)(mN-N+k)} \\ {\rm Tr}(C_{kn}) &= \sum_{j=1}^{J} A_{(jK-K+k)(jN-N+n)} \\ \\ }$$
An important special case occurs when $N=1$ $$\eqalign{ E_{kn} &= e_k \\ C_{kn} &= C_k \\ A &= \sum_{k=1}^K C_{k}\otimes e_{k} \\ }$$ The coefficient matrices and their traces reduce to $$\eqalign{ C_{k} &= \sum_{j=1}^{J}\sum_{m=1}^{M} A_{(k-K+Kj)(m)}\; E_{jm} \\ {\rm Tr}(C_{k}) &= \sum_{j=1}^{J} A_{(k-K+Kj)(j)} \\ }$$ Repeating this analysis for $M=1$ yields $$\eqalign{ E_{jm} &= e_j \\ B_{jm} &= B_j \\ A &= \sum_{j=1}^J e_{j}\otimes B_{j} \\ B_{j} &= \sum_{k=1}^{K}\sum_{n=1}^{N} A_{(jK-K+k)(n)}\; E_{kn} \\ {\rm Tr}(B_{j}) &= \sum_{k=1}^{K} A_{(jK-K+k)(k)} \\ }$$