Let $A,B$ be real $n\times n$ matrices (if we want complex entries, replace transpose with hermitian conjugate). Write them as an array of columns so that $A=\begin{pmatrix} A_1~|&\cdots& |~A_n \end{pmatrix}$, and $B=\begin{pmatrix} B_1~|&\cdots& |~B_n \end{pmatrix}$. By definition, the matrix product $AB$ is an array of the pairwise inner products of the columns: $$(AB)_{ij}:=A_i^TB_j$$ I was wondering if there is something analogous for an array of the outer products $A_iB_j^T$. Each of these is itself an $n\times n$ matrix. Specifically, it can be expressed as a Kronecker product: $A_iB_j^T=B_j^T\otimes A_i$. So the desired array, let's call it $A\odot B$ for lack of better notation, is an $n\times n$ block matrix where each of the blocks is itself of dimension $n\times n$. $$(A\odot B)_{ij}:=B_j^T\otimes A_i$$ Overall, we could think of this as a matrix of dimension $n^2\times n^2$. I was wondering if one can express $A\odot B$ in terms of the Kronecker product of the entire matrices $A$ and $B$. In the case $n=2$, I was able to check by hand that $$A\odot B=\begin{pmatrix}1&0&0&0\\0&0&1&0\\0&1&0&0\\0&0&0&1\end{pmatrix}\big(A\otimes B^T\big)=\big(B^T\otimes A\big)\begin{pmatrix}1&0&0&0\\0&0&1&0\\0&1&0&0\\0&0&0&1\end{pmatrix}$$
This is a bit ad hoc, as I basically just wrote out all the coefficients and tried to reverse engineer how to get from one to the other. I'm not sure how I could generalize this to more than 2 columns. It's also very possible that I made a mistake because of the huge number of indices involved. Is there a better way to think about this? Although the definition of $A\odot B$ seems to be a natural "dual" to matrix multiplication, I wasn't able to find anything about it.
For context, the reason I became interested in this question was because I was trying to translate orthogonality/completeness relations into matrix form. For example, the set of vectors $\{A_i\}_{i=1}^n$ is orthonormal iff $A_i^TA_j=\delta_{ij}$. In matrix form, this says that $A$ satisfies $A^TA=I$. In order to be an orthonormal basis, it needs to also satisfy the completeness relation (resolution of the identity) $I=\sum_{i=1}^nA_iA_i^T$. This equation could be interpreted in matrix form by saying that the partial (block-wise) trace of $A\odot A$ equals the identity matrix. In this situation, we could perhaps think of $A\odot A$ as being the block version of the concept of a density matrix.
As we know: $$ A=\pmatrix{a_{11}&a_{12}\\a_{21}&a_{22}}\,,\quad B=\pmatrix{b_{11}&b_{12}\\b_{21}&b_{22}} \,, $$ $$ A\otimes B=\pmatrix{a_{11}B&a_{12}B\\a_{21}B&a_{22}B} =\pmatrix{a_{11}b_{11}&a_{11}b_{12}&a_{12}b_{11}&a_{12}b_{12}\\ a_{11}b_{21}&a_{11}b_{22}&a_{12}b_{21}&a_{12}b_{22}\\ a_{21}b_{11}&a_{21}b_{12}&a_{22}b_{11}&a_{22}b_{12}\\ a_{21}b_{21}&a_{21}b_{22}&a_{22}b_{21}&a_{22}b_{22}}\,. $$ Writing $$ A=\pmatrix{A_1&A_2}\,,\quad B=\pmatrix{B_1&B_2} \,, $$ where $A_1,A_2,B_1,B_2$ are columns we have $$ A_1\otimes B_1=\pmatrix{a_{11}b_{11}&a_{11}b_{21}\\a_{21}b_{11}&a_{21}b_{21}}\,. $$ But this is not block of $A\otimes B\,.$ When you use instead of the column vectors of $B$ its row vectors, $$ B=\pmatrix{B_1\\B_2}\,, $$ we get $$ A_1\otimes B_1=\pmatrix{a_{11}b_{11}&a_{11}b_{12}\\a_{21}b_{11}&a_{21}b_{12}}\,. $$ which is not a block of $A\otimes B$ either.