In full-disclosure I asked this question on Quora anonymously, because I thought the answer was going to be embarrassingly self-evident, but given the single, tangential answer (albeit very interesting), I want to maximize my chances by posting the question here with a follow-up immediately related secondary question.
- Before convolving a filter with an image, or a kernel with a layer in convolutional neural networks, the filter (or kernel) is flipped in its rows and columns. I am looking for the name of this flipped matrix.
The matrix $\begin{bmatrix}a&b\\c&d\end{bmatrix}=\begin{bmatrix}\color{red}{\blacksquare}&\color{blue}{\blacksquare}\\\color{green}{\blacksquare}&\color{aqua}{\blacksquare}\end{bmatrix}$ with flipped columns and rows would be $\begin{bmatrix}d&c\\b&a\end{bmatrix}=\begin{bmatrix}\color{aqua}{\blacksquare}&\color{green}{\blacksquare}\\\color{blue}{\blacksquare}&\color{red}{\blacksquare}\end{bmatrix}.$
To be clear (given prior experience - I know it is by and large completely unnecessary), I am not asking for the transpose:
$\begin{bmatrix}a&b\\c&d\end{bmatrix}^\top=\begin{bmatrix}a&c\\b&d\end{bmatrix}=\begin{bmatrix}\color{red}{\blacksquare}&\color{green}{\blacksquare}\\\color{blue}{\blacksquare}&\color{aqua}{\blacksquare}\end{bmatrix}.$
- Immediately after getting this flipped filter or kernel the convolution consists of a sum of all the entries of a Hadamard product, which really is sort of a "dot product of matrices". What is the name of this matrix operation in general:
$$\text{elementwise}\sum\left(\begin{bmatrix}a&b\\c&d\end{bmatrix}\circ \begin{bmatrix}z&w\\v &y\end{bmatrix}\right)=\text{elementwise}\sum\begin{bmatrix}az&bw\\cv&dy\end{bmatrix}=az+bw+cv+dy$$
?
Thanks to @Omnomnomnom the answer to the second question is the Frobenius inner product:
For $A = \begin{bmatrix}a&b\\c&d\end{bmatrix}$ and $\begin{bmatrix}z&w\\v &y\end{bmatrix}$, the Frobenius inner product, $\langle A,B \rangle_\mathbf F$ takes two matrices and returns a number.
$$\small\sum_{\text{el.wise}}\left(\begin{bmatrix}a&b\\c&d\end{bmatrix}\circ \begin{bmatrix}z&w\\v &y\end{bmatrix}\right) = tr\left(\begin{bmatrix}a&b\\c&d\end{bmatrix}^\top\cdot \begin{bmatrix}z&w\\v &y\end{bmatrix}\right)=tr\left(\begin{bmatrix}a&c\\b&d\end{bmatrix}\cdot \begin{bmatrix}z&w\\v &y\end{bmatrix}\right)=\tiny az+cv+bw+dy$$
and thanks to @Hurkyl
$$\small\sum_{\text{el.wise}}\left(\begin{bmatrix}a&b\\c&d\end{bmatrix}\circ \begin{bmatrix}z&w\\v &y\end{bmatrix}\right)=\small\sum_{\text{el.wise}}\left(\begin{bmatrix}az&bw\\cv&dy\end{bmatrix}\right)=\begin{bmatrix}1&1\end{bmatrix}\begin{bmatrix}az&bw\\cv&dy\end{bmatrix}\begin{bmatrix}1\\1\end{bmatrix}=\tiny az+cv+bw+dy $$
I do not know of any name for the operation so I will try and make an argument for why it could be called "reflected transpose".
Consider the $3\times 3$ matrix :
$${\bf A} = \left[\begin{array}{ccc}1&2&3\\4&5&6\\7&8&9\end{array}\right]$$
using lexical vectorization, we get:
$$\mathrm{vec}({\bf A}) = \left[\begin{array}{ccccccccc} 1&4&7&2&5&8&3&6&9 \end{array}\right]^T$$
We can now define this "flipped identity" reflection in the vectorization:
$${\bf R} = \left[\begin{array}{ccccccccc}0&0&0&0&0&0&0&0&1\\0&0&0&0&0&0&0&1&0\\0&0&0&0&0&0&1&0&0\\0&0&0&0&0&1&0&0&0\\0&0&0&0&1&0&0&0&0\\0&0&0&1&0&0&0&0&0\\ 0&0&1&0&0&0&0&0&0\\0&1&0&0&0&0&0&0&0\\1&0&0&0&0&0&0&0&0\end{array}\right]$$
The matrix you seek is then:
$$(\textrm{vec}^{-1}({\bf R}\textrm{vec}({\bf A})))^T = \left[\begin{array}{ccc}9&6&3\\8&5&2\\7&4&1\end{array}\right]$$
Which is the transpose of the special reflection $\bf R$ of the vectorization.
In some sense a reflected transpose.
We can of course do all operations inside the vectorization with $\bf T$ transpose:
$${\bf T} = \left[\begin{array}{ccccccccc}1&0&0&0&0&0&0&0&0\\0&0&0&1&0&0&0&0&0\\0&0&0&0&0&0&1&0&0\\0&1&0&0&0&0&0&0&0\\0&0&0&0&1&0&0&0&0\\0&0&0&0&0&0&0&1&0\\0&0&1&0&0&0&0&0&0\\0&0&0&0&0&1&0&0&0\\0&0&0&0&0&0&0&0&1\end{array}\right]$$
And we get:
$$\textrm{vec}^{-1}({\bf TR}\textrm{vec}({\bf A})) = \left[\begin{array}{ccc}9&6&3\\8&5&2\\7&4&1\end{array}\right]$$