ConvNet is a class of neural networks which use kernels to extract position-dependent data from data and perform further classification or regression operations on it.
Simply put, kernels work like this. For some data $D$ padded with a "border" of zeros:
$D = \begin{bmatrix}0 & 0 & 0 & 0 & 0\\0 & d_{11} & d_{12} & d_{13} & 0\\0 & d_{21} & d_{22} & d_{23} & 0\\0 & d_{31} & d_{32} & d_{33} & 0\\0 & 0 & 0 & 0 & 0\end{bmatrix}$
there may be a kernel $K$:
$K = \begin{bmatrix}k_{11} & k_{12} & k_{13}\\k_{21} & k_{22} & k_{23}\\k_{31} & k_{32} & k_{33}\end{bmatrix}$
so that the result of applying the convolution with the kernel $K$ to $D$ is a matrix, every element of which is the sum of elements of the Hadamard product $K \circ D_{ij}$ where $D_{ij}$ is a 3x3 block of $D$ with $d_{ij}$ at its center.
The question is, may a convolution be a hash function meeting the criteria of uniformity and irreversibility? If yes, what classes of kernels might provide such behavior? Certainly, there are edge cases like if $K$ is all zeros except for 1 in the central element which would reduce the convolution to the identity function which is reversible and is not uniformly distributed.