How is the relation between discreete 2d convolution and neural network convolution

17 Views Asked by At

I have a question about how to formalize the definition of neural network convolution.

When it is coded, we express the 2d convolution like (in the case of a 3x3 kernel)

$C [x,y] = \sum_u \sum_v A[x-1+u, y-1+v] \cdot B[u,v]$ 3

This is pretty similar to what I see in the Multidiminetion discreet convoltuion

But in that case, it is defined as:

$q(j,n) = \sum_i \sum_m f(i,m) \cdot p(i-j, n-m)$ (from book: Multidimentional digital signal processing, dudgeon

Do you know how what is the relation between these two formulations? They look similar, they should represent the same thing, but I understand only the first, not the second.