Flipping a variable in convolution operation would be the same as without the flip?

19 Views Asked by At

Are $(f*g)(t) = \sum_k f(k)g(t-k)$ and $\sum_k f(k)g(-(t-k))$ the same? I tested it with some values of $f(0)=0, f(1)=1, f(2)=2, g(0)=0, g(1)=2, g(2)=3$ otherwise $0$, and so I believe they are NOT the same. However, I've seen a Youtube instructor saying they're the same in neural network (source): the instructor states $\frac{\partial L}{\partial W[a',b']} = \sum_i\sum_j\frac{\partial L}{\partial Y[i,j]}X[i-a',j-b'] = X * \frac{\partial L}{\partial Y}$. Am I missing something here?