Relationship between summation and convolution

293 Views Asked by At

Currently I am reading a paper which derives a multiplication of matrix such as: $AB=I$, with $B$ is the inverse of $A$, $I$ is an identity matrix. This is the short hand notation for the following summation: $$\sum_j a_{ij}b_{jk}=\delta_{ik}.$$ where $a_{ij}$ and $b_{jk}$ (which are functions of variable $h$) are elements of $A$ and $B$; $\delta_{ik}$ is Kronecker delta. The authors claim that the continuous equivalent of this summation is given by: $$\{a*b\}(h)=\delta(h).$$ where * denotes the convolution. Actually, I cannot understand what theorem or properties behind this claim. Is there anyone knows this issue? Thank you.

1

There are 1 best solutions below

6
On BEST ANSWER

No, the equivalence is not when "$a_{ij}$ and $b_{jk}$ are functions of $h$".

In the equivalence to the continuous case, $h$ is taking over the roles of $i, j, k$. But it is easier to see when we use three continuous variables that correspond to each of the three indices, say $$i \to x, j \to y, k \to z$$ Then matrix elements correspond to $$a_{ij} \to a(x - y)\\b_{jk} \to b(y - z)\\\delta_{ik} \to \delta(i - k)$$ and finally, the summation over $j$ becomes integration over $y$:

$$\int_{\Bbb R} a(x-y)b(y-z)dy = \delta(x - z)$$

Doesn't look quite right yet. So substitute $u = y - z$, then $du = dy$ and the limits don't change as $u$ still covers all of $\Bbb R$. but $x - y = x - z - u$, so it becomes $$\int_{\Bbb R} a(x- z - u)b(u)du = \delta(x - z)$$ The expression on the left is the definition of $\{a*b\}(x - z)$. Substitute $h = x - z$ and you have your equivalence.