I am performing a convolution of a dataset with a large 2D kernel which is defined as the sum of two Gaussians in the following form
$$ f(r) = \exp(- r^2 / \sigma_1^2) + \beta \cdot \exp(- r^2 / \sigma_2^2) $$
with $\beta$ being defined > 0. To speed up the calculation, I want to decompose the operation into 2 1D convolutions (separable convolution).
To check if it is feasible, I quickly threw the kernel into numpy and calculated the rank of the matrix and it is 1 which indicates that decomposition is possible. Nevertheless, I could not find a way around ending up with a $\sqrt{-\beta}$ in the two decomposed kernels which I want to avoid due to the numerical overhead coming with complex numbers. I also found the obvious solution where I do the convolutions for each ($\exp(- r^2 / \sigma_1^2)$ and $\beta \cdot \exp(- r^2 / \sigma_2^2)$) separated and then calculate the sum of the two results but that would also more then double the computational complexity.
Is there a clever way of decomposing this kernel resulting in a single set of 1D, non-complex kernels which I am overseeing? Thanks a lot in advance!
Actually I did a mistake in the calculation of the rank. The rank is unequal 1 and therefore the kernel is not separable. The summed Gaussian kernels cannot be separated.