I have been trying to get my head around a toy problem I made up in order to better understand the convolution theorem — the convolution of two signals in the 'time' domain (or probability domain for this problem) is equal to the multiplication of those signals in the frequency domain.
The problem
In a sports league (Aussie Rules in my example) there are 2 teams both trying to avoid coming last. In order to avoid coming last, Team A must win at least two more games than Team B.
Assumptions
Each team's distribution of wins from their remaining games $N=8$ can be modeled using a binomial distribution, using some average chance of winning each game ($\rho_a,\rho_b)$. That is, Team A can be modeled by $w(\rho_a)$ and likewise for Team B.
The two teams do not play each other and hence their distributions are uncorrelated.
Attempted solution
Using $\rho_a=\rho_b=0.2$ I come up with this distribution for number of wins from $0$ to $8$.
$$w(\rho_a)=w(\rho_b)=\begin{bmatrix} 1.68\times10^{-1} \\ 3.36\times10^{-1} \\ 2.94\times10^{-1} \\ 1.47\times10^{-1} \\ 4.59\times10^{-2} \\ 9.18\times10^{-3} \\ 1.15\times10^{-3} \\ 8.19\times10^{-5} \\ 2.56\times10^{-6} \\ \end{bmatrix}$$
I then take the Fourier transforms of both distributions.
$$\mathscr{F}[w(\rho_a)]=W_a$$ $$\mathscr{F}[w(\rho_b)]=W_b$$
Then take the pointwise multiplication of these functions and invert
$$\mathscr{F}^-1[W_aW_b]$$
However, this does not return the expected result. I know from solving this same problem in the probability domain that the distribution of Team A getting more $0$ to $8$ more wins than Team B is
\begin{bmatrix} 2.51\times 10^{-1} \\ 2.05\times 10^{-1} \\ 1.13\times 10^{-1} \\ 4.29\times 10^{-2} \\ 1.11\times 10^{-2} \\ 1.95\times 10^{-3} \\ 2.21\times 10^{-4} \\ 1.46\times 10^{-5}\\ 4.29\times 10^{-7} \end{bmatrix}
Which is nothing like what I get from the inverse Fourier transform.
I apologise for the long-winded setup, but can anyone point me in the direction of the flaws in my logic?
Edit:
It seems like I had a couple of problems.
I wasn't zero-padding the pdfs in frequency space. To be honest, I'm not sure why I need to do that, but it doesn't work without doing so.
Not taking the absolute value of the Fourier coefficients before inverting.
Code
Here is the code that got me the answer.
import numpy as np
from math import factorial
num_games = 8
a_ave_win = 0.2
b_ave_win = 0.2
# create zero padded arrays for pdfs
team_a_pdf = np.zeros((num_games*2+1))
team_b_pdf = team_a_pdf.copy()
# compute pdf from binomial distribution
for i in range(num_games+1):
team_a_pdf[i] = a_ave_win**i*(1-a_ave_win)**(num_games-i)*factorial(num_games)/factorial(i)/factorial(num_games-i)
team_b_pdf[i] = b_ave_win**i*(1-b_ave_win)**(num_games-i)*factorial(num_games)/factorial(i)/factorial(num_games-i)
team_a_fft = np.fft.fft(team_a_pdf)
team_b_fft = np.fft.fft(team_b_pdf)
conv = np.abs(team_a_fft*team_b_fft)
ifft = np.fft.ifft(conv).real
print(ifft[:num_games+1])
Here is a straightforward approach using the theory you are interested in. You are trying to find the distribution of $Z=X-Y$ where $X$ is the number of wins of team A and $Y$ is the number of wins of team B out of $n\geq 2$ games each team plays. We assume $X$ is independent of $Y$. Once you have found the distribution of $Z$, the probability that A wins at least two more games than B is $P(Z\geq 2)$. We have $Z \in \{-n,...-1,0,1,...,n\}$ a.s. By independence, we compute the Fourier transform of the distribution of $Z$: $$\begin{aligned}E[e^{i\xi Z}]&=E[e^{i\xi X}]E[e^{-i\xi Y}]=\\ &=((p_a(e^{i\xi}-1)+1)(p_b(e^{-i\xi}-1)+1))^n\end{aligned}$$ Under the further assumption that $p_a=p_b=p$ we have $$\begin{aligned}E[e^{i\xi Z}]&=(2p(1-p)(\cos(\xi)-1)+1)^n\end{aligned}$$ This function is periodic over $(-\pi,\pi]$ (and purely real, so that you already know that the distribution of $Z$ is symmetric around $0$). So what you seek is $$P(Z\geq 2)=\sum_{2\leq k \leq n}\underbrace{\frac{1}{2\pi}\int_{(-\pi,\pi]}(2p(1-p)(\cos(\xi)-1)+1)^ne^{-i\xi k}d\xi}_{=P(Z=k)}$$