I am recently reading the tutorial on spiking neural networks about Alpha neurons written by Jason K. Eshraghian. Here's the link to his tutorial. According to his tutorial, the direct implementation is given as:
$U_{\rm mem}(t) = \sum_i W(\epsilon * S_{\rm in})(t)$,
where the kernel $\epsilon$ is an Alpha function given by
$\epsilon(t) = \frac{t}{\tau}e^{1-t/\tau}\Theta(t)$.
$\Theta(t)$ is Heaviside step function. We can see this function is parameterized by $\tau$.
However, in snntorch, the above convolution is realized in a recurrent form:
$I_{exc}[t+1]=(\alpha I_{exc}[t]+I_{in}[t+1])-R(\alpha I_{exc}[t]+I_{in}[t+1])$
$I_{inh}[t+1]=(\beta I_{inh}[t]-I_{in}[t+1])-R(\beta I_{inh}[t]-I_{in}[t+1])$
$U[t+1] = \tau_{\alpha}(I_{exc}[t+1]+I_{inh}[t+1])$
Now, the equations are parameterized by $\alpha$ and $\beta$.
So, what is the relationship among $\alpha$, $\beta$, and $\tau$?
I have checked the documentation of snn.Alpha. An equation is given as follows:
$\tau_{\alpha}=\frac{log(\alpha)}{log(\beta)}-log(\alpha)+1$.
But according to my experiment, $\tau_{\alpha}$ does not equal $\tau$.