Deriving Time of extintion of a Small neural Network

20 Views Asked by At

I'm trying to derive the Expected Value of the Time of Extintion $\tau_{ext}$ of a small Neural Stochastic Network with the following dynamics, where I consider $\tau_{ext}$ to be the time of the last spike of the system.

Take $N=2$ the number of Neurons. $X_i(t) $ represents the potential of Neuron $i$ at time $t$. Suppose $X(0) =(X_1(0),X_2(0))=(x,x)=(1,1) $ .

$X_i(t)=x e^{-t}$ , while there's no spike in the system. And when it is, say Neuron $j$ spikes, the value of neurons $i \neq j$ becomes $X_i(t)=x e^{-t}+C$ (I will take $C = 1\ \ a.e.$) and Neuron $j$ resets to $0$ potential. The dynamics restart afresh from the past, following the same deterministic behaviour as before, but with the new starting points, until the next spike of the system

The stochastic part comes with the time of the first spike the system. The first spike of Neuron $i$ when the initial condition is $X_i(0) = x $:

$\tau_i^x := inf\ \{t>0 : X_i(t)=0 \}$i.i.d.'s with law (in this particular case) $P(\tau_i^x >t)=e^{x(e^{-t}-1)} $

( Observe that $P(\tau_i^x =+ \infty)=e^{-x} > 0 $)

So the time of the first spike of the system is $\tau^x:=mín\ \tau_i^x$ with law $P(\tau>t)=e^{2x(e^{-t}-1)} $(in this case).

This would be a draft of the dynamics:dynamics of the network

And I'm thinking of $\tau_{ext}$ (in the case $x=1 , \tau:=\tau^1$ and $\hat{\tau}:=\tau^{x_{\tau}}$ ) as something like: $$ \tau_{ext}= \tau \ \Bbb{1}_{\{\tau < +\infty \}}+ \hat{\tau}\ \Bbb{1}_{\{\tau < +\infty ,\ \hat{ \tau } < +\infty \}} + \sum_{i=1}^{G-1}{\tau_1^c(i)} $$ where $\tau_1^c(i) \sim \tau_1^c \ i.i.d. $ and $G\sim Ge(p := P(\tau_1^c = + \infty))$

Problems with this would be:

a) the second and "third" term seem difficult to compute.

b) I think I should modify the r.v's $\tau_1^c(i)$ so that $P(\tau_1^c(i)=0,+\infty)=0 $ but I can't figure out how.