Let $N(t)$ denote a counting process, $X_1$, $X_2$, ... denote the inter-arrival time, and $S_1$, $S_2$, ... denote the arrival timestamp. So $S_1=X_1$, $S_2=X_1+X_2$, ...
Let $T$ be a constant, so $N(T)$ denote the number of arrivals in time interval $[0, T]$. Note that $N(t)$ is inherently a stochastic process, so $N(T)$ is a random variable, and we can compute its expectation $E[N(T)]$. We can calculate the value of $E[N(T)]$ when $T$ and the distribution of $\{X_i\}$ is known.
Now let $T$ be a random variable with known distribution. $E[N(T)]$ still retains its meaning. Now I am wondering what is the relation between $E[N(E[T])]$ and $E[N(T)]$. Are they equal? If not, how can I derive the expression of $E[N(T)]$ given that $T$'s distribution is known.
In a slightly more general case, you have two random variables $N$ and $T$. The distribution of $N$ depends on the value of $T$, so let's say that $\kappa(\mathrm dn|t)$ is conditional distribution of $N$ given $T$. Let further $\mu$ be the distribution of $T$ itself. For example, if $N$ is a Poisson process with the intensity $\lambda$ then $\kappa(\cdot|t) = \mathrm{Poi}(\lambda t)$. The law of total probability gives us $$ \mathsf E[N_T] = \iint n \kappa(\mathrm dn|t)\mu(\mathrm dt) \qquad \mathsf E[N_{\mathsf E[T]}] = \int n\kappa\left(\mathrm dn\left|\int t\mu (\mathrm dt)\right)\right.. $$ In a simpler terms, let $f(t) := \mathsf E N_t$ for any deterministic $t$, then $$ \mathsf E[N_T] = \int f(t)\mu(\mathrm dt)\qquad \mathsf E[N_{\mathsf E[T]}] = f(\mathsf E[T]). $$