Given a unit rate Poisson process $\{N(t):t\ge 0\}$, we generate a process $$A(t) = N(\int_0^t \Lambda(\tau)d\tau)$$ associated with a filtration $\mathcal{F}_t$ such that $A(t)$ is adapted and $\Lambda(t)\ge 0$ is predicatable. (we may also assume that $\Lambda(\cdot)\le c<\infty$ a.s.)
I am working on a problem that needs to determine if $$\mathbb{E}[A(t)] = \mathbb{E}[\int_0^t \Lambda(\tau)d\tau] $$
Intuitively this should be true because the rate function $\Lambda(\cdot)$ is predictable.
Without the predictability assumption, this cannot be true, say one can define a random time $\xi$ that $N(\xi)- \xi>1$.
Some motivations for the problem: consider a manager adjusting the future arrival rate $\Lambda(\cdot)$ based on the history information (including history arrivals). In this case, the arrival process is $A(t)$ and we expect the expected future arrivals to be the mean of the rate.
Additional assumption: as suggested by Kurt, this claim may not be true. To make the motivation work, we may add an assumption that $N(t+\int_0^s\Lambda(\tau)d\tau)-N(\int_0^s\Lambda(\tau)d\tau)\perp \mathcal{F}_s $ for all $t>0$.
After some time, here's my idea of proving the result:
consider the piecewise simple predictable functions that approximates $\Theta(t)=\int_0^t\Lambda(s)ds$ by simply partitioning the time:
with $\Theta^n(t) = \sum_{k=0}^n\Theta(t_k)\mathbf{1}_{\{t_k\le t<t_{k+1}\}}$, where $0=t_0\le t_1 \le\cdots\le t_n = T$.
Let $\ell = \max\{k\le n: t_k\le t\}$, then for arbitrary $\epsilon\ge0$ \begin{align} \mathbb{E}[N(\Theta^n(t)+\epsilon)] &= \mathbb{E}\left[ \mathbb{E}[N(\Theta^n(t)+\epsilon)-N(\Theta^n(t_\ell))\vert\mathcal{F}(t_\ell)]+N(\Theta^n(t_\ell))\right]\\ & = \mathbb{E}\left[ \left(\Theta^n(t)-\Theta^n(t_\ell)+\epsilon\right)\right]+\mathbb{E}\left[N(\Theta^n(t_\ell))\right]\\ &=\cdots\\ & = \mathbb{E}[\Theta^n(t)]+\epsilon. \end{align} What remains to be checked is $$\mathbb{E}[N(\Theta(t))] = \lim_{n\to\infty}\mathbb{E}[N(\Theta^n(t))].$$ Because $\Theta^n(t)\nearrow \Theta(t)$, $\lim_{n\to\infty}N(\Theta^n(t)) = N(\Theta(t)-)$ for each sample path. Hence
$$\mathbb{E}[N(\Theta(t))]\ge\lim_{n\to\infty}\mathbb{E}[N(\Theta^n(t))] = \lim_{n\to\infty}\mathbb{E}[\Theta^n(t)] = \mathbb{E}[\Theta(t)].$$
But $N$ is non-decreasing, for all $\epsilon>0$ $$N(\Theta(t))\le \lim_{n\to\infty}N(\Theta^n(t)+\epsilon).$$
Hence $$\mathbb{E}[N(\Theta(t))]\le \lim_{n\to\infty}\mathbb{E}[N(\Theta^n(t)+\epsilon)] = \mathbb{E}[\Theta(t)]+\epsilon.$$
Thus the claim follows from a sandwich argument.