Information theory books talk about entropy and mutual information of discrete time processes, such as a sequence of symbols sent with a symbol period $T_s$ and there received sequence.
Can we talk about the mutual information between two continuous time stochastic processes $X(t)$ and $Y(t)$ in a way similar to the mutual information between two random variables $X$ and $Y$?
I guess the problem is the quantities in information theory are defined in densities of random variables. So can we define an instantaneous mutual information $I(X(t);Y(t))$ for the two processes at time $t$?