Intuition tells me this question has something to do with stochastic process. Since I'm not quite familiar with it, I might misuse some terms or definitions.
Supposed that $X_i$, $i \in [0,1]$ are i.i.d random variables, where $X_i \sim \log N(\mu, \sigma^2)$.
Then, is $$ \int_0^1 X_i di$$ well defined?
If yes, then is it a random variable as well, or is it some sort of "expectation" ?
Thanks in advance!
This is an important question related to the law of large numbers for continuum of variables. In the context of finitely many i.i.d. random variables we have assertions of the type
$$\sum_{i=1}^N X_i \rightarrow \mu \text{ as } N\rightarrow\infty$$
under suitable conditions, where $\mathbb{E}[X_i]=\mu$. The convergence is almost sure (strong law of large numbers) or in probability (weak law of large numbers).
The question you are asking is then whether some sort of law of large numbers for i.i.d. random variables exists for continuum of random variables. If we could invoke the law of large number then we would asserts that $$\int_0^1 X_i di \quad ``=" \quad\mu, \qquad \qquad(*)$$ where $\mathbb{E}[X_i]=\mu$. I put the equality sign in quotation marks since LHS is random while the RHS is constant so we need to ask ourselves in what sense we want this equality holds.
Definitions
To cast this problem formally, we need to define $X_i(\omega)$ as a function on appropriate measure space. We need index space $([0,1],\mathcal{I},\lambda)$, where $\lambda$ is a Lebesgue measure, and a sample space $(\Omega,\mathcal{F},P)$. Then, we define a product probability space in the usual way as $(I\times\Omega,\mathcal{I}\otimes\mathcal{F},\lambda\otimes P)$.
Consider a function $X(i,\omega)$ defined on $I\times\Omega$. Let $X_{\omega}$ represent a function $X(\cdot,\omega)$ which is a function on $[0,1]$ and let $X_{i}$ represent a function $X(i,\cdot)$ which is a function on $\Omega$. In addition, assume that $X(i,\cdot)$ are essentially pairwise independent. That is, for $\lambda$-almost all i, $X_i$ is independent of $X_j$ for $\lambda$-almost all $j\in [0,1]$.
Almost sure equality fails
Suppose that we want $(*)$ to hold almost surely for $X(i,\omega)$ defined above. Then we quickly run into two problems describe by Judd 1985.
Below, I state two results that point to the problem with defining $(*)$ in almost sure sense. The first result is due to Sun, 2006.
The second result is due to Uhlig(1996).
In other words, if a law of large numbers is supposed to hold for a continuum of i.i.d. random variables in almost sure sense, then it has to be the case that $X_i(\omega)$ must be essentially constant random variables.
Solutions
There are two ways to get out of this problem.
References
Judd, "The law of large numbers with a continuum of IID random variables", Journal of Economic Theory, Volume 35, Issue 1, February 1985, Pages 19-25
Sun, "The exact law of large numbers via Fubini extension and characterization of insurable risks", Journal of Economic Theory, Volume 126, Issue 1, January 2006, Pages 31-69
Uhlig, "A law of large numbers for large economies," Economic Theory, February 1996, Volume 8, Issue 1, pp 41–50
Yeneng Sun has several papers on the this topic besides the once cited above.