On random functions taking values in the space of continous functions.

57 Views Asked by At

Here is the passage that is unclear to me (Theoretical statistics by Keener):

In this section we develop a weak law of large numbers for averages of random functions. This is used in the rest of the chapter to establish consistency and asymptotic normality of maximum likelihood and other estimators.

Let $X_1, X_2, \dots$ be i.i.d., let $K$ be a compact set in $\mathbb{R}^p$, and define $$ W_i(t) = h(t, X_i) \,, \quad t \in K$$ where $h(t,x)$ is a continuous function of $t$ for all $x$. Then $W_1, W_2, \dots$ are i.i.d. random functions taking values in $C(K)$, the space of continuous functions on $K$.

I do not understand how $W_1, W_2, \dots$ take values in $C(K)$. For every different realization of the random variable they take as value a continuous function? So $W_i(t)$ does not only depend on $t$ and $i$ but also on the realised event on the sample space?

Probably I realise I have never fully understood what the definition $W_i(t) = h(t,X_i)$ really means, could somebody guide me through it?

1

There are 1 best solutions below

0
On BEST ANSWER

Think about it this way. Once we "know" the value of a particular $X_i$ then we have fixed the second parameter of $h$. Since $h(t, x)$ is continuous in $t$ for all $x$, then we are left with a continuous function in $t$. Since $t \in K$, we are left with a continuous function on the domain $K$.

For example, let's say that the $X_i$ are uniformly distributed in $[0, 1]$, $K = [0, 2]$, and $h(t, x) = t + x$.

  • Say that $X_1 = 0$, so $W_1(t) = h(t, 0) = t + 0 = t$.
  • If $X_2 = 0.5$ then $W_2(t) = h(t, 0.5) = t + 0.5$.

You can see that each of these are continuous functions on $[0, 2]$ i.e. they are functions in $C([0, 2])$.