I am reading DeGroot's book titled 'Optimal Statistical Decisions' in which he says the following:
If $S$ is the $n$-dimensional space $\mathbb{R}^n$, then the $\sigma$-field will be taken to be the $\sigma$-field of Borel sets, i.e. , the smallest $\sigma$-field containing all $n$-dimensional intervals.
and
If the function $g$ is measurable with respect to a $\sigma$-field and $B$ is any Borel set on the real line, then the subset $g^{-1}(B)$ of $S$, defined by the relation $g^{-1}(B)=\{s:g(s)\in B\}$, also belongs to the $\sigma$-field.
I have no prior exposure to measure theory and topology, hence I find these two statements difficult to comprehend. However, I do understand what a $\sigma$-field is and the three properties that a collection of subsets of sample space $S$ must fulfill in order to become a $\sigma$-field.
I hope someone can provide a simple and concise explanation of what a Borel set is, so that I can develop a deeper understanding of these two statements above. I only want to learn from a probabilistic standpoint right now and would appreciate it if the explanation would leave out measure theory and topology altogether. Thanks.
A Borel set is actually a simple concept. Any set that you can form from open sets or their complements (i.e., closed sets) using a countable number of intersections or unions is a Borel set. It really is nothing more than that.
Why we need this concept is a much more interesting discussion though and unless you venture into measure theory it will be hard to understand that. But I will attempt an exaplantion. Let's start with some motivation (courtesy Shreve Vol. II, Ch 1):
Well, expectation is traditionally defined as follows: $$ E(f(x)) = \sum_{i = 1}^\infty f(x_i) P(x_i) \label{def_exp} \tag{1} $$
where $x_i \in [0,1]$ represent all possible numbers one can draw. A very naïve way to sum this up would be to say $P(x_i) = 0 \, \forall x_i \in \mathbb{R}$ and then conclude that the expected value of $f(x)$ is zero. However, that is not true because probabilities can only be summed up when you have a countable set of events (see Kolmogorov's axioms please) and the set of all reals from $[0,1]$ is not countable at all.
That's where Borel sets come in. What you do is map the events from a sample space $\Omega$ onto the Borel subsets of $\mathbb{R}$. Specifically we construct a probability measure $\mu$ as follows, $$ \mu(B) = P\{\omega : X(\omega) \in B \}, \qquad \omega \in \Omega; B \in \mathcal{B}(\mathbb{R}) $$
where $\mathcal{B}(\mathbb{R})$ is the Borel $\sigma$-field on $\mathbb{R}$. Because this definition satisfies the Kolmogorov axioms it qualifies as a valid probability measure. If furthermore, we are able to come up with a valid probability measure that gives a uniform distribution to $x$ on $[0,1]$ then we can use it in our expectation value calculation above.
And we can very easily come up with such a measure for $[0,1]$. For any open interval $(a,b) \subset [0,1]$ we choose $\mu((a,b)) = b - a$ as our probability measure. Now what is this measure for our set of rationals? Consider the rationals $q_1 = 0, q_2, q_3 \cdots$ to be the sequence of all rational numbers in $[0,1]$. For any $\epsilon > 0$ and $i$, define the set:
$$ Q_i = \left(q_i - \epsilon/2^i, q_i + \epsilon/2^i\right) $$
Then, $$ Q_{[0,1]} := Q \cap \left[0,1\right] \subseteq \bigcup_{i = 1}^\infty Q_i \\ \implies \mu \left( Q_{[0,1]} \right) \leq \sum_{i=1}^\infty \epsilon/2^{i-1} = 2\epsilon $$
But $\epsilon$ is arbitrary which means $\mu\left(Q_{[0,1]}\right) = 0$ (actually there are a few more steps before we can conclude this but I will omit them to keep the scope of this answer limited to probability theory).
But if $\mu(Q_{[0,1]} = 0$ then $\mu(Q^c_{[0,1]}) = 1$. As a last step realise that $\mu$ is the $P$ for \ref{def_exp} because it is a valid probability measure for a uniform distribution on $[0,1]$. Therefore, $E(f(x)) = 1$.
And that's one reason why we need a measure theoretic viewpoint to properly appreciate probability. More advanced applications in financial mathematics cannot work properly unless one takes the subtleties of measures into account.