Given a $\sigma$-algebra $\mathcal{S}$, is $\mu:\mathcal{S}\to 0$ a valid measure?

28 Views Asked by At

Definition A real-valued function $\mu$ defined on a $\sigma$-algebra $\mathcal{S}$ is a measure if:

  1. $\mu(\emptyset)=0$;
  2. $\mu(A)\geq0$ for all $A\in\mathcal{S}$;
  3. $\mu(\cup_k A_k)=\Sigma_k\mu(A_k)$ if $\{A_k\}$ is a finite or infinite sequence of pairwise disjoint sets from $\mathcal{S}$, i.e., $A_i\cap A_j=\emptyset$ for $i\neq j$.

According to this definition, I might be allowed to define a measure $\mu:\mathcal{S}\to 0$ such that $\mu(A)=0$ for all $A\in\mathcal{S}$. However, it is not clear then what Lebesgue integration means with respect to this measure. For instance, it is not true anymore that two functions having the same integral over a set $A\in\mathcal{S}$ must be equal almost everywhere in $A$.

How to get around this?

1

There are 1 best solutions below

0
On BEST ANSWER

Every measurable function $f$ is integrable w.r.t. your $\mu$ with $\int f d\mu=0$.

Further, $f=g$ a.e. holds for any two measurable functions since $\mu (\{x: f(x)\neq g(x)\})=0$