I have come across the following argument concerning the Bayes error:
Let $r(x)=\mathbb{E}[Y|X=x]$ be the regression of $Y$ on $X$. Then one can show that the Bayes error is
$L^* = \mathbb{E}[\min(r(X),1-r(X))]$.
Suppose now that $X$ has a density $f$. Denote $p=\mathbb{P}[Y=1]$. Then
$L^* = \int \min (r(x),1-r(x)) \, f(x) \, dx = \int \min ((1-p)f_0(x),pf_1(x)) \, dx$,
where $f_0,f_1$ are the class-conditional densities. Now comes the following claim:
If $f_0$ and $f_1$ are non-overlapping, i.e. $\int f_0(x) f_1(x) \, dx = 0$, then obviously $L^*=0$.
I cannot see at a glance how the property of being non-overlapping is related to orthogonality in $L_2$. Neither is it obvious to me how the expression for $L^*$ can be bounded from above by $\int f_0(x) f_1(x) \, dx = 0$. Is some elementary inequality for $\min$ involved here? What am I missing?