I am reviewing this paper on the approximating power of neural networks and I came across a definition that I could not quite understand. The definition reads: 
where $I_n$ is the $n$-dimensional unit hypercube and $M(I_n)$ is the space of finite, signed, regular Borel measures on $I_n$.
The only thing that I could get from this definition which again does not seem plausible enough is: since whenever the integral is $0$ it must imply that the measure $\mu$ is $0$, then $\sigma$ is non-zero. I am not sure if this right though.
I literally could not find any other literature or similar definitions on this and I've looked in a number of textbooks such as Kreyzig, Rudin and Stein & Shakarchi.
Any insight/help?
EDIT
So, a function $f$ is discriminatory if the condition in the above definition is sufficient for $\mu \in M(I_n)$ to be $0$. In a sense, if the condition is met then $f$ "discriminates" or "reveals" the notion that the measure $\mu$ is $0$ over all of $I_n$.
Remark Hence meaning that the set $I_n$ on which $\mu$ is defined on is $\mu$-negligible.
I am not entirely sure if my last remark is correct though since $\mu$ is signed and not necessarily positive. I have read that the stronger form for signed measures would in this case be if every measurable subset A of $I_n$ satisfies $\mu(A) = 0$. I do not think however that my remark is even near informative enough and this definition probably requires further justice.