In theoretical physics one often encounters the following rationale: if $f$ and $g$ are functions on $\mathbf{R}^n$, satisfying some technical conditions, and $\displaystyle\int_\Omega f=\int_\Omega g$ for all open sets $\Omega$, then $f=g$. (For instance, one obtains from Gauss law in "integral form" $\displaystyle\int_\Omega\mathrm{div} \ E = \int_\Omega \rho$ and its "differential form" $\mathrm{div} \ E=\rho$.)
Now my problem is: I want to know these technical conditions. Of course, it is a seemingly obvious statement, but it disturbs me not to know under what conditions this reasoning is legit. Is there some theorem which answers this question? Of course, I am also happy with a good reference.
If $ f $ and $ g $ are continuous $ L^{1} $-functions, then you can certainly conclude that $ f = g $. We need the $ L^{1} $-condition to ensure that the integrals $ \displaystyle \int_{\Omega} f $ and $ \displaystyle \int_{\Omega} g $ are defined at the very least (especially if you let $ \Omega = \mathbb{R}^{n} $).
Now, suppose that $ \displaystyle \int_{\Omega} f = \int_{\Omega} g $ for all open subsets $ \Omega $ of $ \mathbb{R}^{n} $. Assume, for the sake of contradiction, that $ f \neq g $. Without loss of generality, we may suppose that there exists an $ x \in \mathbb{R}^{n} $ such that $ f(x) < g(x) $. Then by the continuity of $ f $ and $ g $, we can find an open neighborhood $ U $ of $ x $ such that $ f(y) < g(y) $ for all $ y \in U $. It follows readily that $ \displaystyle \int_{U} f < \int_{U} g $, which contradicts our initial hypothesis. The assumption is therefore false, so we conclude that $ f = g $.