I have seen many variations of this statement in different texts.
$\bullet$ "Two densities with the same cumulative distribution function are equal except on a set of Lebesgue measure zero."
What does this mean?
As a follow-up, what does this mean in the context of maximum entropy?
Colloquially, sets of measure zero in probability are "impossible events". What does it mean for a series of "impossible events" to occur if our notion of probability is a frequentist one and we are building a pdf empirically? I know that maximum entropy distributions are defined as those which are "noncommittal" with respect to missing information. Thus new probabilities are assigned to additional points in the sample space such that entropy of the pdf is maximized. But an exceedingly rare event or an "impossible event" would represent no information since it violates some unspoken rule of the experiment. Most significantly for me, maximum entropy distributions are unique. How does this work if the CDF is non-unique with respect to Lebesgue measure zero?
Let $f_1$ and $f_2$ be two density functions on $\mathbb{R}$. The set $A := \{x \in \mathbb{R} : f_1(x) \ne f_2(x)\}$ is a subset of $\mathbb{R}$. The claim is that if $f_1$ and $f_2$ are densities with the same cumulative distribution function (i.e. $\int_{-\infty}^x f_1(t) \, dt = \int_{-\infty}^x f_2(t) \, dt$), then the Lebesgue measure of $A$ is zero.
For example, it may be that $f_1$ and $f_2$ are exactly the same except for at finitely many points. Actually, in the worst case, they must be the same except for at countably many points.
Regarding your last sentence: no, a probability distribution has a unique CDF. The statement here is that the PDF is "almost" unique too in a certain sense. (This is made precise by the notion of "being equal almost everywhere.")
I feel your main question has nothing to do with the stuff about maximum entropy...