For continuous distribution (on R) the probability of a single point is $0$. So I'm not sure what does it mean to sample $M$ elements from a continuous distribution.
Let say there is a continuous distribution D and there is a number z and a function f such that: $f(x)=1$ for $x < z$ except of a finite number of cases.
$f(x)=0$ for $x \ge z$ except of a finite number of cases.
$D(x < z) > 0 , D(x \ge z) > 0$
So if I have a random sample $(X_1, \dots ,X_m)$ from $D$, And assume $X_1>z$ , can I conclude that $f(X_1)=0$ ?
And assume $f(X_2)=1$ , can I conclude that $X_2 < z$ ?
Sampling $m$ elements simply means obtaining $m$ values from $X_i$, call them $x_i$, according to the distribution $D$, exactly as for the discrete case. If you prefer, think about it as choosing a number which has the correct probability of being in any interval.
If your observation $x_i$ of $X_i$ satisfies $x_i>z$, then with probability one it satisfies the result you give. Strictly the statement is $P(f(X_i)=0|X_i>z) =1$. That is, with probability 1 your conclusion holds. Similarly for $x_2$.
The key points are
The fact that something happens with probability 0/1 doesn't mean it's impossible/certain.
You make statements about specific observed values of random variables by referring to particular observations, and statements about probabilistic deductions you can make given random variables by using conditional probabilities.