Questions:
For random variables with $A,B,C$ and some function (probably Borel-measurable and bounded/integrable) $g$, if $A$ and $g(B,C)$ have the same distribution, then does there necessarily exist random variables $D$ and $E$ such that $A=g(D,E)$ (at least almost surely)?
If this doesn't work in general, then are there conditions where such $D$ and $E$ will exist?
Context:
Sum of iid Bernoulli is binomial. But also conversely, for every binomial random variable $X$, what I understand is that there exists iid Bernoulli's whose sum equals $X$. Like, not just equal in distribution but really at least almost surely equal. Also, I don't think this is trivial because it's not like binomial distribution's definition necessarily depends on having a definition of bernoulli distribution. Well at least I recall in undergrad that we learned Bernoulli after Binomial.
This question (This random variable $Z$ seems to have the same distribution as $\min{X,Y}$ as $|X-Y|$, for $X,Y \sim \operatorname{Unif}(0,1)$.) and that question (Why does $\min(X,Y)$ and $|X-Y|$ have the same distribution when $X,Y\sim U(0,1)$?).
For $R=\min\{X,Y\}$ and $Q=|X-Y|$, I was able to find random variables $I,J,G,H$ such that $R=|I-J|$ and $Q=\min\{G,H\}$. However, neither $(I,J)$ nor $(G,H)$ necessarily have the same relationship as $(X,Y)$ (which is that $X$ and $Y$ are iid). Also, each of $I,J,G,H$ isn't necessarily Unif(0,1). I was thinking to ask later on if there were such $(I,J)$ or $(G,H)$, but I wanted to settle the existence of $D,E$ 1st.
The answer to your first question is no, at least not in general. The answer to the second question might be a bit more subtle. Of course a necessary and sufficient condition would be that $D,E$ exists and $A=g(D,E)$, but that isn't really useful.
Let me argue why the answer to the first question is no. Consider the probability space $\Omega = \{0,1,2\}$ with the $\sigma$-algebra $\mathcal{P}(\Omega)$ and the probability measure given by $$\mathbb{P}(\{x\}) = \begin{cases} \frac14 & ,x=0 \\ \frac12 & ,x=1 \\ \frac14 &,x=2 \end{cases}.$$ Furthermore consider the random variable $X:\Omega \rightarrow \mathbb{R}$ defined as $X(\omega)=\omega$. Clearly $\mathbb{P}(X=x)=\mathbb{P}(\{x\})$ for $x=0,1,2$ and therefore $X\sim \operatorname{Binomial}(2,\frac12)$. Now, since our probability space $\Omega$ is quite small, there isn't many ways to construct $\operatorname{Bernoulli}(\frac12)$ variables, in fact there are only 2 ways: $$Y(\omega)=\begin{cases} 1 &,\omega =0,2 \\ 0 &,\omega =1 \end{cases} \quad \text{and} \quad Z(\omega)=\begin{cases} 1 &,\omega=1 \\ 0 &,\omega=0,2\end{cases}.$$ Now first of all $Y$ and $Z$ are not independent, since $Z=1-Y$, and furthermore $Y+Z = 1 \neq X$. Hence it is clear that $X$ cannot be written as a sum of $i.i.d.$ $\operatorname{Bernoulli}(\frac12)$ variables.