I was working with Bernoulli Mixture models for a Machine Learning class, and I stumbled across a (to me, surprising) identity.
Question
The identity in question is
$$\forall x \in \{0,1\}, a,b \in \mathbb{R} : ax + b(1-x) = a^xb^{(1-x)}$$
It is triviably provable by cases.
Interestingly, the form $ax + b(1-x)$ bears striking familiarity to the log form: $\ln(a^xb^{(1-x)}) = x\ln{a} + (1-x)\ln{b}$.
Is there any actual relation, or is this mere mathematical coincidence?
Background
Let $x \in \{0,1\}^D$ be a vector of random binary variables. Assume each $x_d$ is drawn from a Bernoulli distribution with mean $p_d$. Let $p \in (0,1)^D$ be the resulting parameters—write an expression for $\Pr(x \mid p)$.
I originally wrote
$$\Pr(x \mid p) = \prod_{d = 1}^D x_dp_d + (1-x_d)(1-p_d)$$
Later, I realized that this was equivalent to
$$\Pr(x \mid p) = \prod_{d=1}^D p_d^{x_d}{(1-p_d)}^{(1-x_d)}$$
The latter plays nicer with log-likelihoods, which is how I came upon it. I pared it down to the 1-$d$ case above.