Consider the following two-class problem: We have two classes $c_1, c_2$ and binary data $x \in \{0,1\}^d$. Each dimension of $x$ is drawn independently from a Bernoulli distribution, according to the following probabilities: $$P(x_i = 0 | w_1) = 1 - p_i$$ $$P(x_i = 1 | w_1) = p_i$$ $$P(x_i = 0 | w_2) = p_i$$ $$P(x_i = 1 | w_2) = 1- p_i$$
Assume now a classifier that classifies $x$ as $w_1$ if
$l=\frac{P(w_1 \mid x)}{P(w_2 \mid x)}>1$
Our task is to show that this classifier can be rewritten as $a^Tx +b > 0$ where $a \in \mathbb{R}^d$,$b \in R$. We may assume $P(w_1) = P(w_2) = 0.5$
My approach so far is the following: Since the prior class probalities are equal, we can rewrite $l$ as: $l = \frac{P(x \mid w_1)}{P(x \mid w_2)}$
I now tried to use the fact that $P(x \mid w_1) = \prod_{i=1}^{d} p_i^{x_i} (1 - p_i)^{1-x_i}$ and vice versa $P(x \mid w_2) = \prod_{i=1}^{d} (1 - p_i)^{x_i} p_i^{1-x_i}$ and the natural logarithm to transform the problem, but I am stuck with this.
How can we rewrite $l > 1$ to turn it into a linear classifier of the form $a^T x +b >0$
Taking the logarithm does it. Where $s,t\in\Bbb R^+,~~\dfrac st>1~\iff ~\ln s-\ln t>0~$
You have correctly found $l$. $$\begin{align}l&=\dfrac{\prod_{i=1}^d p_i^{x_i}(1-p_i)^{1-x_i}}{\prod_{i=1}^d p_i^{1-x_i}(1-p_i)^{x_i}}\\[2ex] \text{So }l>1\text{ iff}&\\[2ex]0&<\ln\prod_{i=1}^d p_i^{x_i}(1-p_i)^{1-x_i} -\ln\prod_{i=1}^d p_i^{1-x_i}(1-p_i)^{x_i}\\[1ex]0&<\sum_{i=1}^d \big(x_i\ln p_i + (1-x_i)\ln (1-p_i)\big)-\sum_{i=1}^d \big((1-x_i)\ln p_i + x_i\ln (1-p_i)\big)\\[1ex]0&<2\sum_{i=1}^d \big(\ln p_i-\ln(1-p_i)\big)x_i - \sum_{i=1}^d \big(\ln p_i-\ln (1-p_i)\big)\end{align}$$
Which is of the form $0<\vec a^\top\vec x+b$