I have a random variable $X=a_1X_1+a_2X_2 + \ldots a_kX_k$ where $X_i \sim Bern(q)$, $X_i \perp X_j, \forall i,j\in \{1,2\ldots,k\}$. Also $\sum_{i=1}^{k} a_i=k$ and $a_i \in \mathbb{N} \bigcup \{0\}$ (Non-negative integers).
In essence, $X$ is a discrete random variable taking values in $\{0,1,2\ldots,k\}$ which is the weighted sum of up to $k$ IID Bernoulli random variables.
I have another Binary random variable $Y$ such that $Y=0$ with probability $\frac{0.1}{1+x}$ when $X=x$. Note that the transition probabilities $\frac{0.1}{1+x}$ is decreasing in $x$.
I have a conjecture that the mutual information $I(X ; Y)$ is maximum when $Var(a_{i})$ or equivalently $\sum a_i^2$ is maximum. Moreover, in general, $I(X ; Y)$ increases as $\sum a_i^2$ increases.
Can someone help me verify if this is correct and establish this formally?