Let $n,d\in\mathbb N^+$ and let $\mathcal D\in\Delta(\{1,\ldots,n\})$ be a probability distribution over $\{1,\ldots,n\}$.
Consider random variables $X_1,\ldots,X_d, Y_1,\ldots, Y_d\sim \mathcal D$.
Let us define by $H_X=-\sum_{x\in\{1,\ldots,n\}^d} \Pr[X = x]\cdot \log_2 \Pr[X=x]$ the resulting entropy of the vector $X=(X_1,\ldots,X_d)$. Define $H_Y=-\sum_{y\in\{1,\ldots,n\}^d} \Pr[Y = y]\cdot \log_2 \Pr[Y=y]$ in a similar way.
Assume that the $X_i$'s are independent of each other.
Is it true that $H_X\ge H_Y$ regardless of the dependencies in $Y$?