I arrived at a (slightly different but hopefully more precise) reformulation of my original question shown below:
Is $H(A,B)-H(A,B|C)$ maximal for independent $A$ and $B$ in a similar sense as $H(A,B)$ alone is maximal for independent $A$ and $B$? This is asking under the constraint $H(A,B|C)=H(A|C)+H(B|C)$ and both $A$ and $B$ being dependent on $C$.
For random variables $A,B,C$ with $A$ and $B$ independent conditioned on $C$, while both are marginally dependent on $C$, it seems that the conditional entropy $$H(A,B|C)=-\sum_c p(c) \sum_{a,b} p(a,b|c) \log p(a,b|c)=H(A|C)+H(B|C)$$ is independent of whether $p(a,b)$ factorises or not, i.e. of whether $A$ and $B$ are also marginally independent or not.
I'm trying to make this a more rigorous statement (something like the conditional entropy is unaffected by changing $||p_{a,b}-p_ap_b||$) as well as gain some intuition.
In particular, are there non-trivial examples where $H(A,B|C)=H(A',B'|C)$ while $A$ and $B$ are independent, $A'$ and $B'$ are dependent, both are independent conditioned on $C$, $A,A',B,B'$ are dependent on $C$, and $A\sim A'$, $B\sim B'$? Starting with two dependent random variables, it seems mind boggling to me how one can arrive at independent variables with the same marginal distributions that leaves the conditional entropy unchanged.