Let $\tau \in [0,1]$. Let us define the norm $\Omega(x)=\tau\|x\|_1+(1-\tau)\|x\|_{1,2}$; where $$ \|x\|_{1,2}=\sqrt{\sum_{g \in \mathcal{G}} \left(\sum_{i \in g} |x_i|\right)^2} $$ is the exclusive lasso norm. The vector has a group structure; the groups do not have common elements, and the union contains all elements in the vector. It can be shown that the dual norm of $\|x\|_{1,2}$ is the $\|x\|_{\infty,2}$ norm, defined as [1] $$ \|x\|_{\infty,2}=\sqrt{\sum_{g \in \mathcal{G}} \left(\sup_{i \in g} |x_i|\right)^2} $$ Let the Fenchel conjugate of $f$ be defined as $$ f^*(y)=\sup_x y^Tx-f(x)\\ \text{ex. }\Omega(x)=\|x\|\\ \text{then }\Omega^*(y)=I_{\|v\|_*\leq1}(y)\text{ where }\\ I_C(x)= \begin{cases} 0,& \text{if } x\in C\\ \infty, & \text{otherwise} \end{cases} $$ where $\|x\|_*$ is the dual norm of $\|x\|$: $$ \|x\|_*=\sup_{\|z\|\leq1} z^Tx $$
My question is, how to derive the Fenchel conjugate for a convex combination of norms?
Using the fact $\Omega^*(y) = (\Omega_1+\Omega_2)^*(y)=\underset{z}{\inf}\Omega_1^*(z)+\Omega_2^*(y-z)$, where $\Omega, \Omega_1, \Omega_2$ are all vector norms, and the scalar multiplication rule: $$ f(x)=\alpha g(x), f^*(y)=\alpha g^*(y/\alpha) $$
we arrive at: $$ f(x)=\tau \|x\|_1, g(x)=(1-\tau) \|x\|_{1,2}\\ \Omega^*(y)=\inf_z f^*(z)+g^*(y-z)=\inf_zI_{\|v\|_{\infty}\leq1}(\frac{z}{\tau})+I_{\|u\|_{\infty,2}\leq1}(\frac{y-z}{1-\tau}) $$ Then I got stuck from here. Ndiaye et. al [2], had a result for convex combination of $\ell_1$ and $\ell_2$ norms without showing the proof (would be great if someone could show the proof of this, too, since maybe I can adapt this to my original problem):
$$ \Omega(x)=\tau\|x\|_1+(1-\tau)\|x\|_2\\ \Omega^*(y)=\inf_z I_{\|v\|_{\infty}\leq \tau}(z)+I_{\|v\|_{2}\leq 1}(\frac{y-z}{1-\tau})=I_{\|v\|_{2}\leq 1}(\frac{y-\Pi_{\|v\|_{\infty}\leq \tau}(y)}{1-\tau}) $$ where the projection onto the unit $\ell_{\infty}$ ball is defined as $$ \left[\Pi_{\|v\|_{\infty}\leq 1}(x)\right]_i= \begin{cases} 1,& \text{if } x_i\geq1\\ -1, & \text{if } x_i\leq-1\\ x_i & \text{otherwise} \end{cases} $$
References:
[1] Campell, Allen, "Within group variable selection through the Exclusive Lasso", Page 4248
[2] Ndiaye et. al., "GAP Safe Screening Rules for Sparse-Group Lasso", page 15
Let's try a solution "by hand". Consider $A$ a closed convex set and the optimization problem
$\min_z \iota_A (z) + \iota_B (y - z)$ where $B$ is a set. Now, note that $0$ is a lower bound and so any $z$ that attains $0$ is optimal. Define $z^{\star} = \Pi_A(y)$ so that $\iota_A(z^{\star}) = 0$.