In some optimization problems we have convex losses like convex lp-norms etc. And most of the cases calculating the convex conjugate is possible and does not cost much than calculating the loss function itself.
I wanted to ask if there is any convex loss function where calculating its conjugate is costly or not possible?
Even if you know the conjugate of $f$ and $g$, the conjugate of $f+g$ may not be tractable.
In particular, let $f$ and $g$ be convex, proper, lower-semicontinuous functions on a real Hilbert space $X$. Then The conjugate of a sum of functions is the infimal convolution of their conjugates (This result appears in Bauschke & Combettes' book, volume 2, e.g. pp. 247):
$$(f+g)^*(x)=\inf_{\substack{(z,y)\in X \\ z+y=x}}f^*(z)+g^*(y)$$
The expression on the righthand side (called the "infimal convolution of $f^*$ and $g^*$") can only sometimes be expressed in closed-form. One instance is the Moreau envelope, but in general this is not possible.
I have heard, however, that there are efficient methods to numerically approximate a conjugate evaluation.