I am trying to analytically solve a minimization problem $$ g(\tau_1,\dots,\tau_k) \rightarrow \min_{\tau_1,\dots,\tau_k \in (0;1)} $$ where $$ g(\tau_1,\dots,\tau_k) = \frac{ \sum_{i=1}^k \sum_{j=1}^k \left( \min(\tau_i,\tau_j)-\tau_i\tau_j \right) }{ \sum_{i=1}^k \sum_{j=1}^k f(F^{-1}(\tau_i)) f(F^{-1}(\tau_j)) } $$ where $f(\cdot)$ is a probability density function (PDF) of some given distribution (e.g. Normal, $t$, logistic, exponential, Weibull) and $F^{-1}(\cdot)$ is the corresponding inverse cumulative density function (inverse CDF).
The distribution should be taken as given, i.e. $f(F^{-1}(\cdot))$ need not be generic; but I have not decided which distribution to start from as probably some are easier to work with than others.
I would normally start from finding the derivative of the function being optimized and setting the derivative to zero. But I have trouble handling the $f(F^{-1}(\cdot))$...
Q1: Where do I even start (for a given distribution)?
Q2: What are some distributions (preferably with support on $\mathbb{R}$ or $\mathbb{R}_{+}$) for which $f(F^{-1}(\cdot))$ is easy to handle analytically?
Answer to Q2: One example is the exponential distribution, for which $f(F^{-1}(\tau))=\lambda(1-\tau)$. Also, $t(1)$, $t(2)$ and $t(4)$ are quite tractable -- see Wikipedia's article on the Quantile function.
(The context is choosing the quantiles $\tau_1,\dots,\tau_k$ to minimize the asymptotic variance of an equally-weighted composite quantile regression estimator as per Koenker (1984). There $g(\cdot)$ is the main element in the expression of the asymptotic variance.)
Koenker, Roger. "A note on L-estimates for linear models." Statistics & Probability Letters 2.6 (1984): 323-325.