The standard definition of mutual information for a pair of random variables $X, Y$ is $I(X; Y) = D_{KL}(P_{X, Y} \| P_X \times P_Y)$. One of the most important properties of this definition is the chain rule, which says that, if we replace $Y$ with a collection of random variables $Y_1, \dots, Y_n$, we get $I(X; Y_1, \dots, Y_n) = \sum\limits_{i \in [n]} I(X; Y_i | Y_1, \dots, Y_{i - 1})$.
Still, this property is generally known to break down if we consider variants of mutual information where the KL-divergence in the definition is replaced with an abritrary $f$-divergence (see Section 7.8 of this book). One case of $f$-divergence where the corresponding $f$-mutual information enjoys a chain rule-type property is the symmetric KL-divergence, where we have $I_{SKL}(X; Y_1, \dots, Y_n) = \sum\limits_{i \in [n]} I_{SKL}(X; Y_i)$. However, SKL-mutual information suffers from the fact that, for its values to be finite, it's necessary for $P_{X, Y}, P_X \times P_Y$ to be absolutely continuous with respect to each other, which generally cannot be taken for granted.
Do we know any cases of $f$-divergences other than the KL and SKL divergences that lead to $f$-mutual information definitions that exhibit additivity properties? What about formulations of mutual information based on Renyi entropy (I understand such definitions can be a bit more technically involved, e.g., here)? More broadly, do we know any other measures of dependence (outside mutual information and variants) between random variables that have chain/additivity rules? I'm not counting covariance (which has one such property due to linearity of expectation) because it captures a very specific type of dependence which is not as general as I'd hope.