Consider the n-dimensional positive simplex $S_n = \{x | \sum_{i=1}^n x_i = 1, x_i\geq0 \}$.
Let $f:\mathbb{R}^n\to \mathbb{R}$ be a linear function so that $f(x) = w^Tx+b$ for some $w \in \mathbb{R}^n$ and $b \in \mathbb{R}$.
Let $g:S_n\to S_n$ be the function such that for $y = g(x)$, we have $$y_i=\frac{r_ix_i}{\sum_{i=1}^n r_ix_i}$$
Where $1 \geq r_i > 0$ and determine $g$. Thus $g$ just scales and renormalizes vectors in $S_n$ using some positive weights $r_i$ (These types of transforms frequently appear in machine learning/probability).
The question is whether $f(g(x))$ is piece wise linear in $x$. I spent some time pondering over it and it unclear to me why it should be, however an old well cited paper seems to assert it is.
Any help would be great.