I've proved that $f(x_i): \mathbb{R} \to \mathbb{R}$ is convex (it only depends on the the vector component $x_i$); can I then say $g(x)\equiv f(x_i): \mathbb{R}^n \to \mathbb{R}$ (I just expanded the "scope" of $f$) is also convex? More generally I want to say that a function is convex in its domain element if it's convex in some "components" of the domain element that it depends on.
The context is this:
I'm trying to show that the relative entropy $R(u,v)$ is convex in the pair $(u,v)$, $u,v \in \mathbb{R}_{++}^n$. I've shown that each $h_i(u_i,v_i) = u_i \log(u_i/v_i)$ is convex in $(u_i,v_i)$, and want to argue that $R$ is convex in the pair $(u,v)$ because it's the sum $\sum_i h_i(u_i,v_i)$, each $h_i$ convex in $(u,v)$.
Update:
Now I'm pretty sure this is OK, as I my textbook at one point argued "$g(y)$ is log-concave in $(x,y)$" (Boyd's Convex Optimization, p. 106)
Denote $w=(u,v)$, where $u,v\in\mathbb{R}_{++}^{n}$. Also, I write $h_{i}(w)$ with the understanding that $h_{i}$ depends only on $i$th entry in $w$.
With this notation you have already proven that $$h_{i}(\alpha w+(1-\alpha)w')\leq\alpha h_{i}(w)+(1-\alpha)h_{i}(w')$$ for all $i$. The result you are after is, given $R(w)=\sum_{i}h_{i}(w)$, $$R(\alpha w+(1-\alpha)w')\leq\alpha R(w)+(1-\alpha) R(w').$$ The latter inequality holds since you get it as the sum of the former inequality.