I'm refering to this proof: http://en.wikipedia.org/wiki/Quantum_relative_entropy#The_result
In there it's stated that "Since the matrix $(P_{ij})_{ij}$ is a doubly stochastic matrix and $-\log$ is a convex function," $$\sum_ip_i(\log p_i-\sum_j(\log q_j)P_{ij}) \geq \sum_i p_i(\log p_i - \log(\sum_j q_jP_{ij})).$$
Note that $0<P_{ij},q_j\leq 1$. For $x,y,\varepsilon<1$, I think that the correctness of this step relies on $$-\varepsilon\log x \geq -\log(\varepsilon x)$$ and $$-\log x - \log y \geq -\log(x+y).$$
The former looks terribly wrong, while the latter looks about right. Now, I don't see why this step is correct and can't think of any reason. Any ideas?
Edit: A little graph supporting that the former is wrong (copy the whole link) http://www.wolframalpha.com/input/?i=from+0.001+to+1+plot+%28-log%280.1*x%29%2C+-0.1*log%28x%29%29
Alternatively, enter
from 0.001 to 1 plot (-log(0.1*x), -0.1*log(x))
ThisThe statement from the proof relies on the fact the $P$ is a doubly stochastic matrix (http://en.wikipedia.org/wiki/Doubly_stochastic_matrix) therefore the sum of all it's entries is 1, and the fact the $-\log$ is convex.Since it is a doubly stochastic matrix, $\sum_j{(-\log{q_j})P_{ij}}$ is a convex expression (wighted average) - that means, since it is convex function (http://en.wikipedia.org/wiki/Convex_function), that $\sum_j{(-\log{q_j})P_{ij}}\geq-\log(\sum_j{q_jP_{ij}})$.
EDIT: I was asked to explain further. Please read in wikipedia what is a convex set and function. Perhaps the "convex expression" notation is incorrect but what I meant by this is for coeffients adding to 1, $\sum_ia_i=1$ - they are the weights! the convex expression is the weighted average $\sum_i{a_ib_i}$ for $\{b_i\}_i$ (could be numbers, vectors, functions etc.).
Now a convex function $f$ satisfies: $f(\sum_i{a_ib_i})\leq\sum_i{a_if(b_i)}$. This should make it clear why the statement from the proof is correct.
About your arguments, I think they are too strong. You can try using graphs on WolframAlpha and see what happens. I've switched letters for convenience. $-y\log x\geq-\log(yx)\iff y\log x\leq \log(yx) \iff \log x^y\leq \log(yx) \iff x^y - xy\leq 0$
EDIT2: Thanks to @Semiclassical I've found a point worth mentioning. $$\sum_i{p_i(\log p_i-\sum_j{(\log q_j)P_{ij}})}=\sum_i{p_i\log p_i-\sum_i\sum_j{(\log q_j)P_{ij}}}=\sum_i{p_i\log p_i+\sum_{i,j}{(-\log q_j)P_{ij}}}$$ and think of running on the matrix as one finite series - it doesn't matter which way you go, you end up with the same sum.