I am reading a book on information theory, therein has been introduced Renyi entropy of order $\alpha$ as
$S_{\alpha} = \frac{1}{1-\alpha}\log(Tr\rho^{\alpha})$,
where $\rho$ is density matrix.
It has been claimed that for $\alpha = 0$ there is a subadditivity relation as
$S_{0} (\rho_{AB})\leq S_{0} (\rho_{A})+S_{0} (\rho_{B})$.
I cannot see how one can prove this. Any idea?
Using the definition the formula you want to prove is turning into:
$\log(Tr\rho_{AB})\leq\log(Tr\rho_A)+\log(Tr\rho_B)$
Now using the property of sums of logarithms this is equivalent to:
$\log(Tr\rho_{AB})\leq\log(Tr\rho_A\cdot Tr\rho_B)$
which finally comes down to $Tr\rho_{AB}\leq Tr\rho_A\cdot Tr\rho_B$.
This last inequality holds as all $\rho$ are density matrices.