i need to show $f(x)=-\log{\frac{exp(C_{\{k,:\}}x)}{\sum_j exp(C_{\{j,:\}}x)}}$ is convex. $x \in R^n$ and $exp(v)$ is the element-wise exponential of $v$ and $C \in R^{d \times n}$. Notation $C_{\{k,:\}}$ means the $i$th row of $C$.
In fact it is the intersection of 3 functions, $\{-log(p),\frac{exp(p_k)}{\sum_j exp(p_j)}, Cx\}$. I tried to calculate Hessian, but i obtained a complicated matrix with too many terms to show it is PSD.
I know it is PSD because i used matlab Hessian approximation and tried that with all kinds of $x$ and the result was PSD.
Besides proving Hessian being PSD, is there any other way easier to prove its convexity?
If I understand your question correctly, you're asking how to show the function $f(x) = -\log\left(\frac{e^{\theta_i^T x} }{\sum_{j=1}^N e^{\theta_j^T x}} \right)$ is convex in $x\in \mathbf{R}^n$? If so, you can easily rewrite this as
$$f(x) = -\theta_i^T x - \log\left(\sum_{j=1}^N e^{\theta_j^T x}\right)$$
The first term is obviously convex in $x$ (specifically linear), and the second term is the negation of the log-sum-exp function. I think what you meant to say about the intersection above, is that this function is actually the composition of three functions.
There are some conditions (section 3.2.4 of Boyd) under which compositions of functions are sufficient to yield convex/concave functions but the log-sum-exp is a standard example of a function that cannot be proved concave by these rules (they are sufficient, not necessary).
You can compute the Hessian of this and use the Cauchy-Schwarz inequality to prove concavity. The negation is then obviously convex. See page 74 of Stephen Boyd's book on convex optimization