Let's consider the following convex optimization problem of minimizing the log barrier function:
$$\min_{\textbf{x}\in \Re^n}f(\textbf{x})=\min_{\textbf{x}\in \Re^n}\bigg\{\textbf{c}^T\textbf{x}-\sum\limits_{i=1}^mlog(b_i-\textbf{a}_i^T\textbf{x})\bigg\}$$
where $x\in \Re^{n\times 1} \text{, } c\in \Re^{n\times 1} \text{, } b\in \Re^{m\times 1} \text{, } A\in \Re^{m\times n} \text{.}$ I am trying to solve this problem for the unconstrained case but first I want to prove that $f$ is in fact convex. What I have thought: the logarithmic function restricts us to $domf=\{x\in \Re^n|b_i-\textbf{a}_i^T\textbf{x}>0\} \text{ for } i=1,2,...,m.$ All of these inequalities define the intersection of $m$ halfspaces which are convex by default and since convexity is closed under intersection, $domf$ is convex.
So for this domain the goal is to prove that the objective function is convex. I guess that since logarithm is concave the minus logarithm is convex and then the non-negative weighted summation preserves convexity. Finally, we add the scalar $\textbf{c}^T\textbf{x}$ which also preserves convexity. But is there a more strict way to prove this?