For a concave function of a single variable, the following property is classic:
If $f$ is a concave function of a single variable, its derivative function $f'$ is monotonically decreasing on that interval, that is, a concave function has a non-increasing (decreasing) slope. And it is obvious that if slope is closer to zero, it means closer to the global maximum.
Does this property can be applied to a concave function for multiple variables?
For example, we know that the function $f(B)=log \left | B \right |$, where $\left | B \right |$ is the determinant of a nonnegative-definite matrix $B$, is concave. The derivative of $\left [ \mathrm{log} \left (\mathrm{det} B \right ) \right ]$ is $$\left [ \mathrm{log} \left (\mathrm{det} B \right ) \right ]'=\left ( B^{-1} \right )^{T}.$$
Because we have constraint for variables, so the feasible solutions are constrained to several points in a space. My question is that can we measure the slope of the gradient in matrix-form, and say that one point is the closest to the global maximum if some property of gradient is observed?
If there is any related literature I will be very grateful. Thank you very much.
A common characterization requires you take two derivatives and work with the Hessian. I'm going to phrase this in the convex case (but the concave case is analogous, just multiply everything by $-1$).
By calculus, "$f'$ is always increasing" is the same as saying "$f''$ is always positive" (provided $f''$ exists). You have observed that this property holds for convex functions. More generally, the following is a common result:
If a real-valued function defined on a real Hilbert space (e.g. $\mathbb{R}^N$) has a well-defined Hessian, $H$ then $$f \;\;\text{is convex}\;\;\Leftrightarrow\;\;H\;\;\text{is positive semidefinite}.$$ This result appears here and in most convex analysis text books (Rockafellar is a classic). The analogue for your concave case is that $H$ must be negative semidefinite.
To your second question, it still holds that $x$ (globally) minimizes a convex function $f$ if and only if $\nabla f(x)=0$. Equivalently, $x$ (globally) maximizes a concave function $f$ if and only if $\nabla f(x)=0$. This is known as Fermat's rule, or a first-order condition.