I have a set of points $S$ which have to be clustered into $K$ cluster say, $S_k$, by minimizing the following function:
$J = - \sum_{i=1}^{K} \log \det( \mathbf{I} + H_i H_i^T)$,
Where $H_i$ is the matrix formed by taking the columns of points in cluster $i$. Note that $S = \cup S_k$ and $S_k \cap S_j = \emptyset$, if $k \neq j$. We know the number clusters $K$. How do I optimize such function?
Do you want to do it efficiently or exactly? Technically you have a mixed-integer convex MAXDET problem, so it can be solved using branch-and-bound applied on a convex semidefinite MAXDET problems.
This is the model in the MATLAB Toolbox YALMIP (disclaimer, developed by me), which has a rudimentary mixed-integer SDP solver. To solve the relaxations, the SDP solver SDPT3 is required, as it is the only available MAXDET solver
This is the exact brute-force solution, and definitely not the way you want to approach this if you want to go for efficiency.