Given a vector field $X$in $M$, defined along the submanifold $\Sigma$, we can define the tangential divergence of $X$ to be
$$\operatorname{div}_\Sigma X = \sum_{i = 1}^n (\nabla_{e_i} X, e_i)$$
where $e_1, \dots, e_n$ is any orthornomal frame of $\Sigma$
I want to show that for any frame $v_1,\dots,v_n$ for $\Sigma$, we have
$$\operatorname{div}_\Sigma X = \sum_{i,j = 1}^m (v^i,v^j)(\nabla_{v_i} X, v_j)$$
I tried using Gram-Schmidt on $v_1, \dots, v_n$, but it not work. I don't actually know how to define the divergence without trace. I know the result here is summing all possible $(i,j)$ entries in the matrix.