In some papers in community detection or network clustering, they use the following measure of quality for the detected communities (see for example https://arxiv.org/abs/0906.0612, section XVB, 3rd paragraph)
Fraction of correctly classified vertices. A vertex is correctly classified if it is in the same cluster with at least half of its “natural” partners. If the partition found by the algorithm has clusters given by the merging of two or more natural groups, all vertices of the cluster are considered incorrectly classified. The number of correctly classified vertices is then divided by the total size of the graph, to yield a number between 0 and 1.
I am very confused by this definition. Could someone explain how to compute this quantity given a ground truth partition $C = \{c_1,\dots,c_n\}$ of the vertex set $V = \{1,\dots,v\}$ and another partition obtained from the community detection algorithm $D = \{d_1\dots d_m\}$?