There is a set of points on the plane $(x_i, y_i)$, $i = 1, 2, ...., n$. Each point has a binary label: $0$ or $1$.
Different points denoted by yellow (0) and blue (1) in the Figure.
It is required to build a spatial graph and then find the minimum spanning tree. In the graph, a vertex is a point, and edges are added according to two conditions:
- an edge between vertices labeled $0$ is prohibited,
- the degree of the vertex labeled $0$ must be equal to $1$.
To constuct a graph, I defined the distance matrix between all pairs of points:
\begin{equation*} d((x_i, y_i), (x_j, y_j)) = \begin{cases} \infty, &\text{if $(x_i, y_i)$ and $(x_j, y_j)$ labeled by $0$ both}\\ \sqrt{(x_i- x_j)^2 + (y_i - y_j)^2}, & \text{otherwise} \end{cases} \end{equation*}
Problem: using the formula above, I don't know how to implement the condition (2). The current solution is to add a new point and get two edges between blue vertecies.
Question: How to define the condition (2) for calculating the distance matrix?
My idea. The distance matrix is symmetric. For each point labeled with $0$ in a column of the distance matrix, put all elements to infinity except the minimum one. In this case we will have only one edge.
Edit after @DavidG.Stork's comment.
I followed steps 1-2. I am currently working on the Step 3. I need to remove the nodes marked with red lines. My problem is the conditions. I am looking for a rule to distinguish between marked/non-marked nodes.


I would:
Repeat step 3 as necessary...