maximum number of leaves in spanning tree

328 Views Asked by At

We have a graph with $9n^2$ vertices and put vertices in a $3n*3n$ table. two vertices are adjacent in graph if they are adjacent in table. what is maximum number of leaves in a spanning tree of this graph.

I think the answer is $6n^2-2(n-1)$ and i have a example for this but i can't prove that there's no spanning tree with more leaves. please let me know if you have any idea how to solve this.

This is my example:

this is for $n=2$

It's easy to find the pattern for bigger n.

2

There are 2 best solutions below

0
On BEST ANSWER

Here is a neural network with 2 hidden layer of 10 nodes each to compute the maximum leaves of the graph on $n \times m$ grid. The training data are generated for special know cases mentioned in the paper:

Here is the codes:

import torch
from torch import autograd, nn, optim
from torch.autograd import Variable
import torch.nn.functional as Fuc
import math


def relu_num(x):
    if x<0:
        return 0
    else:
        return x


numberOfSample=1000

# features are as [number_of_rows, 
#                  num_of_columns, 
#                  number_of_edges_of_spanning_tree, 
#                  number_of_nodes_degree_atmost_4, 
#                  number_of_nodes_degree_atmost_3,
#                  ]

#R(2,m)=m , R(2,1)=R(2,2)=2, R(2,3)=4
A=[[2.0 , i+1.0 , 2.0*(i+1.0)-1.0 , 0.0 , relu_num(2.0*(i+1.0)-4.0)] for i in range(numberOfSample)]
B= [[float(x)+1.0] for x in range(numberOfSample)]
B[0]=[2.0]
B[1]=[2.0]
B[2]=[4.0]

#R(3,m)=2m
C=[[3.0 , i+1.0 , 3.0*(i+1.0)-1.0 , relu_num(i-2.0), relu_num(2.0*i-2.0)] for i in range(numberOfSample)]
D = [[2.0*(x+1.0)] for x in range(numberOfSample)]

# R(4,m)=2m+floor(m/3)
E= [[4.0,i+1.0, 4.0*(i+1.0)-1.0, relu_num(2.0*(i-2.0)), relu_num(2.0*(i-2.0)+4.0)] for i in range(numberOfSample)]
F=[[math.floor((x+1.0)/3.0)+2.0*(x+1.0)] for x in range(numberOfSample)]

#R(6,m)=4m-2
G= [[6.0,i+1.0, 6.0*(i+1.0), relu_num(4.0*(i-2.0)), relu_num(2.0*(i-2.0)+8.0)] for i in range(numberOfSample)]
H =[[4.0*(x+1.0)-2.0] for x in range(numberOfSample)]

A[numberOfSample :]=C
A[numberOfSample*2 :]=E
A[numberOfSample*3 :]=G

B[numberOfSample :]=D
B[numberOfSample*2 :]=F
B[numberOfSample*3 :]=H

data = Variable(torch.tensor(A))
target= Variable(torch.tensor(B))


batch_size = 5
input_size = 5
hidden_size = 13
num_classes = 1
learning_rate = 0.005



class Model(nn.Module):
    def __init__(self, input_size, hidden_size, num_classes):
        super().__init__()
        self.h1 = nn.Linear(input_size, hidden_size)
        self.h2 = nn.Linear(hidden_size, hidden_size)
        self.h3 = nn.Linear(hidden_size, num_classes)

    def forward(self, x):
        x = self.h1(x)
        x = Fuc.tanh(x)
        x = self.h2(x)
        x = Fuc.relu(x)
        x = self.h3(x)
        return x

model = Model(input_size=input_size, hidden_size=hidden_size, num_classes=num_classes)
opt = optim.Adam(params=model.parameters(), lr=learning_rate)


for epoch in range(10000):
    out = model(data)
    loss = torch.nn.MSELoss()
    ll=loss(out,target)
    model.zero_grad()
    ll.backward()
    opt.step()

# for i in range(4*numberOfSample):
#     print(target[i],out[i])

test=Variable(torch.tensor([[3.0,1100.0,3300.0-1.0,(1100.0-2.0),2.0*(1100.0-1.0)],
                            [2.0,1100.0,2200.0-1.0,0.0, 2.0*(1100.0-2.0)],
                            [4.0,1100.0,4400.0-1.0, 2.0*(1100.0-2.0),2.0*(1100.0-2.0)+4.0],
                            [6.0,1100.0,6600.0-1.0, 4.0*(1100.0-2.0),2.0*(1100.0-2.0)+8.0],
                            [15.0,15.0,(15.0*15.0)-1.0, 13.0*13.0,2.0*13.0+26.0],
                            [18.0,18.0,18.0*18.0-1.0,16.0*16.0,2.0*16.0+32.0],
                            [21.0,21.0,21.0*21.0-1.0,19.0*19.0,2.0*19.0+38.0]]))
print(model(test))

The result for n,m = [[3.0,3.0],[6.0,6.0],[9.0,9.0],[12.0,12.0],[15.0,15.0],[18.0,18.0],[21.0,21.0]] are

tensor([[  7.3681],
        [ 22.1005],
        [120.1357],
        [125.5403],
        [125.5866],
        [125.5394],
        [125.5266]]

The next step is to incorporate more background knowledge in the training set on for example the upper and lower bounds for $R(n,m)$ and show some plots.

6
On

This is a construction that gives $6n^2-n$

enter image description here