I have been bashing my head trying to solve the following problem for the past two days, it is a review question in preparation for my exam and I assume something similar will be on it. The problem states:
Find the minimum spanning tree for a graph with |V|=n and |E|=n+k for some constant k such that n>>k (so graph is pretty sparse) in n+k^100 running time (the k^100 in this case is an arbitrary function). Each comparison costs 1 unit of time (and so I must reduce the number of comparisons). The following hint was given: leaf nodes (nodes of 1 degree) do not have to be compared against others.
My original thought was to use some form of Boruvka's algorithm since it can be done linearly on the number of edges (if some edges could be deleted on every iteration). I would love to know which direction to go with this problem or if I am missing something obvious in this.
Here is a sketch of an idea, not sure if this will actually work.
The graph you describe will be mostly "tree-like" except for a few cycles. Classify the edges as "cycle edges" if they belong to a cycle or "tree edges" otherwise. All tree edges must be included in the MST, so ignore them. You can decompose the subgraph of cycle edges into a collection of simple paths that intersect at their endpoints. For each such path, all edges except the maximum-weight edge must be included in the MST. The set $M$ of maximum-weight edges for each path can be computed in $O(n)$ comparisons. And the size of this set is probably some low-degree polynomial $p(k)$...