Proving that a graph of minimum degree $\delta$ and girth $g$ has at least $n_0(\delta, g)$ vertices

869 Views Asked by At

I am currently studying the textbook Graph Theory, fifth edition, by Reinhard Diestel. Chapter 1.3 Paths and cycles says the following:

Proposition 1.3.3. A graph $G$ of radius at most $k$ and maximum degree at most $d \ge 3$ has fewer than $\dfrac{d}{d - 2}(d - 1)^k$ vertices.
Proof. Let $z$ be a central vertex in $G$, and let $D_i$ denote the set of vertices of $G$ at distance $i$ from $z$. Then $V(G) = \bigcup_{i = 0}^k D_i$. Clearly $\lvert D_0 \rvert = 1$ and $\lvert D_1 \rvert \le d$. For $i \ge 1$ we have $\lvert D_{i + 1} \rvert \le (d - 1) \lvert D_i \rvert$, because every vertex in $D_{i + 1}$ is a neighbour of a vertex in $D_i$ (why?), and each vertex in $D_i$ has at most $d - 1$ neighbours in $D_{i + 1}$ (since it has another neighbour in $D_{i - 1}$). Thus $\lvert D_{i + 1} \rvert \le d(d - 1)^i$ for all $i < k$ by induction, giving $$\lvert G \rvert \le 1 + d\sum_{i = 0}^{k - 1} (d - 1)^i = 1 + \dfrac{d}{d - 2}((d - 1)^k - 1) < \dfrac{d}{d - 2}(d - 1)^k. \square$$ Similarly, we can bound the order of $G$ from below by assuming that both its minimum degree and girth are large. For $d \in \mathbb{R}$ and $g \in \mathbb{N}$ let $$n_0(d, g) := \begin{cases} 1 + d \sum\limits_{i = 0}^{r - 1} (d - 1)^i & \text{if $g =: 2r + 1$ is odd;} \\ 2 \sum\limits_{i = 0}^{r - 1} (d - 1)^i & \text{if $g =: 2r$ is even.} \end{cases}$$ It is not difficult to prove that a graph of minimum degree $\delta$ and girth $g$ has at least $n_0(\delta, g)$ vertices (Exercise 7). Interestingly, one can obtain the same bound for its average degree:
Theorem 1.3.4. (Alon, Hoory & Linial 2002)
Let $G$ be a graph. If $d(G) \ge d \ge 2$ and $g(G) \ge g \in \mathbb{N}$ then $\lvert G \rvert \ge n_0(d, g)$.
One aspect of Theorem 1.3.4 is that it guarantees the existence of a short cycle compared with $\lvert G \rvert$. Using just the easy minimum degree version of Exercise 7, we get the following rather general bound:
Corollary 1.3.5. If $\delta(G) \ge 3$ then $g(G) < 2 \log \lvert G \rvert$.
Proof. If $g := g(G)$ is even then $$n_0(3, g) = 2 \dfrac{2^{g/2} - 1}{2 - 1} = 2^{g/2} + (2^{g/2} - 2) > 2^{g/2},$$ while if $g$ is odd then $$n_0(3, g) = 1 + 3 \dfrac{2^{(g - 1)/2} - 1}{2 - 1} = \dfrac{3}{\sqrt{2}} 2^{g/2} - 2 > 2^{g/2}$$ As $\lvert G \rvert \ge n_0(3, g)$, the result follows. $\square$

The author says that "it is not difficult to prove that a graph of minimum degree $\delta$ and girth $g$ has at least $n_0(\delta, g)$ vertices (Exercise 7)." I've been working on this for 48 hours, but I cannot see how what the author has done here makes sense.

So we want to prove that $$\lvert G \rvert \ge n_0(\delta, g) := \begin{cases} 1 + \delta \sum\limits_{i = 0}^{r - 1} (\delta - 1)^i & \text{if $g =: 2r + 1$ is odd;} \\ 2 \sum\limits_{i = 0}^{r - 1} (\delta - 1)^i & \text{if $g =: 2r$ is even.} \end{cases}$$ Start with a graph $G$, and let $d(G) \ge \delta \ge 2$ and $g(G) \ge g$. So, on the one hand, we have to derive $$n_0(\delta, g) := \begin{cases} 1 + \delta \sum\limits_{i = 0}^{r - 1} (\delta - 1)^i & \text{if $g =: 2r + 1$ is odd;} \\ 2 \sum\limits_{i = 0}^{r - 1} (\delta - 1)^i & \text{if $g =: 2r$ is even.} \end{cases}$$ And, on the other hand, we have to prove that $\lvert G \rvert \ge n_0(\delta, g)$. The author describes this as "not difficult," but there is nothing in the information that the author has provided up to this point to indicate where $n_0(d, g)$ comes from, apart from the $1 + d \sum\limits_{i = 0}^{r - 1} (d - 1)^i$ expression, which is derived here but only for the general case (not any such odd-even split). I don't see any way to prove this, unless using information outside of what the author has introduced so far in the textbook.

2

There are 2 best solutions below

10
On

$g$ is odd case

Suppose that $r$ is such that $g=2r+1$.

Let again $z$ be a central vertex and $D_i$ denote the set of vertices of distance $i$ from $z$. Now $|D_0|=1$ and obviously $|D_1|\geq \delta$ (since $z$ has $\delta(z)\geq \delta$ neighbors).

The important fact is that for $i\in[1,r-1]$ the following holds $|D_{i+1}|\geq (\delta-1)|D_i|$. Why?

For any $u\in D_i$ there is one edge from vertex in $D_{i-1}$ but $u$ will have at least $\delta(u) -1 \geq \delta -1$ edges to other vertices not in $\bigcup_{j=0}^i D_j$ since if there was a $v\in D_j$, for some $j$, such that $(u,v)\in E(G)$ then there would be a cycle of length $L\leq j + i + 1\leq 2i+1\leq 2(r-1)+1<g$, contradiction. A similar argument shows that two vertices $u_1,u_2\in D_i$ can't both be connected with a vertex $v\in D_{i+1}$.

$|D_{i+1}|\geq (\delta-1)|D_i|$ implies $|D_{i+1}|\geq \delta (\delta-1)^i$.

The result follows $$|G| = 1 + \sum_{i=1}^{r}|D_i| \geq 1 + \delta \sum_{i=0}^{r-1}(\delta-1)^i =n_0(\delta, g)$$


$g$ is even case

Suppose that $r$ is such that $g=2r$.

Now we can't choose a central vertex. So choose two vertices $z_1, z_2$ and join them together to $z$. Now $|D_0| = 2$ and $|D_1| \geq 2(\delta-1)$, since we have to account for the edge $(z_1,z_2)$.

Afterwards there can be a similar argument to prove $|D_{i+1}|\geq (\delta -1)|D_i|$ but now, because we to account for $(z_1, z_2)$, $i$ is not greater than $r-2$ since the inequality that bounds $i$ is given by $L\leq 2i+1+2= 2(i+1)+1$, so for $g$ to be surely greater than $L$ we can choose $i+1 <r \iff i \leq r-2$ so that $2(i+1)+1\leq 2r - 1< g$.

Combining the above, $|D_{i+1}|\geq 2(\delta-1)^{i+1}$ but only for $i\in[1,r-2]$. So we can write $|D_{i}|\geq 2(\delta-1)^{i}$ ($i\in[2,r-1]$) but we can also write $|D_0|=2(\delta -1)^0$ and $|D_1|=2(\delta-1)^1$ so finally

$$|G| = 2 + \sum_{i=1}^{r}|D_i| \geq 2 \sum_{i=0}^{r-1}(\delta-1)^i =n_0(\delta, g)$$

2
On

To recap: In Graph Theory by Diestel, Theorem 1.3.3, uses the radius and maximum degree of a graph $G$ to obtain an upper bound on the number of vertices (the order $|G|$) of $G$. A full proof is given. The reader is then supposed to prove a similar looking lower bound on $|G|$, using the girth $g$ and the minimum degree $\delta$. The lower bound is given by the expression $n_0(\delta, g)$.

(When Diestel says that "this is not hard" to do, I think he means that one should not need to invent an entirely new proof method; but that can re-use some of the ideas from the proof of Theorem 1.3.3. This is indeed the case.)

We first prove a lower bound which is valid for all $g$, and then improve this bound to obtain the separate expression valid when $g$ is even.

First bound

The general idea is as follows: We start at an arbitraty vertex $v$ of $G$. One step at a time, we walk away from this vertex in all possible directions. For each further step we count how many additional vertices we can be sure that this extra step allows us to reach.

The minimum degree $\delta$ says that for most steps (all except the first and up to a certain number, see below) we should be able to reach at least a factor of $\delta-1$ new vertices. The "minus one" is there since we are not allowed to go backwards. For the first step, there is no backwards, so we have at least $\delta$ different vertices to go to.

The girth $g$ (the length of the shortest cycle) guarantees that as long as we don't walk too far out, we will not double count any vertices. Thus, our expression will indeed be a lower bound. So what is "too far out", exactly? We can not risk double counting any vertex, so just before the risk arises that we might reach and count an already reached vertex, we need to stop! Exactly how many steps out does the risk of double counting start?

Obviously, we cannot go $g$ steps away from our starting vertex, since at that point we risk coming right back to $v$. However, we actually need to stop much earlier, just before $g/2$ steps (which might not be an integer). Consider that if we count vertices which are $g/2$ steps or more away from $v$, we run the risk of having gone one path that runs along one half of a cycle to a vertex $u$ and another path which goes the along the other half of the same cycle to the same vertex $u$. That is, we would risk counting $u$ twice.

We are close to the key realization: that every unique path less than $g/2$ steps away from $v$ leads to a unique vertex. We prove this by contradiction. Suppose that we have two different paths, each shorter than $g/2$, which lead from $v$ to $u$. Since the paths differ, they diverge at some point, say $a$ (which might be the same vertex as $v$), and at some vertex along the way to $u$, say at $b$ (which might or might not be the same as $u$), they reunite. We now have two distinct paths from $a$ to $b$. Put together, the two paths form a cycle. Since each of the two parts are strictly shorter than $g/2$, the cycle is strictly shorter than $g$. This is a contradiction.

So, how many unique paths strictly less than $g/2$ steps away from $v$ are there? The largest integer strictly less than $g/2$ is $\lceil g/2 \rceil -1$. And for each permitted path length $i$, we have argued that there are at least $\delta(\delta-1)^{i-1}$ different $i$-paths. Counting $v$ itself as the "zero path", and summing over the permitted lengths, there is at least

$$1 + \delta \sum\limits_{i = 0}^{\lceil g/2 \rceil - 2} (\delta - 1)^i $$

such paths. They all start at $v$, have length less than $g/2$, and, crucially, they each end at a different vertex. Thus, the above expression gives a lower bound for $|G|$. When $g$ is odd, $g=2r+1$, we have $\lceil (2r+1)/2 \rceil - 2 = r-1$, so the expression implies

$$ |G| \geq 1 + \delta \sum\limits_{i = 0}^{r-1} (\delta - 1)^i \,.$$

The right hand side equals our given $n_0(\delta, g)$ when $g=2r+1$. Thus, our bound, which is valid for any $g$, exactly matches what we want for odd $g$.

Improving the bound when $g$ is even

If $g$ is even, we can be a little clever and improve the bound. Suppose that $g=2r$. We have already shown

$$|G|\geq 1 + \delta \sum\limits_{i = 0}^{r - 2} (\delta - 1)^i\,.$$

Now, instead of starting at the chosen vertex $v$, let us start at any two neighbouring vertices $v_1$ and $v_2$. The key insight now is that every unique path of length $r-1$ or less, starting at either $v_1$ or $v_2$, leads to a unique vertex.

For, like before, suppose that we have two different paths of length $r-1$ or less, leading to the same vertex $u$. Then the symmetric difference of the edge sets of these paths (all the edges that belong to just one of the paths), possibly with the addition of the edge $v_1v_2$, would form a cycle. The length of the cycle would be at most $(r-1)+(r-1)+1 = 2r-1 < g$, a contradiction!

Thus, as a lower bound for $|G|$, we obtain a lower bound on the number of paths of length $r-1$ or less which start at either $v_1$ or $v_2$. For each vertex, we have one path of length zero, then there are at least $\delta-1$ choices for the first step, then at least $\delta-1$ for the second, etc.

$$|G|\geq 2(1+ \sum_{i=1}^{r-1}(\delta-1)^i) = \sum_{i=0}^{r-1}2(\delta-1)^i) \,.$$

The rhs equals $n_0(\delta, g)$ when $g=2r$, and we are done.

However, we might want to check that the new bound for $g=2r$ is actually stronger than the one we obtained before. To see this, consider that the number of terms in both expressions is $r$, and notice that every term of the latter bound is strictly larger than the corresponding term of the former bound. (This comparison boils down to $\delta \geq 3$. Which of course is true, since a cycle of length 2 or less makes no sense for simple graphs.)