Question about Minkowski dimension

69 Views Asked by At

I'm learning Minkowski and Hausdorff dimensions to study Brownian motion right now, and I'm trying to understand the reasoning behind the Minkowski dimension of the set $(0,1,1/2, 1/3,\ldots)$ being $1/2$. I am not certain if my understanding of the proof of this statement is correct, so I wanted to verify on here.

So to begin, let us assume that $\epsilon < 1/n^2$. Then if we split the set into:

$$(1, 1/2, 1/3, 1/4, \ldots, 1/n)\cup (0, 1/(n+1), 1/(n+2),\ldots) = A\cup B$$

We know that there must exist precisely $n$ disjoint $\epsilon$-balls that cover $A$. Since the elements of the set $B$ are closer together than the elements in the set in $A$, then there can only be at most another $n$ $\epsilon$-balls that also cover $B$. Then the total number of $\epsilon$-balls (let us denote this value as $M(E,\epsilon)$) is $M(E,\epsilon)=\mathcal{O}(n)$. Minkowski dimension is defined to be: $$\text{Dim}(E) = \lim_{\epsilon\downarrow 0}\frac{\log(M(E, \epsilon))}{\log(1/\epsilon)}$$

Since $M(E,\epsilon) = \mathcal{O}(n)$, and $\epsilon < 1/n^2$, this implies that $n<\sqrt{1/\epsilon}$, meaning that $M(E,\epsilon) = \mathcal{O}(1/\sqrt{\epsilon})$. From here we can take the limit: $$\text{Dim}(E) = \lim_{\epsilon\downarrow 0}\frac{\log(\mathcal{O}(1/\sqrt{\epsilon}))}{\log(1/\epsilon)}$$

From here, my knowledge that the answer is $1/2$ suggests that I can somehow eliminate the big-O completely, and evaluate the limit of $\frac{\log(1/\sqrt{\epsilon})}{\log(1/\epsilon)}$ to $1/2$. Why can we get rid of the big-O like this? Or is there some flaw in my earlier reasoning?

1

There are 1 best solutions below

1
On BEST ANSWER

The problem with big-Oh notation is that it only gives you an upper bound. Recall the definition:

Definition: The notation $f(x) = \mathcal{O}(g(x))$ (where $g$ is a positive function) as $x \to a$ means that there exists some constant $M$ and some $\delta > 0$ such that $$ 0 < |x-a| < \delta \implies |f(x)| < M g(x).$$

In other words, if $x$ is very close to $a$, then $f(x)$ is "dominated by" $g(x)$. This only gives an upper bound for $|f(x)|$, not a precise estimate. In the context of the question asked, the best that we can be done is to show that there are $M$ and $\delta$ such that $0 < |\varepsilon| < \delta$ implies that $$ \frac{\log(\mathcal{O}(1/\sqrt{\varepsilon})}{\log(1/\varepsilon)} \le \frac{\log(M/\sqrt{\varepsilon})}{\log(1/\varepsilon)} = \frac{\log(M) - \frac{1}{2} \log(\varepsilon)}{\log(1) - \log(\varepsilon)} = \frac{1}{2}- \frac{\log(M)}{\log(\varepsilon)}.$$ As $\varepsilon$ goes to zero, its logarithm goes to infinity, and so the last term vanishes. It then follows that $$ \operatorname{Dim}(E) = \lim_{\varepsilon\downarrow 0} \frac{\log(M(E,\varepsilon))}{\log(1/\varepsilon)} \le \lim_{\varepsilon\downarrow 0} \left[ \frac{1}{2} - \frac{\log(M)}{\log(\varepsilon)} \right] =\frac{1}{2}, $$ assuming that the limit exists.

In order to complete the proof that $\operatorname{Dim}(E) = 1/2$, it is necessary to also obtain a lower bound. Alternatively, obtain a more explicit count of the number of $\varepsilon$-balls needed to cover the set.


In this case, why not just use the explicit upper and lower bounds already found? Fix $\varepsilon > 0$ and choose $n$ so that $$\varepsilon = \frac{1}{n^2}. $$ No two points of the set $$ \left\{ \frac{1}{2}, \frac{1}{3}, \frac{1}{4}, \dotsc, \frac{1}{n} \right\} $$ can be covered by a single $\varepsilon$-ball, so $n$ $\varepsilon$-balls are needed to cover this set. The rest of $E$ is contained in the interval $[0,\frac{1}{n}]$, and this interval can be covered by fewer than $n$ balls of radius $\varepsilon$ (it can be covered by exactly $\frac{n}{2} + 1$ such balls (assuming that they are open), but this more precise estimate is a little more tedious to work with, so the sloppier estimate is fine).

This implies that it takes at least $n$, and fewer than $2n$, balls to cover all of $E$. That is, $$ n \le M(E,\varepsilon) \le 2n. $$ Thus, since $\varepsilon = \frac{1}{n^2}$ (and so $n = \varepsilon^{-1/2}$) $$ -\frac{1}{2} \log(\varepsilon) = \log(n) \le \log(M(E,\varepsilon)) \le \log\left(2n \right) = -\frac{1}{2}\log(\varepsilon) + \log(2). $$ The Minkowski–Bouligand dimension can then be obtained via the squeeze theorem: $$ \frac{1}{2} = \lim_{\varepsilon\downarrow 0} \frac{-\frac{1}{2} \log(\varepsilon)}{\log(1/\varepsilon)} \le \underbrace{\lim_{\varepsilon\downarrow 0} \frac{\log(M(E,\varepsilon))}{\log(1/\varepsilon)}}_{= \operatorname{Dim}(E)} \le \lim_{\varepsilon\downarrow 0} \frac{-\frac{1}{2}\log(\varepsilon) + \log(2)}{\log(1/\varepsilon)} = \frac{1}{2}. $$