Also known as CMU 15-455, Spring 2017, Homework 2.4.
Before I ask the main questions, let me first give a sketch of my idea. First, recall the definition of big-$O$ and time complexity class $TIME(t(n))$.
Definition: Let $f$ and $g$ be functions $f$, $g$: $\mathbb{N}$ $\rightarrow$ $\mathbb{R}^+$. Say that $f(n)$ = $O(g(n))$ if $\lim_{n \to \infty}\dfrac{f(n)}{g(n)}$ = 0. In other words, $f(n) = O(g(n))$ means that for any real number $c$ > 0, a number $n_0$ exists, where $f(n)$ $\leq$ $cg(n)$ for all $n \geq n_0$.
Definition: Let $t: \mathbb{N} \rightarrow \mathbb{R}^+$ be a function. Define the time complexity class, $TIME(t(n))$, to be the collection of all languages that are decidable by an $O(t(n))$ time Turing machine.
By definition, $TIME(\sqrt{n})$ and $TIME(1)$ are the collection of all languages that are decidable by a $O(\sqrt{n})$ and a $O(1)$ time TM respectively. My approach is the following:
Proposition: $TIME(1)$ is the collection of all languages that are decidable by an $O(\sqrt{n})$ time Turing machine.
If the above proposition is true, it should be sufficient to prove that $TIME(\sqrt{n})$ = $TIME(1)$. If the runtime is a constant $c$, then we can use the runtime $O(1)$ to represent $c$. Let $c = 1$ as an example. Obviously, $1$ = $O(1)$. We wish to show that $1$ = $O(\sqrt{n})$ as well.
By the definition of big-$O$, we have to prove that $\lim_{n\to\infty} \dfrac{1}{\sqrt{n}} = 0$, which is true. As such, $1$ = $O(1)$ = $O(\sqrt{n})$. As $TIME(1)$ is the collection of all languages that are decidable by an $O(1)$ time TM, and $O(1)$ = $O(\sqrt{n})$, the above proposition is proved. Hence $TIME(\sqrt{n})$ = $TIME(1)$.
The proof looks sound to me. However, upon further inspection, I have some problems with it:
The reverse approach, $TIME(\sqrt{n})$ is the collection of all languages decidable by an $O(1)$ time TM does not seem to work. Because if I understand the definition of big-$O$ correctly, are we trying to assert that $\sqrt{n}$ = $O(\sqrt{n})$ = $O(1)$? Then $\sqrt{n}$ = $1$, which is not true? What did I get wrong?
How can we show that $1 = O(1)$ with the definition of big-$O$ above? $\lim_{n\to\infty} \dfrac{1}{1} = 1 \neq 0$. And if we use the second version of the definition, what if $0 \leq c \leq 1$?
Those are my main question. In the case my approach is wrong, how will you approach it?
(I have asked this question on the Computer Science Stack Exchange, but no one has given an answer yet. I figure asking here may get more responses, since the question is also math-oriented.)
$f(n)=O(g(n))$ is a dangerous notation. It isn't what it looks like. It really means $f(n)\in O(g(n)),$ that is, $f(n)$ is among the functions that belong to the class $O(g(n))$ (which is defined a little differently than what you wrote). It doesn't even follow the most basic properties of the $=$ relation, such as that if $a=b$ then $b=a,$ but people use this notation anyway because that's how they learned it.
It is true that if $g(n)<h(n)$ for all positive $n$ then any function that is in $O(g(n))$ is in $O(h(n))$ -- but the converse is not true. You found that any member of $TIME(1)$ is also a member of $TIME(\sqrt n)$, but as you noticed, that is not the same thing as showing that any member of $TIME(\sqrt n)$ is a member of $TIME(1)$.
If you could prove both directions you would prove that the classes are equal, but if you can show the existence of even one language that is in $TIME(\sqrt n)$ and not in $TIME(1)$ you would prove that the classes are not equal.
Some of your confusion is due to incorrect statements of definitions. The first "big-O" definition you gave is actually a definition of "little-O", $f(n) \in o(g(n)).$ In the second definition (which is almost the "big-O" definition), there just needs to exist some $c$ for which the rest is true; you do not need the rest of the statement to be true for every positive $c.$