Recently I’ve been learning about different running times for algorithms, such as $\mathscr{O}(n)$ for linear search, $\mathscr{O}(\log n)$ for binary search, $\mathscr{O}(n^2)$ for selection sort and bubble sort, and $\mathscr{O}(n \log n)$ for merge sort, to name a few examples.
However, I was wondering if there are any nontrivial computational problems (such as searching a list for a particular item, sorting a list of items, etc.) that can be solved in constant time or $\mathscr{O}(1)$ (yes, I realize that “constant time” is not strictly speaking the same as $\mathscr{O}(1)$).
So here is my question: does anyone know of any problems that can be solved by an $\mathscr{O}(1)$ algorithm, for which it’s not obvious that the problem is solvable in a running time independent of the size of the data set?
Some data structure provide $\mathcal{O}(1)$-time probing. For instance, it is well-known that static dictionaries holding $n$ arbitrary elements can be built using $\mathcal{O}(n)$ memory and constant access time membership test. Cuckoo hashing yields the same results.