I have a function $f(x) = \cos(x)$.
I am given $\lim_{x \to 0} \cos(x) = 1$.
Is it correct to say $f(x)$ is of order $n^2$? What about $f(x)$ is $O(n^2)$
Or should I say $f(x)$ is of order $x^2$ or $f(x)$ is $O(x^2)$
I think $O(x^2)$ is correct because it depends on $x$, but this sounds awkward. I have not seen $O(x^2)$ elsewhere.
I think $O(n^2)$ is incorrect because $n$ is not defined anywhere (maybe implicitly?).
$f(x) = \cos(x)$ was just an example. If we let $g(x) = x^2$, would it be $O(x^2)$ or $O(n^2)$?
None of this is correct! For a function, Big-O describes how a function grows as it approaches infinity or some value. In your case, as x goes to $0$ so:
$$ f(x)= O(1) \text{ as } x \rightarrow 0$$
Interestingly, $ \cos(x)= O(1) \text{ as } x \rightarrow \infty$ because of the definition of Big-O. Although the function continues to oscillate between -1 and 1, not approaching anything, the definition of Big-O is that $f(x)$ is Big-O of $g(x)$ if there is some constant, $C$ such that $\lvert f(x) \rvert < C \cdot g(x)$. Now we can make $C$ any number we want. So usually we want $g(x) = 1$ to indicate a function that does not grow with $x$ so just imagine $C$ is a number just larger than $1$. We can now state $$ \cos(x)= O(1) \text{ as } x \rightarrow \infty$$
As far as using $x$ or $n$ for Big-$O$ notation, for mathematical functions, Big-$O$ describes behavior as the function approaches some value $x$ even if $x$ is $\infty$ so $x$ would be used.
In describing time complexity of computer algorithms, Big-$O$ is used to describe how the time of computation grows as the data size grows. Traditionally, the variable $n$ is used as the size of the data set so $n$ would be used. Example, average case time complexity of QuickSort is $O(n \cdot \log (n))$.