Say we have some algorithm which is known to have a worst case running time of $\Theta (n^2)$. I have been told it is possible that there could be some input which would have $O(n)$ running time but I am not sure why as by definition, $t(n) = \Theta(n^2) \implies t(n) = O(n^2) \text{ and } t(n) = \Omega(n^2)$ which in turn implies that beyond some constant $c$, $t(n) \ge c \cdot g(n) \,\forall n$.
Am I to just take that because the definitions need only hold for some constant, below this we can assume there exists a 'shorter' input where the algorithm does run in linear time?
The question does not have to do with the details of big omega/theta notation. Simply, the $\Theta(n^2)$ is a guarantee on the worst-case running time of the algorithm, and does not say anything specific about the running time of the algorithm on specific inputs (beyond $O(n^2)$).
Response to comment:
Simply because you said "the algorithm is known to have a worst case running time of $\Theta(n^2)$." You seem to be thinking that the big theta notation bounds all possible running times on all inputs.
As a very concrete example suppose for each input length $n$, there are only two possible inputs, each with respective running times $n$ and $n^2$. The worst-case running time for inputs of length $n$ is thus $n^2$, so the sequence $1^2, 2^2, 3^2, \ldots$ of worst-case running times is $\Theta(n^2)$. This is not saying that all run times are $\Theta(n^2)$; in this example, the sequence of best-case running times is $\Theta(n)$.