Assume we have a problem, for example, "finding the best element in sequence". Also, let's pretend that we do not have an algorithm for this problem yet.
Is it possible to say theoretically, which is the most effective algorithm, without developing one? For example, I could say that for the problem above, I would definitely need to iterate through all sequence once and do something with each element, so best possible algorithm for this problem would be O(n) and never better.
Are there any parts of math that investigate these kinds of things?
Absolutely.
In the given example, if all elements are independent (in the sense that knowing the bestness of an element tells you nothing about the bestness of others), you cannot do without inspecting all elements and you write that the complexity is $\Omega(n)$ (i.e. at least linear in the number of elements).
Caution, this is just a lower bound, and not a sufficient proof that you will find a solution in time $O(n)$: either you need to exhibit an algorithm with that complexity, or you must find non-constructive arguments showing that such an algorithm exist.
For example, a combinatorial argument shows that to sort $n$ elements by means of comparisons, you need $\Omega(n\log n)$ comparisons. This doesn't prove that this bound can be reached. Anyway, we have HeapSort, with is know to be worst-case $O(n\log n)$, which settles the question.
For another example, it can be shown that proving all elements in a list different takes $\Omega(n\log n)$ comparisons. But if the list is sorted, this can be achieved in linear time $O(n)$. So virtually without writing the algorithm, we deduce that $O(n\log n)$ can be achieved for an unsorted list.
Final remark:
The "best" algorithm for a given task does not exist. Because this is a multicriteria decision (not a total ordering) and usually several competing algorithms will be better for some use cases and worse on others.