I'm not sure if what I'm asking even makes sense but it's a property of big-O that if $T_1(n) = O(f(n))$ and $T_2(n) = O(g(n))$, then
$T_1(n) + T_2(n) = O(f(n) + g(n))$, or less formally its $O(max(f(n), g(n)))$. And also $T_1(N) * T_2(n) = O(f(n) * (g(n))$.
Do similar properties exist for the inverse of these two operations. I've been told without explanation that $T_1(n) - T_2(n) = O(f(n) + g(n))$. Why is that so? Is it because time would need to be spent computing each value before the operation could be performed?
That is true, because when we say a sequence $T(n) = O(f(n))$, where $f(n)$ is asymptotically positive, we mean that there exists $a > 0$, $N > 0$ so that for all $n > N$ \begin{align*} |T(n)| \leq af(n) \end{align*} You can check against this defining property for $T_1(n)-T_2(n)$