Suppose that $f(t)=0$ for all $|t|\geq a>0$ and that $g(t)=0$ for all $|t|\geq b>0$. Show that $f*g(t)=0$ for all $|t|\geq a+b$.
Without loss of generality, assume $b\leq a$. Then $$f*g(t)=\int_{-\infty}^\infty f(t-x)g(x)dx=\int_{-b}^bf(t-x)g(x)dx$$ which is $0$ if $|t|\geq b$.
Is this the correct way of going about this?
You're on the right track but you made an error in the last step. Note that if $|x|\ge a$, $f(x) = 0$, so $f(t-x)$ is only (possibly) nonzero if $|t-x| < a$. But this is the same as $-a < t-x < a$, or $x-a < t < x+a$. Since $x$ ranges from $-b$ to $b$, this means that $f(t-x)$ is only (possibly) nonzero if $-a-b < t < a+b$. Outside of that, it is definitively zero.