If $X$ and $Y$ are independent, we would use:
$$ \varphi_{X+Y}(t)=E[e^{it(X+Y)}]=\{ \text{independence}\} = E[e^{itX}e^{itY}] = E[e^{itX}]\cdot E[e^{itY}]=\varphi_X \varphi_Y $$
But how would I proceed to compute $\varphi_{X+Y}(t)$ if I do not know if $X$ and $Y$ are dependent or not?
I have tried searching for a general answer but cannot seem find a simple answer.
Edit: In the case where the joint distribution of $(X,Y)$ is given, what would be the general approach then?
You can express this in terms of the joint CF $$ \phi_{X,Y}(t,u)= E(e^{itX +iuY}) $$ as $$\phi_{X+Y}(t)=E[e^{it(X+Y)}]=E[e^{itX+itY}]=\phi_{X,Y}(t,t).$$
This is the generalization of the product identity you quoted for when they are independent. Of course the difficult part here is that you need to get the joint CF for the joint distribution and if that were easy, direct computation of the expectation value you wrote down at the beginning would be easy. I suppose the same could be said in the case of independence, though.