I have three random variables, $X, Y$ and $Z$ that are related as follows.
$$X = Y + Z$$
$X \sim Z$ whereas $Y$ and $Z$ are independent and $Y$ is std normal. $X\geq 0$ a.s.
I want to show that $X = \infty$ a.s. (equivalent to saying $Z = \infty$ obviously).
Intuitively, if adding a normal r.v. to another r.v. does not change its distribution, then that r.v. cannot be finite with positive probability. But I want to make this formal.
From the above relations I can deduce that the characteristic function of $X$ is zero everywhere maybe except at zero. I don’t see where to go from here. I tried the inversion formula as well.
Your idea to work with the characteristic function is good. However, it is not possible that the characteristic function of a random variable is constantly zero, since every characteristic function $\varphi$ has to satisfy $\varphi(0) = 1$ (see Wikipedia). Your remark that the characteristic function is zero everywhere except at one point is misleading since the characteristic function is always uniformly continuous - hence if it is everywhere zero except at one point, then it is constantly zero.
The reason for this problem lies in my opinion in the fact that you try to compute the characteristic function for a random variable that takes values in $\mathbb{R} \cup \{ +\infty, -\infty\}$ and not only in $\mathbb{R}$. I'm not entirely sure if that is possible because: What should $e^{i \cdot \infty}$ be?
In fact, you can circumvent this issue as follows: Set $$ S := \{ \omega \in \Omega: |Z(\omega)| < \infty \} = \{ \omega \in \Omega: |X(\omega)| < \infty \} \subseteq \Omega $$ and assume that $\mathbf{P}(S) \neq 0$. Now, consider the probability space $(S, \mathcal{E}, \mathbf{P}')$ where $\mathcal{E }= \mathcal{F} \cap S$ is the trace-$\sigma$-algebra and $\mathbf{P}' := { 1 \over \mathbf{P}(S)} \cdot \mathbf{P} \upharpoonright \mathcal{E}$. (I use that $X, Y$ and $Z$ were defined on the probability space $(\Omega, \mathcal{F}, \mathbf{P})$). Then let $$X' := X \upharpoonright S, Y' := Y \upharpoonright S \text{ and } Z' := Z \upharpoonright S$$ be the restrictions of $X, Y, Z$ on $S$.
On the space $(S, \mathcal{E}, \mathbf{P}')$ it is still $$ X' = Y' + Z', \quad X' \sim Z' \quad \text{ and } \quad Y' \sim \mathcal{N}(0,1) \qquad (*) $$ with the advantage that the random variables $X', Z', Y'$ take only values in $\mathbb{R}$. Therefore, it is safe to look at their respective characteristic functions and we obtain $$ \varphi_{Z'} = \varphi_{X'} = \varphi_{Y'} \varphi_{Z'} = e^{-\frac{t^2}{2}} \varphi_{Z'} \quad \implies \quad 0 = (1-e^{-\frac{(\cdot)^2}{2}}) \varphi_{Z'}. $$ But then $\varphi_{Z'}$ has to be zero everywhere expect at $0$ and - as stated above - it follows that $\varphi_{Z'}$ is zero even at zero. This is a contradiction since no characteristic function can be zero at zero.
Therefore, it must be $\mathbf{P}(S) = 0$ and that is (more or less) what we wanted to show.
Note that in fact $X$ and $Z$ need not be constantly $\infty$ since they can assume both the values $\infty$ and $-\infty$.
The first equality in $(*)$ is clear as it holds for all $\omega \in \Omega$ and thus for all $\omega \in S$ as well. For $X' \sim Z'$ note that for every $B \in \mathcal{B}(\mathbb{R})$ we have $$ \begin{align*} \mathbf{P}'(X' \in B) &= \frac{1}{\mathbf{P}(S)} \cdot \mathbf{P}(\{\omega \in S: X'(\omega) \in B\}) \\ &= \frac{1}{\mathbf{P}(S)} \cdot \mathbf{P}(\{\omega \in \Omega: X(\omega) \in B\}) \\ &= \frac{1}{\mathbf{P}(S)} \cdot \mathbf{P}(\{\omega \in \Omega: Z(\omega) \in B\}) \\ &= \frac{1}{\mathbf{P}(S)} \cdot \mathbf{P}(\{\omega \in S: Z'(\omega) \in B\}) \\ &=\mathbf{P}'(X' \in B) \end{align*} $$ using that $X \sim Z$ and that $|X(\omega)| = \infty$ ($|Z(\omega)| = \infty$) holds iff $\omega \not \in S$.
For $Y' \sim \mathcal{N}(0,1)$ have a look at the characteristic function of $Y'$. It is for every $ t\in \mathbb{R}$ $$ \begin{align*} \varphi_{Y'}(t) &= \int_S e^{itY'} d\mathbf{P'} \\ &= \int_S e^{itY'} \frac{1}{\mathbf{P}(S)} d\mathbf{P} \\ &= \int_S e^{itY} \frac{1}{\mathbf{P}(S)} d\mathbf{P} \\ &= \frac{1}{\mathbf{P}(S)}\int_\Omega \mathbb{1}_S e^{itY} d\mathbf{P} \\ &= \frac{1}{\mathbf{P}(S)} \int_\Omega \mathbb{1}_S d\mathbf{P} \cdot \int_\Omega e^{itY} d\mathbf{P} \\ &= \int_\Omega e^{itY}d\mathbf{P} = e^{-\frac{t^2}{2}} \end{align*} $$ where we can split the integral because indipendence of $Y$ and $Z$ yields indipendence of $\mathbb{1}_S$ and $e^{itZ}$.