Suppose that $y_t$ is a $ARIMA(0,1,1)$ model $y_t=y_{t-1}+\epsilon_t+\theta\epsilon_{t-1}$. Show that $$y_t=\sum_{j=1}^\infty(1+\theta)(-\theta)^{j-1}y_{t-j}+\epsilon_t$$ where $\epsilon_t$ is White Noise with $E[\epsilon_t]=0$ and $Var(\epsilon_t)=\sigma^2$
Let $x_t=\epsilon_t+\theta\epsilon_{t-1}$ a $MA(1)$ process, then $$y_t=y_{t-1}+x_t$$ where $x_t=(1+\theta B)\epsilon_t$ where $B$ is the lag operator. Then assuming that $|\theta|<1$ the process is invertible and we can write $x_t$ as $$x_t=\sum_{j=1}^\infty (-\theta)^jx_{t-j}+\epsilon_t$$ The problem is when I substitute $x_t=y_t-y_{t-1}$ I don't get the result.
You have
\begin{equation} y_t=y_{t-1}+\epsilon_{t}+\theta\epsilon_{t-1} \end{equation}
As well as
\begin{equation} \epsilon_{t-j}=y_{t-j}-y_{t-j-1}-\theta\epsilon_{t-j-1}, \ j=1,2,... \end{equation}
Set $j=1$ and substitute $\epsilon_{t-1}$ in first equation with $\epsilon_{t-1}=y_{t-1}-y_{t-2}-\theta\epsilon_{t-2}$. Then set $j=2$ and similarly substitute $\epsilon_{t-2}$. Keep iterating.
Now $y_t$ converges to the desired expression assuming that $|\theta| <1 $.