$$ p(t)=\begin{cases} 0& t<0\\\\ 1& 0<t<1\\\\ 0& t>1 \end{cases}$$
I need to compute $$ \int_{-\infty}^{\infty} p(\tau) p(t+\tau) d \tau$$
My thinking is :
First convert the integral to
$$ \int_{0}^{1} p(t+\tau) d \tau$$
because $p(\tau)$ is none zero only when $ 0< \tau < 1 $ and in that interval $p(\tau)=1$
Now I set $x = t+\tau $ and have $$ \int_{t}^{1+t} p(x) dx $$
Taking $ -1<t<0$ I have $$ \int_{0}^{1+t} 1 dx = 1+t$$ and $ 0< t < 1 $ I have $$ \int_{t}^{1} 1 dx = 1-t $$
So it looks like my solution is linear with $t$ on those ranges. Does it make sense ? Did I missed something here ?