$a=bq+r$
I suppose$~~d|a~~\wedge~~d|b~~\Leftrightarrow~~a=dx~~\wedge~~b=dy~~$for some integers x y.
So
$dx=dyq+r$
$r=d(x-yq)~~\Leftrightarrow~~d|r$
What I've concluded is $~~d|a~~\wedge~~d|b~~\Rightarrow~~d|r$
A similar argument shows that $~~s|b~~\wedge~~s|r~~\Rightarrow~~s|a$
But I've stopped here, I need to prove that If$~~gcd(a,b)=d~~$and$~~gcd(b,r)=s~~$then$~~d=s$
Can someone help me? I think I'm almost there
Yes, you are almost there. Note that you proved that whenever a number divides both $a$ and $b$, then it also divides $r$. And that if a number divdes both $b$ and $r$, then it also divides $a$. So, you proved the the common divisors of both $a$ and $b$ are exactly the common divisors of both $b$ and $r$. Therefore, the greatest common divisor of $a$ and $b$ is the greatest common divisor of $b$ and $r$.