Prove that if $d$ is a common divisor of two integers $a$ and $b$, then $d=\gcd(a,b)$ if and only if $\gcd(a/d,b/d)=1$.
So far I used what was given so I have $a=dk$, $b=ld$ and $\gcd(a,b)=d$ can be written as a linear combination of $ax+by=d$ but I am unsure how to use the information.
Where do I go from here? Can someone show me how to solve this using Bezout's Identity if possible?
Comment: Dividing by d will give the equality $1$, but there is much more to be said. Yes, by definition the smallest positive integer that can be expressed by a linear combination is $1$ which would imply that $1$ is the gcd,but a better way I think is to proceed by contradiction.
So, you have: $\frac{a}{d}+\frac{b}{d}=1$, where $d=\gcd(a,b)$.
Suppose $\gcd(\frac{a}{d},\frac{b}{d})$=e. We will show that $e=1$.
Here: $e|\frac{a}{d},e|\frac{b}{d}\Rightarrow \frac{a}{d}=ex, \frac{b}{d}=ey$ for $x,y \in \mathbb{Z}$
Thus: $a=dex,b=dey \Rightarrow$ $de$ is a common divisor of $a,b$, but $de>d$ which is a contradiction by assumption of $d=\gcd(a,b)$. Hence, $e=1$.