First question, sorry for poor formatting. This question is from my Calculus 2 class, and I am pretty sure I am supposed to be using arc length formula for the question.
Exact words:
Consider the line segment shown
*shows simple right triangle with $y=-mx+b$ over hypotenuse, and the $90$ degree corner sitting at point $(0,0)$
a. Use calculus to find its length (show your calculation of the integral involved)
b. Now show that your answer to part a agrees with what you get when you simply use the Pythagorean Theorem (or distance formula)
So I think I got through the first part:
Taking the derivative of $y=-mx+b$ and putting it into the arc length formula I got $(b-a)\sqrt{1+m^2}$
The second part is where I am lost... I feel like I am missing something pretty easy but I don't understand how I would make the Pythagorean theorem or distance formula to agree with my answer. My answer to part a might just be wrong too but I am not sure how else to do it.
There are a lot more parts to the problem, if I can get past this hopefully it can show me how to approach the other problems.
Appreciate any help I can get, thanks!

Hint:when $x=0, y=b $ and when $y=0,x=\frac bm.$Now apply Pythagorean theorem to the right angled triangle constructed using the points $(0,0), (0,b)$ and $(b/m, 0)$.
But this will give you an answer different from the one you wrote above. Also you haven't mentioned anything about $a$.