A point on the hypotenuse of a triangle is at distance a and b from the sides of the triangle. Prove that the minimum length of the hypotenuse is $(a^{2/3}+b^{2/3})^{3/2}$.
My Attempt
$\frac{x}{y}=\frac{a}{CM}=\frac{AN}{b}$ $$ \frac{x}{y}=\frac{AN}{b}\implies y=\frac{xb}{\sqrt{x^2-a^2}} $$ $$ h(x)=x+y=x+\frac{xb}{\sqrt{x^2-a^2}} $$ $$ h'(x)=1+\frac{\sqrt{x^2-a^2}.b-xb.\frac{x}{\sqrt{x^2-a^2}}}{x^2-a^2}=1+\frac{x^2b-a^2b-x^2b}{(x^2-a^2)^{3/2}}\\ =1+\frac{-a^2b}{(x^2-a^2)^{3/2}}=\frac{(x^2-a^2)^{3/2}-a^2b}{(x^2-a^2)^{3/2}} $$ $$ h'(x)=0\implies (x^2-a^2)^{3/2}=a^2b\implies (x^2-a^2)^{3}=a^4b^2\\ \implies x^6-3x^4a^2+3x^2a^4-a^6=a^4b^2\implies x^6-3x^4a^2+3x^2a^4-a^6-a^4b^2=0\\ $$
How do I proceed further and find $h_{min}$ without using trigonometry ? Or is there anything wrong with my calculation ?

Following your solution from this point: $$(x^2-a^2)^{3/2}=a^2b \Rightarrow x^2=(a^2b)^{2/3}+a^2 \Rightarrow x=a^{2/3}(a^{2/3}+b^{2/3})^{1/2}.$$ So: $$y=\frac{xb}{\sqrt{x^2-a^2}}=\frac{a^{2/3}(a^{2/3}+b^{2/3})^{1/2}b}{\sqrt{((a^2b)^{2/3}+a^2)-a^2}}=(a^{2/3}+b^{2/3})^{1/2}b^{2/3}.$$ Hence: $$x+y=a^{2/3}(a^{2/3}+b^{2/3})^{1/2}+(a^{2/3}+b^{2/3})^{1/2}b^{2/3}=\\ (a^{2/3}+b^{2/3})^{1/2}(a^{2/3}+b^{2/3})=(a^{2/3}+b^{2/3})^{3/2}$$