I was watching a video on machine learning.
The instructor says that maximizing ($\frac{2}{||w||}$)is difficult (why?) so instead we prefer to minimize $\frac{1}{2}||w||^2$. $w$ is a vector.
How are these two functions equivalent?
****ADDENDUM ****
I plotted both statements. So I can see that the max of ($\frac{2}{||w||}$) is at 0 (it reaches infinity) and the min of $\frac{1}{2}||w||^2$ is 0. Am I thinking of this correctly? Isn't it obvious to begin with? I couldve figured out that the max of ($\frac{2}{||w||}$) is 0 without doing any fancy conversion.

Let $x = ||w||$, then $\dfrac{2}{||w||} = \dfrac{2}{x}$. So if $\dfrac{2}{x} \leq M$, then:$x \geq \dfrac{2}{M}$, and it follows that: $x^2 \geq \dfrac{4}{M^2}$. This means that instead of finding the max of $\dfrac{2}{x}$ which can be cumbersome, you equivalently solve for the min of $x^2$ which is much simpler because you will deal painlessly with a much simpler object: a polynomial of second degree instead of a painful rational expression !