If I have a number made of 128 1 and 128 0 (256 bits) and I convert it to 10 base integer I get:
115792089237316195423570985008687907852929702298719625575994209400481361428480
and if I square the number 2 with 256 I get:
115792089237316195423570985008687907853269984665640564039457584007913129639936
But taking a close look, the second number which should represent maximal 256 bits number is smaller than the first one which has 256 bits. How can this be explained?
Actually, the second number is larger. I indicated the first digit where they change:
$11579208923731619542357098500868790785\color{red}2929702298719625575994209400481361428480$
$11579208923731619542357098500868790785\color{red}3269984665640564039457584007913129639936$