\begin{align*} \frac{0}{0} &= \frac{100-100}{100-100} \\ &= \frac{10^2-10^2}{10(10-10)} \\ &= \frac{(10+10)(10-10)}{10(10-10)} \\ &= \frac{10+10}{10} \\ &= \frac{20}{10} \\ &= 2 \end{align*}
I know that $0/0$ isn't equal to $2$, the what is wrong in this proof? I wasn't satisfied after searching it on Google, any help would be appreciated.
The problem of this proof is your first proposition, that the number $\frac00$ exists, or, in other words, that the operation of division is defined for the value of $0$. In fact, the operation $(x,y)\mapsto \frac{x}{y}$ is defined on $$\mathbb R\times (\mathbb R\setminus \{0\}).$$
The operation is defined like so: the number $a=\frac{x}{y}$ is the unique number for which $a\times y = x$. This uniqueness allows you to do all the algebraic operations on these numbers, so it is quite necesary.
In the case of $\frac00$, this would mean that $\frac00$ is the unique number $a$ for which $0\cdot a = 0$. However, since $0\cdot a = 0$ for any value of $a$, such a unique number does not exist, therefore we cannot sensibly define $\frac00$.