There are many unproven conjectures that if you heuristically took a probability of it being true, it would basically be almost $100\%$ true.
I can see why mathematics must be rigorous as many conjectures thought to be true were proven false, but these conjectures did not heuristically have a $100\%$ chance of being true. Many of these conjectures proven false have an issue from some condition holding true for an infinite number of integers, many times unjustifiably.
So what we would lose from saying things like $\pi^{{\pi}^{\pi^\pi}}$ is not an integer and $e + \pi$ is irrational just because we didn't prove it?
Let me give one catastrophic example: You may be familiar with the Moebius function $\mu(n)$ which is defined as $0$ if $n$ has any factor which is the square of a prime, and for square-free number is $+1$ if $n$ has an even number of prime factors, and $-1$ otherwise.
Now let's look at a an integer-valued function function $M(n)$ the sum $$ M(n) = \sum_{k=1}^n \mu(k)$$ (this is known as the Merten's function) and observe (without proving it) that while $M(1) = 1$, for all $n>1$, $|M(n)| < \sqrt{n}$. We try this out for up to $n=1,000,000$ and it always works, so we call this Merten's conjecture.
Being a bit OCD, we try it out for all $n < 10^{16}$ and not only does it always work, $|M(n)|$ is always less than $\frac35$ for all those cases. So now we decide, let's assume the conjecture is true; let's add it to our body of axioms that we can use in future proofs.
Well, pretty soon we are happy, because now we can prove the Riemann Hypothesis (a much more important conjecture, but one that is perhaps tougher to convince yourself of by numerical examples). And we continue to develop a bunch of other math, so much so that a century from now we forget which theorems depend on the Mertens axiom and which don't.
Along the way, somebody manages to prove the $3n+1$ conjecture (that if you start with any integer and if odd, change it to $3n+1$ or if even, divide by $2$, the conjecture is that eventually you end up with $1$). I can assure you that this is possible starting from theorems derived assuming the Mertens conjecture. And $50$ years later, some student discovers (using the processor on her smartphone) a number which instead of reaching $1$, goes into a cycle that repeats forever.
Well now the world is in the unpleasant position of knowing that the math that people have been working on contains contradictions, meaning that anything can be proven. Good luck trying to unravel which pieces of 21st century math depend on the Mertens axiom and which don't. What a mess.
(By the way, the Mertens conjecture is known to be false, although we may never have computers strong enough to find the smallest counterexample.)