Let's say one is factoring an equation. This equation is solved by factoring and upon doing so one is left with x=-1 and x=3 (the specific number does not really matter for the sake of this question). When plugging those values back into the equation the -1 is found to not equal 0, and 3 does, thus 3 is a solution and -1 is not.
This makes me wonder. Does this support the idea that math is invented rather than discovered? (Given that we get some random value that is meaningless...)
Or does this value actually mean something... If so what?
The situation you describe cannot happen, at least not exactly the way you describe it; "factoring" means replacing an expression with a product of two expressions which it is equal to, and if one of those expressions is zero the product must always be.
However, a situation like the one you describe - in which one solution is extraneous - can happen in slightly different circumstances. For example, consider the equation
$$\log_3{x} + \log_3(x - 2) = 1$$
By using a property of logarithms, and raising $3$ to the power of both sides, this gives $x^2 - 2x = 3$, which has solutions $x = -1$ and $x = 3$. But, while $x = 3$ does satisfy the original equation, plugging in $x = -1$ requires taking $\log_3(-1)$, which is nonsense.
This doesn't seem to be a reason to believe that math is an invention - but the filter we see it through, and the words we use for it, are. Here, the issue is that the logarithm property we used (that $\log_3x + \log_3y = \log_3(xy)$) isn't correct in exactly the way we usually think of it; it's only true when the relevant logarithms exist.
This is a general rule - when you get an apparent solution that isn't an actual solution, it's because you've made a mistake in your reasoning somewhere, sometimes a very subtle one.