Is there a way to solve a system of equations with more equations than unknowns other than just transposing?

118 Views Asked by At

Let's say we have a system of experimentally-driven equations with more experiments (equations) than unknowns. This gives me the sense that we have "too much" information about the variables and it's almost always impossible to find an exact solution. However if you left-multiply the transpose of the coefficient matrix on both sides of the system, that converts your rectangular system into an at least positive-semidefinite coefficient matrix that hopefully has a unique solution. That would be your "closest" solution in terms of least square regression is what I'm understanding.

Here comes my question. This seems to be if you gave equal weight to all experiments that generated your initial systems of equations. What if, after you do the experiments but before you attempt to solve for the variables, you assign a probability distribution to the system of equations measuring the likelihood of experiment accuracy? Couldn't you use those "weights" to give extra impact to the equations most likely to be true and also give less of an impact to the equations most likely to be erroneous? And wouldn't that affect your "closest" answer that you would have gotten if you simply transposed the coefficient matrix like I was describing in the above paragraph? If that is possible, how would you do so generally in a real-world problem?