I do apologize my question is going to be wordy because i'm just at a loss on how to even start coding this. Pseudo-code answers are highly appreciated if only to allow me to understand how to solve this (then I can write some actual code and come back for help if necessary).
My problem isn't so much the code as it is understanding the logic I need (which is arguably the harder part of programming).
An informal explanation of my problem is that want to change a matrix A (which happens to be sparse) such that the row sums are equal to the column sums. I can do this by adding to A a matrix AS where S is a matrix of scales.
Formally, I want to find an S matrix such that $(A + AS)1^n = T$ and $(A' + A'S)1^n = T$ where $1^n$ is a vector of ones that creates T, the vector of row sums.
The vector $T$ is set in stone as it were, it is the current column sums and is the target for the row sums.
I think the way I want to solve this is for each row i and column j where i = j I want to find the row sum and compute how far it is from the target. Then I want to change each element of that row such that the row sum equals the target (or is at least "close enough" where I can set the "close enough").
However, this is subject to the condition that the sum of column j must equal the target as well.
How can I design the logic so that I can start with say column 1 and row 1, figure out the values in row 1 and then figure out the values of column 1 subject to the first entry of column 1 being "fixed" by the earlier procedure.
Following that, row 2 should have its first value "fixed" by the above, and similarly the programme needs to figure out column 2 with fixed values for the first two entries now.
And so on until you get to the final column and row
I have tried programming a gradient descent but got stick on how to make the gradient descent for the columns depend on the gradient descent for the rows iteratively.
I've also worked this out by hand (for a 2x2 matrix), I can figure out the answer but I'm not sure how I managed to do so which is why I'm struggling to code it.
Expected results are a matrix $(A + AS)$ such that the row sums equal the column sums.
Or an error message saying "does not converge"