I'm not sure if this is the right stackexchange to post this question to, but I was just wondering if someone had the answer to an interesting observation I've made.
I've written a program that generates a 6th order magic square. Then it finds the standard deviation of each of the columns. Here are a few screenshots of generated squares and the deviations.
First run:

Second run:

Third run:

Now, what I was wondering is why are the highest deviations always at column 1 or 2 and the lowest at 5 or 6? And why are both never found at 3 or 4? It comes out like this no matter how many times I run the program. I thought my standard deviation math was wrong at first, but I checked it by hand and it all checks out.
Does this have something to do with magic squares in general or is it just a coincidence?
EDIT:
@taninamdar I generate the magic square by randomly picking one of the 8 possible 3x3 magic squares. Then I expand each number of it to be a 2x2 section, making the base 6x6 square. Then I create another 6x6 magic square consisting of 9 Medjig squares. Once that is in order with every row and column adding up to 9, I loop through each square of the grid using the equation grid[x, y] = grid[x, y] + 9 * medjig_grid[x, y] which generates the final magic square you see in the screenshots.
Your magic squares are generated in a way it is ignored that the diagonals must have the same sum as the rows and columns.
For this "subclass" of magic squares we can generate a different magic square by exchanging columns and/or rows.
Exchanging the columns 1/2 or 5/6 with 3/4 disproves your assumption.
Either your observation was by chance, or @taninamdar is right and it is in the way you generate the squares.