Graphs are often given by their "standard" definition through a set of nodes and a set of edges.
But for me, the adjacency matrix definition has always seemed more intuitive (and less cumbersome). It states that we're given an $N \times N$ matrix, A, such that $A_{i j} = 1 \iff i \sim j$. Otherwise, $A_{i j} = 0$. (Note this can be extended to weighted and directed graphs pretty easily).
If we are given two graphs, $A$ and $B$, an isomorphism between them is a permutation matrix $\sigma$ such that $\sigma A = B$ and $\sigma^{-1} B = A$. (equivalently, $\sigma$ is invertible)
But what would an automorphism of a graph, A, be? Is it a permutation $\sigma$ such that $\sigma A = A$? I don't think this can be correct, as the only $\sigma$ that would work is the identity matrix.
So how can automorphisms be described in the adjacency matrix definition of graphs?