There are different ways to represent a graph but adjacency and laplacian matrices are the two most powerful ones having various properties.
Recently, a student asked me when exactly we should use adjacency or laplacian matrix? How can he know?
I provided him some examples, basically picking up some sample properties of each matrix but as a student who hasn't learnt all these properties, how can s/he know? I wasn't satisfied by my own response and wonder if people in this forum can provide better answers. Are there any "rules" or "guide" that one can use to decide when to use which or both.
The paper "Limit theorems for eigenvectors of the normalized Laplacian" by Tang and Priebe (2018) gives a satisfactory answer to this question in the context of random dot product graphs where one wants to recover the latent positions in $\mathbb{R}^d$. As the name suggests they establish a central limit theorem for the spectral embeddings associated to the Laplacian and adjacency matrix.
Further, they construct a information-theoretic quantity which measures the performance of community detection based on the spectral embeddings. The conclusion appears to be that Laplacians are to be preferred over adjacency matrices whenever the matrix is sufficiently sparse.
This conclusion is fairly sensible from the well-known perspective that adjacency clustering is a relaxation of the (NP-hard) Cut problem whereas Laplacian clustering is a relaxation of Normalized Cut problem.
My personal viewpoint is that Laplacians are nearly always the best for any large graph which one is likely to observe in practice. On the other hand, adjacency matrices are easier to explain to someone who doesn't know a lot of math and hence have an edge with regards to accessibility.