This question is rather a general one concerning the areas of mathematics where the condition occurs.
(One implication is that such eigenvector viewed as a point in $\mathbb{R}^n$ for some $n \in \mathbb{N}$ lies in the hyperplane $\sum_{i=1}^n$ $x_i$ = 0 of dimension $n-1$.)
Could we have a sufficient and necessary condition for the coordinate sum to be zero?
What could be inferred in case of such eigenvector of an adjacency matrix (and Laplacian matrix) of a graph?
Your first question is too vague to answer. If we are given the eigenvector, we can compute the sum. So I assume this is not what you mean, but I cannot tell what you really intend.
Laplacians: if the graph is connected, 0 is a simple eigenvalue and the all-ones vector is an eigenvector; any eigenvector with non-zero eigenvalue will be orthogonal to the all-ones vector.
Adjaency matrix: if the graph is connected and $k$-regular, the all-ones vector is an eigenvector and all remaining eigenvectors are orthogonal to it. In general, an eigenvector has coordinate sum zero if and only if it is an eigenvector for the graph and its complement. So if its eigenvalue in $G$ is $\lambda$, its eigenvalue in $\overline{G}$ will be $-1-\lambda$. Thus giv e a necessary condition: (using $\phi$ to denote characteristic polynomial) if $\phi(G,t)$ and $\phi(\overline{G},-t-1)$ have no common factor, no eigenvector of $GG$ has coordinate sum zero.
Finally, any eigenspace with dimension at least two necessarily contains an eigenvector with coordinate sum zero.