Let $\mathbf{A}$ be a $4-$ dimensional symmetric matrix with real entries, whose elements are given as \begin{equation} \mathbf{A} = \left( \begin{array}{cccc} a & b & c & d \\ b & c & d & e \\ c & d & e & f \\ d & e & f & g \end{array} \right) \end{equation} Let $\mathbf{B}$ be another $4-$ dimensional matrix whose elements are given as \begin{equation} \mathbf{B} = \left( \begin{array}{cccc} b & c & d & e \\ c & d & e & f \\ d & e & f & g \\ e & f & g & h \end{array} \right) \end{equation} One can see that the elements of $\mathbf{B}$ are shifted by one with respect to $\mathbf{A}$. $\mathbf{A}$ and $\mathbf{B}$ are also called Hankel Matrices. My query is:
"Does any relationship exist between the eigenvalues of $\mathbf{A}$ and those of $\mathbf{B}$?"
As hickslebummbumm noted, the question as stated (especially with the presence of $B$’s lower right value $h$) doesn't really impose a strong enough relationship between $A$ and $B$ to establish a clear relationship between their respective spectra. It’s probably necessary to narrow the question's scope and look at some special cases to say something meaningful.
For example, suppose we look at $A$ and $B$ in the case where $e = a$, $f = b$, $g = c$, and $h = d$. This is actually a natural case to look at since $A$ and $B$ are then row-reversed (i.e., upside-down) circulant matrices with a succinctly expressed relationship $A=PB$, where $P$ is the permutation matrix
\begin{equation} \mathbf{P} = \left( \begin{array}{cccc} 0 & 0 & 0 & 1 \\ 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \end{array} \right). \end{equation}
Rewritten, your Hankel matrices in this case are
\begin{equation} \mathbf{A} = \left( \begin{array}{cccc} a & b & c & d \\ b & c & d & a \\ c & d & a & b \\ d & a & b & c \end{array} \right) \end{equation} and \begin{equation} \mathbf{B} = \left( \begin{array}{cccc} b & c & d & a \\ c & d & a & b \\ d & a & b & c \\ a & b & c & d \end{array} \right). \end{equation}
This is something that we can analyze in detail. Since every row-sum of $A$ and $B$ are the same, we immediately see that $A$ and $B$ share the eigenvalue $\lambda_1(A)= \lambda_1(B)=a+b+c+d.$ We can also readily observe that $A$ has an eigenvalue $\lambda_2(A)=(a-b)+(c-d)$ while $B$ has an eigenvalue of equal magnitude and opposite sign, $\lambda_2(B)=(b-a)+(d-c).$
Next, looking at the trace of $A$ we can then see that its remaining two eigenvalues have equal magnitude and opposite sign. The same holds for $B$. We also note that since $A$ can be obtained from $B$ by performing three row interchanges, we have that $\det(A)=-\det(B).$ Using the elementary fact that the determinant of a matrix is equal to the product of its eigenvalues, and from our explicit determination of $\lambda_1(A),$ $\lambda_2(A), $ $\lambda_1(B),$ and $\lambda_2(B),$ we can then conclude that the remaining two eigenvalues of $A$ are the same as the remaining two eigenvalues of $B.$
So, without much work we have shown (in the special case where $e = a$, $f = b$, $g = c$, and $h = d$) that $A$ and $B$ share three eigenvalues and $A$’s other eigenvalue has equal magnitude and opposite sign from $B$’s other eigenvalue. With a little additional effort, we can in fact show that the other eigenvalues of $A$ and $B$ in this case have values $\pm \sqrt{(a-c)^2+(b-d)^2}.$
So, this provides an answer for one class of matrices $A$ and $B$ satisfying your question's broad constraint. Are there other subclasses that you'd be particularly interested in investigating?