Suppose you know there exists an $n\times n$ matrix, call it $A=\begin{pmatrix}a_{11}\ldots a_{1n}\\etc\end{pmatrix}$, whose eigenvalues (let's assume they're all independent) are $\lambda_i,i=1\ldots n$. And you know all those $\lambda_i$, but don't know anything about matrix $A$ itself. That is, for each of the given $\lambda_i$'s, we know there must exist some corresponding eigenvector $\vec{x_i}$ such that
$$\begin{pmatrix}a_{11}&a_{12}&\cdots&a_{1n}\\a_{21}&\cdots\\ \vdots\\a_{n1}&\cdots&&a_{nn}\end{pmatrix}\begin{pmatrix}x_{i1}\\x_{i2}\\\vdots\\x_{in}\end{pmatrix} = \lambda_i \begin{pmatrix}x_{i1}\\x_{i2}\\\vdots\\x_{in}\end{pmatrix}$$
And note that we also know nothing about the components of those $\vec{x_i}$ eigenvectors, either. We're only given that set of $n$ eigenvalues $\lambda_i,i=1\ldots n$, and we'd like to determine the matrix $A$ that possesses them. Note that $A$ may or may not be Hermitian.
So, to what extent can the $a_{ij}$'s that comprise $A$ be determined solely from $A$'s eigenvalues? I'm mostly interested in the Hermitian case, but if $A$ real and/or $A=A^T$ easier, that'll be a great help, too.