I'm wondering about if it could be possibile to have a eigenvector equation with matrices, i.e. if it is possible to use "standard methods" to solve an equation like this:
$$ \pmb{A}\pmb{V}=\lambda\pmb{V} $$ where $\pmb{A}$, $\pmb{V}$ are (symmetric) matrices and $\lambda$ is the eigenvalue. $\pmb{V}$ is the "eigenmatrix". Specifically I have to solve this matrix equation $$ \pmb{A}\pmb{V}\pmb{A}=\lambda\pmb{V} $$ and I would like (if it is possible) to link the standard eigenvalues and eigenvector of the matrix $\pmb{A}$ with the constant $\lambda$ and the matrix $\pmb{V}$
I can't just say that the columns of matrix $\pmb{V}$ are the eigenvectors of $\pmb{A}$ because $\lambda$ is a constant.
Yes, it's possible.
Note that the map $T:\Bbb R^{n \times n} \to \Bbb R^{n \times n}$ defined by $T(V) = AVA$ is a linear map from a vector space to itself. We can compute the eigenvalues/eigenvectors of this transformation by applying the usual techniques to the matrix of this linear transformation relative to some basis.
More specifically: let $e_1,\dots,e_n$ denote the standard basis of $\Bbb R^n$, so that $e_ie_j^T$ is the matrix with a $1$ in the $i,j$ entry and zeros everywhere else. If we select the basis $\mathcal B = \{e_je_i^T : 1 \leq i,j \leq n\}$, where the tuples $(i,j)$ are taken in lexicographical order, then we find that the matrix of this transformation is $A^T \otimes A$, where $\otimes$ denotes the Kronecker product. Equivalently, we can rewrite your equation using the vectorization operator as follows: $$ \operatorname{vec}[AVA] = \operatorname{vec}(\lambda V) \implies\\ \operatorname{vec}[AVA] = \lambda \operatorname{vec}(V) \implies\\ (A^T \otimes A)\operatorname{vec}(V) = \lambda \operatorname{vec}(V) $$ so that $V$ is an eigenmatrix of your transformation if and only if $\operatorname{vec}(V)$ (a column vector) is an eigenvector of the matrix $A^T \otimes A$.
Restricting our search to symmetric matrices, which is to say defining $T$ as a transformation $T:\mathrm{Sym}_n \to \mathrm{Sym}_n$ where $\mathrm{Sym}_n$ denotes the set of symmetric matrices of size $n$, is also possible by computing the matrix of the linear transformation but doesn't play as nicely with the vectorization trick described above.
If you find the matrix of $T$ relative to the correct choice of basis (in this case, one such option would be $\mathcal B = \{e_je_i^T + e_ie_j^T: 1 \leq i \leq j \leq n\}$, where again tuples $i,j$ are taken in lexicographical order) then you'll find that the matrix of $T$ turns out to be $ \vee^2A $, where $\vee$ denotes a symmetric power of the matrix $A$. Concretely, we have $$ \vee^2A[i,j] = \operatorname{perm}\pmatrix{a_{ii} & a_{ij}\\ a_{ji}& a_{jj}} = a_{ii}a_{jj} + a_{ij}^2 $$ where $\operatorname{perm}$ denotes a matrix permanent.