I have the following optimization problem in $x \in \mathbb R^n$
$$\begin{array}{ll} \text{minimize} & \lambda_{\max} \left( x x^T \right)\\ \text{subject to} & Ax \leq b\end{array}$$
where matrix $A$ and vector $b$ are given.
I think this problem is non-convex due to $x x^T$. Are there any relaxation techniques or rewriting the problem in convex optimization form?
Since symmetric, positive semidefinite, rank-$1$ matrix $\mathrm x \mathrm x^\top$ has only one nonzero eigenvalue, namely, $\| \mathrm x \|_2^2$, your optimization problem is equivalent to the following (convex) quadratic program
$$\begin{array}{ll} \text{minimize} & \| \mathrm x \|_2^2\\ \text{subject to} & \mathrm A \mathrm x \leq \mathrm b\end{array}$$